WO2016132678A1 - Wearable camera system and method for synchronously reproducing video data - Google Patents

Wearable camera system and method for synchronously reproducing video data Download PDF

Info

Publication number
WO2016132678A1
WO2016132678A1 PCT/JP2016/000333 JP2016000333W WO2016132678A1 WO 2016132678 A1 WO2016132678 A1 WO 2016132678A1 JP 2016000333 W JP2016000333 W JP 2016000333W WO 2016132678 A1 WO2016132678 A1 WO 2016132678A1
Authority
WO
WIPO (PCT)
Prior art keywords
video data
wearable camera
vehicle
camera
wearable
Prior art date
Application number
PCT/JP2016/000333
Other languages
French (fr)
Japanese (ja)
Inventor
稔 萩尾
康志 横光
和彦 山口
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2016132678A1 publication Critical patent/WO2016132678A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to a wearable camera system including a wearable camera that can be worn on, for example, a human body, clothes, or a hat, and a video data synchronous reproduction method for reproducing video data captured by the wearable camera.
  • Patent Document 1 As a prior art using a wearable camera, for example, there is a wearable surveillance camera system described in Patent Document 1.
  • the wearable surveillance camera system shown in Patent Document 1 stores a video signal and an audio signal from a CCD camera means and a microphone means attached to the body, and a date / time information signal from a built-in clock means in a pouch means attached to the body. Encoding is performed by the encoding back-end server means, and the date / time information converted into character information is superimposed on the captured image and recorded.
  • the present disclosure is a wearable camera system including at least a first wearable camera that can be worn by a user and a back-end server, and the back-end server includes first video data captured by the first wearable camera, A communication unit that receives the second video data captured by the external device, a storage unit that stores the first video data and the second video data received by the communication unit in association with each other, and stores the storage unit in the storage unit A playback unit for playing back the first video data and the second video data, wherein the first video data has first recording time information imaged by the first wearable camera, and The second video data has second recording time information captured by the external device, and the playback unit uses the first recording time information and the second recording time information to Synchronously reproducing video data and second video data of a wearable camera system.
  • the present disclosure is a video data synchronized playback method in a wearable camera system including at least a first wearable camera that can be worn by a user and a back-end server, the video being captured by the first wearable camera, and an external device
  • the first video data captured by the first wearable camera, the first video data captured by the first wearable camera and the second video data captured by the external device are received, and the first recording captured by the first wearable camera is received.
  • the first video data having the time information and the second video data having the second recording time information captured by the external device are stored in the storage unit in association with each other, and the first recording time information and the second video data Video data for reproducing the first video data and the second video data stored in the storage unit in synchronization using the recording time information. It is a synchronous playback method.
  • FIG. 1 is an explanatory diagram regarding the outline of the wearable camera system of the present embodiment and the use of video data captured by the wearable camera.
  • FIG. 2 is a block diagram showing an example of the internal configuration of the in-vehicle camera system and the in-vehicle PC of the present embodiment.
  • FIG. 3 is a block diagram illustrating an example of an internal configuration of the wearable camera according to the present embodiment.
  • FIG. 4 is a block diagram showing an example of the internal configuration of the back-end server of this embodiment.
  • FIG. 5 is a diagram showing a state in which the user wears the wearable camera of the present embodiment.
  • FIG. 6 is a front view showing an example of the appearance of the wearable camera of the present embodiment.
  • FIG. 1 is an explanatory diagram regarding the outline of the wearable camera system of the present embodiment and the use of video data captured by the wearable camera.
  • FIG. 2 is a block diagram showing an example of the internal configuration of the in-vehicle camera system and the in-
  • FIG. 7 is a diagram illustrating an example of a data structure of recorded data captured by the in-vehicle camera.
  • FIG. 8 is a diagram illustrating an example of a data structure of recorded data captured by the wearable camera.
  • FIG. 9 is a diagram illustrating an example of a data structure of frame parameter information in video data.
  • FIG. 10 is a diagram illustrating an example of the data structure of the frame parameter information in the video data of the recording data captured by the in-vehicle camera.
  • FIG. 11 is a diagram illustrating an example of the data structure of the frame parameter information in the video data of the recording data captured by the wearable camera.
  • FIG. 12 is a flowchart for explaining an example of an operation procedure in which the wearable camera of the present embodiment synchronizes time with the in-vehicle camera system.
  • FIG. 13 is a flowchart for explaining an example of an operation procedure in which the back-end server of the present embodiment reproduces video data of recorded data captured by the wearable camera and the in-vehicle camera.
  • a police officer wears a wearable camera and rushes to a designated site with a police car, for example, the police officer only operates the wearable camera to record and record the situation at the scene.
  • an in-vehicle camera mounted in a police car is operated to pick up an image of an angle different from that of the wearable camera.
  • Each video data obtained by the wearable camera and the in-vehicle camera is transferred to the back-end server in the police station, and then played back so that the person in charge in the police station can compare each video data. Observe the situation.
  • a plurality of pieces of video data obtained by imaging the same incident for example, not only the case of each piece of video data taken by the above-described wearable camera and vehicle-mounted camera, but also the case of each piece of video data taken by a plurality of wearable cameras. If the playback times of multiple videos played back by the back-end server are not synchronized, the time-series context cannot be determined, making it difficult to grasp the situation at the site. There was a problem.
  • the target person is different in the same incident (that is, the subject of the wearable camera or the in-vehicle camera), and the playback times of multiple videos played back by the back-end server are not synchronized, It becomes difficult to grasp the status of the serial context. If the target person is the same even in the same case, if each person in the police station tracks the target person when each video data is played back, the chronological context can be confirmed. It takes time and effort. For this reason, if the reproduction times of the respective video data are not synchronized, there is an unsolved problem that it is difficult to grasp the situation at the site.
  • the present disclosure provides a wearable camera that allows a user to easily grasp the on-site situation by synchronizing the reproduction time of each video data when reproducing a plurality of video data obtained by imaging the same incident. It is an object of the present invention to provide a system and a video data synchronous reproduction method.
  • the present embodiment specifically discloses the wearable camera system and the video data synchronized playback method according to the present disclosure will be described in detail with reference to the drawings.
  • a wearable camera system including at least a wearable camera (BWC: Body Worn Camera) and a back-end server that can be worn by a user (for example, police officer 7; the same applies hereinafter)
  • BWC Body Worn Camera
  • a back-end server that can be worn by a user
  • an image is displayed by the wearable camera.
  • An image is captured and an image is captured by an external device (for example, an in-vehicle camera or another wearable camera).
  • the video data captured by the wearable camera and the video data captured by the external device are received by the back-end server.
  • the back-end server stores the first video data having the first recording time information captured by the wearable camera and the second video data having the second recording time information captured by the external device in association with each other. Store in the department. Further, the back-end server uses the first recording time information and the second recording time information to reproduce the first video data and the second video data stored in the storage unit in synchronization.
  • FIG. 1 is an explanatory diagram regarding the outline of the wearable camera system 100 of this embodiment and the use of video data captured by the wearable camera 10.
  • the wearable camera 10 of this embodiment is an imaging device that a user (for example, police officer 7) can wear on a body, clothes, a hat, or the like.
  • the wearable camera 10 is connected to an in-vehicle system 60 mounted on a car (for example, a police car on which a police officer gets on) or a back-end server (that is, a back-end server SV) in a police station to which the police officer 7 belongs. It has a communication function for performing communication (for example, wireless communication).
  • the wearable camera 10 and the in-vehicle system 60 constitute a front-end system 100A, and the management software 70 on the network, the back-end server SV, and the in-station PC 71 that is a PC in the police station are back.
  • the end system 100B is configured.
  • the management software 70 is executed by, for example, the in-station PC 71 or the back-end server SV.
  • the wearable camera system 100 is used in the police station 5, for example.
  • the police officer 7 operates the wearable camera 10 to image the situation on the scene or a specific subject (for example, a victim of the incident, a suspect, a visitor around the scene).
  • the wearable camera 10 transfers video data obtained by imaging to, for example, a back-end system 100B in the police station 5 in accordance with a user operation.
  • Wearable camera 10 is not limited to a police officer as a user, and may be used in various other establishments (for example, a security company). In this embodiment, a police officer is mainly exemplified as a user.
  • the front-end system 100A includes a wearable camera 10 that can be worn by the police officer 7 dispatched to the forefront of the scene, and a portable terminal (for example, a smartphone, which is held in a police car that is held or used by the police officer) And a vehicle-mounted system 60 installed in the police car 6.
  • the in-vehicle system 60 includes an in-vehicle camera 61, an in-vehicle recorder 62, an in-vehicle PC 63, a communication unit, and the like, and configures an in-vehicle camera system, a video management system, and the like.
  • the in-vehicle camera 61 is installed at a predetermined position of the police car 6 and captures images around the police car 6 (for example, the front of the police car or the back seat in the police car) at regular intervals or at predetermined timings. That is, the in-vehicle camera 61 is, for example, a front camera (not shown) for imaging the front of the police car 6, and a back seat for imaging a back seat (for example, a seat on which a suspect is sitting) in the police car 6. And a camera (not shown).
  • the video data captured by the in-vehicle camera 61 is accumulated in the in-vehicle recorder 62, for example, by executing a recording operation.
  • a plurality of in-vehicle cameras 61 may be provided.
  • the in-vehicle camera 61 may have a microphone (not shown) that collects sound inside and outside the police car 6 as a front camera or a back seat camera. In this case, it is also possible to pick up (record) sound produced by the police officer 7 or the suspect in the police car 6.
  • the in-vehicle recorder 62 stores video data captured by the in-vehicle camera 61.
  • the in-vehicle recorder 62 can acquire and store video data captured by the wearable camera 10. Further, the in-vehicle recorder 62 may manage meta information such as attribute information given to the video data.
  • the in-vehicle PC 63 may be a PC that is fixedly installed in the police car 6, or a wireless communication device such as a portable PC, a smartphone, a mobile phone, a tablet terminal, or a PDA (Personal Digital Assistant) used outside the police car 6. But you can.
  • a wireless communication device such as a portable PC, a smartphone, a mobile phone, a tablet terminal, or a PDA (Personal Digital Assistant) used outside the police car 6. But you can.
  • the in-vehicle PC 63 enables cooperation between the in-vehicle system 60 and the wearable camera 10 (specifically, communication between the in-vehicle system 60 and the wearable camera 10) by executing management software (not shown). Further, a UI (User Interface) (for example, an operation device, a display device, and an audio output device) of the in-vehicle PC 63 is also used as a UI for operating the in-vehicle recorder 62.
  • UI User Interface
  • the wearable camera 10 When the police officer 7 is dispatched from the police station 5 with a predetermined requirement (for example, patrol), for example, the wearable camera 10 is mounted, and the police officer 7 gets on the police car 6 equipped with the in-vehicle system 60 and goes to the site.
  • a predetermined requirement for example, patrol
  • the wearable camera 10 In the front-end system 100A, for example, an on-site camera 61 of the in-vehicle system 60 captures an image of the scene where the police car 6 arrives, and the police officer 7 gets off the police car 6 to obtain a closer and more detailed image of the scene on the wearable camera 10. Take an image.
  • Video data of a moving image or a still image captured by the wearable camera 10 is stored in a storage device such as a memory of the wearable camera 10, for example.
  • the wearable camera 10 transfers (uploads) various data including video data captured by the wearable camera 10 from the storage device of the wearable camera 10 to the back-end system 100B.
  • Various data including video data captured by the wearable camera 10 may be directly transferred from the wearable camera 10 to the back-end system 100B, or may be transferred to the back-end system 100B via the in-vehicle system 60. Good.
  • Video data of a moving image or a still image captured by the in-vehicle camera 61 is stored in a storage such as a hard disk (HDD (Hard Disk Drive)), SSD (Solid State Drive) or the like provided in the in-vehicle recorder 62 of the in-vehicle system 60, for example.
  • the in-vehicle system 60 (for example, the in-vehicle recorder 62) transfers (uploads) various data including video data captured by the in-vehicle camera 61 from the storage of the in-vehicle system 60 to the back-end system 100B.
  • Data transfer to the back-end system 100B is performed by connecting by wireless communication from the field, for example, or when patrol is completed and the police station 5 is returned, wireless communication, wired communication, or manual (for example, storage medium) Carry around).
  • an operation of transferring (uploading) video data captured by the wearable camera 10 to the back-end system 100B via the in-vehicle system 60 will be mainly described.
  • video data of a moving image or a still image captured by the wearable camera 10 is transferred from the wearable camera 10 to the in-vehicle recorder 62 of the in-vehicle system 60 and stored.
  • the moving image or still image data captured by the wearable camera 10 or the in-vehicle camera 61 and stored in the in-vehicle recorder 62 is transferred (uploaded) from the in-vehicle recorder 62 to the back-end system 100B.
  • the video data captured by the wearable camera 10 can be stored in the storage of the in-vehicle PC 63 and transferred (uploaded) from the in-vehicle PC 63 to the back-end system 100B.
  • the back-end system 100B includes a back-end server SV installed in the police station 5 or other places, management software 70 for communicating with the front-end system 100A, and a station PC 71. .
  • the back-end server SV includes a storage 308 configured using an HDD, SSD, or the like inside (see FIG. 4) or outside.
  • the back-end server SV accumulates video data and other data transferred from the front-end system 100A in the storage 308, and constructs a database used in each department in the police station.
  • the back-end server SV receives video data transferred from, for example, the wearable camera 10 or the in-vehicle system 60 (for example, the in-vehicle recorder 62) and stores it in the storage 308.
  • the video data stored in the back-end system 100B is used, for example, by a person in charge in the relevant department in the police station 5 for investigation and verification of the incident, and as required, a predetermined storage medium (for example, DVD: Digital Versatile Disk) ) Is copied and submitted as evidence in a predetermined scene (for example, trial).
  • a predetermined storage medium for example, DVD: Digital Versatile Disk
  • the identification information of the police officer 7 for example, officer ID (Office ID)
  • the identification information of the wearable camera 10 used by the police officer For example, camera ID (Camera ID)
  • identification information of the police car 6 used by the police officer 7 for example, car ID (Car ID)
  • the back-end server SV With respect to the recorded video data, it is possible to clearly distinguish when and when the police officer uses the camera.
  • the person in charge in the police station 5 or the police officer 7 to be dispatched operates the operation device (not shown) of the in-station PC 71, and the in-station PC 71 executes the management software 70 It is done by doing.
  • information other than the Office ID, Camera ID, and Car ID may be input via the operation device of the in-station PC 71.
  • the management software 70 includes, for example, an application for managing the personnel of the police officer 7, an application for managing the dispatch of the police car 6 and the like, and an application for managing the taking-out of the wearable camera 10.
  • the management software 70 includes an application for searching and extracting specific video data based on attribute information from a plurality of video data stored in the back-end server SV, for example.
  • FIG. 2 is a block diagram showing an example of the internal configuration of the in-vehicle camera system 65 and the in-vehicle PC 63.
  • the in-vehicle camera 61 and the in-vehicle recorder 62 constitute an in-vehicle camera system (ICV: In Car Video System) 65.
  • the in-vehicle camera 61 captures video at a predetermined frame rate, and generates frame parameter information (see FIG. 9) indicating frame information for each frame of video data obtained by the imaging.
  • the vehicle-mounted camera 61 generates video data by adding frame parameter information for each frame to the video data (see FIG. 7).
  • the frame parameter information will be described later with reference to FIG.
  • the in-vehicle recorder 62 includes a CPU (Central Processing Unit) 101, a communication unit 102, a flash ROM (Read Only Memory) 104, a RAM (Random Access Memory) 105, and a microcomputer ( ⁇ CON, sometimes referred to as a microcontroller) 106. , GPS (Global Positioning System) 107, GPIO (General Purpose Input / Output) 108, switch 109, LED (Light Emitting Diode) 110, and storage 111.
  • CPU Central Processing Unit
  • communication unit 102 includes a communication unit 102, a flash ROM (Read Only Memory) 104, a RAM (Random Access Memory) 105, and a microcomputer ( ⁇ CON, sometimes referred to as a microcontroller) 106.
  • GPS Global Positioning System
  • GPIO General Purpose Input / Output
  • switch 109 switch 109
  • LED Light Emitting Diode
  • the CPU 101 performs, for example, control processing for overall control of operations of each unit of the in-vehicle recorder 62, data input / output processing with other units, data calculation (calculation) processing, and data storage processing.
  • the communication unit 102 communicates with an external device (for example, the wearable camera 10 or 10A or the in-vehicle PC 63) via a wireless line or a wired line.
  • the wireless communication includes, for example, wireless LAN (W-LAN (Local Area Network)) communication, near field communication (NFC: Near Field Communication), and Bluetooth (registered trademark).
  • the wireless LAN communication is performed in accordance with, for example, the IEEE 802.11n standard of Wi-fi (registered trademark).
  • the wired communication includes, for example, wired LAN communication and USB (Universal Serial Bus) communication.
  • the CPU 101 and the communication unit 102 are connected via, for example, PCI (Peripheral Component InterConnect) or USB.
  • the communication unit 102 performs wireless communication or wired communication with the in-vehicle camera 61, the in-vehicle PC 63, the wearable camera 10, the in-station PC 71 of the police station 5, and the back-end server SV, for example.
  • the communication unit 102 is wirelessly connected to the wearable cameras 10 and 10A, and receives various data including video data transmitted from the wearable cameras 10 and 10A.
  • the flash ROM 104 is a memory that stores a program and data for controlling the CPU 101, for example, and holds various setting information.
  • the RAM 105 is a work memory used in the operation of the CPU 101, for example, and is provided with either one or a plurality of RAMs.
  • the microcomputer 106 is, for example, a kind of microcomputer and is connected to each unit (for example, the GPS 107, the GPIO 108, the switch 109, and the LED 110) related to the external interface, and performs control related to the external interface.
  • the microcomputer 106 is connected to the CPU 101 via, for example, a UART (Universal Asynchronous Receiver Transmitter).
  • the GPS 107 receives, for example, the current position information and time information of the in-vehicle recorder 62 from a GPS transmitter (not shown) and outputs it to the CPU 101. This time information is used for correcting the system time of the in-vehicle recorder 62. Further, the in-vehicle camera 61 periodically accesses the in-vehicle recorder 62 and adjusts the time so that the system time of the in-vehicle camera 61 and the system time of the in-vehicle recorder 62 are synchronized (matched). Accordingly, the system time in the in-vehicle camera system 65 shown in FIG.
  • the system time of the vehicle-mounted recorder 62 is corrected using the output of the GPS 107 described above, and the vehicle-mounted camera 61 periodically
  • the method for accessing the in-vehicle recorder 62 is not limited.
  • an in-vehicle recorder 62 may be provided with an NTP (Network Time Protocol) server, and the in-vehicle camera 61 may periodically access the in-vehicle recorder 62 as an NTP server.
  • NTP Network Time Protocol
  • the wearable cameras 10 and 10A periodically access the in-vehicle recorder 62, so that the in-vehicle recorder 62 (that is, the in-vehicle camera system 65) and the wearable camera are wearable.
  • the system time is synchronized with the cameras 10 and 10A.
  • the GPIO 108 is, for example, a parallel interface, and inputs and outputs signals between an external device (not shown) connected via the GPIO 108 and the CPU 101.
  • various sensors for example, a speed sensor, an acceleration sensor, and a door opening / closing sensor
  • a speed sensor for example, a Bosch Sensortec GPIO 108
  • an acceleration sensor for example, a Bosch Sensortec GPIO 108
  • a door opening / closing sensor for example, a door opening / closing sensor
  • the switch 109 is a switch such as a button provided as an input device for the user to perform operation input of the in-vehicle recorder 62.
  • the switch 109 is, for example, a recording button for starting or stopping recording of video data imaged by the in-vehicle camera 61, and an adding button for giving attribute information or meta information to the video data imaged by the in-vehicle camera 61 including.
  • the LED 110 is provided as a display device that indicates an operation state of the in-vehicle recorder 62. For example, the LED 110 turns on, turns off, blinks, etc. the power-on state (on / off state) of the in-vehicle recorder 62, the recording state, the connection state of the in-vehicle recorder 62 to the LAN, and the use state of the LAN connected to the in-vehicle recorder 62. Is displayed.
  • the storage 111 is configured by, for example, an SSD or HDD, and stores and accumulates video data captured and recorded by the in-vehicle camera 61. In addition, when video data is transferred from the wearable cameras 10 and 10A, the storage 111 stores and accumulates video data captured and recorded by the wearable cameras 10 and 10A.
  • the storage 111 may store data other than video data.
  • the storage 111 is connected to the CPU 101 via, for example, SATA (Serial ATA). A plurality of storages 111 may be provided.
  • the in-vehicle PC 63 includes a CPU 201, an I / O (Input / Output) control unit 202, a communication unit 203, a memory 204, an input unit 205, a display unit 206, and a speaker 207.
  • the in-vehicle PC 63 can communicate with the in-vehicle recorder 62, and can also communicate with the back-end server SV of the back-end system 100B and the in-station PC 71.
  • the CPU 201 authenticates whether or not the police officer 7 can log in to the in-vehicle system 60 by, for example, an input operation of the police officer 7 on a login screen (not shown) to the in-vehicle system 60 displayed on the display unit 206.
  • the input operation of the police officer 7 is an operation for inputting, for example, an officer ID and a password.
  • Various kinds of information related to the police officer 7 to whom login is permitted are stored in advance in, for example, the memory 204, and the CPU 201 uses the information on the login permission target stored in advance in the memory 204 to use the police officer 7.
  • Whether to log in to the in-vehicle system 60 is determined.
  • the login may be a login to the in-vehicle system 60 via the in-vehicle PC 63, or a login to an application that operates the in-vehicle system 60 mounted in the in-vehicle PC 63.
  • the I / O control unit 202 controls data input / output between the CPU 201 and each unit of the in-vehicle PC 63 (for example, the communication unit 203, the input unit 205, the display unit 206, and the speaker 207). Relay data to.
  • the I / O control unit 202 may be configured integrally with the CPU 201.
  • the communication unit 203 performs wired or wireless communication with, for example, the in-vehicle recorder 62 or the back-end system 100B side.
  • the communication unit 203 transfers and copies the login information stored in the memory 204 to the wearable camera 10, while the police officer 7 logs in to the in-vehicle system 60. If not, the login information is not transferred to the wearable camera 10.
  • the login information includes, for example, an officer ID, a camera ID, and a car ID.
  • the memory 204 is configured by using, for example, RAM, ROM, nonvolatile or volatile semiconductor memory, functions as a work memory when the CPU 201 operates, and stores a predetermined program and data for operating the CPU 201. .
  • the memory 204 stores login information related to the police officer 7 who is permitted to log in to the in-vehicle system 60, for example.
  • the input unit 205 is a UI for receiving an input operation of the police officer 7 and notifying the CPU 201 via the I / O control unit 202, and is a pointing device such as a mouse or a keyboard.
  • the input unit 205 may be configured using, for example, a touch panel or a touch pad that is arranged corresponding to the screen of the display unit 206 and can be operated with a finger of the police officer 7 or a stylus pen.
  • the display unit 206 is a display device configured using, for example, an LCD (Liquid Crystal Display) or an organic EL (Electroluminescence), and displays various types of information.
  • the display unit 206 displays the video data under the instruction of the CPU 201 when video data captured (recorded) by the wearable cameras 10 and 10A is input in response to an input operation of the police officer 7, for example. Display included video on the screen.
  • the speaker 207 receives video data including sound picked up (recorded) by the wearable cameras 10 and 10 ⁇ / b> A. Output included audio.
  • the display unit 206 and the speaker 207 may be configured separately from the in-vehicle PC 63.
  • FIG. 3 is a block diagram showing an example of the internal configuration of the wearable cameras 10 and 10A.
  • the wearable camera 10 shown in FIG. 3 includes an imaging unit 11, a GPIO 12, a RAM 13, a ROM 14, and a storage unit 15.
  • the wearable camera 10 includes an EEPROM (Electrically Erasable Programmable ROM) 16, an RTC (Real Time Clock) 17, and a GPS 18.
  • the wearable camera 10 includes an MCU (Micro Controller Unit) 19, a communication unit 21, a USB 22, a contact terminal 23, a power supply unit 24, and a battery 25.
  • MCU Micro Controller Unit
  • the wearable camera 10 includes a recording switch SW1, a snapshot switch SW2, and an attribute information addition switch SW3 as an example of an operation input unit.
  • the wearable camera 10 includes LEDs 26a, 26b, and 26c and a vibrator 27 as an example of a state display unit.
  • the imaging unit 11 outputs, for example, a solid-state imaging device such as an imaging lens 11a (see FIG. 6), a CCD (Charge Coupled Device) type image sensor or a CMOS (Complementary Metal Oxide Semiconductor) type image sensor, and the like.
  • the imaging unit 11 captures a video at a predetermined frame rate, and outputs video data of the subject obtained by the imaging to the MCU 19.
  • the GPIO 12 is, for example, a parallel interface, and inputs and outputs signals between the recording switch SW1, the snapshot switch SW2, the attribute information addition switch SW3, the LEDs 26a to 26c, the vibrator 27, and the MCU 19.
  • various sensors for example, acceleration sensors are connected to the GPIO 12.
  • the RAM 13 is a work memory used in the operation of the MCU 19, for example.
  • the ROM 14 is a memory that stores in advance a program and data for controlling the MCU 19, for example.
  • the storage unit 15 is configured by a storage medium such as an SD memory, for example, and stores video data obtained by imaging by the imaging unit 11. When an SD memory is used as the storage unit 15, it can be attached to and detached from the housing body of the wearable cameras 10 and 10A.
  • the EEPROM 16 stores, for example, identification information (for example, a serial number as a camera ID) for identifying the wearable cameras 10 and 10A, and other setting information.
  • the other setting information includes, for example, login information (for example, car ID and officer ID) obtained by setting registration at the in-station PC 71 and logging into the in-vehicle recorder 62.
  • the RTC 17 counts the current time information and outputs it to the MCU 19.
  • the GPS 18 receives current position information and time information of the wearable cameras 10 and 10 ⁇ / b> A from a GPS transmitter (not shown), and outputs them to the MCU 19. This time information is used for correcting the system time of the wearable cameras 10 and 10A.
  • the MCU 19 has a function as a control unit, for example, control processing for overall control of operations of each part of the wearable camera 10, 10A, data input / output processing between each unit of the wearable camera 10, 10A, for example. Data calculation (calculation) processing and data storage processing are performed, and operations are performed in accordance with programs and data stored in the ROM 14.
  • the MCU 19 uses, for example, the RAM 13, obtains current time information from the RTC 17, and obtains current position information from the GPS 18.
  • the communication unit 21 defines the connection between the communication unit 21 and the MCU 19 in the physical layer, which is the first layer of the OSI (Open Systems Interconnection) reference model, for example, and, for example, a wireless LAN (W-LAN) Wireless communication (for example, Wi-fi (registered trademark)) is performed.
  • the communication unit 21 may perform wireless communication such as near field communication (NFC) and Bluetooth (registered trademark).
  • the USB 22 is a serial bus and enables, for example, the wearable camera 10 to be connected to the in-vehicle system 60 or the in-station PC 71 in the police station 5.
  • the contact terminal 23 is a terminal for electrically connecting to a cradle (not shown) or an external adapter (not shown), and is connected to the MCU 19 via the USB 22 and connected to the power supply unit 24. Via the contact terminal 23, the wearable camera 10 can be charged and data including video data can be communicated.
  • the contact terminal 23 is provided with, for example, “charging terminal V +”, “CON.DET terminal”, “data terminals D ⁇ , D +” and “ground terminal” (all not shown). CON.
  • the DET terminal is a terminal for detecting a voltage and a voltage change.
  • the data terminals D ⁇ and D + are terminals for transferring video data captured by the wearable camera 10 to an external PC or the like via, for example, a USB connector terminal.
  • the power supply unit 24 supplies the battery 25 with power supplied from a cradle or an external adapter via the contact terminal 23, for example, and charges the battery 25.
  • the battery 25 is constituted by a rechargeable secondary battery, for example, and supplies power to each part of the wearable camera 10.
  • the recording switch SW1 is a push button switch for inputting an operation instruction for starting / stopping recording (moving image capturing) by a user's pressing operation, for example.
  • the snapshot switch SW2 is, for example, a push button switch that inputs an operation instruction for capturing a still image by a user's pressing operation.
  • the attribute information addition switch SW3 is a push button switch for inputting an operation instruction for adding attribute information to the video data, for example, by a pressing operation of the user.
  • the recording switch SW1, the snapshot switch SW2, and the attribute information addition switch SW3 are arranged at positions that can be easily operated by the police officer 7 even in an emergency, for example (see, for example, FIG. 6).
  • Each of the switches SW1 to SW3 is not limited to the above form, and may be another form of operation input device that allows the user to input an operation instruction.
  • the LED 26 a is a display unit that indicates, for example, the power-on state (on / off state) of the wearable camera 10 and the state of the battery 25.
  • LED26b is a display part which shows the state (recording state) of the imaging operation of the wearable camera 10, for example.
  • LED26c is a display part which shows the state of the communication mode of the wearable camera 10, for example.
  • the MCU 19 detects input of each of the recording switch SW1, the snapshot switch SW2, and the attribute information addition switch SW3, and performs processing for the switch input that has been operated.
  • the MCU 19 detects an operation input of the recording switch SW1
  • the MCU 19 controls the start or stop of the imaging operation in the imaging unit 11, and stores the video data obtained from the imaging unit 11 in the storage unit 15 as video data of a moving image.
  • the MCU 19 detects an operation input of the snapshot switch SW2
  • the MCU 19 stores the video data from the imaging unit 11 when the snapshot switch SW2 is operated in the storage unit 15 as video data of a still image.
  • the MCU 19 When the MCU 19 detects an operation input of the attribute information addition switch SW3, the MCU 19 assigns preset attribute information to the video data, and stores it in the storage unit 15 in association with the video data. At this time, for example, the MCU 19 detects the state of an attribute selection switch (not shown) for selecting an attribute to be associated with the video data, and assigns attribute information according to the setting. Further, the MCU 19 operates the communication unit 21 in a preset communication mode. When the recording operation is started, the MCU 19 drives the LEDs 26a to 26c and the vibrator 27 in accordance with a preset notification mode, and notifies the outside of the state of the recording operation by LED display and / or vibrator vibration.
  • the MCU 19 generates frame parameter information (see FIG. 9) indicating various types of frame information in the video data by using, for example, respective outputs of the RTC 17 and the GPS 18 for each frame of the video data output from the imaging unit 11.
  • the MCU 19 includes a system time counting unit 19A that holds the system time of the wearable cameras 10 and 10A, and may correct the system time using the output of the RTC 17, or may be synchronized with the system time of the in-vehicle camera system 65. Therefore, the system time may be corrected by periodically accessing the in-vehicle camera system 65 (for example, the in-vehicle recorder 62 as an NTP server) without using the output of the RTC 17.
  • the MCU 19 uses the output of the system time counting unit 19A (that is, the system time of the wearable cameras 10 and 10A) to generate the frame parameter information, and further adds the frame parameter information to the video data stream to generate video data. (See FIG. 8)).
  • the frame parameter information will be described later with reference to FIG.
  • FIG. 4 is a block diagram showing an example of the internal configuration of the back-end server SV of the present embodiment.
  • the back-end server SV illustrated in FIG. 4 includes a CPU 301, a memory 302, an I / O control unit 303, a communication unit 304, an input unit 305, an output unit 306, a storage control unit 307, and a storage 308.
  • the CPU 301 performs, for example, control processing for overall control of operations of each unit of the back-end server SV, data input / output processing with other units, data calculation (calculation) processing, and data storage processing. .
  • the CPU 301 as an example of a playback unit includes two recorded data RCD1 and RCD2 (for example, recorded data RCD1 captured and recorded by the in-vehicle camera system 65 and recorded data recorded by the wearable camera 10 and stored in the storage 308.
  • RCD2 recorded data RCD1 captured and recorded by the in-vehicle camera system 65 and recorded data recorded by the wearable camera 10 and stored in the storage 308.
  • RCD2 recorded data to the output unit 306 is instructed to the I / O control unit 303.
  • the recording data RCD1 captured and recorded by the in-vehicle camera system 65 and the recording data RCD2 captured and recorded by the wearable camera 10 are reproduced by the output unit 306, respectively.
  • the CPU 301 when reproducing the two recorded data RCD1 and RCD2, the CPU 301 arranges two screens (windows) in the output unit 306 (for example, a display) in a comparative manner and synchronizes the recorded times. To play.
  • a method of reproducing the two recorded data RCD1 and RCD2 so that the recorded times are synchronized will be described later with reference to FIG.
  • the person in charge in the police station having jurisdiction over the back-end system 100B can grasp the situation on the site extensively and clearly.
  • the memory 302 is configured using, for example, RAM, ROM, nonvolatile or volatile semiconductor memory, functions as a work memory when the CPU 301 operates, and stores a predetermined program and data for operating the CPU 301. .
  • the CPU 302 instructs the I / O control unit 303 to reproduce (output) the two recorded data RCD1 and RCD2 (see FIGS. 7 and 8) read from the storage 308 by the CPU 301, the recorded data RCD1 and RCD2 are recorded. Is temporarily stored.
  • the I / O control unit 303 performs control related to data input / output between the CPU 301 and each unit of the back-end server SV (for example, the communication unit 304, the input unit 305, and the output unit 306). Relay data.
  • the I / O control unit 303 may be configured integrally with the CPU 301.
  • the communication unit 304 performs wired or wireless communication with the in-vehicle camera system 65 of the front end system 100A or the wearable cameras 10 and 10A via the management software 70.
  • the communication unit 304 as an example of a receiving unit receives the recording data RCD1 and RCD2 transmitted from the in-vehicle camera system 65.
  • the communication unit 304 may receive the recording data RCD2 directly transmitted from the wearable cameras 10 and 10A.
  • the input unit 305 is a UI for receiving an input operation of a person in charge in the police station 5 having jurisdiction over the back-end system 100B and notifying the CPU 301 via the I / O control unit 303. It is a pointing device.
  • the input unit 305 may be configured using, for example, a touch panel or a touch pad that is arranged corresponding to the screen output from the output unit 306 and can be operated with the finger of the person in charge or the stylus pen.
  • the output unit 306 includes, for example, a display device configured using an LCD or an organic EL and / or a speaker that outputs sound, and displays various data on a screen or outputs sound.
  • the output unit 306 captures (records) the recorded data RCD1 captured (recorded) by the wearable camera 10 and the in-vehicle camera system 65 in response to an input operation of a person in charge in the police station 5 having jurisdiction over the back-end system 100B.
  • the recorded video data RCD2 is input, the video included in the video data in each recorded data is displayed on a different screen under the instruction of the CPU 301.
  • the storage control unit 307 reads various data stored in the storage 308 or writes various data to the storage 308 in accordance with an instruction from the CPU 301 or the I / O control unit 303.
  • the storage 308 is configured by, for example, an SSD or an HDD, and stores recording data RCD1 captured and recorded by the in-vehicle camera system 65 (the in-vehicle camera 61), and recorded data RCD2 captured and recorded by the wearable camera 10. accumulate. Further, the storage 308 stores and accumulates recording data captured and recorded by the wearable cameras 10 and 10A when the recording data is directly transferred from the wearable cameras 10 and 10A. The storage 308 may store various data other than the recorded data. A plurality of storages 308 may be provided.
  • FIG. 5 is a schematic diagram showing a state where the wearable camera 10 is worn by the police officer 7.
  • FIG. 6 is a front view illustrating an example of the appearance of the wearable camera 10.
  • the wearable camera 10 is attached to clothes or a body worn by the police officer 7 so as to capture an image of a field of view from a position close to the viewpoint of the police officer 7, such as the chest of the police officer 7, for example. Used or attached to the hat via a clip or other fastener. With the wearable camera 10 attached, the police officer 7 operates the recording switch SW1 to image a surrounding subject.
  • the wearable camera 10 is provided with, for example, an imaging lens 11a of the imaging unit 11, a recording switch SW1, and a snapshot switch SW2 in front of a substantially rectangular parallelepiped casing 10K.
  • an imaging lens 11a of the imaging unit 11 For example, when the recording switch SW1 is pressed an odd number of times, recording (moving image capturing) starts, and when the recording switch SW1 is pressed an even number of times, the recording ends.
  • the snapshot switch SW2 is pressed, a still image is captured when the snapshot switch SW2 is pressed.
  • Attribute information addition switch SW3 is provided on the left side when viewed from the front of the housing 10K of the wearable camera 10.
  • the attribute information for example, theft, drunk driving, parking violation, etc.
  • the attribute information is classification information for identifying the type of video data.
  • the attribute information is given according to the user's operation of the attribute information addition switch SW3 of the wearable camera 10, the operation of the switch 109 of the in-vehicle recorder 62, or the operation of the in-vehicle PC 63 based on the setting contents of the attribute information set in advance.
  • the attribute information for example, an attribute relating to an incident that has occurred in the field, such as theft, drunk driving, or parking violation, is used.
  • the LEDs 26a to 26c are arranged on the upper surface when viewed from the front of the housing 10K of the wearable camera 10, as shown in FIG. Thereby, the police officer 7 can easily visually recognize the LEDs 26a to 26c in a state where the wearable camera 10 is worn.
  • the LEDs 26a to 26c may not be seen by anyone other than the seven police officers.
  • a contact terminal 23 is provided on the lower surface of the wearable camera 10 as viewed from the front of the housing 10K.
  • FIG. 7 is a diagram illustrating an example of a data structure of the recording data RCD1 captured by the in-vehicle camera 61.
  • FIG. 8 is a diagram illustrating an example of a data structure of the recording data RCD2 imaged by the wearable camera 10.
  • FIG. 9 is a diagram illustrating an example of a data structure of the frame parameter information FPM1 and FPM2 in the video data VDO1 and VDO2.
  • FIG. 10 is a diagram illustrating an example of a data structure of the frame parameter information FPM1 in the video data VDO1 of the recording data RCD1 captured by the in-vehicle camera 61.
  • FIG. 11 is a diagram illustrating an example of a data structure of the frame parameter information FPM2 in the video data VDO2 of the recording data RCD2 captured by the wearable camera 10.
  • the wearable camera 10 and the in-vehicle camera 61 of the present embodiment when video is captured and recorded, as shown in FIGS. 7 and 8, the captured video data VDO1 and VDO2, along with attribute information related to this video data.
  • Meta information MTD1 and MTD2 including the data are generated and stored in the memory in each apparatus as recorded data RCD1 and RCD2 in which the two data are associated with each other.
  • the recording data RCD2 stored in the memory (for example, the storage unit 15) of the wearable camera 10 includes video data VDO2 and meta information MTD2.
  • the recording data RCD1 captured and recorded by the in-vehicle camera 61 includes video data VDO1 and meta information MTD1.
  • the recorded data RCD1 is stored in the storage 111 in the in-vehicle recorder 62.
  • the recorded data RCD1 and RCD2 are preferably transferred to and stored in the back-end server SV via the in-vehicle recorder 62 in terms of communication environment (for example, line speed).
  • the recording data RCD2 imaged and recorded by the wearable camera 10 may be directly transferred from the wearable camera 10, but in order to shorten the transfer time, the in-vehicle recorder 62 sends video data or the like to the back-end server SV.
  • Video data is transferred from the wearable camera 10 to the in-vehicle recorder 62 during a period in which the camera is not transferred (for example, movement time until patrol shift working hours end and return to the police station), and then the in-vehicle recorder 62 back-end server It is preferable to transfer to SV.
  • the frame parameter information FPM1 and FPM2 are generated for each frame of the video imaged by the in-vehicle camera 61 and the wearable camera 10, and various information of the frame (for example, the recording time (second) of the frame, the frame Recording time (milliseconds), frame number in video data).
  • the frame parameter information FPM1 and FPM2 shown in FIG. 9 “Record time (sec)”, “Record time (msec)”, and “Frame count” are shown.
  • “Record time (sec)” has a size of 4 bytes and indicates the recording time (second unit) of the frame. Specifically, it is the total number of seconds from a predetermined reference date.
  • the predetermined reference date is not particularly limited, but may be, for example, January 1, 1970, or may be the first day of the year (for example, January 1, 2015).
  • “Record time (msec)” has a size of 2 bytes and indicates the recording time of the frame (in milliseconds). Accordingly, the sum of “Record time (sec)” and “Record time (msec)” is the time (recording time) when the corresponding frame of the actual video data VDO1, VDO2 was recorded.
  • “Frame count” has a size of 4 bytes and indicates a frame number in the video data VDO1 and VDO2.
  • the video data VDO1 of the corresponding frame is captured and recorded by the in-vehicle camera 61 when “1002231 + 0.743” seconds have elapsed from the predetermined reference date. It will be. In other words, the time when “1002231 + 0.743” seconds have elapsed from the predetermined reference date is the recording start time of the video data VDO1.
  • the video data of the corresponding frame is captured and recorded by the vehicle-mounted camera 61 when “10027272 + 0.057” seconds have elapsed from the predetermined reference date. It will be done.
  • the recording start time of the video data VDO2 is the time when “1002722 + 0.057” seconds have elapsed from the predetermined reference date.
  • the in-vehicle camera 61 and the wearable camera 10 are images obtained by imaging the same incident site, but the recording times are different and the synchronization is performed. This means that the in-vehicle camera 61 starts recording earlier than the wearable camera 10.
  • the back-end server SV uses the frame parameter information FPM1 and FPM2 regarding the first frame in the video data VDO1 and VDO2 of the respective recording data RCD1 and RCD2 to reproduce the video data VDO1 and VDO2. Start.
  • FIG. 12 is a flowchart for explaining an example of an operation procedure in which the wearable camera 10 according to the present embodiment synchronizes time with the in-vehicle camera system 65.
  • the in-vehicle camera 61 and the in-vehicle recorder 62 in the in-vehicle camera system 65 are synchronized (that is, coincident) with the system time.
  • the wearable camera 10 executes initial processing of the wearable camera 10 main body (S1).
  • the initial processing is, for example, reading initial setting values for various processing into the memory (for example, the storage unit 15) of the wearable camera 10 or booting the OS (Operating System) of the wearable camera 10 to set each value of the wearable camera 10 To read each and make it operable.
  • the wearable camera 10 determines whether or not it is within the communication area (communication area) of the in-vehicle camera system (ICV) 65 after completing the initial process of step S1 (S2). If it is determined that the wearable camera 10 is not within the communication range of the in-vehicle camera system 65 (S2, NO), the operation of the wearable camera 10 proceeds to step S5. On the other hand, when the wearable camera 10 determines that it is within the communication range of the in-vehicle camera system 65 (S2, YES), it establishes a communication link with the in-vehicle camera system 65 (S3).
  • the wearable camera 10 accesses the in-vehicle camera system 65, and the system time counting unit 19A of the wearable camera 10 makes the system time match (that is, synchronizes) with the in-vehicle camera system 65. Adjust the output (system time).
  • an in-vehicle camera system 65 (either in-vehicle camera 61 or in-vehicle recorder 62) is provided with an NTP server, and the wearable camera 10 accesses this NTP server to access the system. Adjustment is possible by obtaining the time.
  • a unique protocol for adjusting the system time between the wearable camera 10 and the in-vehicle camera system 65 may be used.
  • FIG. 12 describes the synchronization of the system time between the in-vehicle camera system 65 and the wearable camera 10, but the description of FIG. 12 synchronizes the system time between the wearable camera 10 and the wearable camera 10A. The same applies to the case of doing so.
  • step S4 the wearable camera 10 waits for a trigger to start recording (S5). That is, the wearable camera 10 stands by until a trigger operation for starting recording (for example, pressing of the recording switch SW1 or starting automatic recording when a predetermined condition is satisfied) is performed.
  • a trigger operation for starting recording for example, pressing of the recording switch SW1 or starting automatic recording when a predetermined condition is satisfied
  • steps S2 to S4 shown in FIG. 12 are periodically repeated by the wearable camera 10. Even if it is determined in step S2 that the wearable camera 10 is not within the communication range of the in-vehicle camera system 65, the wearable camera 10 and the in-vehicle camera system 65 are different when the recording data is transmitted to the back-end server SV. This is because the system time needs to be synchronized (in other words, the process of step S4 needs to be executed). Accordingly, the processing of steps S2 to S4 shown in FIG. 12 needs to be repeated not only once but also taking into account that the police officer 7 wearing the wearable camera 10 moves, for example.
  • the wearable camera 10 and the in-vehicle camera system 65 can synchronize the system time when the wearable camera 10 enters the communication range of the in-vehicle camera system 65. It is.
  • FIG. 12 illustrates an operation procedure related to time synchronization between the wearable camera 10 and the in-vehicle camera system 65
  • the wearable camera 10 transfers (uploads), for example, recorded data RCD2 to the back-end server SV.
  • the system time may be synchronized with the back-end server SV of the back-end system 100B using the same processing as in steps S2 to S4 shown in FIG.
  • the in-vehicle recorder 62 of the in-vehicle camera system 65 synchronizes the system time with the back-end server SV of the back-end system 100B, for example, during or before the recording data RCD1 is transferred (uploaded) to the back-end server SV. May be.
  • the back-end system 100B holds an NTP server (not shown), and the back-end server SV adjusts the system time by periodically accessing the NTP server.
  • FIG. 13 is a flowchart illustrating an example of an operation procedure in which the back-end server SV of the present embodiment reproduces the video data VDO1 and VDO2 of the recorded data RCD1 and RCD2 captured by the wearable camera 10 and the in-vehicle camera 61, respectively.
  • FIG. 13 is a flowchart illustrating an example of an operation procedure in which the back-end server SV of the present embodiment reproduces the video data VDO1 and VDO2 of the recorded data RCD1 and RCD2 captured by the wearable camera 10 and the in-vehicle camera 61, respectively.
  • the recording data RCD1 captured by the in-vehicle camera 61 and the recording data RCD2 captured by the wearable camera 10 are stored in the storage 308 of the back-end server SV, and the wearable camera 10 and the in-vehicle camera are recorded.
  • Each system time is synchronized with the system 65 according to the method shown in FIG.
  • the back-end server SV reads the recording data RCD1 captured by the in-vehicle camera 61 from the storage 308 into the memory 302, and acquires the recording start time information (S11). That is, the back-end server SV acquires information indicating when the in-vehicle camera 61 starts recording the first frame of the video data VDO1 based on the frame parameter information FPM1 included in the video data VDO1 in the recording data RCD1. (S11).
  • the back-end server SV reads the recording data RCD2 imaged by the wearable camera 10 from the storage 308 into the memory 302 and acquires the recording start time information (S12). That is, the back-end server SV acquires information indicating when the wearable camera 10 starts recording the first frame of the video data VDO2 based on the frame parameter information FPM2 included in the video data VDO2 in the recording data RCD2. (S12).
  • the back-end server SV determines whether or not the recording start time of the video data VDO1 of the recording data RCD1 is earlier than the recording start time of the video data VDO2 of the recording data RCD2 (S13).
  • the back-end server SV determines that the recording start time of the video data VDO1 of the recording data RCD1 is ahead of the recording start time of the video data VDO2 of the recording data RCD2 (S13, YES)
  • the in-vehicle camera system 65 The reproduction of the video data VDO1 of the recording data RCD1 that has been imaged and recorded is started and the video is output to the screen (S14).
  • the back-end server SV starts playback of the video data VDO1 and then reaches the recording start time of the video data VDO2 of the recording data RCD2 imaged by the wearable camera 10 (S15, YES), that is, the video data VDO1.
  • the end server SV ends the reproduction of the video data VDO1 and VDO2.
  • the wearable camera 10 The reproduction of the video data VDO2 of the recording data RCD2 that has been imaged and recorded is started and the video is output to the screen (S18).
  • the back-end server SV starts playback of the video data VDO2 and then reaches the recording start time of the video data VDO1 of the recording data RCD1 imaged by the in-vehicle camera system 65 (S19, YES), that is, the video data VDO2.
  • the end server SV ends the reproduction of the video data VDO1 and VDO2.
  • the wearable camera 10 captures an image of the situation of the scene where the police officer 7 rushes
  • the in-vehicle camera system 65 ( Specifically, the in-vehicle camera 61) captures an image of the situation at the same site.
  • the back-end server SV receives the recording data RCD2 having the video data VDO2 and the recording data RCD1 having the video data VDO1, and stores the recording data RCD1 and the recording data RCD2 in association with each other in the storage 308.
  • the back-end server SV can associate the recorded data RCD1 with the recorded data RCD2 by assigning a common case number, for example. Further, the back-end server SV uses the frame parameter information related to the first frame of the video data VDO1 and the frame parameter information related to the first frame of the video data VDO2 to reproduce the video data VDO1 and the video data VDO2 in synchronization.
  • the wearable camera system 100 reproduces a plurality of video data VDO1 and VDO2 captured in the same incident on the back-end server SV, the playback times of the video data VDO1 and VDO2 captured at different angles are synchronized.
  • the situation of the scene where the police officer 7 rushed can be easily grasped extensively and clearly.
  • the wearable camera system 100 is The situation (for example, time-series context in the same case, victim, suspect, surrounding visitors) can be accurately grasped.
  • the in-vehicle camera 61 Since the in-vehicle camera 61 is installed in the police car, for example, even if the video data VDO1 captured and recorded by the in-vehicle camera 61 is blocked by a police officer or an obstacle, the image is not captured. Since the playback times of the data VDO and VDO2 are synchronized, the person in charge in the police station 5 having jurisdiction over the back-end system 100B can view the video of the blocked portion in the video data VDO2 captured and recorded by the wearable camera 10. It becomes possible to grasp.
  • the video data VDO1 has been described as video data captured and recorded by the in-vehicle camera 61, but it may be video data captured and recorded by the wearable camera 10A shown in FIG.
  • the back-end server SV broadens the situation at the scene at the time of imaging (recording time). It becomes possible to grasp.
  • the wearable camera 10 of the present embodiment synchronizes with the system time of the in-vehicle camera 61 (in other words, the system time of the in-vehicle camera system 65) when in the communication range of the in-vehicle camera 61.
  • the system time of the in-vehicle camera 61 in other words, the system of the in-vehicle camera system 65.
  • the system time of wearable camera 10 itself can be adjusted to coincide with (time).
  • the wearable camera 10 of the present embodiment periodically determines whether or not the vehicle-mounted camera 61 is within the communication range. Thereby, even when wearable camera 10 determines that it is once out of the communication range of in-vehicle camera 61, for example, when police officer 7 wearing wearable camera 10 moves into the communication range of in-vehicle camera 61, The system time of the wearable camera 10 itself can be adjusted to coincide with the system time of the in-vehicle camera 61.
  • the back-end server SV of the present embodiment reproduces video data corresponding to the preceding recording time when one of the recording times indicated by the frame parameter information FPM1 and FPM2 of the video data VDO1 and VDO2 precedes. Is started first, and when the other recording time coincides with one recording time, reproduction of video data corresponding to the other recording time is started.
  • the recording time information of each frame parameter FPM1, FPM2 included in the video data VDO1, VDO2 is obtained.
  • this recording time information it is possible to reproduce the video data VDO1 and VDO2 so that the recording times coincide with each other, and to accurately output the on-site situation.
  • the back-end server SV periodically determines whether or not there is a difference in the reproduction time of the video data VDO1 and VDO2 after step S16 or step S20, and adjusts the difference if it is determined that there is a difference. May be. That is, the back-end server SV controls the video data VDO1 and VDO2 output on the two screens (two windows) to slow down the playback speed of one or both of the video data so that no deviation occurs in the playback time. It is preferable.
  • the present disclosure is useful as a wearable camera system and a video data reproduction synchronization method for reproducing a plurality of video data obtained by imaging the same incident and synchronizing the reproduction times of the respective video data so as to easily grasp the situation in the field. It is.

Abstract

When a plurality of pieces of video data obtained by image capture of the same event are reproduced, the reproduction times of the respective pieces of video data are synchronized, thereby allowing the situation of the scene to be easily ascertained. A server comprises: a communication unit that receives first video data obtained by image capture using a first wearable camera and second video data obtained by image capture using an external device; a storage unit that stores the first and second video data, which are received from the communication unit, in such a manner that the first and second video data are associated with each other; and a reproduction unit that reproduces the first and second video data stored in the storage unit. The first video data has first recording time information of the image capture using the first wearable camera, while the second video data has second recording time information of the image capture using the external device. The reproduction unit uses the first and second recording time information to synchronize and reproduce the first and second video data.

Description

ウェアラブルカメラシステムおよび映像データ同期再生方法Wearable camera system and video data synchronous reproduction method
 本開示は、例えば人の身体、衣服または帽子等に装着可能なウェアラブルカメラを含むウェアラブルカメラシステムと、ウェアラブルカメラにより撮像された映像データを再生する映像データ同期再生方法に関する。 The present disclosure relates to a wearable camera system including a wearable camera that can be worn on, for example, a human body, clothes, or a hat, and a video data synchronous reproduction method for reproducing video data captured by the wearable camera.
 近年、例えば警察官または警備員の業務を支援するために、警察官または警備員に装着されて使用されるウェアラブルカメラの導入が検討されている。 In recent years, for example, in order to support the operations of police officers or guards, the introduction of wearable cameras that are used by police officers or guards is being considered.
 ここで、ウェアラブルカメラを用いた先行技術として、例えば特許文献1に記載されたウェアラブル監視カメラシステムがある。特許文献1に示すウェアラブル監視カメラシステムは、身体に装着するCCDカメラ手段およびマイク手段からの映像信号および音声信号、並びに内蔵のクロック手段からの日時情報信号を、身体に装着するポーチ手段に収納するエンコードバックエンドサーバー手段によりエンコードし、更に、文字情報に変換した日時情報を撮像した映像にスーパーインポーズして記録する。 Here, as a prior art using a wearable camera, for example, there is a wearable surveillance camera system described in Patent Document 1. The wearable surveillance camera system shown in Patent Document 1 stores a video signal and an audio signal from a CCD camera means and a microphone means attached to the body, and a date / time information signal from a built-in clock means in a pouch means attached to the body. Encoding is performed by the encoding back-end server means, and the date / time information converted into character information is superimposed on the captured image and recorded.
特開2006-148842号公報JP 2006-148842 A
 本開示は、ユーザが装着可能な第1のウェアラブルカメラと、バックエンドサーバとを少なくとも含むウェアラブルカメラシステムであって、バックエンドサーバは、第1のウェアラブルカメラにより撮像された第1の映像データと外部装置により撮像された第2の映像データとを受信する通信部と、通信部により受信された第1の映像データおよび第2の映像データを対応付けて記憶する記憶部と、記憶部に記憶された第1の映像データおよび第2の映像データを再生する再生部と、を備え、第1の映像データは、第1のウェアラブルカメラにより撮像された第1の録画時刻情報を有し、第2の映像データは、外部装置により撮像された第2の録画時刻情報を有し、再生部は、第1の録画時刻情報および第2の録画時刻情報を用いて、第1の映像データおよび第2の映像データを同期して再生する、ウェアラブルカメラシステムである。 The present disclosure is a wearable camera system including at least a first wearable camera that can be worn by a user and a back-end server, and the back-end server includes first video data captured by the first wearable camera, A communication unit that receives the second video data captured by the external device, a storage unit that stores the first video data and the second video data received by the communication unit in association with each other, and stores the storage unit in the storage unit A playback unit for playing back the first video data and the second video data, wherein the first video data has first recording time information imaged by the first wearable camera, and The second video data has second recording time information captured by the external device, and the playback unit uses the first recording time information and the second recording time information to Synchronously reproducing video data and second video data of a wearable camera system.
 また、本開示は、ユーザが装着可能な第1のウェアラブルカメラとバックエンドサーバとを少なくとも含むウェアラブルカメラシステムにおける映像データ同期再生方法であって、第1のウェアラブルカメラにより映像を撮像し、外部装置により映像を撮像し、第1のウェアラブルカメラにより撮像された第1の映像データと外部装置により撮像された第2の映像データとを受信し、第1のウェアラブルカメラにより撮像された第1の録画時刻情報を有する第1の映像データ、および外部装置により撮像された第2の録画時刻情報を有する第2の映像データを対応付けて記憶部に記憶し、第1の録画時刻情報および第2の録画時刻情報を用いて、記憶部に記憶された第1の映像データおよび第2の映像データを同期して再生する、映像データ同期再生方法である。 In addition, the present disclosure is a video data synchronized playback method in a wearable camera system including at least a first wearable camera that can be worn by a user and a back-end server, the video being captured by the first wearable camera, and an external device The first video data captured by the first wearable camera, the first video data captured by the first wearable camera and the second video data captured by the external device are received, and the first recording captured by the first wearable camera is received. The first video data having the time information and the second video data having the second recording time information captured by the external device are stored in the storage unit in association with each other, and the first recording time information and the second video data Video data for reproducing the first video data and the second video data stored in the storage unit in synchronization using the recording time information. It is a synchronous playback method.
 本開示によれば、同一の事件を撮像した複数の映像データを再生する場合に、それぞれの映像データの再生時刻を同期して現場の状況を容易に把握させることができる。 According to the present disclosure, when reproducing a plurality of video data obtained by imaging the same incident, it is possible to easily grasp the on-site situation by synchronizing the reproduction times of the respective video data.
図1は、本実施形態のウェアラブルカメラシステムの概要とウェアラブルカメラにより撮像された映像データの使用に関する説明図である。FIG. 1 is an explanatory diagram regarding the outline of the wearable camera system of the present embodiment and the use of video data captured by the wearable camera. 図2は、本実施形態の車載カメラシステムおよび車載PCの内部構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of the internal configuration of the in-vehicle camera system and the in-vehicle PC of the present embodiment. 図3は、本実施形態のウェアラブルカメラの内部構成の一例を示すブロック図である。FIG. 3 is a block diagram illustrating an example of an internal configuration of the wearable camera according to the present embodiment. 図4は、本実施形態のバックエンドサーバの内部構成の一例を示すブロック図である。FIG. 4 is a block diagram showing an example of the internal configuration of the back-end server of this embodiment. 図5は、本実施形態のウェアラブルカメラをユーザが装着した状態を示す図である。FIG. 5 is a diagram showing a state in which the user wears the wearable camera of the present embodiment. 図6は、本実施形態のウェアラブルカメラの外観の一例を示す正面図である。FIG. 6 is a front view showing an example of the appearance of the wearable camera of the present embodiment. 図7は、車載カメラにより撮像された録画データのデータ構造の一例を示す図である。FIG. 7 is a diagram illustrating an example of a data structure of recorded data captured by the in-vehicle camera. 図8は、ウェアラブルカメラにより撮像された録画データのデータ構造の一例を示す図である。FIG. 8 is a diagram illustrating an example of a data structure of recorded data captured by the wearable camera. 図9は、映像データ内のフレームパラメータ情報のデータ構造の一例を示す図である。FIG. 9 is a diagram illustrating an example of a data structure of frame parameter information in video data. 図10は、車載カメラにより撮像された録画データの映像データ内のフレームパラメータ情報のデータ構造の一例を示す図である。FIG. 10 is a diagram illustrating an example of the data structure of the frame parameter information in the video data of the recording data captured by the in-vehicle camera. 図11は、ウェアラブルカメラにより撮像された録画データの映像データ内のフレームパラメータ情報のデータ構造の一例を示す図である。FIG. 11 is a diagram illustrating an example of the data structure of the frame parameter information in the video data of the recording data captured by the wearable camera. 図12は、本実施形態のウェアラブルカメラが車載カメラシステムとの間で時刻を同期する動作手順の一例を説明するフローチャートである。FIG. 12 is a flowchart for explaining an example of an operation procedure in which the wearable camera of the present embodiment synchronizes time with the in-vehicle camera system. 図13は、本実施形態のバックエンドサーバがウェアラブルカメラ、車載カメラそれぞれにより撮像された録画データの映像データを再生する動作手順の一例を説明するフローチャートである。FIG. 13 is a flowchart for explaining an example of an operation procedure in which the back-end server of the present embodiment reproduces video data of recorded data captured by the wearable camera and the in-vehicle camera.
 警察官がウェアラブルカメラを装着しパトカーで指定の現場に急行した場合、現場の状況の映像を記録するために、例えば警察官がウェアラブルカメラを操作して現場の状況を撮像して録画するだけでなく、パトカー内に搭載された車載カメラを操作しウェアラブルカメラと異なるアングルの映像を撮像することがある。ウェアラブルカメラ、車載カメラの撮像により得られたそれぞれの映像データは警察署内のバックエンドサーバに転送され、その後、警察署内の担当者がそれぞれの映像データを見比べできるように再生して現場の状況を観察する。 When a police officer wears a wearable camera and rushes to a designated site with a police car, for example, the police officer only operates the wearable camera to record and record the situation at the scene. In some cases, an in-vehicle camera mounted in a police car is operated to pick up an image of an angle different from that of the wearable camera. Each video data obtained by the wearable camera and the in-vehicle camera is transferred to the back-end server in the police station, and then played back so that the person in charge in the police station can compare each video data. Observe the situation.
 ここで、同一の事件を撮像した複数の映像データ(例えば、上記したウェアラブルカメラ、車載カメラが撮像したそれぞれの映像データの場合に限らず、複数のウェアラブルカメラが撮像したそれぞれの映像データの場合も含まれる。)がある場合に、バックエンドサーバにより再生された複数の映像の再生時刻が同期していないと、時系列的な前後関係の判別ができず、現場の状況の把握が困難になるという課題があった。 Here, a plurality of pieces of video data obtained by imaging the same incident (for example, not only the case of each piece of video data taken by the above-described wearable camera and vehicle-mounted camera, but also the case of each piece of video data taken by a plurality of wearable cameras) If the playback times of multiple videos played back by the back-end server are not synchronized, the time-series context cannot be determined, making it difficult to grasp the situation at the site. There was a problem.
 例えば、同一の事件でも対象人物(つまり、ウェアラブルカメラや車載カメラの被写体)が異なる場合、バックエンドサーバにより再生された複数の映像の再生時刻が同期していないと、これらの映像から現場における時系列的な前後関係の状況の把握が困難になる。同一の事件でも対象人物が同一であれば、それぞれの映像データが再生された場合に警察署内の担当者がその対象人物を追跡すれば時系列的な前後関係を確認できるが、確認作業に手間がかかる。このため、それぞれの映像データの再生時刻が同期していないと、現場の状況の把握が困難になるという課題は未解決である。 For example, if the target person is different in the same incident (that is, the subject of the wearable camera or the in-vehicle camera), and the playback times of multiple videos played back by the back-end server are not synchronized, It becomes difficult to grasp the status of the serial context. If the target person is the same even in the same case, if each person in the police station tracks the target person when each video data is played back, the chronological context can be confirmed. It takes time and effort. For this reason, if the reproduction times of the respective video data are not synchronized, there is an unsolved problem that it is difficult to grasp the situation at the site.
 上記した特許文献1では、映像を撮像した日時情報は映像データにインポーズされるが、上記した警察署内のバックエンドサーバで複数のカメラにより撮像された映像データを再生するときの同期については言及がなされていない。 In the above-mentioned Patent Document 1, the date and time information when the video is captured is imposed on the video data. However, regarding the synchronization when the video data captured by a plurality of cameras is reproduced by the back-end server in the police station described above, No mention is made.
 本開示は、上述した課題を解決するために、同一の事件を撮像した複数の映像データを再生する場合に、それぞれの映像データの再生時刻を同期して現場の状況を容易に把握させるウェアラブルカメラシステムおよび映像データ同期再生方法を提供することを目的とする。 In order to solve the above-described problem, the present disclosure provides a wearable camera that allows a user to easily grasp the on-site situation by synchronizing the reproduction time of each video data when reproducing a plurality of video data obtained by imaging the same incident. It is an object of the present invention to provide a system and a video data synchronous reproduction method.
 以下、本開示に係るウェアラブルカメラシステムおよび映像データ同期再生方法を具体的に開示した実施形態(以下、「本実施形態」という)について、図面を参照して詳細に説明する。 Hereinafter, an embodiment (hereinafter referred to as “the present embodiment”) that specifically discloses the wearable camera system and the video data synchronized playback method according to the present disclosure will be described in detail with reference to the drawings.
 本実施形態では、ユーザ(例えば警察官(Officer)7。以下同様。)が装着可能なウェアラブルカメラ(BWC:Body Worn Camera)とバックエンドサーバとを少なくとも含むウェアラブルカメラシステムにおいて、ウェアラブルカメラにより映像が撮像され、外部装置(例えば車載カメラまたは別のウェアラブルカメラ)により映像が撮像される。ウェアラブルカメラにより撮像された映像データと外部装置により撮像された映像データとはバックエンドサーバにおいて受信される。バックエンドサーバは、ウェアラブルカメラにより撮像された第1の録画時刻情報を有する第1の映像データ、および外部装置により撮像された第2の録画時刻情報を有する第2の映像データを対応付けて記憶部に記憶する。さらに、バックエンドサーバは、第1の録画時刻情報および第2の録画時刻情報を用いて、記憶部に記憶された第1の映像データおよび第2の映像データを同期して再生する。 In this embodiment, in a wearable camera system including at least a wearable camera (BWC: Body Worn Camera) and a back-end server that can be worn by a user (for example, police officer 7; the same applies hereinafter), an image is displayed by the wearable camera. An image is captured and an image is captured by an external device (for example, an in-vehicle camera or another wearable camera). The video data captured by the wearable camera and the video data captured by the external device are received by the back-end server. The back-end server stores the first video data having the first recording time information captured by the wearable camera and the second video data having the second recording time information captured by the external device in association with each other. Store in the department. Further, the back-end server uses the first recording time information and the second recording time information to reproduce the first video data and the second video data stored in the storage unit in synchronization.
 図1は、本実施形態のウェアラブルカメラシステム100の概要とウェアラブルカメラ10により撮像した映像データの使用に関する説明図である。本実施形態のウェアラブルカメラ10は、ユーザ(例えば警察官7)が身体、衣服、帽子等に装着可能な撮像装置である。ウェアラブルカメラ10は、車(例えば警察官が乗車するパトカー)に搭載された車載システム60、または警察官7が所属する警察署内のバックエンドサーバ(つまり、バックエンドサーバSV)等との間で通信(例えば無線通信)を行う通信機能を有する。 FIG. 1 is an explanatory diagram regarding the outline of the wearable camera system 100 of this embodiment and the use of video data captured by the wearable camera 10. The wearable camera 10 of this embodiment is an imaging device that a user (for example, police officer 7) can wear on a body, clothes, a hat, or the like. The wearable camera 10 is connected to an in-vehicle system 60 mounted on a car (for example, a police car on which a police officer gets on) or a back-end server (that is, a back-end server SV) in a police station to which the police officer 7 belongs. It has a communication function for performing communication (for example, wireless communication).
 図1に示すウェアラブルカメラシステム100において、ウェアラブルカメラ10、車載システム60がフロントエンドシステム100Aを構成し、ネットワーク上の管理ソフトウェア70、バックエンドサーバSV、警察署内のPCである署内PC71がバックエンドシステム100Bを構成する。管理ソフトウェア70は、例えば署内PC71またはバックエンドサーバSVにより実行される。 In the wearable camera system 100 shown in FIG. 1, the wearable camera 10 and the in-vehicle system 60 constitute a front-end system 100A, and the management software 70 on the network, the back-end server SV, and the in-station PC 71 that is a PC in the police station are back. The end system 100B is configured. The management software 70 is executed by, for example, the in-station PC 71 or the back-end server SV.
 以下、本実施形態では、例えばウェアラブルカメラシステム100が警察署5において使用される場合を想定して説明する。警察官7は、ウェアラブルカメラ10を操作して現場の状況または特定の被写体(例えば事件の被害者、容疑者、現場周囲にいる見学者)の撮像を行う。ウェアラブルカメラ10は、ユーザの操作に応じて、例えば警察署5内のバックエンドシステム100Bに、撮像により得られた映像データを転送する。なお、ウェアラブルカメラ10は、ユーザが警察官に限定されず、その他様々な事業所(例えば、警備会社)において使用されてもよい。本実施形態では、ユーザとして警察官を主に例示する。 Hereinafter, the present embodiment will be described assuming that the wearable camera system 100 is used in the police station 5, for example. The police officer 7 operates the wearable camera 10 to image the situation on the scene or a specific subject (for example, a victim of the incident, a suspect, a visitor around the scene). The wearable camera 10 transfers video data obtained by imaging to, for example, a back-end system 100B in the police station 5 in accordance with a user operation. Wearable camera 10 is not limited to a police officer as a user, and may be used in various other establishments (for example, a security company). In this embodiment, a police officer is mainly exemplified as a user.
 フロントエンドシステム100Aは、現場の最前線に出動する警察官7が装着可能なウェアラブルカメラ10と、警察官が保持または警察官が乗用するパトカー内に配置された携帯端末(例えばスマートフォンであるが図示を省略)と、パトカー6内に設置された車載システム60とを含む構成である。車載システム60は、車載カメラ61、車載レコーダ62、車載PC63、通信ユニット等を有し、車載カメラシステム、映像管理システム等を構成する。 The front-end system 100A includes a wearable camera 10 that can be worn by the police officer 7 dispatched to the forefront of the scene, and a portable terminal (for example, a smartphone, which is held in a police car that is held or used by the police officer) And a vehicle-mounted system 60 installed in the police car 6. The in-vehicle system 60 includes an in-vehicle camera 61, an in-vehicle recorder 62, an in-vehicle PC 63, a communication unit, and the like, and configures an in-vehicle camera system, a video management system, and the like.
 車載カメラ61は、パトカー6の所定位置に設置され、パトカー6の周囲(例えばパトカーの前方又はパトカー内のバックシート)を常時または所定のタイミング毎に映像を撮像する。つまり、車載カメラ61は、例えばパトカー6の前方を撮像するためのフロント用カメラ(不図示)、パトカー6内のバックシート(例えば容疑者を座らせているシート)を撮像するためのバックシート用カメラ(不図示)と、を含む構成である。車載カメラ61により撮像された映像データは、例えば録画動作の実行によって車載レコーダ62に蓄積される。なお、車載カメラ61は、複数台設けられてもよい。 The in-vehicle camera 61 is installed at a predetermined position of the police car 6 and captures images around the police car 6 (for example, the front of the police car or the back seat in the police car) at regular intervals or at predetermined timings. That is, the in-vehicle camera 61 is, for example, a front camera (not shown) for imaging the front of the police car 6, and a back seat for imaging a back seat (for example, a seat on which a suspect is sitting) in the police car 6. And a camera (not shown). The video data captured by the in-vehicle camera 61 is accumulated in the in-vehicle recorder 62, for example, by executing a recording operation. A plurality of in-vehicle cameras 61 may be provided.
 また、車載カメラ61は、フロント用カメラやバックシート用カメラとして、パトカー6の車内外の音を収音するマイク(不図示)を有してもよい。この場合、パトカー6内の警察官7または容疑者の発する音声も収音(録音)可能である。 Further, the in-vehicle camera 61 may have a microphone (not shown) that collects sound inside and outside the police car 6 as a front camera or a back seat camera. In this case, it is also possible to pick up (record) sound produced by the police officer 7 or the suspect in the police car 6.
 車載レコーダ62は、車載カメラ61により撮像された映像データを蓄積する。車載レコーダ62は、ウェアラブルカメラ10により撮像された映像データを取得し、蓄積可能である。また、車載レコーダ62は、映像データに対して付与される属性情報などのメタ情報を管理してもよい。 The in-vehicle recorder 62 stores video data captured by the in-vehicle camera 61. The in-vehicle recorder 62 can acquire and store video data captured by the wearable camera 10. Further, the in-vehicle recorder 62 may manage meta information such as attribute information given to the video data.
 車載PC63は、パトカー6内に固定的に搭載されるPCでもよいし、パトカー6外で使用される携帯可能なPC、スマートフォン、携帯電話機、タブレット端末、PDA(Personal Digital Assistant)等の無線通信装置でもよい。 The in-vehicle PC 63 may be a PC that is fixedly installed in the police car 6, or a wireless communication device such as a portable PC, a smartphone, a mobile phone, a tablet terminal, or a PDA (Personal Digital Assistant) used outside the police car 6. But you can.
 車載PC63は、図示しない管理ソフトウェアを実行することで、車載システム60とウェアラブルカメラ10との連携(具体的には、車載システム60とウェアラブルカメラ10との間の通信)を可能にする。また、車載PC63のUI(User Interface)(例えば、操作デバイス、表示デバイス、音声出力デバイス)は、車載レコーダ62を操作するためのUIとしても用いられる。 The in-vehicle PC 63 enables cooperation between the in-vehicle system 60 and the wearable camera 10 (specifically, communication between the in-vehicle system 60 and the wearable camera 10) by executing management software (not shown). Further, a UI (User Interface) (for example, an operation device, a display device, and an audio output device) of the in-vehicle PC 63 is also used as a UI for operating the in-vehicle recorder 62.
 警察官7は、所定の用件(例えばパトロール)で警察署5から出動する際、例えばウェアラブルカメラ10を装着し、車載システム60を搭載したパトカー6に乗車して現場に向かう。フロントエンドシステム100Aにおいて、例えばパトカー6が到着した現場の映像を車載システム60の車載カメラ61により撮像し、警察官7がパトカー6から降車して現場のより身近で詳細な映像をウェアラブルカメラ10により撮像する。 When the police officer 7 is dispatched from the police station 5 with a predetermined requirement (for example, patrol), for example, the wearable camera 10 is mounted, and the police officer 7 gets on the police car 6 equipped with the in-vehicle system 60 and goes to the site. In the front-end system 100A, for example, an on-site camera 61 of the in-vehicle system 60 captures an image of the scene where the police car 6 arrives, and the police officer 7 gets off the police car 6 to obtain a closer and more detailed image of the scene on the wearable camera 10. Take an image.
 ウェアラブルカメラ10が撮像した動画または静止画の映像データは、例えばウェアラブルカメラ10のメモリ等による記憶デバイスに保存される。ウェアラブルカメラ10は、ウェアラブルカメラ10の記憶デバイスから、ウェアラブルカメラ10により撮像された映像データを含む各種データを、バックエンドシステム100Bに転送(アップロード)する。なお、ウェアラブルカメラ10により撮像された映像データを含む各種データは、ウェアラブルカメラ10からバックエンドシステム100Bに直接に転送されてもよいし、車載システム60を介してバックエンドシステム100Bに転送されてもよい。 Video data of a moving image or a still image captured by the wearable camera 10 is stored in a storage device such as a memory of the wearable camera 10, for example. The wearable camera 10 transfers (uploads) various data including video data captured by the wearable camera 10 from the storage device of the wearable camera 10 to the back-end system 100B. Various data including video data captured by the wearable camera 10 may be directly transferred from the wearable camera 10 to the back-end system 100B, or may be transferred to the back-end system 100B via the in-vehicle system 60. Good.
 車載カメラ61が撮像した動画または静止画の映像データは、例えば車載システム60の車載レコーダ62が備えるハードディスク(HDD(Hard Disk Drive))、SSD(Solid State Drive)等によるストレージに保存される。車載システム60(例えば車載レコーダ62)は、車載システム60のストレージから、車載カメラ61により撮像された映像データを含む各種データを、バックエンドシステム100Bに転送(アップロード)する。 Video data of a moving image or a still image captured by the in-vehicle camera 61 is stored in a storage such as a hard disk (HDD (Hard Disk Drive)), SSD (Solid State Drive) or the like provided in the in-vehicle recorder 62 of the in-vehicle system 60, for example. The in-vehicle system 60 (for example, the in-vehicle recorder 62) transfers (uploads) various data including video data captured by the in-vehicle camera 61 from the storage of the in-vehicle system 60 to the back-end system 100B.
 バックエンドシステム100Bへのデータ転送は、例えば現場から無線通信により接続して行うか、或いは、パトロールが終了して警察署5に戻った際に、無線通信、有線通信、または手動(例えば記憶媒体の持ち運び)により行う。 Data transfer to the back-end system 100B is performed by connecting by wireless communication from the field, for example, or when patrol is completed and the police station 5 is returned, wireless communication, wired communication, or manual (for example, storage medium) Carry around).
 本実施形態では、ウェアラブルカメラ10により撮像した映像データを、車載システム60経由でバックエンドシステム100Bに転送(アップロード)する動作を主に説明する。この場合、ウェアラブルカメラ10により撮像した動画または静止画の映像データを、ウェアラブルカメラ10から車載システム60の車載レコーダ62に転送して保存する。そして、ウェアラブルカメラ10または車載カメラ61により撮像され、車載レコーダ62に保存された動画または静止画の映像データを、車載レコーダ62からバックエンドシステム100Bに転送(アップロード)する。 In the present embodiment, an operation of transferring (uploading) video data captured by the wearable camera 10 to the back-end system 100B via the in-vehicle system 60 will be mainly described. In this case, video data of a moving image or a still image captured by the wearable camera 10 is transferred from the wearable camera 10 to the in-vehicle recorder 62 of the in-vehicle system 60 and stored. Then, the moving image or still image data captured by the wearable camera 10 or the in-vehicle camera 61 and stored in the in-vehicle recorder 62 is transferred (uploaded) from the in-vehicle recorder 62 to the back-end system 100B.
 なお、ウェアラブルカメラ10により撮像した映像データを、車載PC63のストレージ等に保存し、車載PC63からバックエンドシステム100Bに転送(アップロード)することも可能である。 Note that the video data captured by the wearable camera 10 can be stored in the storage of the in-vehicle PC 63 and transferred (uploaded) from the in-vehicle PC 63 to the back-end system 100B.
 バックエンドシステム100Bは、警察署5内または他の場所に設置されたバックエンドサーバSVと、フロントエンドシステム100Aとの間で通信するための管理ソフトウェア70と、署内PC71とを含む構成である。 The back-end system 100B includes a back-end server SV installed in the police station 5 or other places, management software 70 for communicating with the front-end system 100A, and a station PC 71. .
 バックエンドサーバSVは、内部(図4参照)または外部にHDD、SSD等を用いて構成されるストレージ308を備える。バックエンドシステム100Bにおいて、バックエンドサーバSVは、フロントエンドシステム100Aから転送された映像データおよび他のデータを、ストレージ308に蓄積し、警察署内の各部署において使用されるデータベースを構築する。バックエンドサーバSVは、例えばウェアラブルカメラ10または車載システム60(例えば車載レコーダ62)から転送された映像データを受信し、ストレージ308に保存する。 The back-end server SV includes a storage 308 configured using an HDD, SSD, or the like inside (see FIG. 4) or outside. In the back-end system 100B, the back-end server SV accumulates video data and other data transferred from the front-end system 100A in the storage 308, and constructs a database used in each department in the police station. The back-end server SV receives video data transferred from, for example, the wearable camera 10 or the in-vehicle system 60 (for example, the in-vehicle recorder 62) and stores it in the storage 308.
 バックエンドシステム100Bに蓄積された映像データは、例えば警察署5内の関係部署の担当者により事件の捜査・検証等に利用され、必要に応じて、所定の記憶媒体(例えばDVD:Digital Versatile Disk)に映像データがコピーされて、所定のシーン(例えば裁判)において証拠品として提出される。本実施形態では、警察官7が装着したウェアラブルカメラ10を用いて、現場の証拠映像をより的確に取得および保存することが可能である。 The video data stored in the back-end system 100B is used, for example, by a person in charge in the relevant department in the police station 5 for investigation and verification of the incident, and as required, a predetermined storage medium (for example, DVD: Digital Versatile Disk) ) Is copied and submitted as evidence in a predetermined scene (for example, trial). In the present embodiment, using the wearable camera 10 worn by the police officer 7, it is possible to acquire and store the evidence video on the site more accurately.
 警察官7が警察署5から現場へ出動し、ウェアラブルカメラ10を用いる際には、警察官7の識別情報(例えばオフィサID(Officer ID))、警察官が使用するウェアラブルカメラ10の識別情報(例えばカメラID(Camera ID)、警察官7が使用するパトカー6の識別情報(例えばカーID(Car ID)等を、署内PC71等を用いて設定登録する。これにより、バックエンドサーバSVに蓄積された映像データについて、いつ、どの警察官がどのカメラを用いて撮像した映像であるかを明確に区別できる。 When the police officer 7 is dispatched from the police station 5 and uses the wearable camera 10, the identification information of the police officer 7 (for example, officer ID (Office ID)), the identification information of the wearable camera 10 used by the police officer ( For example, camera ID (Camera ID), identification information of the police car 6 used by the police officer 7 (for example, car ID (Car ID)) is set and registered using the in-station PC 71. Thereby, it is stored in the back-end server SV. With respect to the recorded video data, it is possible to clearly distinguish when and when the police officer uses the camera.
 警察官7やウェアラブルカメラ10の設定登録は、例えば警察署5内の担当者や出動する警察官7が署内PC71の操作デバイス(不図示)を操作し、署内PC71が管理ソフトウェア70を実行することで行われる。この設定登録では、上記Officer ID、Camera ID、Car ID以外の情報が署内PC71の操作デバイスを介して入力されてもよい。 For setting registration of the police officer 7 and the wearable camera 10, for example, the person in charge in the police station 5 or the police officer 7 to be dispatched operates the operation device (not shown) of the in-station PC 71, and the in-station PC 71 executes the management software 70 It is done by doing. In this setting registration, information other than the Office ID, Camera ID, and Car ID may be input via the operation device of the in-station PC 71.
 つまり、管理ソフトウェア70は、例えば警察官7の人員を管理するためのアプリケーション、パトカー6等の配車を管理するためのアプリケーション、ウェアラブルカメラ10の持ち出しを管理するためのアプリケーションを含む。また、管理ソフトウェア70は、例えばバックエンドサーバSVに蓄積された複数の映像データから、属性情報に基づいて特定の映像データを検索し、抽出するためのアプリケーションを含む。 That is, the management software 70 includes, for example, an application for managing the personnel of the police officer 7, an application for managing the dispatch of the police car 6 and the like, and an application for managing the taking-out of the wearable camera 10. The management software 70 includes an application for searching and extracting specific video data based on attribute information from a plurality of video data stored in the back-end server SV, for example.
 図2は、車載カメラシステム65および車載PC63の内部構成の一例を示すブロック図である。図2に示す車載システム60において、車載カメラ61、車載レコーダ62が車載カメラシステム(ICV:In Car Video System)65を構成する。 FIG. 2 is a block diagram showing an example of the internal configuration of the in-vehicle camera system 65 and the in-vehicle PC 63. In the in-vehicle system 60 shown in FIG. 2, the in-vehicle camera 61 and the in-vehicle recorder 62 constitute an in-vehicle camera system (ICV: In Car Video System) 65.
 車載カメラ61は、所定のフレームレートで映像を撮像し、撮像により得られた映像データのフレーム毎に、フレームの情報を示すフレームパラメータ情報(図9参照)を生成する。車載カメラ61は、フレーム毎のフレームパラメータ情報を映像データに付加して映像データを生成する(図7参照)。フレームパラメータ情報については、図9を参照して後述する。 The in-vehicle camera 61 captures video at a predetermined frame rate, and generates frame parameter information (see FIG. 9) indicating frame information for each frame of video data obtained by the imaging. The vehicle-mounted camera 61 generates video data by adding frame parameter information for each frame to the video data (see FIG. 7). The frame parameter information will be described later with reference to FIG.
 車載レコーダ62は、CPU(Central Processing Unit)101、通信部102、フラッシュROM(Read Only Memory)104、RAM(Random Access Memory)105、マイコン(μCON、マイクロコントローラとも称されることがある。)106、GPS(Global Positioning System)107、GPIO(General Purpose Input/Output)108、スイッチ109、LED(Light Emitting Diode)110、およびストレージ111を含む構成である。 The in-vehicle recorder 62 includes a CPU (Central Processing Unit) 101, a communication unit 102, a flash ROM (Read Only Memory) 104, a RAM (Random Access Memory) 105, and a microcomputer (μCON, sometimes referred to as a microcontroller) 106. , GPS (Global Positioning System) 107, GPIO (General Purpose Input / Output) 108, switch 109, LED (Light Emitting Diode) 110, and storage 111.
 CPU101は、例えば車載レコーダ62の各部の動作を全体的に統括するための制御処理、他の各部との間のデータの入出力処理、データの演算(計算)処理およびデータの記憶処理を行う。 The CPU 101 performs, for example, control processing for overall control of operations of each unit of the in-vehicle recorder 62, data input / output processing with other units, data calculation (calculation) processing, and data storage processing.
 通信部102は、無線回線または有線回線を介して外部装置(例えばウェアラブルカメラ10,10A、車載PC63)との間で通信を行う。無線通信は、例えば無線LAN(W-LAN(Local Area Network))通信、近距離無線通信(NFC:Near Field Communication)、Bluetooth(登録商標)を含む。無線LAN通信は、例えばWi-fi(登録商標)のIEEE802.11n規格に従って通信する。有線通信は、例えば有線LAN通信、USB(Universal Serial Bus)通信を含む。CPU101と通信部102との間は、例えばPCI(Peripheral Component InterConnect)またはUSBを介して接続される。 The communication unit 102 communicates with an external device (for example, the wearable camera 10 or 10A or the in-vehicle PC 63) via a wireless line or a wired line. The wireless communication includes, for example, wireless LAN (W-LAN (Local Area Network)) communication, near field communication (NFC: Near Field Communication), and Bluetooth (registered trademark). The wireless LAN communication is performed in accordance with, for example, the IEEE 802.11n standard of Wi-fi (registered trademark). The wired communication includes, for example, wired LAN communication and USB (Universal Serial Bus) communication. The CPU 101 and the communication unit 102 are connected via, for example, PCI (Peripheral Component InterConnect) or USB.
 通信部102は、例えば車載カメラ61、車載PC63、ウェアラブルカメラ10、警察署5の署内PC71やバックエンドサーバSVとの間で無線通信または有線通信を行う。通信部102は、ウェアラブルカメラ10,10Aとの間で無線接続し、ウェアラブルカメラ10,10Aから送信された映像データを含む各種データを受信する。 The communication unit 102 performs wireless communication or wired communication with the in-vehicle camera 61, the in-vehicle PC 63, the wearable camera 10, the in-station PC 71 of the police station 5, and the back-end server SV, for example. The communication unit 102 is wirelessly connected to the wearable cameras 10 and 10A, and receives various data including video data transmitted from the wearable cameras 10 and 10A.
 フラッシュROM104は、例えばCPU101を制御するためのプログラムおよびデータを記憶するメモリであり、各種設定情報を保持する。RAM105は、例えばCPU101の動作において使用されるワークメモリであり、1つまたは複数のいずれか設けられる。 The flash ROM 104 is a memory that stores a program and data for controlling the CPU 101, for example, and holds various setting information. The RAM 105 is a work memory used in the operation of the CPU 101, for example, and is provided with either one or a plurality of RAMs.
 マイコン106は、例えばマイクロコンピュータの一種であり、外部インタフェースに係る各部(例えばGPS107、GPIO108、スイッチ109、LED110)と接続され、外部インタフェースに関する制御を行う。マイコン106は、例えばUART(Universal Asynchronous Receiver Transmitter)を介して、CPU101に接続される。 The microcomputer 106 is, for example, a kind of microcomputer and is connected to each unit (for example, the GPS 107, the GPIO 108, the switch 109, and the LED 110) related to the external interface, and performs control related to the external interface. The microcomputer 106 is connected to the CPU 101 via, for example, a UART (Universal Asynchronous Receiver Transmitter).
 GPS107は、例えば現在の車載レコーダ62の位置情報および時刻情報をGPS発信機(不図示)から受信して、CPU101に出力する。この時刻情報は、車載レコーダ62のシステム時刻の補正のために使用される。また、車載カメラ61は、車載レコーダ62に対して定期的にアクセスし、車載カメラ61のシステム時刻と車載レコーダ62のシステム時刻とが同期(一致)するように時刻を調整する。従って、車載レコーダ62におけるシステム時刻の補正により、図2に示す車載カメラシステム65におけるシステム時刻(言い換えると、車載カメラ61、車載レコーダ62のそれぞれのシステム時刻)は同期している(つまり、一致している)とする。 The GPS 107 receives, for example, the current position information and time information of the in-vehicle recorder 62 from a GPS transmitter (not shown) and outputs it to the CPU 101. This time information is used for correcting the system time of the in-vehicle recorder 62. Further, the in-vehicle camera 61 periodically accesses the in-vehicle recorder 62 and adjusts the time so that the system time of the in-vehicle camera 61 and the system time of the in-vehicle recorder 62 are synchronized (matched). Accordingly, the system time in the in-vehicle camera system 65 shown in FIG. 2 (in other words, the respective system times of the in-vehicle camera 61 and the in-vehicle recorder 62) is synchronized (that is, coincides) by correcting the system time in the in-vehicle recorder 62. And).
 なお、車載カメラシステム65における車載カメラ61および車載レコーダ62のそれぞれのシステム時刻を一致させるために、上記したGPS107の出力を用いて車載レコーダ62のシステム時刻を補正して車載カメラ61が定期的に車載レコーダ62にアクセスする方法に限定されない。例えば車載レコーダ62にNTP(Network Time Protocol)サーバが設けられ、車載カメラ61が定期的にNTPサーバとしての車載レコーダ62にアクセスする方法が用いられてもよい。さらに、詳細は図12を参照して後述するが、本実施形態では、ウェアラブルカメラ10,10Aが定期的に車載レコーダ62にアクセスすることにより、車載レコーダ62(つまり、車載カメラシステム65)とウェアラブルカメラ10,10Aとはシステム時刻が同期する。 In addition, in order to make each system time of the vehicle-mounted camera 61 and the vehicle-mounted recorder 62 in the vehicle-mounted camera system 65 correspond, the system time of the vehicle-mounted recorder 62 is corrected using the output of the GPS 107 described above, and the vehicle-mounted camera 61 periodically The method for accessing the in-vehicle recorder 62 is not limited. For example, an in-vehicle recorder 62 may be provided with an NTP (Network Time Protocol) server, and the in-vehicle camera 61 may periodically access the in-vehicle recorder 62 as an NTP server. Furthermore, although details will be described later with reference to FIG. 12, in the present embodiment, the wearable cameras 10 and 10A periodically access the in-vehicle recorder 62, so that the in-vehicle recorder 62 (that is, the in-vehicle camera system 65) and the wearable camera are wearable. The system time is synchronized with the cameras 10 and 10A.
 GPIO108は、例えばパラレルインタフェースであり、GPIO108を介して接続される外部装置(不図示)とCPU101との間で、信号を入出力する。GPIO108には、例えば各種センサ(例えば速度センサ、加速度センサ、扉開閉センサ)が接続される。 The GPIO 108 is, for example, a parallel interface, and inputs and outputs signals between an external device (not shown) connected via the GPIO 108 and the CPU 101. For example, various sensors (for example, a speed sensor, an acceleration sensor, and a door opening / closing sensor) are connected to the GPIO 108.
 スイッチ109は、ユーザが車載レコーダ62の操作入力を行うための入力デバイスとして設けられるボタン等のスイッチである。スイッチ109は、例えば車載カメラ61により撮像された映像データの録画を開始または停止するための録画ボタン、車載カメラ61により撮像された映像データに対して属性情報やメタ情報を付与するための付与ボタンを含む。 The switch 109 is a switch such as a button provided as an input device for the user to perform operation input of the in-vehicle recorder 62. The switch 109 is, for example, a recording button for starting or stopping recording of video data imaged by the in-vehicle camera 61, and an adding button for giving attribute information or meta information to the video data imaged by the in-vehicle camera 61 including.
 LED110は、車載レコーダ62の動作状態を示す表示デバイスとして設けられる。LED110は、例えば車載レコーダ62の電源投入状態(オンオフ状態)、録画の実施状態、車載レコーダ62のLANへの接続状態、車載レコーダ62に接続されたLANの使用状態を、点灯、消灯、点滅等により表示する。 The LED 110 is provided as a display device that indicates an operation state of the in-vehicle recorder 62. For example, the LED 110 turns on, turns off, blinks, etc. the power-on state (on / off state) of the in-vehicle recorder 62, the recording state, the connection state of the in-vehicle recorder 62 to the LAN, and the use state of the LAN connected to the in-vehicle recorder 62. Is displayed.
 ストレージ111は、例えばSSDまたはHDD等により構成され、車載カメラ61により撮像され録画された映像データを記憶して蓄積する。また、ストレージ111は、ウェアラブルカメラ10,10Aから映像データが転送される場合、ウェアラブルカメラ10,10Aにより撮像され録画された映像データを記憶して蓄積する。また、ストレージ111は、映像データ以外のデータを蓄積してもよい。ストレージ111は、例えばSATA(Serial ATA)を介して、CPU101に接続される。ストレージ111は、複数設けられてもよい。 The storage 111 is configured by, for example, an SSD or HDD, and stores and accumulates video data captured and recorded by the in-vehicle camera 61. In addition, when video data is transferred from the wearable cameras 10 and 10A, the storage 111 stores and accumulates video data captured and recorded by the wearable cameras 10 and 10A. The storage 111 may store data other than video data. The storage 111 is connected to the CPU 101 via, for example, SATA (Serial ATA). A plurality of storages 111 may be provided.
 車載PC63は、CPU201、I/O(Input/Output)制御部202、通信部203、メモリ204、入力部205、表示部206およびスピーカ207を含む構成である。車載PC63は、車載レコーダ62と通信可能であり、バックエンドシステム100BのバックエンドサーバSV、署内PC71とも通信可能である。 The in-vehicle PC 63 includes a CPU 201, an I / O (Input / Output) control unit 202, a communication unit 203, a memory 204, an input unit 205, a display unit 206, and a speaker 207. The in-vehicle PC 63 can communicate with the in-vehicle recorder 62, and can also communicate with the back-end server SV of the back-end system 100B and the in-station PC 71.
 CPU201は、例えば表示部206に表示された車載システム60へのログイン画面(不図示)に対する警察官7の入力操作により、警察官7の車載システム60へのログインの可否を認証する。警察官7の入力操作は、例えばオフィサIDとパスワード等を入力する操作である。ログインが許可される対象となる警察官7に関する各種情報は、例えばメモリ204にあらかじめ保存されており、CPU201は、メモリ204にあらかじめ保存されているログインの許可対象の情報を用いて、警察官7の車載システム60へのログインの可否を判定する。なお、ログインは、車載PC63を介した車載システム60へのログインでもよいし、車載PC63に搭載された車載システム60を操作するアプリケーションへのログインでもよい。 The CPU 201 authenticates whether or not the police officer 7 can log in to the in-vehicle system 60 by, for example, an input operation of the police officer 7 on a login screen (not shown) to the in-vehicle system 60 displayed on the display unit 206. The input operation of the police officer 7 is an operation for inputting, for example, an officer ID and a password. Various kinds of information related to the police officer 7 to whom login is permitted are stored in advance in, for example, the memory 204, and the CPU 201 uses the information on the login permission target stored in advance in the memory 204 to use the police officer 7. Whether to log in to the in-vehicle system 60 is determined. The login may be a login to the in-vehicle system 60 via the in-vehicle PC 63, or a login to an application that operates the in-vehicle system 60 mounted in the in-vehicle PC 63.
 I/O制御部202は、CPU201と車載PC63の各部(例えば通信部203、入力部205、表示部206、スピーカ207)との間でデータの入出力に関する制御を行い、CPU201からのデータおよびCPU201へのデータの中継を行う。なお、I/O制御部202は、CPU201と一体的に構成されてもよい。 The I / O control unit 202 controls data input / output between the CPU 201 and each unit of the in-vehicle PC 63 (for example, the communication unit 203, the input unit 205, the display unit 206, and the speaker 207). Relay data to. The I / O control unit 202 may be configured integrally with the CPU 201.
 通信部203は、例えば車載レコーダ62との間、またはバックエンドシステム100B側との間で、有線または無線による通信を行う。通信部203は、警察官7が車載システム60にログイン中である場合、メモリ204に保存されるログイン情報をウェアラブルカメラ10に転送してコピーさせ、一方、警察官7が車載システム60にログインしていない場合、ログイン情報をウェアラブルカメラ10に転送しない。ログイン情報は、例えば、オフィサID、カメラID、カーIDを含む。 The communication unit 203 performs wired or wireless communication with, for example, the in-vehicle recorder 62 or the back-end system 100B side. When the police officer 7 is logging in to the in-vehicle system 60, the communication unit 203 transfers and copies the login information stored in the memory 204 to the wearable camera 10, while the police officer 7 logs in to the in-vehicle system 60. If not, the login information is not transferred to the wearable camera 10. The login information includes, for example, an officer ID, a camera ID, and a car ID.
 メモリ204は、例えばRAM、ROM、不揮発性または揮発性の半導体メモリを用いて構成され、CPU201の動作時のワークメモリとして機能し、CPU201を動作させるための所定のプログラムおよびデータを保存している。メモリ204は、例えば車載システム60へのログインが許可される警察官7に関するログイン情報を保存する。 The memory 204 is configured by using, for example, RAM, ROM, nonvolatile or volatile semiconductor memory, functions as a work memory when the CPU 201 operates, and stores a predetermined program and data for operating the CPU 201. . The memory 204 stores login information related to the police officer 7 who is permitted to log in to the in-vehicle system 60, for example.
 入力部205は、警察官7の入力操作を受け付け、I/O制御部202を介してCPU201に通知するためのUIであり、例えばマウス、キーボード等のポインティングデバイスである。入力部205は、例えば表示部206の画面に対応して配置され、警察官7の指またはスタイラスペンによって操作が可能なタッチパネルまたはタッチパッドを用いて構成されてもよい。 The input unit 205 is a UI for receiving an input operation of the police officer 7 and notifying the CPU 201 via the I / O control unit 202, and is a pointing device such as a mouse or a keyboard. The input unit 205 may be configured using, for example, a touch panel or a touch pad that is arranged corresponding to the screen of the display unit 206 and can be operated with a finger of the police officer 7 or a stylus pen.
 表示部206は、例えばLCD(Liquid Crystal Display)または有機EL(Electroluminescence)を用いて構成される表示デバイスであり、各種情報を表示する。また、表示部206は、例えば警察官7の入力操作に応じて、ウェアラブルカメラ10,10Aにより撮像(録画)された映像データが入力された場合には、CPU201の指示の下で、映像データに含まれる映像を画面に表示する。 The display unit 206 is a display device configured using, for example, an LCD (Liquid Crystal Display) or an organic EL (Electroluminescence), and displays various types of information. In addition, the display unit 206 displays the video data under the instruction of the CPU 201 when video data captured (recorded) by the wearable cameras 10 and 10A is input in response to an input operation of the police officer 7, for example. Display included video on the screen.
 スピーカ207は、例えば警察官7の入力操作に応じて、ウェアラブルカメラ10,10Aにより撮像(録画)された音声を含む映像データが入力された場合には、CPU201の指示の下で、映像データに含まれる音声を出力する。なお、表示部206およびスピーカ207は、車載PC63とは別々の構成としてもよい。 For example, in response to an input operation of the police officer 7, the speaker 207 receives video data including sound picked up (recorded) by the wearable cameras 10 and 10 </ b> A. Output included audio. The display unit 206 and the speaker 207 may be configured separately from the in-vehicle PC 63.
 図3は、ウェアラブルカメラ10,10Aの内部構成の一例を示すブロック図である。図3に示すウェアラブルカメラ10は、撮像部11と、GPIO12と、RAM13と、ROM14と、記憶部15とを備える。ウェアラブルカメラ10は、EEPROM(Electrically Erasable Programmable ROM)16と、RTC(Real Time Clock)17と、GPS18とを備える。ウェアラブルカメラ10は、MCU(Micro Controller Unit)19と、通信部21と、USB22と、コンタクトターミナル23と、電源部24と、バッテリ25とを備える。 FIG. 3 is a block diagram showing an example of the internal configuration of the wearable cameras 10 and 10A. The wearable camera 10 shown in FIG. 3 includes an imaging unit 11, a GPIO 12, a RAM 13, a ROM 14, and a storage unit 15. The wearable camera 10 includes an EEPROM (Electrically Erasable Programmable ROM) 16, an RTC (Real Time Clock) 17, and a GPS 18. The wearable camera 10 includes an MCU (Micro Controller Unit) 19, a communication unit 21, a USB 22, a contact terminal 23, a power supply unit 24, and a battery 25.
 ウェアラブルカメラ10は、操作入力部の一例として、録画スイッチSW1と、スナップショットスイッチSW2と、属性情報付与スイッチSW3と、を備える。ウェアラブルカメラ10は、状態表示部の一例として、LED26a,26b,26cと、バイブレータ27と、を備える。 The wearable camera 10 includes a recording switch SW1, a snapshot switch SW2, and an attribute information addition switch SW3 as an example of an operation input unit. The wearable camera 10 includes LEDs 26a, 26b, and 26c and a vibrator 27 as an example of a state display unit.
 撮像部11は、例えば撮像レンズ11a(図6参照)と、CCD(Charge Coupled Device)型イメージセンサまたはCMOS(Complementary Metal Oxide Semiconductor)型イメージセンサ等による固体撮像素子と、固体撮像素子からの出力を人が判別可能な既定形式のフレームを有する映像データに変換するための信号処理部とを有する。撮像部11は、所定のフレームレートで映像を撮像し、撮像により得られた被写体の映像データをMCU19に出力する。 The imaging unit 11 outputs, for example, a solid-state imaging device such as an imaging lens 11a (see FIG. 6), a CCD (Charge Coupled Device) type image sensor or a CMOS (Complementary Metal Oxide Semiconductor) type image sensor, and the like. A signal processing unit for converting into video data having a frame of a predetermined format that can be identified by a person. The imaging unit 11 captures a video at a predetermined frame rate, and outputs video data of the subject obtained by the imaging to the MCU 19.
 GPIO12は、例えばパラレルインタフェースであり、録画スイッチSW1、スナップショットスイッチSW2、属性情報付与スイッチSW3、LED26a~26cおよびバイブレータ27と、MCU19との間で信号を入出力する。また、GPIO12には、例えば、各種センサ(例えば加速度センサ)が接続される。 The GPIO 12 is, for example, a parallel interface, and inputs and outputs signals between the recording switch SW1, the snapshot switch SW2, the attribute information addition switch SW3, the LEDs 26a to 26c, the vibrator 27, and the MCU 19. In addition, for example, various sensors (for example, acceleration sensors) are connected to the GPIO 12.
 RAM13は、例えばMCU19の動作において使用されるワークメモリである。ROM14は、例えばMCU19を制御するためのプログラムおよびデータをあらかじめ記憶するメモリである。記憶部15は、例えばSDメモリ等の記憶媒体により構成され、撮像部11にて撮像して得られた映像データを記憶する。記憶部15としてSDメモリを用いる場合、ウェアラブルカメラ10,10Aの筐体本体に対して取付けおよび取外しが可能である。 The RAM 13 is a work memory used in the operation of the MCU 19, for example. The ROM 14 is a memory that stores in advance a program and data for controlling the MCU 19, for example. The storage unit 15 is configured by a storage medium such as an SD memory, for example, and stores video data obtained by imaging by the imaging unit 11. When an SD memory is used as the storage unit 15, it can be attached to and detached from the housing body of the wearable cameras 10 and 10A.
 EEPROM16は、例えばウェアラブルカメラ10,10Aをそれぞれ識別する識別情報(例えばカメラIDとしてのシリアル番号)、および他の設定情報を記憶する。他の設定情報は、例えば、署内PC71での設定登録や車載レコーダ62へログインすることにより得られるログイン情報(例えばカーID、オフィサID)、を含む。RTC17は、現在の時刻情報をカウントしてMCU19に出力する。GPS18は、現在のウェアラブルカメラ10,10Aの位置情報および時刻情報をGPS発信機(不図示)から受信して、MCU19に出力する。この時刻情報は、ウェアラブルカメラ10,10Aのシステム時刻の補正のために使用される。 The EEPROM 16 stores, for example, identification information (for example, a serial number as a camera ID) for identifying the wearable cameras 10 and 10A, and other setting information. The other setting information includes, for example, login information (for example, car ID and officer ID) obtained by setting registration at the in-station PC 71 and logging into the in-vehicle recorder 62. The RTC 17 counts the current time information and outputs it to the MCU 19. The GPS 18 receives current position information and time information of the wearable cameras 10 and 10 </ b> A from a GPS transmitter (not shown), and outputs them to the MCU 19. This time information is used for correcting the system time of the wearable cameras 10 and 10A.
 MCU19は、制御部としての機能を有し、例えば、ウェアラブルカメラ10,10Aの各部の動作を全体的に統括するための制御処理、ウェアラブルカメラ10,10Aの各部との間のデータの入出力処理、データの演算(計算)処理およびデータの記憶処理を行い、ROM14に記憶されたプログラムおよびデータに従って動作する。MCU19は、動作時には、例えばRAM13を使用し、RTC17より現在の時刻情報を得て、GPS18より現在の位置情報を得る。 The MCU 19 has a function as a control unit, for example, control processing for overall control of operations of each part of the wearable camera 10, 10A, data input / output processing between each unit of the wearable camera 10, 10A, for example. Data calculation (calculation) processing and data storage processing are performed, and operations are performed in accordance with programs and data stored in the ROM 14. During operation, the MCU 19 uses, for example, the RAM 13, obtains current time information from the RTC 17, and obtains current position information from the GPS 18.
 通信部21は、例えばOSI(Open Systems Interconnection)参照モデルの第1層である物理層において、通信部21とMCU19との間の接続を規定し、この規定に従って、例えば無線LAN(W-LAN)による無線通信(例えばWi-fi(登録商標))を行う。通信部21は、近距離無線通信(NFC)、Bluetooth(登録商標)等の無線通信を行ってもよい。USB22は、シリアルバスであり、例えばウェアラブルカメラ10と、車載システム60または警察署5内の署内PC71との接続を可能とする。 The communication unit 21 defines the connection between the communication unit 21 and the MCU 19 in the physical layer, which is the first layer of the OSI (Open Systems Interconnection) reference model, for example, and, for example, a wireless LAN (W-LAN) Wireless communication (for example, Wi-fi (registered trademark)) is performed. The communication unit 21 may perform wireless communication such as near field communication (NFC) and Bluetooth (registered trademark). The USB 22 is a serial bus and enables, for example, the wearable camera 10 to be connected to the in-vehicle system 60 or the in-station PC 71 in the police station 5.
 コンタクトターミナル23は、クレードル(不図示)または外付けアダプタ(不図示)等と電気的に接続するための端子であり、USB22を介してMCU19に接続され、電源部24と接続される。コンタクトターミナル23を介して、ウェアラブルカメラ10の充電、および映像データを含むデータの通信が可能となっている。コンタクトターミナル23には、例えば「充電端子V+」、「CON.DET端子」、「データ端子D-,D+」および「グランド端子」(いずれも図示省略)が設けられる。CON.DET端子は、電圧および電圧変化を検出するための端子である。データ端子D-,D+は、例えばUSBコネクタ端子を介して、外部PC等に対してウェアラブルカメラ10で撮像した映像データ等を転送するための端子である。コンタクトターミナル23とクレードル(不図示)または外付けアダプタ(不図示)のコネクタとが接続されることにより、ウェアラブルカメラ10と外部機器との間でデータ通信が可能となる。 The contact terminal 23 is a terminal for electrically connecting to a cradle (not shown) or an external adapter (not shown), and is connected to the MCU 19 via the USB 22 and connected to the power supply unit 24. Via the contact terminal 23, the wearable camera 10 can be charged and data including video data can be communicated. The contact terminal 23 is provided with, for example, “charging terminal V +”, “CON.DET terminal”, “data terminals D−, D +” and “ground terminal” (all not shown). CON. The DET terminal is a terminal for detecting a voltage and a voltage change. The data terminals D− and D + are terminals for transferring video data captured by the wearable camera 10 to an external PC or the like via, for example, a USB connector terminal. By connecting the contact terminal 23 and a connector of a cradle (not shown) or an external adapter (not shown), data communication is possible between the wearable camera 10 and an external device.
 電源部24は、例えばコンタクトターミナル23を介してクレードルまたは外付けアダプタより供給される電源電力をバッテリ25に給電して、バッテリ25を充電する。バッテリ25は、例えば充電可能な2次電池により構成され、ウェアラブルカメラ10の各部に電源電力を供給する。 The power supply unit 24 supplies the battery 25 with power supplied from a cradle or an external adapter via the contact terminal 23, for example, and charges the battery 25. The battery 25 is constituted by a rechargeable secondary battery, for example, and supplies power to each part of the wearable camera 10.
 録画スイッチSW1は、例えばユーザの押下操作による録画(動画の撮像)の開始/停止の操作指示を入力する押しボタンスイッチである。スナップショットスイッチSW2は、例えばユーザの押下操作による静止画の撮像の操作指示を入力する押しボタンスイッチである。属性情報付与スイッチSW3は、例えばユーザの押下操作による、映像データに属性情報を付与するための操作指示を入力する押しボタンスイッチである。録画スイッチSW1、スナップショットスイッチSW2、属性情報付与スイッチSW3は、例えば緊急時においても警察官7が容易に操作可能な位置に配置される(例えば図6参照)。なお、各スイッチSW1~SW3は、上記の形態に限定されず、ユーザによる操作指示の入力が可能な他の形態の操作入力デバイスでもよい。 The recording switch SW1 is a push button switch for inputting an operation instruction for starting / stopping recording (moving image capturing) by a user's pressing operation, for example. The snapshot switch SW2 is, for example, a push button switch that inputs an operation instruction for capturing a still image by a user's pressing operation. The attribute information addition switch SW3 is a push button switch for inputting an operation instruction for adding attribute information to the video data, for example, by a pressing operation of the user. The recording switch SW1, the snapshot switch SW2, and the attribute information addition switch SW3 are arranged at positions that can be easily operated by the police officer 7 even in an emergency, for example (see, for example, FIG. 6). Each of the switches SW1 to SW3 is not limited to the above form, and may be another form of operation input device that allows the user to input an operation instruction.
 LED26aは、例えばウェアラブルカメラ10の電源投入状態(オンオフ状態)およびバッテリ25の状態を示す表示部である。LED26bは、例えばウェアラブルカメラ10の撮像動作の状態(録画状態)を示す表示部である。LED26cは、例えばウェアラブルカメラ10の通信モードの状態を示す表示部である。 The LED 26 a is a display unit that indicates, for example, the power-on state (on / off state) of the wearable camera 10 and the state of the battery 25. LED26b is a display part which shows the state (recording state) of the imaging operation of the wearable camera 10, for example. LED26c is a display part which shows the state of the communication mode of the wearable camera 10, for example.
 MCU19は、録画スイッチSW1、スナップショットスイッチSW2、属性情報付与スイッチSW3の各スイッチの入力検出を行い、操作があったスイッチ入力に対する処理を行う。MCU19は、録画スイッチSW1の操作入力を検出した場合、撮像部11における撮像動作の開始または停止を制御し、撮像部11から得られた映像データを、動画像の映像データとして記憶部15に保存する。MCU19は、スナップショットスイッチSW2の操作入力を検出した場合、スナップショットスイッチSW2が操作されたときの撮像部11による映像データを、静止画の映像データとして記憶部15に保存する。 The MCU 19 detects input of each of the recording switch SW1, the snapshot switch SW2, and the attribute information addition switch SW3, and performs processing for the switch input that has been operated. When the MCU 19 detects an operation input of the recording switch SW1, the MCU 19 controls the start or stop of the imaging operation in the imaging unit 11, and stores the video data obtained from the imaging unit 11 in the storage unit 15 as video data of a moving image. To do. When the MCU 19 detects an operation input of the snapshot switch SW2, the MCU 19 stores the video data from the imaging unit 11 when the snapshot switch SW2 is operated in the storage unit 15 as video data of a still image.
 MCU19は、属性情報付与スイッチSW3の操作入力を検出した場合、あらかじめ設定された属性情報を映像データに付与し、映像データと対応付けて記憶部15に保存する。この際、MCU19は、例えば映像データに対応付ける属性を選択するための属性選択スイッチ(不図示)の状態を検出し、設定に応じた属性情報を付与する。また、MCU19は、あらかじめ設定された通信モードによって通信部21を動作させる。MCU19は、録画動作を開始した場合、あらかじめ設定された報知モードに従ってLED26a~26c、バイブレータ27を駆動し、LED表示および/またはバイブレータ振動によって録画動作の状態を外部に示す報知を行う。 When the MCU 19 detects an operation input of the attribute information addition switch SW3, the MCU 19 assigns preset attribute information to the video data, and stores it in the storage unit 15 in association with the video data. At this time, for example, the MCU 19 detects the state of an attribute selection switch (not shown) for selecting an attribute to be associated with the video data, and assigns attribute information according to the setting. Further, the MCU 19 operates the communication unit 21 in a preset communication mode. When the recording operation is started, the MCU 19 drives the LEDs 26a to 26c and the vibrator 27 in accordance with a preset notification mode, and notifies the outside of the state of the recording operation by LED display and / or vibrator vibration.
 MCU19は、撮像部11から出力された映像データのフレーム毎に、例えばRTC17やGPS18のそれぞれの出力を用いて、映像データにおけるフレームの各種情報を示すフレームパラメータ情報(図9参照)を生成する。または、MCU19は、ウェアラブルカメラ10,10Aのシステム時刻を保持するシステム時刻計数部19Aを有し、RTC17の出力を用いてシステム時刻を補正してもよいし、車載カメラシステム65のシステム時刻と同期するために、RTC17の出力を用いずに定期的に車載カメラシステム65(例えばNTPサーバとしての車載レコーダ62)にアクセスすることでシステム時刻を補正してもよい。 The MCU 19 generates frame parameter information (see FIG. 9) indicating various types of frame information in the video data by using, for example, respective outputs of the RTC 17 and the GPS 18 for each frame of the video data output from the imaging unit 11. Alternatively, the MCU 19 includes a system time counting unit 19A that holds the system time of the wearable cameras 10 and 10A, and may correct the system time using the output of the RTC 17, or may be synchronized with the system time of the in-vehicle camera system 65. Therefore, the system time may be corrected by periodically accessing the in-vehicle camera system 65 (for example, the in-vehicle recorder 62 as an NTP server) without using the output of the RTC 17.
 MCU19は、システム時刻計数部19Aの出力(つまり、ウェアラブルカメラ10,10Aのシステム時刻)を用いて、上記フレームパラメータ情報を生成し、さらにフレームパラメータ情報を映像データストリームに付加して映像データを生成する(図8参照))。フレームパラメータ情報については、図9を参照して後述する。 The MCU 19 uses the output of the system time counting unit 19A (that is, the system time of the wearable cameras 10 and 10A) to generate the frame parameter information, and further adds the frame parameter information to the video data stream to generate video data. (See FIG. 8)). The frame parameter information will be described later with reference to FIG.
 図4は、本実施形態のバックエンドサーバSVの内部構成の一例を示すブロック図である。図4に示すバックエンドサーバSVは、CPU301、メモリ302、I/O制御部303、通信部304、入力部305、出力部306、ストレージ制御部307、およびストレージ308を含む構成である。 FIG. 4 is a block diagram showing an example of the internal configuration of the back-end server SV of the present embodiment. The back-end server SV illustrated in FIG. 4 includes a CPU 301, a memory 302, an I / O control unit 303, a communication unit 304, an input unit 305, an output unit 306, a storage control unit 307, and a storage 308.
 CPU301は、例えばバックエンドサーバSVの各部の動作を全体的に統括するための制御処理、他の各部との間のデータの入出力処理、データの演算(計算)処理およびデータの記憶処理を行う。 The CPU 301 performs, for example, control processing for overall control of operations of each unit of the back-end server SV, data input / output processing with other units, data calculation (calculation) processing, and data storage processing. .
 再生部の一例としてのCPU301は、ストレージ308に記憶された2つの録画データRCD1,RCD2(例えば、車載カメラシステム65により撮像され録画された録画データRCD1,ウェアラブルカメラ10により撮像され録画された録画データRCD2)の出力部306への再生をI/O制御部303に指示する。これにより、車載カメラシステム65により撮像され録画された録画データRCD1,ウェアラブルカメラ10により撮像され録画された録画データRCD2は、それぞれ出力部306において再生される。本実施形態では、CPU301は、2つの録画データRCD1,RCD2を再生する際、出力部306(例えばディスプレイ)のうち2つの画面(ウインドウ)を対比的に並べて、かつ録画された時刻が同期するように再生する。2つの録画データRCD1,RCD2の録画された時刻が同期するように再生する方法については、図13を参照して後述する。これにより、バックエンドシステム100Bを管轄する警察署内の担当者は、現場の状況を広範かつ鮮明に把握可能になる。 The CPU 301 as an example of a playback unit includes two recorded data RCD1 and RCD2 (for example, recorded data RCD1 captured and recorded by the in-vehicle camera system 65 and recorded data recorded by the wearable camera 10 and stored in the storage 308. RCD2) to the output unit 306 is instructed to the I / O control unit 303. As a result, the recording data RCD1 captured and recorded by the in-vehicle camera system 65 and the recording data RCD2 captured and recorded by the wearable camera 10 are reproduced by the output unit 306, respectively. In the present embodiment, when reproducing the two recorded data RCD1 and RCD2, the CPU 301 arranges two screens (windows) in the output unit 306 (for example, a display) in a comparative manner and synchronizes the recorded times. To play. A method of reproducing the two recorded data RCD1 and RCD2 so that the recorded times are synchronized will be described later with reference to FIG. As a result, the person in charge in the police station having jurisdiction over the back-end system 100B can grasp the situation on the site extensively and clearly.
 メモリ302は、例えばRAM、ROM、不揮発性または揮発性の半導体メモリを用いて構成され、CPU301の動作時のワークメモリとして機能し、CPU301を動作させるための所定のプログラムおよびデータを保存している。メモリ302は、例えばCPU301がストレージ308から読み出した2つの録画データRCD1,RCD2(図7,図8参照)の再生(出力)をI/O制御部303に指示する際に、録画データRCD1,RCD2を一時的に記憶する。 The memory 302 is configured using, for example, RAM, ROM, nonvolatile or volatile semiconductor memory, functions as a work memory when the CPU 301 operates, and stores a predetermined program and data for operating the CPU 301. . For example, when the CPU 302 instructs the I / O control unit 303 to reproduce (output) the two recorded data RCD1 and RCD2 (see FIGS. 7 and 8) read from the storage 308 by the CPU 301, the recorded data RCD1 and RCD2 are recorded. Is temporarily stored.
 I/O制御部303は、CPU301とバックエンドサーバSVの各部(例えば通信部304、入力部305、出力部306)との間でデータの入出力に関する制御を行い、CPU301からのデータおよびCPU301へのデータの中継を行う。なお、I/O制御部303は、CPU301と一体的に構成されてもよい。 The I / O control unit 303 performs control related to data input / output between the CPU 301 and each unit of the back-end server SV (for example, the communication unit 304, the input unit 305, and the output unit 306). Relay data. The I / O control unit 303 may be configured integrally with the CPU 301.
 通信部304は、管理ソフトウェア70を介して、フロントエンドシステム100Aの車載カメラシステム65またはウェアラブルカメラ10,10Aとの間で、有線または無線による通信を行う。受信部の一例としての通信部304は、車載カメラシステム65から送信された録画データRCD1,RCD2を受信する。また、通信部304は、ウェアラブルカメラ10,10Aから直接送信された録画データRCD2を受信してもよい。 The communication unit 304 performs wired or wireless communication with the in-vehicle camera system 65 of the front end system 100A or the wearable cameras 10 and 10A via the management software 70. The communication unit 304 as an example of a receiving unit receives the recording data RCD1 and RCD2 transmitted from the in-vehicle camera system 65. The communication unit 304 may receive the recording data RCD2 directly transmitted from the wearable cameras 10 and 10A.
 入力部305は、バックエンドシステム100Bを管轄する警察署5内の担当者の入力操作を受け付け、I/O制御部303を介してCPU301に通知するためのUIであり、例えばマウス、キーボード等のポインティングデバイスである。入力部305は、例えば出力部306において出力される画面に対応して配置され、上記担当者の指またはスタイラスペンによって操作が可能なタッチパネルまたはタッチパッドを用いて構成されてもよい。 The input unit 305 is a UI for receiving an input operation of a person in charge in the police station 5 having jurisdiction over the back-end system 100B and notifying the CPU 301 via the I / O control unit 303. It is a pointing device. The input unit 305 may be configured using, for example, a touch panel or a touch pad that is arranged corresponding to the screen output from the output unit 306 and can be operated with the finger of the person in charge or the stylus pen.
 出力部306は、例えばLCDまたは有機ELを用いて構成される表示デバイスと音声を出力するスピーカの両方またはその一方を含む構成であり、各種データを画面に表示したり音声を出力したりする。出力部306は、例えばバックエンドシステム100Bを管轄する警察署5内の担当者の入力操作に応じて、ウェアラブルカメラ10により撮像(録画)された録画データRCD1および車載カメラシステム65により撮像(録画)された録画データRCD2が入力された場合には、CPU301の指示の下で、それぞれの録画データ内の映像データに含まれる映像を異なる画面に表示する。 The output unit 306 includes, for example, a display device configured using an LCD or an organic EL and / or a speaker that outputs sound, and displays various data on a screen or outputs sound. For example, the output unit 306 captures (records) the recorded data RCD1 captured (recorded) by the wearable camera 10 and the in-vehicle camera system 65 in response to an input operation of a person in charge in the police station 5 having jurisdiction over the back-end system 100B. When the recorded video data RCD2 is input, the video included in the video data in each recorded data is displayed on a different screen under the instruction of the CPU 301.
 ストレージ制御部307は、CPU301またはI/O制御部303の指示に応じて、ストレージ308に記憶された各種データの読み出し、またはストレージ308への各種データの書き込みを行う。 The storage control unit 307 reads various data stored in the storage 308 or writes various data to the storage 308 in accordance with an instruction from the CPU 301 or the I / O control unit 303.
 ストレージ308は、例えばSSDまたはHDD等により構成され、車載カメラシステム65(の車載カメラ61)により撮像され録画された録画データRCD1、およびウェアラブルカメラ10により撮像され録画された録画データRCD2を記憶して蓄積する。また、ストレージ308は、ウェアラブルカメラ10,10Aから録画データが直接転送された場合には、ウェアラブルカメラ10,10Aにより撮像され録画された録画データを記憶して蓄積する。また、ストレージ308は、録画データ以外の各種データを蓄積してもよい。なお、ストレージ308は、複数設けられてもよい。 The storage 308 is configured by, for example, an SSD or an HDD, and stores recording data RCD1 captured and recorded by the in-vehicle camera system 65 (the in-vehicle camera 61), and recorded data RCD2 captured and recorded by the wearable camera 10. accumulate. Further, the storage 308 stores and accumulates recording data captured and recorded by the wearable cameras 10 and 10A when the recording data is directly transferred from the wearable cameras 10 and 10A. The storage 308 may store various data other than the recorded data. A plurality of storages 308 may be provided.
 図5は、ウェアラブルカメラ10を警察官7が装着した状態を示す模式図である。図6は、ウェアラブルカメラ10の外観の一例を示す正面図である。図5に示すように、ウェアラブルカメラ10は、例えば警察官7の胸部など、警察官7の視点に近い位置からの視野の映像を撮像するように、警察官7が着用した衣服または身体に装着して使用され、またはクリップなどの留め具を介して帽子に取り付けられてもよい。警察官7は、ウェアラブルカメラ10を装着した状態で、録画スイッチSW1を操作して周囲の被写体の撮像を行う。 FIG. 5 is a schematic diagram showing a state where the wearable camera 10 is worn by the police officer 7. FIG. 6 is a front view illustrating an example of the appearance of the wearable camera 10. As shown in FIG. 5, the wearable camera 10 is attached to clothes or a body worn by the police officer 7 so as to capture an image of a field of view from a position close to the viewpoint of the police officer 7, such as the chest of the police officer 7, for example. Used or attached to the hat via a clip or other fastener. With the wearable camera 10 attached, the police officer 7 operates the recording switch SW1 to image a surrounding subject.
 図6に示すように、ウェアラブルカメラ10は、例えば略直方体状の筐体10Kの正面に、撮像部11の撮像レンズ11a、録画スイッチSW1、スナップショットスイッチSW2が設けられる。録画スイッチSW1は、例えば奇数回押下されることで録画(動画の撮像)が開始され、偶数回押下されることで録画が終了する。スナップショットスイッチSW2は、例えば押下される度に、押下時の静止画の撮像が実行される。 As shown in FIG. 6, the wearable camera 10 is provided with, for example, an imaging lens 11a of the imaging unit 11, a recording switch SW1, and a snapshot switch SW2 in front of a substantially rectangular parallelepiped casing 10K. For example, when the recording switch SW1 is pressed an odd number of times, recording (moving image capturing) starts, and when the recording switch SW1 is pressed an even number of times, the recording ends. For example, each time the snapshot switch SW2 is pressed, a still image is captured when the snapshot switch SW2 is pressed.
 ウェアラブルカメラ10の筐体10Kの正面から見て左側面には、属性情報付与スイッチSW3が設けられる。警察官7が属性情報付与スイッチSW3を押下操作することにより、現在録画中の映像データまたは直前に録画した映像データに対して、設定状態に応じた属性情報(例えば窃盗、飲酒運転、駐車違反など)が付与される。属性情報は、映像データの種別を識別するための分類情報である。属性情報は、あらかじめ設定した属性情報の設定内容に基づき、ユーザによるウェアラブルカメラ10の属性情報付与スイッチSW3の操作、または車載レコーダ62のスイッチ109の操作、或いは車載PC63の操作に従って付与する。属性情報としては、例えば窃盗、飲酒運転、駐車違反など、現場で発生した事件に関する属性が用いられる。 Attribute information addition switch SW3 is provided on the left side when viewed from the front of the housing 10K of the wearable camera 10. When the police officer 7 depresses the attribute information addition switch SW3, the attribute information (for example, theft, drunk driving, parking violation, etc.) corresponding to the set state is applied to the video data currently recorded or the video data recorded immediately before. ) Is given. The attribute information is classification information for identifying the type of video data. The attribute information is given according to the user's operation of the attribute information addition switch SW3 of the wearable camera 10, the operation of the switch 109 of the in-vehicle recorder 62, or the operation of the in-vehicle PC 63 based on the setting contents of the attribute information set in advance. As the attribute information, for example, an attribute relating to an incident that has occurred in the field, such as theft, drunk driving, or parking violation, is used.
 LED26a~26cは、図6に示すように、ウェアラブルカメラ10の筐体10Kの正面から見て上面に配置されている。これにより、警察官7がウェアラブルカメラ10を装着した状態でLED26a~26cを容易に視認できる。なお、警察官7本人以外からはLED26a~26cが見えないようにしてもよい。また、図示しないが、ウェアラブルカメラ10の筐体10Kの正面から見て下面には、コンタクトターミナル23が設けられる。 The LEDs 26a to 26c are arranged on the upper surface when viewed from the front of the housing 10K of the wearable camera 10, as shown in FIG. Thereby, the police officer 7 can easily visually recognize the LEDs 26a to 26c in a state where the wearable camera 10 is worn. The LEDs 26a to 26c may not be seen by anyone other than the seven police officers. Further, although not shown, a contact terminal 23 is provided on the lower surface of the wearable camera 10 as viewed from the front of the housing 10K.
 図7は、車載カメラ61により撮像された録画データRCD1のデータ構造の一例を示す図である。図8は、ウェアラブルカメラ10により撮像された録画データRCD2のデータ構造の一例を示す図である。図9は、映像データVDO1,VDO2内のフレームパラメータ情報FPM1,FPM2のデータ構造の一例を示す図である。図10は、車載カメラ61により撮像された録画データRCD1の映像データVDO1内のフレームパラメータ情報FPM1のデータ構造の一例を示す図である。図11は、ウェアラブルカメラ10により撮像された録画データRCD2の映像データVDO2内のフレームパラメータ情報FPM2のデータ構造の一例を示す図である。 FIG. 7 is a diagram illustrating an example of a data structure of the recording data RCD1 captured by the in-vehicle camera 61. FIG. 8 is a diagram illustrating an example of a data structure of the recording data RCD2 imaged by the wearable camera 10. FIG. 9 is a diagram illustrating an example of a data structure of the frame parameter information FPM1 and FPM2 in the video data VDO1 and VDO2. FIG. 10 is a diagram illustrating an example of a data structure of the frame parameter information FPM1 in the video data VDO1 of the recording data RCD1 captured by the in-vehicle camera 61. FIG. 11 is a diagram illustrating an example of a data structure of the frame parameter information FPM2 in the video data VDO2 of the recording data RCD2 captured by the wearable camera 10.
 本実施形態のウェアラブルカメラ10や車載カメラ61では、映像を撮像して録画を行う場合、図7および図8に示すように、撮像した映像データVDO1,VDO2とともに、この映像データに関連する属性情報を含むメタ情報MTD1,MTD2を生成し、両者のデータを関連付けた録画データRCD1,RCD2として各装置内のメモリに保存する。言い換えると、ウェアラブルカメラ10のメモリ(例えば記憶部15)に保存された録画データRCD2には、映像データVDO2とメタ情報MTD2とが含まれる。同様に、車載カメラ61により撮像されて録画された録画データRCD1には、映像データVDO1とメタ情報MTD1とが含まれる。録画データRCD1は、車載レコーダ62内のストレージ111に保存される。なお、録画データRCD1,RCD2は、通信環境(例えば回線速度)の点において、車載レコーダ62経由でバックエンドサーバSVに転送して蓄積されることが好ましい。言い換えると、ウェアラブルカメラ10により撮像され録画された録画データRCD2は、ウェアラブルカメラ10から直接転送されてもよいが、転送時間を短縮するためには、車載レコーダ62がバックエンドサーバSVへ映像データ等を転送していない期間(例えばパトロールのシフト勤務時間が終わり、警察署に戻るまでの移動時間)にウェアラブルカメラ10から車載レコーダ62へ映像データを転送しておき、その後車載レコーダ62からバックエンドサーバSVに転送される方が好ましい。 In the wearable camera 10 and the in-vehicle camera 61 of the present embodiment, when video is captured and recorded, as shown in FIGS. 7 and 8, the captured video data VDO1 and VDO2, along with attribute information related to this video data. Meta information MTD1 and MTD2 including the data are generated and stored in the memory in each apparatus as recorded data RCD1 and RCD2 in which the two data are associated with each other. In other words, the recording data RCD2 stored in the memory (for example, the storage unit 15) of the wearable camera 10 includes video data VDO2 and meta information MTD2. Similarly, the recording data RCD1 captured and recorded by the in-vehicle camera 61 includes video data VDO1 and meta information MTD1. The recorded data RCD1 is stored in the storage 111 in the in-vehicle recorder 62. Note that the recorded data RCD1 and RCD2 are preferably transferred to and stored in the back-end server SV via the in-vehicle recorder 62 in terms of communication environment (for example, line speed). In other words, the recording data RCD2 imaged and recorded by the wearable camera 10 may be directly transferred from the wearable camera 10, but in order to shorten the transfer time, the in-vehicle recorder 62 sends video data or the like to the back-end server SV. Video data is transferred from the wearable camera 10 to the in-vehicle recorder 62 during a period in which the camera is not transferred (for example, movement time until patrol shift working hours end and return to the police station), and then the in-vehicle recorder 62 back-end server It is preferable to transfer to SV.
 図9に示すように、フレームパラメータ情報FPM1,FPM2は、車載カメラ61,ウェアラブルカメラ10により撮像された映像のフレーム毎に生成され、フレームの各種情報(例えばフレームの記録時刻(秒)、フレームの記録時刻時間(ミリ秒)、映像データ内のフレーム番号)を示す。図9に示すフレームパラメータ情報FPM1,FPM2では、「Record time (sec)」、「Record time (msec)」、「Frame count」が示されている。「Record time (sec)」は、4バイトのサイズを有し、フレームの記録時刻(秒単位)を示し、具体的には所定の基準日からの通算秒である。所定の基準日は、特に限定されないが、例えば1970年1月1日でもよいし、当該年の元日(例えば2015年1月1日)でもよい。「Record time (msec)」は、2バイトのサイズを有し、フレームの記録時刻(ミリ秒単位)を示す。従って、「Record time (sec)」と「Record time (msec)」との和が実際の映像データVDO1,VDO2の該当フレームが記録された時刻(録画時刻)となる。また、「Frame count」は、4バイトのサイズを有し、映像データVDO1,VDO2内のフレーム番号を示す。 As shown in FIG. 9, the frame parameter information FPM1 and FPM2 are generated for each frame of the video imaged by the in-vehicle camera 61 and the wearable camera 10, and various information of the frame (for example, the recording time (second) of the frame, the frame Recording time (milliseconds), frame number in video data). In the frame parameter information FPM1 and FPM2 shown in FIG. 9, “Record time (sec)”, “Record time (msec)”, and “Frame count” are shown. “Record time (sec)” has a size of 4 bytes and indicates the recording time (second unit) of the frame. Specifically, it is the total number of seconds from a predetermined reference date. The predetermined reference date is not particularly limited, but may be, for example, January 1, 1970, or may be the first day of the year (for example, January 1, 2015). “Record time (msec)” has a size of 2 bytes and indicates the recording time of the frame (in milliseconds). Accordingly, the sum of “Record time (sec)” and “Record time (msec)” is the time (recording time) when the corresponding frame of the actual video data VDO1, VDO2 was recorded. “Frame count” has a size of 4 bytes and indicates a frame number in the video data VDO1 and VDO2.
 図10および図11に具体例を用いてより詳細に説明する。 10 and FIG. 11 will be described in more detail using specific examples.
 図10に示すフレームパラメータ情報FPM1では、所定の基準日から「1002231+0.743」秒が経過した時点に、該当フレーム(フレーム番号「1」)の映像データVDO1が車載カメラ61により撮像され録画されたことになる。言い換えると、所定の基準日から「1002231+0.743」秒が経過した時刻が、映像データVDO1の録画開始時刻となる。同様に、図11に示すフレームパラメータ情報FPM2では、所定の基準日から「1002372+0.057」秒が経過した時点に、該当フレーム(フレーム番号「1」)の映像データが車載カメラ61により撮像され録画されたことになる。言い換えると、所定の基準日から「1002372+0.057」秒が経過した時刻が、映像データVDO2の録画開始時刻となる。 In the frame parameter information FPM1 shown in FIG. 10, the video data VDO1 of the corresponding frame (frame number “1”) is captured and recorded by the in-vehicle camera 61 when “1002231 + 0.743” seconds have elapsed from the predetermined reference date. It will be. In other words, the time when “1002231 + 0.743” seconds have elapsed from the predetermined reference date is the recording start time of the video data VDO1. Similarly, in the frame parameter information FPM2 shown in FIG. 11, the video data of the corresponding frame (frame number “1”) is captured and recorded by the vehicle-mounted camera 61 when “10027272 + 0.057” seconds have elapsed from the predetermined reference date. It will be done. In other words, the recording start time of the video data VDO2 is the time when “1002722 + 0.057” seconds have elapsed from the predetermined reference date.
 つまり、図10および図11に示すフレームパラメータ情報FPM1,FPM2によれば、車載カメラ61とウェアラブルカメラ10とでは、同一の事件の現場を撮像した映像であるが、録画した時刻が異なっており同期しておらず、ウェアラブルカメラ10より車載カメラ61が先に録画を開始したことになる。本実施形態では、バックエンドサーバSVは、各録画データRCD1,RCD2の映像データVDO1,VDO2内の先頭フレームに関するフレームパラメータ情報FPM1,FPM2を用いて、映像データVDO1,VDO2が同期するように再生を開始する。 That is, according to the frame parameter information FPM1 and FPM2 shown in FIG. 10 and FIG. 11, the in-vehicle camera 61 and the wearable camera 10 are images obtained by imaging the same incident site, but the recording times are different and the synchronization is performed. This means that the in-vehicle camera 61 starts recording earlier than the wearable camera 10. In the present embodiment, the back-end server SV uses the frame parameter information FPM1 and FPM2 regarding the first frame in the video data VDO1 and VDO2 of the respective recording data RCD1 and RCD2 to reproduce the video data VDO1 and VDO2. Start.
 次に、車載カメラシステム65とウェアラブルカメラ10とにおけるシステム時刻の同期の動作手順について、図12を参照して説明する。図12は、本実施形態のウェアラブルカメラ10が車載カメラシステム65との間で時刻を同期する動作手順の一例を説明するフローチャートである。図12の説明の前提として、車載カメラシステム65における車載カメラ61と車載レコーダ62とはシステム時刻が同期(つまり、一致)している。 Next, an operation procedure for synchronizing the system time between the in-vehicle camera system 65 and the wearable camera 10 will be described with reference to FIG. FIG. 12 is a flowchart for explaining an example of an operation procedure in which the wearable camera 10 according to the present embodiment synchronizes time with the in-vehicle camera system 65. As a premise of the description of FIG. 12, the in-vehicle camera 61 and the in-vehicle recorder 62 in the in-vehicle camera system 65 are synchronized (that is, coincident) with the system time.
 図12において、ウェアラブルカメラ10は、ウェアラブルカメラ10本体の初期処理を実行する(S1)。初期処理とは、例えばウェアラブルカメラ10のメモリ(例えば記憶部15)に各種処理用の初期設定値を読み込んだり、ウェアラブルカメラ10のOS(Operating System)をブートしてウェアラブルカメラ10の各部が設定値をそれぞれ読み込んで動作可能な状態にしたりすることである。 12, the wearable camera 10 executes initial processing of the wearable camera 10 main body (S1). The initial processing is, for example, reading initial setting values for various processing into the memory (for example, the storage unit 15) of the wearable camera 10 or booting the OS (Operating System) of the wearable camera 10 to set each value of the wearable camera 10 To read each and make it operable.
 ウェアラブルカメラ10は、ステップS1の初期処理を終了すると、車載カメラシステム(ICV)65の通信圏(通信エリア)内にいるか否かを判断する(S2)。ウェアラブルカメラ10が車載カメラシステム65の通信圏内にいないと判断された場合には(S2、NO)、ウェアラブルカメラ10の動作はステップS5に進む。一方、ウェアラブルカメラ10は、車載カメラシステム65の通信圏内にいると判断した場合には(S2、YES)、車載カメラシステム65との間で通信のリンクを確立する(S3)。 The wearable camera 10 determines whether or not it is within the communication area (communication area) of the in-vehicle camera system (ICV) 65 after completing the initial process of step S1 (S2). If it is determined that the wearable camera 10 is not within the communication range of the in-vehicle camera system 65 (S2, NO), the operation of the wearable camera 10 proceeds to step S5. On the other hand, when the wearable camera 10 determines that it is within the communication range of the in-vehicle camera system 65 (S2, YES), it establishes a communication link with the in-vehicle camera system 65 (S3).
 ステップS3の後、ウェアラブルカメラ10は、車載カメラシステム65にアクセスし、車載カメラシステム65との間でシステム時刻が一致する(つまり、同期する)ように、ウェアラブルカメラ10のシステム時刻計数部19Aの出力(システム時刻)を調整する。なお、システム時刻を同期させるための時刻調整については、例えば車載カメラシステム65(車載カメラ61または車載レコーダ62のいずれか)にNTPサーバが設けられ、ウェアラブルカメラ10がこのNTPサーバにアクセスしてシステム時刻を取得することで調整可能である。または、ウェアラブルカメラ10と車載カメラシステム65との間でシステム時刻を調整するための独自のプロトコルを用いてもよい。なお、図12では車載カメラシステム65とウェアラブルカメラ10との間でシステム時刻を同期することについて説明しているが、図12の説明はウェアラブルカメラ10とウェアラブルカメラ10Aとの間でシステム時刻を同期する場合についても同様に適用可能である。 After step S3, the wearable camera 10 accesses the in-vehicle camera system 65, and the system time counting unit 19A of the wearable camera 10 makes the system time match (that is, synchronizes) with the in-vehicle camera system 65. Adjust the output (system time). As for the time adjustment for synchronizing the system time, for example, an in-vehicle camera system 65 (either in-vehicle camera 61 or in-vehicle recorder 62) is provided with an NTP server, and the wearable camera 10 accesses this NTP server to access the system. Adjustment is possible by obtaining the time. Alternatively, a unique protocol for adjusting the system time between the wearable camera 10 and the in-vehicle camera system 65 may be used. Note that FIG. 12 describes the synchronization of the system time between the in-vehicle camera system 65 and the wearable camera 10, but the description of FIG. 12 synchronizes the system time between the wearable camera 10 and the wearable camera 10A. The same applies to the case of doing so.
 ステップS4の後、ウェアラブルカメラ10は、録画開始のトリガ待ちとなる(S5)。つまり、録画が開始されるためのトリガ操作(例えば録画スイッチSW1の押下、所定の条件を満たした場合の自動録画開始)が行われるまで、ウェアラブルカメラ10は待機する。 After step S4, the wearable camera 10 waits for a trigger to start recording (S5). That is, the wearable camera 10 stands by until a trigger operation for starting recording (for example, pressing of the recording switch SW1 or starting automatic recording when a predetermined condition is satisfied) is performed.
 なお、図12に示すステップS2~S4の処理(点線参照)は、ウェアラブルカメラ10により定期的に繰り返される。これは、ステップS2においてウェアラブルカメラ10が車載カメラシステム65の通信圏内にいないと判断された場合でも、バックエンドサーバSVに録画データが送信される場合にはウェアラブルカメラ10と車載カメラシステム65とはシステム時刻が同期している必要がある(言い換えると、ステップS4の処理が実行される必要がある)ためである。従って、図12に示すステップS2~S4の処理は、一度だけではなく、例えばウェアラブルカメラ10を装着した警察官7が移動することも勘案し、定期的に繰り返される必要がある。これにより、ウェアラブルカメラ10を装着した警察官7が移動した場合など、ウェアラブルカメラ10が車載カメラシステム65の通信圏内に入ったことにより、ウェアラブルカメラ10と車載カメラシステム65とはシステム時刻を同期可能である。 Note that the processing of steps S2 to S4 shown in FIG. 12 (see dotted lines) is periodically repeated by the wearable camera 10. Even if it is determined in step S2 that the wearable camera 10 is not within the communication range of the in-vehicle camera system 65, the wearable camera 10 and the in-vehicle camera system 65 are different when the recording data is transmitted to the back-end server SV. This is because the system time needs to be synchronized (in other words, the process of step S4 needs to be executed). Accordingly, the processing of steps S2 to S4 shown in FIG. 12 needs to be repeated not only once but also taking into account that the police officer 7 wearing the wearable camera 10 moves, for example. Thereby, when the police officer 7 wearing the wearable camera 10 moves, the wearable camera 10 and the in-vehicle camera system 65 can synchronize the system time when the wearable camera 10 enters the communication range of the in-vehicle camera system 65. It is.
 なお、図12ではウェアラブルカメラ10と車載カメラシステム65との間の時刻の同期に関する動作手順が説明されているが、ウェアラブルカメラ10は、例えば録画データRCD2をバックエンドサーバSVに転送(アップロード)する間またはその前に、同図に示すステップS2~ステップS4と同様の処理を用いて、バックエンドシステム100BのバックエンドサーバSVとの間でシステム時刻を同期してもよい。さらに、車載カメラシステム65の車載レコーダ62は、例えば録画データRCD1をバックエンドサーバSVに転送(アップロード)する間またはその前に、バックエンドシステム100BのバックエンドサーバSVとの間でシステム時刻を同期してもよい。この場合、バックエンドシステム100BはNTPサーバ(不図示)を保持しており、バックエンドサーバSVは、このNTPサーバに定期的にアクセスすることで、システム時刻を調整している。 Although FIG. 12 illustrates an operation procedure related to time synchronization between the wearable camera 10 and the in-vehicle camera system 65, the wearable camera 10 transfers (uploads), for example, recorded data RCD2 to the back-end server SV. The system time may be synchronized with the back-end server SV of the back-end system 100B using the same processing as in steps S2 to S4 shown in FIG. Furthermore, the in-vehicle recorder 62 of the in-vehicle camera system 65 synchronizes the system time with the back-end server SV of the back-end system 100B, for example, during or before the recording data RCD1 is transferred (uploaded) to the back-end server SV. May be. In this case, the back-end system 100B holds an NTP server (not shown), and the back-end server SV adjusts the system time by periodically accessing the NTP server.
 次に、本実施形態のバックエンドサーバSVがウェアラブルカメラ10および車載カメラ61それぞれにより撮像された録画データRCD1,RCD2の映像データVDO1,VDO2を再生する動作手順について、図13を参照して説明する。図13は、本実施形態のバックエンドサーバSVがウェアラブルカメラ10、車載カメラ61それぞれにより撮像された録画データRCD1,RCD2の映像データVDO1,VDO2を再生する動作手順の一例を説明するフローチャートである。図13の説明の前提として、車載カメラ61により撮像された録画データRCD1,ウェアラブルカメラ10により撮像された録画データRCD2は、バックエンドサーバSVのストレージ308に記憶されており、ウェアラブルカメラ10と車載カメラシステム65とは図12に示す方法に従ってそれぞれのシステム時刻が同期している。 Next, an operation procedure in which the back-end server SV of the present embodiment reproduces the video data VDO1 and VDO2 of the recorded data RCD1 and RCD2 captured by the wearable camera 10 and the in-vehicle camera 61 will be described with reference to FIG. . FIG. 13 is a flowchart illustrating an example of an operation procedure in which the back-end server SV of the present embodiment reproduces the video data VDO1 and VDO2 of the recorded data RCD1 and RCD2 captured by the wearable camera 10 and the in-vehicle camera 61, respectively. As a premise of the description of FIG. 13, the recording data RCD1 captured by the in-vehicle camera 61 and the recording data RCD2 captured by the wearable camera 10 are stored in the storage 308 of the back-end server SV, and the wearable camera 10 and the in-vehicle camera are recorded. Each system time is synchronized with the system 65 according to the method shown in FIG.
 図13において、バックエンドサーバSVは、ストレージ308から車載カメラ61により撮像された録画データRCD1をメモリ302に読み込み、録画開始時刻情報を取得する(S11)。つまり、バックエンドサーバSVは、録画データRCD1内の映像データVDO1内に含まれるフレームパラメータ情報FPM1を基に、映像データVDO1の先頭フレームが車載カメラ61によりいつ録画開始されたかを示す情報を取得する(S11)。 In FIG. 13, the back-end server SV reads the recording data RCD1 captured by the in-vehicle camera 61 from the storage 308 into the memory 302, and acquires the recording start time information (S11). That is, the back-end server SV acquires information indicating when the in-vehicle camera 61 starts recording the first frame of the video data VDO1 based on the frame parameter information FPM1 included in the video data VDO1 in the recording data RCD1. (S11).
 同様に、バックエンドサーバSVは、ストレージ308からウェアラブルカメラ10により撮像された録画データRCD2をメモリ302に読み込み、録画開始時刻情報を取得する(S12)。つまり、バックエンドサーバSVは、録画データRCD2内の映像データVDO2内に含まれるフレームパラメータ情報FPM2を基に、映像データVDO2の先頭フレームがウェアラブルカメラ10によりいつ録画開始されたかを示す情報を取得する(S12)。 Similarly, the back-end server SV reads the recording data RCD2 imaged by the wearable camera 10 from the storage 308 into the memory 302 and acquires the recording start time information (S12). That is, the back-end server SV acquires information indicating when the wearable camera 10 starts recording the first frame of the video data VDO2 based on the frame parameter information FPM2 included in the video data VDO2 in the recording data RCD2. (S12).
 バックエンドサーバSVは、録画データRCD1の映像データVDO1の録画開始時刻が録画データRCD2の映像データVDO2の録画開始時刻より先であるか否かを判断する(S13)。バックエンドサーバSVは、録画データRCD1の映像データVDO1の録画開始時刻が録画データRCD2の映像データVDO2の録画開始時刻より先であると判断した場合には(S13、YES)、車載カメラシステム65により撮像され録画された録画データRCD1の映像データVDO1の再生を開始して画面に映像を出力する(S14)。 The back-end server SV determines whether or not the recording start time of the video data VDO1 of the recording data RCD1 is earlier than the recording start time of the video data VDO2 of the recording data RCD2 (S13). When the back-end server SV determines that the recording start time of the video data VDO1 of the recording data RCD1 is ahead of the recording start time of the video data VDO2 of the recording data RCD2 (S13, YES), the in-vehicle camera system 65 The reproduction of the video data VDO1 of the recording data RCD1 that has been imaged and recorded is started and the video is output to the screen (S14).
 バックエンドサーバSVは、映像データVDO1の再生を開始した後、ウェアラブルカメラ10により撮像された録画データRCD2の映像データVDO2の録画開始時刻に達したとき(S15、YES)、つまり、映像データVDO1の再生開始から、映像データVDO2の先頭フレームに関するフレームパラメータ情報FPM2の「Record time (sec)」+「Record time (msec)」と映像データVDO1の先頭フレームに関するフレームパラメータ情報FPM1の「Record time (sec)」+「Record time (msec)」との差分に対応する時間が経過したとき、映像データVDO1の再生画面とは別の再生画面に、映像データVDO2の再生を開始して映像を出力する(S16)。 The back-end server SV starts playback of the video data VDO1 and then reaches the recording start time of the video data VDO2 of the recording data RCD2 imaged by the wearable camera 10 (S15, YES), that is, the video data VDO1. From the start of playback, “Record time (sec)” of the frame parameter information FPM2 related to the first frame of the video data VDO2, and “Record time (msec)” and “Record time (sec) of the frame parameter information FPM1 related to the first frame of the video data VDO1 ”+“ Record time (msec) ”When the time corresponding to the difference has elapsed, the playback of the video data VDO2 starts on a playback screen different from the playback screen of the video data VDO1 and the video is output That (S16).
 ここで、映像データVDO1,VDO2が終了した場合、またはバックエンドシステム100Bを管轄する警察署5内の担当者の操作によって映像データVDO1,VDO2の再生の停止操作が行われた場合には、バックエンドサーバSVは、映像データVDO1,VDO2の再生を終了する。 Here, when the video data VDO1 and VDO2 are completed, or when the playback of the video data VDO1 and VDO2 is stopped by the operation of the person in charge in the police station 5 having jurisdiction over the back-end system 100B, The end server SV ends the reproduction of the video data VDO1 and VDO2.
 一方、バックエンドサーバSVは、録画データRCD1の映像データVDO1の録画開始時刻が録画データRCD2の映像データVDO2の録画開始時刻より先でないと判断した場合には(S13、NO)、ウェアラブルカメラ10により撮像され録画された録画データRCD2の映像データVDO2の再生を開始して画面に映像を出力する(S18)。 On the other hand, when the back-end server SV determines that the recording start time of the video data VDO1 of the recording data RCD1 is not ahead of the recording start time of the video data VDO2 of the recording data RCD2 (S13, NO), the wearable camera 10 The reproduction of the video data VDO2 of the recording data RCD2 that has been imaged and recorded is started and the video is output to the screen (S18).
 バックエンドサーバSVは、映像データVDO2の再生を開始した後、車載カメラシステム65により撮像された録画データRCD1の映像データVDO1の録画開始時刻に達したとき(S19、YES)、つまり、映像データVDO2の再生開始から、映像データVDO1の先頭フレームに関するフレームパラメータ情報FPM1の「Record time (sec)」+「Record time (msec)」と映像データVDO2の先頭フレームに関するフレームパラメータ情報FPM2の「Record time (sec)」+「Record time (msec)」との差分に対応する時間が経過したとき、映像データVDO2の再生画面とは別の再生画面に、映像データVDO1の再生を開始して映像を出力する(S20)。 The back-end server SV starts playback of the video data VDO2 and then reaches the recording start time of the video data VDO1 of the recording data RCD1 imaged by the in-vehicle camera system 65 (S19, YES), that is, the video data VDO2. “Record time (sec)” + “Record time (msec)” of the frame parameter information FPM1 related to the first frame of the video data VDO1, and “Record time (sec) of the frame parameter information FPM2 related to the first frame of the video data VDO2. ) ”+“ Record time (msec) ”When the time corresponding to the difference has elapsed, the playback of the video data VDO1 is started on the playback screen different from the playback screen of the video data VDO2, and the video is output. That (S20).
 ここで、映像データVDO1,VDO2が終了した場合、またはバックエンドシステム100Bを管轄する警察署5内の担当者の操作によって映像データVDO1,VDO2の再生の停止操作が行われた場合には、バックエンドサーバSVは、映像データVDO1,VDO2の再生を終了する。 Here, when the video data VDO1 and VDO2 are completed, or when the playback of the video data VDO1 and VDO2 is stopped by the operation of the person in charge in the police station 5 having jurisdiction over the back-end system 100B, The end server SV ends the reproduction of the video data VDO1 and VDO2.
 以上により、本実施形態のウェアラブルカメラシステム100では、ウェアラブルカメラ10は、警察官7が急行した現場の状況の映像を撮像し、警察官7が乗車するパトカー内に設置された車載カメラシステム65(具体的には、車載カメラ61)は、同じ現場の状況の映像を撮像する。バックエンドサーバSVは、映像データVDO2を有する録画データRCD2と映像データVDO1を有する録画データRCD1とを受信し、録画データRCD1および録画データRCD2を対応付けてストレージ308に保存する。バックエンドサーバSVは、例えば共通の事件番号を付与することによって録画データRCD1と録画データRCD2とを対応付けできる。また、バックエンドサーバSVは、映像データVDO1の先頭フレームに関するフレームパラメータ情報と映像データVDO2の先頭フレームに関するフレームパラメータ情報とを用いて、映像データVDO1および映像データVDO2を同期して再生する。 As described above, in the wearable camera system 100 of the present embodiment, the wearable camera 10 captures an image of the situation of the scene where the police officer 7 rushes, and the in-vehicle camera system 65 ( Specifically, the in-vehicle camera 61) captures an image of the situation at the same site. The back-end server SV receives the recording data RCD2 having the video data VDO2 and the recording data RCD1 having the video data VDO1, and stores the recording data RCD1 and the recording data RCD2 in association with each other in the storage 308. The back-end server SV can associate the recorded data RCD1 with the recorded data RCD2 by assigning a common case number, for example. Further, the back-end server SV uses the frame parameter information related to the first frame of the video data VDO1 and the frame parameter information related to the first frame of the video data VDO2 to reproduce the video data VDO1 and the video data VDO2 in synchronization.
 これにより、ウェアラブルカメラシステム100は、同一の事件を撮像した複数の映像データVDO1,VDO2をバックエンドサーバSVにおいて再生する場合に、それぞれ異なるアングルで撮像された映像データVDO1,VDO2の再生時刻を同期できるので、警察官7が急行した現場の状況を広範かつ鮮明に、容易に把握させることができる。例えば、一方のカメラ(例えば車載カメラ61)では撮像し切れていない現場の状況がその時点では他方のカメラ(例えばウェアラブルカメラ10)で撮像できている場合には、ウェアラブルカメラシステム100は、現場の状況(例えば、同一の事件における時系列的な前後関係、被害者、容疑者、周囲の見学者)を的確に把握させることができる。車載カメラ61はパトカー内に設置されているので、例えば警察官や障害物などに遮られて、車載カメラ61が撮像して録画した映像データVDO1では十分な範囲が撮像できていない場合でも、映像データVDO,VDO2の再生時刻が同期しているので、バックエンドシステム100Bを管轄する警察署5内の担当者は、ウェアラブルカメラ10が撮像して録画した映像データVDO2では遮られた部分の映像を把握可能となる。 Accordingly, when the wearable camera system 100 reproduces a plurality of video data VDO1 and VDO2 captured in the same incident on the back-end server SV, the playback times of the video data VDO1 and VDO2 captured at different angles are synchronized. As a result, the situation of the scene where the police officer 7 rushed can be easily grasped extensively and clearly. For example, when the situation in the field that is not completely imaged by one camera (for example, the in-vehicle camera 61) can be imaged by the other camera (for example, the wearable camera 10) at that time, the wearable camera system 100 is The situation (for example, time-series context in the same case, victim, suspect, surrounding visitors) can be accurately grasped. Since the in-vehicle camera 61 is installed in the police car, for example, even if the video data VDO1 captured and recorded by the in-vehicle camera 61 is blocked by a police officer or an obstacle, the image is not captured. Since the playback times of the data VDO and VDO2 are synchronized, the person in charge in the police station 5 having jurisdiction over the back-end system 100B can view the video of the blocked portion in the video data VDO2 captured and recorded by the wearable camera 10. It becomes possible to grasp.
 また、本実施形態では、映像データVDO1は、車載カメラ61により撮像され録画された映像データとして説明したが、図2に示すウェアラブルカメラ10Aにより撮像され録画された映像データでもよい。これにより、複数のウェアラブルカメラ10,10Aにより撮像され録画された映像データの再生時刻が同期して再生されると、バックエンドサーバSVは、撮像時点(録画時点)における現場の状況をより広範に把握させることが可能となる。 In the present embodiment, the video data VDO1 has been described as video data captured and recorded by the in-vehicle camera 61, but it may be video data captured and recorded by the wearable camera 10A shown in FIG. As a result, when the reproduction times of the video data captured and recorded by the plurality of wearable cameras 10 and 10A are reproduced in synchronization, the back-end server SV broadens the situation at the scene at the time of imaging (recording time). It becomes possible to grasp.
 また、本実施形態のウェアラブルカメラ10は、車載カメラ61の通信圏内にいる場合に、車載カメラ61のシステム時刻(言い換えると、車載カメラシステム65のシステム時刻)と同期する。これにより、ウェアラブルカメラ10は、ウェアラブルカメラ10を装着する警察官7が移動したことで車載カメラ61の通信圏内に入った場合に、車載カメラ61のシステム時刻(言い換えると、車載カメラシステム65のシステム時刻)と一致するようにウェアラブルカメラ10自身のシステム時刻を調整できる。 Further, the wearable camera 10 of the present embodiment synchronizes with the system time of the in-vehicle camera 61 (in other words, the system time of the in-vehicle camera system 65) when in the communication range of the in-vehicle camera 61. Thus, when the wearable camera 10 enters the communication range of the in-vehicle camera 61 due to the movement of the police officer 7 wearing the wearable camera 10, the system time of the in-vehicle camera 61 (in other words, the system of the in-vehicle camera system 65). The system time of wearable camera 10 itself can be adjusted to coincide with (time).
 また、本実施形態のウェアラブルカメラ10は、車載カメラ61の通信圏内にいるか否かを定期的に判断する。これにより、ウェアラブルカメラ10は、一度車載カメラ61の通信圏内にいないと判断した場合でも、例えばウェアラブルカメラ10を装着する警察官7が移動したことで車載カメラ61の通信圏内に入った場合に、車載カメラ61のシステム時刻と一致するようにウェアラブルカメラ10自身のシステム時刻を調整できる。 In addition, the wearable camera 10 of the present embodiment periodically determines whether or not the vehicle-mounted camera 61 is within the communication range. Thereby, even when wearable camera 10 determines that it is once out of the communication range of in-vehicle camera 61, for example, when police officer 7 wearing wearable camera 10 moves into the communication range of in-vehicle camera 61, The system time of the wearable camera 10 itself can be adjusted to coincide with the system time of the in-vehicle camera 61.
 また、本実施形態のバックエンドサーバSVは、映像データVDO1,VDO2のフレームパラメータ情報FPM1,FPM2が示すそれぞれの録画時刻のうち一方が先行する場合に、先行する録画時刻に対応する映像データの再生を先に開始し、他方の録画時刻が一方の録画時刻に一致したときに、他方の録画時刻に対応する映像データの再生を開始する。これにより、バックエンドサーバSVは、ウェアラブルカメラ10,車載カメラ61により撮像され録画された時刻が異なっていた場合でも、映像データVDO1,VDO2に含まれる各フレームパラメータFPM1,FPM2の各録画時刻情報を簡易に求めることができ、この録画時刻情報を用いることで、映像データVDO1,VDO2の録画時刻を一致させるように再生でき、現場の状況を的確に出力できる。 Further, the back-end server SV of the present embodiment reproduces video data corresponding to the preceding recording time when one of the recording times indicated by the frame parameter information FPM1 and FPM2 of the video data VDO1 and VDO2 precedes. Is started first, and when the other recording time coincides with one recording time, reproduction of video data corresponding to the other recording time is started. Thereby, even when the time when the back-end server SV is captured and recorded by the wearable camera 10 and the in-vehicle camera 61 is different, the recording time information of each frame parameter FPM1, FPM2 included in the video data VDO1, VDO2 is obtained. By using this recording time information, it is possible to reproduce the video data VDO1 and VDO2 so that the recording times coincide with each other, and to accurately output the on-site situation.
 以上、図面を参照しながら各種の実施形態について説明したが、本開示はかかる例に限定されないことは言うまでもない。当業者であれば、請求の範囲に記載された範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、それらについても当然に本開示の技術的範囲に属するものと了解される。また、本開示の趣旨を逸脱しない範囲において、上記実施形態における各構成要素を任意に組み合わせてもよい。 Although various embodiments have been described above with reference to the drawings, it goes without saying that the present disclosure is not limited to such examples. It will be apparent to those skilled in the art that various changes and modifications can be made within the scope of the claims, and these are naturally within the technical scope of the present disclosure. Is done. In addition, the constituent elements in the above embodiments may be arbitrarily combined without departing from the spirit of the present disclosure.
 例えばバックエンドサーバSVのCPU301の性能が十分に優れていない場合には、映像データVDO1,VDO2の再生時刻がずれてくる可能性がある。そのため、バックエンドサーバSVは、ステップS16またはステップS20の後に、定期的に映像データVDO1,VDO2の再生時刻のずれがあるか否かを判断し、ずれがあると判断した場合にはずれを調整してもよい。つまり、バックエンドサーバSVは、いずれか一方または両方の映像データの再生スピードを遅め、2画面(2ウインドウ)において出力された映像データVDO1,VDO2の再生時刻にずれが生じないように制御することが好ましい。 For example, when the performance of the CPU 301 of the back-end server SV is not sufficiently good, the reproduction times of the video data VDO1 and VDO2 may be shifted. Therefore, the back-end server SV periodically determines whether or not there is a difference in the reproduction time of the video data VDO1 and VDO2 after step S16 or step S20, and adjusts the difference if it is determined that there is a difference. May be. That is, the back-end server SV controls the video data VDO1 and VDO2 output on the two screens (two windows) to slow down the playback speed of one or both of the video data so that no deviation occurs in the playback time. It is preferable.
 本開示は、同一の事件を撮像した複数の映像データを再生する場合に、それぞれの映像データの再生時刻を同期して現場の状況を容易に把握させるウェアラブルカメラシステム及び映像データ再生同期方法として有用である。 The present disclosure is useful as a wearable camera system and a video data reproduction synchronization method for reproducing a plurality of video data obtained by imaging the same incident and synchronizing the reproduction times of the respective video data so as to easily grasp the situation in the field. It is.
5 警察署
6 パトカー
7 警察官
10,10A ウェアラブルカメラ
11 撮像部
12,108 GPIO
13,105 RAM
14 ROM
15 記憶部
16 EEPROM
17 RTC
18,107 GPS
19 MCU
19A システム時刻計数部
21,102,203,304 通信部
22 USB
23 コンタクトターミナル
24 電源部
25 バッテリ
26a,26b,26c LED
27 バイブレータ
60 車載システム
61 車載カメラ
62 車載レコーダ
63 車載PC
65 車載カメラシステム
70 管理ソフトウェア
71 署内PC
100 ウェアラブルカメラシステム
100A フロントエンドシステム
100B バックエンドシステム
101,201,301 CPU
104 フラッシュROM
106 マイコン
109 スイッチ
110 LED
111 ストレージ
202,303 I/O制御部
204,302 メモリ
205,305 入力部
206 表示部
207 スピーカ
SV バックエンドサーバ
306 出力部
307 ストレージ制御部
308 ストレージ
5 Police station 6 Police car 7 Police officer 10, 10A Wearable camera 11 Imaging unit 12, 108 GPIO
13,105 RAM
14 ROM
15 Storage unit 16 EEPROM
17 RTC
18,107 GPS
19 MCU
19A System time counting unit 21, 102, 203, 304 Communication unit 22 USB
23 Contact terminal 24 Power supply 25 Battery 26a, 26b, 26c LED
27 Vibrator 60 In-vehicle system 61 In-vehicle camera 62 In-vehicle recorder 63 In-vehicle PC
65 In-vehicle camera system 70 Management software 71 Office PC
100 Wearable Camera System 100A Front-end System 100B Back- end System 101, 201, 301 CPU
104 flash ROM
106 Microcomputer 109 Switch 110 LED
111 Storage 202, 303 I / O control unit 204, 302 Memory 205, 305 Input unit 206 Display unit 207 Speaker SV Back-end server 306 Output unit 307 Storage control unit 308 Storage

Claims (7)

  1.  ユーザが装着可能な第1のウェアラブルカメラと、サーバとを少なくとも含むウェアラブルカメラシステムであって、
     前記サーバは、
     前記第1のウェアラブルカメラにより撮像された第1の映像データと外部装置により撮像された第2の映像データとを受信する通信部と、
     前記通信部により受信された前記第1の映像データおよび前記第2の映像データを対応付けて記憶する記憶部と、
     前記記憶部に記憶された前記第1の映像データおよび前記第2の映像データを再生する再生部と、を備え、
     前記第1の映像データは、前記第1のウェアラブルカメラにより撮像された第1の録画時刻情報を有し、
     前記第2の映像データは、前記外部装置により撮像された第2の録画時刻情報を有し、
     前記再生部は、前記第1の録画時刻情報および前記第2の録画時刻情報を用いて、前記第1の映像データおよび前記第2の映像データを同期して再生する、
     ウェアラブルカメラシステム。
    A wearable camera system including at least a first wearable camera that can be worn by a user and a server,
    The server
    A communication unit that receives first video data captured by the first wearable camera and second video data captured by an external device;
    A storage unit for storing the first video data and the second video data received by the communication unit in association with each other;
    A playback unit for playing back the first video data and the second video data stored in the storage unit,
    The first video data has first recording time information captured by the first wearable camera,
    The second video data has second recording time information captured by the external device,
    The playback unit uses the first recording time information and the second recording time information to play back the first video data and the second video data in synchronization.
    Wearable camera system.
  2.  請求項1に記載のウェアラブルカメラシステムであって、
     前記外部装置は、前記ユーザが乗車する車両に搭載された車載カメラである、
     ウェアラブルカメラシステム。
    The wearable camera system according to claim 1,
    The external device is an in-vehicle camera mounted on a vehicle on which the user rides.
    Wearable camera system.
  3.  請求項1に記載のウェアラブルカメラシステムであって、
     前記外部装置は、前記第1のウェアラブルカメラとは異なる第2のウェアラブルカメラである、
     ウェアラブルカメラシステム。
    The wearable camera system according to claim 1,
    The external device is a second wearable camera different from the first wearable camera;
    Wearable camera system.
  4.  請求項1に記載のウェアラブルカメラシステムであって、
     前記第1のウェアラブルカメラは、前記外部装置の通信圏内にいる場合に、前記外部装置の時刻と同期する、
     ウェアラブルカメラシステム。
    The wearable camera system according to claim 1,
    The first wearable camera synchronizes with the time of the external device when in the communication range of the external device,
    Wearable camera system.
  5.  請求項4に記載のウェアラブルカメラシステムであって、
     前記第1のウェアラブルカメラは、前記外部装置の通信圏内にいるか否かを定期的に判断する、
     ウェアラブルカメラシステム。
    The wearable camera system according to claim 4,
    The first wearable camera periodically determines whether or not the external device is in a communication range,
    Wearable camera system.
  6.  請求項1に記載のウェアラブルカメラシステムであって、
     前記再生部は、前記第1の録画時刻情報および前記第2の録画時刻情報のうち一方の録画時刻が先行する場合に、前記一方の録画時刻に対応する映像データの再生を開始し、他方の録画時刻が前記一方の録画時刻に一致したときに、前記他方の録画時刻に対応する映像データの再生を開始する、
     ウェアラブルカメラシステム。
    The wearable camera system according to claim 1,
    The playback unit starts playback of video data corresponding to the one recording time when one of the first recording time information and the second recording time information precedes, and the other recording time information When the recording time coincides with the one recording time, the video data corresponding to the other recording time is started to be reproduced.
    Wearable camera system.
  7.  ユーザが装着可能な第1のウェアラブルカメラとサーバとを少なくとも含むウェアラブルカメラシステムにおける映像データ同期再生方法であって、
     前記第1のウェアラブルカメラにより映像を撮像し、
     外部装置により前記映像を撮像し、
     前記第1のウェアラブルカメラにより撮像された第1の映像データと前記外部装置により撮像された第2の映像データとを受信し、
     前記第1のウェアラブルカメラにより撮像された第1の録画時刻情報を有する前記第1の映像データ、および前記外部装置により撮像された第2の録画時刻情報を有する前記第2の映像データを対応付けて前記サーバに記憶し、
     前記第1の録画時刻情報および前記第2の録画時刻情報を用いて、前記サーバに記憶された前記第1の映像データおよび前記第2の映像データを同期して再生する、
     映像データ同期再生方法。
    A method for synchronously reproducing video data in a wearable camera system including at least a first wearable camera that can be worn by a user and a server,
    Taking an image with the first wearable camera,
    Capture the video with an external device,
    Receiving first video data imaged by the first wearable camera and second video data imaged by the external device;
    Associating the first video data having the first recording time information imaged by the first wearable camera with the second video data having the second recording time information imaged by the external device Stored in the server
    Using the first recording time information and the second recording time information, the first video data and the second video data stored in the server are reproduced in synchronization.
    Video data synchronized playback method.
PCT/JP2016/000333 2015-02-16 2016-01-25 Wearable camera system and method for synchronously reproducing video data WO2016132678A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-027608 2015-02-16
JP2015027608A JP6115873B2 (en) 2015-02-16 2015-02-16 Wearable camera system and video data synchronous reproduction method

Publications (1)

Publication Number Publication Date
WO2016132678A1 true WO2016132678A1 (en) 2016-08-25

Family

ID=56688934

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/000333 WO2016132678A1 (en) 2015-02-16 2016-01-25 Wearable camera system and method for synchronously reproducing video data

Country Status (2)

Country Link
JP (1) JP6115873B2 (en)
WO (1) WO2016132678A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2553659A (en) * 2017-07-21 2018-03-14 Weheartdigital Ltd A System for creating an audio-visual recording of an event
JP6341526B1 (en) * 2018-01-16 2018-06-13 新生環境株式会社 Self-action recording camera

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000333160A (en) * 1999-05-19 2000-11-30 Nippon Dry Chem Co Ltd Device for grasping situation of fire site
JP2003283900A (en) * 2002-03-22 2003-10-03 Sanyo Electric Co Ltd Digital camera
JP2003284049A (en) * 2002-03-26 2003-10-03 Kubota Corp Monitoring system
JP2005217609A (en) * 2004-01-28 2005-08-11 Victor Co Of Japan Ltd Video information reproducing apparatus
JP2005328128A (en) * 2004-05-12 2005-11-24 Matsushita Electric Ind Co Ltd Synchronous regeneration device
JP2009267600A (en) * 2008-04-23 2009-11-12 Canon Inc Image processing device, control method thereof, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000333160A (en) * 1999-05-19 2000-11-30 Nippon Dry Chem Co Ltd Device for grasping situation of fire site
JP2003283900A (en) * 2002-03-22 2003-10-03 Sanyo Electric Co Ltd Digital camera
JP2003284049A (en) * 2002-03-26 2003-10-03 Kubota Corp Monitoring system
JP2005217609A (en) * 2004-01-28 2005-08-11 Victor Co Of Japan Ltd Video information reproducing apparatus
JP2005328128A (en) * 2004-05-12 2005-11-24 Matsushita Electric Ind Co Ltd Synchronous regeneration device
JP2009267600A (en) * 2008-04-23 2009-11-12 Canon Inc Image processing device, control method thereof, and program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2553659A (en) * 2017-07-21 2018-03-14 Weheartdigital Ltd A System for creating an audio-visual recording of an event
GB2553659B (en) * 2017-07-21 2018-08-29 Weheartdigital Ltd A System for creating an audio-visual recording of an event
US11301508B2 (en) 2017-07-21 2022-04-12 Filmily Limited System for creating an audio-visual recording of an event
JP6341526B1 (en) * 2018-01-16 2018-06-13 新生環境株式会社 Self-action recording camera
JP2019124816A (en) * 2018-01-16 2019-07-25 新生環境株式会社 Self behavior recording camera

Also Published As

Publication number Publication date
JP6115873B2 (en) 2017-04-19
JP2016152443A (en) 2016-08-22

Similar Documents

Publication Publication Date Title
US10554935B2 (en) Wearable camera system, and video recording control method for wearable camera system
US10715766B2 (en) Wearable camera system and recording control method
WO2016103610A1 (en) Wearable camera
JP2017005436A (en) Wearable camera system and recording control method
JP6799779B2 (en) Surveillance video analysis system and surveillance video analysis method
JP5842103B1 (en) Login authentication apparatus and login authentication method for in-vehicle camera system
JP6115874B2 (en) Wearable camera system and recording control method
JP5861073B1 (en) Wearable camera
JP5856700B1 (en) Wearable camera system and recording control method
JP6115873B2 (en) Wearable camera system and video data synchronous reproduction method
JP2016122115A (en) Wearable camera
JP6145780B2 (en) Wearable camera system and video data transfer method
JP5856701B1 (en) Wearable camera system and recording control method
JP5856702B1 (en) Wearable camera system and attribute information assigning method
JP2021027408A (en) Wearable camera system and video image recording method
JP5861075B1 (en) Wearable camera
JP7327792B2 (en) Image composition system and image composition method
JP7420375B2 (en) User authentication system and user authentication method
JP5861074B1 (en) Wearable camera
JP2018037965A (en) Wearable camera system and communication control method
WO2016121314A1 (en) Wearable camera system and recording control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16752071

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16752071

Country of ref document: EP

Kind code of ref document: A1