US20200358947A1 - Wearable camera, video playback system, and video playback method - Google Patents
Wearable camera, video playback system, and video playback method Download PDFInfo
- Publication number
- US20200358947A1 US20200358947A1 US16/570,098 US201916570098A US2020358947A1 US 20200358947 A1 US20200358947 A1 US 20200358947A1 US 201916570098 A US201916570098 A US 201916570098A US 2020358947 A1 US2020358947 A1 US 2020358947A1
- Authority
- US
- United States
- Prior art keywords
- bookmark
- video data
- wearable camera
- data file
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H04N5/23219—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/62—Text, e.g. of license plates, overlay texts or captions on TV images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H04N5/232939—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/44—Event detection
Definitions
- This disclosure relates to a wearable camera, a video playback system, and a video playback method.
- wearable cameras have been introduced in order to support the work of police officers, security guards, or the like (for example, see PTL 1).
- the wearable camera is attached on the body of the police officer, the security guard, or the like or the clothes worn by them, and captures a video of a scene and the like.
- the police officer, the security guard, or the like may play back, for example, the video captured (recorded) by the wearable camera on a terminal apparatus after returning to the police station or the office, and create a report on the case while watching the played back video.
- the video data captured by the wearable camera may include videos that are not relevant to the case (report), for example. Watching a video still including videos that are not relevant to the case to create the report is time consuming in the report creation and is a problem.
- a non-limiting example of this disclosure contributes to the provision of a wearable camera and a signal adding method capable of simplifying the report creation.
- a wearable camera includes: a storage section that stores therein video data of a captured moving image; and a controller that adds a bookmark signal to the video data when an event included in the video data is detected, in which the bookmark signal indicates a location at which playback of the video data starts, when the video data is played back.
- a video playback system includes: a wearable camera; and a display apparatus that plays back video data of a moving image acquired by the wearable camera, in which: the wearable camera includes: a storage section that stores therein video data of a captured moving image; and a controller that adds a bookmark signal to the video data when an event included in the video data is detected, in which the bookmark signal indicates a location at which playback of the video data starts, when the video data is played back; and the display apparatus starts playback of the video data from a location on the video data indicated by the bookmark signal.
- a video playback method is a method for a video playback system including a wearable camera and a display apparatus that plays back video data of a moving image acquired by the wearable camera, the video playback method including: storing, by the wearable camera, video data of a captured moving image; adding, by the wearable camera, a bookmark signal to the video data when an event included in the video data is detected, the bookmark signal indicating a location at which playback of the video data starts, when the video data is played back; and starting, by the display apparatus, playback of the video data from a location on the video data indicated by the bookmark signal.
- inclusive or specific aspects above may be implemented by a system, an apparatus, a method, an integrated circuit, a computer program, or a recording medium, and may be implemented by a freely-selected combination of the system, the apparatus, the method, the integrated circuit, the computer program, and the recording medium.
- the report creation can be simplified.
- FIG. 1 illustrates a configuration example of a wearable camera system according to Embodiment 1;
- FIG. 2 describes an example of bookmarks added to video data of a wearable camera
- FIG. 3 describes an example of the addition of bookmarks
- FIG. 4 illustrates a playback screen example of a terminal apparatus
- FIG. 5 describes an example of the addition of bookmarks
- FIG. 6 describes an example of the addition of bookmarks
- FIG. 7 describes an example of the addition of bookmarks
- FIG. 8 describes an example of the addition of bookmarks
- FIG. 9 illustrates an example of bookmarks of the face detection and the number plate detection
- FIG. 10 illustrates a playback screen example of the terminal apparatus
- FIG. 11 illustrates an example of an upper body of a police officer wearing the wearable camera and a biological sensor
- FIG. 12 illustrates an external appearance example of the wearable camera
- FIG. 13 illustrates a block configuration example of the wearable camera
- FIG. 14 illustrates a block configuration example of a server
- FIG. 15 illustrates a block configuration example of the terminal apparatus
- FIG. 16 is a flowchart illustrating an operation example of the wearable camera
- FIG. 17 is a flowchart illustrating an operation example of blur processing of the terminal apparatus.
- FIG. 18 is a flowchart illustrating an operation example of a wearable camera according to Embodiment 2.
- Embodiments of the present invention arc described in detail below with reference to the accompanying drawings, as appropriate. Unnecessarily detailed descriptions may be omitted. For example, detailed descriptions of features that are already well known or overlapping descriptions for configurations that are substantially the same may be omitted. This is for preventing the description below from becoming unnecessarily redundant, and facilitating the understanding of a person skilled in the art.
- FIG. 1 illustrates a configuration example of a wearable camera system according to Embodiment 1.
- the wearable camera system includes wearable camera 1 , in-vehicle system 2 , server 3 , and terminal apparatus 4 .
- In-vehicle system 2 , server 3 , and terminal apparatus 4 are connected to each other via network 5 .
- Wearable camera 1 is connected to server 3 and terminal apparatus 4 via in-vehicle system 2 and network 5 .
- Network 5 may include networks such as the Internet and a wireless communication network of a mobile phone and the like, for example.
- Wearable camera 1 is worn on or possessed by a police officer, for example (for example, see FIG. 11 ). Wearable camera 1 communicates with in-vehicle system 2 by short-range wireless communication such as Wi-Fi (R) or Bluetooth (R), for example.
- R Wi-Fi
- R Bluetooth
- In-vehicle system 2 is installed in police vehicle A 1 , for example.
- In-vehicle system 2 includes an in-vehicle camera (not shown), a control apparatus (not shown) such as a personal computer, and a communicate apparatus (not shown), for example.
- In-vehicle system 2 receives video data that is captured by wearable camera 1 from wearable camera 1 , for example. In-vehicle system 2 transmits the video data received from wearable camera 1 to server 3 via network 5 . In-vehicle system 2 transmits the video data captured by the in-vehicle camera to server 3 via network 5 .
- Server 3 stores the video data captured by wearable camera 1 and the video data captured by the in-vehicle camera of in-vehicle system 2 therein.
- Server 3 stores a report created by terminal apparatus 4 and the like therein.
- Terminal apparatus 4 is used by a police officer in police station A 2 , for example. Terminal apparatus 4 accesses server 3 in accordance with the operation by the police officer, and displays the video data stored in server 3 on the display apparatus. Terminal apparatus 4 creates a report relating to the case and the like, for example, in accordance with the operation by the police officer. Terminal apparatus 4 transmits the created report to server 3 via network 5 .
- wearable camera 1 is connected to server 3 and terminal apparatus 4 via in-vehicle system 2 and network 5 , but the present invention is not limited thereto. Wearable camera 1 may be connected to server 3 and terminal apparatus 4 via network 5 and not via in-vehicle system 2 .
- FIG. 2 describes an example of bookmarks added to the video data of wearable camera 1 .
- Wearable camera 1 detects a predetermined event. For example, wearable camera 1 detects an event shown in the right column in FIG. 2 .
- wearable camera 1 adds a bookmark signal (hereinafter may be referred to as a bookmark) including the detected event content to the video data that is being captured (for example, see FIG. 3 ).
- a bookmark signal hereinafter may be referred to as a bookmark
- wearable camera 1 detects the dash, the fall, or the fight with a suspect of the police officer wearing wearable camera 1 by a gyro sensor and an acceleration sensor described below.
- wearable camera 1 adds a bookmark indicating that the police officer has dashed, fell, or fought with the suspect to the video data.
- Wearable camera 1 detects the excited state of the police officer wearing wearable camera 1 by a biological sensor described below. When wearable camera 1 detects the excited state of the police officer, wearable camera 1 adds a bookmark indicating that the police officer has entered an excited state to the video data.
- wearable camera 1 detects that a predetermined image is included in the video data.
- wearable camera 1 detects that a predetermined image is included in the video data
- wearable camera 1 adds a bookmark indicating that a predetermined image is included to the video data.
- the predetermined images to be detected include a person, a face, a vehicle, a number plate, an edged tool or a gun, abnormal behavior of a person, a crowd, color, the color of the clothes of a person, and the color of the vehicle, for example.
- the color may be color other than the color of the clothes of a person and the vehicle, and may be the color of a building and the like, for example.
- wearable camera 1 detects that a conversation or predetermined words are included in the collected sound.
- wearable camera 1 adds a bookmark indicating that a conversation or predetermined words are included to the video data.
- wearable camera 1 detects that a gunshot or an explosion sound are included in the collected sound.
- wearable camera 1 adds a bookmark indicating that a gunshot or an explosion sound are included to the video data.
- the events detected by wearable camera 1 may be classified into the detection of an action of the police officer, the living body detection, the image detection, the audible sound detection, and the special sound detection, for example, as shown in the left column in FIG. 2 .
- the bookmark added to the video data may be referred to as attribute information, a tag, or metadata.
- the event may be understood to be an event relevant to the case.
- the police officer starts to run in order to chase the suspect. Therefore, the “dash detection” in FIG. 2 can be said to be an event relevant to the case.
- FIG. 3 describes an example of the addition of bookmarks.
- FIG. 3 illustrates some frames of the video data captured by wearable camera 1 .
- the horizontal axis in FIG. 3 indicates time.
- wearable camera 1 detects a person from the video data that is being captured.
- a bookmark including time t 1 and an event indicating that the person is detected is added to the video data.
- wearable camera 1 detects a dash of the police officer wearing wearable camera 1 from the acceleration sensor and the gyro sensor described below.
- a bookmark including time t 2 and an event indicating that the dash is detected is added to the video data.
- wearable camera 1 detects a conversation from the audible sound in the video data.
- a bookmark including time t 3 and an event indicating that a conversation is detected is added to the video data.
- wearable camera 1 detects a gunshot from the audible sound in the video data.
- a bookmark including time t 4 and an event indicating that a gunshot is detected is added to the video data.
- the bookmark may include the place in which the event is detected and the like besides the time and the information indicating the event content as described in FIG. 5 to FIG. 9 .
- wearable camera 1 detects a predetermined event from the sensor, the video, and the audible sound from the microphone, wearable camera 1 adds a bookmark to the video data. For example, when wearable camera 1 detects an event shown in the right column in FIG. 2 , wearable camera 1 adds a bookmark to the video data as illustrated in FIG. 3 .
- the police officer ends the image capturing on wearable camera 1 .
- the police officer transmits the video data of wearable camera 1 to server 3 via in-vehicle system 2 and network 5 .
- the police officer returns to police station A 2 , and creates a report on the case with use of terminal apparatus 4 .
- the police officer accesses server 3 with use of terminal apparatus 4 , and plays back the video data with which a report on the case is to be created on terminal apparatus 4 .
- the police officer creates a report on the case on the basis of the video data played back on terminal apparatus 4 .
- FIG. 4 illustrates a playback screen example of terminal apparatus 4 .
- Terminal apparatus 4 accepts information on the video data to be played back from the police officer.
- Terminal apparatus 4 accesses server 3 and receives the video data corresponding to the accepted information from the police officer.
- Terminal apparatus 4 displays the video of the received video data on the display apparatus as illustrated in playback screen 4 a in FIG. 4 .
- the video data includes the bookmarks.
- Terminal apparatus 4 displays the bookmarks included in the video data as illustrated in bookmark list 4 b in FIG. 4 .
- bookmark list 4 b On bookmark list 4 b , the time at which the bookmark is added to the video data, and the event content of the bookmark added at the time are displayed in association with each other.
- terminal apparatus 4 starts the playback from the video place in the video data of the selected bookmark. For example, when the “conversation detection” in bookmark list 4 b is selected, terminal apparatus 4 plays back the video from the place of the bookmark of the “conversation detection” or the time included in the bookmark of the “conversation detection”. In other words, terminal apparatus 4 cues the video from the place selected in bookmark list 4 b.
- wearable camera 1 detects an event, and adds a bookmark including the time at which the event is detected and the content of the detected event to the video data that is being captured.
- terminal apparatus 4 can play back the video data from the place at which the bookmark is added or the time included in the bookmark. Therefore, the creation of the report on the case becomes easier for the police officer.
- Terminal apparatus 4 plays back the video data from the place at which the bookmark of the “conversation detection” is added.
- the police officer selects the “gunshot detection” in bookmark list 4 b in FIG. 4 .
- Terminal apparatus 4 plays back the video data from the place at which the bookmark of the “gunshot detection” is added.
- the police officer can play back the video from the video place from which the police officer desires to report the case, and the creation of the report becomes easier.
- the detected event content may be added, erased, and changed by a server (not shown) in police station A 2 .
- a server not shown
- the color in the “detection of color of clothes of person” shown in FIG. 2 may be changed by the server in police station A 2 .
- the words in the “detection of predetermined words” shown in FIG. 2 may be changed by the server in police station A 2 .
- Wearable camera 1 may include location information of wearable camera 1 in the bookmark added to the video data.
- wearable camera 1 may include the information on the capturing place in the bookmark.
- Wearable camera 1 can acquire the current location information by a Global Positioning System (GPS) described below, for example.
- GPS Global Positioning System
- FIG. 5 describes an example of the addition of bookmarks.
- Wearable camera 1 collects the sound by the microphone described below, and detects a predetermined sound. When wearable camera 1 detects a predetermined sound, wearable camera 1 adds a bookmark to the video data.
- wearable camera 1 when wearable camera 1 detects “conversation” from the collected sound, wearable camera 1 adds a bookmark indicated by arrow A 11 a in FIG. 5 to the video data.
- the bookmark indicated by arrow A 11 a in FIG. 5 the time at which the event of “conversation” is detected, the place, and the bookmark classification are included.
- the bookmark classification may be understood to be information indicating the content of the detected event.
- wearable camera 1 when wearable camera 1 detects “gunshot” from the collected sound, wearable camera 1 adds a bookmark indicated by arrow A 11 b in FIG. 5 to the video data.
- the bookmark indicated by arrow A 11 b in FIG. 5 includes the time at which the event of “gunshot” is detected, the place, and the bookmark classification.
- the video data may include a plurality of conversations.
- Wearable camera 1 may distinguish a plurality of conversations, and add identifiers to the bookmark classifications of the bookmarks corresponding to the conversations. For example, when three conversations are included in the video data, wearable camera 1 may add numbers to the bookmark classifications included in the three bookmarks corresponding to the three conversations as identifiers. For example, wearable camera 1 may set the bookmark classifications included in each of the three bookmarks to be “Conversation 1”, “Conversation 2”, and “Conversation 3”.
- FIG. 6 describes an example of the addition of bookmarks.
- Wearable camera 1 detects a predetermined movement of the police officer wearing wearable camera 1 by the acceleration sensor and the gyro sensor described below. When wearable camera 1 detects a predetermined movement of the police officer, wearable camera 1 adds a bookmark to the video data.
- wearable camera 1 when wearable camera 1 detects a dash of the police officer from the signal of the sensor, wearable camera 1 adds a bookmark indicated by arrow A 12 a in FIG. 6 to the video data.
- the bookmark indicated by arrow A 12 a in FIG. 6 includes the time at which the event of “dash” is detected, the place, and the bookmark classification.
- wearable camera 1 when wearable camera 1 detects the fall of the police officer from the signal of the sensor, wearable camera 1 adds a bookmark indicated by arrow A 12 b in FIG. 6 to the video data.
- the bookmark indicated by arrow A 12 b in FIG. 6 includes the time at which the event of “fall” is detected, the place, and the bookmark classification.
- FIG. 7 describes an example of the addition of bookmarks.
- Wearable camera 1 detects a predetermined image from the video data. For example, wearable camera 1 detects a person, a face, a vehicle, a number plate, an edged tool or a gun, an abnormal behavior, a crowd, color, the color of the clothes of a person, and the color of a vehicle included in the video data by image analysis. When wearable camera 1 detects a predetermined image from the video data, wearable camera 1 adds a bookmark to the video data.
- wearable camera 1 when wearable camera 1 detects a person from the video data at a certain time, wearable camera 1 adds a bookmark indicated by arrow A 13 a in FIG. 7 to the video data.
- the bookmark indicated by arrow A 13 a in FIG. 7 includes the time at which the event of “person” is detected, the place, and the bookmark classification.
- wearable camera 1 when wearable camera 1 detects a person from the video data at another time, wearable camera 1 adds a bookmark indicated by arrow A 13 b in FIG. 7 to the video data.
- the bookmark indicated by arrow A 13 b in FIG. 7 includes the time at which the event of “person” is detected, the place, and the bookmark classification.
- the video data may include a plurality of different people.
- Wearable camera 1 may distinguish the plurality of different people, and add identifiers to the bookmark classifications of the bookmarks corresponding to the people. For example, when three people are included in the video data, wearable camera 1 may add numbers to the bookmark classifications included in three bookmarks corresponding to the three people as identifiers. For example, wearable camera 1 may set the bookmark classifications included in the three bookmarks to be “Person 1”, “Person 2”, and “Person 3”.
- the person illustrated in FIG. 7 is the same person, and hence the bookmarks indicated by arrows A 13 a and A 13 b include the same “Person 1” as the bookmark classification.
- FIG. 8 describes an example of the addition of bookmarks.
- Wearable camera 1 may add a bookmark to the video data in accordance with predetermined words from the police officer possessing wearable camera 1 .
- Wearable camera 1 may start recording in accordance with predetermined words from the police officer possessing wearable camera 1 .
- Wearable camera 1 may stop recording in accordance with predetermined words from the police officer possessing wearable camera 1 .
- wearable camera 1 may perform pre-recording. Wearable camera 1 starts recording from a pre-recorded video indicated by arrow A 14 a in FIG. 8 when the police officer says “REC start”, for example.
- wearable camera 1 adds a bookmark indicated by arrow A 14 b in FIG. 8 to the video data.
- the bookmark indicated by arrow A 14 b in FIG. 8 includes the time at which the event of the audible sound “Bookmark” is detected, the place, and the bookmark classification.
- wearable camera 1 stops recording the video data as indicated by arrow A 14 c in FIG. 8 .
- wearable camera 1 includes a bookmark in the video data in accordance with predetermined words spoken by the police officer.
- the police officer can add a bookmark in the place in the video data that the police officer desires to watch when creating a report by saying predetermined words during the recording of the video data.
- Wearable camera 1 starts and stops recording in accordance with the words spoken by the police officer. As a result, the police officer can easily start and stop recording without operating a switch of wearable camera 1 .
- FIG. 9 illustrates an example of bookmarks of the face detection and the number plate detection.
- the bookmark includes the time, the place, and the bookmark classification, but the present invention is not limited thereto.
- wearable camera 1 may include the coordinates of the face on the image, and a snapshot of the face in the bookmark.
- wearable camera may include the coordinates of the number plate on the image, and a snapshot of the number plate in the bookmark.
- the bookmark of the face detection may include the time, the place, the bookmark classification, the coordinates indicating the location of the face on the image, and a snapshot of the face.
- the bookmark classification for the face detection may include identifiers for identifying faces, for example, as with the bookmark classification for the person detection in FIG. 7 .
- the bookmark classification for the face detection may be indicated as “face 1”, “face 2”. and “face 3”.
- the bookmark of the number plate detection may include the time, the place, the bookmark classification, the coordinates indicating the location of the number plate on the image, and a snapshot of the number plate.
- the bookmark classification for the number plate detection may include identifiers for identifying number plates, for example, as with the bookmark classification for the person detection in FIG. 7 .
- the bookmark classification for the number plate detection may be indicated as “plate 1”, “plate 2”, and “plate 3”.
- terminal apparatus 4 can place a blur on the face included in the video data when the video data is played back.
- terminal apparatus 4 can place a blur on the number plate included in the video data when the video data is played back.
- FIG. 10 illustrates a playback screen example of terminal apparatus 4 .
- the playback screen example in FIG. 10 partially omits the illustration of bookmark list 4 b , a playback button, and the like with respect to the playback screen example illustrated in FIG. 4 .
- wearable camera 1 includes the coordinates of the face in the bookmark of the face detection.
- Wearable camera 1 includes the coordinates of the number plate in the bookmark of the number plate detection.
- terminal apparatus 4 can place a blur on the face and the number plate that appear on the playback screen of the video data.
- the video data includes the bookmark of the face detection.
- Terminal apparatus 4 displays the face on the screen by placing a blur on the face on the screen on the basis of the coordinates indicating the location of the face included in the bookmark of the face detection.
- the video data includes the bookmark of the number plate detection.
- Terminal apparatus 4 displays the number plate on the screen by placing a blur on the number plate on the screen on the basis of the coordinates indicating the location of the number plate included in the bookmark of the number plate detection.
- wearable camera 1 includes the coordinates of the face on the image in the bookmark of the face detection.
- Wearable camera 1 includes the coordinates of the number plate on the image in the bookmark of the number plate detection.
- terminal apparatus 4 can place a blur on the face and the number plate when the video data is played back, and can protect privacy.
- the bookmark classification in the bookmark of the face detection may include identifiers for identifying faces.
- Terminal apparatus 4 may distinguish faces on the basis of the identifiers in the bookmark classification, and place a blur on the face. For example, terminal apparatus 4 can place a blur on the face of a person other than the suspect or the criminal, and prevent the blur from being placed on the face of the suspect or the criminal in accordance with the operation by the police officer that operates terminal apparatus 4 .
- the bookmark classification in the bookmark of the number plate detection may include identifiers for identifying number plates.
- Terminal apparatus 4 may distinguish number plates on the basis of the identifiers in the bookmark classification, and place a blur on the number plates. For example, terminal apparatus 4 can place a blur on the number plate of a vehicle other than the vehicle of the suspect or the criminal, and prevent the blur from being placed on the number plate of the vehicle of the suspect or the criminal in accordance with the operation by the police officer that operates terminal apparatus 4 .
- the bookmark classification includes identifiers for identifying faces, but the present invention is not limited thereto.
- Wearable camera 1 may include identifiers for identifying faces in the bookmark apart from the bookmark classification.
- the bookmark classification does not necessarily need to include identifiers for identifying faces.
- the bookmark classification includes identifiers for identifying number plates, but the present invention is not limited thereto.
- Wearable camera 1 may include identifiers for identifying number plates in the bookmark apart from the bookmark classification.
- the bookmark classification does not necessarily need to include identifiers for identifying number plates.
- each bookmark classification does not necessarily need to include identifiers as with the bookmark of the face detection and the bookmark of the number plate detection.
- the identifiers for identifying conversations and identifiers for identifying people may he included in the bookmarks apart from the bookmark classifications.
- FIG. 11 illustrates an example of the upper body of the police officer wearing wearable camera 1 and biological sensor 6 .
- the same parts as those in FIG. 1 are denoted by the same reference characters.
- Wearable camera 1 is worn or held on the front part of the uniform of police officer U 1 so as to take an image ahead of police officer U 1 .
- Wearable camera 1 may be fixed on the front part of the uniform in a state of being hung from the neck with a strap, for example.
- Wearable camera 1 may be fixed on the front part of the uniform by engaging an attachment (for example, an attachment clip) that is attached to a rear surface of a case of wearable camera 1 with a counterpart attachment that is attached to the front part of the uniform with each other.
- an attachment for example, an attachment clip
- Biological sensor 6 is worn on the wrist of police officer U 1 , for example. Biological sensor 6 acquires living body information such as the heart rate, sweating, and the body temperature of police officer U 1 from the wrist of police officer U 1 . Biological sensor 6 transmits the acquired living body information to wearable camera 1 .
- Wearable camera 1 receives the living body information transmitted from biological sensor 6 . Wearable camera 1 determines whether the police officer wearing wearable camera 1 is in an excited state on the basis of the received living body information. When wearable camera 1 detects the excited state of the police officer during the recording of the video data, wearable camera 1 adds a bookmark to the video data.
- FIG. 12 illustrates an external appearance example of wearable camera 1 .
- switches 11 and 12 are disposed on the front surface of the case of wearable camera 1 .
- Switch 15 is disposed on the side surface of the case of wearable camera 1 .
- Light emitting diodes (LEDs) 16 a to 16 c are disposed on the upper surface of the case of the wearable camera.
- Wearable camera 1 starts image capturing (recording) a moving image when switch 11 is short-pressed. Wearable camera 1 stops image capturing (recording) the moving image when the switch 11 is long-pressed.
- Wearable camera 1 captures (records) a still image in accordance with the pressing of switch 12 .
- Camera lens 13 forms an optical image of an object on an imaging surface of an imaging element.
- Microphone 14 collects the sound around wearable camera 1 .
- Wearable camera 1 communicates with external devices in accordance with the pressing of switch 15 .
- wearable camera 1 transmits information (including recorded video data) stored in a storage section described below to in-vehicle system 2 in accordance with the pressing of switch 15 .
- LEDs 16 a to 16 c indicate the state of wearable camera 1 .
- LEDs 16 a to 16 c indicate whether wearable camera 1 is recording or not.
- LEDs 16 a to 16 c indicate whether wearable camera 1 is communicating with an external device or not, for example.
- FIG. 13 illustrates a block configuration example of wearable camera 1 .
- wearable camera 1 includes controller 21 , camera 22 , gyro sensor 23 , acceleration sensor 24 , switch 25 , microphone 26 , speaker 27 , short-range communicator 28 , communicator 29 , GPS receiver 30 , and storage section 31 .
- Controller 21 controls the entirety of wearable camera 1 .
- the functions of controller 21 may be implemented by processors such as a central processing unit (CPU) and a digital signal processor (DSP), for example.
- processors such as a central processing unit (CPU) and a digital signal processor (DSP), for example.
- Camera 22 includes an imaging element, and camera lens 13 illustrated in FIG. 12 .
- the imaging element is a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor, for example.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- Camera 22 outputs a video signal output from the imaging element to controller 21 as a digital signal, for example.
- Controller 21 stores the digital signal output from camera 22 to storage section 31 .
- Gyro sensor 23 measures the angular velocity about three axes (an x-axis, a y-axis, and a z-axis) of a rectangular coordinate system, for example. Gyro sensor 23 outputs the measured angular velocity to controller 21 as a digital signal, for example.
- Acceleration sensor 24 measures the acceleration of the rectangular coordinate system in the direction of the three axes, for example. Acceleration sensor 24 outputs the measured acceleration to controller 21 as a digital signal, for example. Controller 21 detects the movements of the police officer wearing wearable camera 1 starting to walk, starting to run, fall, fighting, and the like from the angular velocity output from gyro sensor 23 and the acceleration output from acceleration sensor 24 .
- Switch 25 is an input apparatus that accepts the operation of the user.
- Switch 25 corresponds to switches 11 , 12 , and 15 illustrated in FIG. 12 .
- Switch 25 outputs information in accordance with the operation of the user to controller 21 as a digital signal, for example.
- Microphone 26 collects the sound around wearable camera 1 or the voice of the police officer wearing wearable camera 1 .
- Microphone 26 corresponds to microphone 14 illustrated in FIG. 12 .
- Microphone 26 outputs the signal of the collected sound to controller 21 as a digital signal, for example.
- Microphone 26 can be understood to be a sensor that collects sound.
- Speaker 27 converts the audible sound signal output from controller 21 into audible sound, and outputs the audible sound.
- Short-range communicator 28 performs short-range wireless communication with in-vehicle system 2 of police vehicle A 1 by Wi-Fi or Bluetooth, for example.
- Short-range communicator 28 performs wireless communication with biological sensor 6 by short-range wireless communication such as Wi-Fi or Bluetooth, for example.
- short-range communicator 28 may perform short-range wireless communication with in-vehicle system 2 via a mobile terminal such as a smartphone possessed by the police officer, for example.
- Short-range communicator 28 may perform short-range wireless communication with biological sensor 6 via a mobile terminal such as a smartphone possessed by the police officer, for example.
- Communicator 29 communicates with server 3 via network 5 .
- UPS receiver 30 receives a GPS signal transmitted from a plurality of GPS transmitters. GPS receiver 30 calculates the location of wearable camera 1 on the basis of the received GPS signal. GPS receiver 30 outputs the calculated location of wearable camera 1 to controller 21 . Note that the location of wearable camera 1 may be calculated by controller 21 on the basis of the GPS signal received by GPS receiver 30 .
- Images (moving images or still images) taken by camera 22 are stored in storage section 31 .
- the images stored in storage section 31 are saved as evidence images, for example, and cannot be erased.
- a program or data executed by a processor may be stored.
- Storage section 31 may be formed by a read only memory (ROM), a random access memory (RAM), a flash memory, and a hard disk drive (HDD), for example.
- ROM read only memory
- RAM random access memory
- HDD hard disk drive
- the storage section that stores the video data therein and the storage section that stores the program or the data therein may be different storage sections.
- FIG. 14 illustrates a block configuration example of server 3 .
- server 3 includes CPU 41 , RAM 42 , HDD 43 , communication interface 44 , and bus 45 .
- CPU 41 functions as controller 41 a by the execution of the program.
- RAM 42 , HDD 43 , and communication interface 44 are connected to CPU 41 via bus 45 .
- RAM 42 an application program and a program of an operating system (OS) to be executed by CPU 41 are temporarily stored.
- OS operating system
- RAM 42 various data necessary for the processing by CPU 41 are temporarily stored.
- HDD 43 an OS, an application program, and the like are stored.
- the video data of the video captured by wearable camera 1 worn on the user and the in-vehicle camera installed in police vehicle A 1 are stored.
- Communication interface 44 communicates with in-vehicle system 2 and terminal apparatus 4 via network 5 .
- FIG. 15 illustrates a block configuration example of terminal apparatus 4 .
- terminal apparatus 4 includes CPU 51 , RAM 52 , HDD 53 , communication interface 54 , user interface 55 , and bus 56 .
- CPU 51 functions as controller 51 a by the execution of the program.
- CPU 51 is connected to RAM 52 , HDD 53 , communication interface 54 , and user interface 55 via bus 56 .
- RAM 52 an application program and a program of an OS to be executed by CPU 51 are temporarily stored.
- RAM 52 various data necessary for the processing by CPU 51 is temporarily stored.
- HDD 53 In HDD 53 , an OS, an application program, and the like are stored.
- Communication interface 54 communicates with server 3 and in-vehicle system 2 of police vehicle A 1 via network 5 .
- a keyboard apparatus and a display are connected to user interface 55 .
- CPU 51 exchanges data with the keyboard apparatus, the display, and the like via user interface 55 .
- FIG. 16 is a flowchart illustrating an operation example of wearable camera 1 .
- FIG. 16 illustrates the operation example of wearable camera 1 from when the recording starts to when the recording stops.
- Controller 21 of wearable camera 1 performs pre-recording for a certain amount of time.
- Controller 21 of wearable camera 1 starts recording in accordance with the operation of switch 11 by the police officer wearing wearable camera 1 (Step S 1 ). Controller 21 of wearable camera 1 starts recording after going back by a certain amount of time. Note that controller 21 of wearable camera 1 may start recording in accordance with the voice of the police officer wearing wearable camera 1 .
- Microphone 26 of wearable camera 1 collects special sound.
- the special sound is a gunshot, for example.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 2 ).
- Controller 21 of wearable camera 1 includes the time at which the special sound is detected, the place, and the bookmark classification in the bookmark.
- Gyro sensor 23 and acceleration sensor 24 of wearable camera 1 measure the action (movement) of the police officer.
- the action of the police officer is a dash, for example.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 3 ).
- Controller 21 of wearable camera 1 includes the time at which the dash of the police officer is detected, the place, and the bookmark classification in the bookmark.
- controller 21 of wearable camera 1 may detect a face, a number plate, and color by monitoring the video data captured by camera 22 at a certain interval. Controller 21 of wearable camera 1 may detect a conversation by monitoring the sound collected by microphone 26 at a certain interval.
- Microphone 26 of wearable camera 1 collects the sound of the conversation of people.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 4 ).
- Controller 21 of wearable camera 1 includes the time at which the conversation is detected, the place, and the bookmark classification in the bookmark.
- Camera 22 of wearable camera 1 takes an image of the face and the number plate.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 5 ).
- Controller 21 of wearable camera 1 includes the time at which the face is detected, the place, the bookmark classification, the identifier for identifying a face, the coordinates of the face, and a snapshot of the face in the bookmark of the face detection.
- Controller 21 of wearable camera 1 includes the time at which the number plate is detected, the place, the bookmark classification, the identifier for identifying a number plate, the coordinates of the number plate, and a snapshot of the number plate in the bookmark of the number plate detection.
- Step S 5 camera 22 of wearable camera 1 takes images of the face and the number plate, but may take an image of either one of the face and the number plate.
- controller 21 of wearable camera 1 adds the bookmark of either one of the face and the number plate taken by camera 22 of wearable camera 1 in the video data.
- Camera 22 of wearable camera 1 takes an image of a building and the like.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 6 ).
- Controller 21 of wearable camera 1 includes the time at which the color is detected, the place, and the bookmark classification in the bookmark of the color detection.
- Microphone 26 of wearable camera 1 collects the sound of the conversation of people.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 7 ).
- Controller 21 of wearable camera 1 includes the time at which the conversation is detected, the place, and the bookmark classification in the bookmark.
- Step S 8 when a face and a number plate arc detected from the video data captured by camera 22 , controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 8 ).
- controller 21 of wearable camera 1 includes the same identifier as that in Step S 5 in the bookmark.
- controller 21 of wearable camera 1 includes an identifier different from that in Step S 5 in the bookmark.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 9 ).
- Controller 21 of wearable camera 1 includes the time at which the conversation is detected, the place, and the bookmark classification in the bookmark.
- Controller 21 of wearable camera 1 stops recording in accordance with the operation of switch 11 by the police officer wearing wearable camera 1 (Step S 10 ). Note that controller 21 of wearable camera 1 may stop recording in accordance with the voice of the police officer wearing wearable camera 1 .
- FIG. 17 is a flowchart illustrating an operation example of blur processing of terminal apparatus 4 .
- Controller 51 a of terminal apparatus 4 reads a target video (video data) from server 3 in accordance with the operation by the police officer creating a report, for example (Step S 21 ). At this time, controller 51 a of terminal apparatus 4 reads the bookmarks added to the video data.
- Controller 51 a of terminal apparatus 4 extracts the bookmark of the face detection and the bookmark of the number plate detection out of the bookmarks read in Step S 21 (Step S 22 ).
- Controller 51 a of terminal apparatus 4 starts the blur processing (Step S 23 ).
- Controller 51 a of terminal apparatus 4 acquires the coordinates of the face and the coordinates of the number plate from the bookmark of the face detection and the bookmark of the number plate detection extracted in Step S 22 , and specifies the location of the face and the location of the number plate on the image on the basis of the acquired coordinates (Step S 24 ). At this time, controller 51 a of terminal apparatus 4 may distinguish the faces included in the image on the basis of the identifiers included in the bookmark of the face detection. Controller 51 a of terminal apparatus 4 may distinguish the number plates included in the image on the basis of the identifiers included in the bookmark of the number plate detection. As a result, controller 51 a of terminal apparatus 4 can place a blur on faces other than the face of the suspect or the criminal, for example. Controller 51 a of terminal apparatus 4 can place a blur on the number plates of vehicles other than the number plate of the vehicle of the suspect or the criminal, for example.
- Controller 51 a of terminal apparatus 4 places a blur on the locations of the face and the number plate specified in Step S 24 (Step S 25 ).
- Controller 51 a of terminal apparatus 4 performs blur processing of the face and the number plate on the basis of the remaining bookmarks of the face detection and bookmarks of the number plate detection extracted in Step S 22 (Steps S 26 , S 27 , . . . , S 28 , and S 29 ).
- Controller 51 a of terminal apparatus 4 ends the blur processing (Step S 30 ).
- camera 22 storage section 31 that stores therein the video data of the video captured by camera 22 , and controller 21 that adds a bookmark to the video data when an event is detected from a signal of the sensor or the video are included.
- Sensor may be biological sensor 6 , gyro sensor 23 , acceleration sensor 24 , and microphone 26 , for example.
- the bookmark added to the video data is a signal for playing back the video data from a place at which an event is specified in terminal apparatus 4 .
- terminal apparatus 4 receives the video data of the video for which report is to be created from server 3 in accordance with the operation by the police officer creating a report on the case.
- Terminal apparatus 4 can display events on the display apparatus from the bookmarks added to the video data received from server 3 as illustrated in bookmark list 4 b in FIG. 4 , for example.
- bookmark list 4 b in FIG. 4 When an event displayed in bookmark list 4 b in FIG. 4 is selected by the police officer, for example, terminal apparatus 4 can play back the video data from the selected event. Therefore, the police officer creating a report can play back the video data from the place (event) necessary for the report creation, and the report creation becomes easier.
- the in-vehicle camera installed in police vehicle A 1 may also add a bookmark to the video data as with wearable camera 1 .
- In-vehicle camera may include a block configuration similar to the block configuration illustrated in FIG. 13 , for example. However, the block configuration of the in-vehicle camera does not necessarily need to include gyro sensor 23 and acceleration sensor 24 .
- terminal apparatus 4 places a blur on the face and the number plate included in the video, but the present invention is not limited thereto.
- Server 3 may place a blur on the face and the number plate included in the video.
- Server 3 may transmit the video data on which a blur is placed to terminal apparatus 4 .
- the police officer possesses wearable camera 1 , but the present invention is not limited thereto.
- the security guard may possess wearable camera 1 .
- terminal apparatus 4 may display the location information (for example, the address) included in the selected bookmark on the display apparatus. As a result, the police officer can write where the video is captured in the report.
- controller 21 of wearable camera 1 detects an event from the signal of biological sensor 6 , the signal of gyro sensor 23 and acceleration sensor 24 , the signal of microphone 26 , and the video of camera 22 , but the present invention is not limited thereto. Controller 21 of wearable camera 1 may detect an event from at least one of the signal of biological sensor 6 , the signal of gyro sensor 23 and acceleration sensor 24 , the signal of microphone 26 , and the video of camera 22 . For example, controller 21 of wearable camera 1 may detect an event from two, that is, the signal (sound) of microphone 26 and the video of camera 22 .
- wearable camera 1 monitors the video data captured by camera 22 at a certain interval, and detects a face, a number plate, or color. Then, wearable camera 1 adds a bookmark to the video data each time a face, a number plate, or color is detected.
- wearable camera 1 adds a bookmark to the video data when the detection of a face, a number plate, or color is started, and when a face, a number plate, or color is no longer detected.
- wearable camera 1 adds a bookmark to the video data when a face, a number plate, or color enters a video range (capture range), and when the face, the number plate, or the color in the video range exits the video range.
- wearable camera 1 monitors the sound collected by microphone 26 at a certain interval, and detects a conversation. Then, wearable camera 1 adds a bookmark to the video data each time a conversation is detected.
- wearable camera 1 adds a bookmark to the video data when the conversation detection is started, and when the conversation is no longer detected.
- wearable camera 1 adds a bookmark to the video data when a conversation starts, and when the conversation ends. Parts different from those in Embodiment 1 are described below.
- FIG. 18 is a flowchart illustrating an operation example of wearable camera 1 according to Embodiment 2.
- FIG. 18 illustrates an operation example of wearable camera 1 from when the recording starts to when the recording stops.
- Controller 21 of wearable camera 1 performs pre-recording for a certain amount of time.
- Steps S 41 to S 43 in FIG. 18 is similar to the processing in Steps S 1 to S 3 illustrated in FIG. 16 , and the description thereof is omitted.
- Controller 21 of wearable camera 1 may detect a face, a number plate, and color by monitoring the video data captured by camera 22 at a certain interval. Controller 21 of wearable camera 1 may detect a conversation by monitoring the sound collected by microphone 26 at a certain interval.
- Microphone 26 of wearable camera 1 collects the sound of the conversation of people. Controller 21 of wearable camera 1 detects the starts of the conversation collected by microphone 26 , and adds a bookmark to the video data (Step S 44 ). Controller 21 of wearable camera 1 includes the time at which the conversation is detected, the place, and the bookmark classification in the bookmark.
- Camera 22 of wearable camera 1 takes an image of the face and the number plate.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 45 ).
- Controller 21 of wearable camera 1 includes the time at which the face is detected, the place, the bookmark classification, the identifier for identifying the face, the coordinates of the face, and a snapshot of the face in the bookmark of the face detection.
- Controller 21 of wearable camera 1 includes the time at which the number plate is detected, the place, the bookmark classification, the identifier for identifying the number plate, the coordinates of the number plate, and a snapshot of the number plate in the bookmark of the number plate detection.
- controller 21 of wearable camera 1 does not add the bookmarks for the face and the number plate detected in Step S 45 to the video data until the face and the number plate detected in Step S 45 exit the video range. Meanwhile, when a face and a number plate that are different from the face and the number plate detected in Step S 45 are detected, controller 21 of wearable camera 1 adds bookmarks that are different from those for the face and the number plate detected in Step S 45 to the video data.
- Camera 22 of wearable camera 1 takes an image of a building and the like.
- controller 21 of wearable camera 1 adds a bookmark to the video data (Step S 46 ).
- Controller 21 of wearable camera 1 includes the time at which the color is detected, the place, and the bookmark classification in the bookmark of the color detection.
- controller 21 of wearable camera 1 docs not add the bookmark for the color detected in Step S 46 to the video data until the color detected in Step S 46 exits the video range. Meanwhile, when a color of building that is different from the color of the building and the like detected in Step S 46 is detected, controller 21 of wearable camera 1 adds a bookmark different from that for the color detected in Step S 46 to the video data.
- Controller 21 of wearable camera 1 detects the end of the conversation detected in Step S 44 (Step S 47 ). Controller 21 of wearable camera 1 includes the time at which the end of the conversation is detected, the place, and the bookmark classification in the bookmark.
- Controller 21 of wearable camera 1 detects the end of the image capturing of the face and the number plate detected in Step S 45 (Step S 48 ). Controller 21 of wearable camera 1 includes the time at which the detection of the face and the number plate has ended, the place, and the bookmark classification in the bookmark.
- Controller 21 of wearable camera 1 detects the end of the image capturing of the color detected in Step S 46 (Step S 49 ). Controller 21 of wearable camera 1 includes the time at which the color detection has ended, the place, and the bookmark classification in the bookmark.
- Step S 50 in FIG. 18 is similar to the processing in Step S 10 illustrated in FIG. 16 , and the description thereof is omitted.
- controller 21 of wearable camera 1 adds a bookmark to the video data when a face, a number plate, or color enters the video range, and when the face, the number plate, or the color included in the video range exits the video range.
- Wearable camera 1 adds a bookmark to the video data when a conversation starts and when the conversation ends. As a result, controller 21 of wearable camera 1 does not need to add bookmarks to the video data at a certain interval, and the processing load can be reduced.
- expressions such as “ . . . section”, “ . . . or”, and “ . . . er” used in the components may he replaced with other expressions such as “ . . . circuitry”, “ . . . device”, “ . . . unit”, or “ . . . module”.
- the function blocks used in the description of the embodiments described above may be partially or entirely implemented as an LSI, which is an integrated circuit, and each of the processes described in the embodiments described above may be partially or entirely controlled by one LSI or a combination or LSIs.
- the LSI may be formed by individual chips, or may be formed by one chip so as to include a part or all of the function blocks.
- the LSI may include input and output of data.
- the LSI may be called an IC, a system LSI, a super LSI, and an ultra LSI in accordance with the difference in the degree of integration.
- the method of forming an integrated circuit is not limited to the LSI, and may be implemented by a dedicated circuit, a general purpose processor, or a dedicated processor.
- An FPGA that is programmable and a reconfigurable processor capable of reconfiguring the connection and the setting of the circuit cell in the LSI may be used after manufacturing the LSI.
- This disclosure may be implemented as digital processing or analog processing.
- the function blocks may be naturally integrated with use of the technology.
- the application of the biotechnology and the like is possible.
- This disclosure is useful in a wearable camera that records a video.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Television Signal Processing For Recording (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
Abstract
Description
- This application is entitled and claims the benefit of Japanese Patent Application No. 2019-088153, filed on May 8, 2019, the disclosure of which including the specification, drawings and abstract is incorporated herein by reference in its entirety.
- This disclosure relates to a wearable camera, a video playback system, and a video playback method.
- In recent years, wearable cameras have been introduced in order to support the work of police officers, security guards, or the like (for example, see PTL 1). The wearable camera is attached on the body of the police officer, the security guard, or the like or the clothes worn by them, and captures a video of a scene and the like.
- The police officer, the security guard, or the like may play back, for example, the video captured (recorded) by the wearable camera on a terminal apparatus after returning to the police station or the office, and create a report on the case while watching the played back video.
-
PTL 1 - Japanese Patent Application Laid-Open No. 2016-181767
- The video data captured by the wearable camera may include videos that are not relevant to the case (report), for example. Watching a video still including videos that are not relevant to the case to create the report is time consuming in the report creation and is a problem.
- A non-limiting example of this disclosure contributes to the provision of a wearable camera and a signal adding method capable of simplifying the report creation.
- A wearable camera according to one aspect of the present disclosure includes: a storage section that stores therein video data of a captured moving image; and a controller that adds a bookmark signal to the video data when an event included in the video data is detected, in which the bookmark signal indicates a location at which playback of the video data starts, when the video data is played back.
- A video playback system according to one aspect of the the present disclosure includes: a wearable camera; and a display apparatus that plays back video data of a moving image acquired by the wearable camera, in which: the wearable camera includes: a storage section that stores therein video data of a captured moving image; and a controller that adds a bookmark signal to the video data when an event included in the video data is detected, in which the bookmark signal indicates a location at which playback of the video data starts, when the video data is played back; and the display apparatus starts playback of the video data from a location on the video data indicated by the bookmark signal.
- A video playback method according to one aspect of the present disclosure is a method for a video playback system including a wearable camera and a display apparatus that plays back video data of a moving image acquired by the wearable camera, the video playback method including: storing, by the wearable camera, video data of a captured moving image; adding, by the wearable camera, a bookmark signal to the video data when an event included in the video data is detected, the bookmark signal indicating a location at which playback of the video data starts, when the video data is played back; and starting, by the display apparatus, playback of the video data from a location on the video data indicated by the bookmark signal.
- Note that inclusive or specific aspects above may be implemented by a system, an apparatus, a method, an integrated circuit, a computer program, or a recording medium, and may be implemented by a freely-selected combination of the system, the apparatus, the method, the integrated circuit, the computer program, and the recording medium.
- According to one aspect of this disclosure, the report creation can be simplified.
- Further advantages and effects in one example of this disclosure are clarified from the description and the accompanying drawings. Those advantages and/or effects are provided by a number of embodiments and features described in the description and the accompanying drawings, but not all necessarily need to be provided in order to obtain one or more of the same features.
-
FIG. 1 illustrates a configuration example of a wearable camera system according toEmbodiment 1; -
FIG. 2 describes an example of bookmarks added to video data of a wearable camera; -
FIG. 3 describes an example of the addition of bookmarks; -
FIG. 4 illustrates a playback screen example of a terminal apparatus; -
FIG. 5 describes an example of the addition of bookmarks; -
FIG. 6 describes an example of the addition of bookmarks; -
FIG. 7 describes an example of the addition of bookmarks; -
FIG. 8 describes an example of the addition of bookmarks; -
FIG. 9 illustrates an example of bookmarks of the face detection and the number plate detection; -
FIG. 10 illustrates a playback screen example of the terminal apparatus; -
FIG. 11 illustrates an example of an upper body of a police officer wearing the wearable camera and a biological sensor; -
FIG. 12 illustrates an external appearance example of the wearable camera; -
FIG. 13 illustrates a block configuration example of the wearable camera; -
FIG. 14 illustrates a block configuration example of a server; -
FIG. 15 illustrates a block configuration example of the terminal apparatus; -
FIG. 16 is a flowchart illustrating an operation example of the wearable camera; -
FIG. 17 is a flowchart illustrating an operation example of blur processing of the terminal apparatus; and -
FIG. 18 is a flowchart illustrating an operation example of a wearable camera according toEmbodiment 2. - Embodiments of the present invention arc described in detail below with reference to the accompanying drawings, as appropriate. Unnecessarily detailed descriptions may be omitted. For example, detailed descriptions of features that are already well known or overlapping descriptions for configurations that are substantially the same may be omitted. This is for preventing the description below from becoming unnecessarily redundant, and facilitating the understanding of a person skilled in the art.
- Note that the accompanying drawings and the description below are provided so that a person skilled in the art would sufficiently understand this disclosure, and it is not intended to thereby limit the subject matter described in the appended claims.
-
FIG. 1 illustrates a configuration example of a wearable camera system according toEmbodiment 1. As illustrated inFIG. 1 , the wearable camera system includeswearable camera 1, in-vehicle system 2,server 3, and terminal apparatus 4. - In-
vehicle system 2,server 3, and terminal apparatus 4 are connected to each other vianetwork 5.Wearable camera 1 is connected toserver 3 and terminal apparatus 4 via in-vehicle system 2 andnetwork 5. Network 5 may include networks such as the Internet and a wireless communication network of a mobile phone and the like, for example. -
Wearable camera 1 is worn on or possessed by a police officer, for example (for example, seeFIG. 11 ).Wearable camera 1 communicates with in-vehicle system 2 by short-range wireless communication such as Wi-Fi (R) or Bluetooth (R), for example. - In-
vehicle system 2 is installed in police vehicle A1, for example. In-vehicle system 2 includes an in-vehicle camera (not shown), a control apparatus (not shown) such as a personal computer, and a communicate apparatus (not shown), for example. - In-
vehicle system 2 receives video data that is captured bywearable camera 1 fromwearable camera 1, for example. In-vehicle system 2 transmits the video data received fromwearable camera 1 toserver 3 vianetwork 5. In-vehicle system 2 transmits the video data captured by the in-vehicle camera toserver 3 vianetwork 5. -
Server 3 stores the video data captured bywearable camera 1 and the video data captured by the in-vehicle camera of in-vehicle system 2 therein.Server 3 stores a report created by terminal apparatus 4 and the like therein. - Terminal apparatus 4 is used by a police officer in police station A2, for example. Terminal apparatus 4 accesses
server 3 in accordance with the operation by the police officer, and displays the video data stored inserver 3 on the display apparatus. Terminal apparatus 4 creates a report relating to the case and the like, for example, in accordance with the operation by the police officer. Terminal apparatus 4 transmits the created report toserver 3 vianetwork 5. - Note that
wearable camera 1 is connected toserver 3 and terminal apparatus 4 via in-vehicle system 2 andnetwork 5, but the present invention is not limited thereto.Wearable camera 1 may be connected toserver 3 and terminal apparatus 4 vianetwork 5 and not via in-vehicle system 2. -
FIG. 2 describes an example of bookmarks added to the video data ofwearable camera 1.Wearable camera 1 detects a predetermined event. For example,wearable camera 1 detects an event shown in the right column inFIG. 2 . Whenwearable camera 1 detects an event shown in the right column inFIG. 2 ,wearable camera 1 adds a bookmark signal (hereinafter may be referred to as a bookmark) including the detected event content to the video data that is being captured (for example, seeFIG. 3 ). - For example,
wearable camera 1 detects the dash, the fall, or the fight with a suspect of the police officer wearingwearable camera 1 by a gyro sensor and an acceleration sensor described below. Whenwearable camera 1 detects the dash, the fall, or the fight with the suspect of the police officer,wearable camera 1 adds a bookmark indicating that the police officer has dashed, fell, or fought with the suspect to the video data. -
Wearable camera 1 detects the excited state of the police officer wearingwearable camera 1 by a biological sensor described below. Whenwearable camera 1 detects the excited state of the police officer,wearable camera 1 adds a bookmark indicating that the police officer has entered an excited state to the video data. - When a predetermined image is included in the video data that is being captured,
wearable camera 1 detects that a predetermined image is included in the video data. Whenwearable camera 1 detects that a predetermined image is included in the video data,wearable camera 1 adds a bookmark indicating that a predetermined image is included to the video data. As shown in the image detection inFIG. 2 , the predetermined images to be detected include a person, a face, a vehicle, a number plate, an edged tool or a gun, abnormal behavior of a person, a crowd, color, the color of the clothes of a person, and the color of the vehicle, for example. The color may be color other than the color of the clothes of a person and the vehicle, and may be the color of a building and the like, for example. - When a conversation or predetermined words are included in the sound collected by a microphone described below,
wearable camera 1 detects that a conversation or predetermined words are included in the collected sound. Whenwearable camera 1 detects that a conversation or predetermined words are included in the collected sound,wearable camera 1 adds a bookmark indicating that a conversation or predetermined words are included to the video data. - When a gunshot or an explosion sound are included in the sound collected by the microphone described below,
wearable camera 1 detects that a gunshot or an explosion sound are included in the collected sound. Whenwearable camera 1 detects that a gunshot or an explosion sound are included in the collected sound,wearable camera 1 adds a bookmark indicating that a gunshot or an explosion sound are included to the video data. - Note that the events detected by
wearable camera 1 may be classified into the detection of an action of the police officer, the living body detection, the image detection, the audible sound detection, and the special sound detection, for example, as shown in the left column inFIG. 2 . - The bookmark added to the video data may be referred to as attribute information, a tag, or metadata.
- The event may be understood to be an event relevant to the case. For example, when the suspect suddenly starts to run, the police officer starts to run in order to chase the suspect. Therefore, the “dash detection” in
FIG. 2 can be said to be an event relevant to the case. -
FIG. 3 describes an example of the addition of bookmarks.FIG. 3 illustrates some frames of the video data captured bywearable camera 1. The horizontal axis inFIG. 3 indicates time. - At time t1,
wearable camera 1 detects a person from the video data that is being captured. In this case, a bookmark including time t1 and an event indicating that the person is detected is added to the video data. - At time t2,
wearable camera 1 detects a dash of the police officer wearingwearable camera 1 from the acceleration sensor and the gyro sensor described below. In this case, a bookmark including time t2 and an event indicating that the dash is detected is added to the video data. - At time t3,
wearable camera 1 detects a conversation from the audible sound in the video data. In this case, a bookmark including time t3 and an event indicating that a conversation is detected is added to the video data. - At time t4,
wearable camera 1 detects a gunshot from the audible sound in the video data. In this case, a bookmark including time t4 and an event indicating that a gunshot is detected is added to the video data. - Note that the bookmark may include the place in which the event is detected and the like besides the time and the information indicating the event content as described in
FIG. 5 toFIG. 9 . - The operation example in
FIG. 1 is described. The police officer starts image capturing onwearable camera 1. Whenwearable camera 1 detects a predetermined event from the sensor, the video, and the audible sound from the microphone,wearable camera 1 adds a bookmark to the video data. For example, whenwearable camera 1 detects an event shown in the right column inFIG. 2 ,wearable camera 1 adds a bookmark to the video data as illustrated inFIG. 3 . - The police officer ends the image capturing on
wearable camera 1. The police officer transmits the video data ofwearable camera 1 toserver 3 via in-vehicle system 2 andnetwork 5. - The police officer returns to police station A2, and creates a report on the case with use of terminal apparatus 4. For example, the police officer accesses
server 3 with use of terminal apparatus 4, and plays back the video data with which a report on the case is to be created on terminal apparatus 4. The police officer creates a report on the case on the basis of the video data played back on terminal apparatus 4. -
FIG. 4 illustrates a playback screen example of terminal apparatus 4. Terminal apparatus 4 accepts information on the video data to be played back from the police officer. Terminal apparatus 4 accessesserver 3 and receives the video data corresponding to the accepted information from the police officer. Terminal apparatus 4 displays the video of the received video data on the display apparatus as illustrated inplayback screen 4 a inFIG. 4 . - The video data includes the bookmarks. Terminal apparatus 4 displays the bookmarks included in the video data as illustrated in
bookmark list 4 b inFIG. 4 . - On
bookmark list 4 b, the time at which the bookmark is added to the video data, and the event content of the bookmark added at the time are displayed in association with each other. When a bookmark displayed onbookmark list 4 b is selected, terminal apparatus 4 starts the playback from the video place in the video data of the selected bookmark. For example, when the “conversation detection” inbookmark list 4 b is selected, terminal apparatus 4 plays back the video from the place of the bookmark of the “conversation detection” or the time included in the bookmark of the “conversation detection”. In other words, terminal apparatus 4 cues the video from the place selected inbookmark list 4 b. - As described above,
wearable camera 1 detects an event, and adds a bookmark including the time at which the event is detected and the content of the detected event to the video data that is being captured. As a result, terminal apparatus 4 can play back the video data from the place at which the bookmark is added or the time included in the bookmark. Therefore, the creation of the report on the case becomes easier for the police officer. - For example, when the police officer writes the content of conversation with the suspect in the report, the police officer selects the “conversation detection” in
bookmark list 4 b inFIG. 4 . Terminal apparatus 4 plays back the video data from the place at which the bookmark of the “conversation detection” is added. When the police officer writes the situation in which a gunshot has occurred in the report, the police officer selects the “gunshot detection” inbookmark list 4 b inFIG. 4 . Terminal apparatus 4 plays back the video data from the place at which the bookmark of the “gunshot detection” is added. As a result, the police officer can play back the video from the video place from which the police officer desires to report the case, and the creation of the report becomes easier. - Note that, in
wearable camera 1, the detected event content may be added, erased, and changed by a server (not shown) in police station A2. For example, the color in the “detection of color of clothes of person” shown inFIG. 2 may be changed by the server in police station A2. The words in the “detection of predetermined words” shown inFIG. 2 may be changed by the server in police station A2. - Another example of the addition of the bookmarks by
wearable camera 1 is described.Wearable camera 1 may include location information ofwearable camera 1 in the bookmark added to the video data. In other words,wearable camera 1 may include the information on the capturing place in the bookmark.Wearable camera 1 can acquire the current location information by a Global Positioning System (GPS) described below, for example. -
FIG. 5 describes an example of the addition of bookmarks.Wearable camera 1 collects the sound by the microphone described below, and detects a predetermined sound. Whenwearable camera 1 detects a predetermined sound,wearable camera 1 adds a bookmark to the video data. - For example, when
wearable camera 1 detects “conversation” from the collected sound,wearable camera 1 adds a bookmark indicated by arrow A11 a inFIG. 5 to the video data. In the bookmark indicated by arrow A11 a inFIG. 5 , the time at which the event of “conversation” is detected, the place, and the bookmark classification are included. The bookmark classification may be understood to be information indicating the content of the detected event. - For example, when
wearable camera 1 detects “gunshot” from the collected sound,wearable camera 1 adds a bookmark indicated by arrow A11 b inFIG. 5 to the video data. The bookmark indicated by arrow A11 b inFIG. 5 includes the time at which the event of “gunshot” is detected, the place, and the bookmark classification. - Note that the video data may include a plurality of conversations.
Wearable camera 1 may distinguish a plurality of conversations, and add identifiers to the bookmark classifications of the bookmarks corresponding to the conversations. For example, when three conversations are included in the video data,wearable camera 1 may add numbers to the bookmark classifications included in the three bookmarks corresponding to the three conversations as identifiers. For example,wearable camera 1 may set the bookmark classifications included in each of the three bookmarks to be “Conversation 1”, “Conversation 2”, and “Conversation 3”. -
FIG. 6 describes an example of the addition of bookmarks.Wearable camera 1 detects a predetermined movement of the police officer wearingwearable camera 1 by the acceleration sensor and the gyro sensor described below. Whenwearable camera 1 detects a predetermined movement of the police officer,wearable camera 1 adds a bookmark to the video data. - For example, when
wearable camera 1 detects a dash of the police officer from the signal of the sensor,wearable camera 1 adds a bookmark indicated by arrow A12 a inFIG. 6 to the video data. The bookmark indicated by arrow A12 a inFIG. 6 includes the time at which the event of “dash” is detected, the place, and the bookmark classification. - For example, when
wearable camera 1 detects the fall of the police officer from the signal of the sensor,wearable camera 1 adds a bookmark indicated by arrow A12 b inFIG. 6 to the video data. The bookmark indicated by arrow A12 b inFIG. 6 includes the time at which the event of “fall” is detected, the place, and the bookmark classification. -
FIG. 7 describes an example of the addition of bookmarks.Wearable camera 1 detects a predetermined image from the video data. For example,wearable camera 1 detects a person, a face, a vehicle, a number plate, an edged tool or a gun, an abnormal behavior, a crowd, color, the color of the clothes of a person, and the color of a vehicle included in the video data by image analysis. Whenwearable camera 1 detects a predetermined image from the video data,wearable camera 1 adds a bookmark to the video data. - For example, when
wearable camera 1 detects a person from the video data at a certain time,wearable camera 1 adds a bookmark indicated by arrow A13 a inFIG. 7 to the video data. The bookmark indicated by arrow A13 a inFIG. 7 includes the time at which the event of “person” is detected, the place, and the bookmark classification. - For example, when
wearable camera 1 detects a person from the video data at another time,wearable camera 1 adds a bookmark indicated by arrow A13 b inFIG. 7 to the video data. The bookmark indicated by arrow A13 b inFIG. 7 includes the time at which the event of “person” is detected, the place, and the bookmark classification. - Note that the video data may include a plurality of different people.
Wearable camera 1 may distinguish the plurality of different people, and add identifiers to the bookmark classifications of the bookmarks corresponding to the people. For example, when three people are included in the video data,wearable camera 1 may add numbers to the bookmark classifications included in three bookmarks corresponding to the three people as identifiers. For example,wearable camera 1 may set the bookmark classifications included in the three bookmarks to be “Person 1”, “Person 2”, and “Person 3”. The person illustrated inFIG. 7 is the same person, and hence the bookmarks indicated by arrows A13 a and A13 b include the same “Person 1” as the bookmark classification. -
FIG. 8 describes an example of the addition of bookmarks.Wearable camera 1 may add a bookmark to the video data in accordance with predetermined words from the police officer possessingwearable camera 1. -
Wearable camera 1 may start recording in accordance with predetermined words from the police officer possessingwearable camera 1.Wearable camera 1 may stop recording in accordance with predetermined words from the police officer possessingwearable camera 1. - For example,
wearable camera 1 may perform pre-recording.Wearable camera 1 starts recording from a pre-recorded video indicated by arrow A14 a inFIG. 8 when the police officer says “REC start”, for example. - For example, when the police officer says “Bookmark”,
wearable camera 1 adds a bookmark indicated by arrow A14 b inFIG. 8 to the video data. The bookmark indicated by arrow A14 b inFIG. 8 includes the time at which the event of the audible sound “Bookmark” is detected, the place, and the bookmark classification. - For example, when the police officer says “REC stop”,
wearable camera 1 stops recording the video data as indicated by arrow A14 c inFIG. 8 . - As described above,
wearable camera 1 includes a bookmark in the video data in accordance with predetermined words spoken by the police officer. As a result, the police officer can add a bookmark in the place in the video data that the police officer desires to watch when creating a report by saying predetermined words during the recording of the video data. -
Wearable camera 1 starts and stops recording in accordance with the words spoken by the police officer. As a result, the police officer can easily start and stop recording without operating a switch ofwearable camera 1. -
FIG. 9 illustrates an example of bookmarks of the face detection and the number plate detection. In the description above, the bookmark includes the time, the place, and the bookmark classification, but the present invention is not limited thereto. In the face detection,wearable camera 1 may include the coordinates of the face on the image, and a snapshot of the face in the bookmark. In the number plate detection, wearable camera may include the coordinates of the number plate on the image, and a snapshot of the number plate in the bookmark. - For example, as indicated by arrow A21 a in
FIG. 9 , the bookmark of the face detection may include the time, the place, the bookmark classification, the coordinates indicating the location of the face on the image, and a snapshot of the face. The bookmark classification for the face detection may include identifiers for identifying faces, for example, as with the bookmark classification for the person detection inFIG. 7 . For example, the bookmark classification for the face detection may be indicated as “face 1”, “face 2”. and “face 3”. - For example, as indicated by arrow A21 b in
FIG. 9 , the bookmark of the number plate detection may include the time, the place, the bookmark classification, the coordinates indicating the location of the number plate on the image, and a snapshot of the number plate. The bookmark classification for the number plate detection may include identifiers for identifying number plates, for example, as with the bookmark classification for the person detection inFIG. 7 . For example, the bookmark classification for the number plate detection may be indicated as “plate 1”, “plate 2”, and “plate 3”. - When
wearable camera 1 includes the coordinates of the face in the bookmark of the face detection, terminal apparatus 4 can place a blur on the face included in the video data when the video data is played back. Whenwearable camera 1 includes the coordinates of the number plate in the bookmark of the number plate detection, terminal apparatus 4 can place a blur on the number plate included in the video data when the video data is played back. -
FIG. 10 illustrates a playback screen example of terminal apparatus 4. The playback screen example inFIG. 10 partially omits the illustration ofbookmark list 4 b, a playback button, and the like with respect to the playback screen example illustrated inFIG. 4 . - As described above,
wearable camera 1 includes the coordinates of the face in the bookmark of the face detection.Wearable camera 1 includes the coordinates of the number plate in the bookmark of the number plate detection. As a result, as illustrated inFIG. 10 , terminal apparatus 4 can place a blur on the face and the number plate that appear on the playback screen of the video data. - For example, the video data includes the bookmark of the face detection. Terminal apparatus 4 displays the face on the screen by placing a blur on the face on the screen on the basis of the coordinates indicating the location of the face included in the bookmark of the face detection.
- For example, the video data includes the bookmark of the number plate detection. Terminal apparatus 4 displays the number plate on the screen by placing a blur on the number plate on the screen on the basis of the coordinates indicating the location of the number plate included in the bookmark of the number plate detection.
- As described above,
wearable camera 1 includes the coordinates of the face on the image in the bookmark of the face detection.Wearable camera 1 includes the coordinates of the number plate on the image in the bookmark of the number plate detection. As a result, terminal apparatus 4 can place a blur on the face and the number plate when the video data is played back, and can protect privacy. - Note that, as described in
FIG. 9 , the bookmark classification in the bookmark of the face detection may include identifiers for identifying faces. Terminal apparatus 4 may distinguish faces on the basis of the identifiers in the bookmark classification, and place a blur on the face. For example, terminal apparatus 4 can place a blur on the face of a person other than the suspect or the criminal, and prevent the blur from being placed on the face of the suspect or the criminal in accordance with the operation by the police officer that operates terminal apparatus 4. - As described in
FIG. 9 , the bookmark classification in the bookmark of the number plate detection may include identifiers for identifying number plates. Terminal apparatus 4 may distinguish number plates on the basis of the identifiers in the bookmark classification, and place a blur on the number plates. For example, terminal apparatus 4 can place a blur on the number plate of a vehicle other than the vehicle of the suspect or the criminal, and prevent the blur from being placed on the number plate of the vehicle of the suspect or the criminal in accordance with the operation by the police officer that operates terminal apparatus 4. - In the description above, the bookmark classification includes identifiers for identifying faces, but the present invention is not limited thereto.
Wearable camera 1 may include identifiers for identifying faces in the bookmark apart from the bookmark classification. In this case, the bookmark classification does not necessarily need to include identifiers for identifying faces. - In the description above, the bookmark classification includes identifiers for identifying number plates, but the present invention is not limited thereto.
Wearable camera 1 may include identifiers for identifying number plates in the bookmark apart from the bookmark classification. In this case, the bookmark classification does not necessarily need to include identifiers for identifying number plates. - Also for the bookmark classification for the conversation detection described in
FIG. 5 and the bookmark classification for the person detection described inFIG. 7 , each bookmark classification does not necessarily need to include identifiers as with the bookmark of the face detection and the bookmark of the number plate detection. The identifiers for identifying conversations and identifiers for identifying people may he included in the bookmarks apart from the bookmark classifications. -
FIG. 11 illustrates an example of the upper body of the police officer wearingwearable camera 1 and biological sensor 6. InFIG. 11 , the same parts as those inFIG. 1 are denoted by the same reference characters. -
Wearable camera 1 is worn or held on the front part of the uniform of police officer U1 so as to take an image ahead of police officer U1.Wearable camera 1 may be fixed on the front part of the uniform in a state of being hung from the neck with a strap, for example.Wearable camera 1 may be fixed on the front part of the uniform by engaging an attachment (for example, an attachment clip) that is attached to a rear surface of a case ofwearable camera 1 with a counterpart attachment that is attached to the front part of the uniform with each other. - Biological sensor 6 is worn on the wrist of police officer U1, for example. Biological sensor 6 acquires living body information such as the heart rate, sweating, and the body temperature of police officer U1 from the wrist of police officer U1. Biological sensor 6 transmits the acquired living body information to
wearable camera 1. -
Wearable camera 1 receives the living body information transmitted from biological sensor 6.Wearable camera 1 determines whether the police officer wearingwearable camera 1 is in an excited state on the basis of the received living body information. Whenwearable camera 1 detects the excited state of the police officer during the recording of the video data,wearable camera 1 adds a bookmark to the video data. -
FIG. 12 illustrates an external appearance example ofwearable camera 1. As illustrated inFIG. 12 , switches 11 and 12,camera lens 13, andmicrophone 14 are disposed on the front surface of the case ofwearable camera 1.Switch 15 is disposed on the side surface of the case ofwearable camera 1. Light emitting diodes (LEDs) 16 a to 16 c are disposed on the upper surface of the case of the wearable camera. -
Wearable camera 1 starts image capturing (recording) a moving image whenswitch 11 is short-pressed.Wearable camera 1 stops image capturing (recording) the moving image when theswitch 11 is long-pressed. -
Wearable camera 1 captures (records) a still image in accordance with the pressing ofswitch 12. -
Camera lens 13 forms an optical image of an object on an imaging surface of an imaging element. -
Microphone 14 collects the sound aroundwearable camera 1. -
Wearable camera 1 communicates with external devices in accordance with the pressing ofswitch 15. For example,wearable camera 1 transmits information (including recorded video data) stored in a storage section described below to in-vehicle system 2 in accordance with the pressing ofswitch 15. - LEDs 16 a to 16 c indicate the state of
wearable camera 1. For example, LEDs 16 a to 16 c indicate whetherwearable camera 1 is recording or not. LEDs 16 a to 16 c indicate whetherwearable camera 1 is communicating with an external device or not, for example. -
FIG. 13 illustrates a block configuration example ofwearable camera 1. As illustrated inFIG. 13 ,wearable camera 1 includescontroller 21,camera 22,gyro sensor 23,acceleration sensor 24,switch 25,microphone 26,speaker 27, short-range communicator 28,communicator 29,GPS receiver 30, andstorage section 31. -
Controller 21 controls the entirety ofwearable camera 1. The functions ofcontroller 21 may be implemented by processors such as a central processing unit (CPU) and a digital signal processor (DSP), for example. -
Camera 22 includes an imaging element, andcamera lens 13 illustrated inFIG. 12 . The imaging element is a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor, for example.Camera 22 outputs a video signal output from the imaging element tocontroller 21 as a digital signal, for example.Controller 21 stores the digital signal output fromcamera 22 tostorage section 31. -
Gyro sensor 23 measures the angular velocity about three axes (an x-axis, a y-axis, and a z-axis) of a rectangular coordinate system, for example.Gyro sensor 23 outputs the measured angular velocity tocontroller 21 as a digital signal, for example. -
Acceleration sensor 24 measures the acceleration of the rectangular coordinate system in the direction of the three axes, for example.Acceleration sensor 24 outputs the measured acceleration tocontroller 21 as a digital signal, for example.Controller 21 detects the movements of the police officer wearingwearable camera 1 starting to walk, starting to run, fall, fighting, and the like from the angular velocity output fromgyro sensor 23 and the acceleration output fromacceleration sensor 24. -
Switch 25 is an input apparatus that accepts the operation of the user.Switch 25 corresponds toswitches FIG. 12 .Switch 25 outputs information in accordance with the operation of the user tocontroller 21 as a digital signal, for example. -
Microphone 26 collects the sound aroundwearable camera 1 or the voice of the police officer wearingwearable camera 1.Microphone 26 corresponds tomicrophone 14 illustrated inFIG. 12 .Microphone 26 outputs the signal of the collected sound tocontroller 21 as a digital signal, for example.Microphone 26 can be understood to be a sensor that collects sound. -
Speaker 27 converts the audible sound signal output fromcontroller 21 into audible sound, and outputs the audible sound. - Short-
range communicator 28 performs short-range wireless communication with in-vehicle system 2 of police vehicle A1 by Wi-Fi or Bluetooth, for example. Short-range communicator 28 performs wireless communication with biological sensor 6 by short-range wireless communication such as Wi-Fi or Bluetooth, for example. - Note that short-
range communicator 28 may perform short-range wireless communication with in-vehicle system 2 via a mobile terminal such as a smartphone possessed by the police officer, for example. Short-range communicator 28 may perform short-range wireless communication with biological sensor 6 via a mobile terminal such as a smartphone possessed by the police officer, for example. -
Communicator 29 communicates withserver 3 vianetwork 5. -
UPS receiver 30 receives a GPS signal transmitted from a plurality of GPS transmitters.GPS receiver 30 calculates the location ofwearable camera 1 on the basis of the received GPS signal.GPS receiver 30 outputs the calculated location ofwearable camera 1 tocontroller 21. Note that the location ofwearable camera 1 may be calculated bycontroller 21 on the basis of the GPS signal received byGPS receiver 30. - Images (moving images or still images) taken by
camera 22 are stored instorage section 31. The images stored instorage section 31 are saved as evidence images, for example, and cannot be erased. Instorage section 31, a program or data executed by a processor may be stored.Storage section 31 may be formed by a read only memory (ROM), a random access memory (RAM), a flash memory, and a hard disk drive (HDD), for example. The storage section that stores the video data therein and the storage section that stores the program or the data therein may be different storage sections. -
FIG. 14 illustrates a block configuration example ofserver 3. As illustrated inFIG. 14 ,server 3 includesCPU 41,RAM 42,HDD 43,communication interface 44, andbus 45. - The entire apparatus of
server 3 is controlled byCPU 41.CPU 41 functions ascontroller 41 a by the execution of the program.RAM 42,HDD 43, andcommunication interface 44 are connected toCPU 41 viabus 45. - In
RAM 42, an application program and a program of an operating system (OS) to be executed byCPU 41 are temporarily stored. InRAM 42, various data necessary for the processing byCPU 41 are temporarily stored. - In
HDD 43, an OS, an application program, and the like are stored. InHDD 43, the video data of the video captured bywearable camera 1 worn on the user and the in-vehicle camera installed in police vehicle A1 are stored. -
Communication interface 44 communicates with in-vehicle system 2 and terminal apparatus 4 vianetwork 5. -
FIG. 15 illustrates a block configuration example of terminal apparatus 4. As illustrated inFIG. 15 , terminal apparatus 4 includesCPU 51,RAM 52,HDD 53,communication interface 54,user interface 55, andbus 56. - The entire apparatus of terminal apparatus 4 is controlled by
CPU 51.CPU 51 functions ascontroller 51 a by the execution of the program.CPU 51 is connected to RAM 52,HDD 53,communication interface 54, anduser interface 55 viabus 56. - In
RAM 52, an application program and a program of an OS to be executed byCPU 51 are temporarily stored. InRAM 52, various data necessary for the processing byCPU 51 is temporarily stored. - In
HDD 53, an OS, an application program, and the like are stored. -
Communication interface 54 communicates withserver 3 and in-vehicle system 2 of police vehicle A1 vianetwork 5. - A keyboard apparatus and a display, for example, are connected to
user interface 55.CPU 51 exchanges data with the keyboard apparatus, the display, and the like viauser interface 55. -
FIG. 16 is a flowchart illustrating an operation example ofwearable camera 1.FIG. 16 illustrates the operation example ofwearable camera 1 from when the recording starts to when the recording stops.Controller 21 ofwearable camera 1 performs pre-recording for a certain amount of time. -
Controller 21 ofwearable camera 1 starts recording in accordance with the operation ofswitch 11 by the police officer wearing wearable camera 1 (Step S1).Controller 21 ofwearable camera 1 starts recording after going back by a certain amount of time. Note thatcontroller 21 ofwearable camera 1 may start recording in accordance with the voice of the police officer wearingwearable camera 1. -
Microphone 26 ofwearable camera 1 collects special sound. The special sound is a gunshot, for example. When the special sound is collected bymicrophone 26,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S2).Controller 21 ofwearable camera 1 includes the time at which the special sound is detected, the place, and the bookmark classification in the bookmark. -
Gyro sensor 23 andacceleration sensor 24 ofwearable camera 1 measure the action (movement) of the police officer. The action of the police officer is a dash, for example. When a dash is detected from the action of the police officer measured bygyro sensor 23 andacceleration sensor 24,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S3).Controller 21 ofwearable camera 1 includes the time at which the dash of the police officer is detected, the place, and the bookmark classification in the bookmark. - Now,
controller 21 ofwearable camera 1 may detect a face, a number plate, and color by monitoring the video data captured bycamera 22 at a certain interval.Controller 21 ofwearable camera 1 may detect a conversation by monitoring the sound collected bymicrophone 26 at a certain interval. -
Microphone 26 ofwearable camera 1 collects the sound of the conversation of people. When the sound of a conversation is collected bymicrophone 26,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S4).Controller 21 ofwearable camera 1 includes the time at which the conversation is detected, the place, and the bookmark classification in the bookmark. -
Camera 22 ofwearable camera 1 takes an image of the face and the number plate. When a face and a number plate are detected from the video data captured bycamera 22,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S5).Controller 21 ofwearable camera 1 includes the time at which the face is detected, the place, the bookmark classification, the identifier for identifying a face, the coordinates of the face, and a snapshot of the face in the bookmark of the face detection.Controller 21 ofwearable camera 1 includes the time at which the number plate is detected, the place, the bookmark classification, the identifier for identifying a number plate, the coordinates of the number plate, and a snapshot of the number plate in the bookmark of the number plate detection. - Note that, in Step S5,
camera 22 ofwearable camera 1 takes images of the face and the number plate, but may take an image of either one of the face and the number plate. In this case,controller 21 ofwearable camera 1 adds the bookmark of either one of the face and the number plate taken bycamera 22 ofwearable camera 1 in the video data. -
Camera 22 ofwearable camera 1 takes an image of a building and the like. When a predetermined color is detected in the building and the like taken bycamera 22,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S6).Controller 21 ofwearable camera 1 includes the time at which the color is detected, the place, and the bookmark classification in the bookmark of the color detection. -
Microphone 26 ofwearable camera 1 collects the sound of the conversation of people. When the sound of a conversation is collected bymicrophone 26,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S7).Controller 21 ofwearable camera 1 includes the time at which the conversation is detected, the place, and the bookmark classification in the bookmark. - As in Step S5, when a face and a number plate arc detected from the video data captured by
camera 22,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S8). When the face and the number plate detected from the video data are the same as the face and the number plate in Step S5,controller 21 ofwearable camera 1 includes the same identifier as that in Step S5 in the bookmark. When the face and the number plate detected from the video data are different from the face and the number plate in Step S5,controller 21 ofwearable camera 1 includes an identifier different from that in Step S5 in the bookmark. - As in Step S7, when a conversation is detected from the sound collected by
microphone 26,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S9).Controller 21 ofwearable camera 1 includes the time at which the conversation is detected, the place, and the bookmark classification in the bookmark. -
Controller 21 ofwearable camera 1 stops recording in accordance with the operation ofswitch 11 by the police officer wearing wearable camera 1 (Step S10). Note thatcontroller 21 ofwearable camera 1 may stop recording in accordance with the voice of the police officer wearingwearable camera 1. -
FIG. 17 is a flowchart illustrating an operation example of blur processing of terminal apparatus 4.Controller 51 a of terminal apparatus 4 reads a target video (video data) fromserver 3 in accordance with the operation by the police officer creating a report, for example (Step S21). At this time,controller 51 a of terminal apparatus 4 reads the bookmarks added to the video data. -
Controller 51 a of terminal apparatus 4 extracts the bookmark of the face detection and the bookmark of the number plate detection out of the bookmarks read in Step S21 (Step S22). -
Controller 51 a of terminal apparatus 4 starts the blur processing (Step S23). -
Controller 51 a of terminal apparatus 4 acquires the coordinates of the face and the coordinates of the number plate from the bookmark of the face detection and the bookmark of the number plate detection extracted in Step S22, and specifies the location of the face and the location of the number plate on the image on the basis of the acquired coordinates (Step S24). At this time,controller 51 a of terminal apparatus 4 may distinguish the faces included in the image on the basis of the identifiers included in the bookmark of the face detection.Controller 51 a of terminal apparatus 4 may distinguish the number plates included in the image on the basis of the identifiers included in the bookmark of the number plate detection. As a result,controller 51 a of terminal apparatus 4 can place a blur on faces other than the face of the suspect or the criminal, for example.Controller 51 a of terminal apparatus 4 can place a blur on the number plates of vehicles other than the number plate of the vehicle of the suspect or the criminal, for example. -
Controller 51 a of terminal apparatus 4 places a blur on the locations of the face and the number plate specified in Step S24 (Step S25). -
Controller 51 a of terminal apparatus 4 performs blur processing of the face and the number plate on the basis of the remaining bookmarks of the face detection and bookmarks of the number plate detection extracted in Step S22 (Steps S26, S27, . . . , S28, and S29). -
Controller 51 a of terminal apparatus 4 ends the blur processing (Step S30). - As described above,
camera 22,storage section 31 that stores therein the video data of the video captured bycamera 22, andcontroller 21 that adds a bookmark to the video data when an event is detected from a signal of the sensor or the video are included. Sensor may be biological sensor 6,gyro sensor 23,acceleration sensor 24, andmicrophone 26, for example. The bookmark added to the video data is a signal for playing back the video data from a place at which an event is specified in terminal apparatus 4. - As a result, for example, terminal apparatus 4 receives the video data of the video for which report is to be created from
server 3 in accordance with the operation by the police officer creating a report on the case. Terminal apparatus 4 can display events on the display apparatus from the bookmarks added to the video data received fromserver 3 as illustrated inbookmark list 4 b inFIG. 4 , for example. When an event displayed inbookmark list 4 b inFIG. 4 is selected by the police officer, for example, terminal apparatus 4 can play back the video data from the selected event. Therefore, the police officer creating a report can play back the video data from the place (event) necessary for the report creation, and the report creation becomes easier. - The in-vehicle camera installed in police vehicle A1 may also add a bookmark to the video data as with
wearable camera 1. In-vehicle camera may include a block configuration similar to the block configuration illustrated inFIG. 13 , for example. However, the block configuration of the in-vehicle camera does not necessarily need to includegyro sensor 23 andacceleration sensor 24. - In the description above, terminal apparatus 4 places a blur on the face and the number plate included in the video, but the present invention is not limited thereto.
Server 3 may place a blur on the face and the number plate included in the video.Server 3 may transmit the video data on which a blur is placed to terminal apparatus 4. - In the description above, the police officer possesses
wearable camera 1, but the present invention is not limited thereto. For example, the security guard may possesswearable camera 1. - When a bookmark is selected in
bookmark list 4 b inFIG. 4 , terminal apparatus 4 may display the location information (for example, the address) included in the selected bookmark on the display apparatus. As a result, the police officer can write where the video is captured in the report. - In the description above,
controller 21 ofwearable camera 1 detects an event from the signal of biological sensor 6, the signal ofgyro sensor 23 andacceleration sensor 24, the signal ofmicrophone 26, and the video ofcamera 22, but the present invention is not limited thereto.Controller 21 ofwearable camera 1 may detect an event from at least one of the signal of biological sensor 6, the signal ofgyro sensor 23 andacceleration sensor 24, the signal ofmicrophone 26, and the video ofcamera 22. For example,controller 21 ofwearable camera 1 may detect an event from two, that is, the signal (sound) ofmicrophone 26 and the video ofcamera 22. - In
Embodiment 1,wearable camera 1 monitors the video data captured bycamera 22 at a certain interval, and detects a face, a number plate, or color. Then,wearable camera 1 adds a bookmark to the video data each time a face, a number plate, or color is detected. - Meanwhile, in
Embodiment 2,wearable camera 1 adds a bookmark to the video data when the detection of a face, a number plate, or color is started, and when a face, a number plate, or color is no longer detected. In other words,wearable camera 1 adds a bookmark to the video data when a face, a number plate, or color enters a video range (capture range), and when the face, the number plate, or the color in the video range exits the video range. - In
Embodiment 1,wearable camera 1 monitors the sound collected bymicrophone 26 at a certain interval, and detects a conversation. Then,wearable camera 1 adds a bookmark to the video data each time a conversation is detected. - Meanwhile, in
Embodiment 2,wearable camera 1 adds a bookmark to the video data when the conversation detection is started, and when the conversation is no longer detected. In other words,wearable camera 1 adds a bookmark to the video data when a conversation starts, and when the conversation ends. Parts different from those inEmbodiment 1 are described below. -
FIG. 18 is a flowchart illustrating an operation example ofwearable camera 1 according toEmbodiment 2.FIG. 18 illustrates an operation example ofwearable camera 1 from when the recording starts to when the recording stops.Controller 21 ofwearable camera 1 performs pre-recording for a certain amount of time. - The processing in Steps S41 to S43 in
FIG. 18 is similar to the processing in Steps S1 to S3 illustrated inFIG. 16 , and the description thereof is omitted. -
Controller 21 ofwearable camera 1 may detect a face, a number plate, and color by monitoring the video data captured bycamera 22 at a certain interval.Controller 21 ofwearable camera 1 may detect a conversation by monitoring the sound collected bymicrophone 26 at a certain interval. -
Microphone 26 ofwearable camera 1 collects the sound of the conversation of people.Controller 21 ofwearable camera 1 detects the starts of the conversation collected bymicrophone 26, and adds a bookmark to the video data (Step S44).Controller 21 ofwearable camera 1 includes the time at which the conversation is detected, the place, and the bookmark classification in the bookmark. -
Camera 22 ofwearable camera 1 takes an image of the face and the number plate. When a face and a number plate are detected from the video data captured bycamera 22,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S45).Controller 21 ofwearable camera 1 includes the time at which the face is detected, the place, the bookmark classification, the identifier for identifying the face, the coordinates of the face, and a snapshot of the face in the bookmark of the face detection.Controller 21 ofwearable camera 1 includes the time at which the number plate is detected, the place, the bookmark classification, the identifier for identifying the number plate, the coordinates of the number plate, and a snapshot of the number plate in the bookmark of the number plate detection. - Note that
controller 21 ofwearable camera 1 does not add the bookmarks for the face and the number plate detected in Step S45 to the video data until the face and the number plate detected in Step S45 exit the video range. Meanwhile, when a face and a number plate that are different from the face and the number plate detected in Step S45 are detected,controller 21 ofwearable camera 1 adds bookmarks that are different from those for the face and the number plate detected in Step S45 to the video data. -
Camera 22 ofwearable camera 1 takes an image of a building and the like. When a predetermined color is detected in the building and the like taken bycamera 22,controller 21 ofwearable camera 1 adds a bookmark to the video data (Step S46).Controller 21 ofwearable camera 1 includes the time at which the color is detected, the place, and the bookmark classification in the bookmark of the color detection. - Note that
controller 21 ofwearable camera 1 docs not add the bookmark for the color detected in Step S46 to the video data until the color detected in Step S46 exits the video range. Meanwhile, when a color of building that is different from the color of the building and the like detected in Step S46 is detected,controller 21 ofwearable camera 1 adds a bookmark different from that for the color detected in Step S46 to the video data. -
Controller 21 ofwearable camera 1 detects the end of the conversation detected in Step S44 (Step S47).Controller 21 ofwearable camera 1 includes the time at which the end of the conversation is detected, the place, and the bookmark classification in the bookmark. -
Controller 21 ofwearable camera 1 detects the end of the image capturing of the face and the number plate detected in Step S45 (Step S48).Controller 21 ofwearable camera 1 includes the time at which the detection of the face and the number plate has ended, the place, and the bookmark classification in the bookmark. -
Controller 21 ofwearable camera 1 detects the end of the image capturing of the color detected in Step S46 (Step S49).Controller 21 ofwearable camera 1 includes the time at which the color detection has ended, the place, and the bookmark classification in the bookmark. - The processing in Step S50 in
FIG. 18 is similar to the processing in Step S10 illustrated inFIG. 16 , and the description thereof is omitted. - As described above,
controller 21 ofwearable camera 1 adds a bookmark to the video data when a face, a number plate, or color enters the video range, and when the face, the number plate, or the color included in the video range exits the video range.Wearable camera 1 adds a bookmark to the video data when a conversation starts and when the conversation ends. As a result,controller 21 ofwearable camera 1 does not need to add bookmarks to the video data at a certain interval, and the processing load can be reduced. - In the embodiments described above, expressions such as “ . . . section”, “ . . . or”, and “ . . . er” used in the components may he replaced with other expressions such as “ . . . circuitry”, “ . . . device”, “ . . . unit”, or “ . . . module”.
- The embodiments have been described above with reference to the accompanying drawings, but this disclosure is not limited to those examples. It is clear that a person skilled in the art could conceive of various changes or variations within the scope of the appended claims. The changes or variations as above are also understood to belong to the technical scope of this disclosure. The components in the embodiments may be combined in a freely selected manner without departing from the spirit of this disclosure.
- This disclosure can be implemented by software, hardware, or software cooperated with hardware. The function blocks used in the description of the embodiments described above may be partially or entirely implemented as an LSI, which is an integrated circuit, and each of the processes described in the embodiments described above may be partially or entirely controlled by one LSI or a combination or LSIs. The LSI may be formed by individual chips, or may be formed by one chip so as to include a part or all of the function blocks. The LSI may include input and output of data. The LSI may be called an IC, a system LSI, a super LSI, and an ultra LSI in accordance with the difference in the degree of integration.
- The method of forming an integrated circuit is not limited to the LSI, and may be implemented by a dedicated circuit, a general purpose processor, or a dedicated processor. An FPGA that is programmable and a reconfigurable processor capable of reconfiguring the connection and the setting of the circuit cell in the LSI may be used after manufacturing the LSI. This disclosure may be implemented as digital processing or analog processing.
- When technology for forming an integrated circuit that replaces the LSI appears by the progress of the semiconductor technology or another derivative technology, the function blocks may be naturally integrated with use of the technology. The application of the biotechnology and the like is possible.
- This disclosure is useful in a wearable camera that records a video.
- 1 Wearable camera
2 In-vehicle system - 4 Terminal apparatus
4 a Playback screen
4 b Bookmark list - 6 Biological sensor
- 13 Camera lens
- 23 Gyro sensor
24 Acceleration sensor - 28 Short-range communicator
- 30 GPS receiver
31 Storage section - 44, 54 Communication interface
- 55 User interface
A1 Police vehicle
A2 Police station
Claims (16)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019088153A JP2020184678A (en) | 2019-05-08 | 2019-05-08 | Wearable camera and signal addition method |
JP2019-088153 | 2019-05-08 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200358947A1 true US20200358947A1 (en) | 2020-11-12 |
US10855913B1 US10855913B1 (en) | 2020-12-01 |
Family
ID=73045226
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/570,098 Active US10855913B1 (en) | 2019-05-08 | 2019-09-13 | Wearable camera, video playback system, and video playback method |
Country Status (2)
Country | Link |
---|---|
US (1) | US10855913B1 (en) |
JP (1) | JP2020184678A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023133197A1 (en) * | 2022-01-07 | 2023-07-13 | Getac Corporation | Incident category selection optimization |
EP4333446A1 (en) * | 2022-08-30 | 2024-03-06 | Nokia Technologies Oy | Control of user device based on detecting an event of interest |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4403039B2 (en) * | 2004-08-12 | 2010-01-20 | 株式会社日立製作所 | Image processing apparatus and image processing system |
EP1911263A4 (en) * | 2005-07-22 | 2011-03-30 | Kangaroo Media Inc | System and methods for enhancing the experience of spectators attending a live sporting event |
JP6265602B2 (en) * | 2013-01-29 | 2018-01-24 | 株式会社日立国際電気 | Surveillance camera system, imaging apparatus, and imaging method |
JP2016181767A (en) | 2015-03-23 | 2016-10-13 | パナソニックIpマネジメント株式会社 | Wearable camera and wearable camera system |
JP2019021996A (en) * | 2017-07-12 | 2019-02-07 | パナソニックIpマネジメント株式会社 | Wearable camera, wearable camera system, and information recording method |
US11024137B2 (en) * | 2018-08-08 | 2021-06-01 | Digital Ally, Inc. | Remote video triggering and tagging |
-
2019
- 2019-05-08 JP JP2019088153A patent/JP2020184678A/en active Pending
- 2019-09-13 US US16/570,098 patent/US10855913B1/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023133197A1 (en) * | 2022-01-07 | 2023-07-13 | Getac Corporation | Incident category selection optimization |
US11785266B2 (en) | 2022-01-07 | 2023-10-10 | Getac Technology Corporation | Incident category selection optimization |
EP4333446A1 (en) * | 2022-08-30 | 2024-03-06 | Nokia Technologies Oy | Control of user device based on detecting an event of interest |
Also Published As
Publication number | Publication date |
---|---|
US10855913B1 (en) | 2020-12-01 |
JP2020184678A (en) | 2020-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11375161B2 (en) | Wearable camera, wearable camera system, and information processing apparatus for detecting an action in captured video | |
EP3153976A1 (en) | Information processing device, photographing device, image sharing system, information processing method, and program | |
JP6799779B2 (en) | Surveillance video analysis system and surveillance video analysis method | |
JP6669240B1 (en) | Recording control device, recording control system, recording control method, and recording control program | |
CN102708120A (en) | Life streaming | |
CN104332037B (en) | method and device for alarm detection | |
US10855913B1 (en) | Wearable camera, video playback system, and video playback method | |
US11463663B2 (en) | Camera glasses for law enforcement accountability | |
EP2905953A1 (en) | Content acquisition device, portable device, server, information processing device and storage medium | |
CN104050774A (en) | Worn type electronic watch ring piece with video function | |
WO2020249025A1 (en) | Identity information determining method and apparatus, and storage medium | |
JP6096654B2 (en) | Image recording method, electronic device, and computer program | |
JP2008219227A (en) | System and method for monitoring video image | |
US20210073278A1 (en) | Providing Access to Videos Generated from a Vehicle Camera System | |
US10587805B2 (en) | Wearable camera and method for using wearable camera | |
JP2016122115A (en) | Wearable camera | |
CN113706807B (en) | Method, device, equipment and storage medium for sending alarm information | |
US20190306402A1 (en) | Wearable camera, server, and method for using wearable camera | |
CN111133748B (en) | Terminal device, camera system, and control method | |
US10027922B2 (en) | Imaging apparatus for displaying a specific scene while continuously photographing, image playback method, and non-transitory computer-readable storage medium | |
US11127435B2 (en) | Wearable camera | |
JP6457156B2 (en) | Recorded image sharing system, method and program | |
CN116339498A (en) | User information acquisition method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: PANASONIC I-PRO SENSING SOLUTIONS CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:051276/0124 Effective date: 20191203 |
|
AS | Assignment |
Owner name: PANASONIC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAGIO, MINORU;ARAI, SHINICHI;OGUCHI, TAKAE;SIGNING DATES FROM 20190910 TO 20190911;REEL/FRAME:051535/0559 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: PANASONIC I-PRO SENSING SOLUTIONS CO., LTD., JAPAN Free format text: MERGER;ASSIGNOR:PANASONIC I-PRO SENSING SOLUTIONS CO., LTD.;REEL/FRAME:054757/0114 Effective date: 20200401 |
|
AS | Assignment |
Owner name: PANASONIC I-PRO SENSING SOLUTIONS CO., LTD., JAPAN Free format text: ADDRESS CHANGE;ASSIGNOR:PANASONIC I-PRO SENSING SOLUTIONS CO., LTD.;REEL/FRAME:055479/0932 Effective date: 20200401 |
|
AS | Assignment |
Owner name: I-PRO CO., LTD., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:PANASONIC I-PRO SENSING SOLUTIONS CO., LTD.;REEL/FRAME:063101/0966 Effective date: 20220401 Owner name: I-PRO CO., LTD., JAPAN Free format text: CHANGE OF ADDRESS;ASSIGNOR:I-PRO CO., LTD.;REEL/FRAME:063102/0075 Effective date: 20221001 |