WO2012146273A1 - Procédé et système pour l'insertion d'un marqueur vidéo - Google Patents

Procédé et système pour l'insertion d'un marqueur vidéo Download PDF

Info

Publication number
WO2012146273A1
WO2012146273A1 PCT/EP2011/056596 EP2011056596W WO2012146273A1 WO 2012146273 A1 WO2012146273 A1 WO 2012146273A1 EP 2011056596 W EP2011056596 W EP 2011056596W WO 2012146273 A1 WO2012146273 A1 WO 2012146273A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
snapshot
sensor
digital
digital video
Prior art date
Application number
PCT/EP2011/056596
Other languages
English (en)
Inventor
Martin HONNER
Original Assignee
Better4Drive Ug (Haftungsbeschränkt)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Better4Drive Ug (Haftungsbeschränkt) filed Critical Better4Drive Ug (Haftungsbeschränkt)
Priority to PCT/EP2011/056596 priority Critical patent/WO2012146273A1/fr
Publication of WO2012146273A1 publication Critical patent/WO2012146273A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/21Intermediate information storage
    • H04N1/2104Intermediate information storage for one or a few pictures
    • H04N1/2112Intermediate information storage for one or a few pictures using still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00323Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a measuring, monitoring or signaling apparatus, e.g. for transmitting measured information to a central location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00352Input means
    • H04N1/00381Input by recognition or interpretation of visible user gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00352Input means
    • H04N1/00403Voice input means, e.g. voice commands
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/21Intermediate information storage
    • H04N1/2104Intermediate information storage for one or a few pictures
    • H04N1/2112Intermediate information storage for one or a few pictures using still video cameras
    • H04N1/212Motion video recording combined with still video recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/21Intermediate information storage
    • H04N1/2104Intermediate information storage for one or a few pictures
    • H04N1/2112Intermediate information storage for one or a few pictures using still video cameras
    • H04N1/2125Display of information relating to the still picture recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/21Intermediate information storage
    • H04N1/2104Intermediate information storage for one or a few pictures
    • H04N1/2112Intermediate information storage for one or a few pictures using still video cameras
    • H04N1/2129Recording in, or reproducing from, a specific memory area or areas, or recording or reproducing at a specific moment
    • H04N1/2133Recording or reproducing at a specific moment, e.g. time interval or time-lapse

Definitions

  • the present invention generally relates to electronic data processing, and more particularly, relates to methods, computer program products and systems for marking videos.
  • Video surveillance is typically run on stationary systems for security purposes of security sensitive areas.
  • individual mobile video surveillance as for example when driving on the road and observing an accident or when skiing or performing other sports, or in the context of training and education where it is interesting to analyse situations after the event.
  • DailyRoads Voyager is an application for mobile devices powered by the Android operating system, allowing for video recording from vehicles.
  • the application acts as a video black box recording everything, but only keeping what the user is really interested in.
  • the user is usually the driver of a car, who can now capture video sequences of important road events.
  • the application records and saves what happens in front of the camera. For this, the video is terminated and stored in a specific folder for to-be-retained videos when an event is launched and will be restarted afterwards. Therefore there is a small video gap close to the important event between the termination and the restart of the video, which is typically in the range of a few seconds.
  • the method includes: recording a digital video from a digital camera on digital storage means; then receiving a sensor event, wherein the sensor event indicates a request to mark a portion of the digital video while the digital video is being recorded on the digital storage means; then in response to receiving the sensor event, defining at least one recorded video frame of the video as a snapshot ; and storing snapshot meta data associated with the snapshot, wherein the snapshot metadata includes data to retrieve the portion of the digital video.
  • the solution of claim 1 allows for continuous video recording without any gaps in the vicinity of relevant events and provides the opportunity to easily find the relevant sequence in the continuous video recording through the stored snapshot metadata.
  • FIG. 9 Further aspects of the invention are a computer program product according to the independent claim 9 and a computer system including a digital camera, digital storage means, sensor means and a video processor component according to independent claim 10.
  • the computer system can run the computer program to execute the method of claim 1.
  • Alternative embodiments of the invention can be implemented by a further video data retrieval component for retrieving the portion of the digital video by using predefined time interval information, wherein the predefined time interval information and the snapshot metadata is used to calculate a start time of the portion before the snapshot and an end time of the portion after the snapshot; and storing the retrieved portion as a separate digital video on the digital storage means.
  • FIG. 1 is a simplified block diagram of a computer system according to one embodiment of the invention.
  • FIG. 2 is a simplified flow chart of a computer implemented method for marking a video according to one embodiment of the invention
  • FIG. 3 is a simplified flow chart of a further embodiment of the computer implemented method
  • FIG. 4 illustrates creation and use of a snapshot manager object as being used in one embodiment of the invention
  • FIG. 5 illustrates the handling of multiple sensor events according to one embodiment of the invention
  • FIG. 6 and FIG. 7 show alternative options for handling multiple sensor events according to further embodiments of the invention.
  • FIG. 8 shows a class diagram for decoupling video recording from camera preview.
  • FIG. 1 is a simplified block diagram of a computer system 9000 according to one embodiment of the invention.
  • the computer system 9000 includes a video processor component 9400.
  • the video processor component 9400 has a video interface 9420, which is configured to receive digital video data from a digital camera 9100.
  • the digital camera can be directly integrated into the computer system 9000 or it can be a remote camera being communicatively coupled through the video interface.
  • the communicative coupling can be achieved through an appropriate data bus or through wireless connection standards, such as the open wireless technology standard for exchanging data over short distances (Bluetooth).
  • the video processor component 9400 further has a sensor interface 9430, which is configured to receive a sensor event from sensor means 9300.
  • the video processor component 9400 can use digital storage means 9200 communicatively coupled to the video processor 9400 to persistently record the video data on the storage means.
  • the video processor 9400 can define a part of the video being recorded as a snapshot, where the snapshot is associated with the received event.
  • the part of the video which is defined as the snapshot can be a single frame of the video.
  • a frame of the video is one of the many still images which compose the complete moving picture of the video.
  • a typical number of frames per second in digital videos is 30. However, higher or lower numbers can be found in current digital video cameras.
  • the snapshot may be defined as being a number of frames, which are associated with the point in time when the event is received. For example, all frames that relate to one second of the recording time of the video define the snapshot. In this case a snapshot does not relate to a single still picture but to a short moving picture sequence being composed of a couple of frames in a respective recording time interval, which is short enough to define the snapshot associated with the received event.
  • the video processor component 9400 can then store metadata associated with the snapshot, wherein the snapshot metadata includes data for later retrieval of a portion of the digital video, which has been marked through the snapshot.
  • the portion can correspond to a predefined time interval defining a certain period before and after the snapshot.
  • the metadata may include information about the elapsed recording time when the event was received.
  • the metadata could also include a frame identifier which relates to the one or more frames associated with the snapshot or any other information, which is appropriate to localize the snapshot in the recorded video.
  • the snapshot metadata can be stored in the digital storage means 9200 or in any other appropriate storage means, such as for example, a cash memory (not shown) which is part of the video processor component itself.
  • the digital storage means 9200 can be provisioned as a remote resource on demand via a computer network, such as a private network or the Internet (cloud computing).
  • the storage means may be remotely located on a computer which is part of a cloud and can be accessed through an appropriate data interface (e.g., SERIAL-ATA, SCSI, SCA-2or any other appropriate interface type) by the computer system 9000 online (e.g., via the Internet).
  • a cloud in this context is a group of computing devices with storage means, which can be flexibly allocated on demand to provide specific services.
  • the video processor component may use display means 9600 to display the video to a user either while the video is being recorded or after recording.
  • the display means itself for example a TV set, may include video processing capabilities allowing to directly retrieve the video data from the digital storage means for display.
  • the video processor component 9400 can insert a snapshot indicator into the digital video, wherein the snapshot indicator is configured to give feedback to a user about the existence of the snapshot while observing the recorded video.
  • the snapshot indicator can be added to the video data to be stored.
  • the data added to the video may be a sound indicator like a short ring or beep tone to convey to the observer of the recorded video that a snapshot was defined where the sound is audible during video replay.
  • a visual indicator may be added to the video data to indicate the snapshot to the observer.
  • the computer system 9000 further includes a video data retrieval component 9410.
  • the video data retrieval component 9410 is communicatively coupled to the video processor component 9400 but may also be implemented as an integrated component of the video processor component 9400 or of the display means 9600.
  • the video data retrieval component 9410 is configured to retrieve the portion of the digital video around the snapshot by using predefined time interval information.
  • the predefined time interval information and the snapshot metadata are used to calculate a start time of the portion before the snapshot and an end time of the portion after the snapshot.
  • the time interval information may be a predefined system parameter or it may be set by the user as a configuration parameter or it may automatically be determined by the system based on predefined rules. For example, if the video is recorded while driving a car, the portion of the video which is to be retrieved around the snapshot can depend on the speed at which the car was driving when the event for defining the snapshot was received or it may depend on location coordinates of the car . The portion may also depend on the brightness conditions which were sensed at the time the event was received or any other condition which is meaningful to determine the length of the portion. After the portion is retrieved from the stored digital video, the retrieved portion may be stored as a separate digital video on the digital storage means.
  • the digital camera 9100 is communicatively coupled to the video processor component, so that video data can be transmitted to the video processor component 9000 at sufficient bandwidth.
  • the coupling can occur through a video bus where the camera is integrated in a mobile device and the computer system 9000 is integrated in the same mobile device. Examples for such mobile devices are mobile phones (e.g., smartphones), notebooks, mobile navigation devices, laptops, camcorders, personal digital assistants and tablet computers.
  • the camera may be integrated in a vehicle.
  • the camera may be integrated in the rear mirror of a car or in the front lights to observe the traffic in front of the car.
  • the communicative coupling may be implemented through wireless connection protocols, such as for example the Bluetooth technology.
  • the camera may also be integrated into a helmet for observing sports activities or for other occasions like fire brigade actions, where helmets are used.
  • the sensor means 9300 is communicatively coupled to the video processing component 9400 and can include one or more sensors.
  • an acceleration sensor may be used alone or in combination with a speed sensor to raise a sensor event, which indicates a critical situation while driving a car.
  • a relevant negative acceleration is measured a combined sensor event may be launched indicating the critical situation.
  • the sensor means can also include sensors that receive direct user input for launching a sensor event. Such sensors may be shock sensors, gesture sensors, touch sensors, and audio sensors.
  • any combination of different sensor types and appropriate sensor event rules can be used.
  • a person skilled in the art knows how to combine sensors and define such rules.
  • a rule can be defined which requires an additional sensor signal combined with the sensed acceleration.
  • Such an additional sensor signal may result from a reduced speed signal (indicating a crash or at least a brake application) or it may result from an audio signal where the user confirms orally (e.g., by saying "now") that he/she intentionally hit the acceleration sensor.
  • Another example is a location sensor which provides information about the current location of the sensor.
  • Such a location sensor can be implemented with a Global Positioning System (GPS) sensor or alternatively by a mobile phone using GSM localisation being done by multilateration based on the signal strength to nearby antenna masts.
  • the location coordinates delivered by the location sensor may then trigger a sensor event.
  • a sensor event can be triggered when the location sensor delivers location coordinates which are nearby the location coordinates of a predefined point of interest (e.g., a sight, a toll station, etc.)
  • a location sensor in a vehicle approaching the Golden Gate Bridge may trigger the event when the received GPS coordinates indicate that the Bridge is just in front of the vehicle.
  • the rules defining a sensor event can be handled outside of the computer system 9000.
  • the system may assume that every sensor event being received is supposed to define a snapshot because filtering or aggregating sensor events occurs, for example, in some sensor network infrastructure as known by a person skilled in the art.
  • the video processor component may include a rule engine which is capable of identifying relevant sensor events and which triggers the definition of a snapshot only in case that one or more received sensor events fulfill a requirement of a certain snapshot definition triggering rule, such as for example the one described above.
  • Such rules may be formal descriptions of situations like the examples given above with regards to the speed and acceleration sensing.
  • the computer system 9000 may perform the relevant sensor event filtering and aggregation that it would need to handle more complex combined sensor events for triggering a snapshot definition.
  • the storage means 9200 can be any commercially available storage means which is appropriate to store digital video data, such as hard disks, flash memories (e.g., SD cards, USB sticks), or optical discs (e.g., DVD, CD) .
  • hard disks e.g., hard disks, flash memories (e.g., SD cards, USB sticks), or optical discs (e.g., DVD, CD) .
  • flash memories e.g., SD cards, USB sticks
  • optical discs e.g., DVD, CD
  • the computer system 9000 can be part of the same hardware platform as the camera and/or sensor means.
  • the system 9000 can be hosted by a remote server computer where the video and sensor event data are received through a wireless communication technology.
  • the video and sensor event data can be broadcasted through an existing mobile telecommunications network with sufficient bandwidth.
  • Such telecommunication networks may be implemented according to the GSM or UMTS standard or any other standard allowing data transmission of the video data at sufficient bandwidth.
  • FIG. 2 is a simplified flow chart of a computer implemented method for marking a video which can be executed by a computer system as described under FIG. 1 .
  • the method starts by recording 4100 a digital video from a digital camera on digital storage means.
  • the computer system 9000 receives 4200 a sensor event, wherein the sensor event indicates a request to mark a portion of the digital video while the digital video is being recorded on the digital storage means.
  • a sensor event can be an acceleration event, indicating that respective acceleration sensor means sensed an acceleration exceeding a predefined threshold value.
  • the sensor event may also result from a user interaction with respective sensor means.
  • Examples for direct user interaction which may trigger sensor events, are: a user giving a spoken command to a microphone; a user touching a touchscreen of a smart phone or any other touch-screen enabled device (e.g., navigational system); a user pushing a device with a shock sensor; a user indicating a gesture to a gesture recognition sensor.
  • the computer system may be able perform sensor event filtering and/or aggregation by using a rule engine as described above.
  • the video processor component 9400 defines 4300 at least one recorded video frame of the video as a snapshot.
  • the video processing component inserts 4350 a snapshot indicator into the digital video file on the storage means 9200, wherein the snapshot indicator is configured to give feedback to a user about the existence of the snapshot while observing the recorded video.
  • snapshot indicators may be audio or video signals inserted into the video recording.
  • Snapshot metadata associated with the snapshot are then stored 4400, wherein the snapshot metadata includes data to retrieve the portion of the digital video as explained under FIG. 1 .
  • the computer system 9000 may then retrieve 4500 the portion of the digital video from the stored video file by using predefined time interval information.
  • the predefined time interval information can be defined as described under FIG. 1 and the snapshot metadata is then used to calculate a start time of the portion before the snapshot and an end time of the portion after the snapshot.
  • the retrieved portion may be stored 4600 as a separate digital video on the digital storage means. It may also be sent to the display means 9600 after the retrieval without storing it in a separate video file.
  • the advantage of storing it in a separate video file is that only video portions associated with snapshots need to be kept thus freeing up the memory which was occupied by the entire digital video before.
  • the recorded digital video can be deleted to free up memory after all relevant snapshot associated portions have been extracted and are stored as smaller separate digital video files.
  • the computer system 9000 can perform this memory clean-up while the recording is running.
  • the images which were recorded outside the predefined time intervals around each respective snapshot are automatically deleted by the system.
  • the deletion can be triggered by the sensor event that causes the storage of the snapshot metadata but it also can be triggered after a predefined recording time when no relevant sensor events were received during the recording. This frees up additional memory while recording is running and further results in increasing the maximum recording time in case limited data storage means is used for recording.
  • a further advantage of this embodiment is that right after the termination of the recording the final video only includes video scenes which are relevant to the user. This avoids tedious search activities of the user after the end of the recording to find and retrieve a certain video scene.
  • the system 9000 receives 4700 a further sensor event after the first snapshot metadata were stored and the further sensor event indicates a further request to mark a further portion of the digital video while the digital video is being recorded, then the system defines 4710 at least one further recorded video frame of the video as a further snapshot.
  • the system may perform a test, where the time distance between the snapshot and the further snapshot is determined. The system may compare the calculated time distance with a predefined snapshot overlapping threshold ST, and decide that only in case that the time distance is above the snapshot overlapping threshold ST the further snapshot meta data, including data to retrieve the further portion of the digital video, associated with the further snapshot, are stored 4730.
  • the system may simply wait for a new sensor event.
  • sensor events which are so close in time that they would fall into the video portion associated with the previously stored snapshot metadata would not be stored as the information is redundant and the relevant scene is already included in the video portion for the previous snapshot. This behavior can save memory and improve system performance because system operations which would not add useful information for the user are avoided.
  • FIG. 3 is a simplified flow chart of a further embodiment of the computer implemented method 3000 and shows in combination with FIGs. 4 to 7 one example of snapshot
  • the video processing component 9400 starts the continuous video recording 3300 and further creates 3200 a snapshot manager object 9450 (cf. FIG. 4). This occurs at t 0 in the time 1000 dimension (cf. FIG. 4).
  • the snapshot manager object is initialized 100 at t 0 by creating a data record 420 indicating the start time of the video.
  • additional metadata can be stored with the data record 420, such as for example, speed or Global Positioning System (GPS) coordinates.
  • GPS Global Positioning System
  • a first sensor event is received 3310, which triggers a first updating 3210 of the snapshot manager object 9450.
  • a first snapshot is defined and the snapshot metadata ss1 associated with the snapshot are added 101 as a second record 421 to the snapshot manager object.
  • the snapshot metadata ss1 may have the same data structure as the metadata stored with the initialization data record 420 for the start time.
  • a snapshot indicator may be added to the recorded video data 9210.
  • a second sensor event is received 3320 by the video processing component.
  • the system may now check 3220, whether the time distance between the time of the first snapshot ss1 and the time of the second event is above the predefined snapshot overlap threshold ST. If this is the case, the snapshot manager object will be updated 3230 a second time with the snapshot metadata record ss2 associated with the snapshot defined by the second sensor event. In other words, a third data record 422, including data to retrieve the portion of the digital video which relates to the second event, is added to the snapshot manager object. The system will then wait for the receipt 3330 of the next further event and repeat the previous steps.
  • the snapshot manager object may be persistently stored as a separate data file, when the video recording is ended. In an alternative embodiment the snapshot manager object may be persistently stored as a separate data file, already while the video is still being recorded. This may be
  • FIG. 5 further illustrates the handling of multiple sensor events according to one embodiment of the invention.
  • the portions 9221 , 9222 of the recorded video 9220 can later be retrieved by the video data retrieval component 9410 by looking up the snapshot metadata in the stored snapshot manager object file and calculating the respective portion start times t ps1, t pS2 and portion end times t pe i , tpe2 by using the predefined time interval information.
  • the time interval information may be in a format like (At,%), where At is the length of the time interval and % indicates the percentages of the time interval that defines the portion start time in relation to At. Any other format can be used which is appropriate to calculate the start and end times of the portions. There are commercial solutions available to cut the video according to the calculated start and end times of each respective portion and store the corresponding video file as separate videos
  • FIG. 6 and FIG. 7 show alternative options for handling multiple sensor events according to further embodiments of the invention.
  • the threshold check system 3220 results in a time distance between the time of the first snapshot ss1 and the time of the second event, which is below the predefined snapshot overlap threshold ST.
  • FIG. 6 shows one embodiment, where a further sensor event is received before the end of a time period that corresponds to the snapshot overlap threshold ST.
  • the further sensor event will not trigger a snap shot manager update because it is received at a point in time where at least a substantial part of the corresponding video portion is already included in the video portion 9221 associated with the first snapshot ss1. This saves memory space in the snapshot manager object and avoids an updating step without a substantial loss of video information with regards to the retrievable portion 9221.
  • FIG. 7 shows how to handle such an overlap scenario where a loss of information is not acceptable or, where for example the further sensor event occurs only very shortly (e.g., less than 10% of ST) before t pe i .
  • the system still defines a new snapshot ssf at the time t ssf when the further event is received.
  • a corresponding data record is added to the snapshot manager object and a corresponding snapshot indicator may be inserted in the recorded video data 9220.
  • FIG. 8 shows a class diagram for decoupling video recording from camera preview.
  • the recording 4100 of the digital video may be performed as a background process of computer system's operating system.
  • smartphones running on the Android operating system normally run the video recording as a foreground process.
  • Video recording is typically interrupted in the following examples: a user stops the recording; a telephone call comes in; the user is performing a navigation operation in the main menu of the smartphone; the user pushes the back button; or the user turns off the display.
  • the video recording requires the display of the camera preview in the foreground and stops immediately when the camera- preview in the foreground is terminated.
  • the dependency of the video recording task and the camera preview function is dissolved.
  • a service is started, which includes all relevant objects and information for the video recording.
  • the services use the required resources like, for example, the camera and the microphone of the smartphone.
  • the camera preview can now run as a background process and there is no need any more to show the camera preview in the foreground for continuing the recording in case any of the above interruptions occurs.
  • the video recording uses an instance of the class MediaRecorder of the Android class library. Coding block 1 shows a coding example for the Android operating system to implement such a background process based video recording solution on the Android operating system.
  • Embodiments of the invention can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
  • the invention can be implemented as a computer program product, i.e., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable storage device, for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers.
  • a computer program such as the computer program of claim 9, can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
  • Method steps of the invention can be performed by one or more programmable processors executing a computer program to perform functions of the invention by operating on input data and generating output. Method steps can also be performed by, and apparatus of the invention can be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • FPGA field programmable gate array
  • ASIC application-specific integrated circuit
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computing device.
  • a processor will receive instructions and data from a read-only memory or a random access memory or both.
  • the essential elements of a computer are at least one processor for executing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Such storage devices may also provisioned on demand and be accessible through the Internet (Cloud Computing).
  • Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • semiconductor memory devices e.g., EPROM, EEPROM, and flash memory devices
  • magnetic disks e.g., internal hard disks or removable disks
  • magneto-optical disks e.g., CD-ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in special purpose logic circuitry.
  • the invention can be implemented on a computer having a display device, e.g., a cathode ray tube (CRT) or liquid crystal display (LCD) monitor, for displaying information to the user and an input device such as a keyboard, touchscreen or touchpad, a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a cathode ray tube (CRT) or liquid crystal display (LCD) monitor
  • an input device such as a keyboard, touchscreen or touchpad, a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech,
  • the invention can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an
  • the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network.
  • Examples of communication networks include a local area network (LAN) and a wide area network (WAN), e.g., the Internet or wireless LAN or telecommunication networks.
  • LAN local area network
  • WAN wide area network
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

L'invention concerne un procédé mis en œuvre par informatique, un système et un produit-programme informatique pour marquer une vidéo, comprenant un composant de traitement vidéo pour enregistrer une vidéo numérique à partir d'une caméra numérique sur un moyen de stockage numérique; recevoir un événement de capteur, l'événement de capteur indiquant une demande de marquer une partie de la vidéo numérique tandis que la vidéo numérique est enregistrée sur le moyen de stockage numérique; en réponse à la réception de l'événement de capteur, définir au moins une trame vidéo enregistrée de la vidéo en guise d'instantané; et stocker des métadonnées d'instantané associées à l'instantané, les métadonnées de l'instantané comprenant des données pour récupérer la partie de la vidéo numérique.
PCT/EP2011/056596 2011-04-26 2011-04-26 Procédé et système pour l'insertion d'un marqueur vidéo WO2012146273A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/056596 WO2012146273A1 (fr) 2011-04-26 2011-04-26 Procédé et système pour l'insertion d'un marqueur vidéo

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/056596 WO2012146273A1 (fr) 2011-04-26 2011-04-26 Procédé et système pour l'insertion d'un marqueur vidéo

Publications (1)

Publication Number Publication Date
WO2012146273A1 true WO2012146273A1 (fr) 2012-11-01

Family

ID=44202084

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/056596 WO2012146273A1 (fr) 2011-04-26 2011-04-26 Procédé et système pour l'insertion d'un marqueur vidéo

Country Status (1)

Country Link
WO (1) WO2012146273A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9374477B2 (en) 2014-03-05 2016-06-21 Polar Electro Oy Wrist computer wireless communication and event detection
EP3091729A4 (fr) * 2014-07-29 2017-03-15 Panasonic Intellectual Property Management Co., Ltd. Dispositif d'imagerie
CN113660517A (zh) * 2021-08-17 2021-11-16 浙江大华技术股份有限公司 视频文件的存储方法及装置、存储介质、电子装置
US11388338B2 (en) * 2020-04-24 2022-07-12 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Video processing for vehicle ride
US11396299B2 (en) * 2020-04-24 2022-07-26 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Video processing for vehicle ride incorporating biometric data

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6133947A (en) * 1995-11-15 2000-10-17 Casio Computer Co., Ltd. Image processing system capable of displaying photographed image in combination with relevant map image
US6359643B1 (en) * 1998-08-31 2002-03-19 Intel Corporation Method and apparatus for signaling a still image capture during video capture
US20030032447A1 (en) * 2001-08-10 2003-02-13 Koninklijke Philips Electronics N.V. Conversation rewind
US6720998B1 (en) * 1998-12-26 2004-04-13 Lg Semicon Co., Ltd. Device for managing snap shot in USB camera and method therefor
WO2006081053A2 (fr) * 2005-01-24 2006-08-03 Moderator Systems, Inc. Systeme d'authentification d'evenements sans fil
WO2008007878A1 (fr) * 2006-07-10 2008-01-17 Ubtechnology Co., Ltd Système de boîte noire pour véhicule
US20080136940A1 (en) * 2006-12-06 2008-06-12 Samsung Electronics Co., Ltd. Method and apparatus for automatic image management
US20090309989A1 (en) * 2006-06-30 2009-12-17 Nikon Corporation Camera capable of taking movie
US20100020221A1 (en) * 2008-07-24 2010-01-28 David John Tupman Camera Interface in a Portable Handheld Electronic Device
US20100152949A1 (en) * 2008-12-15 2010-06-17 Delphi Technologies, Inc. Vehicle event recording system and method
US20100171829A1 (en) * 2007-09-28 2010-07-08 Mariko Yago Drive recorder
US20100321533A1 (en) * 2009-06-23 2010-12-23 Samsung Electronics Co., Ltd Image photographing apparatus and method of controlling the same

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6133947A (en) * 1995-11-15 2000-10-17 Casio Computer Co., Ltd. Image processing system capable of displaying photographed image in combination with relevant map image
US6359643B1 (en) * 1998-08-31 2002-03-19 Intel Corporation Method and apparatus for signaling a still image capture during video capture
US6720998B1 (en) * 1998-12-26 2004-04-13 Lg Semicon Co., Ltd. Device for managing snap shot in USB camera and method therefor
US20030032447A1 (en) * 2001-08-10 2003-02-13 Koninklijke Philips Electronics N.V. Conversation rewind
WO2006081053A2 (fr) * 2005-01-24 2006-08-03 Moderator Systems, Inc. Systeme d'authentification d'evenements sans fil
US20090309989A1 (en) * 2006-06-30 2009-12-17 Nikon Corporation Camera capable of taking movie
WO2008007878A1 (fr) * 2006-07-10 2008-01-17 Ubtechnology Co., Ltd Système de boîte noire pour véhicule
US20080136940A1 (en) * 2006-12-06 2008-06-12 Samsung Electronics Co., Ltd. Method and apparatus for automatic image management
US20100171829A1 (en) * 2007-09-28 2010-07-08 Mariko Yago Drive recorder
US20100020221A1 (en) * 2008-07-24 2010-01-28 David John Tupman Camera Interface in a Portable Handheld Electronic Device
US20100152949A1 (en) * 2008-12-15 2010-06-17 Delphi Technologies, Inc. Vehicle event recording system and method
US20100321533A1 (en) * 2009-06-23 2010-12-23 Samsung Electronics Co., Ltd Image photographing apparatus and method of controlling the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9374477B2 (en) 2014-03-05 2016-06-21 Polar Electro Oy Wrist computer wireless communication and event detection
US9936084B2 (en) 2014-03-05 2018-04-03 Polar Electro Oy Wrist computer wireless communication and event detection
EP2916250B1 (fr) * 2014-03-05 2018-05-16 Polar Electro Oy Communication sans fil d'ordinateur de poignet et détection d'événement
EP3091729A4 (fr) * 2014-07-29 2017-03-15 Panasonic Intellectual Property Management Co., Ltd. Dispositif d'imagerie
US11388338B2 (en) * 2020-04-24 2022-07-12 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Video processing for vehicle ride
US11396299B2 (en) * 2020-04-24 2022-07-26 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Video processing for vehicle ride incorporating biometric data
CN113660517A (zh) * 2021-08-17 2021-11-16 浙江大华技术股份有限公司 视频文件的存储方法及装置、存储介质、电子装置

Similar Documents

Publication Publication Date Title
US20190371036A1 (en) Digital media editing
EP3232343A1 (fr) Procédé et appareil de gestion de données vidéo, terminal et serveur
WO2017211206A1 (fr) Procédé et dispositif de marquage vidéo, et procédé et système de surveillance vidéo
EP2860968B1 (fr) Dispositif de traitement de données, procédé pour le traitement de données, et programme
US20150094118A1 (en) Mobile device edge view display insert
US10802811B2 (en) Information processing device, information processing method, computer program, and server device
WO2022042389A1 (fr) Procédé et appareil d'affichage de résultat de recherche, support lisible et dispositif électronique
EP3461136B1 (fr) Procédé et dispositif de lecture de vidéo
WO2012146273A1 (fr) Procédé et système pour l'insertion d'un marqueur vidéo
CN107305561B (zh) 图像的处理方法、装置、设备及用户界面系统
US20130222154A1 (en) System and method for providing traffic notifications
JP6732677B2 (ja) 動画収集システム、動画収集装置、および動画収集方法
US20210073278A1 (en) Providing Access to Videos Generated from a Vehicle Camera System
CN102967315A (zh) 一种导航地图完善方法和装置
CN107845161B (zh) 一种获取飞行记录的方法和装置
CN110991260B (zh) 场景标注方法、装置、设备及存储介质
US10013623B2 (en) System and method for determining the position of an object displaying media content
CN109492163B (zh) 一种列表展示的记录方法、装置、终端设备及存储介质
CN114241415A (zh) 车辆的位置监控方法、边缘计算设备、监控设备及系统
CN110543347A (zh) 生成截屏图像的方法、装置以及电子设备
US11182959B1 (en) Method and system for providing web content in virtual reality environment
CN112116826A (zh) 生成信息的方法和装置
US20170200465A1 (en) Location-specific audio capture and correspondence to a video file
CN114025116B (zh) 视频生成方法、装置、可读介质和电子设备
CN117135341A (zh) 图像处理的方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11721252

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11721252

Country of ref document: EP

Kind code of ref document: A1