WO2019110755A1 - Apparatus, system and method for capture of buffered images - Google Patents

Apparatus, system and method for capture of buffered images Download PDF

Info

Publication number
WO2019110755A1
WO2019110755A1 PCT/EP2018/083861 EP2018083861W WO2019110755A1 WO 2019110755 A1 WO2019110755 A1 WO 2019110755A1 EP 2018083861 W EP2018083861 W EP 2018083861W WO 2019110755 A1 WO2019110755 A1 WO 2019110755A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
user
computational device
sensor
predetermined period
Prior art date
Application number
PCT/EP2018/083861
Other languages
French (fr)
Inventor
Sjoerd PITSTRA
Joost Godee
Original Assignee
Roader Media Bv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Roader Media Bv filed Critical Roader Media Bv
Publication of WO2019110755A1 publication Critical patent/WO2019110755A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure

Definitions

  • the present invention is of an apparatus, system and method for capture of buffered images, and in particular, for such an apparatus, system and method for continuous temporary storage of a plurality of images.
  • the present invention overcomes these drawbacks of the background art by providing an apparatus, system and method for continuous temporary storage of a plurality of images.
  • the continuously stored images are preferably only stored temporarily, such that images from a particular period of time are buffered continuously. Images that are older than a specific period time are removed, to be replaced by new images.
  • a signal such as for example activation by the user and/or through an automatic timer or signal
  • temporarily buffered images are stored in a permanent storage.
  • Non-limiting examples of such a signal include pressing a button by the user, or receipt of a signal from a sensor, such as for example and without limitation sensor input from an accelerometer, a G-sensor, light-sensor or sound input.
  • the images that are stored in permanent storage are then removed from the temporary buffer.
  • the process of storing such images in the permanent storage continues until a particular period of time has elapsed.
  • the period of time during which images are buffered continuously ranges from 1 second to 300 seconds.
  • the duration of the video that is transferred into permanent storage also ranges from 1 second to 300 seconds.
  • the apparatus optionally features an image capturing device, such as a camera, in communication with a user computational device.
  • the camera preferably features a temporary buffer, for temporarily storing a plurality of images.
  • a temporary buffer for temporarily storing a plurality of images.
  • the temporarily buffered images are then stored in a permanent memory, such as flash memory for example.
  • the image capturing device is integrated with the user computational device.
  • the storage optionally including both the temporary and permanent storage, is preferably provided through the user computational device.
  • the user computational device is optionally in communication with a remote server for storage of images.
  • the remote server is in communication with, or is an integral part of, a social media platform, for enabling sharing of the images.
  • the social media platform optionally includes, but is not limited to Facebook, Twitter, Instagram, Snap, and the like.
  • the image sensor detects and conveys the information that constitutes an image.
  • the digital image is processed by a CPU (central processing unit), whether of the camera or of the user computational device.
  • the digital video file is stored in a temporary storage, such as a DDR (double data rate) storage.
  • a temporary storage such as a DDR (double data rate) storage.
  • the CPU Upon activation or provision of a signal, as previously described, the CPU then creates a video file (MP4) from the temporary storage.
  • MP4 video file
  • Such a video file is then preferably stored in a more permanent storage, such as a flash memory storage, such as for example and without limitation a NAND flash memory.
  • a flash memory storage such as for example and without limitation a NAND flash memory.
  • the CPU stores multiple files in various image sizes (pixel X pixel, low resolution and high resolution) and frame rates.
  • a handshake is performed, to start transferring the video file from the camera to the user computational device.
  • the low resolution file is
  • the high resolution file is preferably transferred from the permanent storage, such as a flash storage, to the user computational device through a wireless or wired connection.
  • a wireless or wired connection may optionally occur through Bluetooth, WiFi or through a cable.
  • the user induces such transfer through the user computational device, for example through an app.
  • the user activates WiFi or Bluetooth on the separate camera device, which then performs a separate transfer handshake with the user computational device.
  • Such a transfer process enables the images to be stored on, and manipulated through, the user computational device.
  • the images could be shared through a social sharing platform from the user computational device.
  • the images could be shared directly from the separate camera device.
  • file processing is performed on the separate camera device.
  • file processing is performed by the user computational device.
  • Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof.
  • several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof.
  • selected steps of the invention could be implemented as a chip or a circuit.
  • selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
  • selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • any device featuring a data processor and the ability to execute one or more instructions may be described as a computer, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a "network” or a "computer network”.
  • PC personal computer
  • server a distributed server
  • a virtual server a virtual server
  • cloud computing platform a cellular telephone
  • IP telephone IP telephone
  • smartphone IP telephone
  • PDA personal digital assistant
  • Figure 1 illustrates exemplary processing operations associated with the image recording and distribution according to at least some embodiments of the present invention
  • Figures 2A and 2B illustrate exemplary systems according to at least some embodiments of the present invention
  • Figures 3A and 3B illustrate various exemplary configurations of user computational devices and image capture devices according to at least some embodiments of the present invention
  • Figure 4 shows a non-limiting exemplary detailed flow for capturing images with an image capturing device 401 followed by communication to a user computational device 402;
  • Figure 5 shows a similar implementation as that described with regard to figure 4, except in this case the image capturing device is contained within user computational device 501 as an image capturing device 520;
  • Figure 6 shows another schematic implementation featuring a standalone camera
  • Figures 7A and 7B show various exemplary configurations of a board for such a standalone camera;
  • Figures 8 and 9 show a non-limiting exemplary charging wearable cable;
  • Figure 10 shows the function of the charging wearable cable or lanyard
  • Figure 1 1 shows the charging wearable cable function in car mode
  • Figure 12 shows a non-limiting exemplary clip, according to at least some embodiments of the present invention.
  • Figure 13 shows a non-limiting example of a light interface camera.
  • Figure 1 illustrates exemplary processing operations associated with the image recording and distribution according to at least some embodiments of the present invention.
  • An image sensor 100 detects and conveys the information that constitutes an image.
  • the image is stored in the 101 RAM (DDR) memory for a maximum of 300 sec. period of time, although the possible period of time is variable, for example from 1 sec to 300 sec or any integral value in between.
  • DDR RAM
  • the oldest image frame is deleted from the RAM and the latest frame is added to RAM memory.
  • a maximum of less than 300 seconds is used. If the buffering time is flexible in this alternative embodiment, the user may be allowed to specify any integral value between 1 sec and this lower maximum.
  • a trigger 102 an image with a predefined by user setting amount of seconds length of time or number of image frames is captured from the RAM memory and stored as one or multiple files in Flash memory 103.
  • the length of time in seconds back in time or number of image frames before the trigger in 102 is variable and adjustable 104.
  • the trigger 102 may be caused by a manual press on a button, by use of a touch interface, by voice activation or by the detection of an event, for example.
  • Such an event may be detected in input from one or more of an accelerometer, a GPS sensor, a G-sensor, light-sensor, heart-rate sensor, an image capturing device (using image analysis) and a microphone (using audio analysis).
  • An event may be detected when an apparatus is detected to be at a certain location, when a certain person is detected using the image capturing device, when a user’s heart rate is below a first threshold and/or above a second threshold, or when certain sound, e.g. a gunshot, is detected.
  • the trigger 102 may be created internally or may be a remote trigger.
  • the length of time in seconds or number of image frames after the trigger in 102 is variable and adjustable 105.
  • an example of 10 seconds is shown but the length of time can be any number of seconds, preferably up to 300 seconds, as set by the user.
  • Two different file sizes or resolutions can be stored simultaneously as two separate files 103. Files sizes can vary in number of pixels and/or frame rate per second.
  • Computational Device 108 When the connection is established 107, a file as recorded in 103 is automatically transferred/transmitted/uploaded to the User Computational Device 108, described herein as being provided to user computational device 108. The file will be visible in the User Computational Device 108. Through operation of User Computational Device 108, the file can then be modified and provided to an external server and/or share the file using social media platforms 109.
  • FIG. 2A shows a non limiting exemplary implementation of a system, 200, featuring the user computational device 202, with an image capturing device 204, shown in this case as connected to, or implemented within user computational device 202.
  • imaging capturing device 204 may also be in communication with user computational device 202, or connected to user computational device 202, externally rather than being connected to or implemented within, as is the case in system 201.
  • Figure 2B shows a non-limiting exemplary system 210, in which user computational device 202 is in separate communication with image capturing device 204, which may optionally be wired or wireless. In both cases, user computational device 202 is preferably in
  • Computer network 212 is shown in figures 2 A and 2B.
  • Server 207 in figure 2A, and server 208 in figure 2B optionally and preferably stores images sent from, or through, user computational device 202. For example, for storage, for forwarding or transmitting to other platforms, and the like.
  • server 207, or server 208 may also feature software components 213, which may optionally perform one or more image processing techniques or methods.
  • Figure 3 A shows an exemplary, illustrative user computational device 300, in a system 310, in which user computational device contains image capturing device 312.
  • image capturing device 312 may alternatively be external to user computational device 300, and in communication with it.
  • User computational device 300 is in communication with a social media server 301 , and preferably also with a server 302.
  • Server 302 optionally functions as described with regard to servers 207 and 208 of figures 2 A and 2B.
  • Social media server 301 may optionally be any type of social media server, including, but not limited to Facebook, Twitter, Instagram, Snap, and the like.
  • Preferably user computational device communicates with social media server 301 , and server 302, through a computer network
  • Social media server 301 is also preferably in communication with server 302 through such a computer network 314.
  • Server 302 can optionally process images according to user or operator settings, modify, add filters or do image recognition for objects, faces, numbers and codes (like license plates), store location of recording, post on users behalf on Social Media account on Server 301 and store the images, for example for other uses.
  • Figure 3B shows a similar implementation to figure 3 A, except that image capturing device 320 is external to user computational device 321. Otherwise, the functions of social media server 325 and server 326 are similar to those as described with regard to social media server 301 and server 302 of figure 3A.
  • Figure 4 shows a non-limiting exemplary detailed flow for capturing images with an image capturing device 402 followed by communication to a user computational device 410.
  • a device which may optionally be any type of image capturing device.
  • the image sensor 401 captures images, which are provided through an image capturing device 402 and then provided through image buffering 403 to a Buffer memory 404.
  • buffer memory 404 is a temporary memory or storage, for holding the images for a predetermined period of time before the images are flushed.
  • an image storage activation event occurs, shown as storage activation event 405, then the images are held for a second predetermined period of time after event 405 before being flushed.
  • the first and second predetermined periods of time are the same but alternatively they may be different.
  • such a period of time may optionally comprise -10 and +10 seconds after event 405, but can be any number above -300 and below 300. That is, the period of time may optionally run from 1 second to 300 seconds, or any integral number in between, for each of the first and second predetermined periods of time.
  • the time is given as negative, such as -10 seconds for example, in regard to the operation of buffer memory 404 for storage, there is preferably no change before and after event 405.
  • event 405 preferably determines whether the images are stored permanently and for how long of a period of time the captured images are so transferred to permanent storage. This enables buffering to occur as described with regard to figure 1 , so that preferably there is always a plurality of images captured for a certain time period.
  • the images are captured as single static images, which may optionally be separated by a certain period of time, and alternatively are captured as video (that is, as a plurality of images which when displayed in rapid sequence, can be displayed as video data).
  • each such image to be stored is preferably transferred to an image storage process 406, for storage in a permanent memory 407, shown in a non-limiting example as Flash memory 407.
  • a wireless communication 409 or optionally and alternatively a wired communication (not shown), sends the images through a process 408 to a user computational device 410.
  • the images may continue to be stored in permanent memory 407 or they may alternatively be flushed from permanent memory 407.
  • the image capturing device is external to user computational device 410, but as previously described it may also be embedded with, integrated with, or internal to user computational device 410.
  • Figure 5 shows a similar implementation as that described with regard to figure 4, except in this case the image capturing device is contained within user computational device 501 as an image capturing device 520. Images are captured or otherwise brought to user computational device 501 by image capturing device 520. Image editing is performed in stage 502, and image sharing in stage 503. Image sharing in stage 503 may optionally be performed with regard to sharing to a social media platform 504. Image sharing may also optionally be provided to a server 505 for storage and/or processing of images by the system provider, a server 506 for storage and/or processing of images as set by the user or a server 507 for storage and/or processing of images by a government agency, such as an emergency or law enforcement agency.
  • Figure 6 shows another schematic implementation featuring a standalone camera 601.
  • the video is buffered 602 in the camera 601.
  • the camera 601 is capable of wireless
  • User computational device 604 features a screen 605 for displaying the images of the video received from the camera 601 , and one or more buttons 606, for manipulating the image and/or in app sharing of the video. Optionally all of these are implemented as software through device 604.
  • the information may be provided to a standalone computer, featuring a display 607, and one or more capabilities to enrich or adjust the content 608, e.g. through a social platform.
  • Camera 601 when on, captures and buffers video.
  • a button is pushed or upon receipt of a different internal or external trigger (e.g. based on input from a G-sensor, GPS sensor, heart-rate sensor, light-sensor, image input and/or sound input)
  • the camera 601 will be activated.
  • the camera 601 will now write a video clip with a length of XX sec. before the camera 601 is activated and XX sec. after the camera 601 is activated, to flash memory.
  • This video is directly send wirelessly to smartphone (mobile device) 604.
  • Smartphone 604 receives the video clip and opens it in the roader app. The video clip can be reviewed in the app and instantly shared to various social platforms.
  • Figure 7A shows the board front side for the camera as a standalone device, showing a main CPU 701 , lens and image sensor 703 and an LED ring 705.
  • the LED ring 705 indicates when information is being captured and or when it is being stored, which may be triggered by a user pressing a button (not shown), for example.
  • Figure 7B shows the board back side, showing a USB connector 711 to accept accessories and micro switches 713 for inputs, e.g. to allow pressing of the button to cause the images to be stored beyond the buffer time.
  • a USB connector 711 to accept accessories and micro switches 713 for inputs, e.g. to allow pressing of the button to cause the images to be stored beyond the buffer time.
  • Preferably four such buttons are provided for the mechanical operation of the device.
  • FIG. 8 shows a non-limiting exemplary charging wearable cable.
  • the process for using a wearable cable is provided in stage 801. Both ends are plugged into the camera bridge as a detachable accessory, and the cable functions as a lanyard in stage 802. One end of the cable is then unplugged and is plugged into a charging device in stage 803.
  • one end of the cable is unplugged and is then plugged into a computer to transfer data in stage 804, which may also be used for charging.
  • the lanyard cable is multi directional when plugged into a car holder or other charger. The car holder sends a signal to the camera to change recording mode to car mode.
  • the lanyard cable can also function as an extra battery with supplied power-bank connected to the cable. The signal to the
  • CPU that is needed to switch the recording mode from wearable to car mode is preferably made by making a connection between two out of the 24 USB-C pins. By sensing this bridge connection the software detects the required change.
  • the bridge ports are connected in the accessory part to connector.
  • FIG. 9 shows the charging wearable cable lanyard function as a detachable connector through a USB bridge 901.
  • the camera device in this case, is shown as featuring a detachable accessory, which is the USB bridge 901.
  • the USB bridge 901 features a plurality of connectors 903 and 905, two connectors of which are suitable for connecting to the lanyard, and one connector of which is suitable for connecting to the camera itself.
  • Figure 10 shows the function of the charging wearable cable and lanyard 1000.
  • the electrical cable and lanyard 1000 is shown connected to both ends of the connection bridge 1002, that is to both connectors 1001.
  • the lens 1003 of the camera 1004 is shown as is the camera itself 1004.
  • the lanyard 1000 is acting as a charging accessory, it may also optionally include a further charging capability to recharge the camera 1004.
  • one end of the cable is unplugged from the connector on the connection bridge, and is connected to a charger 1005, which is then connected to an electrical power source.
  • Figure 10-3 shows the power-bank 1021 , which may optionally be incorporated in the lanyard, and then Figure 10-3A shows a different implementation of the power-bank, power- bank and length adjuster 1007, featuring a length adjustment in which the cable passes through the power-bank so as to be able to adjust the length of the lanyard, and also optionally to keep the power-bank at the back of the users neck and out of the way.
  • Figure 1 1 shows the charging wearable cable function in car mode.
  • the camera may optionally be connected to a car holder 1101 , or other devices, to keep it from moving around.
  • One end of the cable is connected to a connector on the USB bridge, while the other one is connected to a car adaptor 1103 for charging.
  • Figure 1 l-lb shows another implementation in which the car holder 1101 allows is to be connected upside down, for example, to a dashboard or other holder.
  • the car holder 1101 may be a camera mount or part of a camera mount, for example.
  • a connection may be made between two pins, e.g. two unused pins of 24 available USB-C pins. These pins may be unused due to the camera transferring data in USB 2.0 mode instead of USB 3.0/3.1 mode, for example.
  • the camera can then send a signal over one of these two pins/wires and if it receives this signal back over the other of the two pins/wires, then the software knows that the camera is used in a car and can switch to a car mode instead of to a wearable mode.
  • the car mode like in the wearable mode, upon receipt of an internal or external trigger, X seconds of video before and Y seconds of video after receipt of the trigger are transmitted to another device, e.g. a smartphone.
  • the camera may continuously store files of 5 to 10 minutes in permanent storage up to a maximum amount or size. Older files are overwritten by newer files, but the space allocated to storing these files is larger than the space allocated to the buffer of X seconds.
  • the video segment is transmitted to the other device, e.g. the smartphone, and the user watches the video segment on this other device, he is offered the possibility to access the camera to view the one or more video files that include at least a part of this video segment and possible video files occurring before or after the events captured in the video segment.
  • connection between the two pins can alternatively be made somewhere else than in the car holder 1101, e.g. in a small converter device connected to the car holder 1 101.
  • FIG. 12 shows a non-limiting exemplary clip 1201 , according to at least some embodiments of the present invention.
  • the clip 1201 is a detachable accessory like the USB bridge 1203. It is alternatively incorporated within the USB bridge or is connected to the USB bridge.
  • the USB bridge 1203 may be replaced by bike mounting accessory, for example.
  • a magnetic connection 1205 is provided to connect the clip 1201 to the camera 1202. In an alternative embodiment, the magnetic connection may be used to connect the clip to the USB bridge instead of to the camera.
  • the clip 1201 is disconnected from the camera 1202.
  • the two are shown connected.
  • Figure 13 shows a non-limiting example of a camera with a light interface.
  • the user pushes and holds, e.g. a (power) button, for five seconds while the camera is on.
  • the white light 1311 shows a very slow pulse, e.g. with a one second interval.
  • the user pushes, e.g. this button, for three seconds to make a connection.
  • the white light 1311 slowly pulses, e.g.
  • the user presses, e.g. a front housing touch area, one time to record, e.g. X seconds before pressing, e.g. the touch area, and X seconds after pressing, e.g. the touch area.
  • the white light 1311 slowly rotates 1313 with counter clockwise motion, or also clockwise motion, in a predetermined interval. Such an interval may optionally comprise a 0.3 second interval.
  • audio feedback is provided at the beginning and end of the recording..
  • the user can push, e.g.
  • the touch area again to cause the recording to continue an additional period of time, such as for example 10 seconds.
  • the white light 1311 slow rotates 1313 again to show that the camera is still recording.
  • it may be pushed, e.g. touched, yet again to add another additional period, e.g.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

An apparatus for temporary storage of a plurality of image comprises an image capture device for capturing the plurality of images, a temporary storage for receiving the plurality of images for a predetermined period of time, a permanent storage, and a communication module for transfer of the images to a separate computational device. The plurality of images is captured by the image capture device for the predetermined period of time before the signal is received. The plurality of images is captured by the image capture device for a second predetermined period of time after the signal is received. The stored images in the temporary storage are transferred to the permanent storage upon receipt of a signal and transferred from the permanent storage to the separate computational device upon said transfer to the permanent storage.

Description

APPARATUS, SYSTEM AND METHOD FOR CAPTURE OF BUFFERED IMAGES
FIELD OF THE INVENTION
The present invention is of an apparatus, system and method for capture of buffered images, and in particular, for such an apparatus, system and method for continuous temporary storage of a plurality of images.
BACKGROUND OF THE INVENTION
Users frequently carry digital cameras with them, as separate devices or integrated into cellular telephones. However, unless they continuously capture images with their devices, users are often not able to capture video of the moment that they wish to capture. Certain cameras, such as the Go Pro camera series, are designed to continuously capture and permanently store video for this reason. This results in a large amount of unnecessary video data, which is then difficult for users to sort through and edit.
BRIEF SUMMARY OF THE INVENTION
The present invention overcomes these drawbacks of the background art by providing an apparatus, system and method for continuous temporary storage of a plurality of images. The continuously stored images are preferably only stored temporarily, such that images from a particular period of time are buffered continuously. Images that are older than a specific period time are removed, to be replaced by new images. Upon receipt of a signal, such as for example activation by the user and/or through an automatic timer or signal, temporarily buffered images are stored in a permanent storage. Non-limiting examples of such a signal include pressing a button by the user, or receipt of a signal from a sensor, such as for example and without limitation sensor input from an accelerometer, a G-sensor, light-sensor or sound input. Optionally, the images that are stored in permanent storage are then removed from the temporary buffer. Also optionally, once the signal is given for images to be stored in the permanent storage, the process of storing such images in the permanent storage continues until a particular period of time has elapsed.
Optionally, the period of time during which images are buffered continuously ranges from 1 second to 300 seconds. Also optionally, the duration of the video that is transferred into permanent storage also ranges from 1 second to 300 seconds.
The apparatus optionally features an image capturing device, such as a camera, in communication with a user computational device. The camera preferably features a temporary buffer, for temporarily storing a plurality of images. Upon activation by the user, for example by pressing a button or through a software signal, the temporarily buffered images are then stored in a permanent memory, such as flash memory for example.
Alternatively or additionally, the image capturing device is integrated with the user computational device. In this case, the storage, optionally including both the temporary and permanent storage, is preferably provided through the user computational device.
The user computational device is optionally in communication with a remote server for storage of images. Optionally the remote server is in communication with, or is an integral part of, a social media platform, for enabling sharing of the images. The social media platform optionally includes, but is not limited to Facebook, Twitter, Instagram, Snap, and the like.
As a non-limiting example, optionally the following process is performed. The image sensor detects and conveys the information that constitutes an image. The digital image is processed by a CPU (central processing unit), whether of the camera or of the user computational device. After processing, the digital video file is stored in a temporary storage, such as a DDR (double data rate) storage. Upon activation or provision of a signal, as previously described, the CPU then creates a video file (MP4) from the temporary storage.
Such a video file is then preferably stored in a more permanent storage, such as a flash memory storage, such as for example and without limitation a NAND flash memory. Optionally the CPU stores multiple files in various image sizes (pixel X pixel, low resolution and high resolution) and frame rates. Optionally, if the camera is separate from the user computational device, then preferably a handshake is performed, to start transferring the video file from the camera to the user computational device. Also optionally the low resolution file is
uploaded/pushed to the user computational device, for example by using Bluetooth.
For ease of transfer, the high resolution file is preferably transferred from the permanent storage, such as a flash storage, to the user computational device through a wireless or wired connection. For example and without limitation, such transfer may optionally occur through Bluetooth, WiFi or through a cable. Optionally the user induces such transfer through the user computational device, for example through an app. Alternatively, the user activates WiFi or Bluetooth on the separate camera device, which then performs a separate transfer handshake with the user computational device.
Such a transfer process enables the images to be stored on, and manipulated through, the user computational device. For example and without limitation, the images could be shared through a social sharing platform from the user computational device. Alternatively, the images could be shared directly from the separate camera device.
Optionally, file processing is performed on the separate camera device. Alternatively, such file processing is performed by the user computational device.
Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present invention, several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof. For example, as hardware, selected steps of the invention could be implemented as a chip or a circuit. As software, selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
Although the present invention is described with regard to a“computing device”, a "computer", or“mobile device”, it should be noted that optionally any device featuring a data processor and the ability to execute one or more instructions may be described as a computer, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a "network" or a "computer network".
BRIEF DESCRIPTION OF THE DRAWINGS
The invention is herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in order to provide what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice. In the drawings:
Figure 1 illustrates exemplary processing operations associated with the image recording and distribution according to at least some embodiments of the present invention;
Figures 2A and 2B illustrate exemplary systems according to at least some embodiments of the present invention;
Figures 3A and 3B illustrate various exemplary configurations of user computational devices and image capture devices according to at least some embodiments of the present invention;
Figure 4 shows a non-limiting exemplary detailed flow for capturing images with an image capturing device 401 followed by communication to a user computational device 402;
Figure 5 shows a similar implementation as that described with regard to figure 4, except in this case the image capturing device is contained within user computational device 501 as an image capturing device 520;
Figure 6 shows another schematic implementation featuring a standalone camera;
Figures 7A and 7B show various exemplary configurations of a board for such a standalone camera; Figures 8 and 9 show a non-limiting exemplary charging wearable cable;
Figure 10 shows the function of the charging wearable cable or lanyard;
Figure 1 1 shows the charging wearable cable function in car mode;
Figure 12 shows a non-limiting exemplary clip, according to at least some embodiments of the present invention; and
Figure 13 shows a non-limiting example of a light interface camera.
DESCRIPTION OF AT LEAST SOME EMBODIMENTS
Figure 1 illustrates exemplary processing operations associated with the image recording and distribution according to at least some embodiments of the present invention. An image sensor 100 detects and conveys the information that constitutes an image. The image is stored in the 101 RAM (DDR) memory for a maximum of 300 sec. period of time, although the possible period of time is variable, for example from 1 sec to 300 sec or any integral value in between. After the maximum of 300 sec period of time is exceeded, the oldest image frame is deleted from the RAM and the latest frame is added to RAM memory. In an alternative embodiment, a maximum of less than 300 seconds is used. If the buffering time is flexible in this alternative embodiment, the user may be allowed to specify any integral value between 1 sec and this lower maximum. By a trigger 102, an image with a predefined by user setting amount of seconds length of time or number of image frames is captured from the RAM memory and stored as one or multiple files in Flash memory 103. The length of time in seconds back in time or number of image frames before the trigger in 102 is variable and adjustable 104. The trigger 102 may be caused by a manual press on a button, by use of a touch interface, by voice activation or by the detection of an event, for example. Such an event may be detected in input from one or more of an accelerometer, a GPS sensor, a G-sensor, light-sensor, heart-rate sensor, an image capturing device (using image analysis) and a microphone (using audio analysis). An event may be detected when an apparatus is detected to be at a certain location, when a certain person is detected using the image capturing device, when a user’s heart rate is below a first threshold and/or above a second threshold, or when certain sound, e.g. a gunshot, is detected. The trigger 102 may be created internally or may be a remote trigger.
The length of time in seconds or number of image frames after the trigger in 102 is variable and adjustable 105. In Figure 1 , an example of 10 seconds is shown but the length of time can be any number of seconds, preferably up to 300 seconds, as set by the user. Two different file sizes or resolutions can be stored simultaneously as two separate files 103. Files sizes can vary in number of pixels and/or frame rate per second. Upon a trigger initiated recording as described in 103, the device will initiate a wireless connection 106 to a User
Computational Device 108. When the connection is established 107, a file as recorded in 103 is automatically transferred/transmitted/uploaded to the User Computational Device 108, described herein as being provided to user computational device 108. The file will be visible in the User Computational Device 108. Through operation of User Computational Device 108, the file can then be modified and provided to an external server and/or share the file using social media platforms 109.
The system and apparatus, which may actually be used for implementing the method of figure 1 , may optionally be any type of image capture device optionally coupled to, embedded in or connected with, or in communication with a computational device. Figure 2A shows a non limiting exemplary implementation of a system, 200, featuring the user computational device 202, with an image capturing device 204, shown in this case as connected to, or implemented within user computational device 202. Optionally, imaging capturing device 204 may also be in communication with user computational device 202, or connected to user computational device 202, externally rather than being connected to or implemented within, as is the case in system 201.
Figure 2B shows a non-limiting exemplary system 210, in which user computational device 202 is in separate communication with image capturing device 204, which may optionally be wired or wireless. In both cases, user computational device 202 is preferably in
communication with a server 208 through a computer network 212.
Computer network 212 is shown in figures 2 A and 2B. Server 207 in figure 2A, and server 208 in figure 2B, optionally and preferably stores images sent from, or through, user computational device 202. For example, for storage, for forwarding or transmitting to other platforms, and the like. Optionally, server 207, or server 208, may also feature software components 213, which may optionally perform one or more image processing techniques or methods.
Figure 3 A shows an exemplary, illustrative user computational device 300, in a system 310, in which user computational device contains image capturing device 312. Again, image capturing device 312 may alternatively be external to user computational device 300, and in communication with it. User computational device 300 is in communication with a social media server 301 , and preferably also with a server 302. Server 302 optionally functions as described with regard to servers 207 and 208 of figures 2 A and 2B. Social media server 301 may optionally be any type of social media server, including, but not limited to Facebook, Twitter, Instagram, Snap, and the like. Preferably user computational device communicates with social media server 301 , and server 302, through a computer network
314, which may optionally be the internet as previously described. Social media server 301 is also preferably in communication with server 302 through such a computer network 314. Server 302 can optionally process images according to user or operator settings, modify, add filters or do image recognition for objects, faces, numbers and codes (like license plates), store location of recording, post on users behalf on Social Media account on Server 301 and store the images, for example for other uses.
Figure 3B shows a similar implementation to figure 3 A, except that image capturing device 320 is external to user computational device 321. Otherwise, the functions of social media server 325 and server 326 are similar to those as described with regard to social media server 301 and server 302 of figure 3A.
Figure 4 shows a non-limiting exemplary detailed flow for capturing images with an image capturing device 402 followed by communication to a user computational device 410. As shown with regard to a system 400, there is provided a device, which may optionally be any type of image capturing device. In this case, the image sensor 401 captures images, which are provided through an image capturing device 402 and then provided through image buffering 403 to a Buffer memory 404. Preferably buffer memory 404 is a temporary memory or storage, for holding the images for a predetermined period of time before the images are flushed.
Also optionally, if an image storage activation event occurs, shown as storage activation event 405, then the images are held for a second predetermined period of time after event 405 before being flushed. Optionally the first and second predetermined periods of time are the same but alternatively they may be different. For example, such a period of time may optionally comprise -10 and +10 seconds after event 405, but can be any number above -300 and below 300. That is, the period of time may optionally run from 1 second to 300 seconds, or any integral number in between, for each of the first and second predetermined periods of time. Although the time is given as negative, such as -10 seconds for example, in regard to the operation of buffer memory 404 for storage, there is preferably no change before and after event 405. Instead, event 405 preferably determines whether the images are stored permanently and for how long of a period of time the captured images are so transferred to permanent storage. This enables buffering to occur as described with regard to figure 1 , so that preferably there is always a plurality of images captured for a certain time period. Optionally the images are captured as single static images, which may optionally be separated by a certain period of time, and alternatively are captured as video (that is, as a plurality of images which when displayed in rapid sequence, can be displayed as video data).
Next, once it is determined that something is to be transferred to permanent storage, each such image to be stored is preferably transferred to an image storage process 406, for storage in a permanent memory 407, shown in a non-limiting example as Flash memory 407. Then upon retrieval, a wireless communication 409, or optionally and alternatively a wired communication (not shown), sends the images through a process 408 to a user computational device 410. The images may continue to be stored in permanent memory 407 or they may alternatively be flushed from permanent memory 407.
In this example, the image capturing device is external to user computational device 410, but as previously described it may also be embedded with, integrated with, or internal to user computational device 410.
Figure 5 shows a similar implementation as that described with regard to figure 4, except in this case the image capturing device is contained within user computational device 501 as an image capturing device 520. Images are captured or otherwise brought to user computational device 501 by image capturing device 520. Image editing is performed in stage 502, and image sharing in stage 503. Image sharing in stage 503 may optionally be performed with regard to sharing to a social media platform 504. Image sharing may also optionally be provided to a server 505 for storage and/or processing of images by the system provider, a server 506 for storage and/or processing of images as set by the user or a server 507 for storage and/or processing of images by a government agency, such as an emergency or law enforcement agency.
Figure 6 shows another schematic implementation featuring a standalone camera 601.
The video is buffered 602 in the camera 601. The camera 601 is capable of wireless
communication 603 to a user computational device 604. User computational device 604, e.g. smartphone, features a screen 605 for displaying the images of the video received from the camera 601 , and one or more buttons 606, for manipulating the image and/or in app sharing of the video. Optionally all of these are implemented as software through device 604.
Alternatively or additionally, the information may be provided to a standalone computer, featuring a display 607, and one or more capabilities to enrich or adjust the content 608, e.g. through a social platform.
Camera 601 , when on, captures and buffers video. When a button is pushed or upon receipt of a different internal or external trigger (e.g. based on input from a G-sensor, GPS sensor, heart-rate sensor, light-sensor, image input and/or sound input), the camera 601 will be activated. The camera 601 will now write a video clip with a length of XX sec. before the camera 601 is activated and XX sec. after the camera 601 is activated, to flash memory. This video is directly send wirelessly to smartphone (mobile device) 604. Smartphone 604 receives the video clip and opens it in the roader app. The video clip can be reviewed in the app and instantly shared to various social platforms.
Figure 7A shows the board front side for the camera as a standalone device, showing a main CPU 701 , lens and image sensor 703 and an LED ring 705. The LED ring 705 indicates when information is being captured and or when it is being stored, which may be triggered by a user pressing a button (not shown), for example. Figure 7B shows the board back side, showing a USB connector 711 to accept accessories and micro switches 713 for inputs, e.g. to allow pressing of the button to cause the images to be stored beyond the buffer time. Preferably four such buttons are provided for the mechanical operation of the device.
Figure 8 shows a non-limiting exemplary charging wearable cable. The process for using a wearable cable is provided in stage 801. Both ends are plugged into the camera bridge as a detachable accessory, and the cable functions as a lanyard in stage 802. One end of the cable is then unplugged and is plugged into a charging device in stage 803.
Optionally, alternatively or additionally, one end of the cable is unplugged and is then plugged into a computer to transfer data in stage 804, which may also be used for charging. The lanyard cable is multi directional when plugged into a car holder or other charger. The car holder sends a signal to the camera to change recording mode to car mode. The lanyard cable can also function as an extra battery with supplied power-bank connected to the cable. The signal to the
CPU that is needed to switch the recording mode from wearable to car mode, is preferably made by making a connection between two out of the 24 USB-C pins. By sensing this bridge connection the software detects the required change. The bridge ports are connected in the accessory part to connector.
Figure 9 shows the charging wearable cable lanyard function as a detachable connector through a USB bridge 901. The camera device, in this case, is shown as featuring a detachable accessory, which is the USB bridge 901. The USB bridge 901 features a plurality of connectors 903 and 905, two connectors of which are suitable for connecting to the lanyard, and one connector of which is suitable for connecting to the camera itself.
Figure 10 shows the function of the charging wearable cable and lanyard 1000. In
Figure 10-1 , the electrical cable and lanyard 1000 is shown connected to both ends of the connection bridge 1002, that is to both connectors 1001. The lens 1003 of the camera 1004 is shown as is the camera itself 1004. Now the lanyard 1000 is acting as a charging accessory, it may also optionally include a further charging capability to recharge the camera 1004. When it is time to fully charge the camera 1004, and optionally also the lanyard 1000, as shown in Figure 10-2, then one end of the cable is unplugged from the connector on the connection bridge, and is connected to a charger 1005, which is then connected to an electrical power source. Figure 10-3 shows the power-bank 1021 , which may optionally be incorporated in the lanyard, and then Figure 10-3A shows a different implementation of the power-bank, power- bank and length adjuster 1007, featuring a length adjustment in which the cable passes through the power-bank so as to be able to adjust the length of the lanyard, and also optionally to keep the power-bank at the back of the users neck and out of the way.
Figure 1 1 shows the charging wearable cable function in car mode. As shown in Figure 11 -1 a, the camera may optionally be connected to a car holder 1101 , or other devices, to keep it from moving around. One end of the cable is connected to a connector on the USB bridge, while the other one is connected to a car adaptor 1103 for charging. Figure 1 l-lb shows another implementation in which the car holder 1101 allows is to be connected upside down, for example, to a dashboard or other holder. The car holder 1101 may be a camera mount or part of a camera mount, for example.
In the car holder 1101, a connection may be made between two pins, e.g. two unused pins of 24 available USB-C pins. These pins may be unused due to the camera transferring data in USB 2.0 mode instead of USB 3.0/3.1 mode, for example. The camera can then send a signal over one of these two pins/wires and if it receives this signal back over the other of the two pins/wires, then the software knows that the camera is used in a car and can switch to a car mode instead of to a wearable mode. In the car mode, like in the wearable mode, upon receipt of an internal or external trigger, X seconds of video before and Y seconds of video after receipt of the trigger are transmitted to another device, e.g. a smartphone.
In the car mode, additional functionality is offered. For example, the camera may continuously store files of 5 to 10 minutes in permanent storage up to a maximum amount or size. Older files are overwritten by newer files, but the space allocated to storing these files is larger than the space allocated to the buffer of X seconds. When, after receipt of the trigger, the video segment is transmitted to the other device, e.g. the smartphone, and the user watches the video segment on this other device, he is offered the possibility to access the camera to view the one or more video files that include at least a part of this video segment and possible video files occurring before or after the events captured in the video segment.
This connection between the two pins can alternatively be made somewhere else than in the car holder 1101, e.g. in a small converter device connected to the car holder 1 101.
Figure 12 shows a non-limiting exemplary clip 1201 , according to at least some embodiments of the present invention. The clip 1201 is a detachable accessory like the USB bridge 1203. It is alternatively incorporated within the USB bridge or is connected to the USB bridge. The USB bridge 1203 may be replaced by bike mounting accessory, for example. A magnetic connection 1205 is provided to connect the clip 1201 to the camera 1202. In an alternative embodiment, the magnetic connection may be used to connect the clip to the USB bridge instead of to the camera. At moment 1200, the clip 1201 is disconnected from the camera 1202. At moment 1210, the two are shown connected.
Figure 13 shows a non-limiting example of a camera with a light interface. At moment 1301 , there is no light indication given, so the camera is off. To turn off the camera, the user pushes and holds, e.g. a (power) button, for five seconds while the camera is on. After the user pushes, e.g. this button, for five seconds while the camera is off, it turns on in standby mode. In camera standby mode, at moment 1302, the white light 1311 shows a very slow pulse, e.g. with a one second interval. Then the user pushes, e.g. this button, for three seconds to make a connection. While connecting, at moment 1303, the white light 1311 slowly pulses, e.g. with a 0.3 second interval, and optionally audio feedback is provided. After the camera has connected, the user presses, e.g. a front housing touch area, one time to record, e.g. X seconds before pressing, e.g. the touch area, and X seconds after pressing, e.g. the touch area. During camera recording, at moment 1304, the white light 1311 slowly rotates 1313 with counter clockwise motion, or also clockwise motion, in a predetermined interval. Such an interval may optionally comprise a 0.3 second interval. During camera recording, optionally audio feedback is provided at the beginning and end of the recording.. During recording, at moment 1305, the user can push, e.g. the touch area, again to cause the recording to continue an additional period of time, such as for example 10 seconds. Preferably, the white light 1311 slow rotates 1313 again to show that the camera is still recording. At moment 1306, it may be pushed, e.g. touched, yet again to add another additional period, e.g.
10 seconds, and so forth, with the audio feedback optionally being provided at the end of the recording and white light 1311 slowly rotating 1313 to indicate that the camera is still recording.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub- combination.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the scope of the appended claims. All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.

Claims

WHAT IS CLAIMED IS:
1. An apparatus for temporary storage of a plurality of images, comprising an image capture device for capturing the plurality of images, a temporary storage for receiving the plurality of images for a predetermined period of time, a permanent storage, and a communication module for transfer of the images to a separate computational device; wherein the plurality of images is captured by the image capture device for said predetermined period of time before said signal is received, the plurality of images is captured by the image capture device for a second predetermined period of time after said signal is received, and the stored images in the temporary storage are transferred to the permanent storage upon receipt of a signal and transferred from the permanent storage to the separate computational device upon said transfer to the permanent storage.
2. The apparatus of claim 1 , wherein said predetermined period of time ranges from 1 second to 300 seconds, after which the images are removed from the temporary storage.
3. The apparatus of claim 1 or 2, wherein said second predetermined period of time
ranges from 1 second to 300 seconds, after which the images are not transferred to the permanent storage.
4. The apparatus of any of the above claims, wherein said first or second predetermined period of time is set by a user of the apparatus.
5. The apparatus of any of the above claims, wherein said signal comprises activation of the apparatus by a user.
6. The apparatus of claim 7, further comprising a button for receiving said activation of the apparatus by the user.
7. The apparatus of any of the above claims, wherein said signal comprises input from a sensor selected from one or more of an accelerometer, a GPS sensor, a heart-rate sensor, a G-sensor, light-sensor, image input or sound input.
8. The apparatus of any of the above claims, further comprising software for processing image files.
9. The apparatus of any of the above claims, further comprising software for supporting social sharing through said communication module.
10. The apparatus of any of the above claims, wherein images from said predetermined period of time are buffered continuously, such that images that are older than said predetermined period of time are removed, to be replaced by new images.
11. The apparatus of any of the above claims, further comprising a button for being activated to activate the image capture device.
12. The apparatus of any of the above claims, wherein said signal is sent upon detecting an event in input from a sensor selected from one or more of an accelerometer, a GPS sensor, a G-sensor, light-sensor, heart-rate sensor, said image capturing device, a further image capturing device and a microphone.
13. The apparatus of any of the above claims, further configured to transmit a signal over one wire of a cable and switch to a car mode or a wearable mode in dependence on receiving back the signal over another wire of said cable.
14. A system, comprising the apparatus of any of the above claims and further
comprising a user computational device in communication with the apparatus.
15. The system of claim 14, wherein said user computational device comprises the
apparatus.
16. The system of claim 14, wherein said user computational device is in wireless or wired communication with the apparatus.
17. The system of claim 16, wherein said user computational device comprises software for processing image files.
18. The system of claims 16 or 17, wherein said user computational device, the apparatus or both comprise software for supporting transfer of images from the apparatus to said user computational device.
19. The system of any of the above claims, further comprising a remote server for
receiving the images.
20. The system of claim 19, wherein said remote server comprises a social media
platform for sharing the images.
PCT/EP2018/083861 2017-12-06 2018-12-06 Apparatus, system and method for capture of buffered images WO2019110755A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762595140P 2017-12-06 2017-12-06
US62/595,140 2017-12-06

Publications (1)

Publication Number Publication Date
WO2019110755A1 true WO2019110755A1 (en) 2019-06-13

Family

ID=64661376

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/083861 WO2019110755A1 (en) 2017-12-06 2018-12-06 Apparatus, system and method for capture of buffered images

Country Status (1)

Country Link
WO (1) WO2019110755A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090051768A1 (en) * 2006-08-31 2009-02-26 Dekeyser Paul Loop Recording With Book Marking
US20130128067A1 (en) * 2008-11-07 2013-05-23 Justin Boland Wireless handset interface for video recording camera control
CN105635318A (en) * 2016-03-02 2016-06-01 腾讯科技(深圳)有限公司 Image acquisition method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090051768A1 (en) * 2006-08-31 2009-02-26 Dekeyser Paul Loop Recording With Book Marking
US20130128067A1 (en) * 2008-11-07 2013-05-23 Justin Boland Wireless handset interface for video recording camera control
CN105635318A (en) * 2016-03-02 2016-06-01 腾讯科技(深圳)有限公司 Image acquisition method and system
EP3425529A1 (en) * 2016-03-02 2019-01-09 Tencent Technology (Shenzhen) Company Limited Image acquisition method, controlled device and server

Similar Documents

Publication Publication Date Title
CN102263899B (en) Photographing device and control method therefor
EP3481048B1 (en) Electronic device for recording image using multiple cameras and operating method thereof
US20140085495A1 (en) Methods and devices for controlling camera image capture
EP3174283A1 (en) Preview image display method and apparatus, computer program and recording medium
US20180063421A1 (en) Wearable camera, wearable camera system, and recording control method
TW201442514A (en) Peripheral equipment for controlling camera arranged in a terminal, system and method thereof
CN112929654B (en) Method, device and equipment for detecting sound and picture synchronization and storage medium
CN113556485A (en) Video generation method and device, electronic equipment and storage medium
EP3073491A1 (en) Generation of a digest video
CN106028098A (en) Video recording method, device, and terminal
CN111177137A (en) Data deduplication method, device, equipment and storage medium
JP5861073B1 (en) Wearable camera
US20100002124A1 (en) Image capturing device with operation scheduling and image capturing method thereof
JP5810332B1 (en) Wearable camera
JP2018042229A (en) Wearable camera, wearable camera system, and video recording control method
US10785544B2 (en) Image acquisition apparatus with priority network selection for data upload
CA2825342C (en) Methods and devices for controlling camera image capture
WO2019110755A1 (en) Apparatus, system and method for capture of buffered images
CN109360577B (en) Method, apparatus, and storage medium for processing audio
CN114666455A (en) Shooting control method and device, storage medium and electronic device
CN111314763A (en) Streaming media playing method and device, storage medium and electronic equipment
CN106572306A (en) Image shooting method and electronic equipment
CN111711841B (en) Image frame playing method, device, terminal and storage medium
CN114036424A (en) Method, device, equipment, storage medium and program product for displaying species information
CN113824902A (en) Method, device, system, equipment and medium for determining time delay of infrared camera system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18815661

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 04.09.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18815661

Country of ref document: EP

Kind code of ref document: A1