WO2013140017A1 - Method, apparatus and computer program product for managing transmission bandwidth - Google Patents

Method, apparatus and computer program product for managing transmission bandwidth

Info

Publication number
WO2013140017A1
WO2013140017A1 PCT/FI2012/050282 FI2012050282W WO2013140017A1 WO 2013140017 A1 WO2013140017 A1 WO 2013140017A1 FI 2012050282 W FI2012050282 W FI 2012050282W WO 2013140017 A1 WO2013140017 A1 WO 2013140017A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
image
data
frame
viewfinder
apparatus
Prior art date
Application number
PCT/FI2012/050282
Other languages
French (fr)
Inventor
Mattipekka KRONQVIST
Original Assignee
Eye Solutions Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of content streams, manipulating MPEG-4 scene graphs
    • H04N21/23406Processing of video elementary streams, e.g. splicing of content streams, manipulating MPEG-4 scene graphs involving management of server-side video buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of content streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of content streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/60Selective content distribution, e.g. interactive television, VOD [Video On Demand] using Network structure or processes specifically adapted for video distribution between server and client or between remote clients; Control signaling specific to video distribution between clients, server and network components, e.g. to video encoder or decoder; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6131Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a mobile phone network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television, VOD [Video On Demand]
    • H04N21/60Selective content distribution, e.g. interactive television, VOD [Video On Demand] using Network structure or processes specifically adapted for video distribution between server and client or between remote clients; Control signaling specific to video distribution between clients, server and network components, e.g. to video encoder or decoder; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6373Control signals issued by the client directed to the server or network components for rate control, e.g. request to the server to modify its transmission rate

Abstract

One aspect of the invention discloses a method comprising receiving by an apparatus an image captured by an image sensor of a camera; and scaling the image to correspond with a resolution of a display of the apparatus. The scaled image represents a viewfinder image. The method further comprises preparing the viewfinder image to be stored in a data frame buffer; storing the viewfinder image in the data frame buffer; selecting a viewfinder image from the data frame buffer for transmission; and providing the selected viewfinder image for transmission to a wireless communications network.

Description

METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR MANAGING TRANSMISSION BANDWIDTH

FIELD OF THE INVENTION

The invention relates to wireless communications. More specifically, the invention relates to a method, an apparatus for wireless communication and a computer program product for managing transmission bandwidth.

BACKGROUND OF THE INVENTION

Modern data communication networks provide various ways to stream a live feed from a sending entity to a receiver. Depending on the solutions, the bit rate used to stream information may be very high (e.g. streaming a high definition movie) or moderate (e.g. a live web cam stream between two persons) . A common nominator for these solutions is that there is enough transmission capacity to be used for the streaming .

The situation becomes more difficult when data streams are transmitted via a wireless communication network (e.g. a mobile communication network) where the bandwidth available is often very limited. Let's assume that a mobile telephone is used to record live video at a location and another user wants to see the recorded video as soon as possible to see what happens at the location. The video is first recorded with a camera of the mobile telephone for some time, e.g. several minutes. When the recording is stopped, the recorded video clip is then sent to the user. The transmission time of the video clip to the user depends on various factors, e.g. total data traffic load in the mobile communication network, mobile communication network technology, the location of the mobile telephone etc. As a result, it may take a considerable amount of time to transmit the video clip to the user. Since the transmission time of the video clip is in any case much longer than the length of the video clip, the user receiving the video clip sees the first frame of the video clip only after receiving the whole video clip. A further problem is that the user receiving the video clip is not able to see what is happening right now at the location.

Based on the above, there is a need for an improved solution which enables a receiving user to receive an essentially real time video stream especially via a mobile communication network and see what happens at the location right now. SUMMARY

According to a first aspect of the invention, there is provided a method comprising executing with an apparatus a data frame recording process and a data frame transmission process repeatedly and in parallel. The data frame recording process comprises capturing an image by an image sensor, preparing a viewfinder image on the basis of the captured image, and storing the viewfinder image to an image buffer as a latest viewfinder image.. In some embodiments the data frame transmission process comprises, when determining that a data frame is to be transmitted, selecting the last stored viewfinder image from the data frame buffer; and transmitting the data frame to a wireless communications network.

In some other embodiments the data frame transmission process comprises transmitting the viewfinder image to a wireless communications network when the viewfinder image is ready for transmission.

In one embodiment, the data frame comprises at least one of an image frame and an audio frame. In one embodiment the data frame buffer comprises two or more viewfinder images, and the latest viewfinder image is selected for transmission.

In one embodiment, the viewfinder image recording process comprises storing a viewfinder image and an audio frame corresponding to the viewfinder image to the data buffer; and the image transmission process comprises transmitting the viewfinder image and the audio frame from the data buffer to the wireless communications network.

In one embodiment, the method further comprises receiving a resolution indication via the wireless communication network from a server which sets a resolution which should be used in the transmission of the viewfinder images; comparing the resolution of the viewfinder image to the resolution indication; and reducing the resolution of the viewfinder image, if the comparison indicates that the resolution of the viewfinder image is larger than the resolution indication indicates.

In one embodiment, the method further comprises receiving from a server via the wireless communications network an indication setting at least one of a delay between two consecutive frame transmissions and a bandwidth limitation for transmitting the data frames.

In one embodiment, the method further comprises before storing the data frame in the data buffer, dropping an oldest data frame from the data buffer when the data buffer is full.

In one embodiment, the method further comprises receiving a first viewfinder image and a second a viewfinder image; comparing the first viewfinder image and the second viewfinder image to detect if the first viewfinder image and the second viewfinder image differ from each other by a certain amount; and if the comparison indicates that the first viewfinder image and the second viewfinder image differ from each other by at least the certain amount, initiating the transmission of the selected viewfinder image .

In one embodiment, the method further comprises defining one or more areas in the viewfinder images, wherein the comparing is performed on the basis of image contents within the one or more areas.

According to a second aspect of the invention there is provided an apparatus for wireless communication. The apparatus comprises a transceiver for receiving and transmitting data to and from a wireless communication network; means for receiving an image captured by an image sensor of a camera; means for scaling the image to correspond with a resolution of a display of the apparatus, wherein the scaled image represents a viewfinder image; means for preparing the viewfinder image to be stored in a data frame buffer; means for storing the viewfinder image in the data frame buffer; means for selecting a viewfinder image from the data frame buffer for transmission; and means for providing the selected viewfinder image for transmission to a wireless communications network.

In one embodiment, the data frame comprises at least one of an image frame and an audio frame.

In one embodiment, the camera is a part of the apparatus .

In one embodiment, the data frame buffer comprises two or more viewfinder images, and the apparatus is adapted to select the latest viewfinder image for transmission.

In one embodiment, the means for scaling are adapted to reduce the resolution of the image.

In one embodiment, the data frame recording process means is configured to store an image frame and an audio frame corresponding to the image frame to the data buffer, and the data frame transmission process means is configured to transmit the image frame and the audio frame from the data buffer to the wireless communications network.

In one embodiment, the apparatus further comprises means for receiving a resolution indication via the wireless communication network from a server which sets a resolution which should be used to capture image frames.

In one embodiment, the apparatus further comprises means for receiving from a server via the wireless communications network an indication setting at least one of a delay between two consecutive frame transmissions and a bandwidth limitation for transmitting the data frames.

In one embodiment, the data frame recording process means is configured, before the data frame is stored in the data buffer, to drop an oldest data frame from the data buffer when the data buffer is full .

In one embodiment, the apparatus is configured to prioritize the data frame transmission process over the data frame recording process.

In one embodiment, the apparatus further comprises means for receiving a first viewfinder image and a second a viewfinder image; means for comparing the first viewfinder image and the second viewfinder image to detect if the first viewfinder image and the second viewfinder image differ from each other by a certain amount; and means for initiating the transmission of the selected viewfinder image if the comparison indicates that the first viewfinder image and the second viewfinder image differ from each other by at least the certain amount.

In one embodiment, the apparatus further comprises means for defining one or more areas in the viewfinder images, wherein the means for comparing are adapted to perform the comparison on the basis of image contents within the one or more areas.

According to a third aspect of the invention there is provided a computer program product comprising a computer-readable medium bearing computer program code embodied therein for use with a computer, the computer program code comprises code for performing the above method.

It is possible to combine one or more of the above embodiments to provide a further embodiment of the invention.

Advantages of various embodiments of the invention comprise the ability to provide essentially real-time feed from an apparatus when transmission bandwidth to be used for the apparatus is limited, and to limit transmission bandwidth of the apparatus. BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are included to provide a further understanding of the invention and constitute a part of this specification, illustrate embodiments of the invention and together with the description help to explain the principles of the invention. In the drawings:

Figure 1 discloses a block diagram of a method according to one embodiment of the present invention;

Figure 2 discloses a block diagram of an apparatus for wireless communication according to one embodiment of the invention;

Figure 3 discloses a block diagram of a storage server according to one embodiment of the invention;

Figure 4 discloses a block diagram of an apparatus according to another embodiment of the invention;

Figure 5 discloses some elements arranged in the memory of the apparatus according to one embodiment of the invention;

Figure 6 illustrates as an arrow diagram the operation of an example embodiment of the invention when viewfinder images are captured and transmitted;

Figure 7 depicts an example of a system in which the invention can be implemented; and

Figure 8 depicts an example of some elements of a surveillance system in which the invention can be implemented.

DETAILED DESCRIPTION OF THE INVENTION

Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings.

Figure 1 discloses a block diagram of a method according to one embodiment of the present invention. The method is intended to be implemented in an apparatus for wireless communication which is able to receive and transmit data to/from a mobile telephone network. The apparatus, e.g. a mobile phone, is equipped with a camera which is able to record still images and/or video.

Figure 5 discloses some elements arranged in the memory of the apparatus according to one embodiment of the invention.

The method is divided into two separate processes: (1) a data frame recording process 100 and (2) a data frame transmission process 106. The data frame recording process prepares 102 a data frame to be stored in a data frame buffer 216. Preparing may mean that the camera is used to record an image about an intended target. Preparing may also mean that the image is extracted from a video feed recorded by the camera. In another embodiment, the data frame may comprise only audio data recorded with the mobile phone. If the data frame comprises only audio, the length of the audio in one data frame is selectable, e.g. one second of audio in one data frame.

In some embodiments the data frame recording process comprises capturing an image by an image sensor 212 of the camera 210, preparing a viewfinder image on the basis of the captured image, and storing the viewfinder image to an image buffer 218, which may be part of the data frame buffer 216. Additionally, the data frame recording process may also comprise storing the captured image in a full resolution to the image buffer 218 or to a separate image buffer, e.g. to a second image buffer 220. If both the viewfinder image and the full resolution image are stored to the same image buffer, it may be necessary to indicate the images so that it can be determined which images are viewfinder images and which images are full resolution images .

The viewfinder images and full resolution images can also be called as viewfinder image frames and full resolution image frames, respectively, or with a common name: image frames.

Yet in another embodiment, the data frame may comprise both the image frame and an audio frame corresponding to the image frame. In one embodiment, the audio frame corresponding to the image frame means that if image frames are recorded at a speed of three frames/second, an audio frame corresponding to an image frame includes audio for the duration of a single image frame, i.e. l/3s in this case.

When the data frame has been prepared, the data frame is stored 102 in the data frame buffer 216. Since every buffer has a predetermined length, the data frame buffer 216 may at some point become full. If the data frame buffer 216 is full when trying to store the data frame in the data frame buffer 216, the oldest data frame is deleted from the buffer. In some embodiments it may not be necessary to actually delete the oldest data frame but it may suffice to replace the oldest data frame with the newest data frame e.g. by overwriting the new data frame on the oldest data frame .

In some embodiments the data frame transmission process 106 first comprises determining that a data frame is to be transmitted. The determination may be based on an application parameter. The parameter may define a default interval at which data frames are to be sent towards a wireless communications network, e.g. a mobile communication network. The parameter may also be adjustable by a user of the mobile phone. In another embodiment, the parameter is adjusted based on instructions received from a server via the mobile wireless network, or directly from another apparatus such as another mobile phone .

When determining that a data frame is to be transmitted, the last stored data frame is selected 108 from the data frame buffer. The data frame buffer at this point may include several data frames but the last stored data frame (i.e. the newest data frame) is selected when data frame transmission is about to happen. The selected data frame is transmitted 110 towards the wireless communications network.

In some other embodiments it may not always be the last stored data frame which is selected but it may also be a second last stored data frame which is selected to be transmitted.

With the above solution a receiver receiving data frames sent from the apparatus is able to get an instant image view and/or audio relating to the documented situation.

For example, when apparatus records and stores the first data frame, the data frame is transmitted. Meanwhile data frames VF2, VF3 and VF4 have been stored in the data frame buffer 216. Now, when the next transmission is performed, the latest data frame (i.e. VF4) is sent from the data frame buffer. If the apparatus is able to transmit more data frames, it sends the latest (or in some embodiments, the second latest) stored data frame from the data frame buffer. Thus, the apparatus may not transmit data always in a chronological order since when the latest data frame is sent and the data frame buffer may still store older data frames which have not been sent yet .

In one embodiment of Figure 1, at some point the processing power of the apparatus may not be sufficient to fully execute the data frame recording process and the data frame transmission process in parallel (i.e. virtually at the same time) if the processes use the same resources. In such a case the apparatus may prioritize the data frame transmission process over the data frame recording process to ensure that data frames are continuously transmitted. The prioritization is implemented e.g. by slowing down the data frame recording process, i.e. recording and storing less data frames.

In another embodiment of Figure 1, the apparatus may prioritize transmitting image frames or audio frames if the transmission bandwidth is limited.

In another embodiment of Figure 1, the apparatus may receive from a server via the wireless communication network a resolution indication. The resolution indication indicates to the apparatus which resolution should be used to capture images. Thus the resolution indication may be used to limit the data transmission bandwidth usage of the apparatus if the resolution of the viewfinder image is too large for the situation. This may be deduced e.g. by comparing the resolution of the viewfinder image with the resolution indication and if the comparison indicates that the resolution of the viewfinder image is larger than the resolution indication indicates the resolution of the viewfinder image may be reduced. In another embodiment, the resolution indication may indicate to the apparatus to increase the used resolution, if the resolution has previously been decreased .

In another embodiment of Figure 1, the apparatus may receive from a server via the mobile wireless network an indication setting at least one of a delay between two consecutive frame transmissions and a bandwidth limitation for transmitting the data frames. In other words, it is possible to manage the transmission bandwidth usage of the apparatus. This enables a solution which provides an optimal bandwidth usage as opposed to trying to push more data than the bandwidth allows or pushing less data than the bandwidth allows.

Figure 2 discloses a simplified block diagram of an exemplary apparatus for wireless communication that is suitable for use in practicing the exemplary embodiments of at least a part of this invention. In Figure 2, the apparatus 200 may include at least one processor 202, at least one memory 204 coupled to the processor 202, a suitable transceiver 206 (having a transmitter (TX) and a receiver (RX) ) coupled to the processor 202, coupled to an antenna unit 208, and a camera 210 coupled to the processor 202.

The processor 202 or some other form of generic central processing unit (CPU) or special- purpose processor such as digital signal processor (DSP) , may operate to control the various components of the apparatus 200 in accordance with embedded software and/or firmware stored in memory 204 and/or stored in memory contained within the processor 202 itself. In addition to the embedded software or firmware, the processor 202 may execute other applications or application modules stored in the memory 204 or made available via wireless network communications. The application software may comprise a compiled set of machine-readable instructions that configures the processor 202 to provide the desired functionality, or the application software may be high-level software instructions to be processed by an interpreter or compiler to indirectly configure the processor 202.

The transceiver 206 is for bidirectional wireless communications with a wireless communication network. In some embodiments, the transceiver 206, portions of the antenna unit 208, and an analog baseband processing unit may be combined in one or more processing units and/or application specific integrated circuits (ASICs) . The antenna unit 208 may be provided to convert between wireless signals and electrical signals, enabling the apparatus 200 to send and receive information from the wireless communications network. The antenna unit 208 may include antenna tuning and/or impedance matching components, RF power amplifiers, and/or low noise amplifiers .

The memory 204 or another memory coupled to the processor may be arranged to store data frames in a data frame buffer. A data frame may comprise an image frame and/or an audio frame corresponding to the image frame. If audio is to be stored in the data frame buffer, the apparatus 200 comprises also a microphone. The microphone may be a built-in microphone of the apparatus 200 or a separate microphone coupled to the apparatus 200. In some embodiments the data frame buffer 216 may comprise the image buffer 218 and optionally the audio frame buffer 222, or a first image buffer 218 for storing viewfinder images, the second image buffer 220 for storing full resolution images, and optionally the audio frame buffer 222.

The apparatus 200 is in one embodiment a mobile phone. It may also be any other apparatus that is able to establish a connection to a wireless communication network. The wireless communication network is any network that is able to wirelessly receive and transmit information to/from apparatuses. Examples of such networks include the Global System for Mobile Communications (GSM) , Universal Mobile Telecommunications System (UMTS) and other mobile communication networks. In another embodiment, the wireless network is not a mobile communication network but a local wireless communication network.

In some embodiments the apparatus 200 may operate as follows. The user may start an imaging application 502 or it may be started by some other means, for example by an operating system of the apparatus 200. The imaging application 502 comprises computer code when executed by the processor 202 cause the apparatus 200 to take images and send them to the wireless network 400. The imaging application 502 may, for example, receive image information from an image sensor 212 of the camera and form so called viewfinder images which are provided to a display 214 of the apparatus. The preview images may have a smaller resolution than images to be stored and possibly provided for further processing. It may also be possible that viewfinder images are taken with smaller frame rate than images of a video clip or a video stream. The viewfinder images are stored e.g. to an image buffer 218 as frames or in another appropriate format. When one viewfinder image has been stored the next viewfinder image may be stored to another location in the image buffer 218, if the previously stored viewfinder image is still needed in the buffer. For example, if the viewfinder image has not yet been displayed and/or transmitted to the wireless network.

When there is at least one viewfinder image stored in the image buffer 218 and the apparatus 200 is able to send an image to the wireless network, one viewfinder image is selected for transmission and provided to the transceiver 206. The transceiver 206 takes care of the actual transmission operation by performing optional compression, encoding, transform operations, channel coding etc. so that the viewfinder image can be transmitted by the antenna unit 208 as signals which the wireless communication network is able to process.

In some situations the apparatus 200 may not be able to transmit the viewfinder images at the same frame rate than the viewfinder images are provided by the camera 210. In such situations the newest (latest) image in the image buffer 218 may be selected for transmission. It may therefore happen that some of the viewfinder images are not transmitted at all, or that some or all of the older viewfinder images in the image buffer are transmitted at a later stage when the apparatus 200 is provided enough bandwidth for the transmission and/or the apparatus 200 is otherwise able to transmit the latest viewfinder image and some previously stored viewfinder images.

In some embodiments it is not always the latest image which is selected to be transmitted but it may also be the second last image, or even both the latest and the second last image may be selected for transmission if the bandwidth enables it. Figure 5 depicts an example of elements which may belong to the apparatus 200 to implement some embodiments of the present invention. Some or all these elements may be implemented as software, hardware or both. In the example of Figure 5 these elements are located in the memory 204 of the apparatus 200 as software code but it is obvious that the elements need not be located in the memory or that a part of the elements are implemented outside the memory 204, e.g. in the firmware of the apparatus 200.

Figure 6 illustrates as an arrow diagram the operation of an example embodiment of the invention when viewfinder images are captured and transmitted.

The image capturing element 504 instructs 600 the camera 210 to take 602 an image by the image sensor 212 and to read 604 the image sensor to obtain the image information. The image information comprises pixel values which represent the image in a matrix form. The resolution of the image, i.e. the number of rows and columns of the image, depends inter alia on the resolution of the image sensor.

The image may be a grey scale image wherein a pixel value represents the brightness of the pixel, or a colour image in which one pixel value may comprise sub-pixel values which represent the colour information of the pixel. For example, in the so- called red-green-blue colour system (RGB) , each pixel contains red, green and blue sub-pixels. The number of different coloured pixels may not be the same. For example, each pixel may comprise one red, one blue and two green sub-pixels.

In some embodiments the resolution of the image may be reduced by taking less pixels into account or by combining some pixel values in a region so that the pixel values within the region are represented as one pixel value, which may be an average of the pixel values or another appropriate mathematical operation. The resolution may be handled by e.g. a resolution control element 506.

The arrangement of Figure 5 also comprises a viewfinder element 508 which takes care of providing image information to the display of the apparatus 200 so that the display can be used as a viewfinder. The viewfinder element 508 reads 606 the captured image information and uses the resolution control element 506 to reduce 608, when necessary, the resolution of the image to correspond with the resolution of the display of the apparatus 200. When the resolution has been reduced to correspond with the resolution of the display, the viewfinder element 508 provides 610 the image information to the display so that the captured image can be shown to the user of the apparatus 200.

The viewfinder element 508 also stores 612 the image information to the image buffer 218 at a vacant location. If there are no vacant storage places for the captured viewfinder image, the viewfinder image is stored e.g. at the location of the oldest image in the image buffer so that the oldest image is replaced with the captured (latest) image.

An image transmission element 510 takes care of operations for transmitting 614 the viewfinder image. The image transmission element 510 may, for example, examine if the transceiver 206 is ready for transmission. If so, the image transmission element 510 reads 616 the latest image from the image buffer and provides 618 the image information to the transceiver 206. In some embodiments the image transmission element 510 may also perform some image processing operations before transmission. For example, the viewfinder image may be compressed to reduce the information to be transmitted, the viewfinder image may also be encapsulated or transformed into an appropriate image format, such as JPEG.

The image transmission element 510 may also examine if it is possible to transmit more than one viewfinder image between capturing of two viewfinder images. In such a case the image transmission element 510 may further select the second last viewfinder image in the image buffer 218 to be transmitted.

Captured images may also be stored into the frame memory 216, e.g. to the second image memory 222 in the original (full) resolution or in a reduced resolution so that a video stream can be obtained. Some of such images are illustrated with labels FR1, FR2 and FRn in Figure 5. The individual images of the video stream may then be transmitted to the wireless network and/or stored into a storage medium. The transmission of the video stream may be performed when there is enough bandwidth for the transmission or by using another communication route or channel, if available.

In some embodiments the transmission of the viewfinder images may depend on one or more conditions so that the transmission may be initiated when one or more of the conditions are fulfilled. For example, the apparatus 200 may use a microphone 215 to surveillance the environment of the apparatus 200 and when an audible sound is detected the microphone generates a signal which is analysed e.g. by the processor 202. When the processor 202 determines that the signal level is higher than a threshold, it begins to transmit the viewfinder images. In some other embodiments the processor 202 may filter the signal to determine whether the audible sound is in a certain wavelength range and if so, begins the transmission of the viewfinder images. One purpose of the filtering is to prevent any kind of audible signal triggering the transmission. This may be useful, e.g., when the purpose of the audible sound surveillance is to detect if someone breaks a window to intrude a building, to detect if there is a leakage in a gas pipe, etc.

In yet some embodiments the conditions for initiating the transmission of the viewfinder images may relate to the contents of the video signal captured by the camera 210 e.g. as follows. An example of this kind of system is depicted in Figure 8. The camera 210 captures viewfinder images which are stored to the image buffer 218. The viewfinder element 508 may, for example, inform 804 an image analyser 802 that a new viewfinder image has been captured and stored into the image buffer 218. When there are at least two viewfinder images VF1, VF2 stored in the image buffer 218 the image analyser 802 compares the contents of these two images or otherwise tries to determine how much these two viewfinder images differ from each other. If these two viewfinder images differ enough from each other the transmission of the viewfinder images is initiated 806. The decision when there are enough differences may be based on different kinds of features. For example, the number of different pixel values at the same spatial locations in these two viewfinder images may be used as a basis for the decision, as well as a possible movement of an object within the viewing area of the camera, a change in the luminosity, etc.

In some embodiments only one or more regions in the viewing area may be under the comparison so that only changes within such area(s) trigger the transmission of the viewfinder images.

The above described embodiments may be utilized e.g. in connection with a surveillance system which monitors the surroundings of an area or a building to detect an occurrence of an unwanted condition. This kind of operation may reduce the usage of the communication bandwidth because viewfinder images are only transmitted when necessary.

The transmission of the viewfinder images follows the principles presented above i.e. the newest viewfinder image is transmitted and if the bandwidth of the wireless communication network allows one or more of the previous, unsent viewfinder images are transmitted, if such images exist in the image buffer 218.

It should be noted here that also in the above described surveillance embodiments the transmission of the viewfinder images may be accompanied by the transmission of audio frames.

In some surveillance embodiments the triggering of the transmission of viewfinder images may also trigger the recording of full resolution images FR1, FR2,... to the memory so that the recorded full resolution images may later be analysed to detect the cause of the triggering. When the bandwidth allows, the full resolution images may also be transmitted to the server 300 so that the analyses can be performed e.g. in a control room. In some embodiments the transmission of the full resolution images may be requested by the server 300 and/or the transmission may be initiated by the apparatus 200 when it determines that there is enough communication capacity for the transmission.

In some embodiments it is possible that the full resolution images are also recorded e.g. to the second image buffer 222 in a circular manner, wherein a certain number of captured full resolution images are stored in the memory and when the circular buffer is full a new image is stored onto the oldest image. This kind of operation enables to include some full resolution images captured before the occurrence of the triggering event to the analyses. In other words, the second image buffer 222 contains some full resolution images which have been captured a few moments before the occurrence of the event, which triggered the transmission of the viewfinder images and the recording of the full resolution images for analyses purposes.

Figure 3 discloses a simplified block diagram of an exemplary storage server 300 that is suitable for use in practicing the exemplary embodiments of at least part of this invention and figure 7 depicts an example of a system in which the invention can be implemented. In Figure 3, the storage server 300 may include at least one processor 302, at least one memory 304 coupled to the processor 302, and a network interface 306 coupled to the processor 302. The processor 302 or some other form of generic central processing unit (CPU) or special-purpose processor such as digital signal processor (DSP) , may operate to control the various components of the storage server 300 in accordance with embedded software or firmware stored in memory 304 or stored in memory contained within the processor 302 itself. In addition to the embedded software or firmware, the processor 302 may execute other applications or application modules stored in the memory 304. The application software may comprise a compiled set of machine-readable instructions that configures the processor 302 to provide the desired functionality, or the application software may be high-level software instructions to be processed by an interpreter or compiler to indirectly configure the processor 302.

The storage server 300 is configured to receive data frames from an apparatus 200 depicted in Figures 2, 4 and 7 via the network interface 306. The data frames may include image frames and/or audio frames corresponding to the image frames. The apparatus transmits the latest data frame in its data frame buffer to the storage server 300. Thus, the apparatus 200 may not transmit data always in a chronological order since the latest or the second latest data frame is always sent and the data frame buffer may still store older data frames which have not been sent yet. The storage server 300 has various ways to act upon receiving the data frames.

In one embodiment, the storage server reconstructs image frames received from the apparatus to a video stream to be played back to a receiver 700 (Figure 7) . This may require that enough data frames have been received from the apparatus 200 in a chronological order.

In another embodiment, the storage server may stream the received image frames directly to a receiver 700. The storage server may use the same data frame transmission process as the apparatus 200. In other words, the storage server sends the data frame that was latest received from the apparatus to the receiver. Depending on the receiver to which the data frames are sent, the receiver may not be able to receive data from the storage server as fast as the apparatus sends data to the storage server. In such a case, the storage server stores all data frames received from the apparatus and when the transmission occurs towards the receiver, the storage server sends the latest data frame received from the apparatus. Thus, the receiver may not receive all data frames that the apparatus sent to the storage server. Furthermore, the data frame the receiver is receiving from the storage server may not be in a chronological order .

In one embodiment of Figure 3, the storage server may send a resolution indication to the apparatus 200. The indication indicates to the apparatus which resolution should be used to capture image frames. The resolution indication may be used to further limit the data transmission bandwidth usage of the apparatus .

In another embodiment of Figure 3, the storage server may send to the apparatus an indication setting at least one of a delay between two consecutive frame transmissions and a bandwidth limitation for transmitting the data frames. In other words, it is possible to manage the transmission bandwidth usage of the apparatus by the storage server .

In another embodiment of Figure 3, the storage server stores all data frames received from the apparatus temporarily or permanently. This provides an opportunity to generate a chronological stream from the data frames. This may be necessary in a situation e.g. when it is needed to later see what happened at a certain time interval in the stream.

In another embodiment of Figure 3, the storage server may push data frames to several receivers and/or several receivers may pull same of different data from the storage server.

Another embodiment of the invention may provide a system that comprises an apparatus for wireless communication as disclosed relating to Figure 2 and a storage server as disclosed relating to Figure 3.

Embodiments of the present invention may be implemented in software, hardware, application logic or a combination of software, hardware and application logic. In an example embodiment, the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media. In the context of this document, a "computer- readable medium" may be any media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer. A computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.

Although various aspects of the invention are set out in the independent claims, other aspects of the invention comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims.

It is also noted herein that while the above describes example embodiments of the invention, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the present invention as defined in the appended claims.

Claims

1. A method, comprising:
receiving by an apparatus an image captured by an image sensor of a camera;
scaling the image to correspond with a resolution of a display of the apparatus, wherein the scaled image represents a viewfinder image;
preparing the viewfinder image to be stored in a data frame buffer;
storing the viewfinder image in the data frame buffer ;
selecting a viewfinder image from the data frame buffer for transmission; and
providing the selected viewfinder image for transmission to a wireless communications network.
2. The method according to claim 1, wherein the data frame comprises at least one of an image frame and an audio frame.
3. The method according to claim 1 or 2, wherein the data frame buffer comprises two or more viewfinder images, and the latest viewfinder image is selected for transmission.
4. The method according to claim 1, 2 or 3, wherein the scaling comprises reducing the resolution of the image.
5. The method according to any of the claims 1 to
4, wherein
the data frame recording process comprises storing an image frame and an audio frame corresponding to the viewfinder image to the data frame buffer; and transmitting the image frame and the audio frame from the data buffer to the wireless communications network .
6. The method according to any of claims 2 to 5, further comprising:
receiving a resolution indication via the wireless communication network from a server which sets a resolution which should be used to capture image frames;
comparing the resolution of the viewfinder image to the resolution indication; and
reducing the resolution of the viewfinder image, if the comparison indicates that the resolution of the viewfinder image is larger than the resolution indication indicates.
7. The method according to any of claims 1 to 6, further comprising:
receiving from a server via the wireless communications network an indication setting at least one of a delay between two consecutive frame transmissions and a bandwidth limitation for transmitting the data frames.
8. The method according to any of the claims 1 to 7, further comprising:
before storing the data frame in the data buffer, dropping an oldest data frame from the data buffer when the data buffer is full.
9. The method according to claim 1 comprising transmitting the viewfinder image to a wireless communications network when the viewfinder image is ready for transmission.
10. The method according to any of the claims 1 to 9, further comprising:
receiving a first viewfinder image and a second a viewfinder image;
comparing the first viewfinder image and the second viewfinder image to detect if the first viewfinder image and the second viewfinder image differ from each other by a certain amount; and
if the comparison indicates that the first viewfinder image and the second viewfinder image differ from each other by at least the certain amount, initiating the transmission of the selected viewfinder image .
11. The method according to claim 10 further comprising defining one or more areas in the viewfinder images, wherein the comparing is performed on the basis of image contents within the one or more areas .
12. An apparatus for wireless communication, comprising :
a transceiver for receiving and transmitting data to and from a wireless communication network;
means for receiving an image captured by an image sensor of a camera;
means for scaling the image to correspond with a resolution of a display of the apparatus, wherein the scaled image represents a viewfinder image;
means for preparing the viewfinder image to be stored in a data frame buffer;
means for storing the viewfinder image in the data frame buffer;
means for selecting a viewfinder image from the data frame buffer for transmission; and
means for providing the selected viewfinder image for transmission to a wireless communications network.
13. The apparatus according to claim 12, wherein the data frame comprises at least one of an image frame and an audio frame.
14. The apparatus according to claim 12 or 13, wherein the camera is a part of the apparatus.
15. The apparatus according to claim 12, 13 or 14, wherein the data frame buffer comprises two or more viewfinder images, and the apparatus is adapted to select the latest viewfinder image for transmission.
16. The apparatus according to any of the claims 12 to 15, wherein the means for scaling are adapted to reduce the resolution of the image.
17. The apparatus according to any of the claims 12 to 16, wherein the apparatus further comprises means for storing an audio frame corresponding to the viewfinder image to the data frame buffer; and further wherein the apparatus is adapted to transmit the image frame and the audio frame from the data buffer to the wireless communications network.
18. The apparatus according to any of the claims 12 to 17, further comprising:
means for receiving a resolution indication via the wireless communication network from a server which sets a resolution which should be used to capture viewfinder images.
19. The apparatus according to any of the claims 12 to 18, further comprising:
means for receiving from a server via the wireless communications network an indication setting at least one of a delay between two consecutive frame transmissions and a bandwidth limitation for transmitting the data frames.
20. The apparatus according to any of the claims 12 to 19, wherein the means for storing are adapted, before the data frame is stored in the data buffer, to drop an oldest data frame from the data buffer when the data frame buffer is full.
21. The apparatus according to any of the claims 12 to 20, wherein the apparatus is adapted to prioritize the data frame transmission process over a data frame recording process.
22. The apparatus according to any of the claims
12 to 21, further comprising:
means for receiving a first viewfinder image and a second a viewfinder image;
means for comparing the first viewfinder image and the second viewfinder image to detect if the first viewfinder image and the second viewfinder image differ from each other by a certain amount; and
means for initiating the transmission of the selected viewfinder image if the comparison indicates that the first viewfinder image and the second viewfinder image differ from each other by at least the certain amount.
23. The apparatus according to claim 22 further comprising means for defining one or more areas in the viewfinder images, wherein the means for comparing are adapted to perform the comparison on the basis of image contents within the one or more areas.
24. A computer program product comprising a computer- readable medium bearing computer program code embodied therein for use with a computer, the computer program code comprises code for performing the method of any of the claims 1 to 11.
PCT/FI2012/050282 2012-03-21 2012-03-21 Method, apparatus and computer program product for managing transmission bandwidth WO2013140017A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2012/050282 WO2013140017A1 (en) 2012-03-21 2012-03-21 Method, apparatus and computer program product for managing transmission bandwidth

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2012/050282 WO2013140017A1 (en) 2012-03-21 2012-03-21 Method, apparatus and computer program product for managing transmission bandwidth

Publications (1)

Publication Number Publication Date
WO2013140017A1 true true WO2013140017A1 (en) 2013-09-26

Family

ID=46025767

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2012/050282 WO2013140017A1 (en) 2012-03-21 2012-03-21 Method, apparatus and computer program product for managing transmission bandwidth

Country Status (1)

Country Link
WO (1) WO2013140017A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016197792A1 (en) * 2015-12-17 2016-12-15 中兴通讯股份有限公司 Data sharing system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994003014A1 (en) * 1992-07-24 1994-02-03 Koz Mark C Low power video security monitoring system
US20020141619A1 (en) * 2001-03-30 2002-10-03 Standridge Aaron D. Motion and audio detection based webcamming and bandwidth control
US20050018766A1 (en) * 2003-07-21 2005-01-27 Sony Corporation And Sony Electronics, Inc. Power-line communication based surveillance system
US20070073937A1 (en) * 2005-09-15 2007-03-29 Eugene Feinberg Content-Aware Digital Media Storage Device and Methods of Using the Same
US20070254640A1 (en) * 2006-04-27 2007-11-01 Bliss Stephen J Remote control and viewfinder for mobile camera phone

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994003014A1 (en) * 1992-07-24 1994-02-03 Koz Mark C Low power video security monitoring system
US20020141619A1 (en) * 2001-03-30 2002-10-03 Standridge Aaron D. Motion and audio detection based webcamming and bandwidth control
US20050018766A1 (en) * 2003-07-21 2005-01-27 Sony Corporation And Sony Electronics, Inc. Power-line communication based surveillance system
US20070073937A1 (en) * 2005-09-15 2007-03-29 Eugene Feinberg Content-Aware Digital Media Storage Device and Methods of Using the Same
US20070254640A1 (en) * 2006-04-27 2007-11-01 Bliss Stephen J Remote control and viewfinder for mobile camera phone

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016197792A1 (en) * 2015-12-17 2016-12-15 中兴通讯股份有限公司 Data sharing system

Similar Documents

Publication Publication Date Title
US20030142745A1 (en) Method and apparatus for transmitting image signals of images having different exposure times via a signal transmission path, method and apparatus for receiving thereof, and method and system for transmitting and receiving thereof
US20110261228A1 (en) Image capture module and image capture method for avoiding shutter lag
US20070082700A1 (en) Method of using mobile communications devices for monitoring purposes and a system for implementation thereof
US20140028817A1 (en) Credential Transfer Management Camera System
US7421727B2 (en) Motion detecting system, motion detecting method, motion detecting apparatus, and program for implementing the method
US20110119716A1 (en) System and Method for Video Distribution Management with Mobile Services
US20090207269A1 (en) Image processing device, camera device, communication system, image processing method, and program
US20140078343A1 (en) Methods for generating video and multiple still images simultaneously and apparatuses using the same
US20070126875A1 (en) Multi-codec camera system and image acquisition program
US20050052528A1 (en) Communication apparatus
US20060078046A1 (en) Method and system for multi-path video delivery and distribution
US20090051769A1 (en) Method and system for remote monitoring and surveillance
US20070216781A1 (en) Image pickup apparatus
US20100232518A1 (en) System and method for streaming video to a mobile device
US20120307049A1 (en) Networked security camera with local storage and continuous recording loop
US20130155182A1 (en) Methods and apparatus to compensate for overshoot of a desired field of vision by a remotely-controlled image capture device
US6931658B1 (en) Image on-demand transmitting device and a method thereof
JP2004266404A (en) Tracking type cooperative monitoring system
US20060045367A1 (en) Video image capturing and displaying method and system applying same
US20090234919A1 (en) Method of Transmitting Data in a Communication System
CN101977305A (en) Video processing method, device and system
US20150179130A1 (en) Method for wirelessly transmitting content from a source device to a sink device
US20120250762A1 (en) System and method for implementation of dynamic encoding rates for mobile devices
JP2005073218A (en) Image processing apparatus
US20140244858A1 (en) Communication system and relaying device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12718279

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 12718279

Country of ref document: EP

Kind code of ref document: A1