AU2004200370B2 - Video player with code sensor - Google Patents

Video player with code sensor Download PDF

Info

Publication number
AU2004200370B2
AU2004200370B2 AU2004200370A AU2004200370A AU2004200370B2 AU 2004200370 B2 AU2004200370 B2 AU 2004200370B2 AU 2004200370 A AU2004200370 A AU 2004200370A AU 2004200370 A AU2004200370 A AU 2004200370A AU 2004200370 B2 AU2004200370 B2 AU 2004200370B2
Authority
AU
Australia
Prior art keywords
pct
video
player
auoo
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2004200370A
Other versions
AU2004200370A1 (en
Inventor
Paul Lapstun
Kia Silverbrook
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Silverbrook Research Pty Ltd
Original Assignee
Silverbrook Research Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU16806/01A external-priority patent/AU767226B2/en
Application filed by Silverbrook Research Pty Ltd filed Critical Silverbrook Research Pty Ltd
Publication of AU2004200370A1 publication Critical patent/AU2004200370A1/en
Application granted granted Critical
Publication of AU2004200370B2 publication Critical patent/AU2004200370B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Description

I I VIDEO PLAYER OPERATING WITH CODED SUBSTRATE FIELD OF INVENTION The present invention relates to devices for receiving machine-readable input and for outputting a human discemable output, usually in the form of audio or visual information. More particularly the invention relates to an video player which may be used to scan or sense machine-readable coded data on a surface and to output a still image, a video clip or audio material, or both audio and visual material.
CO-PENDING APPLICATIONS Various methods, systems and apparatus relating to the present invention are disclosed in the following co-pending applications filed by the applicant or assignee of the present invention simultaneously with the present invention: PCT/AU00/01442, PCT/AU00/01444, PCT/AU00/01446, PCT/AU00/01445, PCT/AU00/01450, PCT/AU00/01453, PCT/AU00/01448, PCT/AU00/01447, PCT/AU00/01459, PCT/AU00/01451, PCT/AU00/01454, PCT/AU00/01452, PCT/AUOO/01443, PCT/AU00/01455, PCT/AU00/01456, PCT/AU00/01457, PCT/AU00/01458 and PCT/AU00/01449.
The disclosures of these co-pending applications are incorporated herein by cross-reference Various methods, systems and apparatus relating to the present invention are disclosed in the following co-pending applications filed by the applicant or assignee of the present invention on 20 October 2000: PCT/AU00/01273, PCT/AU00/01279, PCT/AU00/01288, PCT/AU00/01282, PCT/AU00/01276, PCT/AU00/01280, PCT/AU00/01274, PCT/AU00/01289, PCT/AU00/01275, PCT/AU00/01277, PCT/AU00/01286, PCT/AU00/01281, PCT/AU00/01278, PCT/AU00/01287, PCT/AU00/01285, PCT/AU00/01284 and PCT/AU00/01283.
The disclosures of these co-pending applications are incorporated herein by cross-reference.
Various methods, systems and apparatus relating to the present invention are disclosed in the following co-pending applications filed by the applicant or assignee of the present invention on 15 September 2000: PCT/AUOO/01108, PCT/AU00/01110 and PCT/AU00/01111.
The disclosures of these co-pending applications are incorporated herein by cross-reference.
Various methods, systems and apparatus relating to the present invention are disclosed in the following co-pending applications filed by the applicant or assignee of the present invention on 30 June 2000: PCT/AU00/00762, PCT/AU00/00763, PCT/AU00/00761, PCT/AU00/00760, PCT/AU00/00759, PCT/AU00/00758, PCT/AU00/00764, PCT/AU00/00765, PCT/AU00/00766, PCT/AU00/00767, NPS043-AU -2- PCT/AUO0/00768, PCT/AUOO/00773, PCT/AUOO/00774, PCT/AUOO/00775,PCT/AUOO/00776, PCT/AUOO/00777, PCT/AUOO/00770, PCT/AUO0/00769, PCT/AUOO/0077 1, PCT/AUOO/00772, PCT/AUOO/00754, PCT/AUOO/00755, PCT/AUOO/00756 and PCT/AUOO/00757.
The disclosures of these co-pending applications are incorporated herein by cross-reference.
Various methods, systems and apparatus relating to the present invention are disclosed in the following co-pending applications filed by the applicant or assignee of the present invention on 24 May 2000: PCT/AUOO/005 18, PCT/AUOO/00523, PCT/AUOO/00528, PCT/AUOO/00533, PCT/AUOO/00538, PCT/AUOO/00543, PCTAUOOIOO554, PCT/AUOO/00560, PCT/AUOO/00565, PCT/AUOO/00570, PCT/AUOO/00575, PCTIAUOO/0058 1, PCT/AUOO/00589, PCT/AUOO/00592, PCT/AUOOOO5 19, PCT/AUOO/00524, PCT/AUOO/00529, PCT/AUOO/00534, PCTIAUOO/00539, PCTIAUOO/00544, PCTIAUOOIOO556, PCT/AUOO/0056 1, PCT/AU00100566, PCT/AUOO/0057 1, PCT/AUOO/00576, PCT/AUOO/00580, PGT/AUOO/00583, PCT/AUOO/00594, PCT/AUOO/00520, PCTAUOOIOO525, PCT/AUOO/00530, PCT/AUOO/00535, PCT/AUOO/00540, PCTAUOOIOO545, PCT/AUOO/00557, PCTAUOOOO5 62, PCTIAUOO/00567, PCTAUOOOO5 72, PCT/AUOO/00577, PCT/AUOO/00582, PCT/AUOO/00593, PCT/AUOO/00595, PCT/AUOO/0052 1, PCT/AUOO/00526, PCT/AUOO/0053 1, PCT/AUOO/00536, PCTAUOOIOO541, PCTIAUOOIO0547, PCTAUOOIOO558, PCT/AUOO/00563, PCT/AUOO/00568, PCTAUOOIOO573, PCT/AUOO/00578, PCT/AUOO/00587, PCTAUOOOO590, PCT/AUOO/00596, PCT/AUOO/00522, PCT/AUOO/00527, PCT/AUOO/00532, PCT/AUOO/00537, PCT/AUOO/00542, PCT/AUOO/00546, PCT/AUOO/00559, PCT/AUOO/00564, PCT/AUOO/00569, PCT/AUOO/00574, PCT/AUOO/00579, PCT/AUOO/00588, PCT/AUOO/0059 I, PCT/AUOO/00597, PCT/AUOO/00598, PCT/AUOO/005 16, PCT/AUOO/00517 and PGT/AUOO/0051 1.
The disclosures of these co-pending applications are incorporated herein by cross-reference.
BACKGROUND
Purpose-specific devices such as televisions, video cassette player/recorders and camcorders, as well as more general-purpose devices such as personal computers, can be used to play back video material such as movies. Many of these devices can also be used to record video material, such as home movies..
In general, these devices don't provide access to situated video, e.g. to a video clip associated with a concert poster encountered at a train station, or a demonstration video associated with a page in a workshop manual. Each device must be used to seek out the desired video material through a virtual space accessible through the device, or the video material must be brought to the device in a device-compatible physical format.
The present invention utilizes methods, systems and devices related to a system referred to as "netpage", described in our co-pending applications listed above, wherein invisible coded data is disposed on various surfaces to render the surfaces interactive in the manner of graphical user interfaces to computer systems.
NPS043-AU SUMMARY OF INVENTION In one broad form the invention provides a hand-held video player with one or more sensors capable of sensing images including coded data. Images including coded data are sensed by the viewer and decoded. The decoded information is transmitted to a computer system which associates the decoded data with video data stored on the system, using previously stored association data. The data is transmitted to the viewer and to the user on a display screen.
Accordingly, in one broad form, the invention provides a video player including: at least one sensor for sensing coded data on or in a substrate and for generating first data; a transmitter for transmitting said first data or second data at least partially based on the first data to a computer system; a receiver for receiving video data associated with an identity derived from the first data from the computer system; at least one display device for outputting an visual display based at least partially on the video data.
The player preferably includes a touch screen and the visual output includes interactive elements by which the user may modify the visual output by interaction with the touch screen.
The player may also include memory into which a file or files are downloaded for subsequent viewing.
A video player according to one embodiment of the present invention is a compact device which, when used to click a video clip object or hyperlink on a page including machine readable code, plays back the associated video clip. It contains a sensor for reading the machine readable code, a transceiver for communicating with a base station, a compressed video and audio decoder, a color display, a display controller, an audio digital-to-analog converter, a small audio amplifier, a speaker, a stereo headphone socket, a volume control potentiometer, and a controlling processor. It also typically contains playback controls such as stop, play/pause, skip back, skip forward, although these may alternatively be provided in printed form on a coded substrate. Video clip information, such as title, chapter number, playing time, and pause status, is shown transiently on the color display, superimposed on the video.
When the user touches the video player to a video clip object or hyperlink on a coded page, the player senses the page via its page sensor, and decodes the link in the usual way via the base station. It then streams the video from the page network via the base station, subject to the user's interaction with the playback controls.
Audio is routed to the speaker, or, if a plug is present in the stereo headphone socket, to the attached headphones or other external audio device. Audio may also be routed to wireless headphones via the transceiver, either directly from the base station or via the player.
The video player may operate out-of-range ofa netpage base station if it is configured to utilize an embedded mobile telephone transceiver, or if it is itself embedded in a mobile telephone which it is configured NPS043-AU to interoperate with.
BRIEF DESCRIPTION OF DRAWINGS Figure 1 is a perspective view from above of video player according to an embodiment of the invention; Figure 2 is a perspective view from below of the figure 1 player; Figure 3 is an exploded perspective view from above of the figure 1 player; Figure 4 is a side view from the rear of the player of figure 1; Figure 5 is a plan view from above of the figure 1 player; Figure 6 is a side view from the right of the figure 1 player; Figure 7 is a cross-sectional view taken along line AA of figure Figure 8 is a perspective view from above of the internal components of the figure 1 player; Figure 9 is a perspective view from below of the internal components of the figure 1 player; Figure 10 is an exploded perspective view of the underside of the PCB of the figure 1 player; Figure 11 is a perspective view showing the figure 1 player in use on a coded substrate; and Figure 12 is a block diagram of the electronics of the player.
DETAILED DESCRIPTION OF PREFERRED AND OTHER EMBODIMENTS Referring to the drawings, there is shown a display unit 100. The display unit has a color LCD screen 102, five control buttons 104, a speaker 106, volume control 108, an audio out jack 110 and a infrared sensor device 112.
The body of the player is formed of upper and lower moldings 114 and 116 respectively which encapsulate a PCB 118 and all of the electronic components are mounted on or connected to this PCB. Power for the player is supplied by a rechargeable 3 volt lithium ion battery 120. The battery is not user replaceable and is located within the body. The two moldings 114 and 116 clip together at the upper edge 122 and are held together by three screws 124 extending through apertures 126 in the lower molding to engage in threaded studs (not shown) in the upper molding 114. The screws pass through holes 128 in the PCB and sandwich the PBC between lower bosses 130 formed in the lower molding 116 and the studs in the upper molding 114.
The LCD screen 102 is mounted on the top of the PCB and is preferably a full color thin film transistor type display, or a display having similar or better contrast and switching speeds; such displays allow video to be played at acceptable qualities. The video player may include a touch screen overlay 132.
A combined data and power connector 134 is connected to the PCB and is accessible via opening 136 in the body. The connector 134 includes a power input socket 138 for recharging of the battery 120 and a data socket 140 for input/output of data to and from the player. The connector 134 is connected to the PCB via NPS043-AU a flexible PCB 142 and connectors 144 and 146.
The speaker 106 is also connected to the PCB at its lower edge via connector 148 and is positioned between the PCB and the lower edge of the body. The speaker may be hard wired to the PCB rather than being provided with a removable connector. The audio output is used to play audio associated with the video, and optionally to provide operating feedback to the user.
The battery 120 is also positioned between the PCB and the lower edge of the body. A removable connector 150 is provided to connect the battery to the PCB. However, since the battery is not intended to be user replaceable, a permanent connection may be used. The battery is preferably shaped to fit the body and accordingly has a beveled comer 152.
The volume control 108 and audio out jack 110 are mounted directly on the right hand side of the PCB and are accessible via openings 154 and 156 respectively in the right hand side of the body.
The PCB also includes a compressed audio visual decoder 178, processor chip 180, dynamic RAM (DRAM) 182, flash or ROM 184, display controller 250 for controlling the LCD, a transceiver chip 188 and an aerial 190.
Mounted on the lower surface of the PCB, at the top left hand corner, is the sensor device 112 which is capable of detecting infrared markings on a substrate. The sensor device 112 comprises a LED 160 and an image sensor 162. The image sensor 162 is sensitive to infrared light, either inherently or by use of filters and the LED 160 emits infrared light, again inherently or by use of filters. In use infrared light is emitted from the LED 160 and passed through an light guide 164 and then through an aperture 166 in the body formed by the two casings 114 116. Light passes through the aperture 166, the optical molding i64 in the lower molding. Light passes through the aperture 166, the optical molding 164 and is focused onto the image sensor 162. The optical molding preferably includes a lens 168 and may include a beam splitter/combiner to allow light from and to the LED and the image sensor to follow the same path.
The infrared LED 160 may be strobed in synchrony with image capture to prevent motion-blurring of captured tag images. The image sensor typically consists of a 200x200 pixel CMOS or CCD image sensor with a near-infrared bandpass filter.
The free end portion 170 of the light guide 164 is cylindrical and a force sensor collar 172 is slideably mounted on the cylindrical portion. The collar extends beyond the free end of portion 170 so that when the player is pushed against the surface the collar rather than the lens 168 contacts the surface. The collar has a limited amount of movement so that when pressed against a surface there will be a small gap between the surface and the lens 168. The lens 168 is designed to have the surface in focus at most orientations of the player to the surface; it will be appreciated that with different contact angles the distance will change and so the depth of field of focus needs to encompass this range.
The collar 172 extends around and behind the sensor device 112 and engages a micro switch 174.
The micro switch is biased to an open position; pushing on the collar against the micro switch 174 overcomes the biasing means in the switch 174 and closes it. When the force is removed the biasing means urges the NPS043-AU -6collar outward and opens the switch again. A separate biasing means may be provided external of the switch 174 in addition to the micro switch's own biasing means. The sensor device is capable of detecting infrared absorptive tags, such as net page tags. For a full description of the electronic processes involved, reference is made to our co-pending application PCT/AU00/00565 referred to earlier. The CCD 162, the LED 160 and processing functions incorporated in the processor chip 180 are similar to those disclosed in the co-pending application.
The player is thus capable of sensing and decoding netpage tags on a substrate. Image data captured by the CCD 162 is sent to the processor chip 180 and decoded to a region ID (or page ID) and a tag ID. The region ID and tag ID, together with other necessary information, is transmitted to the netpage system via the transceiver chip 188 and aerial 190. The netpage system resolves the region ID and tag ID to corresponding document data which it transmits back to the player. This is received by the processor 180 via the transceiver. The renders the data for display on the LCD 102.via the display controller 250.
Component digital video for standard-definition television (SDTV) is sampled at 720 active luminance samples and 360 active chrominance (red and blue color difference) samples per line, with about 488 active lines per frame in 525/59.94 systems and 576 active lines per frame in 625/50 systems. 525/59.94 systems have a frame rate of 29.97 Hz, corresponding to an interlaced field rate of 59.94 Hz, while 625/50 systems have a frame rate of 25 Hz, corresponding to an interlaced field rate of 50 Hz. With 8-bit sampling per component, i.e. 16 bits per pixel at the luminance sampling rate, both systems have a data rate of about 21 Mbytes/s. This is somewhat lower than the data rate implied by the 13.5 MHz luminance sample rate, since each line contains inactive samples and each frame contains inactive lines. Component digital video for SDTV is known as 4:2:2, in reference to the ratios of its luminance and chrominance sampling rates to a base frequency of 3.375 MHz.
In 4:2:0 video, chrominance is also subsampled by a factor of two with respect to luminance in the vertical dimension, giving 12 bits per pixel at the luminance sampling rate, and a data rate of about 16 Mbytes/s. 4:2:0 is commonly used in JPEG and MPEG compression, discussed below.
Video is also commonly sampled at, or subsampled to, 360 (352) luminance samples per line and 240 lines per frame (in 525/59.94) or 288 lines per frame (in 625/50), particularly in low-rate versions of MPEG and related video compression standards, giving a data rate of about 4.8 Mbytes/s. The MPEG variant is referred to as SIF (Standard Interchange Format), while the International Telecommunications Union's videoconferencing variant is referred to as CIF (Common Intermediate Format). The square-pixel equivalent is normally taken to be 320x240.
For both storage and communications purposes, there is a strong imperative to compress a digital video stream. In the JPEG still image compression standard, blocks of an image are transformed into the frequency domain via a discrete cosine transform (DCT). This has the effect of concentrating image energy in relatively lower-frequency coefficients, allowing higher-frequency coefficients to be more crudely quantized, the principal source of compression in JPEG. At compression ratios at 10:1 and below, JPEG introduces negligible image degradation. In the MPEG video compression standard, periodic reference frames are coded NPS043-AU -7using a scheme similar to JPEG, to support random access. Remaining frames are coded using local blockbased motion vectors, with reference to an earlier frame or to an earlier frame and a future frame.
The MPEG-1 compression standard, targeted at the 1.5 Mbit/s data rate of compact discs, is generally credited with achieving the same quality as VHS tapes at a bit rate of 1.2 Mbit/s, when operating at SIF resolution at 30 Hz non-interlaced.
Digital audio is usually sampled at 44.1 kHz, i.e. at twice the 22.05 kHz upper cutoff frequency of the 25th critical band of human hearing. Slightly higher sampling frequencies, such as 48 kHz, are sometimes used because in reality it is impractical to lowpass filter the audio with a sharp cutoff at 22.05 kHz prior to sampling. With typical quantization of 16 bits per channel, a stereo signal therefore generates 1.41 Mbit/s, and this is consequently the data rate of many common digital audio applications, including, for example, the audio compact disc (if error correction overhead is ignored).
Because the typical 1.41 Mbit/s digital audio data rate is non-trivial, there is a strong incentive to also compress the digital audio signal. The most successful digital audio compression schemes have a perceptual basis, i.e. they exploit the frequency-dependence of the threshold of human hearing, and signal- 1 5 dependent masking, whereby a relatively louder tone can locally raise the threshold curve and thus mask relatively softer adjacent tones. Audio compression also typically relies on traditional compression techniques such as entropy-coding. In multi-channel audio, inter-channel redundancy is also commonly exploited.
The MPEG Audio Layer 3 (MP3) standard uses perceptual coding to achieve 'near-CD' and 'CD' quality reproduction at compression ratios of between 16:1 and 12:1, i.e. reducing the data rate from 1.41 Mbit/s to between 88 Kbit/s and 118 Kbit/s.
The player includes a dedicated compressed video and audio decoder 178 which produces squarepixel progressive-scan digital video and digital audio output. For example, to handle MPEG-I encoded video and audio, a video and audio decoder similar to a C-Cube CL680 decoder may be used. To handle MPEG-2 encoded video and audio, a video and audio decoder similar to C-Cube's ZiVA-3 decoder may be used. An MPEG-1 decoder typically uses a 4 Mbit DRAM during decoding, while an MPEG-2 decoder typically uses a 16 Mbit SRAM during decoding. The decoder memory 179 may be dedicated to the decoder, or may be part of a memory 182 shared with the processor.
Any of a number of other video and audio encoding standards may be supported via suitable video and audio decoders, including RealNetworks' RealVideo.
Digital video decompressed by the decoder is displayed on the color display 102 via a display controller 250.
Digital audio decompressed by the decoder is converted to analog via a digital-to-analog converter (DAC) 183, is amplified by an amplifier 185 subject to the volume control, and is output to a speaker 106 or to an external audio device via an audio jack 110. The speaker 106 is disabled when the audio jack is in use.
The processor chip contains a processor unit 181 which controls and coordinates the various electronic components of the player. The processor unit 181 executes software which monitors, via the tag NPS043-AU -8sensor 162, the identity of the underlying page and the position of the player relative to the page; communicates the identity and position data to the netpage base station via a wireless transceiver 188; receives video clip information and streaming audio data from the base station via the transceiver 188; displays clip information to the status display 102; decompresses streaming audio data to the audio output via the audio decoder 183; and interprets user input captured via the user interface buttons 104. The embedded software executed by the processor is stored in the non-volatile memory 184, typically in the form of ROM and/or flash memory. Identity information unique to the player, as well as communications encryption keys, are also stored in non-volatile memory. During execution the processor utilizes faster volatile memory, typically in the form of a 64 Mbit (8 Mbyte) dynamic RAM (DRAM).
Assuming a compressed video data rate of 1.2 Mbit/s MPEG-1 SIF at 30 Hz), the player's memory 182 can hold about one minute of compressed video (including stereo audio). With higher compression ratios or more memory, correspondingly longer clips can be held. If streaming playback is used by the player, then only a srhall video buffer is required to eliminate transmission jitter, and a significantly smaller memory may be used.
The processor unit 181 communicates with the other components via a shared bus 187. The processor unit 181, the bus 187, and any number of other components may be integrated into a single chip. As indicated in the block diagram, the integrated components may include the digital transceiver controller 189, the video decoder interface 191, and the tag image sensor interface 193. A parallel interface 195 is interposed between the bus 187 and the buttons 104, LED 160, touch sensor 132 and contact switch 174 In a more highly integrated chip, they may also include the video decoder 178, the audio DAC 183, the tag image sensor 162, and the memory 182. The analog radio transceiver 188 is unlikely to be integrated in the same chip, but may be integrated in the same package.
Since the player incorporates a dedicated video/audio decoder 178, the processor unit 181 only needs to be powerful enough to control and coordinate the other components. Alternatively, the video/audio decoder may be omitted, and a more powerful processor can used to decode the compressed video and audio in software.
The transceiver 188 is typically a short-range radio transceiver. It may support any of a number of wireless transmission standards, including Bluetooth/IEEE 802.15, IEEE 802.11, HomeRF/SWAP, HIPERLAN, and OpenAir. Bluetooth/IEEE 802.15, IEEE 802.11-1997, HIPERLAN, OpenAir, and HomeRF/SWAP all support transmission rates in the range of 1 to 2 Mbit/s. IEEE 802.11 lb supports transmission rates of 5.5 Mbit/s and 11 Mbit/s. HIPERLAN also supports a transmission rate of 24 Mbit/s in an alternative mode. Beyond these currently-supported wireless LAN (WLAN) standards, next-generation WLAN standards promise to support transmission rates of 100 Mbit/s and beyond.
The player may alternaitively be connected to the base station by cable, or may utilize a non-radiofrequency wireless transport, such as infrared. IEEE 802.11, for example, optionally utilizes an infrared transport. IrDA also utilizes an infrared transport.
The player may alternatively or additionally contain a mobile telephone transceiver for longer- NPS043-AU -9range communication with a netpage server via a mobile telephone network. If the transceiver supports a thirdgeneration 'always-on' packet-switched connection, then the player may download or stream audio content at will. If the transceiver only supports a circuit-switched connection, then the player may choose to connect (and potentially stream audio content) only when it encounters a hyperlink.
If the player incorporates a longer-range transceiver, then it may act as a netpage base station for wireless netpage pens and other netpage sensing devices.
Assuming a compressed video data rate of 1.2 Mbit/s MPEG-1 SIF at 30 Hz), the receiver must support a data rate of at least 1.2 Mbit/s. This is within the capabilities of some of the wireless transmission standards described above. Lower (or higher) data rates can be readily achieved by decreasing (or increasing) frame resolution, frame rate and/or image quality. The netpage network can dynamically recode a compressed video stream to match a particular player's capabilities, if necessary, either at a server or at a base station.
The player is controlled by five buttons, generally indicated by 104. The five buttons are a power button 192, a play button 194, a stop button 196, a rewind button 198 and a fast forward button 200. The buttons are sandwiched between the upper molding 114 and the PCB 118 and act on switches 202, 204, 206, 208 210 respectively.
The player is provided with a stand 212 comprising two legs 214 216 hinged together at adjacent ends by a pin 218 engaging in alternating sets of cylindrical apertures 220 222 in the two wings. The wing 214 is mounted at its other end for rotation on the lower molding 116 by integral pins 224. The wing 216 is mounted at its other end for rotation and sliding on the lower molding 116 by integral pins 226. The pins 226 engage in slots 228 in downward extending rails 230. The slots are long enough to allow the two legs to be laid flat between the two rails. The rails extend from the general plane of the lower molding more than the thickness of the legs so that when folded flat the player rests on the rails not the wings. The slots are not smooth but have a number of protrusions 232 which limit movement of the pins, so that the legs may be placed in one of a number of positions to tilt the player relative to the surface on which it rests.
Referring to figure 11, in use the user has one or more substrates 200 having netpage tags 202 tiled over its surface. For clarity only some of the tags are shown. The substrate 200 may be paper, electronic paper such as used by E-ink Corporation, a plastics sheet or any other suitable substrate. The substrate carries one or more entries of human readable text 204, usually titles of one or more videos available for viewing by the user.
The human readable text may optionally include a summary or a small logo or picture 224. The entries themselves may be "active" in that the user may select the entry by selecting the summary or a separate selection "button" may be provided to select the video. The entire page may be tiled with netpage tags or tags may only be provided in "active" areas. To select a video the user merely clicks the sensor device 112 on the relevant "active" area.
The sensor device 112 senses one or more of the tags 202, decodes the coded information and transmits this decoded information to the netpage system. As discussed in our co-pending application PCT/AUOO00561, each tag incorporates data which identifies the page it is on and its location within the NPS043-AU page. The netpage system is thus capable of determining what video file(s) are associated with the sensed tags and so can extract video file(s) and transmit them to the player for display on the LCD.
Data is preferably compressed in MPEG format and streamed to the player 100. The data is received via the aerial 190 and transceiver 188 and passed to the dedicated MPEG decoder 178 for decoding.
The decoded data is then transferred to the display controller 250 for display on the color display 102.
Streaming video is buffered in the player's DRAM 182 to eliminate transmission jitter. The size of the required buffer depends on wireless bandwidth contention and contention for the server supplying the streaming data.
Control of the video playback is via the control buttons 104. When the user presses one of the buttons, the processor 180 determines the function of the button and sends an appropriate instruction to the netpage system. According to the instruction, the system modifies or stops transmission of data to the player.
After a video clip has been selected but before playback has commenced the display may change to display the title and optionally the playing time of the video. Optionally controls such as play, fast forward, rewind stop and pause may be displayed, such that the user may control the player via the screen rather than the dedicated control buttons.
The video player optionally includes a microphone, video camera and a record button. It can then be used to record audio and/or video input thus providing another kind of netpage input. Recorded input may, for example, be associated with a location on a netpage, in the form of an annotation, by clicking at the location with the video player. Subsequent clicks at the same location using a video player then cause the audio and/or video annotation to be played back. If the surfaces of physical objects are universally netpageenabled, i.e. tagged with unique netpage tags, then audio and/or video annotations can be placed almost anywhere. Such audio and video annotations may be private or public. When they are private they may only be played back by their author. When they are public they may be played back by anyone.
When incorporating a microphone and video camera, the video player can be configured to act as a wireless telephone or video telephone under the control of a telephony application. Since the player lacks a user interface for dialing numbers, numbers can be selected from a netpage in the manner described in our copending application PCT/AU00/01442.
A video clip may be associated with a netpage in the form of a hyperlink, in which case activation of the hyperlink by the video player is ultimately handled by an application whose responsibility it becomes to provide the video clip to the player. A video clip may also be logically embedded as a video clip object in a page description, in which case clip activation is ultimately handled by the page server which holds the page description. Any click in the zone of the video clip object is interpreted by the page server as video clip activation. In either case the actual video clip may be stored on a separate remote server, which may become involved in the streaming playback or download of the video clip.
The video player can download a video clip activated by the user into its internal memory before making it available for playback, or it can stream the video clip on demand from the remote server in response NPS043-AU -11to the user interacting with the player's playback controls. The player may also include non-volatile storage, such as flash memory, magnetic disk, CD writer or CD rewriter for storage of downloaded video data. The schemes outlined in our co-pending application PCT/AU00/01459 regarding storage of downloaded data and device identity may be used for storage of video data.
The player typically incorporates power management. After a period of inactivity the player may inactivate the status display. After a longer period of inactivity the processor may enter a power-conserving quiescent state. Power management may be coupled with the tag sensor micro switch, allowing wake-up on page interaction. The player may also incorporate an accelerometer for this purpose.
Whilst the invention has been described with reference to the netpage system which uses invisible tags, the invention is not limited to the netpage system or the use of invisible tags. If desired, the invention may utilize tags or codes which are visible to the average unaided human eye, such as bar codes. The tags need not merely encode an identity which is then used to look up the relevant files. The tags may encode instructions at a higher level. For example a tag may encode an instruction of "play chapter 99". If invisible tags are used they need not be limited to the tags disclosed in relation to the netpage system. Other tagging systems are available and any suitable tagging system may be used. The invention is not limited to the use of inks which absorb certain wavelengths or fluoresce certain wavelengths. Magnetic inks, surface modification, including apertures, modification of the structure of the substrate itself all fall within the scope of the invention. The systems and methods to link the audio player of the present invention and the source of the audio files are not limited to netpage systems. A video player may be linked by a cable to a single computer, rather than a network of computers.
The present invention has been described with reference to a preferred embodiment and number of specific alternative embodiments. However, it will be appreciated by those skilled in the relevant fields that a number of other embodiments, differing from those specifically described, will also fall within the spirit and scope of the present invention. Accordingly, it will be understood that the invention is not intended to be limited to the specific embodiments described in the present specification, including documents incorporated by cross-reference as appropriate. The scope of the invention is only limited by the attached claims.
NPS043-AU

Claims (8)

1. A video player including: at least one sensor for sensing coded data on or in a substrate; a transmitter for transmitting, to a computer system, first data at least partially based on the sensed data; a receiver for receiving, from the computer system, video data associated with an identity derived from the sensed data; and at least one display device for outputting a visual display based at least partially on the video data.
2. The video player of claim 1 further including a memory for storing received video data.
3. The video player of claim 2 wherein at least part of the memory is user replaceable.
4. The video player of claim 2 wherein the video player has an identity and further includes processor means to store the identity or data indicative of the identity with or in any file stored in memory.
The video player of claim 1 further including a means to display information relating to received video data.
6. The video player of claim 1 including an actuator to enable a user to activate the at least one sensor.
7. The video player of claim 7 wherein the actuator is a force sensor, a button......
8. The video player of claim 1 including a motion sensor to enable a user to actuate the video device. NPSO43-AU
AU2004200370A 1999-12-01 2004-02-03 Video player with code sensor Ceased AU2004200370B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AUPQ4392 1999-12-01
AU16806/01A AU767226B2 (en) 1999-12-01 2000-11-27 Video player with code sensor
PCT/AU2000/001451 WO2001040987A1 (en) 1999-12-01 2000-11-27 Video player with code sensor

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
AU16806/01A Division AU767226B2 (en) 1999-12-01 2000-11-27 Video player with code sensor

Publications (2)

Publication Number Publication Date
AU2004200370A1 AU2004200370A1 (en) 2004-02-26
AU2004200370B2 true AU2004200370B2 (en) 2006-08-24

Family

ID=39364038

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2004200370A Ceased AU2004200370B2 (en) 1999-12-01 2004-02-03 Video player with code sensor

Country Status (1)

Country Link
AU (1) AU2004200370B2 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995017794A1 (en) * 1993-12-22 1995-06-29 Pics Previews, Inc. Method and apparatus for previewing product information and effecting a transaction

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995017794A1 (en) * 1993-12-22 1995-06-29 Pics Previews, Inc. Method and apparatus for previewing product information and effecting a transaction

Also Published As

Publication number Publication date
AU2004200370A1 (en) 2004-02-26

Similar Documents

Publication Publication Date Title
CA2392904C (en) Video player with code sensor
US7934654B2 (en) Method of associating recorded audio with position
AU2004200370B2 (en) Video player with code sensor
AU2004214597B2 (en) System for retrieving and playing audio files

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
MK14 Patent ceased section 143(a) (annual fees not paid) or expired