GB2477174A - Right angled camera housing - Google Patents

Right angled camera housing Download PDF

Info

Publication number
GB2477174A
GB2477174A GB1016811A GB201016811A GB2477174A GB 2477174 A GB2477174 A GB 2477174A GB 1016811 A GB1016811 A GB 1016811A GB 201016811 A GB201016811 A GB 201016811A GB 2477174 A GB2477174 A GB 2477174A
Authority
GB
United Kingdom
Prior art keywords
camera
user
fixture
cameras
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1016811A
Other versions
GB201016811D0 (en
Inventor
Naveen Chawla
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of GB201016811D0 publication Critical patent/GB201016811D0/en
Priority to PCT/IB2011/050138 priority Critical patent/WO2011089538A1/en
Publication of GB2477174A publication Critical patent/GB2477174A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • H04N5/2252

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A system which comprises one or more fixtures 1,2, wherein each fixture forms at least one empty inner right angle shape and has at least one camera 8 attached to it in a known position and orientation. Two fixtures may be attached to each corner of a television display and may be backed by a magnet, ferrous metal, adhesive or suction pumps. The cameras may use fisheye lenses. The system may automatically calculate the TV screen size by using the camera on a first fixture and determining the pixel location of known features on a second fixture, or using an ultrasonic transmitter 4 and receiver 3. Based on the known position of each camera, the system may estimate a user's 3D viewing location and match this to the location of a virtual camera set relative to a virtual 3D environment. The system may estimate the 3D locations of stereoscopic glasses or devices 13,14 held/worn by the user. This system does not require the stereo calibration step that hinders multiple camera 3D tracking systems.

Description

A Stereo-Calibration-Less Multiple-Camera Human-Tracking System For Human-Computer 3D Interaction One of the problems of multiple-camera 3D tracking is stereo calibration. What is proposed is a system that does not require a stereo calibration step by the user or installer of the system.
In 3D human-computer interaction (e.g. with virtual object 1 in Fig.1), it is important to establish a 3D coordinate space relative to the screen (17 in Fig. 1). Normally one would perform a stereo calibration step to establish the position and orientation of the cameras in order to enable 3D tracking of objects located within viewing range of two or more of the cameras.
However for home use this step is cumbersome, and requires re-doing every time any camera is moved even slightly, for example by accident when someone brushes past.
What is therefore proposed first is the use of right-angled brackets such as 1 and 2 in Fig.1, which would be mounted on corners of the user's screen by the user, in line with the edges of the screen, via a self-attaching method, such as magnets, a magnet and a ferrous metal (one of which is attached to the screen by adhesive), adhesive or suction pumps. The size of the user's screen would then either then be input into the system by the user, or calculated automatically by the system by using at least one ultrasonic receiver such as 3 in Fig. 1 and transmitter such as 4 in Fig. 1 mounted on different brackets, and using a time-of-flight calculation to calculate the distance between the two, and hence deduce the size of the screen, or using at least one camera such as 5 in Fig. 2, to pick up two or more objects or features in known locations on the other bracket such as LEDs 6 and 7 in Fig. 2, and using their pixel locations on the camera (or cameras) to calculate, given the known information, the actual distance to those objects from that camera.
The cameras do not need be exposed as shown by 8 in Fig. 1. That is an under-the-hood drawing. The screen-corner-brackets cou'd typically provide casing for the cameras as denoted by 16 in Fig. 1 for discreetness and extra protection and security. The casing in front of the cameras would need to he transparent to whatever wavelengths of electromagnetic radiation the cameras need to detect, or could he transparent to those wavelengths exclusively.
Using screen-corner-mounted brackets is significantly better than using, for example, a single bar at the top of the screen because the user does not have to perform any kind of measurements to place them in a precise way. Also, the units allow for more compact packing of the system than a long bar would, allowing for more convenient transit, storage and stocking of the system.
The cameras used for 3D tracking, denoted by 8 in Fig. 1, would he pre-mounted on the brackets securely and in a precise way, so that their position and orientation in relation to their bracket is already known. This would then be used in conjunction with the calculation of the screen size as described above to deduce the position and orientation of the cameras in relation to the screen. The position and orientation of the cameras relative to the screen is sufficient information to deduce the 3D location, relative to the screen, of any tracked object in view of two or more of the cameras.
Therefore a second method of avoiding a stereo calibration step proposed here is to produce a television screen or monitor with cameras already mounted in accurately predefined and known locations and orientations relative to its screens, as denoted by 19 in Fig.3.
Typically, in order for realistic 3D interaction, all versions of the system would need to track, or estimate using tracking, either the 3D locations of the user's left and right eyes or a single compromise between the two, in order to place at least one virtual camera position (two if stereoscopic presentation is being used) in the virtual 3D world to match that location, and set its viewing frustum to the asymmetric pyramid shape that the user's viewing position (or each eye position respectively) would make with the corners of the screen if you were to draw a straight line from that position to each of the four corners of the screen at any given time. Example virtual camera frustums for a stereoscopic system with respect to the left and right eyes is denoted by 9 and 10 in Fig. 1. Markers mounted on the user's stereoscopic glasses such as LEDs 11 and 12 in Fig. 1 provide one way of estimating those positions. Another way would be tracking the users head and estimating the left and right eye positions that way. This requires a little more computation than tracking LEDs but would be far more desirable.
The system would also typically need to track another part of the user, or a device that the user is in control of, to enable interaction with the virtual world.
An example of such a user-controlled device is denoted by the proposed handsets 13 and 14 in Fig. 1. Markers such as LEDs 15 in Fig. 1 mounted on each device can provide a simpler way of tracking the device's position and orientation than tracking the device or user by itself.
The system is significantly beneficial in comparison to systems that require stereo calibration because it would be incredibly easy and quick for the user to set up.
A good way of increasing the range of view of the cameras is to use fisheye lenses on them.
GB1016811A 2010-01-25 2010-10-06 Right angled camera housing Withdrawn GB2477174A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2011/050138 WO2011089538A1 (en) 2010-01-25 2011-01-12 A stereo-calibration-less multiple-camera human-tracking system for human-computer 3d interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1001152.6A GB201001152D0 (en) 2010-01-25 2010-01-25 A right-angled-bracket self-adhesive calibration-less multi-camera tracking system for realitic human-computer 3D interaction

Publications (2)

Publication Number Publication Date
GB201016811D0 GB201016811D0 (en) 2010-11-17
GB2477174A true GB2477174A (en) 2011-07-27

Family

ID=42046007

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1001152.6A Ceased GB201001152D0 (en) 2010-01-25 2010-01-25 A right-angled-bracket self-adhesive calibration-less multi-camera tracking system for realitic human-computer 3D interaction
GB1016811A Withdrawn GB2477174A (en) 2010-01-25 2010-10-06 Right angled camera housing

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1001152.6A Ceased GB201001152D0 (en) 2010-01-25 2010-01-25 A right-angled-bracket self-adhesive calibration-less multi-camera tracking system for realitic human-computer 3D interaction

Country Status (1)

Country Link
GB (2) GB201001152D0 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108073330A (en) * 2018-02-09 2018-05-25 业成科技(成都)有限公司 Touch device and preparation method thereof
EP3242274A4 (en) * 2014-12-31 2018-06-20 Alt Limited Liability Company Method and device for displaying three-dimensional objects

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010053762A (en) * 1999-12-01 2001-07-02 이계철 Apparatus for pointing at two dimensional monitor by tracing of eye's movement
JP2005328332A (en) * 2004-05-14 2005-11-24 Matsushita Electric Ind Co Ltd Three-dimensional image communication terminal
CN201163394Y (en) * 2008-03-06 2008-12-10 北京汇冠新技术有限公司 Touch detection apparatus used for notebook computer

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010053762A (en) * 1999-12-01 2001-07-02 이계철 Apparatus for pointing at two dimensional monitor by tracing of eye's movement
JP2005328332A (en) * 2004-05-14 2005-11-24 Matsushita Electric Ind Co Ltd Three-dimensional image communication terminal
CN201163394Y (en) * 2008-03-06 2008-12-10 北京汇冠新技术有限公司 Touch detection apparatus used for notebook computer

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
(LOGITECH) "Logitech Webcam C300 - 1.3MP" [online], 2009. Available from: http://www.amazon.co.uk/Logitech-960-000354-Webcam-C300-1-3MP/dp/B002CNN0MS/ref=sr_1_7?s=computers&ie=UTF8&qid=1289571218&sr=1-7 [Accessed 12 November 2010] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3242274A4 (en) * 2014-12-31 2018-06-20 Alt Limited Liability Company Method and device for displaying three-dimensional objects
CN108073330A (en) * 2018-02-09 2018-05-25 业成科技(成都)有限公司 Touch device and preparation method thereof
CN108073330B (en) * 2018-02-09 2020-12-15 业成科技(成都)有限公司 Touch device and manufacturing method thereof

Also Published As

Publication number Publication date
GB201001152D0 (en) 2010-03-10
GB201016811D0 (en) 2010-11-17

Similar Documents

Publication Publication Date Title
CN113711109B (en) Head mounted display with direct imaging
US9521362B2 (en) View rendering for the provision of virtual eye contact using special geometric constraints in combination with eye-tracking
US10573075B2 (en) Rendering method in AR scene, processor and AR glasses
US9554126B2 (en) Non-linear navigation of a three dimensional stereoscopic display
Hennessey et al. Long range eye tracking: bringing eye tracking into the living room
US9848184B2 (en) Stereoscopic display system using light field type data
US20180211398A1 (en) System for 3d image filtering
CN102802014B (en) Naked eye stereoscopic display with multi-human track function
US11750789B2 (en) Image display system
EP2978217A1 (en) Display device and visual display method for simulating holographic 3d scene
WO2016095057A1 (en) Peripheral tracking for an augmented reality head mounted device
US10560683B2 (en) System, method and software for producing three-dimensional images that appear to project forward of or vertically above a display medium using a virtual 3D model made from the simultaneous localization and depth-mapping of the physical features of real objects
WO2012173998A3 (en) Volumetric video presentation
US20140306954A1 (en) Image display apparatus and method for displaying image
CN106168855B (en) Portable MR glasses, mobile phone and MR glasses system
US20190362150A1 (en) Image processing system and image processing method
WO2012138355A1 (en) System and method of modifying an image
WO2008036931A3 (en) 3-d displays and telepresence systems and methods therefore
CN110880161B (en) Depth image stitching and fusion method and system for multiple hosts and multiple depth cameras
CN109947384B (en) Electronic equipment and processing method
KR101739768B1 (en) Gaze tracking system at a distance using stereo camera and narrow angle camera
GB2477174A (en) Right angled camera housing
US10082672B2 (en) Display apparatus and method of displaying using electromechanical faceplate
CN111179407A (en) Virtual scene creating method, virtual scene projecting system and intelligent equipment
WO2011089538A1 (en) A stereo-calibration-less multiple-camera human-tracking system for human-computer 3d interaction

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)