EP2069889A2 - Bildaufnahmeverfahren und haptisches eingabegerät - Google Patents

Bildaufnahmeverfahren und haptisches eingabegerät

Info

Publication number
EP2069889A2
EP2069889A2 EP07823674A EP07823674A EP2069889A2 EP 2069889 A2 EP2069889 A2 EP 2069889A2 EP 07823674 A EP07823674 A EP 07823674A EP 07823674 A EP07823674 A EP 07823674A EP 2069889 A2 EP2069889 A2 EP 2069889A2
Authority
EP
European Patent Office
Prior art keywords
haptic
image
capture
sensors
image sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP07823674A
Other languages
English (en)
French (fr)
Inventor
Denis Chene
Charles Lenay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orange SA
Original Assignee
France Telecom SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by France Telecom SA filed Critical France Telecom SA
Publication of EP2069889A2 publication Critical patent/EP2069889A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00411Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console

Definitions

  • TITLE Device for capturing images and haptic input
  • the invention relates to a device for capturing images and haptic input.
  • haptic communication refers to the sense of touch and gesture, touch perception, proprioception and kynaesthesia.
  • a so-called “haptic" communication through a network consists of exchanging, for example, information that can be perceived to the touch, in other words information using tactile sensory perception.
  • tactile sensory perception As an illustrative example, there may be mentioned the case of a person transmitting to another person distant the contour of a shape, for example of a heart, in tactile form.
  • the outline of the heart is grasped by the person in transmission, by contact and movement of his index, or a stylus, on a touching surface, and, in reception, the other person perceives tactilely, for example to the using the fingertips of a hand, the outline of the transmitted heart, on a tactile rendering surface.
  • Such a communication mode can be combined with more conventional communication modes, such as video and / or audio.
  • the patent application US 2005/0235032 describes an audio, video and haptic teleconference system, comprising
  • a video device comprising a display screen and an image capture camera
  • an audio device comprising a microphone and a loudspeaker
  • a haptic device comprising
  • a sensitive deformable touch-sensitive membrane for detecting by contact a displacement and / or a force exerted and generating a corresponding haptic signal
  • a second deformable membrane of tactile rendering adapted to deform, to move, on reception of a haptic signal.
  • Two remote people each equipped with this teleconference system, can not only talk to each other but also see each other, for example to shake hands.
  • Such a system requires the user to correctly position his camera in order to target the object or phenomenon he wishes to capture and transmit the images to his interlocutor, especially in the case where he wishes to touch the latter what he sees.
  • the present invention aims to allow a user to touch what he sees in a simpler way.
  • the invention relates to an image capture and haptic capture device comprising image capture means and haptic input means, characterized in that the image capture means and the input means haptics are mounted on the same support surface and in that, the image-capturing means comprising at least one image sensor, the haptic gripping means surround said at least one image sensor, the whole being arranged with in order to capture images of an object and then to enter tactile information on said object, by bringing the device and the object closer together.
  • image sensor any device capable of converting into a corresponding electrical signal, the energy of an optical radiation emitted or reflected by an object or a phenomenon and which makes it possible to reconstitute images of this object or of this phenomenon. It may be a camera capable of generating images from visible band radiation, an infrared sensor or an image sensor operating in any other spectral band.
  • haptic input means there are two types: the haptic input means requiring a physical contact with the object to enter tactile information; the haptic capture means capable of sensing remote tactile information, for example using laser beams for capturing the shape of an object
  • the device of the invention can be used to, initially, capture images of an object by progressively bringing the device and the object closer to one another, so as to capture more and more details.
  • visual images of the object then, in a second step, when the object enters the device's tactile detection zone, to enter tactile information relating to the object using the haptic capture means surrounding the object. image sensor.
  • the user can collect more and more precise visual information as well as tactile information relating to the object, in a simple gesture of gradually bringing his device closer to the object and then pressing it against the object. .
  • haptic gripping means makes it possible to optimize the match between what is grasped at the tactile level and what is captured at the image level, in other words between the "tactile” image and the image. of the object, while ensuring a "visual-tactile continuity".
  • visuo-tactile continuity is meant the continuous linking of continuous visual image capture and touch input. This result is obtained when, during a zoom on an object, the last net image is captured substantially at the moment when the first touch information is captured.
  • the area of visual sharpness (corresponding to the area in which the object must be located for the image sensor to capture sharp images) and the tactile detection area (corresponding to the area in which the object must be located so that the tactile input can be made) overlap slightly or have adjacent respective boundaries or at least close to each other.
  • the start time of the touch input can slightly precede or slightly follow the instant of end of visual sharpness (that is to say the beginning of the blur) or exactly correspond to that moment.
  • the tactile input begins before the instant of end of visual sharpness, in other words before the blur.
  • the touch input begins shortly after the start of the blur.
  • the "loss of the visual" and the "contact” frontier correspond substantially.
  • the various elementary tactile images captured by the various discrete tactile sensors surrounding the image sensor make it possible to reconstruct an enveloping tactile image of the visually grasped object, in which only the central part, less important in terms of tactile perception, is not grasped tactilely because of the presence of the image sensor.
  • the haptic input means comprise a plurality of tactile sensors distributed around said at least one image sensor.
  • the touch sensors can be evenly distributed around the image sensor. This makes it possible to optimally use the input means to capture the maximum of tactile information.
  • the use of discrete touch sensors facilitates the construction of the device simply by mounting the image sensor and touch sensors on the same support.
  • the touch sensors can be arranged in a circle around the lens.
  • the haptic input means may comprise an array of tactile sensors, the image sensor then being able to be disposed in the center of said matrix.
  • the device comprises a plurality of image sensors disposed in spaces separating the touch sensors.
  • the touch sensors may be arranged in parallel rows, each row having a plurality of sensors separated by gaps, and the image sensors may be disposed in said interstices.
  • the adjacent rows of sensors are advantageously in position offset from each other so as to obtain a staggered arrangement of the touch sensors.
  • the device thus makes it possible to obtain a detailed capture of an object as well at the image level as at the tactile level, while ensuring a conformity, a match between what is captured at the image level and what is captured at the tactile level.
  • the haptic gripping means comprise a deformable sensitive membrane and the image sensor is positioned in a central zone of said membrane.
  • the invention also relates to the use of the image capture and haptic capture device previously defined for, first of all, capturing images of an object by bringing the device and said object closer to each other then in a second step, enter tactile information relating to the object when the device is put in contact with it.
  • the invention relates to a communication terminal through a network comprising a device for capturing images and haptic input as defined above.
  • a communication terminal through a network comprising a device for capturing images and haptic input as defined above.
  • This may be for example a mobile phone or any other communication equipment.
  • FIG. 1 represents a first embodiment of the device visual and haptic input
  • FIG. 2 represents a second embodiment of the visual and haptic capture device
  • FIG. 3 represents a third embodiment of the visual and haptic capture device
  • FIG. 4 represents a fourth embodiment of the visual and haptic capture device
  • FIG. 5 schematically represents a visual and haptic communication using the visual and haptic capture device of one of FIGS. 1 to 4.
  • the image capture and haptic capture device of the invention comprises image capture means comprising at least one discrete image sensor and haptic input means.
  • image sensor is intended to denote a sensor capable of converting into a corresponding electrical signal the energy of an optical radiation emitted or reflected by an object, or a scene, a phenomenon, and which makes it possible to reconstitute images of this object, this scene or this phenomenon. It can be an image sensor operating in the visible band, such as an ordinary camera, in the infrared band or in any other spectral band.
  • the electrical signal generated by the image sensor is then processed to be converted, in a well known manner, by processing means into a digital type signal, which will be called "image signal”.
  • the haptic gripping means are adapted to detect the shape and / or the distribution of the pressure forces exerted by an element (object, finger, etc.), by contact, and to generate a corresponding electrical signal, which is then converted, from known manner, by processing means, in a digital type signal, which will be called "haptic signal”.
  • the haptic input means comprise a plurality of discrete tactile sensors 2. These sensors 2 make it possible to detect the presence of an object or to measure a force exerted by contact.
  • the image capture means comprise a discrete image sensor, in this case a video camera 1, the objective of which is represented in the figure, and the haptic capture means comprise a plurality of sensors.
  • the objective of the camera 1 and the discrete tactile sensors 2 are mounted on the same support 3.
  • the tactile sensors 2 are arranged in a circle around the lens 1 of the camera, close to the latter, and are evenly distributed on this circle.
  • FIG. 2 shows a second embodiment of the gripper of the invention, which differs from the first described above by the fact that the haptic gripping means comprise a plurality of discrete tactile sensors 2 arranged in matrix.
  • the objective 1 of the camera is disposed in the center of this matrix 4, in other words at the intersection of the median column of sensors 2 and the median horizontal line of sensors 2.
  • the tactile sensors 2 are thus distributed uniformly around of the lens of the camera 1.
  • FIG. 3 shows a third embodiment of the device of the invention, in which the haptic gripping means comprise several parallel rows 6 of discrete tactile sensors 2 and the image capture means comprise a plurality of sensors.
  • Discrete images 5. These are infrared sensors. However, image sensors operating in any other spectral band, for example in the visible band, could be used.
  • the neighboring rows 6 are offset relative to each other so that the touch sensors 2 are arranged generally "staggered".
  • the image sensors 5 are positioned in the interstices formed between the touch sensors 2 of each row 6.
  • each image sensor - except for those at the edge of the image capture and haptic capture area - is surrounded by four touch sensors 5 distributed uniformly around the image sensor 5 considered.
  • FIG. 4 shows a fourth embodiment of the image capture and haptic capture device, in which the tactile input means comprise a sensitive deformable tactile input surface 7, adapted to detect the shape and / or the distribution of the pressure forces exerted by an object in contact with it, and for generating a corresponding electrical signal, which is converted into a haptic signal by processing means.
  • This type of membrane is well known to those skilled in the art and will not be described in more detail here.
  • the device further comprises an image capture camera 1, having an objective shown in Figure 4 and positioned in the central portion of the sensitive touch-sensitive surface.
  • the image capturing means and the haptic gripping means are mounted on the same support surface in the various embodiments of the invention.
  • the arrangement of the haptic input means around a given image sensor makes it possible to optimize the match between what is captured at the image level and what is captured at the tactile level.
  • an even distribution of the tactile input means around the image sensor allows optimal use of the input means to capture the maximum of tactile information.
  • the device thus provides a "visual-tactile continuity". In other words, the capture of visual images and the tactile capture are linked without apparent interruption, continuously. This result is obtained when, during a zoom on an object, the last net image is captured substantially at the moment when the first touch information is captured.
  • the area of visual sharpness (corresponding to the area in which the object must be located for the image sensor to capture sharp images) and the tactile detection area (corresponding to the area in which the object must to be located so that the tactile input can be made) overlap slightly or have adjacent respective boundaries or at least close to each other.
  • the start time of the touch input can slightly precede or slightly follow the instant of end of visual sharpness (that is to say the beginning of the blur) or exactly correspond to that moment.
  • the tactile input begins before the instant of end of visual sharpness, in other words before the blur.
  • the touch input begins shortly after the start of the blur.
  • the invention also relates to the use of the image capture and haptic capture device previously described for, at first, capturing images of an object by bringing the device and the device closer together. object, in order to visualize more and more details of the object, and then, in a second step, to enter tactile information relating to the object.
  • the haptic input means and the image capture means are arranged so as to capture images of an object and then to enter tactile information on said object, by bringing the device and the object closer together.
  • the haptic input means require a physical contact
  • the input of tactile information is effected by a contact between the device and the object, which follows the approximation between them.
  • the haptic input means are able to capture remote tactile information, for example by using laser beams for capturing the shape of an object, the capture of the tactile information does not require further approximation by a user. put in contact.
  • the image capture and haptic capture device of the invention can be integrated in a communication equipment through a network, for example a mobile phone.
  • a user equipped with such a mobile phone UEi can thus, for example when shopping, show another distant person, equipped with communication equipment incorporating a display screen and a tactile rendering surface, a wallpaper, more and more detailed, then make him touch the relief of the wallpaper, by gradually moving and then brought into contact with the device of the mobile phone and wallpaper.
  • Image capture and touch input are performed one after the other without interruption, in other words continuously.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
EP07823674A 2006-08-03 2007-07-31 Bildaufnahmeverfahren und haptisches eingabegerät Withdrawn EP2069889A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0653261 2006-08-03
PCT/FR2007/051762 WO2008015365A2 (fr) 2006-08-03 2007-07-31 Dispositif de capture d'images et de saisie haptique

Publications (1)

Publication Number Publication Date
EP2069889A2 true EP2069889A2 (de) 2009-06-17

Family

ID=37672318

Family Applications (1)

Application Number Title Priority Date Filing Date
EP07823674A Withdrawn EP2069889A2 (de) 2006-08-03 2007-07-31 Bildaufnahmeverfahren und haptisches eingabegerät

Country Status (3)

Country Link
US (1) US20090189874A1 (de)
EP (1) EP2069889A2 (de)
WO (1) WO2008015365A2 (de)

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090309997A1 (en) * 2008-06-13 2009-12-17 Sony Ericsson Mobile Communications Ab Image Capturing Device
SE533704C2 (sv) 2008-12-05 2010-12-07 Flatfrog Lab Ab Pekkänslig apparat och förfarande för drivning av densamma
FR2952810B1 (fr) * 2009-11-23 2012-12-14 Univ Compiegne Tech Procede d'interaction, stimulateur sensoriel et systeme d'interaction adaptes a la mise en oeuvre dudit procede
WO2012012549A2 (en) 2010-07-21 2012-01-26 The Regents Of The University Of California Method to reduce radiation dose in multidetector ct while maintaining image quality
US9403053B2 (en) 2011-05-26 2016-08-02 The Regents Of The University Of California Exercise promotion, measurement, and monitoring system
EP2793688A4 (de) * 2011-12-19 2015-05-06 Univ California System und verfahren zur quantifizierung einer körperabtastung für verbesserte medizinische diagnosen
US9588619B2 (en) * 2012-01-31 2017-03-07 Flatfrog Laboratories Ab Performance monitoring and correction in a touch-sensitive apparatus
US9483771B2 (en) 2012-03-15 2016-11-01 At&T Intellectual Property I, L.P. Methods, systems, and products for personalized haptic emulations
US10168835B2 (en) 2012-05-23 2019-01-01 Flatfrog Laboratories Ab Spatial resolution in touch displays
US10019113B2 (en) 2013-04-11 2018-07-10 Flatfrog Laboratories Ab Tomographic processing for touch detection
US10201746B1 (en) 2013-05-08 2019-02-12 The Regents Of The University Of California Near-realistic sports motion analysis and activity monitoring
WO2015005847A1 (en) 2013-07-12 2015-01-15 Flatfrog Laboratories Ab Partial detect mode
US10146376B2 (en) 2014-01-16 2018-12-04 Flatfrog Laboratories Ab Light coupling in TIR-based optical touch systems
WO2015108480A1 (en) 2014-01-16 2015-07-23 Flatfrog Laboratories Ab Improvements in tir-based optical touch systems of projection-type
EP3161594A4 (de) 2014-06-27 2018-01-17 FlatFrog Laboratories AB Nachweis von oberflächenverschmutzung
CN107209608A (zh) 2015-01-28 2017-09-26 平蛙实验室股份公司 动态触摸隔离帧
US10318074B2 (en) 2015-01-30 2019-06-11 Flatfrog Laboratories Ab Touch-sensing OLED display with tilted emitters
EP3537269A1 (de) 2015-02-09 2019-09-11 FlatFrog Laboratories AB Optisches berührungssystem
WO2016140612A1 (en) 2015-03-02 2016-09-09 Flatfrog Laboratories Ab Optical component for light coupling
JP2018536944A (ja) 2015-12-09 2018-12-13 フラットフロッグ ラボラトリーズ アーベーFlatFrog Laboratories AB 改善されたスタイラスの識別
WO2018096430A1 (en) 2016-11-24 2018-05-31 Flatfrog Laboratories Ab Automatic optimisation of touch signal
KR102495467B1 (ko) 2016-12-07 2023-02-06 플라트프로그 라보라토리즈 에이비 개선된 터치 장치
CN116679845A (zh) 2017-02-06 2023-09-01 平蛙实验室股份公司 触摸感测装置
US20180275830A1 (en) 2017-03-22 2018-09-27 Flatfrog Laboratories Ab Object characterisation for touch displays
EP4036697B1 (de) 2017-03-28 2026-01-07 FlatFrog Laboratories AB Optische berührungserfassungsvorrichtung
WO2019045629A1 (en) 2017-09-01 2019-03-07 Flatfrog Laboratories Ab IMPROVED OPTICAL COMPONENT
US11567610B2 (en) 2018-03-05 2023-01-31 Flatfrog Laboratories Ab Detection line broadening
US12055969B2 (en) 2018-10-20 2024-08-06 Flatfrog Laboratories Ab Frame for a touch-sensitive device and tool therefor
WO2020153890A1 (en) 2019-01-25 2020-07-30 Flatfrog Laboratories Ab A videoconferencing terminal and method of operating the same
WO2020209011A1 (ja) * 2019-04-08 2020-10-15 ソニー株式会社 移動制御装置及び移動体
ES2991658T3 (es) 2019-11-25 2024-12-04 Flatfrog Lab Ab Un aparato táctil
US12282653B2 (en) 2020-02-08 2025-04-22 Flatfrog Laboratories Ab Touch apparatus with low latency interactions
JP7681911B2 (ja) 2020-02-10 2025-05-23 フラットフロッグ ラボラトリーズ アーベー 改良型タッチ検知装置

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5459329A (en) * 1994-09-14 1995-10-17 Georgia Tech Research Corporation Video based 3D tactile reconstruction input device having a deformable membrane
US7209160B2 (en) * 1995-09-20 2007-04-24 Mcnelley Steve H Versatile teleconferencing eye contact terminal
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
JP2933023B2 (ja) * 1996-08-28 1999-08-09 日本電気株式会社 景色画像の入力及び触覚出力装置
US6368268B1 (en) * 1998-08-17 2002-04-09 Warren J. Sandvick Method and device for interactive virtual control of sexual aids using digital computer networks
US6803906B1 (en) * 2000-07-05 2004-10-12 Smart Technologies, Inc. Passive touch system and method of detecting user input
US6786863B2 (en) * 2001-06-07 2004-09-07 Dadt Holdings, Llc Method and apparatus for remote physical contact
JP4029675B2 (ja) * 2002-06-19 2008-01-09 セイコーエプソン株式会社 画像・触覚情報入力装置
US20050235032A1 (en) * 2004-04-15 2005-10-20 Mason Wallace R Iii System and method for haptic based conferencing
US7535468B2 (en) * 2004-06-21 2009-05-19 Apple Inc. Integrated sensing display
US7626569B2 (en) * 2004-10-25 2009-12-01 Graphics Properties Holdings, Inc. Movable audio/video communication interface system
US7800594B2 (en) * 2005-02-03 2010-09-21 Toshiba Matsushita Display Technology Co., Ltd. Display device including function to input information from screen by light
US9019209B2 (en) * 2005-06-08 2015-04-28 3M Innovative Properties Company Touch location determination involving multiple touch location processes
US20070002130A1 (en) * 2005-06-21 2007-01-04 David Hartkop Method and apparatus for maintaining eye contact during person-to-person video telecommunication

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2008015365A2 *

Also Published As

Publication number Publication date
WO2008015365A2 (fr) 2008-02-07
US20090189874A1 (en) 2009-07-30
WO2008015365A3 (fr) 2008-04-10

Similar Documents

Publication Publication Date Title
EP2069889A2 (de) Bildaufnahmeverfahren und haptisches eingabegerät
CA2835047C (fr) Procede pour commander une action, notamment une modification de nettete, a partir d'une image numerique en couleurs
JP6185213B2 (ja) 撮像装置及び撮像装置の画像処理方法及びプログラム
US9667877B2 (en) Imaging device and imaging method
FR3021133B1 (fr) Terminal mobile et procede de commande dudit terminal mobile
CN104995911B (zh) 图像处理装置、摄影装置、滤波器生成装置、图像复原方法以及程序
Xiao et al. Mobile imaging: the big challenge of the small pixel
US20150334359A1 (en) Image processing device, image capture device, image processing method, and non-transitory computer-readable medium
JP4412710B2 (ja) 光電変換装置の設計方法
FR2878641A1 (fr) Procede de navigation automatique contrainte vers des regions d'interet d'une image
CN105009563A (zh) 复原滤波器生成装置和方法、图像处理装置、摄像装置、复原滤波器生成程序以及记录介质
US9881362B2 (en) Image processing device, image-capturing device, image processing method, and program
FR2999852A1 (fr) Module de camera et systeme de surveillance dote de celui-ci
WO2014046038A1 (ja) 撮像装置及び合焦確認表示方法
WO2017208702A1 (ja) 撮像素子及び撮像装置
US9984448B2 (en) Restoration filter generation device and method, image processing device and method, imaging device, and non-transitory computer-readable medium
JP2004208301A (ja) 画像センサ及び画像キャプチャシステム並びにアレイを用いる方法
EP1351498B1 (de) Echtzeitverarbeitungsmethode eines Bildsignales
EP1834475A2 (de) Videotelefonieendgerät mit intuitiven einstellungen
JP5542248B2 (ja) 撮像素子及び撮像装置
FR2799911A1 (fr) Systeme de controle du stress
EP3408726A2 (de) Persönlicher digitaler assistent aus einem smartphone, einer tastatur und einem tablet zur aufnahme von bildern
CN104871526B (zh) 图像处理装置、摄像装置、图像处理方法、图像处理程序
EP3386191B1 (de) Array-sensor mit zeitkodierung ohne arbitrierung
EP1168810B1 (de) Mobiltelefon versehen mit einer Kamera

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20090302

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK RS

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ORANGE

17Q First examination report despatched

Effective date: 20140630

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20161013