WO2003025859A1 - Dispositif d'interface - Google Patents
Dispositif d'interface Download PDFInfo
- Publication number
- WO2003025859A1 WO2003025859A1 PCT/JP2002/009496 JP0209496W WO03025859A1 WO 2003025859 A1 WO2003025859 A1 WO 2003025859A1 JP 0209496 W JP0209496 W JP 0209496W WO 03025859 A1 WO03025859 A1 WO 03025859A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hand sign
- individual
- image
- user
- interface device
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- the present invention relates to an interface device, and more particularly, to an interface device in a room using a gesture of a hand and having an individual identification function based on a face and a height.
- Fig. 1 there are many systems for identifying speakers based on the position and orientation of an individual's face, but these systems use only one camera or a few cameras 61 It finds the face of the user 64 from information, etc., performs individual identification by face, and measures face orientation, gaze direction, etc., relying on the structure of the face such as eyes, nose, and mouth.
- the recognition target area 63 is limited to a sitting user or a person watching a certain display 62 or the like, and there is a problem that the recognition target area 63 can be used only in a very small place.
- the direction of the face, the direction of the line of sight, and the nodding action are the center of the identification target.
- the camera 7 1 (7 1-1, 7 1-2, 7 1-3, 7 1- n) is placed to expand the target area to capture the entire body of the user 72, but it is not an interface device such as a gesture with personal identification. It does not provide a specialized Ink Laxon.
- the conventional hand sign recognition system of the user 84 shown in FIG. 3 also uses a fixed environment in front of the camera 81, that is, a recognizable area 83, and an image of only the hand is displayed on the display 82 or the like. By acquiring and recognizing it, it is used as an interface device.
- the present invention provides a personal information processing system that can easily identify a plurality of persons in an indoor space and obtain all action logs thereof, and can easily operate the indoor space corresponding to the individual by hand sign. It is intended to provide a device.
- the present invention specifies the position of each person in the room while identifying the person in the room by face and height. At the same time, a command to operate the indoor equipment is sent by pointing directly at the operating equipment and showing the hand sign. When a change is made to a registered indoor space, that change is recorded. It also records all actions of all persons acting indoors. ⁇ '
- the present invention further provides means for registering an individual's face and hand sign corresponding to the individual and means for registering an indoor object. It also has storage means for preliminarily storing commands corresponding to the operation of the hand sign according to the individual, and means for simultaneously registering each command corresponding to the operating device.
- FIG. 1 is a diagram showing a conventional technique.
- FIG. 2 is a diagram showing a conventional technique.
- FIG. 3 is a diagram showing a conventional technique.
- FIG. 4 is a diagram showing a configuration of a system according to an embodiment of the present invention.
- FIG. 5 is a block diagram of the information integrated recognition device shown in FIG.
- FIG. 6 is a flowchart for explaining the operation of one embodiment of the present invention.
- FIG. 7 is a flowchart of a learning process of an unknown object, which is an operation of the embodiment of the present invention.
- FIG. 8 is an operation data base of an indoor operation target, which is an operation of one embodiment of the present invention.
- 9 is a flowchart of a registration process.
- FIG. 9 is a personal monitoring flowchart showing the operation of one embodiment of the present invention. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 4 is a diagram showing a system configuration of one embodiment of the present invention
- FIG. 5 is a block diagram of the information integrated recognition device shown in FIG.
- an image is taken by a plurality of stereo cameras 111 to 1-n so that there is no blind spot in the indoor space 5, and the user 4 moves in the indoor space 5.
- the imaging elements of two or more cameras are fixed in parallel, and the imaging outputs of the stereo cameras 1-1 to 1-1 n are given to the image processing device 2.
- the stereo camera itself is already known, and, for example, a digital camera such as an Acadia of Digicrops-Sanov Institute of Point Gray is used.
- the image processing device 2 processes about 10 to 30 images per second using the image from the stereo camera 1 as an input, and outputs the results of the distance image and the color image obtained therefrom to the information integration recognition device 3. give. That is, x, y, z coordinate information and color image information of all objects existing in the room are acquired from the image processing device 2, and are sent to the information integrated recognition device 3.
- the integrated information recognition device 3 has a target recognition unit 31, a target database 32, an unknown target learning unit 33, a personal identification unit 34, a hand sign identification unit 35, and an individual. It includes an action log database 36, an indoor operation database 37, an operation data registration unit 38, an object identification unit 39, and a personal condition monitoring unit 40.
- the target recognizing unit 31 cuts out a plurality of target portions of a single color image from the three-dimensional information sent from the image processing device 2 and performs all recognition processing.
- the target database 32 stores all target data previously learned.
- the unknown target learning unit 33 learns the unknown target and stores it in the target database 32.
- the individual identification unit 34 performs individual identification based on the recognized target's head and height, and records all individual actions such as indoor movement and posture in the individual action log database 36.
- the hand sign identification unit 35 in cooperation with the personal identification unit 34, sends out the indoor operation command registered in the indoor operation database 37 when the intended hand sign is indicated, and simultaneously records the log into the personal action log. Also entered database 36 Record.
- the operation data registration unit 38 registers information on the indoor operation target device 6 and information on the operable person in the indoor operation database 37.
- the object identification unit 39 recognizes objects other than the user's head and hand, and if an object moves or is newly brought in, the relationship with the person who changed it is recorded in the personal action log. Register in database 36.
- the personal condition monitoring unit 40 registers in advance in the personal activity log data 36 the actions that you want to monitor for each individual, and if there are any applicable conditions in the database that is updated sequentially, it will be used as a speech synthesizer. Tell user 4 by displaying on 7 or display 8.
- FIG. 6 is a flowchart for explaining the operation of one embodiment of the present invention, and is a partially detailed diagram of FIG. FIG. 4 to FIG. 6 show an embodiment of the indoor space apparatus, but the present invention is not limited to this embodiment. It can also be applied to grasping the state of a person for safety management. All the users 4 entering and exiting the indoor space 5 in FIG. 4 are photographed by the stereo cameras 1-1, 1_n, and the imaging output is given to the surface image processing device 2. That is, a person who enters and exits this space is photographed according to the flow shown in Fig. 6 until he exits the room, and his actions are recorded.
- the image processing device 2 generates a distance image based on the color image in the field of view and the indoor coordinate system for each force lens unit, and provides the information to the information integrated recognition device 3.
- the common coordinate system X, y, z (Fig. 4) between each stereo camera (1-1-l_n) is set in advance.
- the recognition target Extract the region that is.
- objects for example, people
- the recognition target Extract the region that is.
- step S32 the target recognizing unit 31 in Fig. 5
- all the targets extracted in step S31 are recognized by collating with all the pre-registered target databases 32.
- the recognition result is output with a certain degree of certainty, and those below a certain threshold are identified as unknown targets.
- the objects are first classified into head, hand, and other large categories. In particular, the head can obtain recognition results from multiple directions.
- step S33 the target recognition unit 31 in FIG. 5
- the recognition result is unknown if it is a head, a hand sign, or any other If the target is a target, the process is performed for each target.
- step S34 (individual identification section 34 in FIG. 5) a person is collated from the recognition results of a plurality of heads and their heights. Accuracy is higher.
- step S35 (individual identification section 34 in FIG. 5) the posture of the person (either standing or sitting) is determined based on the relationship with the face and body parts. Is recognized, and its change state is recorded in the personal action log database 36 in step S36 (individual identification section 34 in FIG. 5).
- step S 37 the direction of the hand, the direction of the face or body, the type and movement of the sign, Identify whether the signature was intended. Then, after being integrated with the personal identification of the same period by step S38 (hand sign identification unit 35 in FIG. 5), the indoor operation command permitted for the individual is transmitted.
- the operation target 6 is operated only by the hand sign and its movement according to the command transmission method registered in advance for the TV or air conditioner. The transmission history is also stored in the personal action log data base 36.
- step S39 identification part 39 in FIG. 5
- the person and the person who moved the object and the final destination are entered in the personal action database 36. Also save.
- step S40 the learning process of the unknown object in step S40 will be described.
- the details correspond to the unknown object learning unit 33 in FIG. 5, and will be described in FIG.
- the user specifies the step S41 (Step 3 in Fig. 5). As shown in 3), registrations are made for each category of head, hand sign, and others.
- step S42 (unknown object learning unit 33 in Fig. 5). Is registered in the target database 32.
- the shape of the handsign is registered in step S43 (the unknown target learning unit 33 in FIG. 5), and is similarly registered in the target database 32.
- other objects are indicated in the hand and the target is specified, so that in step S43 (the unknown target learning unit 33 in Fig. 5), this is also stored in the target database 32. be registered.
- the operation database of the indoor operation target is registered. This corresponds to the operation data registration unit 38 in FIG. Specifically, a TV, an air conditioner, and the like corresponding to the operation device are first registered in step S51 (operation data registration unit 38 in FIG.
- step S52 operation data registration unit 38 in FIG. 5
- step S53 operation data registration unit 38 in FIG. 5
- step S54 operation data registering unit 38 in FIG. 5
- FIG. 9 is a detailed view of the personal condition monitoring unit 40 of FIG.
- the content is registered in advance in the personal action log database 36 through step S91 in FIG. 9 (individual state monitoring unit 40 in FIG. 5). You. The specific contents are compared by the step S93 (individual state monitoring sound 40 in FIG. 5) in the personal action log database 36 which is updated successively. During normal operation, for example, the steps S94 to 9 are performed. 6 As in the example shown in (Individual condition monitoring section 40 in Fig. 5), prevent the user from staying in the same posture for a long time, prompt hydration, and do not look too close to the TV.
- a good message to the individual to improve their health 3 ⁇ 4 ⁇ (Promoting blood circulation and hydration to prevent blood clots, stiff shoulders, back pain, etc. ) Can be used for These messages are transmitted to the individual by the display on the voice synthesizing device 7 and the display device 8 by the step S97 (the personal status monitoring section 40 in FIG. 5).
- a plurality of stereo cameras capture the interior of a room, acquire all action logs of persons in the room, and register a room registered in advance by hand signing intentionally. Can send operation commands to the operation target. .
- these have the function of registering people and objects in the room in advance, and the operation can be restricted on an individual basis. It is possible to operate a different interface for each user such that the user can operate, but the child cannot.
- objects including humans can be accurately recognized without being affected by the state (posture, etc.).
- the interface device of the present invention is a gestural interface capable of facilitating individual identification of a plurality of persons in an indoor space and acquisition of all action logs thereof, and operation of the indoor space corresponding to the individual by hand sign. It is suitable as a device, a recording device or a monitoring device.
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/489,301 US7680295B2 (en) | 2001-09-17 | 2002-09-17 | Hand-gesture based interface apparatus |
JP2003529410A JP4304337B2 (ja) | 2001-09-17 | 2002-09-17 | インタフェース装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2001282076 | 2001-09-17 | ||
JP2001-282076 | 2001-09-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003025859A1 true WO2003025859A1 (fr) | 2003-03-27 |
Family
ID=19105777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2002/009496 WO2003025859A1 (fr) | 2001-09-17 | 2002-09-17 | Dispositif d'interface |
Country Status (3)
Country | Link |
---|---|
US (1) | US7680295B2 (ja) |
JP (1) | JP4304337B2 (ja) |
WO (1) | WO2003025859A1 (ja) |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005202653A (ja) * | 2004-01-15 | 2005-07-28 | Canon Inc | 動作認識装置及び方法、動物体認識装置及び方法、機器制御装置及び方法、並びにプログラム |
JP2008530661A (ja) * | 2005-02-08 | 2008-08-07 | オブロング・インダストリーズ・インコーポレーテッド | ジェスチャベースの制御システムのためのシステムおよび方法 |
JP2010079643A (ja) * | 2008-09-26 | 2010-04-08 | Kddi Corp | 情報端末装置 |
US7987147B2 (en) | 2006-05-18 | 2011-07-26 | Sony Corporation | Information processing apparatus, information processing method, and program based on operator probability using sensors |
JP2011528455A (ja) * | 2008-06-25 | 2011-11-17 | コリア・インスティテュート・オブ・サイエンス・アンド・テクノロジー | 手の動作によるネットワーク上の機器及び情報制御システム |
JP2012502344A (ja) * | 2008-09-04 | 2012-01-26 | エクストリーム リアリティー エルティーディー. | 画像センサベースのヒューマンマシンインタフェースを提供する方法システムおよびソフトウェア |
JP2012502364A (ja) * | 2008-09-03 | 2012-01-26 | オブロング・インダストリーズ・インコーポレーテッド | データ空間の主要次元をナビゲートするための制御システム |
JP2012058928A (ja) * | 2010-09-07 | 2012-03-22 | Sony Corp | 情報処理装置、および情報処理方法 |
US8165407B1 (en) * | 2006-10-06 | 2012-04-24 | Hrl Laboratories, Llc | Visual attention and object recognition system |
JP2012526328A (ja) * | 2009-05-04 | 2012-10-25 | オブロング・インダストリーズ・インコーポレーテッド | 三空間入力の検出、表現、および解釈:自由空間、近接、および表面接触モードを組み込むジェスチャ連続体 |
US8363939B1 (en) * | 2006-10-06 | 2013-01-29 | Hrl Laboratories, Llc | Visual attention and segmentation system |
EP2704057A2 (en) | 2012-08-31 | 2014-03-05 | Omron Corporation | Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium |
US8872899B2 (en) | 2004-07-30 | 2014-10-28 | Extreme Reality Ltd. | Method circuit and system for human to machine interfacing by hand gestures |
US8878779B2 (en) | 2009-09-21 | 2014-11-04 | Extreme Reality Ltd. | Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen |
US8878896B2 (en) | 2005-10-31 | 2014-11-04 | Extreme Reality Ltd. | Apparatus method and system for imaging |
JP2014535100A (ja) * | 2011-10-12 | 2014-12-25 | クアルコム,インコーポレイテッド | 認証型ジェスチャ認識 |
US8928654B2 (en) | 2004-07-30 | 2015-01-06 | Extreme Reality Ltd. | Methods, systems, devices and associated processing logic for generating stereoscopic images and video |
JP2015095164A (ja) * | 2013-11-13 | 2015-05-18 | オムロン株式会社 | ジェスチャ認識装置およびジェスチャ認識装置の制御方法 |
US9046962B2 (en) | 2005-10-31 | 2015-06-02 | Extreme Reality Ltd. | Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region |
US9177220B2 (en) | 2004-07-30 | 2015-11-03 | Extreme Reality Ltd. | System and method for 3D space-dimension based image processing |
US9218126B2 (en) | 2009-09-21 | 2015-12-22 | Extreme Reality Ltd. | Methods circuits apparatus and systems for human machine interfacing with an electronic appliance |
JP2016095647A (ja) * | 2014-11-13 | 2016-05-26 | セイコーエプソン株式会社 | プロジェクター、及び、プロジェクターの制御方法 |
US9778351B1 (en) | 2007-10-04 | 2017-10-03 | Hrl Laboratories, Llc | System for surveillance by integrating radar with a panoramic staring sensor |
JP2020515979A (ja) * | 2017-03-31 | 2020-05-28 | アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited | モノのインターネットに基づく情報処理の方法およびデバイス |
US10841539B2 (en) | 2010-11-17 | 2020-11-17 | Omron Scientific Technologies, Inc. | Method and apparatus for monitoring zones |
JP7274782B1 (ja) | 2021-12-10 | 2023-05-17 | 株式会社 Sai | 建屋内構造物認識システム及び建屋内構造物認識方法 |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DK1573498T3 (da) * | 2002-11-20 | 2012-03-19 | Koninkl Philips Electronics Nv | Brugergrænsefladesystem baseret på en pegeindretning |
EP1679885A1 (en) * | 2003-08-11 | 2006-07-12 | Matsushita Electric Industrial Co., Ltd. | Photographing system and photographing method |
KR100687737B1 (ko) * | 2005-03-19 | 2007-02-27 | 한국전자통신연구원 | 양손 제스쳐에 기반한 가상 마우스 장치 및 방법 |
US8269834B2 (en) | 2007-01-12 | 2012-09-18 | International Business Machines Corporation | Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream |
US8295542B2 (en) | 2007-01-12 | 2012-10-23 | International Business Machines Corporation | Adjusting a consumer experience based on a 3D captured image stream of a consumer response |
US8588464B2 (en) * | 2007-01-12 | 2013-11-19 | International Business Machines Corporation | Assisting a vision-impaired user with navigation based on a 3D captured image stream |
US8886540B2 (en) | 2007-03-07 | 2014-11-11 | Vlingo Corporation | Using speech recognition results based on an unstructured language model in a mobile communication facility application |
US8880405B2 (en) | 2007-03-07 | 2014-11-04 | Vlingo Corporation | Application text entry in a mobile environment using a speech processing facility |
US8838457B2 (en) | 2007-03-07 | 2014-09-16 | Vlingo Corporation | Using results of unstructured language model based speech recognition to control a system-level function of a mobile communications facility |
US8949130B2 (en) | 2007-03-07 | 2015-02-03 | Vlingo Corporation | Internal and external speech recognition use with a mobile communication facility |
US8949266B2 (en) | 2007-03-07 | 2015-02-03 | Vlingo Corporation | Multiple web-based content category searching in mobile search application |
US10056077B2 (en) | 2007-03-07 | 2018-08-21 | Nuance Communications, Inc. | Using speech recognition results based on an unstructured language model with a music system |
US8886545B2 (en) * | 2007-03-07 | 2014-11-11 | Vlingo Corporation | Dealing with switch latency in speech recognition |
US9740949B1 (en) | 2007-06-14 | 2017-08-22 | Hrl Laboratories, Llc | System and method for detection of objects of interest in imagery |
US20090109036A1 (en) * | 2007-10-29 | 2009-04-30 | The Boeing Company | System and Method for Alternative Communication |
US10146320B2 (en) | 2007-10-29 | 2018-12-04 | The Boeing Company | Aircraft having gesture-based control for an onboard passenger service unit |
JP2009146333A (ja) * | 2007-12-18 | 2009-07-02 | Panasonic Corp | 空間入力動作表示装置 |
US8555207B2 (en) * | 2008-02-27 | 2013-10-08 | Qualcomm Incorporated | Enhanced input using recognized gestures |
US9772689B2 (en) * | 2008-03-04 | 2017-09-26 | Qualcomm Incorporated | Enhanced gesture-based image manipulation |
US8514251B2 (en) * | 2008-06-23 | 2013-08-20 | Qualcomm Incorporated | Enhanced character input using recognized gestures |
US8428311B2 (en) * | 2009-02-25 | 2013-04-23 | Honda Motor Co., Ltd. | Capturing and recognizing hand postures using inner distance shape contexts |
US8843857B2 (en) * | 2009-11-19 | 2014-09-23 | Microsoft Corporation | Distance scalable no touch computing |
US20110280439A1 (en) * | 2010-05-11 | 2011-11-17 | Beverly Harrison | Techniques for person detection |
US8490877B2 (en) | 2010-11-09 | 2013-07-23 | Metrologic Instruments, Inc. | Digital-imaging based code symbol reading system having finger-pointing triggered mode of operation |
US9104239B2 (en) | 2011-03-09 | 2015-08-11 | Lg Electronics Inc. | Display device and method for controlling gesture functions using different depth ranges |
WO2012121433A1 (en) * | 2011-03-09 | 2012-09-13 | Lg Electronics Inc. | Display device and method for controlling the same |
JPWO2012124252A1 (ja) * | 2011-03-14 | 2014-07-17 | 株式会社ニコン | 電子機器、電子機器の制御方法およびプログラム |
US9084001B2 (en) | 2011-07-18 | 2015-07-14 | At&T Intellectual Property I, Lp | Method and apparatus for multi-experience metadata translation of media content with metadata |
US8943396B2 (en) | 2011-07-18 | 2015-01-27 | At&T Intellectual Property I, Lp | Method and apparatus for multi-experience adaptation of media content |
US9237362B2 (en) | 2011-08-11 | 2016-01-12 | At&T Intellectual Property I, Lp | Method and apparatus for multi-experience translation of media content with sensor sharing |
US8942412B2 (en) | 2011-08-11 | 2015-01-27 | At&T Intellectual Property I, Lp | Method and apparatus for controlling multi-experience translation of media content |
US8847881B2 (en) | 2011-11-18 | 2014-09-30 | Sony Corporation | Gesture and voice recognition for control of a device |
EP2679140A4 (en) * | 2011-12-26 | 2015-06-03 | Olympus Medical Systems Corp | MEDICAL ENDOSCOPE SYSTEM |
GB201203851D0 (en) * | 2012-03-05 | 2012-04-18 | Elliptic Laboratories As | Touchless user interfaces |
JP5858850B2 (ja) * | 2012-04-02 | 2016-02-10 | 三菱電機株式会社 | 空気調和機の室内機 |
US9474130B2 (en) | 2012-10-17 | 2016-10-18 | Koninklijke Philips N.V. | Methods and apparatus for applying lighting to an object |
US20150139483A1 (en) * | 2013-11-15 | 2015-05-21 | David Shen | Interactive Controls For Operating Devices and Systems |
ES2936342T3 (es) | 2014-01-30 | 2023-03-16 | Signify Holding Bv | Control por gestos |
CN103885590A (zh) * | 2014-03-10 | 2014-06-25 | 可牛网络技术(北京)有限公司 | 获取用户指令的方法及用户设备 |
WO2015189860A2 (en) * | 2014-06-12 | 2015-12-17 | Lensbricks Technology Private Limited | Method for interaction with devices |
US20160219266A1 (en) * | 2015-01-25 | 2016-07-28 | 3dMD Technologies Ltd | Anatomical imaging system for product customization and methods of use thereof |
US10057078B2 (en) | 2015-08-21 | 2018-08-21 | Samsung Electronics Company, Ltd. | User-configurable interactive region monitoring |
WO2017034217A1 (en) * | 2015-08-21 | 2017-03-02 | Samsung Electronics Co., Ltd. | Apparatus and method for user-configurable interactive region monitoring |
US10048769B2 (en) * | 2015-11-18 | 2018-08-14 | Ted Selker | Three-dimensional computer-aided-design system user interface |
KR102462502B1 (ko) * | 2016-08-16 | 2022-11-02 | 삼성전자주식회사 | 스테레오 카메라 기반의 자율 주행 방법 및 그 장치 |
JP2019011881A (ja) * | 2017-06-29 | 2019-01-24 | 三菱重工サーマルシステムズ株式会社 | 空調制御装置、空調システム、空調制御方法及びプログラム |
JP6874136B2 (ja) * | 2017-07-18 | 2021-05-19 | 株式会社ソニー・インタラクティブエンタテインメント | 画像認識装置、画像認識方法及びプログラム |
EP3726344A1 (en) | 2019-04-16 | 2020-10-21 | InterDigital CE Patent Holdings | Method and apparatus for user control of an application and corresponding device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0919906A2 (en) * | 1997-11-27 | 1999-06-02 | Matsushita Electric Industrial Co., Ltd. | Control method |
JP2000000216A (ja) * | 1998-06-12 | 2000-01-07 | Toshiba Eng Co Ltd | 行動監視装置および行動監視・支援システム |
JP2002251235A (ja) * | 2001-02-23 | 2002-09-06 | Fujitsu Ltd | 利用者インタフェースシステム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4119900A (en) * | 1973-12-21 | 1978-10-10 | Ito Patent-Ag | Method and system for the automatic orientation and control of a robot |
US4183013A (en) * | 1976-11-29 | 1980-01-08 | Coulter Electronics, Inc. | System for extracting shape features from an image |
US5012522A (en) * | 1988-12-08 | 1991-04-30 | The United States Of America As Represented By The Secretary Of The Air Force | Autonomous face recognition machine |
US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
EP0905644A3 (en) * | 1997-09-26 | 2004-02-25 | Matsushita Electric Industrial Co., Ltd. | Hand gesture recognizing device |
US6116907A (en) * | 1998-01-13 | 2000-09-12 | Sorenson Vision, Inc. | System and method for encoding and retrieving visual signals |
US6421453B1 (en) * | 1998-05-15 | 2002-07-16 | International Business Machines Corporation | Apparatus and methods for user recognition employing behavioral passwords |
US6904182B1 (en) * | 2000-04-19 | 2005-06-07 | Microsoft Corporation | Whiteboard imaging system |
US6788809B1 (en) * | 2000-06-30 | 2004-09-07 | Intel Corporation | System and method for gesture recognition in three dimensions using stereo imaging and color vision |
US6804396B2 (en) * | 2001-03-28 | 2004-10-12 | Honda Giken Kogyo Kabushiki Kaisha | Gesture recognition system |
JP3996015B2 (ja) * | 2002-08-09 | 2007-10-24 | 本田技研工業株式会社 | 姿勢認識装置及び自律ロボット |
-
2002
- 2002-09-17 US US10/489,301 patent/US7680295B2/en not_active Expired - Fee Related
- 2002-09-17 WO PCT/JP2002/009496 patent/WO2003025859A1/ja active Application Filing
- 2002-09-17 JP JP2003529410A patent/JP4304337B2/ja not_active Expired - Lifetime
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0919906A2 (en) * | 1997-11-27 | 1999-06-02 | Matsushita Electric Industrial Co., Ltd. | Control method |
JP2000000216A (ja) * | 1998-06-12 | 2000-01-07 | Toshiba Eng Co Ltd | 行動監視装置および行動監視・支援システム |
JP2002251235A (ja) * | 2001-02-23 | 2002-09-06 | Fujitsu Ltd | 利用者インタフェースシステム |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4481663B2 (ja) * | 2004-01-15 | 2010-06-16 | キヤノン株式会社 | 動作認識装置、動作認識方法、機器制御装置及びコンピュータプログラム |
US8094881B2 (en) | 2004-01-15 | 2012-01-10 | Canon Kabushiki Kaisha | Action recognition apparatus and method, moving-object recognition apparatus and method, device control apparatus and method, and program |
JP2005202653A (ja) * | 2004-01-15 | 2005-07-28 | Canon Inc | 動作認識装置及び方法、動物体認識装置及び方法、機器制御装置及び方法、並びにプログラム |
US8928654B2 (en) | 2004-07-30 | 2015-01-06 | Extreme Reality Ltd. | Methods, systems, devices and associated processing logic for generating stereoscopic images and video |
US8872899B2 (en) | 2004-07-30 | 2014-10-28 | Extreme Reality Ltd. | Method circuit and system for human to machine interfacing by hand gestures |
US9177220B2 (en) | 2004-07-30 | 2015-11-03 | Extreme Reality Ltd. | System and method for 3D space-dimension based image processing |
JP2008530661A (ja) * | 2005-02-08 | 2008-08-07 | オブロング・インダストリーズ・インコーポレーテッド | ジェスチャベースの制御システムのためのシステムおよび方法 |
US9046962B2 (en) | 2005-10-31 | 2015-06-02 | Extreme Reality Ltd. | Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region |
US8878896B2 (en) | 2005-10-31 | 2014-11-04 | Extreme Reality Ltd. | Apparatus method and system for imaging |
US9131220B2 (en) | 2005-10-31 | 2015-09-08 | Extreme Reality Ltd. | Apparatus method and system for imaging |
US7987147B2 (en) | 2006-05-18 | 2011-07-26 | Sony Corporation | Information processing apparatus, information processing method, and program based on operator probability using sensors |
US8363939B1 (en) * | 2006-10-06 | 2013-01-29 | Hrl Laboratories, Llc | Visual attention and segmentation system |
US8165407B1 (en) * | 2006-10-06 | 2012-04-24 | Hrl Laboratories, Llc | Visual attention and object recognition system |
US9778351B1 (en) | 2007-10-04 | 2017-10-03 | Hrl Laboratories, Llc | System for surveillance by integrating radar with a panoramic staring sensor |
JP2011528455A (ja) * | 2008-06-25 | 2011-11-17 | コリア・インスティテュート・オブ・サイエンス・アンド・テクノロジー | 手の動作によるネットワーク上の機器及び情報制御システム |
JP2012502364A (ja) * | 2008-09-03 | 2012-01-26 | オブロング・インダストリーズ・インコーポレーテッド | データ空間の主要次元をナビゲートするための制御システム |
JP2012502344A (ja) * | 2008-09-04 | 2012-01-26 | エクストリーム リアリティー エルティーディー. | 画像センサベースのヒューマンマシンインタフェースを提供する方法システムおよびソフトウェア |
JP2010079643A (ja) * | 2008-09-26 | 2010-04-08 | Kddi Corp | 情報端末装置 |
JP2012526328A (ja) * | 2009-05-04 | 2012-10-25 | オブロング・インダストリーズ・インコーポレーテッド | 三空間入力の検出、表現、および解釈:自由空間、近接、および表面接触モードを組み込むジェスチャ連続体 |
US8878779B2 (en) | 2009-09-21 | 2014-11-04 | Extreme Reality Ltd. | Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen |
US9218126B2 (en) | 2009-09-21 | 2015-12-22 | Extreme Reality Ltd. | Methods circuits apparatus and systems for human machine interfacing with an electronic appliance |
JP2012058928A (ja) * | 2010-09-07 | 2012-03-22 | Sony Corp | 情報処理装置、および情報処理方法 |
US8744137B2 (en) | 2010-09-07 | 2014-06-03 | Sony Corporation | Information processing device and information processing method |
US9513712B2 (en) | 2010-09-07 | 2016-12-06 | Sony Corporation | Information processing device and information processing method |
US10841539B2 (en) | 2010-11-17 | 2020-11-17 | Omron Scientific Technologies, Inc. | Method and apparatus for monitoring zones |
JP2014535100A (ja) * | 2011-10-12 | 2014-12-25 | クアルコム,インコーポレイテッド | 認証型ジェスチャ認識 |
EP2704057A2 (en) | 2012-08-31 | 2014-03-05 | Omron Corporation | Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium |
JP2015095164A (ja) * | 2013-11-13 | 2015-05-18 | オムロン株式会社 | ジェスチャ認識装置およびジェスチャ認識装置の制御方法 |
JP2016095647A (ja) * | 2014-11-13 | 2016-05-26 | セイコーエプソン株式会社 | プロジェクター、及び、プロジェクターの制御方法 |
JP2020515979A (ja) * | 2017-03-31 | 2020-05-28 | アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited | モノのインターネットに基づく情報処理の方法およびデバイス |
US11461444B2 (en) | 2017-03-31 | 2022-10-04 | Advanced New Technologies Co., Ltd. | Information processing method and device based on internet of things |
JP7274782B1 (ja) | 2021-12-10 | 2023-05-17 | 株式会社 Sai | 建屋内構造物認識システム及び建屋内構造物認識方法 |
JP2023086519A (ja) * | 2021-12-10 | 2023-06-22 | 株式会社 Sai | 建屋内構造物認識システム及び建屋内構造物認識方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2003025859A1 (ja) | 2005-01-06 |
US20060182346A1 (en) | 2006-08-17 |
JP4304337B2 (ja) | 2009-07-29 |
US7680295B2 (en) | 2010-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003025859A1 (fr) | Dispositif d'interface | |
JP4481663B2 (ja) | 動作認識装置、動作認識方法、機器制御装置及びコンピュータプログラム | |
US6686844B2 (en) | Human interface system using a plurality of sensors | |
JP4491495B2 (ja) | ポインティングデバイスに基づくユーザインターフェイスシステム | |
CN106695810B (zh) | 一种基于视觉的迎宾服务机器人及工作方法 | |
JPH02120987A (ja) | 監視区域内で視聴する個々のメンバーを特定する画像認識装置 | |
WO2008018423A1 (fr) | Dispositif de vérification d'objets et procédé de vérification d'objets | |
JP4704174B2 (ja) | 状態識別装置、プログラムおよび方法 | |
WO2010010736A1 (ja) | 会議画像生成方法、会議システム、サーバ装置及び会議装置等 | |
CN111163906A (zh) | 能够移动的电子设备及其操作方法 | |
JP3835771B2 (ja) | コミュニケーション装置及びコミュニケーション方法 | |
JP2005131713A (ja) | コミュニケーションロボット | |
JP2020085302A (ja) | 空気調和機、空気調和システム、風向制御方法、及び、プログラム | |
JP2012037102A (ja) | 人物識別装置、人物識別方法及び人物識別装置を備えた空気調和機 | |
JP2005199373A (ja) | コミュニケーション装置及びコミュニケーション方法 | |
CN113566395B (zh) | 空调器及其控制方法及装置、计算机可读存储介质 | |
WO2020241034A1 (ja) | 監視システム及び監視方法 | |
JP2006121264A (ja) | 動画像処理装置、動画像処理方法およびプログラム | |
WO2023148800A1 (ja) | 制御装置、制御システム、制御方法及びプログラム | |
CN112433601A (zh) | 信息处理装置、存储介质及信息处理方法 | |
CN111919250A (zh) | 传达非语言提示的智能助理设备 | |
TW201942800A (zh) | 多角度人臉辨識系統及其學習方法與辨識方法 | |
JP4012872B2 (ja) | 情報管理装置、情報管理方法及び情報管理プログラム | |
EP3779777A1 (en) | A method of enrolling a new memebr to a facial image database | |
CN115118536B (zh) | 分享方法、控制设备及计算机可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BY BZ CA CH CN CO CR CU CZ DE DM DZ EC EE ES FI GB GD GE GH HR HU ID IL IN IS JP KE KG KP KR LC LK LR LS LT LU LV MA MD MG MN MW MX MZ NO NZ OM PH PL PT RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG US UZ VC VN YU ZA ZM |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003529410 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006182346 Country of ref document: US Ref document number: 10489301 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase | ||
WWP | Wipo information: published in national office |
Ref document number: 10489301 Country of ref document: US |