WO2009021125A1 - Procédé et appareil permettant d'arrêter un dispositif d'imagerie sur la base de l'emplacement d'un objet - Google Patents

Procédé et appareil permettant d'arrêter un dispositif d'imagerie sur la base de l'emplacement d'un objet Download PDF

Info

Publication number
WO2009021125A1
WO2009021125A1 PCT/US2008/072498 US2008072498W WO2009021125A1 WO 2009021125 A1 WO2009021125 A1 WO 2009021125A1 US 2008072498 W US2008072498 W US 2008072498W WO 2009021125 A1 WO2009021125 A1 WO 2009021125A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
distance
imaging device
profile
detection module
Prior art date
Application number
PCT/US2008/072498
Other languages
English (en)
Inventor
Dan Phillips
Mike Haigh
Original Assignee
Sony Computer Entertainment Europe
Sony Computer Entertainment America Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe, Sony Computer Entertainment America Inc. filed Critical Sony Computer Entertainment Europe
Publication of WO2009021125A1 publication Critical patent/WO2009021125A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the invention relates generally to terminating operation of an imaging device and, more particularly, to terminating operation of an imaging device based on a location of an object.
  • a method and apparatus detects an object; detects a measured distance between the object and an imaging device; matches the object with a profile; selects an allowable distance between the object and the imaging device; compares the allowable distance with the measured distance; and selectively reduces operation of the imaging device based on the allowable distance and the measured distance.
  • Figure 1 is a diagram illustrating an environment within which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented
  • Figure 2 is a simplified block diagram illustrating one embodiment in which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented;
  • Figure 3 is a simplified block diagram illustrating a system, consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object;
  • Figure 4 illustrates an exemplary record consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object
  • Figure 5 is a flow diagram consistent with one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object.
  • references to "electronic device” include a device such as a personal digital video recorder, digital audio player, gaming console, a set top box, a personal computer, a cellular telephone, a personal digital assistant, a specialized computer such as an electronic interface with an automobile, and the like.
  • references to "user” include an operator of electronic devices.
  • references to “content” include audio streams, images, video streams, photographs, graphical displays, text files, software applications, electronic messages, and the like.
  • “content” also refers to advertisements and programming.
  • FIG. 1 is a diagram illustrating an environment within which the method and apparatus for selectively terminating an imaging device based on a location of an object is implemented.
  • the environment includes an electronic device 110, e.g. a computing platform configured to act as a client device, such as a personal digital video recorder, digital audio player, computer, a personal digital assistant, a cellular telephone, a camera device, a set top box, a gaming console; a user interface 115, a network 120, e.g. a local area network, a home network, the Internet; and a server 130, e.g. a computing platform configured to act as a server.
  • the network 120 can be implemented via wireless or wired solutions.
  • one or more user interface 115 components are made integral with the electronic device 110, e.g. keypad and video display screen input and output interfaces in the same housing as personal digital assistant electronics, e.g. as in a Clie® manufactured by Sony Corporation.
  • one or more user interface 115 components e.g. a keyboard, a pointing device such as a mouse and trackball, a microphone, a speaker, a display, a camera, are physically separate from, and are conventionally coupled to, an electronic device 110.
  • the user uses the interface 115 components to access and control content and applications stored in an electronic device 110, server 130, or a remote storage device (not shown) coupled via the network 120.
  • embodiments for selectively terminating an imaging device based on a location of an object as described below are executed by an electronic processor in an electronic device 110, in a server 130, or by processors in the electronic device 110 and in the server 130 acting together.
  • the server 130 is illustrated in Figure 1 as a single computing platform, but in other instances it comprises two or more interconnected computing platforms that act as a server.
  • Figure 2 is a simplified diagram illustrating an exemplary architecture in which the method and apparatus for selectively terminating an imaging device based on a location of an object ais implemented.
  • the exemplary architecture includes a plurality of electronic devices 110, a server device 130, and a network 120 connecting the electronic devices 110 to a server 130, and connecting each electronic device 110 to each other.
  • the plurality of electronic devices 110 are each configured to include a computer-readable medium 209, such as random access memory, coupled to an electronic processor 208.
  • the processor 208 executes program instructions that are stored in the computer-readable medium 209.
  • a unique user operates each electronic device 110 via an interface 115, as described with reference to Figure 1.
  • the server device 130 includes a processor 211 that is coupled to a computer-readable medium 212.
  • the server device 130 is coupled to one or more additional external or internal devices, such as, without limitation, a secondary data storage element, such as a database 240.
  • the processors 208 and 211 are manufactured by Intel Corporation, of Santa Clara, California. In other instances, other microprocessors are used.
  • the plurality of client devices 110 and the server 130 include instructions for a customized application for selectively terminating operation of an imaging device based on a location of an object.
  • the plurality of computer-readable media 209 and 212 contain, in part, the customized application.
  • the plurality of client devices 110 and the server 130 are configured to receive and transmit electronic messages for use with the customized application.
  • the network 120 is configured to transmit electronic messages for use with the customized application.
  • One or more user applications are stored in memories 209, in memory 211, or a single user application is stored in part in one memory 209 and in part in another memory 211.
  • a stored user application regardless of storage location, is made customizable based on automatically controlling the sound level based on the content as determined using embodiments described below.
  • Figure 3 illustrates one embodiment of a system 300 for selectively terminating an imaging device based on a location of an object.
  • the system 300 includes an object detection module 310, a distance detection module 320, a storage module 330, an interface module 340, a control module 350, a profile module 360, an object identification module 370, and a termination module 380.
  • the control module 350 communicates with the object detection module 310, the distance detection module 320, the storage module 330, the interface module 340, the profile module 360, the object identification module 370, and the termination module 380.
  • the control module 350 coordinates tasks, requests, and communications between the object detection module 310, the distance detection module 320, the storage module 330, the interface module 340, the profile module 360, the object identification module 370, and the termination module 380.
  • the object detection module 310 detects both living and inanimate objects.
  • the distance detection module 320 detects the distance between the object and the system 300. In one embodiment, the distance detection module 320 uses a camera to detect the distance.
  • the storage module 330 stores a plurality of profiles, wherein each profile is associated with various content and other data associated with the content or a viewer.
  • the profile stores exemplary information as shown in the profiles illustrated in Figure 4.
  • the storage module 330 is located within the server device 130. In another embodiment, portions of the storage module 330 are located within the electronic device 110.
  • the interface module 340 detects the electronic device 110 when the electronic device 110 is connected to the network 120.
  • the interface module 340 detects input from the interface device 115, such as a keyboard, a mouse, a microphone, a still camera, a video camera, and the like.
  • the interface module 340 provides output to the interface device115, such as a display, speakers, external storage devices, an external network, and the like.
  • the profile module 360 processes profile information related to the specific object. In one embodiment, exemplary profile information is shown within a record illustrated in Figure 4. In one embodiment, each profile corresponds with a particular object that is detected by the object detection module 310. In one embodiment, the object identification module 370 determines a match between the detected object and an object type. In one embodiment, a match between the detected object and the object type is determined by a match between the attributes associated with the detected object and the object type.
  • the information within the profile of the object type and detected object is used to determine the match.
  • the termination module 380 selectively shuts down an imaging device based on various parameters, such as the distance between the detected object and the imaging device, the object type, and the like.
  • the system 300 is configured to match the detected object with an object type. In another embodiment, the system 300 is also configured to detect the distance between the detected object and the imaging device. Further, the termination of the imaging device is initiated based on the object type of the detected object and the distance between the detected object and the imaging device.
  • the system 300 in Figure 3 is shown for exemplary purposes and is merely one embodiment of the methods and apparatuses for selectively terminating an imaging device based on a location of an object. Additional modules may be added to the system 300 without departing from the scope of the method and apparatus for selectively terminating an imaging device based on a location of an object. Similarly, modules may be combined or deleted without departing from the scope of the method and apparatus for selectively terminating an imaging device based on a location of an object.
  • Figure 4 illustrates a simplified record 400 that corresponds to a profile that describes different object types.
  • the record 400 is stored within the storage module 330 and used within the system 300.
  • the record 400 includes an object identification field 405, an object description field 410, an object image field 415, and an object classification field 420.
  • the object identification field 405 identifies a specific object associated with the record 400.
  • the object's name is used as a label for the object identification field 405.
  • the object description field 410 includes a description of the object.
  • the object description field 410 may include a detailed written description of the car indicating attributes of the car, such as size of the car, profile of the car, color of the car. Different levels of details may be included within the object description field 410.
  • a narrative or summary of the object may be included within the content description field 410.
  • the object image field 415 identifies a graphic that describes the content.
  • the graphic is a representation of the object.
  • the graphic is the actual image of the object.
  • the object classification field 420 identifies an object type associated with the particular object.
  • multiple distinct objects may be included within a single object type.
  • different object types include living objects and inanimate objects. Within living objects, there may be sub-types, such as humans and animals. In one embodiment, there may be any number of object types and object sub-types to classify various objects.
  • the flow diagram in Figure 5 illustrates one embodiment of the method and apparatus for selectively terminating an imaging device based on a location of an object. The blocks within the flow diagram can be performed in a different sequence without departing from the spirit of the method and apparatus for selectively terminating an imaging device based on a location of an object. Further, blocks can be deleted, added, or combined without departing from the spirit of the method and apparatus for selectively terminating an imaging device based on a location of an object.
  • the flow diagram in Figure 5 illustrates reducing operation of an imaging device based on multiple parameters according to one embodiment of the invention.
  • an object is detected.
  • the object is detected through the object detection module 310.
  • the object is detected through an imaging device.
  • a distance between the detected object and the imaging device is detected.
  • the distance detection module 320 detects the distance between the detected object and the imaging device.
  • the imaging device is used to detect this distance.
  • an object type is detected based on the object profile.
  • various parameters of the object are detected and matched with a specific object profile.
  • An exemplary object profile is illustrated as a record 400 within Figure 4.
  • a match is performed between the object, as detected within the block 505, and the object profile, as identified within the block 515.
  • an image of the detected object is compared with the image contained within the object image field 415 within the record 400.
  • the parameters of the detected object are compared with the description contained within the object description field 410 within the record 400.
  • a match between the detected object and the object profile may be determined through a match threshold that satisfies a minimum level of matching to proceed.
  • a predetermined default allowable distance is used within the block 535.
  • the predetermined default allowable distance is two inches. In other embodiments, the predetermined default allowable distance can be any predetermined distance.
  • the allowable distance is determined through the associated object profile within the block 525, as shown within the exemplary record 400.
  • the object classification field 420 of the record 400 provides an allowable distance.
  • the object classification field 420 of the record 400 provides a range of allowable distances.
  • the distance between the detected object and the imaging device is compared with the either the predetermined default allowable distance from the block 535 or the allowable distance from the block 525.
  • block 540 if the detected object is outside the predetermined default allowable distance from the block 535 or the allowable distance from the block 525, then the imaging device continues to operate. Further, detection of the object continues within the block 505. In one embodiment, the same object continues to be detected within the block 505. In another embodiment, a different object is detected within the block 505. In block 540, if the detected object is within the predetermined default allowable distance from the block 535 or the allowable distance from the block 525, then operation of the imaging device is modified within the block 545.
  • operation of the imaging device is reduced. In one embodiment, the output or emissions from the imaging device is reduced. In another embodiment, the output or emissions from the imaging device is terminated.
  • the output or emissions from the imaging device may harm living objects nearby. In other instances, the output or emissions from the image device may be annoying or affect the comfort of those nearby.
  • a reduction in output or emissions from the imaging device may be sufficient to ensure safety and comfort for those that are nearby.
  • a termination in output or emissions from the imaging device may be needed to ensure safety and comfort for those that are nearby.
  • the detection of the object continues within the block 505. In one embodiment, the same object continues to be detected within the block 505. In another embodiment, a different object is detected within the block 505.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un mode de réalisation, l'invention concerne un procédé et un appareil détectant un objet; détectant une distance mesurée entre l'objet et un dispositif d'imagerie; faisant correspondre l'objet à un profil; choisissant une distance admissible entre l'objet et le dispositif d'imagerie; comparant la distance admissible à la distance mesurée; et réduisant de façon sélective le fonctionnement du dispositif d'imagerie sur la base de la distance admissible et de la distance mesurée.
PCT/US2008/072498 2007-08-07 2008-08-07 Procédé et appareil permettant d'arrêter un dispositif d'imagerie sur la base de l'emplacement d'un objet WO2009021125A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US96386807P 2007-08-07 2007-08-07
US60/963,868 2007-08-07
US18794108A 2008-08-07 2008-08-07
US12/187,941 2008-08-07

Publications (1)

Publication Number Publication Date
WO2009021125A1 true WO2009021125A1 (fr) 2009-02-12

Family

ID=40341742

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/072498 WO2009021125A1 (fr) 2007-08-07 2008-08-07 Procédé et appareil permettant d'arrêter un dispositif d'imagerie sur la base de l'emplacement d'un objet

Country Status (1)

Country Link
WO (1) WO2009021125A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9929357B2 (en) 2014-07-22 2018-03-27 Universal Display Corporation Organic electroluminescent materials and devices

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010040637A1 (en) * 1996-03-28 2001-11-15 Iwao Ishida Detecting device reducing detection errors caused by signals generated other than by direct light
US20030027528A1 (en) * 2001-08-06 2003-02-06 Toshiba Tec Kabushiki Kaisha. Image information input/output device and control system for the same using mobile device
US20050282530A1 (en) * 2002-06-11 2005-12-22 Adam Raff Communications device and method comprising user profiles matching between compatible devices

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010040637A1 (en) * 1996-03-28 2001-11-15 Iwao Ishida Detecting device reducing detection errors caused by signals generated other than by direct light
US20030027528A1 (en) * 2001-08-06 2003-02-06 Toshiba Tec Kabushiki Kaisha. Image information input/output device and control system for the same using mobile device
US20050282530A1 (en) * 2002-06-11 2005-12-22 Adam Raff Communications device and method comprising user profiles matching between compatible devices

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9929357B2 (en) 2014-07-22 2018-03-27 Universal Display Corporation Organic electroluminescent materials and devices

Similar Documents

Publication Publication Date Title
US20230274537A1 (en) Eye gaze tracking using neural networks
EP1782155B1 (fr) Procedes et appareils de selection automatique d'un profil
US20090062943A1 (en) Methods and apparatus for automatically controlling the sound level based on the content
CN110719402A (zh) 图像处理方法及终端设备
US20180108352A1 (en) Robot Interactive Communication System
JP2007520830A (ja) コンテンツを捕捉する機会を特定するための方法及び装置
CN108257104B (zh) 一种图像处理方法及移动终端
CN108616448B (zh) 一种信息分享的路径推荐方法及移动终端
CN112100431B (zh) Ocr系统的评估方法、装置、设备及可读存储介质
US20110247054A1 (en) Methods and apparatuses for selecting privileges for use during a data collaboration session
US9204386B2 (en) Method for rule-based context acquisition
US20200380168A1 (en) Image Access Management Device, Image Access Management Method, and Image Access Management System
US20170289676A1 (en) Systems and methods to identify device with which to participate in communication of audio data
CN113190646A (zh) 一种用户名样本的标注方法、装置、电子设备及存储介质
US10984140B2 (en) Method for detecting the possible taking of screenshots
WO2009021125A1 (fr) Procédé et appareil permettant d'arrêter un dispositif d'imagerie sur la base de l'emplacement d'un objet
CN113111692A (zh) 目标检测方法、装置、计算机可读存储介质及电子设备
US8001114B2 (en) Methods and apparatuses for dynamically searching for electronic mail messages
CN107609446B (zh) 一种码图识别方法、终端及计算机可读存储介质
EP3937035A1 (fr) Procédé et appareil permettant d'empêcher la falsification de données, procédé et appareil de détection de la falsification de données
CN115171718A (zh) 一种特定鸟类识别方法、装置及存储介质
JP2022016527A (ja) 情報提供システム
US20110040778A1 (en) Methods and apparatuses for dynamically displaying search suggestions
US11599827B2 (en) Method and apparatus for improving the robustness of a machine learning system
CN112926080A (zh) 隐私对象的控制方法、装置、存储介质和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08797390

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08797390

Country of ref document: EP

Kind code of ref document: A1