WO2014057710A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2014057710A1
WO2014057710A1 PCT/JP2013/065667 JP2013065667W WO2014057710A1 WO 2014057710 A1 WO2014057710 A1 WO 2014057710A1 JP 2013065667 W JP2013065667 W JP 2013065667W WO 2014057710 A1 WO2014057710 A1 WO 2014057710A1
Authority
WO
WIPO (PCT)
Prior art keywords
search
unit
search range
information processing
perceptual image
Prior art date
Application number
PCT/JP2013/065667
Other languages
French (fr)
Japanese (ja)
Inventor
謙一 北谷
Original Assignee
Necカシオモバイルコミュニケーションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necカシオモバイルコミュニケーションズ株式会社 filed Critical Necカシオモバイルコミュニケーションズ株式会社
Publication of WO2014057710A1 publication Critical patent/WO2014057710A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4058Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
    • A61B5/4064Evaluating the brain
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and a program for processing information.
  • Patent Document 1 A technique for detecting a human brain wave and converting a recall image into a perceptual pattern based on the detection result has been considered (see Patent Document 1).
  • Patent Document 1 A technique for reproducing an image or the like imagined by the person himself / herself from a measured electroencephalogram with an electronic device and displaying it on a display is rapidly being established.
  • An object of the present invention is to provide an information processing apparatus, an information processing method, and a program that solve the above-described problems.
  • the information processing apparatus of the present invention A measurement unit for measuring the electromagnetic characteristics of the user's brain; Analyzing the electromagnetic characteristics measured by the measurement unit, and based on the result of the analysis, an image generation unit that generates a perceptual image; An object identifying unit that extracts a feature amount of the perceptual image generated by the image generation unit and identifies an object indicated by the perceptual image based on the extracted feature amount; A search range determining unit for determining a search range based on the target specified by the target specifying unit; A search unit that searches the perceptual image using the perceptual image in the search range determined by the search range determination unit.
  • the information processing method of the present invention includes: An information processing method for processing information, A process for measuring the electromagnetic properties of the user's brain; Analyzing the measured electromagnetic characteristics, and generating a perceptual image based on the result of the analysis; A process of extracting a feature amount of the generated perceptual image; A process of identifying an object indicated by the perceptual image based on the extracted feature amount; A process of determining a search range based on the identified object; In the determined search range, a process for searching for information related to the perceptual image using the perceptual image is performed.
  • the program of the present invention is A program for causing a computer to execute, Procedures for measuring the electromagnetic properties of the user's brain; Analyzing the measured electromagnetic properties and generating a perceptual image based on the results of the analysis; A procedure for extracting a feature amount of the generated perceptual image; A procedure for identifying an object indicated by the perceptual image based on the extracted feature amount; A procedure for determining a search range based on the identified object; In the determined search range, a procedure for searching for information on the perceptual image using the perceptual image is executed.
  • FIG. 1 It is a figure which shows one Embodiment of the information processing apparatus of this invention. It is a figure which shows an example of the specific information memorize
  • FIG. 1 is a diagram showing an embodiment of an information processing apparatus according to the present invention.
  • the information processing apparatus 100 includes a measurement unit 110, an image generation unit 120, an object specifying unit 130, a search range determination unit 140, a search unit 150, and an output unit 160.
  • the information processing apparatus 100 may be a mobile terminal such as a mobile phone or a smartphone, a game machine, a desktop type, a tablet type or a notebook type PC (Personal Computer), another communication device, or a medical device.
  • the measuring unit 110 measures electromagnetic characteristics such as visual association areas in the user's brain.
  • the electromagnetic characteristics are those that show electrical characteristics in an electric field and magnetic characteristics in a magnetic field (for example, electromagnetic waves). Further, the measurement unit 110 outputs the measurement result to the image generation unit 120.
  • the image generation unit 120 analyzes the measurement result output from the measurement unit 110, and generates (reproduces) a perceptual image such as a character, image, or video perceived by the user in the past based on the analysis result. . Further, the image generation unit 120 outputs the generated perceptual image to the object specifying unit 130 and the search unit 150.
  • the processing of the measurement unit 110 and the image generation unit 120 is generally disclosed as a technique for reproducing an image from the movement of the human brain (for example, Reference 1 (Neuron, Volume 60, Issue 5, 915-929, 10 Dec. 2008, “Visual Image Reconstruction from Human Brain Activity using a Combination of Multiscale Local Image Decoders”), and the technology to extract and reanimate the brain information of animals and reanimate them (for example, Reference 2 (Theodore W Berger, Robert E Hampson, Dong Song, Anushka Goonawardena, Vasilis Z Marmarelis and Sam A Deadwyler, Journal of Neural Engineering 8 (2011) 046017 (11pp) It may be performed using a technique as described.
  • Reference 1 Neuroon, Volume 60, Issue 5, 915-929, 10 Dec. 2008, “Visual Image Reconstruction from Human Brain Activity using a Combination of Multiscale Local Image Decoders”
  • Reference 2 Theodore W Berger, Robert E Hampson, Dong Song, Anushka Goonawarden
  • the object specifying unit 130 extracts the feature amount of the perceptual image generated by the image generation unit 120.
  • the feature amount is the feature of the image, such as the aspect ratio of the paper image, the layout of characters, images, and margins. It is information which shows.
  • the perceptual image is a video image of a movie, video program, live, sports, etc.
  • the combination, composition, arrangement, color, shape, movement of characters and appearances of at least some scenes of the video image This is information indicating the characteristics of the image, such as the color / shape of characters / images and their layout, and their aging.
  • the perceived image is the whole face, part of the face, whole body, part of the body, and combinations of their colors and sizes, height, physique, voice (voice print, etc.), footsteps, copyrighted work (figure by the person) , Characters, handwriting, etc.), clothes, accessories, belongings (bags, vehicles, electronic devices, etc.), the feature amount is information indicating the feature of the image.
  • the object specifying unit 130 specifies the object indicated by the perceptual image based on the extracted feature amount.
  • the target object specifying unit 130 may associate the feature amount and the target object in advance and store them as specific information, and specify the target object from the specific information based on the extracted feature amount. .
  • FIG. 2 is a diagram illustrating an example of identification information stored in the object identification unit 130 illustrated in FIG.
  • the feature quantity and information indicating the object are stored in association with each other.
  • the feature quantity “A001” and the target object “person's face” are stored in association with each other. This indicates that when the extracted feature amount is “A001”, the target specified by the target specifying unit 130 is “a person's face”.
  • the feature amount is indicated as “A001”, but this is a description for convenience of explanation and is different from an actual numerical value.
  • the object specifying unit 130 notifies the search range determining unit 140 of the specified object.
  • the search range determination unit 140 determines the search range of the perceptual image based on the target notified from the target specifying unit 130. For example, the search range determination unit 140 associates the object and the search range in advance and stores them as search information, and determines the search range from the search information based on the target notified from the target specifying unit 130. It may be.
  • FIG. 3 is a diagram showing an example of search information stored in the search range determination unit 140 shown in FIG.
  • the search range determination unit 140 shown in FIG. 1 stores information indicating the object and the search range in association with each other.
  • a search range a range to be searched and a range not to be searched are stored.
  • the object “person's face”, the search target “blog, article”, and the search target “image such as SNS” are stored in association with each other. .
  • the search range determination unit 140 sets the range of “blog, article” as the search target, and sets “image such as SNS”. This indicates that the range is not to be searched.
  • the search range only one of the search target and the non-search target may be stored.
  • the search range determination unit 140 notifies the search unit 150 of the determined search range.
  • the search unit 150 performs a search using the perceptual image output from the image generation unit 120 in the search range notified from the search range determination unit 140. That is, the search unit 150 searches the perceptual image for information in the search range using the perceptual image as a search key.
  • This search method may use a general image search. For example, the search unit 150 searches for a server connected to the information processing apparatus 100 on the Internet.
  • the output unit 160 outputs a search result performed by the search unit 150.
  • the output unit 160 may be a display that displays search results performed by the search unit 150.
  • FIG. 4 is a flowchart for explaining an example of the information processing method in the information processing apparatus 100 shown in FIG.
  • the image generation unit 120 analyzes the measured electromagnetic characteristics in step 2. In step 3, the image generation unit 120 generates a perceptual image based on the result of the analysis. The image generation unit 120 outputs the generated perceptual image to the target object specifying unit 130.
  • step 4 the object specifying unit 130 extracts a feature amount from the perceptual image output from the image generation unit 120, and in step 5, the perceptual image indicates based on the extracted feature amount. Identify the object.
  • the object specifying unit 130 notifies the search range determining unit 140 of the specified object.
  • the search range determination unit 140 determines the search range of the perceptual image based on the target notified from the target specifying unit 130.
  • the search range determination unit 140 notifies the search unit 150 of the determined search range.
  • step 7 the search unit 150 searches for information related to the perceptual image using the perceptual image output from the image generation unit 120 in the search range notified from the search range determination unit 140.
  • step 8 the output unit 160 outputs the search result.
  • the user memorizes the face of the person A who passes by chance and performs a search for the person A based on the visual (perceptual) image.
  • the measurement unit 110 measures the electromagnetic characteristics of the user's brain, and the image generation unit 120 generates a perceptual image of the face of the person A.
  • the object specifying unit 130 extracts a feature amount from the perceptual image and specifies that the object is a human face.
  • the search range determination unit 140 determines that the search range is a predetermined information source because the target is a human face. For example, the search range determination unit 140 excludes an image including the face of the person A registered in the SNS (Social Networking Service) from the search target (even if the user is logged in to the SNS) and is made public. An image including the face of the person A posted on the blog of the person A and an image including the face of the person A posted in a published article are set as search targets.
  • SNS Social Networking Service
  • the search unit 150 performs a search related to the perceptual image according to the search range (search target) determined by the search range determination unit 140.
  • the image generation unit 120 may add tag information indicating that the perceptual image is a perceptual image obtained by measuring brain waves to the perceptual image.
  • the search range determination unit 140 may determine the search range using the tag information. That is, when the tag information is added to the perceptual image, the search range determination unit 140 may narrow (restrict) the search range compared to the case where the tag information is not added to the perceptual image. . In this case, the search range determination unit 140 may determine the search range by using the identification result in the object identification unit 130 together.
  • the case where the tag information is not added to the perceptual image is not the case where the object specifying unit 130 specifies the object based on the perceptual image generated by the image generating unit 120, but the image generating unit 120.
  • the information processing apparatus 100 obtains a perceptual image generated by the image generation unit 120 and object information indicating the object specified by the object specifying unit 130 (for example, a server provided outside the information processing apparatus 100 (for example, SNS server described below), and the server may cause the search range determination unit 140 and the search unit 150 to perform processing.
  • a server provided outside the information processing apparatus 100 (for example, SNS server described below)
  • the server may cause the search range determination unit 140 and the search unit 150 to perform processing.
  • the SNS server performs control for searching for an image including the face of the member B based on the recorded search history. Also good. In this case, since the search is performed within the same community service, there is no need to limit the search range.
  • searching for such a member when transmitting information (for example, perceptual image and object information as described above) for searching from the information processing apparatus 100 to the server, a member ID that can identify the member (User identification information) needs to be transmitted. The transmitted member ID is recorded in the server as a search history, and is used for the search.
  • each component provided in the information processing apparatus 100 described above may be performed by a logic circuit produced according to the purpose.
  • a computer program (hereinafter referred to as a program) in which processing contents are described as a procedure is recorded on a recording medium readable by the information processing apparatus 100, and the program recorded on the recording medium is read by the information processing apparatus 100. , May be executed.
  • the recording medium readable by the information processing apparatus 100 includes a transfer medium such as a floppy (registered trademark) disk, a magneto-optical disk, a DVD, and a CD, as well as a ROM, a RAM, and the like built in the information processing apparatus 100. Memory, HDD, etc.
  • the program recorded on the recording medium is read by a CPU (not shown) provided in the information processing apparatus 100, and the same processing as described above is performed under the control of the CPU.
  • the CPU operates as a computer that executes a program read from a recording medium on which the program is recorded.
  • Appendix 1 a measurement unit for measuring the electromagnetic characteristics of the user's brain; Analyzing the electromagnetic characteristics measured by the measurement unit, and based on the result of the analysis, an image generation unit that generates a perceptual image; An object identifying unit that extracts a feature amount of the perceptual image generated by the image generation unit and identifies an object indicated by the perceptual image based on the extracted feature amount; A search range determining unit for determining a search range based on the target specified by the target specifying unit; An information processing apparatus comprising: a search unit that searches for information related to a perceptual image using the perceptual image in the search range determined by the search range determination unit.
  • specification part matches the said feature-value and the said object beforehand, memorize
  • the said search part performs the said search with respect to the server connected with the said information processing apparatus, The information processing apparatus of any one of Additional remark 1 to 3 characterized by the above-mentioned. (Additional remark 5) It has an output part which outputs the search result which the said search part performed, The information processing apparatus of any one of Additional remark 1 to 4 characterized by the above-mentioned.
  • the image generation unit transmits the generated perceptual image to the server
  • the object specifying unit transmits object information indicating the specified object to the server
  • the server determines a search range based on the object indicated by the object information transmitted from the object specifying unit, and the perceptual image transmitted from the image generation unit in the determined search range. Use this to search for information related to the perceptual image, record the search history of the search, and then search for the user who performed the search recorded in the search history from the user according to the searched information
  • a communication system characterized in that the search range is not limited when there is a problem.
  • An information processing method for processing information A process for measuring the electromagnetic properties of the user's brain; Analyzing the measured electromagnetic characteristics, and generating a perceptual image based on the result of the analysis; A process of extracting a feature amount of the generated perceptual image; A process of identifying an object indicated by the perceptual image based on the extracted feature amount; A process of determining a search range based on the identified object; An information processing method for performing processing for searching for information related to a perceptual image using the perceptual image in the determined search range.

Abstract

A measurement unit (110) measures electromagnetic properties of a user's brain. An image generating unit (120) analyzes the electromagnetic properties which the measurement unit (110) has measured, and, on the basis of the result of the analysis, generates a perception image. An object specification unit (130) extracts a characteristic quantity of the perception image which the image generation unit (120) has generated, and specifies, on the basis of the extracted characteristic quantity, an object which the perception image denotes. A search range determination unit (140) determines a search range, on the basis of the object which the object specification unit (130) has specified. A search unit (150), using a perception image, carries out a search of information relating to this perception image, in the search range which the search range determination unit (150) has determined.

Description

情報処理装置Information processing device
 本発明は、情報を処理する情報処理装置、情報処理方法およびプログラムに関する。 The present invention relates to an information processing apparatus, an information processing method, and a program for processing information.
 人間の脳波を検出し、その検出結果に基づいて、想起イメージを知覚パターン化して出力する技術が考えられている(特許文献1参照。)。この技術を用いて、近年では、測定した脳波から本人が思い浮かべる画像等を電子機器で再現し、ディスプレイ上に表示させる技術が急速に確立されつつある。 A technique for detecting a human brain wave and converting a recall image into a perceptual pattern based on the detection result has been considered (see Patent Document 1). In recent years, using this technique, a technique for reproducing an image or the like imagined by the person himself / herself from a measured electroencephalogram with an electronic device and displaying it on a display is rapidly being established.
 このような技術を用いれば、再現された画像等を検索キーとして、Web上の情報の検索を行うことができ、その検索処理が飛躍的に効率的になることが考えられる。例えば、利用者が一度訪れた場所の風景を思い浮かべることで、思い浮かべた風景から、その場所の地名、住所、そこへの行き方等を検索することが、キーワードを入力することなく可能となる。 If such a technique is used, it is possible that information on the Web can be searched using a reproduced image or the like as a search key, and the search process can be dramatically improved. For example, it is possible to search for a place name, an address, a way to get to the place, etc. from the imagined scenery without inputting a keyword by thinking of the scenery of the place once visited by the user.
特開昭61-15229号公報Japanese Patent Laid-Open No. 61-15229
 近年では、インターネット上に様々な情報が点在しており、その中には個人情報も多々見られる。そのため、上述した技術を用いて情報の検索を行った場合、他人の個人情報を容易に得ることも可能にしてしまい、プライバシーの保護に反するおそれがある。 In recent years, various information is dotted on the Internet, and personal information is often seen in them. Therefore, when information is searched using the above-described technique, it becomes possible to easily obtain other person's personal information, which may be against privacy protection.
 本発明の目的は、上述した課題を解決する情報処理装置、情報処理方法およびプログラムを提供することである。 An object of the present invention is to provide an information processing apparatus, an information processing method, and a program that solve the above-described problems.
 本発明の情報処理装置は、
 利用者の脳の電磁気学的特性を測定する測定部と、
 前記測定部が測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成するイメージ生成部と、
 前記イメージ生成部が生成した知覚イメージの特徴量を抽出し、該抽出した特徴量に基づいて該知覚イメージが示す対象物を特定する対象物特定部と、
 前記対象物特定部が特定した対象物に基づいて、検索範囲を決定する検索範囲決定部と、
 前記検索範囲決定部が決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う検索部とを有する。
The information processing apparatus of the present invention
A measurement unit for measuring the electromagnetic characteristics of the user's brain;
Analyzing the electromagnetic characteristics measured by the measurement unit, and based on the result of the analysis, an image generation unit that generates a perceptual image;
An object identifying unit that extracts a feature amount of the perceptual image generated by the image generation unit and identifies an object indicated by the perceptual image based on the extracted feature amount;
A search range determining unit for determining a search range based on the target specified by the target specifying unit;
A search unit that searches the perceptual image using the perceptual image in the search range determined by the search range determination unit.
 また、本発明の情報処理方法は、
 情報を処理する情報処理方法であって、
 利用者の脳の電磁気学的特性を測定する処理と、
 前記測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成する処理と、
 前記生成した知覚イメージの特徴量を抽出する処理と、
 前記抽出した特徴量に基づいて、該知覚イメージが示す対象物を特定する処理と、
 前記特定した対象物に基づいて、検索範囲を決定する処理と、
 前記決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う処理とを行う。
In addition, the information processing method of the present invention includes:
An information processing method for processing information,
A process for measuring the electromagnetic properties of the user's brain;
Analyzing the measured electromagnetic characteristics, and generating a perceptual image based on the result of the analysis;
A process of extracting a feature amount of the generated perceptual image;
A process of identifying an object indicated by the perceptual image based on the extracted feature amount;
A process of determining a search range based on the identified object;
In the determined search range, a process for searching for information related to the perceptual image using the perceptual image is performed.
 また、本発明のプログラムは、
 コンピュータに実行させるためのプログラムであって、
 利用者の脳の電磁気学的特性を測定する手順と、
 前記測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成する手順と、
 前記生成した知覚イメージの特徴量を抽出する手順と、
 前記抽出した特徴量に基づいて、該知覚イメージが示す対象物を特定する手順と、
 前記特定した対象物に基づいて、検索範囲を決定する手順と、
 前記決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う手順とを実行させる。
The program of the present invention is
A program for causing a computer to execute,
Procedures for measuring the electromagnetic properties of the user's brain;
Analyzing the measured electromagnetic properties and generating a perceptual image based on the results of the analysis;
A procedure for extracting a feature amount of the generated perceptual image;
A procedure for identifying an object indicated by the perceptual image based on the extracted feature amount;
A procedure for determining a search range based on the identified object;
In the determined search range, a procedure for searching for information on the perceptual image using the perceptual image is executed.
 以上説明したように、本発明においては、プライバシーを保護することができる。 As described above, privacy can be protected in the present invention.
本発明の情報処理装置の実施の一形態を示す図である。It is a figure which shows one Embodiment of the information processing apparatus of this invention. 図1に示した対象物特定部に記憶された特定情報の一例を示す図である。It is a figure which shows an example of the specific information memorize | stored in the target object specific | specification part shown in FIG. 図1に示した検索範囲決定部に記憶された検索情報の一例を示す図である。It is a figure which shows an example of the search information memorize | stored in the search range determination part shown in FIG. 図1に示した情報処理装置における情報処理方法の一例を説明するためのフローチャートである。3 is a flowchart for explaining an example of an information processing method in the information processing apparatus shown in FIG. 1.
 以下に、本発明の実施の形態について図面を参照して説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 図1は、本発明の情報処理装置の実施の一形態を示す図である。 FIG. 1 is a diagram showing an embodiment of an information processing apparatus according to the present invention.
 本形態における情報処理装置100には図1に示すように、測定部110と、イメージ生成部120と、対象物特定部130と、検索範囲決定部140と、検索部150と、出力部160とが設けられている。なお、情報処理装置100は、携帯電話やスマートフォン等の携帯端末、ゲーム機、デスクトップ型、タブレット型やノート型のPC(Personal Computer)、他の通信装置や医療機器であっても良い。 As shown in FIG. 1, the information processing apparatus 100 according to the present embodiment includes a measurement unit 110, an image generation unit 120, an object specifying unit 130, a search range determination unit 140, a search unit 150, and an output unit 160. Is provided. The information processing apparatus 100 may be a mobile terminal such as a mobile phone or a smartphone, a game machine, a desktop type, a tablet type or a notebook type PC (Personal Computer), another communication device, or a medical device.
 測定部110は、利用者の脳内の視覚連合野等の電磁気学的特性を測定する。ここで、電磁気学的特性とは、電場における電気的特性および磁場における磁気的特性を示すもの(例えば、電磁波等)である。また、測定部110は、測定結果をイメージ生成部120へ出力する。 The measuring unit 110 measures electromagnetic characteristics such as visual association areas in the user's brain. Here, the electromagnetic characteristics are those that show electrical characteristics in an electric field and magnetic characteristics in a magnetic field (for example, electromagnetic waves). Further, the measurement unit 110 outputs the measurement result to the image generation unit 120.
 イメージ生成部120は、測定部110から出力されてきた測定結果を解析し、その解析の結果に基づいて、利用者が過去に知覚した文字、画像、映像等の知覚イメージを生成(再現)する。また、イメージ生成部120は、生成した知覚イメージを対象物特定部130および検索部150へ出力する。 The image generation unit 120 analyzes the measurement result output from the measurement unit 110, and generates (reproduces) a perceptual image such as a character, image, or video perceived by the user in the past based on the analysis result. . Further, the image generation unit 120 outputs the generated perceptual image to the object specifying unit 130 and the search unit 150.
 この測定部110およびイメージ生成部120の処理は、一般的に開示されている、人間の脳の動きから画像を再生する技術(例えば、文献1(Neuron, Volume 60, Issue 5, 915-929, 10 December 2008 "Visual Image Reconstruction from Human Brain Activity using a Combination of Multiscale Local Image Decoders")に記載されているような技術)や、動物の脳を記憶情報を電子化して取り出し、蘇生させる技術(例えば、文献2(Theodore W Berger, Robert E Hampson, Dong Song, Anushka Goonawardena, Vasilis Z Marmarelis and Sam A Deadwyler, Journal of Neural Engineering 8 (2011) 046017 (11pp) “A cortical neural prosthesis for restoring and enhancing memory”)に記載されているような技術)を用いて行うもので良い。 The processing of the measurement unit 110 and the image generation unit 120 is generally disclosed as a technique for reproducing an image from the movement of the human brain (for example, Reference 1 (Neuron, Volume 60, Issue 5, 915-929, 10 Dec. 2008, “Visual Image Reconstruction from Human Brain Activity using a Combination of Multiscale Local Image Decoders”), and the technology to extract and reanimate the brain information of animals and reanimate them (for example, Reference 2 (Theodore W Berger, Robert E Hampson, Dong Song, Anushka Goonawardena, Vasilis Z Marmarelis and Sam A Deadwyler, Journal of Neural Engineering 8 (2011) 046017 (11pp) It may be performed using a technique as described.
 対象物特定部130は、イメージ生成部120が生成した知覚イメージの特徴量を抽出する。この特徴量とは、例えば、知覚イメージが、新聞、雑誌、ポスター、ウェブサイト等の紙面イメージである場合、その紙面イメージの縦横比や、文字・画像・余白のレイアウト等の、そのイメージの特徴を示す情報である。また、知覚イメージが、映画、映像番組、ライブ、スポーツ等の映像イメージである場合、その映像イメージの少なくとも一部のシーンの登場人物・登場物の組み合わせ・構図・配置・色・形・動作、文字・画像の色・形およびそれらのレイアウト、およびそれらの経時変化等の、そのイメージの特徴を示す情報である。また、知覚イメージが、顔全体、顔の一部、身体全体、身体の一部、およびそれらの色、大きさの組み合わせ、身長、体格、声(声紋等)、足音、著作物(本人による図、字、筆跡等)、服装、アクセサリー、所有物(カバン、乗り物、電子機器等)等のイメージである場合、その特徴量は、そのイメージの特徴を示す情報である。 The object specifying unit 130 extracts the feature amount of the perceptual image generated by the image generation unit 120. For example, when the perceptual image is a paper image such as a newspaper, magazine, poster, or website, the feature amount is the feature of the image, such as the aspect ratio of the paper image, the layout of characters, images, and margins. It is information which shows. In addition, when the perceptual image is a video image of a movie, video program, live, sports, etc., the combination, composition, arrangement, color, shape, movement of characters and appearances of at least some scenes of the video image, This is information indicating the characteristics of the image, such as the color / shape of characters / images and their layout, and their aging. In addition, the perceived image is the whole face, part of the face, whole body, part of the body, and combinations of their colors and sizes, height, physique, voice (voice print, etc.), footsteps, copyrighted work (figure by the person) , Characters, handwriting, etc.), clothes, accessories, belongings (bags, vehicles, electronic devices, etc.), the feature amount is information indicating the feature of the image.
 また、対象物特定部130は、抽出した特徴量に基づいて知覚イメージが示す対象物を特定する。例えば、対象物特定部130は、特徴量と対象物とをあらかじめ対応付けて特定情報として記憶しておき、抽出した特徴量に基づいて、特定情報から対象物を特定するものであっても良い。 Also, the object specifying unit 130 specifies the object indicated by the perceptual image based on the extracted feature amount. For example, the target object specifying unit 130 may associate the feature amount and the target object in advance and store them as specific information, and specify the target object from the specific information based on the extracted feature amount. .
 図2は、図1に示した対象物特定部130に記憶された特定情報の一例を示す図である。 FIG. 2 is a diagram illustrating an example of identification information stored in the object identification unit 130 illustrated in FIG.
 図1に示した対象物特定部130には図2に示すように、特徴量と対象物を示す情報とが対応付けられて記憶されている。例えば、図2に示すように、特徴量「A001」と対象物「人物の顔」とが対応付けられて記憶されている。これは、抽出した特徴量が「A001」である場合、対象物特定部130が特定する対象物は「人物の顔」であることを示している。なお、図2において、特徴量を「A001」として示しているが、これは説明の便宜上の記載であって、実際の数値とは異なるものである。 In the object specifying unit 130 shown in FIG. 1, as shown in FIG. 2, the feature quantity and information indicating the object are stored in association with each other. For example, as shown in FIG. 2, the feature quantity “A001” and the target object “person's face” are stored in association with each other. This indicates that when the extracted feature amount is “A001”, the target specified by the target specifying unit 130 is “a person's face”. In FIG. 2, the feature amount is indicated as “A001”, but this is a description for convenience of explanation and is different from an actual numerical value.
 また、対象物特定部130は、特定した対象物を検索範囲決定部140へ通知する。 Also, the object specifying unit 130 notifies the search range determining unit 140 of the specified object.
 検索範囲決定部140は、対象物特定部130から通知された対象物に基づいて、知覚イメージの検索範囲を決定する。例えば、検索範囲決定部140は、対象物と検索範囲とをあらかじめ対応付けて検索情報として記憶し、対象物特定部130から通知された対象物に基づいて、検索情報から検索範囲を決定するものであっても良い。 The search range determination unit 140 determines the search range of the perceptual image based on the target notified from the target specifying unit 130. For example, the search range determination unit 140 associates the object and the search range in advance and stores them as search information, and determines the search range from the search information based on the target notified from the target specifying unit 130. It may be.
 図3は、図1に示した検索範囲決定部140に記憶された検索情報の一例を示す図である。 FIG. 3 is a diagram showing an example of search information stored in the search range determination unit 140 shown in FIG.
 図1に示した検索範囲決定部140には図3に示すように、対象物を示す情報と検索範囲とが対応付けられて記憶されている。また、検索範囲として、検索対象とする範囲および検索対象外とする範囲が記憶されている。例えば、図3に示すように、対象物「人物の顔」と検索範囲の検索対象「ブログ、記事」と検索範囲の検索対象外「SNS等の画像」とが対応付けられて記憶されている。これは、対象物特定部130から通知された対象物が「人物の顔」である場合、検索範囲決定部140は、「ブログ、記事」の範囲を検索対象とし、「SNS等の画像」の範囲を検索対象とはしないことを示している。なお、検索範囲としては、検索対象と検索対象外とのいずれか一方のみが記憶されているものであっても良い。 As shown in FIG. 3, the search range determination unit 140 shown in FIG. 1 stores information indicating the object and the search range in association with each other. In addition, as a search range, a range to be searched and a range not to be searched are stored. For example, as illustrated in FIG. 3, the object “person's face”, the search target “blog, article”, and the search target “image such as SNS” are stored in association with each other. . This is because, when the target notified from the target specifying unit 130 is a “person's face”, the search range determination unit 140 sets the range of “blog, article” as the search target, and sets “image such as SNS”. This indicates that the range is not to be searched. As the search range, only one of the search target and the non-search target may be stored.
 また、検索範囲決定部140は、決定した検索範囲を検索部150へ通知する。 Also, the search range determination unit 140 notifies the search unit 150 of the determined search range.
 検索部150は、検索範囲決定部140から通知された検索範囲において、イメージ生成部120から出力されてきた知覚イメージを用いた検索を行う。つまり、検索部150は、当該検索範囲において、当該知覚イメージを検索キーとして、その知覚イメージに関する情報を検索する。この検索方法は、一般的な画像検索を用いても良い。検索部150は、例えば、インターネット上で情報処理装置100と接続されたサーバに対して検索を行う。 The search unit 150 performs a search using the perceptual image output from the image generation unit 120 in the search range notified from the search range determination unit 140. That is, the search unit 150 searches the perceptual image for information in the search range using the perceptual image as a search key. This search method may use a general image search. For example, the search unit 150 searches for a server connected to the information processing apparatus 100 on the Internet.
 出力部160は、検索部150が行った検索結果を出力する。ここで、出力部160は、検索部150が行った検索結果を表示するディスプレイであっても良い。 The output unit 160 outputs a search result performed by the search unit 150. Here, the output unit 160 may be a display that displays search results performed by the search unit 150.
 以下に、図1に示した情報処理装置100における情報処理方法について説明する。 Hereinafter, an information processing method in the information processing apparatus 100 shown in FIG. 1 will be described.
 図4は、図1に示した情報処理装置100における情報処理方法の一例を説明するためのフローチャートである。 FIG. 4 is a flowchart for explaining an example of the information processing method in the information processing apparatus 100 shown in FIG.
 まず、ステップ1にて、測定部110が利用者の脳の電磁気学的特性を測定すると、ステップ2にて、測定した電磁気学的特性をイメージ生成部120が解析する。また、ステップ3にて、イメージ生成部120は、その解析の結果に基づいて、知覚イメージを生成する。イメージ生成部120は、生成した知覚イメージを対象物特定部130へ出力する。 First, when the measurement unit 110 measures the electromagnetic characteristics of the user's brain in step 1, the image generation unit 120 analyzes the measured electromagnetic characteristics in step 2. In step 3, the image generation unit 120 generates a perceptual image based on the result of the analysis. The image generation unit 120 outputs the generated perceptual image to the target object specifying unit 130.
 続いて、ステップ4にて、対象物特定部130は、イメージ生成部120から出力されてきた知覚イメージから特徴量を抽出し、ステップ5にて、抽出した特徴量に基づいて、知覚イメージが示す対象物を特定する。対象物特定部130は、特定した対象物を検索範囲決定部140へ通知する。 Subsequently, in step 4, the object specifying unit 130 extracts a feature amount from the perceptual image output from the image generation unit 120, and in step 5, the perceptual image indicates based on the extracted feature amount. Identify the object. The object specifying unit 130 notifies the search range determining unit 140 of the specified object.
 すると、ステップ6にて、検索範囲決定部140は、対象物特定部130から通知された対象物に基づいて、知覚イメージの検索範囲を決定する。検索範囲決定部140は、決定した検索範囲を検索部150へ通知する。 Then, in step 6, the search range determination unit 140 determines the search range of the perceptual image based on the target notified from the target specifying unit 130. The search range determination unit 140 notifies the search unit 150 of the determined search range.
 すると、ステップ7にて、検索部150は、検索範囲決定部140から通知された検索範囲において、イメージ生成部120から出力されてきた知覚イメージを用いて、その知覚イメージに関する情報の検索を行う。検索が完了すると、ステップ8にて、出力部160が検索結果を出力する。 Then, in step 7, the search unit 150 searches for information related to the perceptual image using the perceptual image output from the image generation unit 120 in the search range notified from the search range determination unit 140. When the search is completed, in step 8, the output unit 160 outputs the search result.
 以下に、さらなる具体例を挙げて説明する。 The following will be described with further specific examples.
 例えば、利用者が偶然すれ違った人物Aの顔を記憶し、その視覚(知覚)イメージに基づいて人物Aに関する検索を行なうことを想定する。 For example, it is assumed that the user memorizes the face of the person A who passes by chance and performs a search for the person A based on the visual (perceptual) image.
 測定部110は、当該利用者の脳の電磁気学的特性を測定し、イメージ生成部120は、人物Aの顔の知覚イメージを生成する。対象物特定部130は、当該知覚イメージから特徴量を抽出し、対象物が人物の顔であると特定する。 The measurement unit 110 measures the electromagnetic characteristics of the user's brain, and the image generation unit 120 generates a perceptual image of the face of the person A. The object specifying unit 130 extracts a feature amount from the perceptual image and specifies that the object is a human face.
 すると、検索範囲決定部140は、対象物が人物の顔であることから、検索範囲を所定の情報源にすると決定する。例えば、検索範囲決定部140は、SNS(Social Networking Service)に登録されている人物Aの顔が含まれる画像を検索対象から外し(たとえユーザがSNSにログインしていても)、公開されている人物Aのブログに掲載されている人物Aの顔が含まれる画像と、公開されている記事に掲載されている人物Aの顔が含まれる画像を検索対象とする。 Then, the search range determination unit 140 determines that the search range is a predetermined information source because the target is a human face. For example, the search range determination unit 140 excludes an image including the face of the person A registered in the SNS (Social Networking Service) from the search target (even if the user is logged in to the SNS) and is made public. An image including the face of the person A posted on the blog of the person A and an image including the face of the person A posted in a published article are set as search targets.
 検索部150は、検索範囲決定部140が決定した検索範囲(検索対象)に従って、知覚イメージに関する検索を行なう。 The search unit 150 performs a search related to the perceptual image according to the search range (search target) determined by the search range determination unit 140.
 また、イメージ生成部120は、知覚イメージを生成する際、その知覚イメージに、当該知覚イメージが脳波の測定により得られた知覚イメージであることを示すタグ情報を付加するものであっても良い。この場合、検索範囲決定部140は、そのタグ情報を用いて検索範囲を決定しても良い。つまり、知覚イメージにタグ情報が付加されている場合、検索範囲決定部140は、知覚イメージにタグ情報が付加されていない場合と比べて、検索範囲を狭く(制限)するものであっても良い。また、この場合、検索範囲決定部140は、併せて対象物特定部130における特定結果を用いて、検索範囲を決定しても良い。ここで、知覚イメージにタグ情報が付加されていない場合とは、対象物特定部130が、イメージ生成部120が生成した知覚イメージに基づいて対象物を特定した場合ではなく、イメージ生成部120が生成した知覚イメージ以外の情報、例えば、利用者の操作で入力された情報に基づいて対象物を特定した場合となる。 Further, when generating the perceptual image, the image generation unit 120 may add tag information indicating that the perceptual image is a perceptual image obtained by measuring brain waves to the perceptual image. In this case, the search range determination unit 140 may determine the search range using the tag information. That is, when the tag information is added to the perceptual image, the search range determination unit 140 may narrow (restrict) the search range compared to the case where the tag information is not added to the perceptual image. . In this case, the search range determination unit 140 may determine the search range by using the identification result in the object identification unit 130 together. Here, the case where the tag information is not added to the perceptual image is not the case where the object specifying unit 130 specifies the object based on the perceptual image generated by the image generating unit 120, but the image generating unit 120. This is a case where an object is specified based on information other than the generated perceptual image, for example, information input by a user's operation.
 このように、単純に知覚イメージから生成された知覚イメージを利用する検索を制限することで、高度な処理となり得る特徴量や対象物の特定を実施する必要がなく、処理の高速化、関連装置の低コスト化が実現できる。 In this way, by simply limiting the search that uses the perceptual image generated from the perceptual image, there is no need to specify features and objects that can be advanced processing, speeding up the processing, and related devices The cost can be reduced.
 また、情報処理装置100が、イメージ生成部120が生成した知覚イメージと、対象物特定部130が特定した対象物を示す対象物情報とを、情報処理装置100の外部に設けられたサーバ(例えば、以下で説明するSNSサーバ)へ送信し、当該サーバにて検索範囲決定部140および検索部150の処理を行わせるものであっても良い。このような場合、例えば、SNSのような特定のコミュニティサービス内で、SNSのメンバーBがメンバーCの顔を手掛かりにメンバーCを検索して、SNS内ではメンバーCを見つけられない場合、SNSサーバが、メンバーBがメンバーCを検索したことを示す検索履歴を記録しておくものであっても良い。その後、メンバーCがメンバーBの顔を手掛かりにメンバーBを検索する場合、SNSサーバは、記録している検索履歴に基づいて、メンバーBの顔が含まれる画像を検索対象とする制御を行なっても良い。この場合、同じコミュニティサービス内の検索となるため、検索範囲として、特に制限をかける必要はない。なお、このようなメンバーの検索を行う場合は、情報処理装置100からサーバへ検索するための情報(例えば、上述したような知覚イメージおよび対象物情報)を送信する際、メンバーを識別できるメンバーID(利用者識別情報)を送信する必要がある。送信されたメンバーIDは、検索履歴としてサーバに記録され、検索の際に用いられる。 In addition, the information processing apparatus 100 obtains a perceptual image generated by the image generation unit 120 and object information indicating the object specified by the object specifying unit 130 (for example, a server provided outside the information processing apparatus 100 (for example, SNS server described below), and the server may cause the search range determination unit 140 and the search unit 150 to perform processing. In such a case, for example, when the member B of the SNS searches the member C using the face of the member C within a specific community service such as the SNS and cannot find the member C in the SNS, the SNS server However, a search history indicating that member B has searched for member C may be recorded. Thereafter, when the member C searches for the member B using the face of the member B as a clue, the SNS server performs control for searching for an image including the face of the member B based on the recorded search history. Also good. In this case, since the search is performed within the same community service, there is no need to limit the search range. When searching for such a member, when transmitting information (for example, perceptual image and object information as described above) for searching from the information processing apparatus 100 to the server, a member ID that can identify the member (User identification information) needs to be transmitted. The transmitted member ID is recorded in the server as a search history, and is used for the search.
 このように、関心のある者同士が検索し合う場合には、プライバシーを保護する必要はないと判断でき、知覚イメージから生成されたイメージを利用する検索であっても、お互いに検索することが可能になり、コミュニティサービスの利用促進につながる。 In this way, when interested parties search each other, it can be determined that it is not necessary to protect privacy, and even when searching using images generated from perceptual images, it is possible to search each other. It becomes possible and leads to promotion of use of community service.
 以上説明した本発明においては、対象物に応じて検索範囲を決定することで、人物の個人情報を不当に得ることを防ぐことができる。 In the present invention described above, by determining the search range according to the object, it is possible to prevent the personal information of a person from being obtained unfairly.
 なお、上述した情報処理装置100に設けられた各構成要素が行う処理は、目的に応じてそれぞれ作製された論理回路で行うようにしても良い。また、処理内容を手順として記述したコンピュータプログラム(以下、プログラムと称する)を情報処理装置100にて読取可能な記録媒体に記録し、この記録媒体に記録されたプログラムを情報処理装置100に読み込ませ、実行するものであっても良い。情報処理装置100にて読取可能な記録媒体とは、フロッピー(登録商標)ディスク、光磁気ディスク、DVD、CDなどの移設可能な記録媒体の他、情報処理装置100に内蔵されたROM、RAM等のメモリやHDD等を指す。この記録媒体に記録されたプログラムは、情報処理装置100に設けられたCPU(不図示)にて読み込まれ、CPUの制御にて、上述したものと同様の処理が行われる。ここで、CPUは、プログラムが記録された記録媒体から読み込まれたプログラムを実行するコンピュータとして動作するものである。 It should be noted that the processing performed by each component provided in the information processing apparatus 100 described above may be performed by a logic circuit produced according to the purpose. In addition, a computer program (hereinafter referred to as a program) in which processing contents are described as a procedure is recorded on a recording medium readable by the information processing apparatus 100, and the program recorded on the recording medium is read by the information processing apparatus 100. , May be executed. The recording medium readable by the information processing apparatus 100 includes a transfer medium such as a floppy (registered trademark) disk, a magneto-optical disk, a DVD, and a CD, as well as a ROM, a RAM, and the like built in the information processing apparatus 100. Memory, HDD, etc. The program recorded on the recording medium is read by a CPU (not shown) provided in the information processing apparatus 100, and the same processing as described above is performed under the control of the CPU. Here, the CPU operates as a computer that executes a program read from a recording medium on which the program is recorded.
 上記の実施の形態の一部または全部は、以下の付記のようにも記載され得るが、以下には限られない。
(付記1)利用者の脳の電磁気学的特性を測定する測定部と、
 前記測定部が測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成するイメージ生成部と、
 前記イメージ生成部が生成した知覚イメージの特徴量を抽出し、該抽出した特徴量に基づいて該知覚イメージが示す対象物を特定する対象物特定部と、
 前記対象物特定部が特定した対象物に基づいて、検索範囲を決定する検索範囲決定部と、
 前記検索範囲決定部が決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う検索部とを有する情報処理装置。
(付記2)前記対象物特定部は、前記特徴量と前記対象物とをあらかじめ対応付けて特定情報として記憶し、前記抽出した特徴量に基づいて、前記特定情報から前記対象物を特定することを特徴とする、付記1に記載の情報処理装置。
(付記3)前記検索範囲決定部は、前記対象物と前記検索範囲とをあらかじめ対応付けて検索情報として記憶し、前記対象物特定部が特定した対象物に基づいて、前記検索情報から前記検索範囲を決定することを特徴とする、付記1または付記2に記載の情報処理装置。
(付記4)前記検索部は、当該情報処理装置と接続されたサーバに対して、前記検索を行うことを特徴とする、付記1から3のいずれか1項に記載の情報処理装置。
(付記5)前記検索部が行った検索結果を出力する出力部を有することを特徴とする、付記1から4のいずれか1項に記載の情報処理装置。
(付記6)前記検索範囲決定部は、前記対象物特定部が、前記イメージ生成部が生成した知覚イメージに基づいて前記対象物を特定した場合は、前記イメージ生成部が生成した知覚イメージ以外の情報に基づいて前記対象物を特定した場合よりも、検索範囲を制限したものとすることを特徴とする、付記1から5のいずれか1項に記載の情報処理装置。
(付記7)付記1から6のいずれか1項に記載の情報処理装置と、サーバとを有する通信システムにおいて、
 前記イメージ生成部は、前記生成した知覚イメージを前記サーバへ送信し、
 前記対象物特定部は、前記特定した対象物を示す対象物情報を前記サーバへ送信し、
 前記サーバは、前記対象物特定部から送信されてきた対象物情報が示す対象物に基づいて、検索範囲を決定し、前記決定した検索範囲において、前記イメージ生成部から送信されてきた知覚イメージを用いて該知覚イメージに関する情報の検索を行い、該検索の検索履歴を記録し、その後、該検索された情報に応じた利用者から、前記検索履歴に記録された検索を行った利用者の検索があった場合、検索範囲を制限しないことを特徴とする通信システム。
(付記8)情報を処理する情報処理方法であって、
 利用者の脳の電磁気学的特性を測定する処理と、
 前記測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成する処理と、
 前記生成した知覚イメージの特徴量を抽出する処理と、
 前記抽出した特徴量に基づいて、該知覚イメージが示す対象物を特定する処理と、
 前記特定した対象物に基づいて、検索範囲を決定する処理と、
 前記決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う処理とを行う情報処理方法。
(付記9)コンピュータに、
 利用者の脳の電磁気学的特性を測定する手順と、
 前記測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成する手順と、
 前記生成した知覚イメージの特徴量を抽出する手順と、
 前記抽出した特徴量に基づいて、該知覚イメージが示す対象物を特定する手順と、
 前記特定した対象物に基づいて、検索範囲を決定する手順と、
 前記決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う手順とを実行させるためのプログラム。
A part or all of the above embodiment can be described as in the following supplementary notes, but is not limited thereto.
(Appendix 1) a measurement unit for measuring the electromagnetic characteristics of the user's brain;
Analyzing the electromagnetic characteristics measured by the measurement unit, and based on the result of the analysis, an image generation unit that generates a perceptual image;
An object identifying unit that extracts a feature amount of the perceptual image generated by the image generation unit and identifies an object indicated by the perceptual image based on the extracted feature amount;
A search range determining unit for determining a search range based on the target specified by the target specifying unit;
An information processing apparatus comprising: a search unit that searches for information related to a perceptual image using the perceptual image in the search range determined by the search range determination unit.
(Additional remark 2) The said object specific | specification part matches the said feature-value and the said object beforehand, memorize | stores it as specific information, and specifies the said target object from the said specific information based on the said extracted feature-value. The information processing apparatus according to appendix 1, characterized by:
(Additional remark 3) The said search range determination part matches the said target object and the said search range beforehand, and memorize | stores it as search information, Based on the target object which the said target object specific part specified, the said search from the said search information The information processing apparatus according to appendix 1 or appendix 2, wherein a range is determined.
(Additional remark 4) The said search part performs the said search with respect to the server connected with the said information processing apparatus, The information processing apparatus of any one of Additional remark 1 to 3 characterized by the above-mentioned.
(Additional remark 5) It has an output part which outputs the search result which the said search part performed, The information processing apparatus of any one of Additional remark 1 to 4 characterized by the above-mentioned.
(Additional remark 6) When the said object specific | specification part specifies the said target based on the perceptual image which the said image generation part produced | generated, the said search range determination part other than the perceptual image which the said image generation part produced | generated The information processing apparatus according to any one of appendices 1 to 5, wherein a search range is limited as compared with a case where the object is specified based on information.
(Supplementary note 7) In a communication system including the information processing apparatus according to any one of supplementary notes 1 to 6 and a server,
The image generation unit transmits the generated perceptual image to the server,
The object specifying unit transmits object information indicating the specified object to the server,
The server determines a search range based on the object indicated by the object information transmitted from the object specifying unit, and the perceptual image transmitted from the image generation unit in the determined search range. Use this to search for information related to the perceptual image, record the search history of the search, and then search for the user who performed the search recorded in the search history from the user according to the searched information A communication system characterized in that the search range is not limited when there is a problem.
(Appendix 8) An information processing method for processing information,
A process for measuring the electromagnetic properties of the user's brain;
Analyzing the measured electromagnetic characteristics, and generating a perceptual image based on the result of the analysis;
A process of extracting a feature amount of the generated perceptual image;
A process of identifying an object indicated by the perceptual image based on the extracted feature amount;
A process of determining a search range based on the identified object;
An information processing method for performing processing for searching for information related to a perceptual image using the perceptual image in the determined search range.
(Appendix 9)
Procedures for measuring the electromagnetic properties of the user's brain;
Analyzing the measured electromagnetic properties and generating a perceptual image based on the results of the analysis;
A procedure for extracting a feature amount of the generated perceptual image;
A procedure for identifying an object indicated by the perceptual image based on the extracted feature amount;
A procedure for determining a search range based on the identified object;
A program for executing a procedure for searching for information related to a perceptual image using the perceptual image in the determined search range.
 以上、実施の形態を参照して本願発明を説明したが、本願発明は上記実施の形態に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。 The present invention has been described above with reference to the embodiments, but the present invention is not limited to the above embodiments. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
 この出願は、2012年10月11日に出願された日本出願特願2012-225941を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2012-225941 filed on October 11, 2012, the entire disclosure of which is incorporated herein.

Claims (7)

  1.  利用者の脳の電磁気学的特性を測定する測定部と、
     前記測定部が測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成するイメージ生成部と、
     前記イメージ生成部が生成した知覚イメージの特徴量を抽出し、該抽出した特徴量に基づいて該知覚イメージが示す対象物を特定する対象物特定部と、
     前記対象物特定部が特定した対象物に基づいて、検索範囲を決定する検索範囲決定部と、
     前記検索範囲決定部が決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う検索部とを有する情報処理装置。
    A measurement unit for measuring the electromagnetic characteristics of the user's brain;
    Analyzing the electromagnetic characteristics measured by the measurement unit, and based on the result of the analysis, an image generation unit that generates a perceptual image;
    An object identifying unit that extracts a feature amount of the perceptual image generated by the image generation unit and identifies an object indicated by the perceptual image based on the extracted feature amount;
    A search range determining unit for determining a search range based on the target specified by the target specifying unit;
    An information processing apparatus comprising: a search unit that searches for information related to a perceptual image using the perceptual image in the search range determined by the search range determination unit.
  2.  請求項1に記載の情報処理装置において、
     前記対象物特定部は、前記特徴量と前記対象物とをあらかじめ対応付けて特定情報として記憶し、前記抽出した特徴量に基づいて、前記特定情報から前記対象物を特定することを特徴とする情報処理装置。
    The information processing apparatus according to claim 1,
    The target object specifying unit associates the feature quantity and the target object in advance and stores them as specific information, and specifies the target object from the specific information based on the extracted feature quantity. Information processing device.
  3.  請求項1または請求項2に記載の情報処理装置において、
     前記検索範囲決定部は、前記対象物と前記検索範囲とをあらかじめ対応付けて検索情報として記憶し、前記対象物特定部が特定した対象物に基づいて、前記検索情報から前記検索範囲を決定することを特徴とする情報処理装置。
    The information processing apparatus according to claim 1 or 2,
    The search range determination unit associates the target object with the search range in advance and stores it as search information, and determines the search range from the search information based on the target object specified by the target object specifying unit. An information processing apparatus characterized by that.
  4.  請求項1から3のいずれか1項に記載の情報処理装置において、
     前記検索部は、当該情報処理装置と接続されたサーバに対して、前記検索を行うことを特徴とする情報処理装置。
    The information processing apparatus according to any one of claims 1 to 3,
    The information processing apparatus, wherein the search unit performs the search for a server connected to the information processing apparatus.
  5.  請求項1から4のいずれか1項に記載の情報処理装置において、
     前記検索部が行った検索結果を出力する出力部を有することを特徴とする情報処理装置。
    The information processing apparatus according to any one of claims 1 to 4,
    An information processing apparatus comprising: an output unit that outputs a search result performed by the search unit.
  6.  情報を処理する情報処理方法であって、
     利用者の脳の電磁気学的特性を測定する処理と、
     前記測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成する処理と、
     前記生成した知覚イメージの特徴量を抽出する処理と、
     前記抽出した特徴量に基づいて、該知覚イメージが示す対象物を特定する処理と、
     前記特定した対象物に基づいて、検索範囲を決定する処理と、
     前記決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う処理とを行う情報処理方法。
    An information processing method for processing information,
    A process for measuring the electromagnetic properties of the user's brain;
    Analyzing the measured electromagnetic characteristics, and generating a perceptual image based on the result of the analysis;
    A process of extracting a feature amount of the generated perceptual image;
    A process of identifying an object indicated by the perceptual image based on the extracted feature amount;
    A process of determining a search range based on the identified object;
    An information processing method for performing a process of searching for information related to the perceptual image using the perceptual image in the determined search range.
  7.  コンピュータに、
     利用者の脳の電磁気学的特性を測定する手順と、
     前記測定した電磁気学的特性を解析し、該解析の結果に基づいて、知覚イメージを生成する手順と、
     前記生成した知覚イメージの特徴量を抽出する手順と、
     前記抽出した特徴量に基づいて、該知覚イメージが示す対象物を特定する手順と、
     前記特定した対象物に基づいて、検索範囲を決定する手順と、
     前記決定した検索範囲において、前記知覚イメージを用いて該知覚イメージに関する情報の検索を行う手順とを実行させるためのプログラム。
    On the computer,
    Procedures for measuring the electromagnetic properties of the user's brain;
    Analyzing the measured electromagnetic properties and generating a perceptual image based on the results of the analysis;
    A procedure for extracting a feature amount of the generated perceptual image;
    A procedure for identifying an object indicated by the perceptual image based on the extracted feature amount;
    A procedure for determining a search range based on the identified object;
    A program for executing a procedure for searching for information related to a perceptual image using the perceptual image in the determined search range.
PCT/JP2013/065667 2012-10-11 2013-06-06 Information processing device WO2014057710A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-225941 2012-10-11
JP2012225941 2012-10-11

Publications (1)

Publication Number Publication Date
WO2014057710A1 true WO2014057710A1 (en) 2014-04-17

Family

ID=50477188

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/065667 WO2014057710A1 (en) 2012-10-11 2013-06-06 Information processing device

Country Status (1)

Country Link
WO (1) WO2014057710A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007004521A1 (en) * 2005-06-30 2007-01-11 Olympus Corporation Marker specification device and marker specification method
JP2008102594A (en) * 2006-10-17 2008-05-01 Fujitsu Ltd Content retrieval method and retrieval device
WO2013018515A1 (en) * 2011-07-29 2013-02-07 Necカシオモバイルコミュニケーションズ株式会社 Information processing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007004521A1 (en) * 2005-06-30 2007-01-11 Olympus Corporation Marker specification device and marker specification method
JP2008102594A (en) * 2006-10-17 2008-05-01 Fujitsu Ltd Content retrieval method and retrieval device
WO2013018515A1 (en) * 2011-07-29 2013-02-07 Necカシオモバイルコミュニケーションズ株式会社 Information processing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YOICHI MIYAWAKI ET AL.: "Visual Image Reconstruction from Human Brain Activity using a Combination of Multiscale Local Image Decoders", NEURON, vol. 60, no. 5, 15 August 2012 (2012-08-15), pages 915 - 929, Retrieved from the Internet <URL:http://download.cell.com/neuron/pdf/PIIS0896627308009586.pdf?intermediate=true> [retrieved on 20120815] *

Similar Documents

Publication Publication Date Title
US11409791B2 (en) Joint heterogeneous language-vision embeddings for video tagging and search
CA2827611C (en) Facial detection, recognition and bookmarking in videos
JP6759844B2 (en) Systems, methods, programs and equipment that associate images with facilities
US20140095308A1 (en) Advertisement distribution apparatus and advertisement distribution method
WO2018009666A1 (en) Combining faces from source images with target images based on search queries
CN103988202A (en) Image attractiveness based indexing and searching
US8965867B2 (en) Measuring and altering topic influence on edited and unedited media
Cioppa et al. Scaling up SoccerNet with multi-view spatial localization and re-identification
Caporusso Deepfakes for the good: A beneficial application of contentious artificial intelligence technology
CN111198962A (en) Information processing apparatus, system, method, similarity judging method, and medium
Liu et al. RETRACTED: Rolling bearing fault detection approach based on improved dispersion entropy and AFSA optimized SVM
CN112528049B (en) Video synthesis method, device, electronic equipment and computer readable storage medium
CN113705792A (en) Personalized recommendation method, device, equipment and medium based on deep learning model
CN112380537A (en) Method, device, storage medium and electronic equipment for detecting malicious software
WO2014057710A1 (en) Information processing device
US8515183B2 (en) Utilizing images as online identifiers to link behaviors together
JP6178480B1 (en) DATA ANALYSIS SYSTEM, ITS CONTROL METHOD, PROGRAM, AND RECORDING MEDIUM
CN117795502A (en) Evolution of topics in messaging systems
Hung et al. Smart TV face monitoring for children privacy
CN110069649B (en) Graphic file retrieval method, graphic file retrieval device, graphic file retrieval equipment and computer readable storage medium
CN113497953A (en) Music scene recognition method, device, server and storage medium
JP3985826B2 (en) Image search method and apparatus
Khan et al. A model on multiple perspectives of citizens’ trust in using social media for e-government services
CN104156417B (en) Information processing method and equipment
Guntuku et al. Deep representations to model user ‘likes’

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13845493

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13845493

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP