JP2008027348A - Electronic apparatus and control method for the same - Google Patents

Electronic apparatus and control method for the same Download PDF

Info

Publication number
JP2008027348A
JP2008027348A JP2006202021A JP2006202021A JP2008027348A JP 2008027348 A JP2008027348 A JP 2008027348A JP 2006202021 A JP2006202021 A JP 2006202021A JP 2006202021 A JP2006202021 A JP 2006202021A JP 2008027348 A JP2008027348 A JP 2008027348A
Authority
JP
Japan
Prior art keywords
conversation
step
sentence
keyword
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2006202021A
Other languages
Japanese (ja)
Inventor
Masumi Iida
Yoshinori Kishibe
Tsunesuke Matsubara
祥典 岸部
恒介 松原
真清 飯田
Original Assignee
Sharp Corp
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp, シャープ株式会社 filed Critical Sharp Corp
Priority to JP2006202021A priority Critical patent/JP2008027348A/en
Publication of JP2008027348A publication Critical patent/JP2008027348A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To increase the convenience of an electronic apparatus for outputting translation documents. <P>SOLUTION: A portable translator acquires position information, acquires information about a building in the current position accordingly, acquires information about a store in the current position according to the building information, creates keywords according to the store information, searches a dialog example database (DB) for dialog examples matching the keywords, and presents them in a dialog example list. <P>COPYRIGHT: (C)2008,JPO&INPIT

Description

  The present invention relates to an electronic device and a control method thereof, and more particularly, to an electronic device that outputs a translation document and a control method thereof.

  Conventionally, there are various methods in order for persons with different native languages to communicate more smoothly. For example, learning a language different from the native language and accompanying an interpreter are also methods for smooth communication. However, there is a problem that it takes a relatively long time to learn a foreign language, and a large amount of money is required to accompany an interpreter. As another method for smooth communication, there is a method of using a translator. Conventionally, various techniques related to a translator have been disclosed.

  For example, in Patent Document 1, in an e-mail creation support system, personal information of another person is stored in the system, and is calculated based on the analysis result of the personal information of a transmission partner when a document included in the e-mail is translated. A technique for translating the document according to the specified translation attribute is disclosed. That is, this patent document discloses a technique for obtaining a translated document corresponding to the reader's attribute.

  Patent Document 2 discloses a technique for acquiring position information of a place where the translator is placed and setting a language used for translation based on the position information. That is, this patent document discloses a technique for easily setting a language used for translation in a translator.

Patent Document 3 discloses a technique for referring to event information such as a schedule and determining a translation to be selected based on the event information when selecting a translation for the communication support apparatus. That is, this patent document discloses a technique for translating a language input by a user by using a translation according to the situation.
JP 2000-10975 A Japanese Patent Laid-Open No. 2003-114887 JP 2005-25380 A

  As described above, with conventional translators, a technique for outputting a translation according to the situation for a sentence input by the user, or a technique for setting an appropriate language to be used without causing the user to perform an operation for selection. Various techniques that take into account user convenience are disclosed.

  Depending on the communication ability of the user, the user may feel confused in the dimension of what text should be input to the translator.

  The present invention has been conceived in view of such circumstances, and an object thereof is to improve convenience in an electronic device that outputs a translation document.

  An electronic apparatus according to the present invention includes a conversation sentence storage unit that stores a conversation sentence and a conversation sentence keyword that is a keyword of the conversation sentence, and a current position in the electronic device including a means for outputting the translated conversation sentence. Detailed information acquisition means for acquiring detailed information, search keyword generation means for generating a keyword for searching for a conversation sentence from the detailed information acquired by the detailed information acquisition means, and a keyword generated by the search keyword generation means A conversation sentence keyword search means for searching for a keyword that matches the conversation sentence keyword, a conversation sentence acquisition means for acquiring a conversation sentence having the conversation sentence keyword searched by the conversation sentence keyword search means from the conversation sentence storage means, Output means for outputting a conversation sentence acquired by the conversation sentence acquisition means.

  In the electronic device according to the present invention, the detailed information acquisition unit includes a current position acquisition unit that acquires position information of a current position, and details of the current position are obtained from the position information acquired by the current position acquisition unit. It is preferable to acquire information.

  In the electronic device according to the present invention, the conversation sentence storage means includes a conversation example storage means for storing a typical conversation example sentence and the conversation example keyword, a conversation history sentence of the used conversation sentence, and the conversation history. It is preferable to have one or both of conversation history storage means for storing a conversation history sentence keyword that is a sentence keyword.

  In the electronic device according to the present invention, it is preferable that the detailed information acquisition unit acquires detailed information of a building corresponding to the current position as detailed information of the current position.

  In the electronic device according to the present invention, it is preferable that the detailed information acquisition unit acquires wide area information of a surrounding area of the current position.

  The electronic device according to the present invention further includes an information reliability acquisition unit that acquires the reliability of the detailed information of the current position acquired by the detailed information acquisition unit, and the output unit includes the conversation sentence acquisition unit. It is preferable that the conversation sentence acquired by the above is weighted with the reliability and output from the conversation sentence with high reliability.

  An electronic device control method according to the present invention is a method for controlling an electronic device provided with means for outputting a translated conversation sentence, wherein the conversation sentence and a conversation sentence keyword that is a keyword of the conversation sentence are stored. A storage step, a detailed information acquisition step of acquiring detailed information of the current position, a search keyword generation step of generating a keyword for searching a conversation sentence from the detailed information acquired in the detailed information acquisition step, and the search keyword generation step From a conversation sentence keyword search step for searching for a keyword that matches the keyword generated in (1) from a conversation sentence keyword, and a conversation sentence having a conversation sentence keyword searched in the conversation sentence keyword search step from a conversation sentence stored in a conversation sentence storage step Conversation sentence acquisition step to be acquired and conversation sentence acquired in the conversation sentence acquisition step Characterized in that it comprises an output step of outputting.

  In the electronic device control method according to the present invention, the detailed information acquisition step includes a current position acquisition step of acquiring position information of a current position, and a current position from the position information acquired by the current position acquisition unit. The detailed information is preferably acquired.

  In the electronic device control method according to the present invention, the conversation sentence storage step includes a conversation example storage step for storing a standard conversation example sentence and the conversation example keyword, a conversation history sentence of the conversation sentence used, It is preferable to have one or both of a conversation history storing step for storing a conversation history sentence keyword that is a keyword of the conversation history sentence.

  In the electronic device control method according to the present invention, it is preferable that the detailed information acquisition step acquires detailed information on a building corresponding to the current position as detailed information on the current position.

  In the electronic device control method according to the present invention, it is preferable that the detailed information acquisition step acquires wide area information of a surrounding area of the current position.

  The electronic device control method according to the present invention further includes an information reliability acquisition step of acquiring the reliability of the detailed information of the current position acquired in the detailed information acquisition step, and the output step includes the conversation sentence. It is preferable that the conversation sentence acquired in the acquisition step is weighted with the reliability and output in order from the conversation sentence with the highest reliability.

  According to the present invention, the electronic device acquires detailed information on the current position, generates a keyword from the detailed information, and outputs a conversation sentence stored in association with the keyword generated in the conversation sentence storage unit. .

  Thereby, since the electronic device can output a conversation sentence according to the current position, even if the user is confused about what kind of sentence to input as an original sentence of translation to the translator, Can be supported appropriately.

  Hereinafter, a portable translator according to an embodiment of the present invention will be described with reference to the drawings. In the following description, the same parts and components are denoted by the same reference numerals. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated.

FIG. 1 is a schematic diagram for explaining functions of the portable translator of the present embodiment.
Referring to FIG. 1, portable translator 1 communicates with conversation partner 902 who mainly uses English (an example of a language other than Japanese) by user 901 who has a conversation in Japanese. To be used. In this specification, the sentence before translation is referred to as “original sentence”, and the sentence after translation is referred to as “translation sentence”. In addition, although English is given as an example of a language other than Japanese (language used by the conversation partner 902), the language in which the portable translator according to the present invention outputs a translation is not limited to English.

  The portable translator 1 includes a conversation engine 101 that receives input of information from the user 901 or the conversation partner 902 and outputs information to the user 901 or conversation partner 902, and an information processing engine that controls the operation of the conversation engine 101. 102.

  Specifically, the conversation engine 101 has a Japanese sentence recognition / correction function 101A, an English sentence recognition / correction function 101B, and a conversation example creation function 101C.

  The Japanese sentence recognition / correction function 101A is a function for recognizing Japanese character strings and sentences input by the user 901. In addition, when the input is made by voice, it also has a function of recognizing the voice and correcting the recognition result. The English sentence recognition / correction function 101B is a function that recognizes English character strings and sentences input by the conversation partner 902. In addition, when the input is made by voice, it also has a function of recognizing the voice and correcting the recognition result. Then, the conversation example creation function 101C provides the user 901 with detailed information about the current position of the portable translator 1 obtained based on the current position obtained from a GPS (Global Positioning System) 910 or the like. It is a function that presents a conversation example.

  In this specification, the conversation example creation function 101C of the portable translator 1 will be described in particular. As will be described later, the conversation example creation function 101C in the portable translator 1 includes a conversation example database (hereinafter, “database” is abbreviated as “DB”) 104, a translation / dictionary DB 105, an information table 106, and a reference record. 107, which is realized by referring to the conversation history DB 108 as appropriate.

FIG. 2 is a diagram for explaining the function of the information processing engine 102.
Referring to FIG. 2, when a request for presenting a conversation example is input from conversation engine 101, information processing engine 102 receives server 920 on the Internet, information 930 input by the user, conversation history DB 108, and Or, based on information stored in the main body of the portable translator 1, a conversation example (list) to be presented is created and sent back to the conversation engine 101.

FIG. 3 is a diagram for explaining the function of the information processing engine 102 in more detail.
Referring to FIG. 3, the information processing engine 102 acquires data (download data) 920A downloaded from a server 920 on the Internet, input data 930A by a user, and detailed information acquisition means 20 included in the portable translator 1 The detected data 20A and the accumulated data 108A of the conversation history DB 108 are input. Then, the information processing engine 102 appropriately combines these pieces of information, selects an appropriate conversation example in response to a request from the conversation engine 101, and outputs it.

FIG. 4 is a diagram illustrating a hardware configuration of the portable translator 1.
Referring to FIG. 4, the portable translator 1 includes a control unit 10 that controls the overall operation of the portable translator 1. The control unit 10 includes a CPU (Central Processing Unit). The portable translator 1 also includes an input unit 11 for inputting information and a display unit 12 for displaying information. The input unit 11 includes a button, a touch key displayed on the display unit 12, and / or a microphone for inputting sound. The portable translator 1 also includes a D / A converter 13 for outputting voice, an output amplifier 14, and a speaker 15.

  The portable translator 1 also includes a conversation example sentence in a plurality of languages, dictionary data of a predetermined language pair, a dictionary data storage unit 16 that stores example sentences, a voice data storage unit 17 that stores speech data of conversation example sentences, An information storage unit 18 that stores various information and a communication unit 19 that communicates with a server or a GPS satellite on a network such as the Internet are included.

  In this embodiment, it is described that portable translator 1 acquires its own position information by communicating with GPS satellites, but portable translator 1 acquires its own position information. The method to do is not limited to this. Position information is embedded in a blinking pattern of a light or LED (Light Emitting Diode) installed in a building or the like, the light emitted by the light or LED is received by the communication unit 19, and the light received by the communication unit 19 The portable translator 1 may acquire the position information of its own device by a method of analyzing the pattern of

  Information stored in the information storage unit 18 includes a conversation example DB, a conversation history DB, an information table, and a reference record. The information table includes a building information table, a store information table, and a keyword table, which will be described later. The reference record includes a reliability reference record and a priority conversation pattern reference record to be described later.

  FIG. 5 is a flowchart of processing executed by the control unit 10 when a conversation example is presented on the portable translator 1.

  Referring to FIG. 5, when the user performs an operation for requesting presentation of a conversation example to input unit 11, in step S <b> 101, control unit 10 causes communication unit 19 to communicate with a GPS satellite, Acquires location information of the aircraft. As described above, the method for acquiring the position information of the own device is not limited to communication with a GPS satellite.

  Next, in step S102, the control unit 10 acquires building information at the current position. The processing content of step S102 will be described in detail with reference to FIG. 6 which is a flowchart of the processing subroutine.

  Referring to FIG. 6, in step S <b> 201, the control unit 10 centers on the current position acquired in step S <b> 101 from a server that provides map information on the network by connecting to a network such as the Internet. Map information and information on the names of buildings included in the map are acquired, and the map information is displayed on the display unit 12.

  Next, the control part 10 makes a user select the place which is using the own machine on a map by step S202, and receives the input of the selected information. Here, as information which a user chooses, the name of the building on a map is mentioned, for example.

  Next, in step S203, the control unit 10 determines whether the place (building) specified by the information selected by the user in step S202 corresponds to two or more pieces of building information (information such as stores). If it is determined that they correspond, the two or more pieces of building information are displayed on the display unit 12 in step S204. On the other hand, if not, the control unit 10 advances the process to step S220.

  In the predetermined database of the information storage unit 18, a place or building and building information (such as a store) corresponding to the place or building are stored in association with each other. Specifically, for example, the information storage unit 18 stores a building and names of all stores in the building in association with each other, or stores the name of a certain region and the region. The names of the buildings existing inside are stored in association with each other. Then, the control unit 10 makes a determination in step S203 by referring to the predetermined database.

  After displaying the building information in step S204, the control unit 10 accepts input of information for selecting one piece of building information from the plurality of building information from the user in step S205, and the process proceeds to step S220. Note that the building information displayed in step S204 may be, for example, names of a plurality of buildings, or names of a plurality of different stores in the same building.

  In step S220, the control unit 10 determines whether or not the building information (store name) for setting a value in the name field of the building information table described later has been obtained by the processing from step S203 to step S205. If it is determined that it has been obtained, the process proceeds to step S221, and if it is determined that it has not been obtained, the process proceeds to step S222. In step S220, the control unit 10 specifically refers to the case where one building information corresponds to the location specified by the information selected in step S202, and the user says that the building information is correct. When information is input, or when information for selecting one building information is input from the user in step S205, it is determined that the building information and the store name are obtained.

  Here, the building information table will be described. An example of the building information table is shown in Table 1.

  In the building information table, values for each field of building type, name, and address are defined and stored in association with each other. Specific examples of values stored as building types include, for example, homes (general homes) where ordinary people live, companies, stores, public facilities, roads, and airports. Specific examples of values stored as names include names of people living in the house, company names, store names, names of public facilities, and road names. The value stored as the address is information indicating the address of the corresponding building, store, or house.

  In step S221, if there is two or more pieces of building information at the location selected in step S202 as a value corresponding to the “name” field of the building information table, the control unit 10 performs the processing from step S203 to step S205. If the building information and the store name acquired in step S202 are stored, and there is only one building information at the location selected in step S202, the building information and store name are stored, and the process proceeds to step S206.

  On the other hand, in step S222, the control unit 10 stores the value “unknown” in the “name” field of the building information table, and advances the process to step S206.

  In step S206, the control unit 10 acquires an address corresponding to the information selected in step S202 from the above-described server on the Internet, and advances the process to step S207.

  Referring to FIG. 7, in step S207, control unit 10 determines whether or not an address has been acquired by the process of step S206. If it is determined that the address has been acquired, the process proceeds to step S208. , Proceed with each process. In step S207, the control unit 10 specifically sets the location on the map selected by the user in step S202 on the map that matches the current position information (longitude, latitude) presented in step S201. It is determined whether or not the accompanying address data exists (whether there is data indicating that the data specifies the address). If it is determined that the control unit 10 exists, the process proceeds to step S208. If the control unit 10 determines that it does not exist, the process proceeds to step S212.

  In step S208, the control unit 10 sets the address data determined to exist in step S207 as the value of the [address] field of the building information table. In step S208, when the user selects a place (building) corresponding to a plurality of building information in step S202, the control unit 10 not only shows an address but also information indicating a detailed place corresponding to the building information (building). (Information indicating the number of floors) is set as accompanying information. The communication part 19 can receive the signal from the sensor installed in the building in addition to the information received from GPS as the information specifying the floor of the building location. Specifically, the control unit 10 identifies the distance between the portable translator 1 and each of the plurality of sensors by analyzing signals received by the communication unit 19 from, for example, a plurality of sensors installed in the building. In this way, it is possible to identify where on which floor. In step S208, the control unit 10 can also add the information to the [address] field.

  Next, in step S209, the control unit 10 confirms whether or not the information related to the address set in step S208 exists in a server on the Internet, and proceeds to step S211. In step S209, specifically, the control unit 10 is stored in a server on the Internet, for example, by using a search engine using the address set in step S208 or the store name set in step S221 as a keyword. Search for information.

  In step S211, the control unit 10 determines whether there is information regarding the address. If it is determined that there is information, the control unit 10 proceeds to step S215. If it is determined that there is no information, the control unit 10 proceeds to step S214. In step S211, the control unit 10 is a database regarding whether or not a link to a homepage such as a store is set for the address selected by the user in step S202, or a facility (store). Whether the homepage address corresponding to the facility name as a search result could be obtained after searching the facility name using the address as a keyword in the database including the facility name (store name), address and homepage address, etc. Determine whether there is information about the address.

  In step S212, the control unit 10 determines whether or not the address selected by the user in step S202 corresponds to a road in the map information provided on the Internet or the like. If so, the process proceeds to step S213. If not, the process proceeds to step S214.

  In step S213, in the building information table described above, the address acquired in step S206 is set in the “address” field, and the value “road” is set in the type field corresponding to the set address. A value of “unknown” is set in the name field corresponding to the address, and the process returns to step S102.

  In step S214, the control unit 10 sets a value of “unknown” in the type and name fields corresponding to the building information set in step S208 in the building information table, and returns the process to step S102.

  In step S215, the control unit 10 searches the information acquired as the search result in step S209 for appropriate information as the value of the “type” field in the building information table. Then, in step S216, the control unit 10 determines whether such information is included. Specifically, the control unit 10 applies the text that is a candidate for the value of the “type” field to the source of the home page, and the “type” field of the conversation example DB and conversation history DB described later. If there is a rule that text that matches the stored contents (department / supermarket / ... etc.) is included, or if there is a rule that “type: type name” is included in the source as a format for creating a homepage, It is determined whether data based on the regulations is included in the homepage source. Then, the control unit 10 advances the process to step S217 when determining that it is included, and proceeds to step S218 when determining that it is not included.

  In step S217, the control unit 10 sets the information obtained as the search result in step S215 as the value of the type of field corresponding to the address set in step S208 in the building information table, and returns to step S102. Return processing.

  On the other hand, in step S218, the control unit 10 sets “unknown” as the value of the type of field corresponding to the address set in step S208 in the building information table, and returns the process to step S102.

  Referring to FIG. 5 again, in step S102, as described with reference to FIGS. 6 and 7, information on the building (or store) where the portable translator 1 is currently located is shown in Table 1. Obtained as a building information table.

  Then, the control unit 10 proceeds with subsequent processing according to the value of the [type] field of the building information table obtained in step S102. For example, if the value is “general house (house where ordinary people live)”, the process proceeds to step S103, and if the value is “store”, the process proceeds to step S104.

  In step S103, it is searched whether there is an address corresponding to the value of the address field in the building information table at the current position in the address book stored in the portable translator 1. If the address is present, in step S103A, an English conversation example based on the registered contents (profile data), conversation history, etc. of the individual corresponding to the address in the address book is presented.

  On the other hand, in step S <b> 104, the control unit 10 acquires more detailed information about the current position of the portable translator 1 using the building information table. Examples of the “more detailed information” mentioned here include a store information table shown in Table 2 below.

  In the store information table, values are defined in the fields of type, name, address, major classification of handled products, and data acquisition destination.

  As the value of the type field, data indicating the type of store such as a department store, a supermarket (market), a restaurant, a clothing store, a general store, and a duty-free store is stored.

  Further, data indicating the name of the store is stored as the value of the name field, and data indicating the address of the store location is stored as the value of the address field. The values of the name and address fields are copied from the corresponding values stored in the building information table shown as Table 1.

  Here, the processing content of step S104 is demonstrated with reference to FIG. 8 which is a flowchart of the subroutine of the said processing.

  Referring to FIG. 8, first, in each of steps S301 and S302, control unit 10 stores the values of the name and address fields of the building information table (see Table 1) and the store information table (see Table 2). Is copied as the value of the corresponding field, and the process of step S303 proceeds.

  In step S303, the control unit 10 searches the information stored in the server on the Internet using the values of the two fields of the name and address in the store information table as keywords. Such a search is performed using, for example, a search engine. When any search result is obtained, the search result is displayed on the display unit 12.

  In step S304, the control unit 10 determines whether information is obtained as a result of the search in step S303. In step S304, it is determined whether or not at least one piece of information in the search result displayed in step S303 has been selected by the user. If it is determined that it has been selected, the process proceeds to step S305. If it is determined that no selection has been made or no search result has been obtained in step S303, the process proceeds to step S314.

  In step S314, the control unit 10 sets “unknown” as the value of the type of the store information table and the field of the major classification of the handled product, and returns the process to step S104.

  On the other hand, in step S305, the control unit 10 extracts information indicating the provision destination of the information from the values obtained as the search result in step S303 in the data acquisition destination field of the store information table and sets it as a value. To do. It should be noted that the value setting in step S305 may be performed by a method suitable for the conditions under which the present invention is implemented. For example, there is a method for setting a value determined based on the address (URL) of the top page of the home page. Can be mentioned. Specifically, for example, the control unit 10 determines whether or not a text corresponding to the store name is included in the address of the top page of the homepage, and if included, the “store database” is provided as the providing destination. Set the information. Also, it is determined whether or not the text corresponding to the shopping street name is included in the address of the top page, and if it is included, the information “shopping street homepage” is set as the providing destination. As another method, there is a method of setting a value determined by whether or not a specific text is included in the address of the top page of the home page. For example, it is determined whether or not the address of the top page of the homepage contains text such as “db” meaning database or “shop” meaning store. Set the information "database". As another method, when the homepage source includes a specific text, and the text having a certain relationship with the specific text satisfies a predetermined condition, “ There is also a method of setting information called “store database”. Specifically, search for the text “title” in the source of the homepage, and if it is included, determine if the text “homepage” is included after the text, and if included, It is determined whether or not the text immediately before “Homepage of” is a store name. If it is a store name, information “store database” is set as a providing destination. In addition to the text “title”, an attribute code representing an attribute of the store name is detected in the source of the homepage, and depending on whether or not the name attached to the attribute code is the store name, “provided to the store database” It may be determined whether or not the information “is set.

  Next, in step S306, the control unit 10 searches the information acquired in step S303 for information corresponding to the store type, and in step S307, determines whether such information has been obtained by the search. to decide. If it is determined that it has been obtained, the process proceeds to step S308. If it is determined that it has not been obtained, the process proceeds to step S309. In step S307, as in step S216, the control unit 10 determines, for the homepage source, text that is a candidate for the value of the “type” field, a conversation example DB, and a conversation history, which will be described later. Whether the text that matches the contents (department / supermarket / ..., etc.) stored in the “type” field of the DB is included, or “type: type name” is entered on the source as a format for creating a homepage If there is a provision to keep it, it is determined whether or not data based on the provision is included in the homepage source. Then, the control unit 10 advances the process to step S308 when determining that it is included, and proceeds to step S309 when determining that it is not included.

  In step S308, the control unit 10 sets the information obtained as the search result in step S307 as the value of the type field in the store information table, and advances the process to step S310. On the other hand, in step S309, the control unit 10 sets “unknown” as the value of the field of the type in the store information table, and advances the process to step S310.

  In step S310, the control unit 10 searches the information acquired in step S303 for information corresponding to the product handled at the store, and determines whether or not such information is included in step S311. Specifically, the determination here is made when the data acquisition destination determined in step S305 is determined on the store homepage, database, or the like, on the data acquisition server, “product lineup”, “product list” If there is a page or database such as "", search for whether it contains text such as "clothing", "food", "home appliance", "car", etc. Judge with. If the data acquisition location determined in step S305 is determined on the shopping street homepage, it is checked whether there is a store introduction page on the shopping street homepage. This is done by determining whether or not the corresponding text includes “clothes”, “food”, “home appliance”, “car”, and the like. If it is determined that the information is included, the control unit 10 proceeds to step S312, and if it is determined that the information is not included, the control unit 10 proceeds to step S313.

  In step S312, the control unit 10 sets the information determined to be included in step S311 as the value of the major classification field of the handled product in the store information table, and returns the process to step S104. On the other hand, in step S313, the control unit 10 sets “unknown” as the value of the unclassified large field of the handling product in the store information table, and returns the process to step S104.

  Referring to FIG. 5 again, when acquiring the store information as described with reference to FIG. 8 in step S104, control unit 10 adds the additional information to the store information and creates a keyword in step S105.

  In the present embodiment, in step S103, information on the current position of the portable translator 1 as shown in Table 1 as a building information table is acquired, and further registered in the address book in step S103 or S104. The detailed information of the current position is acquired by the control unit 10 that acquires information on the location or the store where the portable translator 1 is currently located, as shown in Table 2 as a store information table. Detailed information acquisition means is configured.

  The processing contents in step S105 will be described below with reference to FIG. 9 which is a flowchart of the processing subroutine. In this process, a keyword table as shown in Table 3 is created for creating keywords.

  In the keyword table, values are defined for each field of type, name, address, classification, reliability, frequency, and priority conversation pattern.

  Referring to FIG. 9, first, in step S <b> 401, the control unit 10 corresponds to the values in the type, name, address, and classification fields in the keyword table in the store information table shown in Table 2. A value to be set is set (in the classification field, a value of “major classification of handled products” in the store information table is set).

  Next, in step S402, the control unit 10 refers to the information stored in the reliability reference record to determine the reliability corresponding to the data acquisition destination in the store information table, and the reliability in the keyword table. Set as the value of the field. An example of the reliability reference table is shown in Table 4.

  In Table 4, the reliability is stored in three stages of high, medium, or low in association with the type of data acquisition destination. The reliability for each source may be determined in advance in the portable translator 1 or may be set by the user of the portable translator 1.

  In Table 4, when the data is obtained from the store database (store DB) and the store homepage (store HP), the reliability is “high”, and the store area to which the store belongs is shown. In the case of a homepage, the reliability is “medium”, and in other cases, the reliability is “low”.

  Referring to FIG. 9 again, after setting the reliability in step S402, control unit 10 determines in step S403 whether or not there is a history in the conversation history DB based on the address and name in the store information table. To do. As shown in Table 5, the conversation history DB stores the date and time of conversation and the conversation sentence (conversation history) in association with the store type, name and address. As will be described later, the conversation history DB is history information created each time an example of conversation is presented at a place such as one store.

  In step S403, a history in which the address value in the current store information table matches the value in the name field is searched in the conversation history DB. In step S404, it is determined whether there is such a history. If it is determined that there is, the process proceeds to step S405. If it is determined that there is not, the process proceeds to step S413.

  In step S413, the control unit 10 sets “first time” as the frequency field value and sets “none” as the priority conversation pattern field value to the keyword table, and returns the process.

  On the other hand, in step S405, the control unit 10 sequentially arranges all corresponding histories in the conversation history DB (histories determined to match the name and field value in step S403) based on the date and time field values.

  Then, in step S406, the control unit 10 determines whether or not the arranged history exists every day. If it is determined to be, the control unit 10 proceeds to step S407. If not, the control unit 10 proceeds to step S408. Proceed with each process.

  In step S408, the control unit 10 determines whether or not the arranged history exists at an interval of once a week (or once a week or more and less than every day), and if so, If it is determined that this is not the case, the process proceeds to step S410.

  In step S410, the control unit 10 determines whether or not the arranged history exists at a monthly interval (or at least once a month and less than once a week), and determines that this is the case. If so, the process proceeds to step S411, and if not, the process proceeds to step S412. In step S407, step S409, and step S411, the control unit 10 sets “daily”, “once a week”, and “once a month” as the frequency field value in the keyword table, and performs the process in step S412. To proceed. If it is determined in step S410 that it is not once a month, information “unknown” may be input to the value of the frequency field of the keyword table.

  In step S412, the control unit 10 searches the data stored in the conversation history fields of all corresponding histories for whether or not there is a phrase stored in the keyword field of the priority conversation pattern reference record. If there is something to do, it is set in the priority conversation pattern field corresponding to the corresponding word and the process is returned.

  Table 6 shows an example of the priority conversation pattern reference record.

  The keyword field of the priority conversation pattern reference record stores the same type of phrase, and corresponding to each keyword field, the phrase classifying what kind of keyword these phrases are is the priority conversation. Stored in the pattern field. That is, the priority conversation pattern reference record is a storage unit for converting a word / phrase as a keyword into a general-purpose word / phrase.

  For example, if there is “my shoes” or “my watch” in the conversation history, it is considered that the conversation was made to buy one's own thing, so the classification of those words is “My shopping” and the value is given priority. Stored in the conversation pattern field. Similarly, if there is “present” in the conversation history, it is considered that a conversation was made to buy a present, so the classification of the phrase is “Present”.

  The phrase in the priority conversation pattern field is used when a related case is extracted from a conversation case. Specifically, after the conversation history related to the current location information is extracted, the conversation history is searched for the same phrase as the keyword field of the priority conversation pattern reference record. A general phrase in the priority conversation pattern field corresponding to the keyword field is extracted. Furthermore, since there is a search for a conversation example having a general word or phrase in the priority conversation pattern field that has been extracted, and there is such a case, processing is performed so that those cases are output preferentially. The user can quickly find and use related conversation examples.

  Referring to FIG. 5 again, as described with reference to FIG. 9, when the keyword table is created in step S <b> 105, in step S <b> 106, the control unit 10 presents a conversation example to be presented to the user based on the keyword. Is searched from the conversation example DB.

  The conversation example DB is information as shown in Table 7, for example.

  In the conversation example DB, a value corresponding to each field of the type of shop, the classification of products handled in the shop, keywords, and the conversation example is defined.

  The processing content in step S106 will be described with reference to FIG. 10 which is a flowchart of the processing.

  Referring to FIG. 10, first, in step S <b> 501, the control unit 10 searches the information stored in the conversation example DB using the keyword table type, classification, and frequency values as search targets. In this case, when the reliability value in the keyword table is “high”, the control unit 10 performs an AND search of the above values, and when the reliability value is “low”, the control unit 10 performs an OR search of the above values. Do.

  Next, in step S502, the control unit 10 determines whether or not there is a case having the same type, classification, and frequency currently stored in the keyword table in the conversation case DB. If it is determined that there is such a case, the control unit 10 proceeds to step S503. If it is determined that there is no such case, the control unit 10 proceeds to step S504.

In step S503, the control unit 10 rearranges the cases in the order in which the other cases are arranged next to the case having the value of the priority conversation pattern field of the keyword table in the list of cases obtained as a search result. By this, it becomes possible to present the example which a user requires more preferentially.
The value stored in the “conversation example” field is stored as the value of the priority conversation pattern field in the keyword table, and the process proceeds to step S504.

  In step S504, the control unit 10 determines whether the frequency of the keyword table is other than “first time”. If it is determined that it is other than “first time”, the control unit 10 proceeds to step S505. And proceed with each process.

  In step S505, the control unit 10 extracts a history having the same value as the value stored in the address and name fields of the keyword table in the conversation history DB, and when there is such a history, The value (conversation sentence) stored in the conversation history field is added to the conversation list presented to the user, and the process proceeds to step S506.

  In step S506, the control unit 10 presents the extracted conversation example list to the user, for example, by displaying it on the display unit 12, and returns the process.

  Referring to FIG. 5 again, as described with reference to FIG. 10, after presenting the conversation example list in step S106, control unit 10 causes the conversation example to be presented in step S107. The processing content in step S107 will be described with reference to FIG. 11 which is a flowchart of the processing subroutine.

  Referring to FIG. 11, first, in step S <b> 601, control unit 10 accepts a user's selection for the conversation example list presented in step S <b> 506. Specifically, through the process of step S506, the display unit 12 is presented with n patterns of conversation examples from conversation pattern 1 to conversation pattern n, for example, as shown as (A) and (B) in FIG. ing. The number of conversation patterns presented here is the sum of the number of conversation cases extracted in steps S501 to S502 and the number of conversation histories extracted in step S505. When the number of displayed conversation patterns is “1”, the process of step S601 may be omitted and the process of step S602 may be directly executed.

  Then, the control unit 10 displays the conversation pattern of the selected conversation example in step S602 based on the selection content received in step S601.

  In step S603, the control unit 10 determines whether there is a replaceable portion in the conversation pattern displayed in step S602. Here, the place that can be replaced is, for example, a place that can be easily assumed to be changed depending on the speaking situation, such as “T-shift” included in the conversation pattern n shown as (B) in FIG. It is. In addition, such a part is displayed on the display unit 12 in a different manner, such as underlined (see FIG. 12B) or changed in color from other parts in the conversation pattern. It is preferable. It is assumed that information specifying whether or not such a location exists in the conversation pattern and where such a location is included in the conversation pattern data itself.

  If the control unit 10 determines that there is a place that can be replaced in step S603, the control unit 10 proceeds to step S604. On the other hand, if it is determined that there is no such part, the process is returned as it is.

  In step S604, the control unit 10 replaces the conversation pattern with data designating that the user replaces a replaceable portion, and displays it again. In addition, it is preferable that the data specified to be replaced with a replaceable portion is also stored in association with the conversation pattern data itself.

  In step S605, the control unit 10 determines whether personal information of the user has been input.

  Specifically, for example, the conversation pattern n shown as (B) in FIG. 12 is continued, and the conversations shown as (C) and (D) in FIG. In (C), a sentence in which the clerk asks the size of the T-shirt is described, and in (D), a sentence for requesting the clerk to measure the size is described. In addition, by displaying the text of (D), the user recognizes the content of the conversation following (C) and gives personal information (T-shirt size) to the portable translator 1. You are prompted for input. When the conversation pattern shown in FIG. 12B is selected, the control unit 10 adds the conversation sentence described in (C) and (D) together with the sentence shown in (B) in step S602. Is also displayed on the display unit 12. At this time, information (message or the like) that prompts the user to input the size of the T-shirt may be displayed so as to correspond to the content displayed in (D).

  Referring to FIG. 11 again, when control unit 10 determines in step S605 that user information (personal information of the user as described above) has been input, information input in step S606 is used as a user profile. The data is appropriately stored in the storage unit 18 and the process is returned. On the other hand, if it is determined that such information has not been input, the control unit 10 returns the processing as it is.

  In the present embodiment described above, the search keyword generation means is configured by the control unit 10 that generates the keyword table as shown in FIG.

  In addition, a conversation sentence keyword retrieval unit is configured by the control unit 10 that performs a search as shown in step S501 in FIG. 10, and a conversation is performed by the control unit 10 that creates a conversation example list as shown in step S505. A sentence acquisition means is configured. And as shown as step S506, the output part is comprised by the control part 10 and the display part 12 which display a conversation example list | wrist.

  In the present embodiment described above, a conversation example list is presented (displayed) in step S506. In addition, when there are a plurality of conversation patterns displayed as a conversation example list, it is preferable that the conversation patterns (conversation sentences) are displayed in order from the highest reliability in the corresponding keyword table. In the present embodiment, the reliability is classified into three stages of “high”, “medium”, and “low”, but the expression of the reliability in the present invention is not limited to this, and is, for example, between 0% and 100%. It may be expressed in another manner, such as being expressed by the value of.

  Moreover, in this Embodiment demonstrated above, as shown to (A) and (B) in FIG. 12, the example by which a conversation example list | wrist is displayed in the language after translation, ie, English, was shown. However, the conversation example list may be displayed in the language before translation, that is, in Japanese.

  The embodiment disclosed this time should be considered as illustrative in all points and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.

It is a schematic diagram for demonstrating the function of the portable translator which is one embodiment of this invention. It is a figure for demonstrating the function of the information processing engine of FIG. It is a figure for demonstrating in detail the function of the information processing engine of FIG. It is a figure which shows the hardware constitutions of the portable translator of FIG. It is a flowchart of the process which the control part of the said portable translator performs when the portable translator of FIG. 1 presents a conversation example. It is a flowchart of the subroutine of the process which acquires the building information of the present position in the flowchart of FIG. It is a flowchart of the subroutine of the process which acquires the building information of the present position in the flowchart of FIG. It is a flowchart of the subroutine of the process which acquires shop information in the flowchart of FIG. FIG. 6 is a flowchart of a subroutine of processing for creating a keyword by adding additional information to store information in the flowchart of FIG. 5. FIG. 6 is a flowchart of a subroutine of processing for searching a conversation example presented from a conversation example DB using a keyword as a key in the flowchart of FIG. 5. It is a flowchart of the subroutine of the process which displays the example of a conversation in the flowchart of FIG. It is a figure which shows an example of the display content in the display part of the portable translator of this Embodiment.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 1 Portable translator, 10 Control part, 11 Input part, 12 Display part, 13 D / A conversion part, 14 Output amplifier, 15 Speaker, 16 Dictionary data storage part, 17 Voice data storage part, 18 Information storage part, 19 Communication unit, 101 conversation engine, 102 information processing engine, 104 conversation example DB, 105 translation / dictionary DB, 106 information table, 107 reference record, 108 conversation history DB, 901 user, 902 conversation partner.

Claims (12)

  1. In an electronic device equipped with a means for outputting a translated conversation sentence,
    A conversation sentence storage means for storing a conversation sentence and a conversation sentence keyword to be a keyword of the conversation sentence;
    Detailed information acquisition means for acquiring detailed information of the current position;
    Search keyword generation means for generating a keyword for searching a conversation sentence from the detailed information acquired by the detailed information acquisition means;
    A conversation sentence keyword search means for searching a keyword that matches the keyword generated by the search keyword generation means from a conversation sentence keyword;
    A conversation sentence acquisition means for acquiring a conversation sentence having a conversation sentence keyword searched by the conversation sentence keyword search means from a conversation sentence storage means;
    An electronic device comprising: output means for outputting a conversation sentence acquired by the conversation sentence acquisition means.
  2. The detailed information acquisition means includes
    A current position acquisition means for acquiring position information of the current position;
    2. The electronic apparatus according to claim 1, wherein detailed information on the current position is acquired from the position information acquired by the current position acquisition unit.
  3.   The conversation sentence storage means includes a conversation example storage means for storing a standard conversation example sentence and a conversation example keyword which is a keyword of the conversation example sentence, a conversation history sentence of the used conversation sentence, and a keyword of the conversation history sentence. The electronic device according to claim 1, further comprising one or both of a conversation history storage unit that stores a conversation history sentence keyword.
  4.   The electronic device according to claim 1, wherein the detailed information acquisition unit acquires detailed information of a building corresponding to the current position as detailed information of the current position.
  5.   The electronic device according to claim 1, wherein the detailed information acquisition unit acquires wide area information of a surrounding area of the current position.
  6. An information reliability acquisition unit that acquires the reliability of the detailed information of the current position acquired by the detailed information acquisition unit;
    The said output means weights the said reliability to the conversation sentence acquired by the said conversation sentence acquisition means, and outputs from the conversation sentence with high reliability. The electronic device described.
  7. In a control method of an electronic device provided with means for outputting a translated conversation sentence,
    A conversation sentence storage step for storing a conversation sentence and a conversation sentence keyword as a keyword of the conversation sentence;
    A detailed information acquisition step for acquiring detailed information of the current position;
    A search keyword generation step for generating a keyword for searching for a conversation sentence from the detailed information acquired in the detailed information acquisition step;
    A conversational keyword search step for searching for a keyword that matches the keyword generated in the search keyword generation step from the conversational keyword;
    A conversation sentence acquisition step of acquiring a conversation sentence having a conversation sentence keyword searched in the conversation sentence keyword search step from a conversation sentence stored in a conversation sentence storage step;
    An output step of outputting the conversation sentence acquired in the conversation sentence acquisition step.
  8. The detailed information acquisition step includes:
    A current position acquisition step for acquiring position information of the current position;
    The electronic device control method according to claim 7, further comprising: acquiring detailed information of a current position from the position information acquired by the current position acquisition unit.
  9. The conversation sentence storing step includes:
    A conversation example storage step for storing a standard conversation example sentence and the conversation example keyword;
    The conversation history storage step of storing a conversation history sentence of a used conversation sentence and a conversation history sentence keyword that is a keyword of the conversation history sentence is provided, or both of them. The control method of the electronic device of 8.
  10.   10. The electronic device control method according to claim 7, wherein the detailed information acquisition step acquires detailed information on a building corresponding to the current position as detailed information on the current position.
  11.   The method for controlling an electronic device according to any one of claims 7 to 9, wherein the detailed information acquisition step acquires wide area information of a surrounding area of the current position.
  12. An information reliability acquisition step of acquiring the reliability of the detailed information of the current position acquired in the detailed information acquisition step;
    The said output step weights the said reliability to the conversation sentence acquired by the said conversation sentence acquisition step, and outputs it in order from a conversation sentence with high reliability in any one of Claims 7-9. The control method of the electronic device as described.
JP2006202021A 2006-07-25 2006-07-25 Electronic apparatus and control method for the same Pending JP2008027348A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006202021A JP2008027348A (en) 2006-07-25 2006-07-25 Electronic apparatus and control method for the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006202021A JP2008027348A (en) 2006-07-25 2006-07-25 Electronic apparatus and control method for the same

Publications (1)

Publication Number Publication Date
JP2008027348A true JP2008027348A (en) 2008-02-07

Family

ID=39117897

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006202021A Pending JP2008027348A (en) 2006-07-25 2006-07-25 Electronic apparatus and control method for the same

Country Status (1)

Country Link
JP (1) JP2008027348A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015087963A (en) * 2013-10-30 2015-05-07 富士ゼロックス株式会社 Document recommendation program and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000261573A (en) * 1999-03-08 2000-09-22 Matsushita Electric Ind Co Ltd Satellite portable telephone
JP2002007394A (en) * 2000-06-26 2002-01-11 Casio Comput Co Ltd Electronic conversation dictionary device and program recording medium therefor
JP2002366545A (en) * 2001-06-08 2002-12-20 Sony Corp Device for supporting conversation in foreign language
JP2004069438A (en) * 2002-08-05 2004-03-04 Sony Corp Guide system, contents server, portable device, method and program for processing information, and storage medium
JP2004070598A (en) * 2002-08-05 2004-03-04 Sony Corp Guide system, content server, portable equipment, information processing method, information processing program and storage medium
JP2005083941A (en) * 2003-09-09 2005-03-31 Sony Corp Guide information providing device and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000261573A (en) * 1999-03-08 2000-09-22 Matsushita Electric Ind Co Ltd Satellite portable telephone
JP2002007394A (en) * 2000-06-26 2002-01-11 Casio Comput Co Ltd Electronic conversation dictionary device and program recording medium therefor
JP2002366545A (en) * 2001-06-08 2002-12-20 Sony Corp Device for supporting conversation in foreign language
JP2004069438A (en) * 2002-08-05 2004-03-04 Sony Corp Guide system, contents server, portable device, method and program for processing information, and storage medium
JP2004070598A (en) * 2002-08-05 2004-03-04 Sony Corp Guide system, content server, portable equipment, information processing method, information processing program and storage medium
JP2005083941A (en) * 2003-09-09 2005-03-31 Sony Corp Guide information providing device and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015087963A (en) * 2013-10-30 2015-05-07 富士ゼロックス株式会社 Document recommendation program and device

Similar Documents

Publication Publication Date Title
KR101577493B1 (en) Intent deduction based on previous user interactions with a voice assistant
KR100798574B1 (en) Advertising campaign and business listing for a location-based services system
Seneff et al. A new restaurant guide conversational system: Issues in rapid prototyping for specialized domains
US7720674B2 (en) Systems and methods for processing natural language queries
US8073700B2 (en) Retrieval and presentation of network service results for mobile device using a multimodal browser
CN106471570B (en) Order single language input method more
JP4654776B2 (en) Question answering system, data retrieval method, and computer program
CA2458138C (en) Methods and systems for language translation
RU2494476C2 (en) Method and system for providing voice interface
US8903847B2 (en) Digital media voice tags in social networks
CN100565670C (en) System and method for user modeling to enhance named entity recognition
JP2007094086A (en) Input device, input method, and input program
JP4654780B2 (en) Question answering system, data retrieval method, and computer program
KR101712296B1 (en) Voice-based media searching
JP5324937B2 (en) Language extraction of time and location information for recommendation systems
US20050004903A1 (en) Regional information retrieving method and regional information retrieval apparatus
JP2014194827A (en) Non-standard location base text input
JP2016122183A (en) Disambiguating heteronyms in speech synthesis
US7937402B2 (en) Natural language based location query system, keyword based location query system and a natural language and keyword based location query system
KR20170001550A (en) Human-computer intelligence chatting method and device based on artificial intelligence
US20090125497A1 (en) System and method for multi-lingual information retrieval
DE60017000T2 (en) Method for goal-oriented language translation by means of extraction of meaning and dialogue
JP5526396B2 (en) Information search apparatus, information search system, and information search method
US9633004B2 (en) Better resolution when referencing to concepts
US8560301B2 (en) Apparatus and method for language expression using context and intent awareness

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080903

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110322

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110518

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20120117