WO2015125810A1 - 情報処理装置および情報処理方法 - Google Patents
情報処理装置および情報処理方法 Download PDFInfo
- Publication number
- WO2015125810A1 WO2015125810A1 PCT/JP2015/054392 JP2015054392W WO2015125810A1 WO 2015125810 A1 WO2015125810 A1 WO 2015125810A1 JP 2015054392 W JP2015054392 W JP 2015054392W WO 2015125810 A1 WO2015125810 A1 WO 2015125810A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- state
- data
- keyword
- voice
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/338—Presentation of query results
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
Definitions
- Embodiments described herein relate generally to an information processing apparatus and an information processing method.
- multiple staff members of medical and nursing care staff are involved in the care of patients and care recipients and the care of daily life.
- a plurality of staff members observe and diagnose the condition of the patient and the cared person.
- a single staff member does not observe the patient continuously or continuously, but multiple staff members from various occupations visit and observe patients at different times and intervals. For this reason, in order for each staff member to share information about a patient, the observation result of the patient is registered in an electronic medical record system, a nursing / nursing care recording system, or an SNS.
- a voice tweet system An information sharing system using voice messages (hereinafter referred to as a voice tweet system) is known as a system for sharing observation results about patients.
- each staff member utters a patient's observation result to a microphone of a portable terminal such as a smartphone, and records the result with a voice tweet registration application installed in the portable terminal, thereby generating a voice message.
- the generated voice message is transmitted to the server.
- the target patient ID of the patient who is the utterance target, the staff ID of the utterer, the utterance time, the utterance location, the keyword extracted from the voice message, and the like are attached to the voice message as tags. Information including such a voice message and a tag is called a voice tweet.
- Each staff member can browse or view the voice tweets stored on the server from a portable terminal or a personal computer.
- JP 2012-226449 A Patent registration No. 5414865
- Embodiment of this invention aims at making it easy to grasp
- An information processing apparatus as an embodiment of the present invention includes a first storage unit, a second storage unit, a data processing unit, and an output unit.
- the first storage unit stores correspondence data in which a plurality of states related to mind and body are associated with keywords related to each state.
- the second storage unit stores a message group representing the contents uttered by a plurality of observers regarding the observation subject.
- the data processing unit specifies a keyword related to the state existing in the message group, and arranges information data including the specified keyword in association with the state for each message including the specified keyword. Thus, the presentation information is generated.
- the figure which shows the example which changes the color of a cell in the presentation information display of FIG. The figure which shows the other example of the presentation information which concerns on embodiment of this invention.
- production distribution of the keyword regarding several observation subject The figure which shows the example of the generation distribution of the keyword regarding several observation subject.
- FIG. 1 shows an information processing system according to an embodiment of the present invention.
- the 1 includes an information processing apparatus 101 and a plurality of user terminals 102.
- the information processing apparatus 101 and a plurality of user terminals 102 are connected via a network 103.
- the network 103 may be any network in the form of wired, wireless, or a hybrid of wired and wireless.
- the network 103 may be a local network or a wide area network such as the Internet.
- the user terminal is a terminal operated by the user of the present embodiment, and is a smartphone, a PDA, a portable terminal, a personal PC, or the like.
- the user terminal includes a CPU, a storage device, a display unit that displays an image, a speaker, a microphone, and the like.
- the user of this embodiment is an observer who observes an observation target person such as a patient or a care recipient. Specifically, the observer is a doctor, a nurse, a caregiver, a pharmacist or the like who is engaged in a occupation such as a medical profession or a caregiver.
- the information processing apparatus 101 includes a data processing unit 11, a correspondence table storage unit 12, a voice tweet data storage unit 13, an observation subject master storage unit 14, a user master storage unit 15, an output unit 16, and a warning unit. 17.
- the correspondence table storage unit 12 stores a state-keyword correspondence table in which a plurality of states relating to the mind and body of the observation target, such as actions, statements, states, or states of the affected part, are associated with keywords related to each state.
- the correspondence table storage unit 12 is connected to the data processing unit 11.
- FIG. 2 shows an example of the state-keyword correspondence table.
- the state is a state related to the mind and body, and here, “cognition”, “nutrition”, “aspiration”, “falling / falling”, etc. are shown.
- the keyword is a word that the observer expresses as an observation result of the mind and body of the observation subject, such as an action, a statement, a state, or an affected area state that is considered highly relevant to the corresponding state.
- the keyword “ ⁇ ” is highly related to the state of “cognition”.
- Each keyword can belong to more than one state.
- the keyword “Tabekoboshi” belongs to three states of “cognition”, “aspiration”, and “nutrition”.
- the state may have a hierarchical structure of two or more levels. For example, “recognition” may be in a higher level, and lower levels such as “meal”, “daily life”, and “bath” may exist in the lower level.
- the observation subject master storage unit 14 stores an observation subject master that stores, for a plurality of observation subjects (patients, care recipients, etc.), the observation subject ID, the names of the observation subjects, and the like.
- the observation subject master storage unit 14 is connected to the data processing unit 11.
- the observation subject master may include information other than those described here, for example, age and sex.
- the user master storage unit 15 stores a user master related to a user of a service provided by the information processing apparatus.
- the user master storage unit 15 is connected to the data processing unit 11.
- the user is an observer such as a doctor, a nurse, a caregiver, and a pharmacist who observes an observation target person (a patient, a cared person, or the like).
- the user master stores the observer ID, name, job type, and the like.
- a password or the like necessary for authentication at the time of login to a service provided by the information processing apparatus may be stored as necessary. In this case, this apparatus may provide the service only to the user who has input the correct observer ID and password.
- the voice tweet data storage unit 13 stores a voice tweet data table.
- the voice tweet data storage unit 13 is connected to the data processing unit 11.
- data related to voice tweet uttered by a plurality of observers is registered.
- FIG. 3 shows an example of a voice tweet data table.
- the voice tweet data includes fields of “voice tweet ID”, “observer ID”, observation target ID ”,“ utterance date / time ”,“ voice tweet content ”, and“ related keyword ”.
- “Voice tweet ID” is an identifier of voice tweet data.
- Observer ID is an identifier (user ID) of an observer such as a doctor or a caregiver.
- Observation target person ID is an identifier of an observation target person such as a patient or a care recipient.
- “Speech date / time” is the date / time when the observer uttered the content shown in the “voice tweet content” field.
- “Voice tweet content” is data of a voice message uttered (tweeted) by the observer, or a message obtained by converting the voice message into text. Both voice messages and text messages may be stored. Or you may store the link (URL etc.) to a voice message with a text message. In this case, the voice message may be stored in a linked server so that the user terminal 102 can download it from the server. Here, it is assumed that at least a text message is stored in the “voice tweet content” field.
- the “related keyword” is a keyword extracted by the data processing unit 11 to be described later from the voice message or the text message by the keyword extraction process. It is assumed that the keyword to be extracted matches the value of the “keyword” field of the state-keyword correspondence table. However, this is only an example, and keywords may be extracted based on other criteria. In this example, when there are a plurality of extracted keywords, the keywords are arranged separated by commas and stored in one “related keyword” field. It is also possible to store the extracted keywords in another form. For example, a table including a voice tweet ID field and a “related keyword” field that stores only one keyword may be provided, and one keyword may be stored in the “related keyword” field one by one.
- the data processing unit 11 communicates with each user terminal 102 to perform registration processing of voice tweet data.
- Each observer observes a patient or a cared person, and utters the observation result to the microphone of the user terminal 102 by designating the observation target person ID.
- the observation result includes a result of observation with respect to a predetermined observation item, a matter (awareness) noticed when observing the observation item, and the like.
- the utterance content is registered as a voice message by the application installed in the terminal.
- the application attaches tags such as a voice tweet ID, an observer ID, an observation target person ID, and an utterance date to the voice message, and transmits the voice message to the information processing apparatus.
- the data processing unit 11 of the apparatus converts the voice message into text by voice recognition and generates a text message.
- keyword extraction is performed for voice messages or text messages.
- keywords are extracted by absorbing differences in wording and terminology (utilization). For example, “flutter”, “flutter”, “flutter” and the like are all extracted as “flutter”. This can extract the word stem by morphological analysis. Furthermore, it can also be converted into a different expression such as “flutter” ⁇ “unstable” using a hash table (in this example, the hash table key: “flutter”, value: “unstable” is registered in advance) ).
- the data processing unit 11 uses the voice tweet ID, the observer ID, the observation subject ID, the utterance date and time, the voice tweet content (voice message, text message, or both), and the extracted keyword as voice tweet data. Register in the voice tweet data table. Note that the voice message may be converted into text and transmitted to the apparatus by the user terminal.
- the voice tweet data registration process may be performed by a device different from the information processing device, and the voice tweet data table may be received from the other device.
- the data processing unit 11 receives the designation of the observation subject ID from the user terminal 102, and performs processing on the voice tweet data having the designated observation subject ID.
- the data processing unit 11 can access the correspondence table storage unit 12, the voice tweet data storage unit 13, the observation subject master storage unit 14, and the user master storage unit 15, and can read data stored in each. is there.
- the data processing unit 11 specifies a keyword related to the state from the “related keyword” field for each state of the state-keyword correspondence table in each of the voice tweet data having the designated observation subject ID. For example, regarding “cognition”, there are “ ⁇ ”, “disturbance”, and “tabekoboshi” as related keywords. Therefore, each voice tweet data is inspected, and the keywords “ ⁇ ”, “disturbance”, and “Tabekoboshi” are specified.
- the data processing unit 11 generates information data including the specified keyword for each voice tweet data in which the keyword is specified, and presents the generated information data in association with the corresponding state to present to the user Generate information.
- the format of the information data includes, for example, a list of identified keywords and a voice tweet text (text message). In the following description, it is assumed that the information data has this format unless otherwise specified.
- the output unit 16 transmits the presentation information generated by the data processing unit 11 to the user terminal 102 that has requested the processing.
- the output unit 16 is connected to the data processing unit 11.
- the user terminal 102 displays an image representing the presentation information on the application screen.
- the data processing unit 11 may include information for specifying the display method of the presentation information in the presentation information transmitted to the user terminal 102. For example, information specifying a keyword color and font size included in each information data, a background color where each information data is arranged, a voice tweet text, and the like may be included.
- the presentation information may be displayed on the screen by transmitting a link (URL or the like) to the presentation information to the user terminal 102, and the user terminal 102 downloading the presentation information from the link destination.
- a link URL or the like
- the data processing unit 11 transmits the generated presentation information to the linked server.
- the output unit 16 may transmit the presentation information to the user terminal 102 by e-mail.
- methods other than those described here, such as output to a printer and printing, may be used.
- the data processing unit 11 may further classify and arrange the information data (keyword list, voice tweet text) arranged in association with each state according to a predetermined time unit based on the utterance date and time, and may use this as presentation information. For example, in various time units, such as daily, weekly, monthly, time zone, 1 hour, and other time units (seasonal units such as spring, summer, autumn, winter, etc., or 3 months) You may classify.
- the time unit may be designated from the user terminal.
- the data processing unit 11 may count the number of classified information data, that is, the number of related voice tweets, for each classification based on the state and time unit. Further, the total number of keywords in the list may be aggregated between the information data for each classification.
- FIG. 4 shows an example in which the presentation information is displayed on the screen of the user terminal when the time unit for classifying the information data is the monthly unit.
- the vertical axis is the state, and the horizontal axis is the monthly unit.
- the status is “recognition”, “falling / falling”, “aspiration”, “nutrition”.
- Month items are “3 months ago”, “2 months ago”, “Last month”, and “This month”. In other words, voice tweet data issued up to the past three months is targeted.
- a cell is arranged for each set of status and month items.
- one or more pieces of information data are arranged so as to be stacked one by one.
- the information data is displayed in the format of ⁇ related keyword>, ⁇ related keyword>,...] + ⁇ Voice tweet text>.
- the information data consists of a list of keywords separated by commas and a voice tweet text.
- the number of information data (number of voice tweets) and the number of keywords in the cell are displayed.
- the format of the information data is not limited to the above, and may be only a voice tweet text, for example. In this case, in the voice tweet text, the keywords related to the classified state may be given a different color from the other parts to make them stand out.
- buttons for “month”, “week”, “day”, and “time zone” are provided.
- FIG. 4 shows an example in which the “Moon” button is selected.
- the information data may be classified and displayed in units of time corresponding to the selected button.
- One voice tweet may include multiple keywords for multiple states. For this reason, information data including a certain voice tweet can appear in a plurality of cells.
- the keywords in the list contained therein and the display color of the subsequent voice tweet text may be different.
- the keyword may be red
- the voice tweet text may be black characters.
- Voice tweet text is displayed halfway when it cannot be displayed on a single line.
- a screen displaying all of the text may be displayed in a hop-up manner by aligning the mouse pointer with the voice tweet text.
- the voice tweet text cannot be displayed on one line, it can be retrieved and displayed over a plurality of lines.
- a voice message may be reproducible via a speaker (not shown) by setting a link to the voice tweet text and clicking the link.
- the observer can use the information presented on the screen to grasp the state change of the observation target. It is possible to easily grasp the tendency of whether the voice tweet related to each state is increasing or decreasing. In the case of “recognition”, it can be seen that the number of information data (the number of voice tweets) has increased from three months ago to this month. It can also be seen that the number of keywords related to “recognition” issued by the observer is increasing. In addition, the content of the keywords that appear changes with each month. By using such information, a doctor, a caregiver, or the like can grasp the state of the observation subject regarding dementia.
- the display color (background color) of the cell can be changed according to the total number of information data (number of voice tweets). For example, multiple ranges are set using reference values such as white if the number of voice tweets in the cell is 0 or more and 2 or less, light blue if it is 3 or more to 4 or less, and indigo if it is 5 or more.
- the display color of the cell can be changed accordingly.
- the color change is expressed by the difference in hatching pitch.
- Such a reference value for the number of voice tweets for changing the display color can be set for each state. Thereby, the observer can grasp the tendency of the change for each state more intuitively than the display in which the information data is simply stacked in the cell.
- the display color of the cell is changed according to the number of information data in the cell (the number of voice tweets) is shown, but it is also possible to change the display color of the cell according to the number of keywords in the cell. It is.
- the horizontal axis is a monthly unit, but the horizontal axis may be a week unit or a time zone unit.
- An example in which the presentation information when the horizontal axis is the time zone is displayed on the screen of the user terminal is shown in FIG. In Figure 6, it is divided into morning (8: 00-12: 00), noon (12: 00-16: 00), evening (16: 00-20: 00), and other (20: 00-8: 00).
- the time zone may be determined according to another standard.
- the vertical axis represents the state
- the horizontal axis represents the weather item.
- weather information such as temperature, humidity and weather at the time of speaking is set. What is necessary is just to acquire an utterance position in addition to an utterance date and time from a user terminal at the time of acquisition of voice tweet data, and to acquire weather data from an external weather server based on the acquired utterance date and time and utterance position. An utterance position may be added to the voice tweet data.
- the position may be an utterance position attached to the voice tweet (for example, a town name obtained by a reverse geocoding technique from the latitude / longitude of GPS acquired by a smartphone at the time of utterance), or a town name of an address of a target patient.
- the vertical axis is the state and the horizontal axis is the position.
- the number of tweets for each state may be a bar graph, and this bar graph may be arranged at a corresponding position on the departure map (eg, near the center on the town map).
- the processing target of the voice tweet data is set to the date of occurrence within the past six months, but may be set in another period such as the past 10 days or the past 24 hours.
- the job type of the observer (user)
- An example in which the presentation information in this case is displayed on the screen of the user terminal is shown in FIG.
- the vertical axis is the state
- the horizontal axis is the job type.
- the occupation type may be specified from the “observer ID” of the voice tweet data based on the above-described user master.
- the vertical axis is in the state, but the vertical axis can be the observer (user) or the job type of the observer, and the horizontal axis can be the time unit.
- FIG. 9 shows an example in which presentation information when the vertical axis is an observer and the horizontal axis is a week is displayed on the screen of the user terminal.
- the horizontal axis is a week unit, but it may be another time unit such as a month unit or a time zone unit.
- the information data may be classified for each state in a desired time unit and job type. Other combinations may be used.
- the warning unit 17 in FIG. 1 transmits a warning message to the user terminal 102 according to the processing result of the data processing unit 11.
- the warning unit 17 is connected to the data processing unit 11.
- a warning message is transmitted.
- the warning message may include information for identifying the corresponding cell (classification).
- the user terminal 102 that has received the warning message alerts the observer by displaying the received warning message on the screen.
- a display method there is a method of displaying with a pop-up message.
- a warning message can be transmitted to the user terminal 102 by e-mail.
- the warning sound output instruction data may be sent to the user terminal 102, and the user terminal 102 may reproduce the warning sound through a speaker (not shown).
- keywords included in the voice message or text message are extracted and stored in the “related keyword” field.
- the voice message or You may make it extract a keyword from a text message.
- the number of occurrences of voice tweets per specified period (for example, per month) Nuj is obtained.
- Nuj for all observation subjects, a distribution relating to the number of voice tweets generated per specified period is obtained.
- a histogram may be obtained, or an average and a standard deviation may be obtained assuming a normal distribution.
- the distribution obtained in this way may be stored in a file as data.
- the number of voice tweets per specified period can be calculated as follows. For each observation target, for each observation target, count the voice tweets related to each state within a certain period (T is the number of days) and divide the total value by the number of days T in the period. You can find the number of voice tweets per day. Multiplying this by 14 gives you the number of voice tweets per 2 weeks, and multiplying by 30 gives you the number of occurrences per month. In general, it is possible to calculate the number of occurrences of related voice tweets in an arbitrary period unit.
- Fig. 11 (A) shows the distribution of the number of voice tweets regarding "recognition” per month.
- the horizontal axis represents the number of voice tweets related to “recognition”, and the vertical axis represents the number of observation subjects.
- An asterisk indicates the number of occurrences of voice tweets related to “recognition” of a certain observation target.
- FIG. 10 shows a flowchart of a process for obtaining the distribution of the number of voice tweets generated per specified period for all observation subjects. This processing is performed by the data processing unit.
- Mxyz the number of occurrences of voice tweets in a certain state y of a certain observation subject x in a certain month z is described as Mxyz.
- Mxyz can be evaluated by its position in the distribution with respect to Nuj. For example, when the distribution is obtained as a histogram, it is possible to obtain the upper percentage point (upper percentile value) of Nxyz in the Nuj distribution. Depending on the value, the display color of the cell can be changed.
- the corresponding cell can be set to a specific color (such as red). It can be said that the number of occurrences in the upper 5th percentile or higher is considerably higher than the average.
- the display color may be changed continuously or discretely for each upper percentile value. When a normal distribution is assumed as the Nuj distribution, the display color may be changed according to the evaluation value calculated by (average value of Nxyz ⁇ Nuj) / standard deviation.
- FIG. 12 shows a flowchart of processing for evaluating the number of voice tweets generated by a certain observation subject based on the distribution of the number of voice tweets generated by all observation subjects. This processing is performed by the data processing unit.
- the number of voice tweets generated by a certain observation target (x) was evaluated based on the distribution of the number of voice tweets generated by all observation targets.
- the voice tweet occurrence distribution of all observation subjects instead of the voice tweet occurrence distribution of all observation subjects, the voice tweet occurrence distribution of observation subjects who satisfy certain conditions may be used.
- Observer x is equal in age or age group (for example, 10 years old) (2) Observer x is equal in gender (3) Observer x is equal in degree of care required (in this case, distribution of An example is shown in Fig. 11 (B), which has a smaller parameter than that in Fig. 11 (A). (4) Have the same underlying disease as the subject x (5) Have been prescribed the same drug as the subject x within the past L (L is an integer of 1 or more) months (6)
- Information such as age, gender, degree of care required, basic illness, etc. may be obtained by connecting the information processing system to another medical / nursing system such as nursing records and nursing records. Or you may mount the database which stores such information in the information processing apparatus 101 of FIG.
- the distribution of the number of occurrences of voice tweets related to all observation subjects is used as a comparison target.
- the past occurrence distribution of oneself (observation subject x) for example, the past of observation subject x (for example, 5 to 1 years)
- the distribution of voice tweets up to a year ago may be used for comparison.
- the evaluation is performed based on the number of occurrences of voice tweets, but the evaluation may be performed based on the number of occurrences of keywords.
- the number of occurrences of keywords can be calculated by adding 1 to Nuj for each keyword when there are a plurality of keywords k for the same state j in the process of step S105 in FIG.
- FIG. 13 shows a state-keyword correspondence table according to this embodiment.
- a “relevance” field is added to the state-keyword correspondence table according to the first embodiment.
- the relevance is a numerical value indicating the degree of relevance between the state and the keyword.
- keywords related to a certain state there may be a plurality of keywords related to a certain state, but the depth of the relationship with the state is considered to differ depending on the keyword.
- the keyword “ ⁇ ” and the keyword “restless” are both related to the state of recognition, but “ ⁇ ” is considered to be more related to recognition. For this reason, for example, when 5 voice tweets containing the keyword “ ⁇ ” occur in a certain month and 5 voice tweets containing the keyword “disturbed” occur, the former tends to be more perceived. It is desirable to evaluate that there is.
- a “relevance” field is provided in addition to the “status” and “keyword” fields in the status-keyword correspondence table.
- the relevance level is used to extend the method for counting the number of voice tweets or the number of keywords shown in the first embodiment.
- the relevance has a value greater than 0 and 1 or less, but is not limited thereto.
- the degree of relevance may be defined for all keywords for each state. In this case, the degree of relevance may be set to 0 when there is no relation at all.
- the following shows how to count the number of voice tweets using relevance. First, for each voice tweet, the highest degree of association is specified for each state. The flow of this process is shown in FIG.
- Steps S301 to S303 are the Rij initialization phase.
- the number of voice tweets can be obtained as the sum of the relevance levels Rij of the voice tweets.
- the value obtained in this way is particularly called a tweet occurrence rate.
- the occurrence level Vxyz of the voice tweet related to the state y in a certain month z of an observation target person x is obtained by the sum of the degree of association Riy between each voice tweet i generated in the month z and the state y.
- the calculation formula of the occurrence rate Vxyz is shown below. This degree of occurrence can be used in place of the number of voice tweets described above.
- the degree of association between state j and keyword k is described as Wjk.
- the number of occurrences of keywords related to the state j in a certain month z is adjusted using the degree of relationship. Assuming that the number of keywords k related to the state y in the month z for the observation target x is n ykz , the number of occurrences of keywords can be obtained as a weighted sum of n ykz by Wyk. The value obtained in this way is particularly called a keyword occurrence rate.
- a calculation formula of the keyword occurrence degree Mxyz related to the state y in the month z for the observation target person x is shown below. The keyword occurrence degree can be used instead of the keyword occurrence number described so far.
- the display color of the related keyword included in the information data in each cell can be changed according to the degree of relevance.
- RGB (R, G, B) (255, 255 * (1-relevance), 255 * (1-relevance)).
- keywords with a higher degree of relevance close to 1
- keywords with a lower degree of relevance are displayed with lower luminance (for example, light red).
- the font size of the keyword can be increased according to the degree of relevance. For example, the larger the relevance, the larger the font size.
- keywords can be highlighted according to the degree of relevance.
- the degree of association is defined for the set of state and keyword.
- a weight for each job type may be defined for a set of state and keyword. Even for the same keyword, the importance may differ depending on which type of user the user has issued. Therefore, the weight for each job type is defined in the state-keyword correspondence table, and when the above formula 1 or 2 is added up, the weight is further multiplied by the type of job that produced the voice tweet, so that The keyword weight can be adjusted according to the attribute.
- the evaluation can be performed in a form suitable for the job type of the observer.
- the job type of the observer may be acquired from the user master based on the observer ID.
- a state-keyword correspondence table for each user may be defined and held in the correspondence table storage unit 12.
- the state-keyword correspondence table can be defined and used for evaluation from the viewpoint of the user's self-interest.
- a state-keyword correspondence table may be defined for each observation target and held in the correspondence table storage unit 12.
- the information processing apparatus 101 performs the processes of the first to third embodiments by applying the state-keyword correspondence table according to the job type, the user, or the observation target.
- the display control can be performed as follows.
- a voice tweet whose date of occurrence is within the past six months is a table in which the vertical axis is the state and the horizontal axis is the time zone (see FIG. 6).
- the number of voice tweets generated in each state a and each time zone b is obtained.
- the number of occurrences of speech tweets related to each state per day is obtained for each observation target.
- the number Nuab of voice tweets per six months in the time zone b can be obtained for the state a of each observation subject u.
- the number Kuab of keywords can be obtained by the same method.
- the upper percentage point (upper percentile value) of Mxab in the Nuab distribution of all the observation subjects u is obtained, and the display color of the cell is determined by the value. Can be changed.
- the vertical axis is the state and the horizontal axis is the occupation table (see FIG. 8)
- the number of voice tweets and the number of keywords in the cell can be evaluated by the same calculation.
- the number of voice tweets or the number of keywords using the relevance can be evaluated.
- the information processing apparatuses according to the first to fifth embodiments can be realized by using, for example, a general-purpose computer apparatus as basic hardware. That is, the processing of the blocks included in the information processing apparatus can be realized by causing a processor mounted on the computer apparatus to execute a program.
- the information processing apparatus may be realized by installing the above program in a computer device in advance, or may be stored in a storage medium such as a CD-ROM or distributed through the network.
- this program may be realized by appropriately installing it in a computer device.
- the storage means provided in the information processing apparatus appropriately uses a memory, a hard disk or a storage medium such as a CD-R, CD-RW, DVD-RAM, DVD-R, etc., incorporated in or external to the computer apparatus. Can be realized.
- Information processing apparatus 102 User terminal 103: Network 11: Data processing unit 12: Correspondence table storage unit 13: Voice tweet data storage unit 14: Observation subject master storage unit 15: User master storage unit 16: Output unit 17: Warning section
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Business, Economics & Management (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Medical Treatment And Welfare Office Work (AREA)
Abstract
Description
図1に、本発明の実施形態に係る情報処理システムを示す。
第1の実施形態では、分類(セル)ごとに情報データ数(音声つぶやき数)を集計する例を示したが、この音声つぶやき数が、他の観察対象者と比較して多いのか少ないのかを相対的に評価して、セルの表示色を変更することもできる。具体的に、ある観察対象者の各セル内の音声つぶやきの個数と、全観察対象者群についての各セルと同じ条件で分類したときの音声つぶやきの発生数の分布との比較に基づいて、ある観察対象者の各セルの表示色を決定する。
(1)観察対象者xと年齢もしくは年齢層(例えば10歳区切り)が等しい
(2)観察対象者xと性別が等しい
(3)観察対象者xと要介護度が等しい(この場合の分布の例を図11(B)に示す。図11(A)に比べて母数が少なくなっている)
(4)観察対象者xと同じ基礎疾患を有する
(5)過去L(Lは1以上の整数)か月以内に観察対象者xと同じ薬を処方されている
(6)上記の一部の組み合わせ
図13は、本実施形態に係る状態-キーワード対応表を示す。第1の実施形態に係る状態-キーワード対応表に対して、“関連度”フィールドが追加されている。
在宅医療・介護の連携においては、医師、看護師、薬剤師、介護士などの多職種が、それぞれの専門性に基づいて観察対象者を観察し、観察結果を発話した音声つぶやきを登録する。このように多職種により蓄積された音声つぶやきを、どのような状態に関して評価したいかは、職種によって異なる。
第1の実施形態において、提示情報のセルごとに表示色を制御する例を示したが、これについてより一般的に説明する。
102:ユーザ端末
103:ネットワーク
11:データ処理部
12:対応表記憶部
13:音声つぶやきデータ記憶部
14:観察対象者マスタ記憶部
15:ユーザマスタ記憶部
16:出力部
17:警告部
Claims (10)
- 心身に関する複数の状態と、各状態に関連するキーワードとを対応づけた対応データを記憶する第1記憶部と、
観察対象者について複数の観察者により発話された内容を表すメッセージ群を記憶する第2記憶部と、
前記メッセージ群の中に存在する前記状態に関連するキーワードを特定し、前記特定したキーワードを含むメッセージ毎に前記特定したキーワードを含む情報データを前記状態に関連づけて配置することにより、提示情報を生成するデータ処理部と、
を備えた情報処理装置。 - 前記情報データは、前記特定したキーワードのリストと、前記特定したキーワードを含むメッセージを表すテキストとを含む
請求項1に記載の情報処理装置。 - 前記第2記憶部内の前記メッセージ群は、前記メッセージが発話された時刻情報に関連づけられており、
前記データ処理部は、前記状態ごとに、前記情報データを、前記情報データに関連するメッセージの時刻情報に基づき区分して配置する
請求項1または2に記載の情報処理装置。 - 前記第2記憶部に記憶されたメッセージ群は、前記メッセージを発話した観察者の職種が対応づけられており、
前記データ処理部は、前記状態ごとに、前記情報データを前記観察者の職種ごとに分けて配置する
請求項1または2に記載の情報処理装置。 - 前記データ処理部は、前記状態ごとに、前記情報データを前記観察者ごとに分けて配置する
請求項1または2に記載の情報処理装置。 - 前記第2記憶部に記憶されたメッセージ群は、前記メッセージを発話した観察者の職種と、前記メッセージが発話された時刻情報が対応づけられており、
前記データ処理部は、前記情報データを、前記観察者の職種および前記メッセージの時刻情報に応じて区分して配置することにより、前記提示情報とは別の提示情報を生成する
請求項1ないし5のいずれか一項に記載の情報処理装置。 - 前記提示情報を、表示部を備えるユーザ端末に送信する出力部を備え、
前記第1記憶部は、前記状態と前記キーワードとの関連度を記憶し、
前記データ処理部は、前記情報データにおける前記状態に関連するキーワードに対応する関連度に応じて、前記情報データの表示方法を決定し、
前記出力部は、前記データ処理部により決定された表示方法を特定する情報を、前記ユーザ端末に送信する
請求項1ないし6のいずれか一項に記載の情報処理装置。 - 前記提示情報を、表示部を備えるユーザ端末に送信する出力部を備え、
前記データ処理部は、前記状態に関連する情報データ群の総数に応じて、前記状態に関連する情報データ群の表示方法を決定し、
前記出力部は、前記データ処理部により決定された表示方法を特定する情報を前記ユーザ端末に送信する
請求項1ないし7のいずれか一項に記載の情報処理装置。 - 前記提示情報を、表示部を備えるユーザ端末に送信する出力部を備え、
前記データ処理部は、前記状態に関連する情報データ群における前記状態に関連するキーワードの総数に応じて、前記状態に関連する情報データ群の表示方法を決定し、
前記出力部は、前記データ処理部により決定された表示方法を特定する情報を前記ユーザ端末に送信する
請求項1ないし7のいずれか一項に記載の情報処理装置。 - 心身に関する複数の状態と、各状態に関連するキーワードとを対応づけた対応データを読み込むステップと、
観察対象者について複数の観察者により発話された内容を表すメッセージ群を読み込むステップと
前記メッセージ群の中に存在する前記状態に関連するキーワードを特定し、前記特定したキーワードを含むメッセージ毎に前記特定したキーワードを含む情報データを前記状態に関連づけて配置することにより、提示情報を生成するステップと
をコンピュータが実行する情報処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15751732.7A EP3109774A4 (en) | 2014-02-19 | 2015-02-18 | Information processing device and information processing method |
US15/120,105 US11043287B2 (en) | 2014-02-19 | 2015-02-18 | Information processing apparatus and information processing method |
JP2016504125A JP6356779B2 (ja) | 2014-02-19 | 2015-02-18 | 情報処理装置および情報処理方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-029869 | 2014-02-19 | ||
JP2014029869 | 2014-02-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015125810A1 true WO2015125810A1 (ja) | 2015-08-27 |
Family
ID=53878310
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/054392 WO2015125810A1 (ja) | 2014-02-19 | 2015-02-18 | 情報処理装置および情報処理方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US11043287B2 (ja) |
EP (1) | EP3109774A4 (ja) |
JP (1) | JP6356779B2 (ja) |
WO (1) | WO2015125810A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019016137A (ja) * | 2017-07-06 | 2019-01-31 | 富士通株式会社 | 記録プログラム、記録方法、及び記録装置 |
JP2022103155A (ja) * | 2020-12-25 | 2022-07-07 | 三菱電機Itソリューションズ株式会社 | 評価装置、評価方法、および、評価プログラム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3107090B1 (en) * | 2015-06-18 | 2023-01-11 | Airbus Operations GmbH | Announcement signalling on board an aircraft |
JP2021529382A (ja) | 2018-06-19 | 2021-10-28 | エリプシス・ヘルス・インコーポレイテッド | 精神的健康評価のためのシステム及び方法 |
US20190385711A1 (en) | 2018-06-19 | 2019-12-19 | Ellipsis Health, Inc. | Systems and methods for mental health assessment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003036261A (ja) * | 2001-07-26 | 2003-02-07 | Kyodo News Service | 文章自動分類装置、文章自動分類プログラム、文章自動分類方法及び文章自動分類プログラムを記録したコンピュータ読取可能な記録媒体 |
JP2006215675A (ja) * | 2005-02-02 | 2006-08-17 | Sachio Hirokawa | データマップ作成サーバ、データマップ作成方法、およびデータマップ作成プログラム |
Family Cites Families (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3586183B2 (ja) * | 2000-10-13 | 2004-11-10 | 俊忠 亀田 | 医療計画及び記録支援システム並びにプログラムを記録した機械読み取り可能な媒体 |
JP2004530982A (ja) * | 2001-05-04 | 2004-10-07 | ユニシス コーポレーション | Webサーバからの音声アプリケーション情報の動的な生成 |
WO2003077070A2 (en) * | 2002-03-06 | 2003-09-18 | Professional Pharmaceutical Index | Creating records of patients using a browser based hand-held assistant |
WO2005048833A1 (ja) * | 2003-11-20 | 2005-06-02 | Matsushita Electric Industrial Co., Ltd. | 健康データ収集装置 |
CA2572116A1 (en) | 2006-12-27 | 2008-06-27 | Ibm Canada Limited - Ibm Canada Limitee | System and method for processing multi-modal communication within a workgroup |
US20090024411A1 (en) * | 2007-04-12 | 2009-01-22 | Albro Thomas W | System and method for contextualizing patient health information in electronic health records |
US7818183B2 (en) * | 2007-10-22 | 2010-10-19 | American Well Corporation | Connecting consumers with service providers |
EP2169577A1 (en) * | 2008-09-25 | 2010-03-31 | Algotec Systems Ltd. | Method and system for medical imaging reporting |
US20100138231A1 (en) * | 2008-11-30 | 2010-06-03 | Linthicum Steven E | Systems and methods for clinical element extraction, holding, and transmission in a widget-based application |
US20110010195A1 (en) * | 2009-07-08 | 2011-01-13 | Steven Charles Cohn | Medical history system |
US20110029325A1 (en) * | 2009-07-28 | 2011-02-03 | General Electric Company, A New York Corporation | Methods and apparatus to enhance healthcare information analyses |
EP2284747A1 (en) * | 2009-08-12 | 2011-02-16 | F. Hoffmann-La Roche AG | Method of recording data for keeping diary of a medical testing or therapy |
US8311848B2 (en) * | 2009-10-05 | 2012-11-13 | Muthiah Subash | Electronic medical record creation and retrieval system |
US20120166226A1 (en) * | 2009-10-28 | 2012-06-28 | Christine Lee | Healthcare management system |
JP5257330B2 (ja) * | 2009-11-06 | 2013-08-07 | 株式会社リコー | 発言記録装置、発言記録方法、プログラム及び記録媒体 |
JP5499835B2 (ja) * | 2010-03-30 | 2014-05-21 | 富士通株式会社 | 医療介護インシデント情報管理プログラム、該装置及び該方法 |
US8630842B2 (en) * | 2010-06-30 | 2014-01-14 | Zeus Data Solutions | Computerized selection for healthcare services |
US20120158432A1 (en) * | 2010-12-15 | 2012-06-21 | Uday Jain | Patient Information Documentation And Management System |
JP5166569B2 (ja) * | 2011-04-15 | 2013-03-21 | 株式会社東芝 | 業務連携支援システムおよび業務連携支援方法 |
JP5404750B2 (ja) * | 2011-11-22 | 2014-02-05 | シャープ株式会社 | 認知症ケア支援方法、認知症情報出力装置、認知症ケア支援システム、及びコンピュータプログラム |
KR20130057338A (ko) * | 2011-11-23 | 2013-05-31 | 김용진 | 음성인식 부가 서비스 제공 방법 및 이에 적용되는 장치 |
JP5223018B1 (ja) * | 2012-05-30 | 2013-06-26 | 楽天株式会社 | 情報処理装置、情報処理方法、情報処理プログラム及び記録媒体 |
US9305140B2 (en) * | 2012-07-16 | 2016-04-05 | Georgetown University | System and method of applying state of being to health care delivery |
JP5414865B1 (ja) | 2012-09-21 | 2014-02-12 | 株式会社東芝 | 再生データ生成装置および再生データ生成方法 |
US9549295B2 (en) * | 2013-02-08 | 2017-01-17 | Audionow Ip Holdings, Llc | System and method for broadcasting audio tweets |
WO2014133993A1 (en) * | 2013-02-27 | 2014-09-04 | Interactive Intelligence, Inc. | System and method for remote management and detection of client complications |
US20160117469A1 (en) * | 2013-06-04 | 2016-04-28 | Koninklijke Philips N.V. | Healthcare support system and method |
US9420970B2 (en) * | 2013-10-22 | 2016-08-23 | Mindstrong, LLC | Method and system for assessment of cognitive function based on mobile device usage |
US20150149207A1 (en) * | 2013-11-27 | 2015-05-28 | General Electric Company | Health information prescription |
KR20150084520A (ko) * | 2014-01-14 | 2015-07-22 | 삼성전자주식회사 | 디스플레이 장치, 대화형 서버 및 응답 정보 제공 방법 |
US10231622B2 (en) * | 2014-02-05 | 2019-03-19 | Self Care Catalysts Inc. | Systems, devices, and methods for analyzing and enhancing patient health |
JP6418820B2 (ja) * | 2014-07-07 | 2018-11-07 | キヤノン株式会社 | 情報処理装置、表示制御方法、及びコンピュータプログラム |
US9824185B2 (en) * | 2014-08-08 | 2017-11-21 | Practice Fusion, Inc. | Electronic health records data management systems and methods |
US10122657B2 (en) * | 2014-10-29 | 2018-11-06 | Paypal, Inc. | Communication apparatus with in-context messaging |
US20160180023A1 (en) * | 2014-12-20 | 2016-06-23 | My Info LLC | Personal health care records aggregation |
JP5977898B1 (ja) * | 2015-01-26 | 2016-08-24 | 株式会社Ubic | 行動予測装置、行動予測装置の制御方法、および行動予測装置の制御プログラム |
US9875081B2 (en) * | 2015-09-21 | 2018-01-23 | Amazon Technologies, Inc. | Device selection for providing a response |
-
2015
- 2015-02-18 JP JP2016504125A patent/JP6356779B2/ja active Active
- 2015-02-18 WO PCT/JP2015/054392 patent/WO2015125810A1/ja active Application Filing
- 2015-02-18 EP EP15751732.7A patent/EP3109774A4/en not_active Ceased
- 2015-02-18 US US15/120,105 patent/US11043287B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003036261A (ja) * | 2001-07-26 | 2003-02-07 | Kyodo News Service | 文章自動分類装置、文章自動分類プログラム、文章自動分類方法及び文章自動分類プログラムを記録したコンピュータ読取可能な記録媒体 |
JP2006215675A (ja) * | 2005-02-02 | 2006-08-17 | Sachio Hirokawa | データマップ作成サーバ、データマップ作成方法、およびデータマップ作成プログラム |
Non-Patent Citations (2)
Title |
---|
KUMI IKEGAMI: "Tsubuyaki' o Ikashite Iryo Kaigo Service o Kakushin", JST NEWS, 2 December 2013 (2013-12-02), pages 3 - 7, XP055372636 * |
See also references of EP3109774A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019016137A (ja) * | 2017-07-06 | 2019-01-31 | 富士通株式会社 | 記録プログラム、記録方法、及び記録装置 |
JP2022103155A (ja) * | 2020-12-25 | 2022-07-07 | 三菱電機Itソリューションズ株式会社 | 評価装置、評価方法、および、評価プログラム |
Also Published As
Publication number | Publication date |
---|---|
EP3109774A4 (en) | 2017-11-01 |
JPWO2015125810A1 (ja) | 2017-03-30 |
US11043287B2 (en) | 2021-06-22 |
EP3109774A1 (en) | 2016-12-28 |
JP6356779B2 (ja) | 2018-07-11 |
US20170063737A1 (en) | 2017-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Reyes-Menendez et al. | Understanding# WorldEnvironmentDay user opinions in Twitter: A topic-based sentiment analysis approach | |
Alotaibi et al. | Smartphone addiction prevalence and its association on academic performance, physical health, and mental well-being among university students in Umm Al-Qura University (UQU), Saudi Arabia | |
Hersh et al. | Democratic and Republican physicians provide different care on politicized health issues | |
JP6356779B2 (ja) | 情報処理装置および情報処理方法 | |
Wu et al. | Evaluating disparities in elderly community care resources: Using a geographic accessibility and inequality index | |
Schlotheuber et al. | Summary measures of health inequality: a review of existing measures and their application | |
Ahrens et al. | A longitudinal and comparative content analysis of Instagram fitness posts | |
Beierle et al. | Corona health—A study-and sensor-based mobile app platform exploring aspects of the COVID-19 pandemic | |
de Las Heras-Pedrosa et al. | COVID-19 study on scientific articles in health communication: a science mapping analysis in web of science | |
Horstmann et al. | The operationalisation of sex and gender in quantitative health–related research: a scoping review | |
Pan et al. | Using PhenX measures to identify opportunities for cross‐study analysis | |
Wheeler et al. | Neighborhood disadvantage and tobacco retail outlet and vape shop outlet rates | |
Giorgi et al. | Cultural differences in Tweeting about drinking across the US | |
Lenormand et al. | Entropy as a measure of attractiveness and socioeconomic complexity in Rio de Janeiro metropolitan area | |
Ariza-Montes et al. | Safeguarding health at the workplace: a study of work engagement, authenticity and subjective wellbeing among religious workers | |
Franchini et al. | Shifting the paradigm: The Dress-COV telegram bot as a tool for participatory medicine | |
Mou et al. | The effect of socio-demographic factors in health-seeking behaviors among Bangladeshi residents during the first wave of COVID-19 | |
Herrera et al. | Barriers and supports in eHealth implementation among people with chronic cardiovascular ailments: integrative review | |
Medina-Garrido et al. | I can’t go to work tomorrow! work-family policies, well-being and absenteeism | |
Bustamante-Granda et al. | Ecuadorian journalists mental health influence on changing job desire: a cross sectional study | |
Rangaswamy et al. | A study on Singapore’s ageing population in the context of eldercare initiatives using machine learning algorithms | |
Deng et al. | Evaluation of the usage requirements of hospital signage systems based on the Kano model | |
Grossman et al. | Father-teen talks about sex and teens’ sexual health: The role of direct and indirect communication | |
Arroyo-Menéndez et al. | The digitization of seniors: analyzing the multiple confluence of social and spatial divides | |
Pérez-Valdecantos et al. | Professional quality of life of healthcare workers in hospital emergency departments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15751732 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016504125 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15120105 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2015751732 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015751732 Country of ref document: EP |