WO2023136175A1 - Information acquisition device and information processing system - Google Patents

Information acquisition device and information processing system Download PDF

Info

Publication number
WO2023136175A1
WO2023136175A1 PCT/JP2022/048616 JP2022048616W WO2023136175A1 WO 2023136175 A1 WO2023136175 A1 WO 2023136175A1 JP 2022048616 W JP2022048616 W JP 2022048616W WO 2023136175 A1 WO2023136175 A1 WO 2023136175A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing
data
information
user
processing system
Prior art date
Application number
PCT/JP2022/048616
Other languages
French (fr)
Japanese (ja)
Inventor
孝文 柳元
律子 金野
一視 平野
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023136175A1 publication Critical patent/WO2023136175A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices

Definitions

  • the present disclosure relates to an information acquisition device and an information processing system.
  • Patent Document 1 discloses a system for navigating the position of a diagnostic device used on the patient side in online medical care (remote medical care).
  • One aspect of the present disclosure provides technology that enables sensing of vital sounds at an appropriate auscultatory position.
  • An information acquisition device includes a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions.
  • An information processing system includes an information acquisition device including a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at positions facing each other; a processing unit for generating, based on the data, navigation information of auscultation positions for applying the plurality of sensing elements to the user's body.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of an information processing system according to an embodiment
  • FIG. It is a figure which shows the example of schematic structure of a sensor. It is a figure which shows the example of schematic structure of a sensor. It is a figure which shows the example of the functional block of an information processing system.
  • FIG. 4 is a diagram showing an example of calculation of a navigation direction; It is a figure which shows the example of a process.
  • FIG. 4 is a diagram showing an example of presentation of navigation information and the like; It is a figure which shows the example of a process.
  • FIG. 4 is a diagram showing an example of presentation of navigation information and the like;
  • FIG. 4 is a diagram showing an example of presentation of navigation information and the like; FIG.
  • FIG. 4 is a diagram showing an example of presentation of navigation information and the like; 4 is a flow chart showing an example of the flow of vital sound sensing. It is a figure which shows the example of the goods provided with the information acquisition apparatus. It is a figure which shows the example of schematic structure of an information acquisition apparatus. It is a figure which shows the example of the functional block of an information processing system.
  • FIG. 10 is a diagram showing an example of presentation of accumulated data; 1 is a diagram illustrating an example of a schematic configuration of an information processing system; FIG. It is a figure which shows the example of the functional block of an information processing system. 3 is a block diagram showing an example of hardware configuration; FIG.
  • Vital sounds such as breath sounds and heart sounds of patients suffering from allergic respiratory diseases, heart diseases, pulmonary diseases, etc. are usually auscultated by a doctor by placing a stethoscope on an appropriate part of the patient's body.
  • the doctor is not close to the patient.
  • the stethoscope sensor deviates from a position suitable for sensing vital sounds, making accurate sensing of vital sounds difficult.
  • the user of the information acquisition device 11 is referred to as user U and illustrated.
  • a user U receives online medical treatment.
  • a medical worker such as a doctor who treats the user U is referred to as a medical worker C and illustrated.
  • the medical staff C remotely diagnoses the user U by using the medical staff terminal 3 .
  • the information acquisition device 11 has the function of a sensor that senses the user's U vital sounds. Examples of vital sounds are breath sounds, lung sounds, heart sounds, bowel sounds, and the like.
  • the information acquisition device 11 can also be called a stethoscope sensor or a sensing unit having a function as a stethoscope sensor.
  • FIGS. 2 and 3 are diagrams showing examples of the schematic configuration of the sensor.
  • the information acquisition device 11 includes a distal end portion 112 provided with a sensor array 111 , a body portion 113 , and an environmental sound sensor 114 .
  • the figure also shows an XYZ coordinate system.
  • the XY plane direction corresponds to the array direction of the sensor array 111 .
  • the body portion 113 and the tip portion 112 are positioned in this order in the Z-axis positive direction.
  • FIG. 2 schematically shows the external shape of the information acquisition device 11 when viewed obliquely.
  • FIG. 3 schematically shows the external shape of the information acquisition device 11 when viewed from the front (when viewed in the Z-axis negative direction).
  • the sensor array 111 includes a plurality of sensing elements 111a arranged two-dimensionally (on the XY plane).
  • the multiple sensing elements 113a are arranged in a honeycomb pattern.
  • the number, shape, etc. of the sensing elements 111a are not limited to the examples shown in FIGS. Note that the sensor array 111 and the plurality of sensing elements 111a may be read interchangeably within a consistent range.
  • the sensing element 111a senses the vital sound of the user U while the sensor array 111 is in contact with the user's U body.
  • sensing element 111a may include a diaphragm that vibrates in response to sound pressure.
  • Time waveform signal data time-series data
  • the signal data obtained by the sensing element 111a is also called "sensing data". As long as there is no contradiction, the sensing data may be appropriately read as signal data, signals, or the like.
  • Each of the plurality of sensing elements 111a senses vital sounds at opposing positions.
  • the sensing element 113a is configured such that no sound interference occurs between adjacent sensing elements 113a.
  • the diaphragm of each sensing element 113a is configured to vibrate independently of the diaphragm of the adjacent sensing element 111a.
  • the sensing data of each of the plurality of sensing elements 111a can be different sensing data.
  • the environmental sound sensor 114 includes, for example, a microphone, and senses environmental sounds. Examples of environmental sounds are music, air conditioner sounds, conversation sounds, and the like. The environmental sound becomes noise with respect to the vital sound for sensing purposes.
  • Environmental sound sensor 114 is provided in a portion other than sensor array 111 . In this example, the environmental sound sensor 114 is provided on the body portion 113, more specifically, on the portion of the body portion 113 on the side opposite to the distal end portion 112 (Z-axis negative direction side).
  • the user terminal 12 includes a communication unit 13, a user interface unit 14, a processing unit 15, and a storage unit 16.
  • Examples of information stored in the storage unit 16 include an application program 161 (application software), a trained model 162 and user data 163 .
  • the communication unit 13 communicates with other devices. For example, the communication unit 13 receives sensing data from the information acquisition device 11 . The communication unit 13 also transmits vital sound data, which will be described later, to the medical staff terminal 3 .
  • the user interface unit 14 receives the operation of the user terminal 12 by the user U and presents information to the user U.
  • the processing unit 15 functions as a control unit that controls each element of the user terminal 12 and executes various processes.
  • the processing unit 15 executes the application program 161 .
  • An application used by the user U is provided by executing the application program 161 .
  • An example of an application is a clinical application.
  • Diagnosis includes sensing of user U's vital sounds.
  • the information acquisition device 11 described above is used for sensing vital sounds.
  • the processing unit 15 provides various functions by processing the sensing data of each of the plurality of sensing elements 111a. Examples of functions are the navigation of auscultation positions and the generation of vital sound data for auscultation, which will be described in turn below.
  • the processing unit 15 of the user terminal 12 generates navigation information of the auscultation position based on the sensing data of each of the plurality of sensing elements 111a.
  • the navigation information is information for guiding the auscultation position to a position suitable for auscultation of vital sounds for sensing purposes (hereinafter also referred to as "target auscultation position").
  • the navigation information may include the navigation direction.
  • the navigation direction is information for directing the sensor array 111 of the information acquisition device 11 to the target auscultation position (the direction toward the target auscultation position).
  • the processing unit 15 calculates the navigation direction based on the sensing data of each of the plurality of sensing elements 111a.
  • the processing unit 15 calculates an index related to the vital sound for sensing purposes in each of the plurality of sensing elements 111a, and calculates the navigation direction based on the calculated index.
  • the index of the sensing element 111a positioned near the target auscultation position may be calculated to be larger than the index of the sensing element 111a positioned far from the target auscultation position.
  • indicators are the magnitude of the vital sound for sensing purpose included in the sensing data (of the sensing element 111a), the ratio of the vital sound for sensing purpose to all the vital sounds included in the sensing data, and the sensing data included in the sensing data. It is the frequency component (for example, magnitude or ratio) of the target vital sound.
  • the sensing data of each of the plurality of sensing elements 111a includes not only vital sounds for sensing purposes (eg, heartbeat sounds), but also other vital sounds (eg, breathing sounds, lung sounds, bowel sounds, etc.).
  • the processing unit 15 detects (extracts, etc.) the magnitude, frequency component, etc. of each vital sound included in the sensing data for each of the plurality of sensing elements 111a.
  • the detection method is not particularly limited. Each included vital sound may be detected.
  • a trained model 162 may be used as described below.
  • the processing unit 15 calculates the above index based on the detection result. For example, the magnitude (amplitude value, etc.) of the detected vital sound for sensing is calculated as the magnitude of the vital sound for sensing. The magnitude of the vital sound intended for sensing with respect to the total magnitude of all vital sounds included in the sensing data of the sensing element 111a is calculated as the ratio of the vital sound intended for sensing. The detected frequency component of the vital sound for sensing is calculated as it is as the frequency component of the target vital sound.
  • the processing unit 15 calculates the navigation direction based on the calculated index of each of the plurality of sensing elements 111a. For example, in the array direction (XY plane direction) of the sensor array 111, the processing unit 15 calculates the direction from the sensing element 111a with the smaller index to the sensing element 111a with the larger index as the navigation direction. Note that the index of the sensing element 111a positioned near the target auscultation position may be calculated so as to be smaller than the index of the sensing element 111a positioned far from the target auscultation position, in which case the index is large. A direction from the sensing element 111a toward the sensing element 111a having a smaller index may be calculated as the navigation direction.
  • FIG. 5 is a diagram showing an example of calculating the navigation direction.
  • Each sensing element 111a is shown with hatching according to the size of the index.
  • the sensing element 111a is positioned on the upper right side of the sensor array 111 (positive direction of the X axis and the positive direction of the Y axis) than the sensing element 111a positioned on the lower left side of the sensor array 111 (the side of the negative direction of the X axis and the negative direction of the Y axis).
  • the index of the sensing element 111a is larger.
  • the processing unit 15 calculates the direction from the lower left of the sensor array 111 to the upper right of the sensor array 111 as the navigation direction. In the figure, the calculated navigation direction is schematically indicated by an outline arrow.
  • the learned model 162 may be used for part or all of the calculation of the navigation direction.
  • the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the navigation direction.
  • the trained model 162 is generated by machine learning using training data so as to output data corresponding to the navigation direction when data corresponding to sensing data of each of the plurality of sensing elements 111a is input. good.
  • the processing unit 15 acquires data corresponding to the navigation direction by inputting data corresponding to sensing data of each of the plurality of sensing elements 111a into the learned model 162 .
  • the learned model 162 may also be used for detecting each vital sound based on sensing data, calculating an index, calculating a navigation direction based on the index, and the like.
  • the trained model 162 may output data corresponding to each vital sound included in each sensing data when data corresponding to each sensing data of the plurality of sensing elements 111a is input.
  • the trained model 162 may output data corresponding to indices of the plurality of sensing elements 111a when data corresponding to the sensing data of the plurality of sensing elements 111a is input.
  • the learned model 162 may output data corresponding to the navigation direction when data corresponding to the indices of the plurality of sensing elements 111a are input.
  • the training data may include data obtained from medical examinations (auscultation, etc.) of many patients performed using the information acquisition device 11 or a device having a configuration similar to that of the information acquisition device 11.
  • the training data may include data obtained by the user U using the information processing system 100 together with such general-purpose data or instead of such general-purpose data.
  • Data obtained by the user U using the information processing system 100 is stored (saved or accumulated) in the storage unit 16 as user data 163 .
  • Training data for the trained model 162 for example, all data from which input data and output data for the trained model 162 can be obtained may be stored in the storage unit 16 as the user data 163 .
  • Machine learning using training data including the user data 163 provides a trained model 162 optimized for the user U's use of the information acquisition device 11, the use environment, and the like.
  • the trained model 162 may be generated by machine learning using training data including the general-purpose data described above, and then updated by additional machine learning using training data including the accumulated user data 163.
  • the trained model 162 can be used before user data 163 is accumulated.
  • the learned model 162 may be generated by machine learning using only training data including accumulated user data 163.
  • a trained model 162 optimized specifically for the user U can be used rather than using training data including general-purpose data.
  • the possibility of optimizing the navigation information for each user U is further increased.
  • navigation may be performed according to instructions from the medical staff C or the like.
  • the generation of the learned model 162 may be performed by the processing unit 15, or may be performed by a device (server device, etc.) external to the user terminal 12.
  • the navigation direction is calculated as described above.
  • the processing unit 15 generates navigation information including the calculated navigation direction.
  • the navigation information is presented to the user U by the user interface unit 14 as described later.
  • the processing unit 15 may generate vital sound data by adding the sensing data of each of the plurality of sensing elements 111a. As a further contrivance, the processing unit 15 may perform weighted addition of the sensing data of each of the plurality of sensing elements 111a so as to generate vital sound data suitable for auscultation of vital sounds for sensing purposes.
  • FIG. 6 is a diagram showing an example of processing.
  • processing include weight calculation processing P1, weighting processing P2, addition processing P3, noise reduction processing P4, and navigation information generation processing P5.
  • weight calculation process P1, the weighting process P2, the addition process P3, and the noise reduction process P4 are particularly related to the generation of vital sound data.
  • the navigation information generation process P5 is a process for generating the above-described navigation information, and for example, calculates the navigation direction as described above with reference to FIG.
  • weights (coefficients, etc.) for the sensing data of each of the plurality of sensing elements 111a are calculated.
  • the corresponding sensing data is weighted by the weight calculated in the weight calculation process P1 (multiplication of coefficients and the like is performed).
  • each weighted sensing data is added.
  • noise reduction processing P4 noise contained in the data obtained by the addition in the addition processing P3 is reduced. Data whose noise has been reduced by the noise reduction processing P4 is obtained as vital sound data.
  • the weight is calculated based on the addition result of the addition process P3. For example, the weight of the sensing data of the sensing element 111a that contains relatively many vital sounds for sensing purposes (for example, the above-mentioned index is large) is larger than the weight of the sensing data of the sensing element 111a that does not. is calculated.
  • Sensing data from the environmental sound sensor 114 may also be used in the weight calculation process P1.
  • the weight of each sensing data is calculated such that the weight of the sensing data of the sensing element 111a that contains relatively little environmental sound is greater than the weight of the sensing data of the sensing element 111a that does not.
  • the environmental sound, ie, noise contained in the data obtained by the addition in the addition processing P3 is reduced.
  • noise reduction process P4 based on the sensing data of the environmental sound sensor 114, environmental sound, ie, noise-reduced vital sound data is generated.
  • environmental sound ie, noise-reduced vital sound data
  • noise cancellation technology etc.
  • the learned model 162 may be used for the weight calculation process P1 described above.
  • the trained model 162 outputs data corresponding to the weight of each sensing data, for example, when data corresponding to sensing data of each of the plurality of sensing elements 111a is input.
  • the processing unit 15 acquires data corresponding to the weight of each sensing data by inputting the data corresponding to the sensing data of each of the plurality of sensing elements 111a into the learned model 162 .
  • trained model 162 may be a trained model generated by machine learning using training data including user data 163 . The possibility of generating vital sound data optimized for each user U increases.
  • vital sound data suitable for auscultation of vital sounds for sensing purposes is generated.
  • Part or all of the processing by the processing unit 15 may be executed by the information acquisition device 11 .
  • the processing in that case may be executed by a processing unit (processor or the like) (not shown) mounted in the main unit 113 of the information acquisition device 11, for example.
  • the processing unit 15 of the user terminal 12 may be appropriately read as the processing unit within the information acquisition device 11 .
  • FIG. 7 is a diagram showing an example of presentation of navigation information and the like.
  • the user interface unit 14 is exemplified as a display screen (display unit, etc.) of the user terminal 12 .
  • the waveform of the vital sound is displayed in the upper portion of the display screen.
  • vital sounds may be output.
  • Navigation information is displayed in the lower portion of the display screen.
  • the navigation information includes the appearance of the information acquisition device 11 and the navigation direction from the information acquisition device 11 to the target auscultation position.
  • the navigation direction is displayed as an arrow G extending from the information acquisition device 11 as a base end.
  • the user U can bring the sensor array 111 of the information acquisition device 11 closer to the target auscultation position.
  • the user U moves the information acquisition device 11 in contact with his/her body along the direction of the displayed arrow G.
  • the processing unit 15 repeats generation of navigation information and vital sound data, and the information is updated and displayed.
  • the communication unit 13 transmits the vital sound signal to the medical staff terminal 3.
  • the medical staff terminal 3 includes a communication unit 31, a user interface unit 32, and a storage unit 33.
  • Medical information 331 is exemplified as information stored in the storage unit 33 .
  • the medical information 331 includes, for example, information such as an electronic medical record of the user U, and is used for medical care of the user U by the medical staff C, and the like.
  • the communication unit 31 communicates with other devices and the like. For example, the communication unit 31 receives a vital sound signal from the information acquisition device 1 .
  • the user interface unit 32 receives operations of the medical staff terminal 3 by the medical staff C and presents information to the medical staff C. For example, the user interface unit 32 presents a vital sound (sound output, etc.). The medical staff C can treat the user U based on the presented vital sounds.
  • the movement, movement amount, etc. of the information acquisition device 11 may be sensed. It enables generation of more detailed navigation information. Description will be made with reference to FIG.
  • FIG. 8 is a diagram showing an example of processing.
  • the illustrated information acquisition device 11 further includes a motion sensor 115 .
  • the motion sensor 115 senses the movement, movement amount, etc. of the information acquisition device 11 .
  • Examples of the motion sensor 115 are an acceleration sensor, an angle sensor (gyro sensor), and the like.
  • the position of the sensor array 111 of the information acquisition device 11 is also calculated based on the sensing data of the motion sensor 115.
  • the position is calculated, for example, based on the moving direction, the moving distance, etc. from the known reference position.
  • the calculated position of the information acquisition device 11 is also included in the navigation information and presented by the user interface unit 14 .
  • the movement amount (distance) to the target auscultation position, the movement direction, and the like can be presented as specific numerical values, or can be presented together with the current position of the information acquisition device 11 . Presenting detailed navigation information increases the possibility of improving navigation accuracy.
  • the weight calculated by the weight calculation process P1 is also reflected in the position calculation in the navigation information generation process P5.
  • Data on the amount of change between the weight calculated in the n-1th (n is a positive integer) process and the weight calculated in the nth process is used to assist in calculating the position of the information acquisition device 11. .
  • navigation information may include body images. Description will be made with reference to FIGS. 9 and 10. FIG. 9
  • FIG. 9 and 10 are diagrams showing examples of presentation of navigation information and the like.
  • an image of the body is displayed in the lower portion of the display screen.
  • the displayed image of the body also includes the target auscultation position and characteristic parts of the body such as the clavicle, sternum, and the like.
  • the target auscultation locations are indicated ordered by 1-10 circles on the body.
  • the user U applies the sensor array 111 of the information acquisition device 11 to the position of his/her own body corresponding to the characteristic part of the body shown in the image. Note that this position may be indicated by the medical staff C.
  • An arrow G (navigation direction) pointing to the first target auscultation position is displayed, and the user U moves (the sensor array 111 of) the information acquisition device 11 in that direction.
  • the medical staff C can diagnose the user U based on the vital sounds appropriately sensed at the target auscultatory position.
  • This first target auscultation position may be set, for example, to the reference position described above.
  • the position of the sensor array 111 of the information acquisition device 11 calculated based on the amount of movement from the reference position and the like is also displayed. Vital sounds are appropriately sensed at each target auscultation position.
  • the user U is a pregnant woman, and the target auscultatory position for sensing fetal vital sounds is indicated by a circle on her body.
  • the target auscultatory position for sensing fetal vital sounds is indicated by a circle on her body.
  • an arrow G pointing to the target auscultation position is displayed.
  • the user U moves (the sensor array 111 of) the information acquisition device 11 in that direction.
  • Vital sounds are sensed while the sensor array 111 of the information acquisition device 11 is applied to the target auscultatory position.
  • the medical staff C can diagnose the user U based on the vital sounds appropriately sensed at the target auscultatory position.
  • FIG. 11 is a flowchart showing an example of the flow of vital sound sensing.
  • step S1 the initial position of the auscultatory position is set. For example, as described above, a characteristic part of the body to which the sensor array 111 of the information acquisition device 11 is applied is first displayed as an image, and the auscultation position is indicated by the medical staff C. FIG. The user U applies the sensor array 111 of the information acquisition device 11 to the corresponding position of his/her own body.
  • step S2 navigation information is presented.
  • the processing unit 15 of the user terminal 12 generates navigation information based on the sensing data of each of the plurality of sensing elements 111a.
  • the user interface unit 14 presents navigation information.
  • the user U moves the sensor array 111 of the information acquisition device 11 according to the presented navigation information. As the user moves, new navigation information is repeatedly generated and presented. In addition, generation of vital sound data, transmission to the medical staff terminal 3, and the like are also performed.
  • the operation of step S2 is repeated until the sensor array 111 of the information acquisition device 11 moves to the target auscultation position.
  • step S3 the auscultatory position is determined. Vital sounds are sensed at the target auscultation position. Sensing is performed at an appropriate auscultatory position, and properly generated vital sound data is obtained.
  • the information acquisition device 11 may be provided on an article so that the sensor array 111 is in contact with the user's U back.
  • An article provided with the information acquisition device 11 in this way may also be a component of the information processing system 100 .
  • articles are furniture such as chairs and beds.
  • FIG. 12 is a diagram showing an example of an article provided with an information acquisition device.
  • the illustrated article is a chair 8 .
  • the chair 8 includes a support portion 81 that supports the information acquisition device 11 in the backrest portion.
  • the support unit 81 supports the information acquisition device 11 so that the sensor array 111 of the information acquisition device 11 touches the back of the user U while the user U is leaning against the backrest of the chair 8 .
  • the displayed navigation information includes, in addition to the image of the back, the target auscultation position and characteristic parts of the back, such as the shoulder blades and spine.
  • the target auscultation position is indicated by a circle.
  • the support part 81 is configured to be movable in the XY plane direction.
  • the user U operates the direction key K displayed in the lower part of the display screen of the user terminal 12, and the sensor array 111 of the information acquisition device 11 indicated by the support section 81 and the support section 81 is operated. to move.
  • the chair 8 and the user terminal 12 may be configured to be able to communicate with each other by, for example, wired or short-range wireless communication so that the chair 8 can be operated from the user terminal 12 .
  • an arrow G (navigation direction) pointing to the target auscultation position is displayed.
  • the user U moves the support portion 81 in that direction.
  • the sensor array 111 of the information acquisition device 11 is moved together with the support portion 81 to the target auscultation position, and vital sounds are sensed in that state.
  • the medical staff C can diagnose the user U based on appropriately sensed vital sounds.
  • the movement of the sensor array 111 of the information acquisition device 11 by the support section 81 may be automatically controlled by the processing section 15 of the user terminal 12 without intervention of the user U's operation.
  • the information acquisition device 11 may present navigation information. Description will be made with reference to FIG.
  • FIG. 13 is a diagram showing an example of a schematic configuration of an information acquisition device.
  • FIG. 13 schematically shows the external shape of the information acquisition device 11 when viewed from the back (when viewed in the Z-axis positive direction).
  • Information acquisition device 11 includes light emitting unit 116 .
  • the light emitting unit 116 is provided at a portion other than the main surface 112a of the tip portion 112 between the tip portion 112 and the main body portion 113 so as to be visible to the user U when the information acquisition device 11 is viewed from the rear.
  • the light-emitting portion 116 is provided in a ring shape on the surface of the distal end portion 112 opposite to the main surface 112a.
  • the light emitting section 116 includes a plurality of light emitting elements 116a arranged in a ring shape.
  • the light emitting element 116a are an LED (Light Emitting Diode), an OLED (Organic Light-Emitting Diode), and the like. Lighting and extinguishing (blinking) of each of the plurality of light emitting elements 116a can be individually controlled.
  • one light emitting element 116a on the upper left side (the positive direction of the X axis and the positive direction of the Y axis) is lit.
  • a direction (schematically illustrated as an arrow G) from the center position of the ring shape toward the lit light emitting element 116a is presented as the navigation direction. Besides presenting the navigation direction, for example, by changing the blinking pattern of the light emitting element 116a, it is possible to present the distance to the target auscultation position.
  • the light emitting unit 116 provided in the information acquisition device 11 can present navigation information to the user U as described above.
  • the auscultation position can be navigated without using the user interface unit 14 of the user terminal 12 .
  • data including vital sound data may be saved and accumulated. This data may be accumulated in the user terminal 12 or may be accumulated in an external device. Description will be made with reference to FIG.
  • FIG. 14 is a diagram showing an example of functional blocks of an information processing system.
  • An information processing device 2 is exemplified as an external device of the information acquisition device 1 .
  • the information processing device 2 is configured to be able to communicate with the information acquisition device 1 and the medical staff terminal 3 via the network N (FIG. 1).
  • data including vital sound data generated by the processing unit 15 is stored in the storage unit 16 as accumulated data AD.
  • the vital sound data included in the accumulated data AD may be processed data.
  • the communication unit 13 of the user terminal 12 also transmits the vital sound data to the information processing device 2 .
  • Information processing device 2 includes communication unit 21 , estimation unit 22 , and storage unit 23 .
  • the communication unit 21 communicates with other devices and the like.
  • the communication unit 21 receives a vital sound signal from the user terminal 12 .
  • Accumulated data AD including the vital sound signal is also stored in the storage unit 23 of the information processing device 2 .
  • the information processing device 2 performs various analyzes of the accumulated data AD. An example of the analysis is estimation of the medical condition of the user U, etc., which is performed by the estimation unit 22 .
  • the estimation unit 22 estimates an index related to the user U's medical condition based on the evaluation result of the feature amount calculated from the vital sound data included in the accumulated data AD.
  • An example of the feature quantity is vital stability (stability of user U's heartbeat, etc.) and the like.
  • Various algorithms may be used for estimation.
  • the estimating unit 22 calculates feature amounts from the vital sound signal using various feature amount calculation algorithms (learned models may be used). Then, the estimation unit 22 estimates the index by evaluating the calculated feature amount. Examples of indicators are the degree of progress of a medical condition, the effectiveness of rehabilitation, and the like.
  • the accumulated data AD may also include the estimation result (analysis result) of the estimation unit 22 .
  • the accumulated data AD is transmitted to the medical staff terminal 3 and reflected in the medical information 331 (electronic medical record, etc.) stored in the storage unit 33 .
  • the accumulated data AD may be presented to the user U by the user terminal 12 or presented to the medical staff C by the medical staff terminal 3. It becomes possible to visualize the history of symptoms of the user U, and the like.
  • the health care worker C can check the condition of the user U each time, which could only be checked when visiting the hospital. Not only the medical staff C, but also the user U himself/herself can grasp the patient's condition and its change until the next hospital visit. It is also possible to predict aggravation tendency based on the symptom data and its changes, and call attention to the medical staff C and the user U.
  • FIG. 15 is a diagram showing an example of presentation of accumulated data.
  • An example of presentation of accumulated data AD by the user interface unit 14 of the user terminal 12 is shown.
  • On the upper side of the display screen a graph of feature values for each day over a period of several months, for example, is displayed. Vital sounds of the selected day, in this example, breathing sounds, heartbeat sounds, etc., can be reproduced and confirmed.
  • a similar presentation may be made by the user interface unit 32 of the medical staff terminal 3 .
  • the community member terminal 4 is a terminal used by member M.
  • the member M is a member belonging to the same community as the user U, such as a family member of the user U, another patient having the same disease as the user U, or the like.
  • the life insurance/health insurance terminal 5, product/service provider terminal 6, and analysis terminal 7 will be described later.
  • FIG. 17 is a diagram showing an example of functional blocks of an information processing system.
  • the information processing device 2 further includes a recommendation section 24 and a processing section 25 .
  • patient information 231, community information 232, algorithm DB 233, learned model 234, recommendation information 235, and anonymously processed information 236 are stored in the storage unit 23 of the information processing device 2.
  • FIG. 17 is a diagram showing an example of functional blocks of an information processing system.
  • the information processing device 2 further includes a recommendation section 24 and a processing section 25 .
  • patient information 231, community information 232, algorithm DB 233, learned model 234, recommendation information 235, and anonymously processed information 236 are stored in the storage unit 23 of the information processing device 2.
  • the patient information 231 includes information about the user U who is a patient.
  • Examples of the patient information 251 are user U's disease information, hospital visit history information, rehabilitation history information, the above-described accumulated data AD (FIG. 14), and the like.
  • the disease information includes user U's disease name and the like.
  • the hospital visit history information is information about the user U's past hospital visits.
  • Rehabilitation history information is information related to past rehabilitation performed by user U. These pieces of information are provided, for example, from the medical staff terminal 3 or the like.
  • the community information 232 is information about the community to which the user U belongs, and includes information about the members M and the community member terminals 4 .
  • the algorithm DB 233, the learned model 234, the recommended information 235, and the anonymously processed information 236 will be described later.
  • the communication unit 21 and the estimation unit 22 are as described above.
  • the information including the estimation result of the estimation unit 22 and the accumulated data AD is referred to as "result information”.
  • the communication unit 21 transmits the result information to other devices and terminals, in this example, the user terminal 12 , the medical worker terminal 3 , the community member terminal 4 and the life/health insurance terminal 5 . Note that the recommendation unit 24 and processing unit 25 of the information processing device 2 will be described later.
  • the result information is presented by the user interface unit 14.
  • the user U can know various indicators regarding his/her own physical function, behavior, disease, and the like.
  • Content to be presented to the user U may be included in the result information transmitted from the information processing device 2 to the user terminal 12 . Individual content can be presented based on the estimation results.
  • the result information is presented by 34 on the medical staff terminal 3 as well.
  • the medical staff C can know various indexes related to the user U's physical function, behavior, disease, and the like.
  • individual content such as a rehabilitation menu customized by the medical staff C so as to suit the user U, for example, may be generated using the user interface unit 32 .
  • the content is transmitted to the user terminal 12 and presented to the user U by the communication unit 31 .
  • the community member terminal 4 includes a communication unit 41 and a user interface unit 42.
  • the communication unit 41 communicates with other devices and the like. For example, the communication unit 41 receives result information from the information processing device 2 .
  • the user interface unit 42 receives operations of the community member terminal 4 by the member M and presents information to the member M. For example, the result information from the information processing device 2 is presented and shared with the member M as well.
  • the result information transmitted from the information processing device 2 to the community member terminal 4 may include content to be presented to the member M. FIG. Individual content can be presented based on the estimation results.
  • the life insurance/health insurance terminal 5 is a terminal used by insurance companies, health insurance companies, and the like.
  • Life insurance/health insurance terminal 5 includes communication unit 51 , analysis unit 52 , and storage unit 53 .
  • Customer/employee information 531 is exemplified as information stored in the storage unit 53 .
  • Customer/employee information 531 includes information about user U's life insurance, health insurance, and the like.
  • the communication unit 51 receives result information from the information processing device 2 .
  • the analysis unit 52 analyzes the result information and specifies (calculates, etc.) insurance premiums and rewards.
  • the identification may involve the work, judgment, etc. of an employee of a life insurance company or a health insurance company. A reduction in insurance premiums, a change to a limited plan, or the like may also be performed.
  • the communication unit 51 transmits to the user terminal 12 the identified insurance premium and the recommended insurance premium/reward information. Premium/reward information is presented by the user interface unit 14 of the user terminal 12 .
  • the product/service provider terminal 6 is a terminal used by companies that provide products/services. Examples of products include wheelchairs, walking aids, rehabilitation equipment, health food and health equipment. An example of a service is a health application or the like that can be executed on the user terminal 12 .
  • the product/service provider terminal 6 includes a communication unit 61 and a user interface unit 62.
  • product/service information that associates the anonymized result information with products and services is input or generated via the user interface unit 62 .
  • the communication unit 61 transmits product/service information to the information processing device 2 .
  • the communication unit 21 of the information processing device 2 receives product/service information from the product/service provider terminal 6 .
  • the recommendation unit 24 and the processing unit 25 of the information processing device 2 will be explained.
  • the recommendation unit 24 generates recommendation information 235 including information on products and services to be recommended to the user U based on the product/service information from the product/service provider terminal 6 .
  • the communication unit 21 transmits the recommendation information 235 to the user terminals 12 and the community member terminals 4 .
  • the recommendation information 235 is presented by the user interface unit 14 of the user terminal 12 or by the user interface unit 42 of the community member terminal 4 .
  • the processing unit 25 generates anonymously processed information 236 by anonymizing the result information.
  • the anonymously processed information 236 describes the anonymized personal information and the result information in association with each other.
  • the communication unit 21 of the information processing device 2 transmits the anonymously processed information 236 to the analysis terminal 7.
  • the analysis terminal 7 is, for example, a terminal used by a company that provides the above products/services, a pharmaceutical company that conducts clinical development, and the like.
  • the analysis terminal 7 includes a communication section 71 , an analysis section 72 and a user interface section 73 .
  • the communication unit 71 receives the anonymously processed information 236 from the information processing device 2.
  • the analysis unit 72 performs data analysis (data analysis) based on the anonymously processed information 236 .
  • the analysis may involve the work, judgment, etc., of employees of the company or the like.
  • the user interface unit 73 presents information related to data analysis. Examples of analysis include analysis of user groups for products such as health foods and health equipment, data analysis for clinical development, and the like.
  • the anonymously processed information 236 can be used for various services, such as marketing analysis by manufacturers, analysis of percentage of users with specific symptoms, age, gender, etc., and monitoring of symptoms of patients taking specific drugs.
  • the sensing data of the information acquisition device 11 is transmitted to the information processing device 2 as vital sound data via the user terminal 12 as an example.
  • part or all of the sensing data of the information acquisition device 11 is directly transmitted from the information acquisition device 11 to the information processing device 2 without going through the user terminal 12, and the vital sound raw data is generated in the information processing device 2.
  • the information processing device 2 may also generate the navigation information. That is, part or all of the functions of the processing unit 15 of the user terminal 12 may be implemented in the information processing device 2 .
  • these functions and information may be provided in a server device or the like (service provider server device) managed by the user of the corresponding terminal.
  • the life insurance/health insurance terminal 5, product/service provider terminal 6, and analysis terminal 7 communicate with corresponding server devices and the like to use their functions. It is possible to reduce the processing load on the information processing device 2 and to simplify the functions to reduce the cost.
  • Some of the functions of the terminal may be provided in a server device or the like managed by the user of the corresponding terminal.
  • the function of the analysis unit 52 of the life insurance/health insurance terminal 5 may be provided in a server device or the like managed by an insurance company, a health insurance company, or the like.
  • the life insurance/health insurance terminal 5 communicates with the server device or the like to use its functions.
  • the function of the analysis unit 72 of the analysis terminal 7 may be provided in a server device or the like managed by a pharmaceutical company or the like.
  • the analysis terminal 7 communicates with the server device or the like to use its functions. It is possible to reduce the processing load on the life insurance/health insurance terminal 5 and the analysis terminal 7, and to reduce the cost by simplifying the functions.
  • FIG. 18 is a block diagram showing an example of the hardware configuration.
  • the user terminal 12 will be described below as an example. The same explanation can be given for the information acquisition device 11, the information processing device 2, the medical staff terminal 3, the community member terminal 4, the life insurance/health insurance terminal 5, the product/service provider terminal 6, the analysis terminal 7, and the like.
  • Various types of processing are realized by cooperation between software and hardware described below.
  • the user terminal 12 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a.
  • the user terminal 12 also includes a bridge 904 , an external bus 904 b , an interface 905 , an input device 906 , an output device 907 , a storage device 908 , a drive 909 , a connection port 911 , a communication device 913 and a sensor 915 .
  • the information processing device 2 may have a processing circuit such as a DSP or an ASIC in place of or together with the CPU 901 .
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls general operations within the information processing device 2 according to various programs.
  • the CPU 901 may be a microprocessor.
  • the ROM 902 stores programs, calculation parameters, and the like used by the CPU 901 .
  • the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
  • the CPU 901 can embody the processing unit 15 of the user terminal 12, for example.
  • the CPU 901, ROM 902 and RAM 903 are interconnected by a host bus 904a including a CPU bus and the like.
  • the host bus 904a is connected via a bridge 904 to an external bus 904b such as a PCI (Peripheral Component Interconnect/Interface) bus.
  • PCI Peripheral Component Interconnect/Interface
  • host bus 904a, bridge 904 and external bus 904b need not necessarily have separate configurations from each other and may be implemented in a single configuration (eg, one bus).
  • the input device 906 is implemented by a device such as a mouse, keyboard, touch panel, button, microphone, switch, lever, etc., through which information is input by the practitioner.
  • the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or PDA corresponding to the operation of the information processing device 2.
  • the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the practitioner using the above input means and outputs the signal to the CPU 901 . By operating the input device 906, the practitioner can input various data to the user terminal 12 and instruct processing operations.
  • the output device 907 is formed by a device capable of visually or audibly notifying the practitioner of the acquired information.
  • Such devices include display devices such as CRT display devices, liquid crystal display devices, plasma display devices, EL display devices and lamps, audio output devices such as speakers and headphones, and printer devices.
  • the storage device 908 is a device for storing data.
  • the storage device 908 is implemented by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 908 stores programs executed by the CPU 901, various data, and various data acquired from the outside.
  • the storage device 908 can embody the storage unit 16 or the like of the user terminal 12, for example.
  • the drive 909 is a reader/writer for storage media, and is built in or externally attached to the information processing device 2 .
  • the drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903 .
  • Drive 909 can also write information to a removable storage medium.
  • connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of data transmission by, for example, USB (Universal Serial Bus).
  • USB Universal Serial Bus
  • the communication device 913 is, for example, a communication interface formed of a communication device or the like for connecting to the network 920 (corresponding to the network N in FIG. 1, for example).
  • the communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various types of communication, or the like.
  • This communication device 913 can transmit and receive signals and the like to and from the Internet and other communication devices, for example, according to a predetermined protocol such as TCP/IP.
  • the communication device 913 can embody the communication unit 21 of the information processing device 2, for example.
  • the sensor 915 may include at least part of the sensors included in the information acquisition device 11 .
  • the network 920 is a wired or wireless transmission path for information transmitted from devices connected to the network 920 .
  • the network 920 may include the Internet, a telephone line network, a public line network such as a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), WANs (Wide Area Networks), and the like.
  • Network 920 may also include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
  • a hardware configuration example capable of realizing the functions of the user terminal 12 has been shown above.
  • Each component described above may be implemented using general-purpose members, or may be implemented by hardware specialized for the function of each component. Therefore, it is possible to appropriately change the hardware configuration to be used according to the technical level at which the present disclosure is implemented.
  • a computer-readable recording medium storing such a computer program can also be provided. Recording media include, for example, magnetic disks, optical disks, magneto-optical disks, flash memories, and the like. Also, the above computer program may be distributed, for example, via a network without using a recording medium.
  • the technical category that embodies the above technical idea is not limited.
  • the above technical ideas may be embodied by a computer program for causing a computer to execute one or more procedures (steps) included in the method of manufacturing or using the above apparatus.
  • the above technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
  • the information acquisition device 11 includes a plurality of sensing elements 111a arranged two-dimensionally so as to sense vital sounds at opposing positions.
  • navigation of the auscultation position can be performed by processing the sensing data of each of the plurality of sensing elements 111a as described above. Therefore, it is possible to sense vital sounds at an appropriate auscultatory position.
  • the information acquisition device 11 includes a tip portion 112 having a main surface 112a provided with a plurality of sensing elements 111a, and a tip portion 112 having a main surface 112a opposite to the main surface 112a. and a body portion 113 that is supported from the side, and the tip portion 112 and the body portion 113 have sizes that can be held and moved by the user U (for example, about the same size as the headpiece of a stethoscope). may have The user U can easily handle the stethoscope sensor of the information acquisition device 11 .
  • the information processing system 100 described with reference to FIGS. 1 to 7 is also one of the disclosed technologies.
  • the information processing system 100 includes the information acquisition device 11 including the plurality of sensing elements 111a described above, and the navigation of the auscultation position where the plurality of sensing elements 111a are applied to the body of the user U based on the sensing data of each of the plurality of sensing elements 111a. and a processing unit 15 that generates information.
  • the navigation information includes a navigation direction for orienting the information acquisition device 11 to a target auscultation position suitable for auscultation of vital sounds for sensing purposes. may compute the navigation direction. For example, such navigational information can be provided for sensing vital sounds at appropriate auscultatory locations.
  • the processing unit 15 calculates the navigation direction based on the indices related to the vital sounds for sensing purposes in each of the plurality of sensing elements 111a. At least one of the volume of the vital sound for sensing included, the ratio of the vital sound for sensing to all the vital sounds included in the sensing data, and the frequency component of the vital sound for sensing included in the sensing data. may contain. For example, navigation directions can be calculated based on such indicators.
  • the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the navigation direction.
  • the data corresponding to the sensing data of each of the sensing elements 111a are input, the data corresponding to the navigation direction may be output.
  • a navigation direction can also be calculated using such a trained model 162 .
  • the trained model 162 in that case may be a trained model generated by machine learning using training data including the user data 163 obtained by the user U using the information processing system 100. .
  • the navigation direction can be calculated using the learned model 162 optimized for the usage status of the information acquisition device 11 by the user U, the usage environment, and the like.
  • the information acquisition device 11 includes the motion sensor 115, and the processing unit 15 obtains navigation information including the positions of the plurality of sensing elements 111a based on the sensing data of the motion sensor. may be generated. The possibility of improving navigation accuracy increases.
  • the information processing system 100 may include the user interface section 14 that presents the user U with navigation information. According to the presented navigation information, the user U can bring the plurality of sensing elements 111a of the information acquisition device 11 closer to the target auscultatory position.
  • the information processing system 100 includes the tip portion 112 described above, the body portion 113 described above, and the tip portion 112 and the body portion 113 other than the main surface 112 a of the tip portion 112 . and a light-emitting unit 116 provided in the portion of , and the light-emitting unit 116 may present the user U with navigation information. Navigation information can also be presented by such a light emitting unit 116 .
  • the processing unit 15 may generate vital sound data based on the sensing data of each of the plurality of sensing elements 111a.
  • the processing unit 15 may weight-add the sensing data of each of the plurality of sensing elements 111a so as to generate the vital sound data suitable for auscultation of vital sounds for sensing purposes.
  • the information acquisition device 11 may include the environmental sound sensor 114 , and the processing unit 15 may weight-add the sensing data of each of the plurality of sensing elements 111 a based on the sensing data of the environmental sound sensor 114 .
  • the processing unit 15 may generate vital sound data with reduced noise based on the sensing data of the environmental sound sensor. For example, by generating vital sound data in this manner, vital sounds suitable for auscultation can be obtained.
  • the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the weight of each of the plurality of sensing elements 111a.
  • the trained model 162 may output data corresponding to the weight of each of the plurality of sensing elements 111a when the data corresponding to the sensing data of each of the plurality of sensing elements 111a is input.
  • a weight can also be calculated using such a trained model 162 .
  • the trained model 162 in that case may also be a trained model generated by machine learning using training data including the user data 163 . The possibility of optimizing the vital sound data for each user U increases.
  • the information processing system 100 may include a storage unit for accumulating vital sound data (storage unit 16, storage unit 23, etc. for storing accumulated data AD).
  • a storage unit for accumulating vital sound data storage unit 16, storage unit 23, etc. for storing accumulated data AD.
  • it can be used in various ways by presenting the accumulated data AD and analyzing the accumulated data AD.
  • the information processing system 100 may include an article (for example, the chair 8) provided with the information acquisition device 11 such that the plurality of sensing elements 111a are in contact with the back of the user U. Auscultation of the back of the user U can also be easily performed.
  • the present technology can also take the following configuration.
  • (1) Equipped with a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions, Information acquisition device.
  • (2) a tip portion having a main surface on which the plurality of sensing elements are provided; a body portion that supports the tip portion from a side opposite to the main surface; with The tip portion and the body portion have a size that allows a user to hold and move them by hand, The information acquisition device according to (1).
  • an information acquisition device including a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions; a processing unit that generates navigation information of an auscultatory position where the plurality of sensing elements are applied to the user's body based on the sensing data of each of the plurality of sensing elements; comprising Information processing system.
  • the navigation information includes navigation directions for directing the information acquisition device to a target auscultation position suitable for auscultation of vital sounds for sensing purposes; the processing unit calculates the navigation direction; The information processing system according to (3).
  • the processing unit calculates the navigation direction based on an index related to the vital sound for the purpose of sensing in each of the plurality of sensing elements,
  • the indicator is the volume of the vital sound for sensing purposes included in the sensing data, The ratio of vital sounds for sensing purposes to all vital sounds contained in the sensing data, as well as, frequency components of vital sounds for sensing purposes included in the sensing data; including at least one of The information processing system according to (4).
  • the processing unit calculates the navigation direction using sensing data of each of the plurality of sensing elements and a trained model, The trained model outputs data corresponding to the navigation direction when data corresponding to sensing data of each of the plurality of sensing elements is input.
  • the information processing system according to (4).
  • the processing unit generates vital sound data based on the sensing data of each of the plurality of sensing elements.
  • (12) The processing unit performs weighted addition of the sensing data of each of the plurality of sensing elements so as to generate the vital sound data suitable for auscultation of vital sounds for sensing purposes.
  • the information acquisition device includes an environmental sound sensor, The processing unit performs weighted addition of the sensing data of each of the plurality of sensing elements based on the sensing data of the environmental sound sensor.
  • the information acquisition device includes an environmental sound sensor, The processing unit generates the vital sound data with reduced noise based on the sensing data of the environmental sound sensor.
  • Information Processing System 11 Information Acquisition Device 111 Sensor Array 111a Sensing Element 112 Tip Part 112a Main Surface 113 Body Part 114 Environmental Sound Sensor 115 Motion Sensor 116 Light Emitting Part 116a Light Emitting Element 12 User Terminal 13 Communication Part 14 User Interface Part 15 Processing Part 16 Storage unit 161 Application program 162 Learned model 163 User data 2 Information processing device 21 Communication unit 22 Estimation unit 23 Storage unit 231 Patient information 232 Community information 233 Algorithm DB 234 Trained model 235 Recommended information 236 Anonymously processed information 24 Recommendation unit 25 Processing unit 3 Medical staff terminal 31 Communication unit 32 User interface unit 33 Storage unit 331 Medical information 4 Community member terminal 41 Communication unit 42 User interface unit 5 Life/health insurance Terminal 51 Communication Unit 52 Analysis Unit 53 Storage Unit 531 Customer/Employee Information 6 Product/Service Provider Terminal 61 Communication Unit 62 User Interface Unit 7 Analysis Terminal 71 Communication Unit 72 Analysis Unit 73 User Interface Unit 8 chair 81 support G arrow (navigation direction) AD Accumulated data C Medical

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

This information acquisition device (11) is equipped with multiple sensing elements (111a) that are arranged two-dimensionally so that each element senses a vital sound at an opposing position.

Description

情報取得装置及び情報処理システムInformation acquisition device and information processing system
 本開示は、情報取得装置及び情報処理システムに関する。 The present disclosure relates to an information acquisition device and an information processing system.
 例えば特許文献1は、オンライン診療(遠隔診療)において、患者側で用いられる診断デバイスの位置をナビゲーションするシステムを開示する。 For example, Patent Document 1 discloses a system for navigating the position of a diagnostic device used on the patient side in online medical care (remote medical care).
特表2014-509010号公報Special Table 2014-509010
 聴診を含む診療においては、医師等が近くにいないと、聴診器センサ等を患者が自分で身体に当てなければならない。適切な聴診位置に位置合わせしてバイタル音をセンシングすることは容易ではない。  In medical examinations that include auscultation, if a doctor is not nearby, the patient must apply the stethoscope sensor, etc. to the body by himself/herself. It is not easy to align to the appropriate auscultatory position and sense vital sounds.
 本開示の一側面は、適切な聴診位置でのバイタル音のセンシングを可能にする技術を提供する。 One aspect of the present disclosure provides technology that enables sensing of vital sounds at an appropriate auscultatory position.
 本開示の一側面に係る情報取得装置は、それぞれが対向する位置のバイタル音をセンシングするように2次元状に配置された複数のセンシング素子を備える。 An information acquisition device according to one aspect of the present disclosure includes a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions.
 本開示の一側面に係る情報処理システムは、それぞれが対向する位置のバイタル音をセンシングするように2次元状に配置された複数のセンシング素子を含む情報取得装置と、複数のセンシング素子それぞれのセンシングデータに基づいて、複数のセンシング素子をユーザの身体に当てる聴診位置のナビゲーション情報を生成する処理部と、を備える。 An information processing system according to one aspect of the present disclosure includes an information acquisition device including a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at positions facing each other; a processing unit for generating, based on the data, navigation information of auscultation positions for applying the plurality of sensing elements to the user's body.
実施形態に係る情報処理システムの概略構成の例を示す図である。1 is a diagram illustrating an example of a schematic configuration of an information processing system according to an embodiment; FIG. センサの概略構成の例を示す図である。It is a figure which shows the example of schematic structure of a sensor. センサの概略構成の例を示す図である。It is a figure which shows the example of schematic structure of a sensor. 情報処理システムの機能ブロックの例を示す図である。It is a figure which shows the example of the functional block of an information processing system. ナビゲーション方向の算出の例を示す図である。FIG. 4 is a diagram showing an example of calculation of a navigation direction; 処理の例を示す図である。It is a figure which shows the example of a process. ナビゲーション情報等の提示の例を示す図である。FIG. 4 is a diagram showing an example of presentation of navigation information and the like; 処理の例を示す図である。It is a figure which shows the example of a process. ナビゲーション情報等の提示の例を示す図である。FIG. 4 is a diagram showing an example of presentation of navigation information and the like; ナビゲーション情報等の提示の例を示す図である。FIG. 4 is a diagram showing an example of presentation of navigation information and the like; バイタル音のセンシングの流れの例を示すフローチャートである。4 is a flow chart showing an example of the flow of vital sound sensing. 情報取得装置が設けられた物品の例を示す図である。It is a figure which shows the example of the goods provided with the information acquisition apparatus. 情報取得装置の概略構成の例を示す図である。It is a figure which shows the example of schematic structure of an information acquisition apparatus. 情報処理システムの機能ブロックの例を示す図である。It is a figure which shows the example of the functional block of an information processing system. 蓄積データの提示の例を示す図である。FIG. 10 is a diagram showing an example of presentation of accumulated data; 情報処理システムの概略構成の例を示す図である。1 is a diagram illustrating an example of a schematic configuration of an information processing system; FIG. 情報処理システムの機能ブロックの例を示す図である。It is a figure which shows the example of the functional block of an information processing system. ハードウェア構成の例を示すブロック図である。3 is a block diagram showing an example of hardware configuration; FIG.
 以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の要素には同一の符号を付することにより重複する説明を省略する。 Below, embodiments of the present disclosure will be described in detail based on the drawings. In addition, in each of the following embodiments, the same reference numerals are given to the same elements to omit redundant description.
 以下に示す項目順序に従って本開示を説明する。
  0.序
  1.実施形態
   1.1 聴診位置のナビゲーション
   1.2 聴診用のバイタル音データの生成
  2.応用例
  3.蓄積データの利活用の例
  4.ハードウェア構成の例
  5.効果の例
The present disclosure will be described according to the order of items shown below.
0. Introduction 1. Embodiment 1.1 Navigation of auscultation position 1.2 Generation of vital sound data for auscultation2. Application example 3 . Examples of utilization of accumulated data 4 . Example of hardware configuration5. Example of effect
0.序
 呼吸器系のアレルギー疾患、心疾患、肺疾患等の患者の呼吸音、心拍音等のバイタル音の聴診は、通常は、医師が聴診器を患者の身体の適切な部位に当てて行う。しかしながら、オンライン診療のように医師が患者の近くにいない場合もある。患者が自分で自分の身体に聴診器センサを当てると、バイタル音のセンシングに適した位置からずれるため、正確なバイタル音のセンシングが困難になる。
0. INTRODUCTION Vital sounds such as breath sounds and heart sounds of patients suffering from allergic respiratory diseases, heart diseases, pulmonary diseases, etc. are usually auscultated by a doctor by placing a stethoscope on an appropriate part of the patient's body. However, in some cases, such as online consultations, the doctor is not close to the patient. When the patient applies the stethoscope sensor to his or her own body, the stethoscope sensor deviates from a position suitable for sensing vital sounds, making accurate sensing of vital sounds difficult.
 専門家でない患者が自分で適切な聴診位置を決めることは困難である。患者の体形や体格等によって適切な聴診位置が異なり得るので、患者用のマニュアル作成等も困難である。何らかのバイタル音がセンシング出来たとしても、センシング目的のバイタル音が正しくセンシングできているのかどうか、患者では判断が難しい。 It is difficult for non-professional patients to determine the appropriate auscultation position by themselves. Since the appropriate position for auscultation may vary depending on the patient's physique, physique, etc., it is difficult to prepare a manual for the patient. Even if some vital sounds can be sensed, it is difficult for the patient to judge whether the vital sounds intended for sensing are correctly sensed.
 上記のような課題の少なくとも一部が、開示される技術のよって対処され得る。詳細は後述するが、聴診位置のナビゲーションにより、患者等のユーザにおいて、適切な聴診位置でバイタル音をセンシングすることが容易になる。例えばオンライン診療の場合でも、最適なバイタル音を医師等の医療従事者と共有することができ、適切な診療を可能にする。適切なバイタル音データが生成され、それにより、例えば診療から次の診療までの間の空白期間の症状の変化等を記録し、医療従事者と共有することも可能になる。 At least part of the above issues can be addressed by the disclosed technology. Although the details will be described later, the navigation of the auscultatory position facilitates the sensing of vital sounds at an appropriate auscultatory position for a user such as a patient. For example, even in the case of online medical care, optimal vital sounds can be shared with medical professionals such as doctors, enabling appropriate medical care. Appropriate vital sound data is generated, which makes it possible to record, for example, changes in symptoms during blank periods between consultations and to share them with medical staff.
1.実施形態
 図1は、実施形態に係る情報処理システムの概略構成の例を示す図である。情報処理システム100は、情報取得装置11と、ユーザ端末12と、医療従事者端末3とを含む。これらの装置及び端末は、ネットワークNを介して通信可能に構成される。
1. Embodiment FIG. 1 is a diagram illustrating an example of a schematic configuration of an information processing system according to an embodiment. The information processing system 100 includes an information acquisition device 11 , a user terminal 12 and a medical staff terminal 3 . These devices and terminals are configured to be able to communicate via network N. FIG.
 情報取得装置11のユーザを、ユーザUと称し図示する。ユーザUは、オンライン診療を受診する。ユーザUを診療する医師等の医療従事者を、医療従事者Cと称し図示する。医療従事者Cは、医療従事者端末3を利用することによって、ユーザUを遠隔で診療する。情報取得装置11は、ユーザUのバイタル音をセンシングするセンサの機能を有する。バイタル音の例は、呼吸音、肺音、心拍音、腸音等である。情報取得装置11は、聴診器センサ、または、聴診器センサとしての機能を有するセンシングユニット等と呼ぶこともできる。 The user of the information acquisition device 11 is referred to as user U and illustrated. A user U receives online medical treatment. A medical worker such as a doctor who treats the user U is referred to as a medical worker C and illustrated. The medical staff C remotely diagnoses the user U by using the medical staff terminal 3 . The information acquisition device 11 has the function of a sensor that senses the user's U vital sounds. Examples of vital sounds are breath sounds, lung sounds, heart sounds, bowel sounds, and the like. The information acquisition device 11 can also be called a stethoscope sensor or a sensing unit having a function as a stethoscope sensor.
 図2及び図3は、センサの概略構成の例を示す図である。情報取得装置11は、センサアレイ111が設けられた先端部112と、本体部113と、環境音センサ114とを含む。図には、XYZ座標系も示される。XY平面方向は、センサアレイ111のアレイ方向に相当する。Z軸正方向に、本体部113及び先端部112が順に位置する。図2には、情報取得装置11を斜視したときの外観形状が模式的に示される。図3には、情報取得装置11を正面視したとき(Z軸負方向にみたとき)の外観形状が模式的に示される。 2 and 3 are diagrams showing examples of the schematic configuration of the sensor. The information acquisition device 11 includes a distal end portion 112 provided with a sensor array 111 , a body portion 113 , and an environmental sound sensor 114 . The figure also shows an XYZ coordinate system. The XY plane direction corresponds to the array direction of the sensor array 111 . The body portion 113 and the tip portion 112 are positioned in this order in the Z-axis positive direction. FIG. 2 schematically shows the external shape of the information acquisition device 11 when viewed obliquely. FIG. 3 schematically shows the external shape of the information acquisition device 11 when viewed from the front (when viewed in the Z-axis negative direction).
 先端部112及び本体部113は、ユーザUが手で持って移動させることができる大きさを有してよく、例えば全体として聴診器のヘッドピースと同程度の大きさ、形状等を有してよい。ユーザUによる情報取得装置11の聴診器センサとしての扱いが容易になる。図2及び図3に示される例では、先端部112は、XY平面方向に延在する略円板形状を有する。先端部112は、センサアレイ111が設けられた主面112a(Z軸正方向側の面)を有する。本体部113は、Z軸方向に延在する略円柱形状を有し、先端部112を主面112aとは反対側から支持する。 The distal end portion 112 and the main body portion 113 may have a size that allows the user U to hold and move them by hand, and for example, they may have a size, shape, etc. that are approximately the same as the headpiece of a stethoscope as a whole. good. The user U can easily handle the information acquisition device 11 as a stethoscope sensor. In the example shown in FIGS. 2 and 3, the tip portion 112 has a substantially disc shape extending in the XY plane direction. The distal end portion 112 has a main surface 112a (a surface on the Z-axis positive direction side) on which the sensor array 111 is provided. The body portion 113 has a substantially cylindrical shape extending in the Z-axis direction, and supports the tip portion 112 from the side opposite to the main surface 112a.
 センサアレイ111は、2次元状に(XY平面上に)配置された複数のセンシング素子111aを含む。図2及び図3に示される例では、複数のセンシング素子113aは、ハニカム状に配置される。ただし、センシング素子111aの数、形状等は、図2及び図3に示される例に限定されない。なお、矛盾の無い範囲において、センサアレイ111と複数のセンシング素子111aとは適宜読み替えられてよい。 The sensor array 111 includes a plurality of sensing elements 111a arranged two-dimensionally (on the XY plane). In the example shown in FIGS. 2 and 3, the multiple sensing elements 113a are arranged in a honeycomb pattern. However, the number, shape, etc. of the sensing elements 111a are not limited to the examples shown in FIGS. Note that the sensor array 111 and the plurality of sensing elements 111a may be read interchangeably within a consistent range.
 センシング素子111aは、センサアレイ111がユーザUの身体に当てられた状態で、ユーザUのバイタル音をセンシングする。聴診器センサ等の技術分野で用いられる種々の公知の構成が採用されてよい。例えば、センシング素子111aは、音圧に応じて振動するダイヤフラムを含んで構成されてよい。ユーザUのバイタル音に対応する時間波形の信号データ(時系列データ)が得られる。センシング素子111aによって得られた信号データを、「センシングデータ」とも称する。矛盾の無い範囲において、センシングデータは、信号データ、信号等に適宜読み替えられてよい。 The sensing element 111a senses the vital sound of the user U while the sensor array 111 is in contact with the user's U body. Various known configurations used in the art, such as stethoscope sensors, may be employed. For example, sensing element 111a may include a diaphragm that vibrates in response to sound pressure. Time waveform signal data (time-series data) corresponding to the user U's vital sounds is obtained. The signal data obtained by the sensing element 111a is also called "sensing data". As long as there is no contradiction, the sensing data may be appropriately read as signal data, signals, or the like.
 複数のセンシング素子111aそれぞれは、対向する位置のバイタル音をセンシングする。センシング素子113aは、隣接するセンシング素子113aとの間で音の干渉が生じないように構成される。例えば各センシング素子113aのダイヤフラムが、隣接するセンシング素子111aのダイヤフラムとは独立して振動するように構成される。複数のセンシング素子111aそれぞれのセンシングデータは、異なるセンシングデータとなり得る。 Each of the plurality of sensing elements 111a senses vital sounds at opposing positions. The sensing element 113a is configured such that no sound interference occurs between adjacent sensing elements 113a. For example, the diaphragm of each sensing element 113a is configured to vibrate independently of the diaphragm of the adjacent sensing element 111a. The sensing data of each of the plurality of sensing elements 111a can be different sensing data.
 環境音センサ114は、例えばマイクロフォンを含んで構成され、環境音をセンシングする。環境音の例は、音楽、エアコン音、会話音等である。環境音は、センシング目的のバイタル音に対しては雑音となる。環境音センサ114は、センサアレイ111以外の部分に設けられる。この例では、環境音センサ114は、本体部113、より具体的には、本体部113のうち、先端部112とは反対側(Z軸負方向側)の部分に設けられる。 The environmental sound sensor 114 includes, for example, a microphone, and senses environmental sounds. Examples of environmental sounds are music, air conditioner sounds, conversation sounds, and the like. The environmental sound becomes noise with respect to the vital sound for sensing purposes. Environmental sound sensor 114 is provided in a portion other than sensor array 111 . In this example, the environmental sound sensor 114 is provided on the body portion 113, more specifically, on the portion of the body portion 113 on the side opposite to the distal end portion 112 (Z-axis negative direction side).
 図1に戻り、ユーザ端末12は、ユーザUが利用する端末である。ユーザ端末12の例は、スマートフォン、タブレット端末等である。ユーザ端末12は、例えば有線通信、近距離無線通信等を用いて、情報取得装置11からのセンシングデータを受信する。 Returning to FIG. 1, the user terminal 12 is a terminal used by the user U. Examples of the user terminal 12 are smart phones, tablet terminals, and the like. The user terminal 12 receives sensing data from the information acquisition device 11 using, for example, wired communication, short-range wireless communication, or the like.
 図4は、情報処理システムの機能ブロックの例を示す図である。上述のように、情報取得装置11からユーザ端末12に、複数のセンシング素子111aそれぞれのセンシングデータが送信される。 FIG. 4 is a diagram showing an example of functional blocks of the information processing system. As described above, the sensing data of each of the plurality of sensing elements 111a is transmitted from the information acquisition device 11 to the user terminal 12. FIG.
 ユーザ端末12は、通信部13と、ユーザインタフェース部14と、処理部15と、記憶部16とを含む。記憶部16に記憶される情報として、アプリケーションプログラム161(アプリケーションソフトウェア)、学習済みモデル162及びユーザデータ163が例示される。 The user terminal 12 includes a communication unit 13, a user interface unit 14, a processing unit 15, and a storage unit 16. Examples of information stored in the storage unit 16 include an application program 161 (application software), a trained model 162 and user data 163 .
 通信部13は、他の装置等と通信する。例えば、通信部13は、情報取得装置11からのセンシングデータを受信する。また、通信部13は、後述のバイタル音データを医療従事者端末3に送信する。 The communication unit 13 communicates with other devices. For example, the communication unit 13 receives sensing data from the information acquisition device 11 . The communication unit 13 also transmits vital sound data, which will be described later, to the medical staff terminal 3 .
 ユーザインタフェース部14は、ユーザUによるユーザ端末12の操作を受け付けたり、情報をユーザUに提示したりする。 The user interface unit 14 receives the operation of the user terminal 12 by the user U and presents information to the user U.
 処理部15は、ユーザ端末12の各要素を制御する制御部として機能するとともに、種々の処理を実行する。例えば、処理部15は、アプリケーションプログラム161を実行する。アプリケーションプログラム161の実行により、ユーザUが利用するアプリケーションが提供される。アプリケーションの例は、診療アプリケーションである。診療は、ユーザUのバイタル音のセンシングを含む。バイタル音のセンシングに、上述の情報取得装置11が用いられる。 The processing unit 15 functions as a control unit that controls each element of the user terminal 12 and executes various processes. For example, the processing unit 15 executes the application program 161 . An application used by the user U is provided by executing the application program 161 . An example of an application is a clinical application. Diagnosis includes sensing of user U's vital sounds. The information acquisition device 11 described above is used for sensing vital sounds.
 処理部15は、複数のセンシング素子111aそれぞれのセンシングデータを処理することで、さまざまな機能を提供する。機能の例は、聴診位置のナビゲーション、及び、聴診用のバイタル音データの生成であり、以下、順に説明する。 The processing unit 15 provides various functions by processing the sensing data of each of the plurality of sensing elements 111a. Examples of functions are the navigation of auscultation positions and the generation of vital sound data for auscultation, which will be described in turn below.
1.1 聴診位置のナビゲーション
 専門家でないユーザUが自分で適切な聴診位置、すなわち情報取得装置11のセンサアレイ111をユーザUの身体に当てる位置を決めることは困難である。そこで、情報処理システム100では、聴診位置のナビゲーションが行われる。
1.1 Navigation of Auscultatory Positions It is difficult for the user U, who is not an expert, to determine an appropriate auscultatory position, that is, the position at which the sensor array 111 of the information acquisition device 11 is applied to the user's U body. Therefore, in the information processing system 100, navigation of the auscultation position is performed.
 ユーザ端末12の処理部15は、複数のセンシング素子111aそれぞれのセンシングデータに基づいて、聴診位置のナビゲーション情報を生成する。ナビゲーション情報は、聴診位置を、センシング目的のバイタル音の聴診に適した位置(以下、「目的聴診位置」とも称する。)に導くための情報である。 The processing unit 15 of the user terminal 12 generates navigation information of the auscultation position based on the sensing data of each of the plurality of sensing elements 111a. The navigation information is information for guiding the auscultation position to a position suitable for auscultation of vital sounds for sensing purposes (hereinafter also referred to as "target auscultation position").
 ナビゲーション情報は、ナビゲーション方向を含んでよい。ナビゲーション方向は、情報取得装置11のセンサアレイ111を目的聴診位置に方向付けるための情報(目的聴診位置に向かう方向)である。処理部15は、複数のセンシング素子111aそれぞれのセンシングデータに基づいて、ナビゲーション方向を算出する。  The navigation information may include the navigation direction. The navigation direction is information for directing the sensor array 111 of the information acquisition device 11 to the target auscultation position (the direction toward the target auscultation position). The processing unit 15 calculates the navigation direction based on the sensing data of each of the plurality of sensing elements 111a.
 例えば、処理部15は、複数のセンシング素子111aそれぞれにおけるセンシング目的のバイタル音に関する指標を算出し、算出した指標に基づいて、ナビゲーション方向を算出する。複数のセンシング素子111aのうち、目的聴診位置の近くに位置するセンシング素子111aの指標が、目的聴診位置の遠くに位置するセンシング素子111aの指標よりも大きくなるように算出されてよい。指標の例は、(そのセンシング素子111aの)センシングデータに含まれるセンシング目的のバイタル音の大きさ、センシングデータに含まれる全バイタル音中のセンシング目的のバイタル音の割合、センシングデータに含まれるセンシング目的のバイタル音の周波数成分(の例えば大きさや割合)等である。 For example, the processing unit 15 calculates an index related to the vital sound for sensing purposes in each of the plurality of sensing elements 111a, and calculates the navigation direction based on the calculated index. Among the plurality of sensing elements 111a, the index of the sensing element 111a positioned near the target auscultation position may be calculated to be larger than the index of the sensing element 111a positioned far from the target auscultation position. Examples of indicators are the magnitude of the vital sound for sensing purpose included in the sensing data (of the sensing element 111a), the ratio of the vital sound for sensing purpose to all the vital sounds included in the sensing data, and the sensing data included in the sensing data. It is the frequency component (for example, magnitude or ratio) of the target vital sound.
 複数のセンシング素子111aそれぞれのセンシングデータには、センシング目的のバイタル音(例えば心拍音等)だけでなく、他のバイタル音(例えば呼吸音、肺音、腸音等)も含まれる。処理部15は、複数のセンシング素子111aそれぞれについて、センシングデータに含まれる各バイタル音の大きさ、周波数成分等を検出(抽出等)する。検出手法はとくに限定されないが、例えば予め準備された各種のバイタル音の信号波形や周波数成分等の参照データと、複数のセンシング素子111aそれぞれのセンシングデータとの比較結果等に基づいて、センシングデータに含まれる各バイタル音が検出されてよい。後述するように学習済みモデル162が用いられてもよい。 The sensing data of each of the plurality of sensing elements 111a includes not only vital sounds for sensing purposes (eg, heartbeat sounds), but also other vital sounds (eg, breathing sounds, lung sounds, bowel sounds, etc.). The processing unit 15 detects (extracts, etc.) the magnitude, frequency component, etc. of each vital sound included in the sensing data for each of the plurality of sensing elements 111a. The detection method is not particularly limited. Each included vital sound may be detected. A trained model 162 may be used as described below.
 処理部15は、検出結果に基づいて、上述の指標を算出する。例えば、検出したセンシング目的のバイタル音の大きさ(振幅値等)が、センシング目的のバイタル音の大きさとして算出される。センシング素子111aのセンシングデータに含まれる全バイタル音の合計の大きさに対するセンシング目的のバイタル音の大きさが、センシング目的のバイタル音の割合として算出される。検出されたセンシング目的のバイタル音の周波数成分は、そのまま目的のバイタル音の周波数成分として算出される。 The processing unit 15 calculates the above index based on the detection result. For example, the magnitude (amplitude value, etc.) of the detected vital sound for sensing is calculated as the magnitude of the vital sound for sensing. The magnitude of the vital sound intended for sensing with respect to the total magnitude of all vital sounds included in the sensing data of the sensing element 111a is calculated as the ratio of the vital sound intended for sensing. The detected frequency component of the vital sound for sensing is calculated as it is as the frequency component of the target vital sound.
 処理部15は、算出した複数のセンシング素子111aそれぞれの指標に基づいて、ナビゲーション方向を算出する。例えば、処理部15は、センサアレイ111のアレイ方向(XY平面方向)において、指標が小さいセンシング素子111aから指標が大きいセンシング素子111aに向かう方向を、ナビゲーション方向として算出する。なお、目的聴診位置の近くに位置するセンシング素子111aの指標が、目的聴診位置の遠くに位置するセンシング素子111aの指標よりも小さくなるように算出されてもよく、その場合には、指標が大きいセンシング素子111aから指標が小さいセンシング素子111aに向かう方向が、ナビゲーション方向として算出されてよい。 The processing unit 15 calculates the navigation direction based on the calculated index of each of the plurality of sensing elements 111a. For example, in the array direction (XY plane direction) of the sensor array 111, the processing unit 15 calculates the direction from the sensing element 111a with the smaller index to the sensing element 111a with the larger index as the navigation direction. Note that the index of the sensing element 111a positioned near the target auscultation position may be calculated so as to be smaller than the index of the sensing element 111a positioned far from the target auscultation position, in which case the index is large. A direction from the sensing element 111a toward the sensing element 111a having a smaller index may be calculated as the navigation direction.
 図5は、ナビゲーション方向の算出の例を示す図である。各センシング素子111aは、指標の大きさに応じたハッチングを付して示される。この例では、センサアレイ111の左下(X軸負方向及びY軸負方向側)に位置するセンシング素子111aよりも、センサアレイ111の右上(X軸正方向及びY軸正方向側)に位置するセンシング素子111aの方が、指標が大きくなっている。処理部15は、センサアレイ111の左下からセンサアレイ111の右上に向かう方向を、ナビゲーション方向として算出する。図において、算出されたナビゲーション方向が、白抜き矢印で模式的に示される。 FIG. 5 is a diagram showing an example of calculating the navigation direction. Each sensing element 111a is shown with hatching according to the size of the index. In this example, the sensing element 111a is positioned on the upper right side of the sensor array 111 (positive direction of the X axis and the positive direction of the Y axis) than the sensing element 111a positioned on the lower left side of the sensor array 111 (the side of the negative direction of the X axis and the negative direction of the Y axis). The index of the sensing element 111a is larger. The processing unit 15 calculates the direction from the lower left of the sensor array 111 to the upper right of the sensor array 111 as the navigation direction. In the figure, the calculated navigation direction is schematically indicated by an outline arrow.
 ナビゲーション方向の算出の一部または全部に、学習済みモデル162が用いられてもよい。その場合、処理部15は、複数のセンシング素子111aそれぞれのセンシングデータと、学習済みモデル162とを用いて、ナビゲーション方向を算出する。 The learned model 162 may be used for part or all of the calculation of the navigation direction. In that case, the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the navigation direction.
 例えば、学習済みモデル162は、複数のセンシング素子111aそれぞれのセンシングデータに対応するデータが入力されると、ナビゲーション方向に対応するデータを出力するように、訓練データを用いた機械学習によって生成されてよい。この場合、処理部15は、複数のセンシング素子111aそれぞれのセンシングデータに対応するデータを学習済みモデル162に入力することによって、ナビゲーション方向に対応するデータを取得する。 For example, the trained model 162 is generated by machine learning using training data so as to output data corresponding to the navigation direction when data corresponding to sensing data of each of the plurality of sensing elements 111a is input. good. In this case, the processing unit 15 acquires data corresponding to the navigation direction by inputting data corresponding to sensing data of each of the plurality of sensing elements 111a into the learned model 162 .
 センシングデータに基づく各バイタル音の検出、指標の算出、指標に基づくナビゲーション方向の算出等にも、学習済みモデル162が用いられてもよい。例えば、学習済みモデル162は、複数のセンシング素子111aそれぞれのセンシングデータに対応するデータが入力されると、それぞれのセンシングデータに含まれる各バイタル音に対応するデータを出力してよい。学習済みモデル162は、複数のセンシング素子111aそれぞれのセンシングデータに対応するデータが入力されると、複数のセンシング素子111aそれぞれの指標に対応するデータを出力してもよい。また、学習済みモデル162は、複数のセンシング素子111aそれぞれの指標に対応するデータが入力されると、ナビゲーション方向に対応するデータを出力してもよい。 The learned model 162 may also be used for detecting each vital sound based on sensing data, calculating an index, calculating a navigation direction based on the index, and the like. For example, the trained model 162 may output data corresponding to each vital sound included in each sensing data when data corresponding to each sensing data of the plurality of sensing elements 111a is input. The trained model 162 may output data corresponding to indices of the plurality of sensing elements 111a when data corresponding to the sensing data of the plurality of sensing elements 111a is input. Further, the learned model 162 may output data corresponding to the navigation direction when data corresponding to the indices of the plurality of sensing elements 111a are input.
 訓練データは、情報取得装置11又は情報取得装置11と同様の構成を備える装置等を利用して行われた多くの患者の診療(聴診等)から得られたデータを含んでよい。このような汎用のデータとともに、或いはこのような汎用のデータに代えて、ユーザUが情報処理システム100を利用することによって得られたデータも、訓練データに含まれてよい。ユーザUが情報処理システム100を利用することによって得られたデータは、ユーザデータ163として記憶部16に記憶(保存、蓄積)される。学習済みモデル162の訓練データ、例えば学習済みモデル162の入力データや出力データを得ることができるあらゆるデータが、ユーザデータ163として記憶部16に記憶されてよい。ユーザデータ163を含む訓練データを用いた機械学習により、ユーザUによる情報取得装置11の利用状況、利用環境等に最適化された学習済みモデル162が得られる。 The training data may include data obtained from medical examinations (auscultation, etc.) of many patients performed using the information acquisition device 11 or a device having a configuration similar to that of the information acquisition device 11. The training data may include data obtained by the user U using the information processing system 100 together with such general-purpose data or instead of such general-purpose data. Data obtained by the user U using the information processing system 100 is stored (saved or accumulated) in the storage unit 16 as user data 163 . Training data for the trained model 162 , for example, all data from which input data and output data for the trained model 162 can be obtained may be stored in the storage unit 16 as the user data 163 . Machine learning using training data including the user data 163 provides a trained model 162 optimized for the user U's use of the information acquisition device 11, the use environment, and the like.
 例えば、学習済みモデル162は、上述の汎用データを含む訓練データを用いた機械学習により生成され、その後、蓄積されたユーザデータ163を含む訓練データを用いた追加の機械学習により更新されてよい。ユーザデータ163の蓄積前から学習済みモデル162を用いることができる。 For example, the trained model 162 may be generated by machine learning using training data including the general-purpose data described above, and then updated by additional machine learning using training data including the accumulated user data 163. The trained model 162 can be used before user data 163 is accumulated.
 学習済みモデル162は、蓄積されたユーザデータ163を含む訓練データだけを用いた機械学習によって生成されてもよい。汎用データを含む訓練データを用いる場合よりも、ユーザUに特化して最適化された学習済みモデル162を用いることができる。ユーザUごとに、ナビゲーション情報を最適化できる可能性がさらに高まる。ユーザデータ163の蓄積前は、医療従事者Cの指示等によるナビゲーションが行われてよい。 The learned model 162 may be generated by machine learning using only training data including accumulated user data 163. A trained model 162 optimized specifically for the user U can be used rather than using training data including general-purpose data. The possibility of optimizing the navigation information for each user U is further increased. Before the user data 163 is stored, navigation may be performed according to instructions from the medical staff C or the like.
 なお、学習済みモデル162の生成は、処理部15によって行われてもよいし、ユーザ端末12の外部の装置(サーバ装置等)で行われてもよい。 It should be noted that the generation of the learned model 162 may be performed by the processing unit 15, or may be performed by a device (server device, etc.) external to the user terminal 12.
 例えば以上のようにして、ナビゲーション方向が算出される。処理部15は、算出したナビゲーション方向を含むナビゲーション情報を生成する。ナビゲーション情報は、後述するようにユーザインタフェース部14によってユーザUに提示される。 For example, the navigation direction is calculated as described above. The processing unit 15 generates navigation information including the calculated navigation direction. The navigation information is presented to the user U by the user interface unit 14 as described later.
1.2 聴診用のバイタル音データの生成
 情報処理システム100では、情報取得装置11の複数のセンシング素子111aそれぞれがバイタル音をセンシングするので、それらのセンシングデータから、聴診用のバイタル音を作り出す必要がある。処理部15は、複数のセンシング素子111aそれぞれのセンシングデータに基づいて、バイタル音データを生成する。
1.2 Generation of Vital Sound Data for Auscultation In the information processing system 100, each of the plurality of sensing elements 111a of the information acquisition device 11 senses vital sounds, so it is necessary to generate vital sounds for auscultation from the sensing data. There is The processing unit 15 generates vital sound data based on the sensing data of each of the plurality of sensing elements 111a.
 最も単純には、処理部15は、複数のセンシング素子111aそれぞれのセンシングデータを加算することによって、バイタル音データを生成してよい。さらなる工夫として、処理部15は、センシング目的のバイタル音の聴診に適したバイタル音データを生成するように、複数のセンシング素子111aそれぞれのセンシングデータを重み付け加算してよい。 Most simply, the processing unit 15 may generate vital sound data by adding the sensing data of each of the plurality of sensing elements 111a. As a further contrivance, the processing unit 15 may perform weighted addition of the sensing data of each of the plurality of sensing elements 111a so as to generate vital sound data suitable for auscultation of vital sounds for sensing purposes.
 図6は、処理の例を示す図である。処理として、重み算出処理P1、重み付け処理P2、加算処理P3及び雑音低減処理P4、並びにナビゲーション情報生成処理P5が例示される。これらの処理のうち、とくに、重み算出処理P1、重み付け処理P2、加算処理P3及び雑音低減処理P4が、バイタル音データの生成に関連する。ナビゲーション情報生成処理P5は、上述のナビゲーション情報を生成する処理であり、例えば先に図5を参照して説明したようなナビゲーション方向を算出する。 FIG. 6 is a diagram showing an example of processing. Examples of processing include weight calculation processing P1, weighting processing P2, addition processing P3, noise reduction processing P4, and navigation information generation processing P5. Among these processes, the weight calculation process P1, the weighting process P2, the addition process P3, and the noise reduction process P4 are particularly related to the generation of vital sound data. The navigation information generation process P5 is a process for generating the above-described navigation information, and for example, calculates the navigation direction as described above with reference to FIG.
 重み算出処理P1では、複数のセンシング素子111aそれぞれのセンシングデータに対する重み(係数等)が算出される。重み付け処理P2では、重み算出処理P1で算出された重みで、対応するセンシングデータが重み付けされる(係数の乗算等が行われる)。加算処理P3では、重み付けされた後の各センシングデータが加算される。雑音低減処理P4では、加算処理P3での加算によって得られたデータに含まれる雑音が低減される。雑音低減処理P4によって雑音が低減されたデータが、バイタル音データとして得られる。 In the weight calculation process P1, weights (coefficients, etc.) for the sensing data of each of the plurality of sensing elements 111a are calculated. In the weighting process P2, the corresponding sensing data is weighted by the weight calculated in the weight calculation process P1 (multiplication of coefficients and the like is performed). In addition processing P3, each weighted sensing data is added. In the noise reduction processing P4, noise contained in the data obtained by the addition in the addition processing P3 is reduced. Data whose noise has been reduced by the noise reduction processing P4 is obtained as vital sound data.
 重み算出処理P1では、加算処理P3の加算結果に基づいて、重みが算出される。例えば、センシング目的のバイタル音を比較的多く含む(例えば上述の指標が大きい)センシング素子111aのセンシングデータの重みが、そうでないセンシング素子111aのセンシングデータの重みよりも大きくなるように、各センシングデータの重みが算出される。 In the weight calculation process P1, the weight is calculated based on the addition result of the addition process P3. For example, the weight of the sensing data of the sensing element 111a that contains relatively many vital sounds for sensing purposes (for example, the above-mentioned index is large) is larger than the weight of the sensing data of the sensing element 111a that does not. is calculated.
 重み算出処理P1には、環境音センサ114のセンシングデータも用いられてよい。例えば、環境音を比較的少なく含むセンシング素子111aのセンシングデータの重みが、そうでないセンシング素子111aのセンシングデータの重みよりも大きくなるように、各センシングデータの重みが算出される。加算処理P3での加算によって得られたデータに含まれる環境音すなわち雑音が低減される。 Sensing data from the environmental sound sensor 114 may also be used in the weight calculation process P1. For example, the weight of each sensing data is calculated such that the weight of the sensing data of the sensing element 111a that contains relatively little environmental sound is greater than the weight of the sensing data of the sensing element 111a that does not. The environmental sound, ie, noise contained in the data obtained by the addition in the addition processing P3 is reduced.
 雑音低減処理P4では、環境音センサ114のセンシングデータに基づいて、環境音すなわち雑音が低減されたバイタル音データが生成される。雑音低減の手法には、種々の公知の手法(ノイズキャンセル技術等)が用いられてよい。 In the noise reduction process P4, based on the sensing data of the environmental sound sensor 114, environmental sound, ie, noise-reduced vital sound data is generated. Various known methods (noise cancellation technology, etc.) may be used as the noise reduction method.
 上述の重み算出処理P1に、学習済みモデル162が用いられてもよい。その場合の学習済みモデル162は、例えば、複数のセンシング素子111aそれぞれのセンシングデータに対応するデータが入力されると、それぞれのセンシングデータの重みに対応するデータを出力する。処理部15は、複数のセンシング素子111aそれぞれのセンシングデータに対応するデータを学習済みモデル162に入力することによって、それぞれのセンシングデータの重みに対応するデータを取得する。先にも述べたように、学習済みモデル162は、ユーザデータ163を含む訓練データを用いた機械学習によって生成された学習済みモデルであってよい。ユーザUごとに最適化されたバイタル音データを生成できる可能性が高まる。 The learned model 162 may be used for the weight calculation process P1 described above. In this case, the trained model 162 outputs data corresponding to the weight of each sensing data, for example, when data corresponding to sensing data of each of the plurality of sensing elements 111a is input. The processing unit 15 acquires data corresponding to the weight of each sensing data by inputting the data corresponding to the sensing data of each of the plurality of sensing elements 111a into the learned model 162 . As previously mentioned, trained model 162 may be a trained model generated by machine learning using training data including user data 163 . The possibility of generating vital sound data optimized for each user U increases.
 例えば以上のようにして、センシング目的のバイタル音の聴診に適したバイタル音データが生成される。なお、処理部15による処理の一部又は全部は、情報取得装置11において実行されてもよい。その場合の処理は、例えば情報取得装置11の本体部113に搭載された図示しない処理部(プロセッサ等)によって実行されてよい。矛盾の無い範囲において、ユーザ端末12の処理部15は、情報取得装置11内の処理部に適宜読み替えられてよい。 For example, as described above, vital sound data suitable for auscultation of vital sounds for sensing purposes is generated. Part or all of the processing by the processing unit 15 may be executed by the information acquisition device 11 . The processing in that case may be executed by a processing unit (processor or the like) (not shown) mounted in the main unit 113 of the information acquisition device 11, for example. As long as there is no contradiction, the processing unit 15 of the user terminal 12 may be appropriately read as the processing unit within the information acquisition device 11 .
 図4に戻り、ユーザインタフェース部14は、処理部15によって生成されたナビゲーション情報を提示(表示等)したり、バイタル音データを提示(波形表示、音出力等)したりする。図7を参照して説明する。 Returning to FIG. 4, the user interface unit 14 presents (displays, etc.) navigation information generated by the processing unit 15, and presents vital sound data (waveform display, sound output, etc.). Description will be made with reference to FIG.
 図7は、ナビゲーション情報等の提示の例を示す図である。ユーザインタフェース部14は、ユーザ端末12の表示画面(表示ユニット等)として例示される。表示画面の上側部分には、バイタル音の波形が表示される。波形表示とともに或いは波形表示に代えて、バイタル音が音出力されてもよい。表示画面の下側部分には、ナビゲーション情報が表示される。この例では、ナビゲーション情報は、情報取得装置11の外観、及び、情報取得装置11から目的聴診位置に向かうナビゲーション方向を含む。ナビゲーション方向は、情報取得装置11を基端として延在する矢印Gとして表示される。 FIG. 7 is a diagram showing an example of presentation of navigation information and the like. The user interface unit 14 is exemplified as a display screen (display unit, etc.) of the user terminal 12 . The waveform of the vital sound is displayed in the upper portion of the display screen. Along with the waveform display or instead of the waveform display, vital sounds may be output. Navigation information is displayed in the lower portion of the display screen. In this example, the navigation information includes the appearance of the information acquisition device 11 and the navigation direction from the information acquisition device 11 to the target auscultation position. The navigation direction is displayed as an arrow G extending from the information acquisition device 11 as a base end.
 例えば上記のように提示されたナビゲーション情報に従って、ユーザUは、情報取得装置11のセンサアレイ111を目的聴診位置に近づけることができる。この例では、ユーザUは、表示された矢印Gの方向に沿って、自身の身体に当てている情報取得装置11を移動させる。目的聴診位置でのバイタル音のセンシングが終わるまで、処理部15によるナビゲーション情報の生成やバイタル音データの生成が繰り返し行われ、また、それらの情報が更新表示される。 For example, according to the navigation information presented as described above, the user U can bring the sensor array 111 of the information acquisition device 11 closer to the target auscultation position. In this example, the user U moves the information acquisition device 11 in contact with his/her body along the direction of the displayed arrow G. Until the vital sound sensing at the target auscultation position is completed, the processing unit 15 repeats generation of navigation information and vital sound data, and the information is updated and displayed.
 図4に戻り、通信部13は、バイタル音信号を医療従事者端末3に送信する。 Returning to FIG. 4, the communication unit 13 transmits the vital sound signal to the medical staff terminal 3.
 医療従事者端末3は、通信部31と、ユーザインタフェース部32と、記憶部33とを含む。記憶部33に記憶される情報として、医療情報331が例示される。医療情報331は、例えばユーザUの電子カルテ等の情報を含み、医療従事者CによるユーザUの診療等に用いられる。 The medical staff terminal 3 includes a communication unit 31, a user interface unit 32, and a storage unit 33. Medical information 331 is exemplified as information stored in the storage unit 33 . The medical information 331 includes, for example, information such as an electronic medical record of the user U, and is used for medical care of the user U by the medical staff C, and the like.
 通信部31は、他の装置等と通信する。例えば、通信部31は、情報取得装置1からのバイタル音信号を受信する。 The communication unit 31 communicates with other devices and the like. For example, the communication unit 31 receives a vital sound signal from the information acquisition device 1 .
 ユーザインタフェース部32は、医療従事者Cによる医療従事者端末3の操作を受け付けたり、情報を医療従事者Cに提示したりする。例えば、ユーザインタフェース部32によって、バイタル音が提示(音出力等)される。医療従事者Cは、提示されたバイタル音に基づいてユーザUを診療することができる。 The user interface unit 32 receives operations of the medical staff terminal 3 by the medical staff C and presents information to the medical staff C. For example, the user interface unit 32 presents a vital sound (sound output, etc.). The medical staff C can treat the user U based on the presented vital sounds.
2.応用例
 以上で説明した技術をベースとしたさまざまな応用が可能である。いくつかの応用例について説明する。
2. Application Examples Various applications are possible based on the technology described above. Some application examples will be described.
 一実施形態において、情報取得装置11の移動、移動量等がセンシングされてよい。より詳細なナビゲーション情報の生成が可能になる。図8を参照して説明する。 In one embodiment, the movement, movement amount, etc. of the information acquisition device 11 may be sensed. It enables generation of more detailed navigation information. Description will be made with reference to FIG.
 図8は、処理の例を示す図である。例示される情報取得装置11は、モーションセンサ115をさらに含む。モーションセンサ115は、情報取得装置11の移動、移動量等をセンシングする。モーションセンサ115の例は、加速度センサ、角度センサ(ジャイロセンサ)等である。 FIG. 8 is a diagram showing an example of processing. The illustrated information acquisition device 11 further includes a motion sensor 115 . The motion sensor 115 senses the movement, movement amount, etc. of the information acquisition device 11 . Examples of the motion sensor 115 are an acceleration sensor, an angle sensor (gyro sensor), and the like.
 ナビゲーション情報生成処理P5では、モーションセンサ115のセンシングデータに基づいて、情報取得装置11のセンサアレイ111の位置も算出される。位置は、例えば、既知である基準位置からの移動方向、移動距離等に基づいて算出される。算出された情報取得装置11の位置もナビゲーション情報に含まれ、ユーザインタフェース部14によって提示される。例えば、目的聴診位置までの移動量(距離)、移動方向等を、具体的な数値で提示したり、さらには情報取得装置11の現在位置とともに提示したりすることが可能である。詳細なナビゲーション情報を提示することで、ナビゲーション精度を向上できる可能性が高まる。 In the navigation information generation process P5, the position of the sensor array 111 of the information acquisition device 11 is also calculated based on the sensing data of the motion sensor 115. The position is calculated, for example, based on the moving direction, the moving distance, etc. from the known reference position. The calculated position of the information acquisition device 11 is also included in the navigation information and presented by the user interface unit 14 . For example, the movement amount (distance) to the target auscultation position, the movement direction, and the like can be presented as specific numerical values, or can be presented together with the current position of the information acquisition device 11 . Presenting detailed navigation information increases the possibility of improving navigation accuracy.
 なお、図8に示される例では、重み算出処理P1によって算出された重みも、ナビゲーション情報生成処理P5での位置算出に反映される。n-1回目(nは正の整数)の処理で算出された重みに対するn回目の処理で算出された重みとの変化量のデータが、情報取得装置11の位置の算出に補助的に用いられる。 In the example shown in FIG. 8, the weight calculated by the weight calculation process P1 is also reflected in the position calculation in the navigation information generation process P5. Data on the amount of change between the weight calculated in the n-1th (n is a positive integer) process and the weight calculated in the nth process is used to assist in calculating the position of the information acquisition device 11. .
 一実施形態において、ナビゲーション情報は、身体の画像を含んでよい。図9及び図10を参照して説明する。 In one embodiment, navigation information may include body images. Description will be made with reference to FIGS. 9 and 10. FIG.
 図9及び図10は、ナビゲーション情報等の提示の例を示す図である。図9を参照すると、表示画面の下側部分に、身体の画像が表示される。表示される身体の画像には、目的聴診位置、及び、身体の特徴的な部位、例えば鎖骨、胸骨等も含まれる。目的聴診位置が、身体上の1~10の丸印で順序付けて示される。ユーザUは、画像に示される身体の特徴的な部位に対応する自身の身体の位置に、情報取得装置11のセンサアレイ111を当てる。なお、この位置は、医療従事者Cによって指示されてもよい。1番目の目的聴診位置に向かう矢印G(ナビテーション方向)が表示され、ユーザUは、その方向に情報取得装置11(のセンサアレイ111)を移動させる。 9 and 10 are diagrams showing examples of presentation of navigation information and the like. Referring to FIG. 9, an image of the body is displayed in the lower portion of the display screen. The displayed image of the body also includes the target auscultation position and characteristic parts of the body such as the clavicle, sternum, and the like. The target auscultation locations are indicated ordered by 1-10 circles on the body. The user U applies the sensor array 111 of the information acquisition device 11 to the position of his/her own body corresponding to the characteristic part of the body shown in the image. Note that this position may be indicated by the medical staff C. An arrow G (navigation direction) pointing to the first target auscultation position is displayed, and the user U moves (the sensor array 111 of) the information acquisition device 11 in that direction.
 1番目の目的聴診位置(又はその近傍)に情報取得装置11のセンサアレイ111が当てられた状態で、バイタル音がセンシングされ、バイタル音データが生成される。医療従事者Cは、目的聴診位置で適切にセンシングされたバイタル音に基づいてユーザUを診療することができる。この1番目の目的聴診位置が、例えば上述の基準位置に設定されてよい。2番目から10番目の目的聴診位置へのナビゲーションの際には、基準位置からの移動量等に基づいて算出された情報取得装置11のセンサアレイ111の位置も表示される。それぞれの目的聴診位置で、バイタル音が適切にセンシングされる。 With the sensor array 111 of the information acquisition device 11 applied to the first target auscultation position (or its vicinity), vital sounds are sensed to generate vital sound data. The medical staff C can diagnose the user U based on the vital sounds appropriately sensed at the target auscultatory position. This first target auscultation position may be set, for example, to the reference position described above. During navigation to the second to tenth target auscultation positions, the position of the sensor array 111 of the information acquisition device 11 calculated based on the amount of movement from the reference position and the like is also displayed. Vital sounds are appropriately sensed at each target auscultation position.
 図10に示される例では、ユーザUは妊婦であり、胎児のバイタル音をセンシングするための目的聴診位置が、身体上に丸印で示される。ユーザUが情報取得装置11のセンサアレイ111を身体に当てると、目的聴診位置に向かう矢印G(ナビゲーション方向)が表示される。ユーザUは、その方向に情報取得装置11(のセンサアレイ111)を移動させる。目的聴診位置に情報取得装置11のセンサアレイ111が当てられた状態で、バイタル音がセンシングされる。医療従事者Cは、目的聴診位置で適切にセンシングされたバイタル音に基づいてユーザUを診療することができる。 In the example shown in FIG. 10, the user U is a pregnant woman, and the target auscultatory position for sensing fetal vital sounds is indicated by a circle on her body. When the user U applies the sensor array 111 of the information acquisition device 11 to the body, an arrow G (navigation direction) pointing to the target auscultation position is displayed. The user U moves (the sensor array 111 of) the information acquisition device 11 in that direction. Vital sounds are sensed while the sensor array 111 of the information acquisition device 11 is applied to the target auscultatory position. The medical staff C can diagnose the user U based on the vital sounds appropriately sensed at the target auscultatory position.
 図11は、バイタル音のセンシングの流れの例を示すフローチャートである。ステップS1において、聴診位置の初期位置が設定される。例えば上述のように、はじめに情報取得装置11のセンサアレイ111を当てる身体の特徴的部位が画像表示されたり、聴診位置が医療従事者Cによって指示されたりする。ユーザUは、対応する自身の身体の位置に、情報取得装置11のセンサアレイ111を当てる。 FIG. 11 is a flowchart showing an example of the flow of vital sound sensing. In step S1, the initial position of the auscultatory position is set. For example, as described above, a characteristic part of the body to which the sensor array 111 of the information acquisition device 11 is applied is first displayed as an image, and the auscultation position is indicated by the medical staff C. FIG. The user U applies the sensor array 111 of the information acquisition device 11 to the corresponding position of his/her own body.
 ステップS2において、ナビゲーション情報が提示される。ユーザ端末12の処理部15は、複数のセンシング素子111aそれぞれのセンシングデータに基づいて、ナビゲーション情報を生成する。ユーザインタフェース部14は、ナビゲーション情報を提示する。ユーザUは、提示されたナビゲーション情報に従って、情報取得装置11のセンサアレイ111を移動させる。移動に伴い、新たなナビゲーション情報が繰り返し生成され提示される。なお、バイタル音データの生成や医療従事者端末3への送信等も併せて行われる。ステップS2の動作は、情報取得装置11のセンサアレイ111が目的聴診位置に移動するまで繰り返される。 In step S2, navigation information is presented. The processing unit 15 of the user terminal 12 generates navigation information based on the sensing data of each of the plurality of sensing elements 111a. The user interface unit 14 presents navigation information. The user U moves the sensor array 111 of the information acquisition device 11 according to the presented navigation information. As the user moves, new navigation information is repeatedly generated and presented. In addition, generation of vital sound data, transmission to the medical staff terminal 3, and the like are also performed. The operation of step S2 is repeated until the sensor array 111 of the information acquisition device 11 moves to the target auscultation position.
 ステップS3において、聴診位置が確定する。目的聴診位置でバイタル音がセンシングされる。適切な聴診位置でセンシングされ、適切に生成されたバイタル音データが得られる。 In step S3, the auscultatory position is determined. Vital sounds are sensed at the target auscultation position. Sensing is performed at an appropriate auscultatory position, and properly generated vital sound data is obtained.
 一実施形態において、情報取得装置11は、センサアレイ111がユーザUの背中に当たるように物品に設けられてよい。このように情報取得装置11が設けられた物品も、情報処理システム100の構成要素であってよい。物品の例は、椅子、ベッド等の家具等である。 In one embodiment, the information acquisition device 11 may be provided on an article so that the sensor array 111 is in contact with the user's U back. An article provided with the information acquisition device 11 in this way may also be a component of the information processing system 100 . Examples of articles are furniture such as chairs and beds.
 図12は、情報取得装置が設けられた物品の例を示す図である。例示される物品は、椅子8である。図12の(A)に示されるように、椅子8は、背もたれ部分において、情報取得装置11を支持する支持部81を含む。支持部81は、ユーザUが椅子8の背もたれ部分にもたれている状態で、情報取得装置11のセンサアレイ111がユーザUの背中に当たるように、情報取得装置11を支持する。図12の(B)に示されるように、表示されるナビゲーション情報には、背中の画像の他に、目的聴診位置、及び、背中の特徴的な部位、例えば肩甲骨、背骨等が含まれる。目的聴診位置は、丸印で示される。 FIG. 12 is a diagram showing an example of an article provided with an information acquisition device. The illustrated article is a chair 8 . As shown in FIG. 12A, the chair 8 includes a support portion 81 that supports the information acquisition device 11 in the backrest portion. The support unit 81 supports the information acquisition device 11 so that the sensor array 111 of the information acquisition device 11 touches the back of the user U while the user U is leaning against the backrest of the chair 8 . As shown in FIG. 12B, the displayed navigation information includes, in addition to the image of the back, the target auscultation position and characteristic parts of the back, such as the shoulder blades and spine. The target auscultation position is indicated by a circle.
 支持部81は、XY平面方向に移動可能に構成される。この例では、ユーザUは、ユーザ端末12の表示画面の下側部分に表示される方向キーKを操作することで、支持部81及び支持部81によって指示された情報取得装置11のセンサアレイ111を移動させる。ユーザ端末12から椅子8を操作できるように、椅子8とユーザ端末12とは、例えば有線又は近距離無線通信等によって通信可能に構成されてよい。 The support part 81 is configured to be movable in the XY plane direction. In this example, the user U operates the direction key K displayed in the lower part of the display screen of the user terminal 12, and the sensor array 111 of the information acquisition device 11 indicated by the support section 81 and the support section 81 is operated. to move. The chair 8 and the user terminal 12 may be configured to be able to communicate with each other by, for example, wired or short-range wireless communication so that the chair 8 can be operated from the user terminal 12 .
 情報取得装置11のセンサアレイ111がユーザUの背中に当たると、目的聴診位置に向かう矢印G(ナビゲーション方向)が表示される。ユーザUは、その方向に支持部81を移動させる。支持部81とともに情報取得装置11のセンサアレイ111が目的聴診位置まで移動し、その状態でバイタル音がセンシングされる。医療従事者Cは、適切にセンシングされたバイタル音に基づいてユーザUを診療することができる。 When the sensor array 111 of the information acquisition device 11 hits the back of the user U, an arrow G (navigation direction) pointing to the target auscultation position is displayed. The user U moves the support portion 81 in that direction. The sensor array 111 of the information acquisition device 11 is moved together with the support portion 81 to the target auscultation position, and vital sounds are sensed in that state. The medical staff C can diagnose the user U based on appropriately sensed vital sounds.
 上記の構成によれば、ユーザUの背中の聴診も容易に行うことができる。なお、支持部81による情報取得装置11のセンサアレイ111の移動は、ユーザUによる操作を介さずに、例えばユーザ端末12の処理部15によって自動制御されてもよい。 According to the above configuration, it is possible to easily perform auscultation of the back of the user U. Note that the movement of the sensor array 111 of the information acquisition device 11 by the support section 81 may be automatically controlled by the processing section 15 of the user terminal 12 without intervention of the user U's operation.
 一実施形態において、情報取得装置11がナビゲーション情報を提示してもよい。図13を参照して説明する。 In one embodiment, the information acquisition device 11 may present navigation information. Description will be made with reference to FIG.
 図13は、情報取得装置の概略構成の例を示す図である。図13には、情報取得装置11を背面視したとき(Z軸正方向にみたとき)の外観形状が模式的に示される。情報取得装置11は、発光部116を含む。発光部116は、情報取得装置11を背面視したときにユーザUから見えるように、先端部112及び本体部113のうち、先端部112の主面112a以外の部分に設けられる。この例では、発光部116は、先端部112の主面112aとは反対側の面に、リング状に設けられる。 FIG. 13 is a diagram showing an example of a schematic configuration of an information acquisition device. FIG. 13 schematically shows the external shape of the information acquisition device 11 when viewed from the back (when viewed in the Z-axis positive direction). Information acquisition device 11 includes light emitting unit 116 . The light emitting unit 116 is provided at a portion other than the main surface 112a of the tip portion 112 between the tip portion 112 and the main body portion 113 so as to be visible to the user U when the information acquisition device 11 is viewed from the rear. In this example, the light-emitting portion 116 is provided in a ring shape on the surface of the distal end portion 112 opposite to the main surface 112a.
 発光部116は、リング状に並んで配置された複数の発光素子116aを含む。発光素子116aの例は、LED(Light Emitting Diode)、OLED(Organic Light-Emitting Diode)等である。複数の発光素子116aそれぞれの点灯及び消灯(点滅)は、個別に制御可能である。図13に示される例では、左上側(X軸正方向及びY軸正方向側)の1つの発光素子116aが点灯している。リング形状の中心位置から点灯している発光素子116aに向かう方向(矢印Gとして模式的に図示)が、ナビゲーション方向として提示される。ナビゲーション方向の提示以外にも、例えば発光素子116aの点滅パターンを変えることで、目的聴診位置までの距離を提示することが可能である。 The light emitting section 116 includes a plurality of light emitting elements 116a arranged in a ring shape. Examples of the light emitting element 116a are an LED (Light Emitting Diode), an OLED (Organic Light-Emitting Diode), and the like. Lighting and extinguishing (blinking) of each of the plurality of light emitting elements 116a can be individually controlled. In the example shown in FIG. 13, one light emitting element 116a on the upper left side (the positive direction of the X axis and the positive direction of the Y axis) is lit. A direction (schematically illustrated as an arrow G) from the center position of the ring shape toward the lit light emitting element 116a is presented as the navigation direction. Besides presenting the navigation direction, for example, by changing the blinking pattern of the light emitting element 116a, it is possible to present the distance to the target auscultation position.
 例えば上記のようにして、情報取得装置11に設けられた発光部116がナビゲーション情報をユーザUに提示することも可能である。ユーザ端末12のユーザインタフェース部14を用いなくとも、聴診位置をナビゲーションすることができる。 For example, it is also possible for the light emitting unit 116 provided in the information acquisition device 11 to present navigation information to the user U as described above. The auscultation position can be navigated without using the user interface unit 14 of the user terminal 12 .
 一実施形態において、バイタル音データを含むデータが保存、蓄積されてよい。このデータは、ユーザ端末12に蓄積されてもよいし、外部装置に蓄積されてもよい。図14を参照して説明する。 In one embodiment, data including vital sound data may be saved and accumulated. This data may be accumulated in the user terminal 12 or may be accumulated in an external device. Description will be made with reference to FIG.
 図14は、情報処理システムの機能ブロックの例を示す図である。情報取得装置1の外部装置として、情報処理装置2が例示される。情報処理装置2は、情報取得装置1及び医療従事者端末3とネットワークN(図1)を介して通信可能に構成される。 FIG. 14 is a diagram showing an example of functional blocks of an information processing system. An information processing device 2 is exemplified as an external device of the information acquisition device 1 . The information processing device 2 is configured to be able to communicate with the information acquisition device 1 and the medical staff terminal 3 via the network N (FIG. 1).
 情報取得装置1において、処理部15によって生成されたバイタル音データを含むデータが、蓄積データADとして記憶部16に記憶される。蓄積データADに含まれるバイタル音データは、加工されたデータであってもよい。 In the information acquisition device 1, data including vital sound data generated by the processing unit 15 is stored in the storage unit 16 as accumulated data AD. The vital sound data included in the accumulated data AD may be processed data.
 ユーザ端末12の通信部13は、バイタル音データを、情報処理装置2にも送信する。情報処理装置2は、通信部21と、推定部22と、記憶部23とを含む。 The communication unit 13 of the user terminal 12 also transmits the vital sound data to the information processing device 2 . Information processing device 2 includes communication unit 21 , estimation unit 22 , and storage unit 23 .
 通信部21は、他の装置等と通信する。例えば、通信部21は、ユーザ端末12からのバイタル音信号を受信する。バイタル音信号を含む蓄積データADが、情報処理装置2の記憶部23にも記憶される。情報処理装置2では、蓄積データADのさまざまな解析が行われる。解析の例は、ユーザUの病状の推定等であり、推定部22によって行われる。 The communication unit 21 communicates with other devices and the like. For example, the communication unit 21 receives a vital sound signal from the user terminal 12 . Accumulated data AD including the vital sound signal is also stored in the storage unit 23 of the information processing device 2 . The information processing device 2 performs various analyzes of the accumulated data AD. An example of the analysis is estimation of the medical condition of the user U, etc., which is performed by the estimation unit 22 .
 例えば、推定部22は、蓄積データADに含まれるバイタル音データから算出された特徴量の評価結果に基づいて、ユーザUの病状に関する指標を推定する。特徴量の例は、バイタル安定性(ユーザUの心拍等の安定性等)等である。推定には、さまざまなアルゴリズムが用いられてよい。推定部22は、さまざまな特徴量算出アルゴリズム(学習済みモデルあってもよい)を用いて、バイタル音信号から特徴量を算出する。そして、推定部22は、算出した特徴量を評価することで、指標を推定する。指標の例は、病状の進行度合、リハビリの有効性等である。 For example, the estimation unit 22 estimates an index related to the user U's medical condition based on the evaluation result of the feature amount calculated from the vital sound data included in the accumulated data AD. An example of the feature quantity is vital stability (stability of user U's heartbeat, etc.) and the like. Various algorithms may be used for estimation. The estimating unit 22 calculates feature amounts from the vital sound signal using various feature amount calculation algorithms (learned models may be used). Then, the estimation unit 22 estimates the index by evaluating the calculated feature amount. Examples of indicators are the degree of progress of a medical condition, the effectiveness of rehabilitation, and the like.
 蓄積データADは、推定部22の推定結果(解析結果)も含んでよい。蓄積データADは、医療従事者端末3に送信され、記憶部33に記憶された医療情報331(電子カルテ等)に反映される。 The accumulated data AD may also include the estimation result (analysis result) of the estimation unit 22 . The accumulated data AD is transmitted to the medical staff terminal 3 and reflected in the medical information 331 (electronic medical record, etc.) stored in the storage unit 33 .
 蓄積データADは、ユーザ端末12によってユーザUに提示されたり、医療従事者端末3によって医療従事者Cに提示されたりしてよい。ユーザUの症状の履歴の可視化等が可能になる。医療従事者Cにおいては、通院時にしか確認できなかったユーザUの状態を都度確認することが可能になる。次回通院までの患者の状態やその変化を、医療従事者Cだけでなく、ユーザU自身が把握することもできる。また、症状データやその変化で増悪傾向を予測し、医療従事者CやユーザUに注意を促すといったことも可能である。 The accumulated data AD may be presented to the user U by the user terminal 12 or presented to the medical staff C by the medical staff terminal 3. It becomes possible to visualize the history of symptoms of the user U, and the like. The health care worker C can check the condition of the user U each time, which could only be checked when visiting the hospital. Not only the medical staff C, but also the user U himself/herself can grasp the patient's condition and its change until the next hospital visit. It is also possible to predict aggravation tendency based on the symptom data and its changes, and call attention to the medical staff C and the user U.
 図15は、蓄積データの提示の例を示す図である。ユーザ端末12のユーザインタフェース部14による蓄積データADの提示の例が示される。表示画面の上側には、例えば数か月間の期間にわたる1日ごとの特徴量がグラフ表示される。選択された日のバイタル音、この例では呼吸音、心拍音等の再生確認が可能である。同様の提示が、医療従事者端末3のユーザインタフェース部32によって行われてもよい。 FIG. 15 is a diagram showing an example of presentation of accumulated data. An example of presentation of accumulated data AD by the user interface unit 14 of the user terminal 12 is shown. On the upper side of the display screen, a graph of feature values for each day over a period of several months, for example, is displayed. Vital sounds of the selected day, in this example, breathing sounds, heartbeat sounds, etc., can be reproduced and confirmed. A similar presentation may be made by the user interface unit 32 of the medical staff terminal 3 .
3.蓄積データの利活用の例
 蓄積データのさまざまな利活用が可能である。図16及び図17を参照して説明する。
3. Examples of Utilization of Accumulated Data Accumulated data can be utilized in various ways. Description will be made with reference to FIGS. 16 and 17. FIG.
 図16は、情報処理システムの概略構成の例を示す図である。情報処理システム100は、これまで説明した情報取得装置11、ユーザ端末12、情報処理装置2及び医療従事者端末3の他に、コミュニティメンバー端末4、生保/健保端末5、製品/サービス提供者端末6及び分析端末7を含む。これらの装置及び端末も、ネットワークNを介して通信可能に構成される。 FIG. 16 is a diagram showing an example of a schematic configuration of an information processing system. Information processing system 100 includes community member terminal 4, life insurance/health insurance terminal 5, product/service provider terminal, in addition to information acquisition device 11, user terminal 12, information processing device 2, and medical worker terminal 3 described so far. 6 and analysis terminal 7 . These devices and terminals are also configured to be able to communicate via network N. FIG.
 コミュニティメンバー端末4は、メンバーMが利用する端末である。メンバーMは、ユーザUと同じコミュニティに属するメンバーであり、例えばユーザUの家族、ユーザUと同様の疾患を有する他の患者等である。 The community member terminal 4 is a terminal used by member M. The member M is a member belonging to the same community as the user U, such as a family member of the user U, another patient having the same disease as the user U, or the like.
 生保/健保端末5、製品/サービス提供者端末6及び分析端末7については後述する。 The life insurance/health insurance terminal 5, product/service provider terminal 6, and analysis terminal 7 will be described later.
 図17は、情報処理システムの機能ブロックの例を示す図である。情報処理装置2は、推薦部24及び加工部25をさらに含む。また、情報処理装置2の記憶部23には、患者情報231、コミュニティ情報232、アルゴリズムDB233、学習済みモデル234、推薦情報235及び匿名加工情報236が記憶される。 FIG. 17 is a diagram showing an example of functional blocks of an information processing system. The information processing device 2 further includes a recommendation section 24 and a processing section 25 . In addition, patient information 231, community information 232, algorithm DB 233, learned model 234, recommendation information 235, and anonymously processed information 236 are stored in the storage unit 23 of the information processing device 2. FIG.
 患者情報231は、患者であるユーザUに関する情報を含む。患者情報251の例は、ユーザUの疾患情報、通院履歴情報、リハビリ履歴情報、上述の蓄積データAD(図14)等である。疾患情報は、ユーザUの疾患名等を含む。通院履歴情報は、ユーザUの過去の通院に関する情報である。リハビリ履歴情報は、ユーザUが行った過去のリハビリに関する情報である。これらの情報は、例えば医療従事者端末3等から与えられる。 The patient information 231 includes information about the user U who is a patient. Examples of the patient information 251 are user U's disease information, hospital visit history information, rehabilitation history information, the above-described accumulated data AD (FIG. 14), and the like. The disease information includes user U's disease name and the like. The hospital visit history information is information about the user U's past hospital visits. Rehabilitation history information is information related to past rehabilitation performed by user U. These pieces of information are provided, for example, from the medical staff terminal 3 or the like.
 コミュニティ情報232は、ユーザUが属するコミュニティに関する情報であり、メンバーMやコミュニティメンバー端末4に関する情報を含む。 The community information 232 is information about the community to which the user U belongs, and includes information about the members M and the community member terminals 4 .
 アルゴリズムDB233、学習済みモデル234、推薦情報235及び匿名加工情報236については後述する。 The algorithm DB 233, the learned model 234, the recommended information 235, and the anonymously processed information 236 will be described later.
 通信部21及び推定部22については先に説明したとおりである。図17においては、推定部22の推定結果や蓄積データADを含む情報を、「結果情報」と称し図示する。通信部21は、結果情報を、他の装置や端末、この例では、ユーザ端末12、医療従事者端末3、コミュニティメンバー端末4及び生保/健保端末5に送信する。なお、情報処理装置2の推薦部24及び加工部25については後述する。 The communication unit 21 and the estimation unit 22 are as described above. In FIG. 17, the information including the estimation result of the estimation unit 22 and the accumulated data AD is referred to as "result information". The communication unit 21 transmits the result information to other devices and terminals, in this example, the user terminal 12 , the medical worker terminal 3 , the community member terminal 4 and the life/health insurance terminal 5 . Note that the recommendation unit 24 and processing unit 25 of the information processing device 2 will be described later.
 ユーザ端末12では、結果情報が、ユーザインタフェース部14によって提示される。ユーザUは、自身の身体機能、行動、疾患等に関するさまざまな指標を知ることができる。情報処理装置2からユーザ端末12に送信される結果情報には、ユーザUに提示するコンテンツが含まれてよい。推定結果に基づく個別コンテンツの提示が可能になる。  In the user terminal 12, the result information is presented by the user interface unit 14. The user U can know various indicators regarding his/her own physical function, behavior, disease, and the like. Content to be presented to the user U may be included in the result information transmitted from the information processing device 2 to the user terminal 12 . Individual content can be presented based on the estimation results.
 医療従事者端末3でも、結果情報が34によって提示される。医療従事者Cは、ユーザUの身体機能、行動、疾患等に関するさまざまな指標を知ることができる。他にも、例えばユーザUに適合するように医療従事者Cによってカスタマイズされたリハビリメニュー等の個別コンテンツが、ユーザインタフェース部32を利用して生成されてよい。コンテンツは、通信部31によってユーザ端末12に送信され、ユーザUに提示される。 The result information is presented by 34 on the medical staff terminal 3 as well. The medical staff C can know various indexes related to the user U's physical function, behavior, disease, and the like. In addition, individual content such as a rehabilitation menu customized by the medical staff C so as to suit the user U, for example, may be generated using the user interface unit 32 . The content is transmitted to the user terminal 12 and presented to the user U by the communication unit 31 .
 コミュニティメンバー端末4は、通信部41と、ユーザインタフェース部42とを含む。通信部41は、他の装置等と通信する。例えば、通信部41は、情報処理装置2からの結果情報を受信する。 The community member terminal 4 includes a communication unit 41 and a user interface unit 42. The communication unit 41 communicates with other devices and the like. For example, the communication unit 41 receives result information from the information processing device 2 .
 ユーザインタフェース部42は、メンバーMによるコミュニティメンバー端末4の操作を受け付けたり、情報をメンバーMに提示したりする。例えば、情報処理装置2からの結果情報が提示され、メンバーMにも共有される。情報処理装置2からコミュニティメンバー端末4に送信される結果情報には、メンバーMに提示するコンテンツが含まれてよい。推定結果に基づく個別コンテンツの提示が可能になる。 The user interface unit 42 receives operations of the community member terminal 4 by the member M and presents information to the member M. For example, the result information from the information processing device 2 is presented and shared with the member M as well. The result information transmitted from the information processing device 2 to the community member terminal 4 may include content to be presented to the member M. FIG. Individual content can be presented based on the estimation results.
 生保/健保端末5は、保険会社、健康保険会社等が利用する端末である。生保/健保端末5は、通信部51と、解析部52と、記憶部53とを含む。記憶部53に記憶される情報として、顧客/従業員情報531が例示される。顧客/従業員情報531は、ユーザUの生命保険や健康保険等に関する情報を含む。 The life insurance/health insurance terminal 5 is a terminal used by insurance companies, health insurance companies, and the like. Life insurance/health insurance terminal 5 includes communication unit 51 , analysis unit 52 , and storage unit 53 . Customer/employee information 531 is exemplified as information stored in the storage unit 53 . Customer/employee information 531 includes information about user U's life insurance, health insurance, and the like.
 通信部51は、情報処理装置2からの結果情報を受信する。解析部52は、結果情報を解析し、保険料や報奨を特定(算出等)する。特定には、生命保険会社や健康保険会社の従業員等の作業、判断等が介在してもよい。保険料の値下げや限定プランへの変更等も行われてよい。通信部51は、特定された保険料やこれを推奨する保険料/報奨情報を、ユーザ端末12に送信する。保険料/報奨情報は、ユーザ端末12のユーザインタフェース部14によって提示される。 The communication unit 51 receives result information from the information processing device 2 . The analysis unit 52 analyzes the result information and specifies (calculates, etc.) insurance premiums and rewards. The identification may involve the work, judgment, etc. of an employee of a life insurance company or a health insurance company. A reduction in insurance premiums, a change to a limited plan, or the like may also be performed. The communication unit 51 transmits to the user terminal 12 the identified insurance premium and the recommended insurance premium/reward information. Premium/reward information is presented by the user interface unit 14 of the user terminal 12 .
 製品/サービス提供者端末6は、製品/サービスを提供する企業等が利用する端末である。製品の例は、車椅子や歩行補助器、リハビリ機器、健康食品及び健康器具等である。サービスの例は、ユーザ端末12において実行可能な健康アプリケーション等である。 The product/service provider terminal 6 is a terminal used by companies that provide products/services. Examples of products include wheelchairs, walking aids, rehabilitation equipment, health food and health equipment. An example of a service is a health application or the like that can be executed on the user terminal 12 .
 製品/サービス提供者端末6は、通信部61と、ユーザインタフェース部62とを含む。例えばユーザインタフェース部62を介して、匿名化された結果情報と製品及びサービスとを対応付ける製品/サービス情報が入力されたり生成されたりする。通信部61は、製品/サービス情報を、情報処理装置2に送信する。 The product/service provider terminal 6 includes a communication unit 61 and a user interface unit 62. For example, product/service information that associates the anonymized result information with products and services is input or generated via the user interface unit 62 . The communication unit 61 transmits product/service information to the information processing device 2 .
 情報処理装置2の通信部21は、製品/サービス提供者端末6からの製品/サービス情報を受信する。 The communication unit 21 of the information processing device 2 receives product/service information from the product/service provider terminal 6 .
 情報処理装置2の推薦部24及び加工部25について説明する。推薦部24は、製品/サービス提供者端末6からの製品/サービス情報に基づいて、ユーザUに推薦すべき製品やサービスの情報を含む推薦情報235を生成する。通信部21は、推薦情報235をユーザ端末12及びコミュニティメンバー端末4に送信する。推薦情報235は、ユーザ端末12のユーザインタフェース部14によって提示されたり、コミュニティメンバー端末4のユーザインタフェース部42によって提示されたりする。 The recommendation unit 24 and the processing unit 25 of the information processing device 2 will be explained. The recommendation unit 24 generates recommendation information 235 including information on products and services to be recommended to the user U based on the product/service information from the product/service provider terminal 6 . The communication unit 21 transmits the recommendation information 235 to the user terminals 12 and the community member terminals 4 . The recommendation information 235 is presented by the user interface unit 14 of the user terminal 12 or by the user interface unit 42 of the community member terminal 4 .
 加工部25は、結果情報を匿名化することによって、匿名加工情報236を生成する。匿名加工情報236は、匿名化された個人情報と、結果情報とを対応付けて記述する。 The processing unit 25 generates anonymously processed information 236 by anonymizing the result information. The anonymously processed information 236 describes the anonymized personal information and the result information in association with each other.
 情報処理装置2の通信部21は、匿名加工情報236を分析端末7に送信する。 The communication unit 21 of the information processing device 2 transmits the anonymously processed information 236 to the analysis terminal 7.
 分析端末7は、例えば上述の製品/サービスを提供する企業、臨床開発を行う製薬企業等が利用する端末である。分析端末7は、通信部71と、解析部72と、ユーザインタフェース部73とを含む。 The analysis terminal 7 is, for example, a terminal used by a company that provides the above products/services, a pharmaceutical company that conducts clinical development, and the like. The analysis terminal 7 includes a communication section 71 , an analysis section 72 and a user interface section 73 .
 通信部71は、情報処理装置2からの匿名加工情報236を受信する。解析部72は、匿名加工情報236に基づいて、データ解析(データ分析)を行う。解析には、企業の従業員等の作業、判断等が介在してもよい。ユーザインタフェース部73は、データ解析に関する情報の提示等を行う。解析の例は、健康食品、健康器具等の製品のユーザ層の分析、臨床開発のためのデータ分析等である。メーカ等によるマーケティング分析、特定症状のユーザの割合、年齢、性別等の分析、特定薬剤服用患者に対する症状のモニタリング等、さまざまなサービスに匿名加工情報236を利活用することができる。 The communication unit 71 receives the anonymously processed information 236 from the information processing device 2. The analysis unit 72 performs data analysis (data analysis) based on the anonymously processed information 236 . The analysis may involve the work, judgment, etc., of employees of the company or the like. The user interface unit 73 presents information related to data analysis. Examples of analysis include analysis of user groups for products such as health foods and health equipment, data analysis for clinical development, and the like. The anonymously processed information 236 can be used for various services, such as marketing analysis by manufacturers, analysis of percentage of users with specific symptoms, age, gender, etc., and monitoring of symptoms of patients taking specific drugs.
 なお、上記の例では、情報取得装置11のセンシングデータが、ユーザ端末12を介して、バイタル音データとして情報処理装置2に送信される場合を例に挙げて説明した。ただし、情報取得装置11のセンシングデータの一部又は全部が、ユーザ端末12を介さずに、情報取得装置11から情報処理装置2に直接送信され、情報処理装置2においてバイタル音生データが生成されてもよい。ナビゲーション情報の生成も、情報処理装置2で行われてよい。すなわち、ユーザ端末12の処理部15の機能の一部又は全部が、情報処理装置2において実現されてよい。 In the above example, the case where the sensing data of the information acquisition device 11 is transmitted to the information processing device 2 as vital sound data via the user terminal 12 has been described as an example. However, part or all of the sensing data of the information acquisition device 11 is directly transmitted from the information acquisition device 11 to the information processing device 2 without going through the user terminal 12, and the vital sound raw data is generated in the information processing device 2. may The information processing device 2 may also generate the navigation information. That is, part or all of the functions of the processing unit 15 of the user terminal 12 may be implemented in the information processing device 2 .
 また、上記では、生保/健保端末5、製品/サービス提供者端末6及び分析端末7での各種サービスに関する機能や情報、例えば推薦部24、加工部25、推薦情報235、匿名加工情報236が、情報処理装置2に備えられる場合を例に挙げて説明した。ただし、それらの機能や情報は、対応する端末の利用者が管理するサーバ装置等(サービス提供者サーバ装置)に備えられてもよい。生保/健保端末5、製品/サービス提供者端末6及び分析端末7は、対応するサーバ装置等と通信してその機能を利用する。情報処理装置2での処理負担を軽減したり、機能を簡素化してコストを低減したりすることができる。 Further, in the above description, functions and information related to various services in the life insurance/health insurance terminal 5, the product/service provider terminal 6, and the analysis terminal 7, such as the recommendation unit 24, the processing unit 25, the recommendation information 235, and the anonymously processed information 236, The case where it is provided in the information processing apparatus 2 has been described as an example. However, these functions and information may be provided in a server device or the like (service provider server device) managed by the user of the corresponding terminal. The life insurance/health insurance terminal 5, product/service provider terminal 6, and analysis terminal 7 communicate with corresponding server devices and the like to use their functions. It is possible to reduce the processing load on the information processing device 2 and to simplify the functions to reduce the cost.
 端末の機能の一部が、対応する端末の利用者が管理するサーバ装置等に備えられてもよい。例えば、生保/健保端末5の解析部52の機能は、保険会社、健康保険会社等が管理するサーバ装置等に備えられてよい。生保/健保端末5は、そのサーバ装置等と通信してその機能を利用する。分析端末7の解析部72の機能は、製薬企業等が管理するサーバ装置等に備えられてよい。分析端末7は、そのサーバ装置等と通信してその機能を利用する。生保/健保端末5や分析端末7での処理負担を軽減したり、機能を簡素化してコストを低減したりすることができる。 Some of the functions of the terminal may be provided in a server device or the like managed by the user of the corresponding terminal. For example, the function of the analysis unit 52 of the life insurance/health insurance terminal 5 may be provided in a server device or the like managed by an insurance company, a health insurance company, or the like. The life insurance/health insurance terminal 5 communicates with the server device or the like to use its functions. The function of the analysis unit 72 of the analysis terminal 7 may be provided in a server device or the like managed by a pharmaceutical company or the like. The analysis terminal 7 communicates with the server device or the like to use its functions. It is possible to reduce the processing load on the life insurance/health insurance terminal 5 and the analysis terminal 7, and to reduce the cost by simplifying the functions.
4.ハードウェア構成の例
 図18は、ハードウェア構成の例を示すブロック図である。以下では、ユーザ端末12を例に挙げて説明する。情報取得装置11、情報処理装置2、医療従事者端末3、コミュニティメンバー端末4、生保/健保端末5、製品/サービス提供者端末6及び分析端末7等についても同様の説明が可能である。各種処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。
4. Example of Hardware Configuration FIG. 18 is a block diagram showing an example of the hardware configuration. The user terminal 12 will be described below as an example. The same explanation can be given for the information acquisition device 11, the information processing device 2, the medical staff terminal 3, the community member terminal 4, the life insurance/health insurance terminal 5, the product/service provider terminal 6, the analysis terminal 7, and the like. Various types of processing are realized by cooperation between software and hardware described below.
 図18に示されるように、ユーザ端末12は、CPU(Central Processing Unit)901、ROM(Read Only Memory)902、RAM(Random Access Memory)903及びホストバス904aを備える。また、ユーザ端末12は、ブリッジ904、外部バス904b、インタフェース905、入力装置906、出力装置907、ストレージ装置908、ドライブ909、接続ポート911、通信装置913、及びセンサ915を備える。情報処理装置2は、CPU901に代えて、又はこれとともに、DSP若しくはASIC等の処理回路を有してもよい。 As shown in FIG. 18, the user terminal 12 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a. The user terminal 12 also includes a bridge 904 , an external bus 904 b , an interface 905 , an input device 906 , an output device 907 , a storage device 908 , a drive 909 , a connection port 911 , a communication device 913 and a sensor 915 . The information processing device 2 may have a processing circuit such as a DSP or an ASIC in place of or together with the CPU 901 .
 CPU901は、演算処理装置及び制御装置として機能し、各種プログラムに従って情報処理装置2内の動作全般を制御する。また、CPU901は、マイクロプロセッサであってもよい。ROM902は、CPU901が使用するプログラムや演算パラメータ等を記憶する。RAM903は、CPU901の実行において使用するプログラムや、その実行において適宜変化するパラメータ等を一時記憶する。CPU901は、例えば、ユーザ端末12の処理部15等を具現し得る。 The CPU 901 functions as an arithmetic processing device and a control device, and controls general operations within the information processing device 2 according to various programs. Alternatively, the CPU 901 may be a microprocessor. The ROM 902 stores programs, calculation parameters, and the like used by the CPU 901 . The RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. The CPU 901 can embody the processing unit 15 of the user terminal 12, for example.
 CPU901、ROM902及びRAM903は、CPUバス等を含むホストバス904aにより相互に接続されている。ホストバス904aは、ブリッジ904を介して、PCI(Peripheral Component Interconnect/Interface)バス等の外部バス904bに接続されている。なお、ホストバス904a、ブリッジ904及び外部バス904bは、お互いから分離した構成を必ずしも有する必要はなく、単一の構成(例えば1つのバス)において実装されてもよい。 The CPU 901, ROM 902 and RAM 903 are interconnected by a host bus 904a including a CPU bus and the like. The host bus 904a is connected via a bridge 904 to an external bus 904b such as a PCI (Peripheral Component Interconnect/Interface) bus. It should be noted that host bus 904a, bridge 904 and external bus 904b need not necessarily have separate configurations from each other and may be implemented in a single configuration (eg, one bus).
 入力装置906は、例えば、マウス、キーボード、タッチパネル、ボタン、マイクロフォン、スイッチ及びレバー等、実施者によって情報が入力される装置によって実現される。また、入力装置906は、例えば、赤外線やその他の電波を利用したリモートコントロール装置であってもよいし、情報処理装置2の操作に対応した携帯電話やPDA等の外部接続機器であってもよい。さらに、入力装置906は、例えば、上記の入力手段を用いて実施者により入力された情報に基づいて入力信号を生成し、CPU901に出力する入力制御回路等を含んでいてもよい。実施者は、この入力装置906を操作することにより、ユーザ端末12に対して各種のデータを入力したり処理動作を指示したりすることができる。 The input device 906 is implemented by a device such as a mouse, keyboard, touch panel, button, microphone, switch, lever, etc., through which information is input by the practitioner. Also, the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or PDA corresponding to the operation of the information processing device 2. . Furthermore, the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the practitioner using the above input means and outputs the signal to the CPU 901 . By operating the input device 906, the practitioner can input various data to the user terminal 12 and instruct processing operations.
 出力装置907は、取得した情報を実施者に対して視覚的又は聴覚的に通知することが可能な装置で形成される。このような装置として、CRTディスプレイ装置、液晶ディスプレイ装置、プラズマディスプレイ装置、ELディスプレイ装置及びランプ等の表示装置や、スピーカ及びヘッドホン等の音響出力装置や、プリンタ装置等がある。 The output device 907 is formed by a device capable of visually or audibly notifying the practitioner of the acquired information. Such devices include display devices such as CRT display devices, liquid crystal display devices, plasma display devices, EL display devices and lamps, audio output devices such as speakers and headphones, and printer devices.
 ストレージ装置908は、データ格納用の装置である。ストレージ装置908は、例えば、HDD等の磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等により実現される。ストレージ装置908は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置及び記憶媒体に記録されたデータを削除する削除装置等を含んでもよい。このストレージ装置908は、CPU901が実行するプログラムや各種データ及び外部から取得した各種のデータ等を格納する。ストレージ装置908は、例えば、ユーザ端末12の記憶部16等を具現し得る。 The storage device 908 is a device for storing data. The storage device 908 is implemented by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 908 stores programs executed by the CPU 901, various data, and various data acquired from the outside. The storage device 908 can embody the storage unit 16 or the like of the user terminal 12, for example.
 ドライブ909は、記憶媒体用リーダライタであり、情報処理装置2に内蔵、或いは外付けされる。ドライブ909は、装着されている磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリ等のリムーバブル記憶媒体に記録されている情報を読み出して、RAM903に出力する。また、ドライブ909は、リムーバブル記憶媒体に情報を書き込むこともできる。 The drive 909 is a reader/writer for storage media, and is built in or externally attached to the information processing device 2 . The drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903 . Drive 909 can also write information to a removable storage medium.
 接続ポート911は、外部機器と接続されるインタフェースであって、例えばUSB(Universal Serial Bus)等によりデータ伝送可能な外部機器との接続口である。 The connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of data transmission by, for example, USB (Universal Serial Bus).
 通信装置913は、例えば、ネットワーク920(例えば図1のネットワークNに相当)に接続するための通信デバイス等で形成された通信インタフェースである。通信装置913は、例えば、有線若しくは無線LAN(Local Area Network)、LTE(Long Term Evolution)、Bluetooth(登録商標)又はWUSB(Wireless USB)用の通信カード等である。また、通信装置913は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ又は各種通信用のモデム等であってもよい。この通信装置913は、例えば、インターネットや他の通信機器との間で、例えばTCP/IP等の所定のプロトコルに則して信号等を送受信することができる。通信装置913は、例えば、情報処理装置2の通信部21等を具現し得る。 The communication device 913 is, for example, a communication interface formed of a communication device or the like for connecting to the network 920 (corresponding to the network N in FIG. 1, for example). The communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB). Also, the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various types of communication, or the like. This communication device 913 can transmit and receive signals and the like to and from the Internet and other communication devices, for example, according to a predetermined protocol such as TCP/IP. The communication device 913 can embody the communication unit 21 of the information processing device 2, for example.
 センサ915は、情報取得装置11に含まれるセンサの少なくとも一部のセンサを含んでよい。 The sensor 915 may include at least part of the sensors included in the information acquisition device 11 .
 なお、ネットワーク920は、ネットワーク920に接続されている装置から送信される情報の有線又は無線の伝送路である。例えば、ネットワーク920は、インターネット、電話回線網、衛星通信網等の公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)等を含んでもよい。また、ネットワーク920は、IP-VPN(Internet Protocol - Virtual Private Network)等の専用回線網を含んでもよい。 Note that the network 920 is a wired or wireless transmission path for information transmitted from devices connected to the network 920 . For example, the network 920 may include the Internet, a telephone line network, a public line network such as a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), WANs (Wide Area Networks), and the like. Network 920 may also include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
 以上、ユーザ端末12の機能を実現可能なハードウェア構成例を示した。上記の各構成要素は、汎用的な部材を用いて実現されていてもよいし、各構成要素の機能に特化したハードウェアにより実現されていてもよい。従って、本開示を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。 A hardware configuration example capable of realizing the functions of the user terminal 12 has been shown above. Each component described above may be implemented using general-purpose members, or may be implemented by hardware specialized for the function of each component. Therefore, it is possible to appropriately change the hardware configuration to be used according to the technical level at which the present disclosure is implemented.
 なお、上記のようなユーザ端末12の各機能を実現するためのコンピュータプログラムを作製し、PC等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等を含む。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 It should be noted that it is possible to create a computer program for realizing each function of the user terminal 12 as described above and implement it on a PC or the like. A computer-readable recording medium storing such a computer program can also be provided. Recording media include, for example, magnetic disks, optical disks, magneto-optical disks, flash memories, and the like. Also, the above computer program may be distributed, for example, via a network without using a recording medium.
 本明細書で開示されている実施形態及び変形例は全ての点で例示に過ぎず限定的には解釈されないことに留意されるべきである。上述の実施形態及び変形例は、添付の特許請求の範囲及びその趣旨を逸脱することなく、様々な形態での省略、置換及び変更が可能である。例えば上述の実施形態及び変形例が全体的に又は部分的に組み合わされてもよく、また上述以外の実施形態が上述の実施形態又は変形例と組み合わされてもよい。また、本明細書に記載された本開示の効果は例示に過ぎず、その他の効果がもたらされてもよい。 It should be noted that the embodiments and modifications disclosed in this specification are merely illustrative in all respects and should not be construed as limiting. The embodiments and variations described above can be omitted, substituted, and modified in various ways without departing from the scope and spirit of the appended claims. For example, the above-described embodiments and modifications may be wholly or partially combined, and embodiments other than those described above may be combined with the above-described embodiments or modifications. Also, the advantages of the disclosure described herein are merely exemplary, and other advantages may be achieved.
 上述の技術的思想を具現化する技術的カテゴリは限定されない。例えば上述の装置を製造する方法或いは使用する方法に含まれる1又は複数の手順(ステップ)をコンピュータに実行させるためのコンピュータプログラムによって、上述の技術的思想が具現化されてもよい。またそのようなコンピュータプログラムが記録されたコンピュータが読み取り可能な非一時的(non-transitory)な記録媒体によって、上述の技術的思想が具現化されてもよい。 The technical category that embodies the above technical idea is not limited. For example, the above technical ideas may be embodied by a computer program for causing a computer to execute one or more procedures (steps) included in the method of manufacturing or using the above apparatus. Also, the above technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
5.効果の例
 以上で説明した技術は、例えば次のように特定される。開示される技術の1つは、情報取得装置11である。図1~図3等を参照して説明したように、情報取得装置11は、それぞれが対向する位置のバイタル音をセンシングするように2次元状に配置された複数のセンシング素子111aを備える。
5. Example of Effect The technology described above is specified as follows, for example. One of the disclosed technologies is the information acquisition device 11 . As described with reference to FIGS. 1 to 3 and the like, the information acquisition device 11 includes a plurality of sensing elements 111a arranged two-dimensionally so as to sense vital sounds at opposing positions.
 上記の情報取得装置11によれば、先に説明したように複数のセンシング素子111aそれぞれのセンシングデータを処理することで聴診位置のナビゲーションを行うことができる。従って、適切な聴診位置でのバイタル音のセンシングが可能になる。 According to the information acquisition device 11 described above, navigation of the auscultation position can be performed by processing the sensing data of each of the plurality of sensing elements 111a as described above. Therefore, it is possible to sense vital sounds at an appropriate auscultatory position.
 図2及び図3等を参照して説明したように、情報取得装置11は、複数のセンシング素子111aが設けられた主面112aを有する先端部112と、先端部112を主面112aとは反対側から支持する本体部113と、を備え、先端部112及び本体部113は、ユーザUが手で持って移動させることができる大きさ(例えば聴診器のヘッドピースと同程度の大きさ)を有してよい。ユーザUによる情報取得装置11の聴診器センサとして扱いが容易になる。 As described with reference to FIGS. 2 and 3 and the like, the information acquisition device 11 includes a tip portion 112 having a main surface 112a provided with a plurality of sensing elements 111a, and a tip portion 112 having a main surface 112a opposite to the main surface 112a. and a body portion 113 that is supported from the side, and the tip portion 112 and the body portion 113 have sizes that can be held and moved by the user U (for example, about the same size as the headpiece of a stethoscope). may have The user U can easily handle the stethoscope sensor of the information acquisition device 11 .
 図1~図7等を参照して説明した情報処理システム100も、開示される技術の1つである。情報処理システム100は、上述の複数のセンシング素子111aを含む情報取得装置11と、複数のセンシング素子111aそれぞれのセンシングデータに基づいて、複数のセンシング素子111aをユーザUの身体に当てる聴診位置のナビゲーション情報を生成する処理部15と、を備える。図4及び図5等を参照して説明したように、ナビゲーション情報は、情報取得装置11をセンシング目的のバイタル音の聴診に適した目的聴診位置に方向付けるためのナビゲーション方向を含み、処理部15は、ナビゲーション方向を算出してよい。例えばこのようなナビゲーション情報を、適切な聴診位置でのバイタル音のセンシングに供することができる。 The information processing system 100 described with reference to FIGS. 1 to 7 is also one of the disclosed technologies. The information processing system 100 includes the information acquisition device 11 including the plurality of sensing elements 111a described above, and the navigation of the auscultation position where the plurality of sensing elements 111a are applied to the body of the user U based on the sensing data of each of the plurality of sensing elements 111a. and a processing unit 15 that generates information. As described with reference to FIGS. 4 and 5, the navigation information includes a navigation direction for orienting the information acquisition device 11 to a target auscultation position suitable for auscultation of vital sounds for sensing purposes. may compute the navigation direction. For example, such navigational information can be provided for sensing vital sounds at appropriate auscultatory locations.
 図4及び図5等を参照して説明したように、処理部15は、複数のセンシング素子111aそれぞれにおけるセンシング目的のバイタル音に関する指標に基づいて、ナビゲーション方向を算出し、指標は、センシングデータに含まれるセンシング目的のバイタル音の大きさ、センシングデータに含まれる全バイタル音中のセンシング目的のバイタル音の割合、及び、センシングデータに含まれるセンシング目的のバイタル音の周波数成分、の少なくとも1つを含んでよい。例えばこのような指標に基づいてナビゲーション方向を算出することができる。 As described with reference to FIGS. 4 and 5, the processing unit 15 calculates the navigation direction based on the indices related to the vital sounds for sensing purposes in each of the plurality of sensing elements 111a. At least one of the volume of the vital sound for sensing included, the ratio of the vital sound for sensing to all the vital sounds included in the sensing data, and the frequency component of the vital sound for sensing included in the sensing data. may contain. For example, navigation directions can be calculated based on such indicators.
 図4等を参照して説明したように、処理部15は、複数のセンシング素子111aそれぞれのセンシングデータと、学習済みモデル162とを用いて、ナビゲーション方向を算出し、学習済みモデル162は、複数のセンシング素子111aそれぞれのセンシングデータに対応するデータが入力されると、ナビゲーション方向に対応するデータを出力してよい。このような学習済みモデル162を用いてナビゲーション方向を算出することもできる。その場合の学習済みモデル162は、ユーザUが情報処理システム100を利用することによって得られたデータであるユーザデータ163を含む訓練データを用いた機械学習によって生成された学習済みモデルであってよい。ユーザUによる情報取得装置11の利用状況、利用環境等に最適化された学習済みモデル162を用いて、ナビゲーション方向を算出することができる。 As described with reference to FIG. 4 and the like, the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the navigation direction. When the data corresponding to the sensing data of each of the sensing elements 111a are input, the data corresponding to the navigation direction may be output. A navigation direction can also be calculated using such a trained model 162 . The trained model 162 in that case may be a trained model generated by machine learning using training data including the user data 163 obtained by the user U using the information processing system 100. . The navigation direction can be calculated using the learned model 162 optimized for the usage status of the information acquisition device 11 by the user U, the usage environment, and the like.
 図8等を参照して説明したように、情報取得装置11は、モーションセンサ115を含み、処理部15は、モーションセンサのセンシングデータに基づいて、複数のセンシング素子111aの位置を含むナビゲーション情報を生成してよい。ナビゲーション精度を向上できる可能性が高まる。 As described with reference to FIG. 8 and the like, the information acquisition device 11 includes the motion sensor 115, and the processing unit 15 obtains navigation information including the positions of the plurality of sensing elements 111a based on the sensing data of the motion sensor. may be generated. The possibility of improving navigation accuracy increases.
 図4及び図7等を参照して説明したように、情報処理システム100は、ナビゲーション情報をユーザUに提示するユーザインタフェース部14を備えてよい。提示されたナビゲーション情報に従って、ユーザUは、情報取得装置11の複数のセンシング素子111aを目的聴診位置に近づけることができる。 As described with reference to FIGS. 4 and 7, the information processing system 100 may include the user interface section 14 that presents the user U with navigation information. According to the presented navigation information, the user U can bring the plurality of sensing elements 111a of the information acquisition device 11 closer to the target auscultatory position.
 図13等を参照して説明したように、情報処理システム100は、上述の先端部112と、上述の本体部113と、先端部112及び本体部113のうち、先端部112の主面112a以外の部分に設けられた発光部116と、を備え、発光部116は、ナビゲーション情報をユーザUに提示してよい。このような発光部116によるナビゲーション情報の提示も可能である。 As described with reference to FIG. 13 and the like, the information processing system 100 includes the tip portion 112 described above, the body portion 113 described above, and the tip portion 112 and the body portion 113 other than the main surface 112 a of the tip portion 112 . and a light-emitting unit 116 provided in the portion of , and the light-emitting unit 116 may present the user U with navigation information. Navigation information can also be presented by such a light emitting unit 116 .
 図6等を参照して説明したように、処理部15は、複数のセンシング素子111aそれぞれのセンシングデータに基づいて、バイタル音データを生成してよい。例えば、処理部15は、センシング目的のバイタル音の聴診に適した前記バイタル音データを生成するように、複数のセンシング素子111aそれぞれのセンシングデータを重み付け加算してよい。この場合、情報取得装置11は、環境音センサ114を含み、処理部15は、環境音センサ114のセンシングデータに基づいて、複数のセンシング素子111aそれぞれのセンシングデータを重み付け加算してよい。また、処理部15は、環境音センサのセンシングデータに基づいて、雑音が低減されたバイタル音データを生成してよい。例えばこのようにしてバイタル音データを生成することで、聴診に適したバイタル音を得ることができる。 As described with reference to FIG. 6 and the like, the processing unit 15 may generate vital sound data based on the sensing data of each of the plurality of sensing elements 111a. For example, the processing unit 15 may weight-add the sensing data of each of the plurality of sensing elements 111a so as to generate the vital sound data suitable for auscultation of vital sounds for sensing purposes. In this case, the information acquisition device 11 may include the environmental sound sensor 114 , and the processing unit 15 may weight-add the sensing data of each of the plurality of sensing elements 111 a based on the sensing data of the environmental sound sensor 114 . Also, the processing unit 15 may generate vital sound data with reduced noise based on the sensing data of the environmental sound sensor. For example, by generating vital sound data in this manner, vital sounds suitable for auscultation can be obtained.
 図4及び図6等を参照して説明したように、処理部15は、複数のセンシング素子それぞれ111aのセンシングデータと、学習済みモデル162とを用いて、複数のセンシング素子111aそれぞれの重みを算出し、学習済みモデル162は、複数のセンシング素子111aそれぞれのセンシングデータに対応するデータが入力されると、複数のセンシング素子111aそれぞれの重みに対応するデータを出力してよい。このような学習済みモデル162を用いて重みを算出することもできる。その場合の学習済みモデル162も、ユーザデータ163を含む訓練データを用いた機械学習によって生成された学習済みモデルであってよい。ユーザUごとに、バイタル音データを最適化できる可能性が高まる。 As described with reference to FIGS. 4 and 6, the processing unit 15 uses the sensing data of each of the plurality of sensing elements 111a and the learned model 162 to calculate the weight of each of the plurality of sensing elements 111a. The trained model 162 may output data corresponding to the weight of each of the plurality of sensing elements 111a when the data corresponding to the sensing data of each of the plurality of sensing elements 111a is input. A weight can also be calculated using such a trained model 162 . The trained model 162 in that case may also be a trained model generated by machine learning using training data including the user data 163 . The possibility of optimizing the vital sound data for each user U increases.
 図14等を参照して説明したように、情報処理システム100は、バイタル音データを蓄積する記憶部(蓄積データADを記憶する記憶部16、記憶部23等)を備えてよい。例えば蓄積データADを提示したり、蓄積データADを解析したりして、さまざまに利用することができる。 As described with reference to FIG. 14 and the like, the information processing system 100 may include a storage unit for accumulating vital sound data (storage unit 16, storage unit 23, etc. for storing accumulated data AD). For example, it can be used in various ways by presenting the accumulated data AD and analyzing the accumulated data AD.
 図12等を参照して説明したように、情報処理システム100は、複数のセンシング素子111aがユーザUの背中に当たるように情報取得装置11が設けられた物品(例えば椅子8)を備えてよい。ユーザUの背中の聴診も容易に行うことができる。 As described with reference to FIG. 12 and the like, the information processing system 100 may include an article (for example, the chair 8) provided with the information acquisition device 11 such that the plurality of sensing elements 111a are in contact with the back of the user U. Auscultation of the back of the user U can also be easily performed.
 なお、本開示に記載された効果は、あくまで例示であって、開示された内容に限定されない。他の効果があってもよい。 It should be noted that the effects described in this disclosure are merely examples and are not limited to the disclosed content. There may be other effects.
 以上、本開示の実施形態について説明したが、本開示の技術的範囲は、上述の実施形態そのままに限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。また、異なる実施形態及び変形例にわたる構成要素を適宜組み合わせてもよい。 Although the embodiments of the present disclosure have been described above, the technical scope of the present disclosure is not limited to the embodiments described above, and various modifications are possible without departing from the gist of the present disclosure. Moreover, you may combine the component over different embodiment and modifications suitably.
 なお、本技術は以下のような構成も取ることができる。
(1)
 それぞれが対向する位置のバイタル音をセンシングするように2次元状に配置された複数のセンシング素子を備える、
 情報取得装置。
(2)
 前記複数のセンシング素子が設けられた主面を有する先端部と、
 前記先端部を前記主面とは反対側から支持する本体部と、
 を備え、
 前記先端部及び前記本体部は、ユーザが手で持って移動させることができる大きさを有する、
 (1)に記載の情報取得装置。
(3)
 それぞれが対向する位置のバイタル音をセンシングするように2次元状に配置された複数のセンシング素子を含む情報取得装置と、
 前記複数のセンシング素子それぞれのセンシングデータに基づいて、前記複数のセンシング素子をユーザの身体に当てる聴診位置のナビゲーション情報を生成する処理部と、
 を備える、
 情報処理システム。
(4)
 前記ナビゲーション情報は、前記情報取得装置をセンシング目的のバイタル音の聴診に適した目的聴診位置に方向付けるためのナビゲーション方向を含み、
 前記処理部は、前記ナビゲーション方向を算出する、
 (3)に記載の情報処理システム。
(5)
 前記処理部は、前記複数のセンシング素子それぞれにおける前記センシング目的のバイタル音に関する指標に基づいて、前記ナビゲーション方向を算出し、
 前記指標は、
  センシングデータに含まれるセンシング目的のバイタル音の大きさ、
  センシングデータに含まれる全バイタル音中のセンシング目的のバイタル音の割合、
 及び、
  センシングデータに含まれるセンシング目的のバイタル音の周波数成分、
 の少なくとも1つを含む、
 (4)に記載の情報処理システム。
(6)
 前記処理部は、前記複数のセンシング素子それぞれのセンシングデータと、学習済みモデルとを用いて、前記ナビゲーション方向を算出し、
 前記学習済みモデルは、前記複数のセンシング素子それぞれのセンシングデータに対応するデータが入力されると、前記ナビゲーション方向に対応するデータを出力する、
 (4)に記載の情報処理システム。
(7)
 前記学習済みモデルは、前記ユーザが前記情報処理システムを利用することによって得られたデータであるユーザデータを含む訓練データを用いた機械学習によって生成された学習済みモデルである、
 (6)に記載の情報処理システム。
(8)
 前記情報取得装置は、モーションセンサを含み、
 前記処理部は、前記モーションセンサのセンシングデータに基づいて、前記複数のセンシング素子の位置を含む前記ナビゲーション情報を生成する、
 (3)~(7)のいずれかに記載の情報処理システム。
(9)
 前記ナビゲーション情報を前記ユーザに提示するユーザインタフェース部を備える、
 (3)~(8)のいずれかに記載の情報処理システム。
(10)
 前記複数のセンシング素子が設けられた主面を有する先端部と、
 前記先端部を前記主面とは反対側から支持する本体部と、
 前記先端部及び本体部のうち、前記先端部の前記主面以外の部分に設けられた発光部と、
 を備え、
 前記発光部は、前記ナビゲーション情報をユーザに提示する、
 (9)に記載の情報処理システム。
(11)
 前記処理部は、前記複数のセンシング素子それぞれのセンシングデータに基づいて、バイタル音データを生成する、
 (3)~(10)のいずれかに記載の情報処理システム。
(12)
 前記処理部は、センシング目的のバイタル音の聴診に適した前記バイタル音データを生成するように、前記複数のセンシング素子それぞれのセンシングデータを重み付け加算する、
 (11)に記載の情報処理システム。
(13)
 前記情報取得装置は、環境音センサを含み、
 前記処理部は、前記環境音センサのセンシングデータに基づいて、前記複数のセンシング素子それぞれのセンシングデータを重み付け加算する、
 (12)に記載の情報処理システム。
(14)
 前記情報取得装置は、環境音センサを含み、
 前記処理部は、前記環境音センサのセンシングデータに基づいて、雑音が低減された前記バイタル音データを生成する、
 (12)又は(13)に記載の情報処理システム。
(15)
 前記処理部は、前記複数のセンシング素子それぞれのセンシングデータと、学習済みモデルとを用いて、前記複数のセンシング素子それぞれの重みを算出し、
 前記学習済みモデルは、前記複数のセンシング素子それぞれのセンシングデータに対応するデータが入力されると、前記複数のセンシング素子それぞれの重みに対応するデータを出力する、
 (12)~(14)のいずれかに記載の情報処理システム。
(16)
 前記学習済みモデルは、前記ユーザが前記情報処理システムを利用することによって得られたデータであるユーザデータを含む訓練データを用いた機械学習によって生成された学習済みモデルである、
 (15)に記載の情報処理システム。
(17)
 前記バイタル音データを蓄積する記憶部を備える、
 (11)~(16)のいずれかに記載の情報処理システム。
(18)
 前記複数のセンシング素子が前記ユーザの背中に当たるように前記情報取得装置が設けられた物品を備える、
 (3)~(17)のいずれかに記載の情報処理システム。
Note that the present technology can also take the following configuration.
(1)
Equipped with a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions,
Information acquisition device.
(2)
a tip portion having a main surface on which the plurality of sensing elements are provided;
a body portion that supports the tip portion from a side opposite to the main surface;
with
The tip portion and the body portion have a size that allows a user to hold and move them by hand,
The information acquisition device according to (1).
(3)
an information acquisition device including a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions;
a processing unit that generates navigation information of an auscultatory position where the plurality of sensing elements are applied to the user's body based on the sensing data of each of the plurality of sensing elements;
comprising
Information processing system.
(4)
the navigation information includes navigation directions for directing the information acquisition device to a target auscultation position suitable for auscultation of vital sounds for sensing purposes;
the processing unit calculates the navigation direction;
The information processing system according to (3).
(5)
The processing unit calculates the navigation direction based on an index related to the vital sound for the purpose of sensing in each of the plurality of sensing elements,
The indicator is
the volume of the vital sound for sensing purposes included in the sensing data,
The ratio of vital sounds for sensing purposes to all vital sounds contained in the sensing data,
as well as,
frequency components of vital sounds for sensing purposes included in the sensing data;
including at least one of
The information processing system according to (4).
(6)
The processing unit calculates the navigation direction using sensing data of each of the plurality of sensing elements and a trained model,
The trained model outputs data corresponding to the navigation direction when data corresponding to sensing data of each of the plurality of sensing elements is input.
The information processing system according to (4).
(7)
The trained model is a trained model generated by machine learning using training data including user data obtained by the user using the information processing system.
The information processing system according to (6).
(8)
The information acquisition device includes a motion sensor,
The processing unit generates the navigation information including the positions of the plurality of sensing elements based on the sensing data of the motion sensor.
The information processing system according to any one of (3) to (7).
(9)
a user interface unit that presents the navigation information to the user;
The information processing system according to any one of (3) to (8).
(10)
a tip portion having a main surface on which the plurality of sensing elements are provided;
a body portion that supports the tip portion from a side opposite to the main surface;
a light-emitting portion provided at a portion of the tip portion other than the main surface of the tip portion and the body portion;
with
the light emitting unit presents the navigation information to the user;
The information processing system according to (9).
(11)
The processing unit generates vital sound data based on the sensing data of each of the plurality of sensing elements.
The information processing system according to any one of (3) to (10).
(12)
The processing unit performs weighted addition of the sensing data of each of the plurality of sensing elements so as to generate the vital sound data suitable for auscultation of vital sounds for sensing purposes.
The information processing system according to (11).
(13)
The information acquisition device includes an environmental sound sensor,
The processing unit performs weighted addition of the sensing data of each of the plurality of sensing elements based on the sensing data of the environmental sound sensor.
The information processing system according to (12).
(14)
The information acquisition device includes an environmental sound sensor,
The processing unit generates the vital sound data with reduced noise based on the sensing data of the environmental sound sensor.
The information processing system according to (12) or (13).
(15)
The processing unit uses the sensing data of each of the plurality of sensing elements and a trained model to calculate the weight of each of the plurality of sensing elements,
When data corresponding to sensing data of each of the plurality of sensing elements is input to the trained model, the learned model outputs data corresponding to the weight of each of the plurality of sensing elements.
The information processing system according to any one of (12) to (14).
(16)
The trained model is a trained model generated by machine learning using training data including user data obtained by the user using the information processing system.
The information processing system according to (15).
(17)
A storage unit for accumulating the vital sound data,
The information processing system according to any one of (11) to (16).
(18)
An article provided with the information acquisition device such that the plurality of sensing elements hit the user's back,
The information processing system according to any one of (3) to (17).
 100 情報処理システム
  11 情報取得装置
 111 センサアレイ
111a センシング素子
 112 先端部
112a 主面
 113 本体部
 114 環境音センサ
 115 モーションセンサ
 116 発光部
116a 発光素子
  12 ユーザ端末
  13 通信部
  14 ユーザインタフェース部
  15 処理部
  16 記憶部
 161 アプリケーションプログラム
 162 学習済みモデル
 163 ユーザデータ
   2 情報処理装置
  21 通信部
  22 推定部
  23 記憶部
 231 患者情報
 232 コミュニティ情報
 233 アルゴリズムDB
 234 学習済みモデル
 235 推薦情報
 236 匿名加工情報
  24 推薦部
  25 加工部
   3 医療従事者端末
  31 通信部
  32 ユーザインタフェース部
  33 記憶部
 331 医療情報
   4 コミュニティメンバー端末
  41 通信部
  42 ユーザインタフェース部
   5 生保/健保端末
  51 通信部
  52 解析部
  53 記憶部
 531 顧客/従業員情報
   6 製品/サービス提供者端末
  61 通信部
  62 ユーザインタフェース部
   7 分析端末
  71 通信部
  72 解析部
  73 ユーザインタフェース部 
   8 椅子
  81 支持部   G 矢印(ナビゲーション方向)
  AD 蓄積データ
   C 医療従事者  P1 重み算出処理
   M メンバー
   N ネットワーク
  P2 重み付け処理
  P3 加算処理
  P4 雑音低減処理
  P5 ナビゲーション情報生成処理
   U ユーザ
100 Information Processing System 11 Information Acquisition Device 111 Sensor Array 111a Sensing Element 112 Tip Part 112a Main Surface 113 Body Part 114 Environmental Sound Sensor 115 Motion Sensor 116 Light Emitting Part 116a Light Emitting Element 12 User Terminal 13 Communication Part 14 User Interface Part 15 Processing Part 16 Storage unit 161 Application program 162 Learned model 163 User data 2 Information processing device 21 Communication unit 22 Estimation unit 23 Storage unit 231 Patient information 232 Community information 233 Algorithm DB
234 Trained model 235 Recommended information 236 Anonymously processed information 24 Recommendation unit 25 Processing unit 3 Medical staff terminal 31 Communication unit 32 User interface unit 33 Storage unit 331 Medical information 4 Community member terminal 41 Communication unit 42 User interface unit 5 Life/health insurance Terminal 51 Communication Unit 52 Analysis Unit 53 Storage Unit 531 Customer/Employee Information 6 Product/Service Provider Terminal 61 Communication Unit 62 User Interface Unit 7 Analysis Terminal 71 Communication Unit 72 Analysis Unit 73 User Interface Unit
8 chair 81 support G arrow (navigation direction)
AD Accumulated data C Medical staff P1 Weight calculation process M Member N Network P2 Weighting process P3 Addition process P4 Noise reduction process P5 Navigation information generation process U User

Claims (18)

  1.  それぞれが対向する位置のバイタル音をセンシングするように2次元状に配置された複数のセンシング素子を備える、
     情報取得装置。
    Equipped with a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions,
    Information acquisition device.
  2.  前記複数のセンシング素子が設けられた主面を有する先端部と、
     前記先端部を前記主面とは反対側から支持する本体部と、
     を備え、
     前記先端部及び前記本体部は、ユーザが手で持って移動させることができる大きさを有する、
     請求項1に記載の情報取得装置。
    a tip portion having a main surface on which the plurality of sensing elements are provided;
    a body portion that supports the tip portion from a side opposite to the main surface;
    with
    The tip portion and the body portion have a size that allows a user to hold and move them by hand,
    The information acquisition device according to claim 1.
  3.  それぞれが対向する位置のバイタル音をセンシングするように2次元状に配置された複数のセンシング素子を含む情報取得装置と、
     前記複数のセンシング素子それぞれのセンシングデータに基づいて、前記複数のセンシング素子をユーザの身体に当てる聴診位置のナビゲーション情報を生成する処理部と、
     を備える、
     情報処理システム。
    an information acquisition device including a plurality of sensing elements arranged two-dimensionally so as to sense vital sounds at opposing positions;
    a processing unit that generates navigation information of an auscultatory position where the plurality of sensing elements are applied to the user's body based on the sensing data of each of the plurality of sensing elements;
    comprising
    Information processing system.
  4.  前記ナビゲーション情報は、前記情報取得装置をセンシング目的のバイタル音の聴診に適した目的聴診位置に方向付けるためのナビゲーション方向を含み、
     前記処理部は、前記ナビゲーション方向を算出する、
     請求項3に記載の情報処理システム。
    the navigation information includes navigation directions for directing the information acquisition device to a target auscultation position suitable for auscultation of vital sounds for sensing purposes;
    the processing unit calculates the navigation direction;
    The information processing system according to claim 3.
  5.  前記処理部は、前記複数のセンシング素子それぞれにおける前記センシング目的のバイタル音に関する指標に基づいて、前記ナビゲーション方向を算出し、
     前記指標は、
      センシングデータに含まれるセンシング目的のバイタル音の大きさ、
      センシングデータに含まれる全バイタル音中のセンシング目的のバイタル音の割合、
     及び、
      センシングデータに含まれるセンシング目的のバイタル音の周波数成分、
     の少なくとも1つを含む、
     請求項4に記載の情報処理システム。
    The processing unit calculates the navigation direction based on an index related to the vital sound for the purpose of sensing in each of the plurality of sensing elements,
    The indicator is
    the volume of the vital sound for sensing purposes included in the sensing data,
    The ratio of vital sounds for sensing purposes to all vital sounds contained in the sensing data,
    as well as,
    frequency components of vital sounds for sensing purposes included in the sensing data,
    including at least one of
    The information processing system according to claim 4.
  6.  前記処理部は、前記複数のセンシング素子それぞれのセンシングデータと、学習済みモデルとを用いて、前記ナビゲーション方向を算出し、
     前記学習済みモデルは、前記複数のセンシング素子それぞれのセンシングデータに対応するデータが入力されると、前記ナビゲーション方向に対応するデータを出力する、
     請求項4に記載の情報処理システム。
    The processing unit calculates the navigation direction using sensing data of each of the plurality of sensing elements and a trained model,
    The trained model outputs data corresponding to the navigation direction when data corresponding to sensing data of each of the plurality of sensing elements is input.
    The information processing system according to claim 4.
  7.  前記学習済みモデルは、前記ユーザが前記情報処理システムを利用することによって得られたデータであるユーザデータを含む訓練データを用いた機械学習によって生成された学習済みモデルである、
     請求項6に記載の情報処理システム。
    The trained model is a trained model generated by machine learning using training data including user data obtained by the user using the information processing system.
    The information processing system according to claim 6.
  8.  前記情報取得装置は、モーションセンサを含み、
     前記処理部は、前記モーションセンサのセンシングデータに基づいて、前記複数のセンシング素子の位置を含む前記ナビゲーション情報を生成する、
     請求項3に記載の情報処理システム。
    The information acquisition device includes a motion sensor,
    The processing unit generates the navigation information including the positions of the plurality of sensing elements based on the sensing data of the motion sensor.
    The information processing system according to claim 3.
  9.  前記ナビゲーション情報を前記ユーザに提示するユーザインタフェース部を備える、
     請求項3に記載の情報処理システム。
    a user interface unit that presents the navigation information to the user;
    The information processing system according to claim 3.
  10.  前記複数のセンシング素子が設けられた主面を有する先端部と、
     前記先端部を前記主面とは反対側から支持する本体部と、
     前記先端部及び本体部のうち、前記先端部の前記主面以外の部分に設けられた発光部と、
     を備え、
     前記発光部は、前記ナビゲーション情報をユーザに提示する、
     請求項9に記載の情報処理システム。
    a tip portion having a main surface on which the plurality of sensing elements are provided;
    a body portion that supports the tip portion from a side opposite to the main surface;
    a light-emitting portion provided at a portion of the tip portion other than the main surface of the tip portion and the body portion;
    with
    the light emitting unit presents the navigation information to the user;
    The information processing system according to claim 9.
  11.  前記処理部は、前記複数のセンシング素子それぞれのセンシングデータに基づいて、バイタル音データを生成する、
     請求項3に記載の情報処理システム。
    The processing unit generates vital sound data based on the sensing data of each of the plurality of sensing elements.
    The information processing system according to claim 3.
  12.  前記処理部は、センシング目的のバイタル音の聴診に適した前記バイタル音データを生成するように、前記複数のセンシング素子それぞれのセンシングデータを重み付け加算する、
     請求項11に記載の情報処理システム。
    The processing unit performs weighted addition of the sensing data of each of the plurality of sensing elements so as to generate the vital sound data suitable for auscultation of vital sounds for sensing purposes.
    The information processing system according to claim 11.
  13.  前記情報取得装置は、環境音センサを含み、
     前記処理部は、前記環境音センサのセンシングデータに基づいて、前記複数のセンシング素子それぞれのセンシングデータを重み付け加算する、
     請求項12に記載の情報処理システム。
    The information acquisition device includes an environmental sound sensor,
    The processing unit performs weighted addition of the sensing data of each of the plurality of sensing elements based on the sensing data of the environmental sound sensor.
    The information processing system according to claim 12.
  14.  前記情報取得装置は、環境音センサを含み、
     前記処理部は、前記環境音センサのセンシングデータに基づいて、雑音が低減された前記バイタル音データを生成する、
     請求項12に記載の情報処理システム。
    The information acquisition device includes an environmental sound sensor,
    The processing unit generates the vital sound data with reduced noise based on the sensing data of the environmental sound sensor.
    The information processing system according to claim 12.
  15.  前記処理部は、前記複数のセンシング素子それぞれのセンシングデータと、学習済みモデルとを用いて、前記複数のセンシング素子それぞれの重みを算出し、
     前記学習済みモデルは、前記複数のセンシング素子それぞれのセンシングデータに対応するデータが入力されると、前記複数のセンシング素子それぞれの重みに対応するデータを出力する、
     請求項12に記載の情報処理システム。
    The processing unit uses the sensing data of each of the plurality of sensing elements and a trained model to calculate the weight of each of the plurality of sensing elements,
    When data corresponding to sensing data of each of the plurality of sensing elements is input to the trained model, the learned model outputs data corresponding to the weight of each of the plurality of sensing elements.
    The information processing system according to claim 12.
  16.  前記学習済みモデルは、前記ユーザが前記情報処理システムを利用することによって得られたデータであるユーザデータを含む訓練データを用いた機械学習によって生成された学習済みモデルである、
     請求項15に記載の情報処理システム。
    The trained model is a trained model generated by machine learning using training data including user data obtained by the user using the information processing system.
    The information processing system according to claim 15.
  17.  前記バイタル音データを蓄積する記憶部を備える、
     請求項11に記載の情報処理システム。
    A storage unit for accumulating the vital sound data,
    The information processing system according to claim 11.
  18.  前記複数のセンシング素子が前記ユーザの背中に当たるように前記情報取得装置が設けられた物品を備える、
     請求項3に記載の情報処理システム。
    An article provided with the information acquisition device such that the plurality of sensing elements hit the user's back,
    The information processing system according to claim 3.
PCT/JP2022/048616 2022-01-12 2022-12-28 Information acquisition device and information processing system WO2023136175A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-003065 2022-01-12
JP2022003065 2022-01-12

Publications (1)

Publication Number Publication Date
WO2023136175A1 true WO2023136175A1 (en) 2023-07-20

Family

ID=87279145

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/048616 WO2023136175A1 (en) 2022-01-12 2022-12-28 Information acquisition device and information processing system

Country Status (1)

Country Link
WO (1) WO2023136175A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002165292A (en) * 2000-11-27 2002-06-07 Nippon Telegr & Teleph Corp <Ntt> Multi-channel acoustic signal collection device
JP2009273817A (en) * 2008-05-19 2009-11-26 Shigehiro Kuroki Vital reaction recorder and vital reaction recording method
WO2011114669A1 (en) * 2010-03-18 2011-09-22 パナソニック株式会社 Biometric sound testing device
US20180160907A1 (en) * 2018-01-26 2018-06-14 Shiv Prakash Verma Digital healthcare practice system for digital citizens
CN112515698A (en) * 2020-11-24 2021-03-19 英华达(上海)科技有限公司 Auscultation system and control method thereof
JP2022006758A (en) * 2020-06-25 2022-01-13 オンキヨー株式会社 Stethoscopic system, stethoscope, and method
JP2023016317A (en) * 2021-07-21 2023-02-02 オンキヨー株式会社 Stethoscope and auscultation system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002165292A (en) * 2000-11-27 2002-06-07 Nippon Telegr & Teleph Corp <Ntt> Multi-channel acoustic signal collection device
JP2009273817A (en) * 2008-05-19 2009-11-26 Shigehiro Kuroki Vital reaction recorder and vital reaction recording method
WO2011114669A1 (en) * 2010-03-18 2011-09-22 パナソニック株式会社 Biometric sound testing device
US20180160907A1 (en) * 2018-01-26 2018-06-14 Shiv Prakash Verma Digital healthcare practice system for digital citizens
JP2022006758A (en) * 2020-06-25 2022-01-13 オンキヨー株式会社 Stethoscopic system, stethoscope, and method
CN112515698A (en) * 2020-11-24 2021-03-19 英华达(上海)科技有限公司 Auscultation system and control method thereof
JP2023016317A (en) * 2021-07-21 2023-02-02 オンキヨー株式会社 Stethoscope and auscultation system

Similar Documents

Publication Publication Date Title
US10307104B2 (en) Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US9805339B2 (en) Method for monitoring and improving health and productivity of employees using a computer mouse system
US10206625B2 (en) Chair pad system and associated, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US9949640B2 (en) System for monitoring employee health
US9808156B2 (en) Systems, computer medium and computer-implemented methods for monitoring and improving biomechanical health of employees
US9693734B2 (en) Systems for monitoring and improving biometric health of employees
US9526455B2 (en) Systems, computer medium and computer-implemented methods for monitoring and improving health and productivity of employees
US20130012802A1 (en) Systems, Computer Medium and Computer-Implemented Methods For Monitoring and Improving Cognitive and Emotive Health of Employees
JP2013524332A (en) Method and apparatus for optimizing questionnaires
JP2006230679A (en) Health monitoring apparatus
TW200526174A (en) Analysis of auscultatory sounds using single value decomposition
CN100391405C (en) Body balance function detecting method and training system
JP2014524800A (en) System, computer medium and computer-implemented method for monitoring employee health using a mobile device
JP2008176816A (en) Method and apparatus of generating avatar image for representing health condition
KR20190058858A (en) Method for providing diagnostic information on cardiovascular diseases using a smart device and heart sound application for the same
CN105943080A (en) Intelligent stethophone
JP2004157941A (en) Home care system, its server, and toy device for use with home care system
JP2019003570A (en) Health care device, health care method, and health care program
WO2023136175A1 (en) Information acquisition device and information processing system
CN112740332A (en) Evaluation support system and evaluation support method for supporting evaluation of state of circulation system
WO2023058391A1 (en) Information processing method, information processing system, and information processing device
US20220151582A1 (en) System and method for assessing pulmonary health
WO2023223380A1 (en) Information processing system, information processing method, information processing device, measurement device, and computer program
de Silva et al. Telemedicine-Remote Sensory Interaction with Patients for Medical Evaluation and Diagnosis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22920690

Country of ref document: EP

Kind code of ref document: A1