WO2011055628A1 - Organization behavior analyzer and organization behavior analysis system - Google Patents

Organization behavior analyzer and organization behavior analysis system Download PDF

Info

Publication number
WO2011055628A1
WO2011055628A1 PCT/JP2010/068289 JP2010068289W WO2011055628A1 WO 2011055628 A1 WO2011055628 A1 WO 2011055628A1 JP 2010068289 W JP2010068289 W JP 2010068289W WO 2011055628 A1 WO2011055628 A1 WO 2011055628A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
organization
index
persons
data
Prior art date
Application number
PCT/JP2010/068289
Other languages
French (fr)
Japanese (ja)
Inventor
信夫 佐藤
聡美 辻
和男 矢野
宏視 荒
知明 秋富
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to JP2011539329A priority Critical patent/JP5400895B2/en
Publication of WO2011055628A1 publication Critical patent/WO2011055628A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management

Definitions

  • the present invention relates to technology for visualizing the state of an organization from behavior data and communication data of members in the organization.
  • a sensor network is a system applied to the acquisition and control of the state by attaching a terminal equipped with a sensor and a wireless communication circuit to an environment, an object, a person, etc., and extracting various information obtained from the sensor via wireless.
  • the physical quantities acquired by the sensor to detect this communication include infrared rays for detecting a face-to-face state, voices for detecting speech and environment, and acceleration for detecting human action.
  • Patent Document 1 discloses that.
  • Patent Document 2 discloses that life support is performed by viewing the relationship between stress and the behavior of the person by displaying the feature value obtained from the acceleration sensor in a time series and the stress questionnaire on one screen. There is.
  • Patent Document 2 discloses a method of displaying stress level and behavior at one time, but does not describe specifying behavior that is a factor of stress or a countermeasure (stress elimination) method.
  • an object of the present invention is to generate an analysis model composed of factors useful for problem solution of an organization, from behavior data and communication data in the organization.
  • an organization dynamics index is obtained from sensor data and used as an explanatory variable.
  • Objective data are obtained from questionnaire responses such as objective data such as organizational productivity and accidents / defects, leadership / teamwork indicators, employee's rewards / enhancement indicators, and stress / mental upset indicators.
  • an organization behavior analysis device which analyzes an organization constituted by a plurality of persons.
  • a receiver for receiving sensor data acquired by an infrared transmitting / receiving unit and an acceleration sensor of a terminal attached to each of a plurality of persons, and data indicating a subjective evaluation or an objective evaluation of each of the plurality of persons;
  • a control unit that analyzes data indicating a subjective evaluation or an objective evaluation, and a recording unit that records analysis conditions for the control unit to analyze and a result of analysis by the control unit.
  • the control unit calculates, from the sensor data, an indicator indicating the relationship between persons in the organization and the action in the organization for each of the plurality of persons, and records the index in the recording unit.
  • tissue behavior analysis system includes a tissue behavior analysis device including: a control unit that analyzes and a recording unit that records analysis conditions for the analysis by the control unit and the analysis result of the control unit.
  • the control unit calculates, from the sensor data, an indicator indicating the relationship between persons in the organization and the action in the organization for each of the plurality of persons, and records the index in the recording unit.
  • an organization behavior analysis device which analyzes an organization constituted by a plurality of persons.
  • a receiver for receiving data indicating subjective evaluation of each of a plurality of persons, a controller for analyzing data indicating the subjective evaluation, data for indicating a seat position in a tissue, and an analysis for the controller to analyze
  • a recording unit that records the condition and the analysis result of the control unit.
  • the control unit calculates, for each of a plurality of persons, an index calculation unit that calculates an index related to stress from data indicating a subjective evaluation based on analysis conditions, data indicating a seat position, and an index related to stress.
  • a seat arrangement determining unit that determines the arrangement of seats in the organization of each of the plurality of persons.
  • Example of system configuration of embodiment 1 Example of system configuration of embodiment 1
  • Example of system configuration of embodiment 1 Example of system configuration of embodiment 1
  • Example of system configuration of embodiment 1 Example of system configuration of embodiment 1
  • Example of system configuration of embodiment 1 The example of the figure which shows the whole processing flow of Example 1
  • the example of the figure which shows the whole processing flow of Example 1 Example of Meeting Table in Example 1
  • Example of Body Rhythm Table of Example 1 Example of the facing matrix of Example 1
  • Example of network diagram (Part 1) Example of face-to-face index of Example 1 and organization activity index
  • Example of Personality Index in Example 1 Example of productivity index of example 1, accident failure index
  • Example of factor coefficient of Example 1 The example of the figure which shows the whole processing flow of Example 2
  • Example of personality coefficient of Example 2 Example of user-specific personality factor in Example 2
  • 26 is a diagram showing an example of a process flow of job position hierarchical network diagram coordinate identification according to the twelfth embodiment; A diagram showing an example of a job position hierarchical network diagram coordinate list of the twelfth embodiment The figure which shows an example of the post hierarchy hierarchy structure of Example 12
  • a business microscope system In order to clarify the position and function of the analysis system in the present invention, first, a business microscope system will be described.
  • a business microscope is used to monitor the behavior and behavior of a human being with a sensor node worn by a human, and to help improve the organization by illustrating the relationship between persons and the image of the current organization as an organization activity.
  • System data on face-to-face detection, action, voice, etc. acquired by the sensor node is generically referred to as tissue dynamics data.
  • FIGS. 1A, 1B, 1C, 1D and 1E are explanatory diagrams showing components of one embodiment of a business microscope system, which are shown separately for convenience of illustration. The respective processes are executed in cooperation with each other.
  • FIG. 1A to 1E show a sensor net server (SS) that stores organization dynamics data from a name tag type sensor node (TR), a base station (GW), and an application server (AS) that analyzes organization dynamics data, It shows a series of flows up to the client (CL) that outputs the analysis result to the viewer.
  • SS sensor net server
  • TR name tag type sensor node
  • GW base station
  • AS application server
  • This system comprises a name tag type sensor node (TR), a base station (GW), an organization sensor network server (SS), an application server (AS), and a client (CL).
  • TR name tag type sensor node
  • GW base station
  • SS organization sensor network server
  • AS application server
  • CL client
  • the sensor network server and the application server are described as separate devices, but it is also possible to realize the functions of these servers with one server.
  • An application server shown in FIG. 1A, analyzes and processes tissue dynamics data.
  • the analysis application is started.
  • the analysis application requests the sensor network server (SS) shown in FIG. 1C to acquire necessary tissue dynamics data.
  • the analysis application analyzes the acquired tissue dynamics data, and returns the analysis result to the client (CL) shown in FIG. 1B.
  • the analysis application may record the analysis result as it is in the analysis result database (F).
  • the application used for analysis is stored in the analysis algorithm (D) and executed by the control unit (ASCO).
  • the processing executed by the present embodiment is modeling analysis (CA), personality index extraction analysis (CA1), and personality index conversion analysis (CA2).
  • the application server includes a transmission / reception unit (ASSR), a storage unit (ASME), and a control unit (ASCO).
  • ASSR transmission / reception unit
  • ASME storage unit
  • ASCO control unit
  • the transceiver unit transmits and receives tissue dynamics data between the sensor network server (SS) shown in FIG. 1C and the client (CL) shown in FIG. 1B. Specifically, the transmission / reception unit (ASSR) receives a command sent from the client (CL), and transmits a tissue dynamics data acquisition request to the sensor network server (SS). Further, the transmission / reception unit (ASSR) receives tissue dynamics data from the sensor network server (SS), and transmits an analysis result to the client (CL).
  • the storage unit (ASME) is configured of an external recording device such as a hard disk, a memory, or an SD card.
  • the storage unit (ASME) stores setting conditions for analysis and analysis results.
  • the storage unit (ASME) includes user / place information table (I), organization information table (H), questionnaire (G), analysis result table (F), analysis condition period table (E), analysis algorithm Store (D).
  • the user / place information table (I) is a table in which personal information such as the user's name, job title, and user ID, and information on places are described.
  • the organizational information table (H) contains data necessary for organizational modeling, such as productivity (HA) and accident defects (HB), and data necessary for organizational activities such as climate and stock prices as general information. It is a stored table.
  • the questionnaire (G) is a table in which the questionnaires the user is asked to perform and the responses are stored.
  • the analysis result table (F) is a table in which results of analysis of tissue dynamics data (tissue dynamics index) and results of analysis of questionnaire results are stored.
  • the analysis condition period table (E) is a table for temporarily storing analysis conditions for display requested from the client (CL).
  • the analysis algorithm (D) stores a program used for analysis. At the request of the client (CL), an appropriate program is selected, sent to the control unit (ASCO), and analysis is performed.
  • the control unit includes a central processing unit CPU (not shown) and executes control of data transmission / reception and analysis of sensing data. Specifically, the CPU (not shown) realizes various functions by reading and executing various programs stored in the storage unit (ASME). Specifically, communication control (ASCC), modeling analysis (CA), personality index extraction analysis (CA1), and personality index conversion analysis (CA2) are executed.
  • ASCC communication control
  • CA modeling analysis
  • CA1 personality index extraction analysis
  • CA2 personality index conversion analysis
  • Communication control controls the timing of communication with the sensor network server (SS) and client data (CL) by wire or wireless. Furthermore, communication control (ASCC) executes format conversion of data and distribution of destinations according to data types.
  • Modeling analysis is a process of modeling the main factor of the problem which the organization has from tissue dynamics data and questionnaire results.
  • Modeling analysis (CA) includes meeting table creation (C1A), body rhythm table creation (C1B), meeting matrix creation (C1C), network index extraction (CAA), body rhythm index extraction (CAB), meeting index extraction (CAC) Organization activity index extraction (CAD), correlation analysis (CAE), and factor selection (CAF).
  • Face-to-face table creation is a process in which organization dynamics data is rearranged in chronological order for each user, and is a process for creating a table on face-to-face.
  • Body rhythm table creation is a process in which tissue dynamics data is rearranged in chronological order for each user, and is a process of creating a table related to body rhythm.
  • Face-to-face matrix creation is a process of creating a table in which the faces of each user are summarized in a matrix from the result of the face-to-face table creation (C1A).
  • CAA Network metric extraction
  • Body rhythm index extraction analyzes indices related to body rhythm in tissue dynamics indices from a body rhythm table.
  • Face-to-face index extraction analyzes the face-to-face index in the tissue dynamics index from the face-to-face table and the body rhythm table.
  • CAD Activity index extraction
  • CAE Correlation analysis
  • CAF Factor selection
  • the personality index extraction analysis (CA1) is to obtain the contribution coefficient of the tissue dynamics index for each questionnaire item. This is a process performed by personality index coefficient extraction (CA1A).
  • CA2 Personality index conversion analysis
  • CA1A personality index conversion analysis
  • the analysis result is transmitted from the analysis result table (F) or the transmission / reception unit (ASSR) to the display (J) of the client (CL) shown in FIG. 1B.
  • the client (CL) shown in FIG. 1B is a contact point with the user, and performs data input / output.
  • the client (CL) includes an input / output unit (CLIO), a transmission / reception unit (CLSR), a storage unit (CLME), and a control unit (CLCO).
  • CLIO input / output unit
  • CLSR transmission / reception unit
  • CLME storage unit
  • CLCO control unit
  • the input / output unit (CLIO) is a part serving as an interface with the user.
  • the input / output unit (CLIO) includes a display (CLOD), a keyboard (CLIK), a mouse (CLIM) and the like.
  • Other input / output devices can also be connected to the external input / output (CLIU) as required.
  • the display is an image display device such as a CRT (CATHODE-RAY TUBE) or a liquid crystal display.
  • the display (CLOD) may include a printer or the like.
  • the transmission / reception unit (CLSR) transmits / receives data to / from the application server (AS) shown in FIG. 1A or the sensor network server (SS) shown in FIG. 1C. Specifically, the transmission / reception unit (CLSR) transmits the analysis condition (CLMP) to the application server (AS), and receives the analysis result.
  • CLMP analysis condition
  • the storage unit (CLME) is configured by an external recording device such as a hard disk, a memory or an SD card.
  • the storage unit (CLME) records information necessary for drawing such as analysis conditions (CLMP) and drawing setting information (CLMT).
  • the analysis condition (CLMP) records conditions such as the number of analysis target members set by the user and the selection of the analysis method.
  • the drawing setting information (CLMT) records information on the drawing position of what part of the drawing to plot.
  • the storage unit (CLME) may store a program executed by a CPU (not shown) of the control unit (CLCO).
  • the control unit has a CPU (not shown) and executes control of communication, input of analysis conditions from the client user (US), and drawing for presenting the analysis result to the client user (US). Do. Specifically, the CPU executes the program stored in the storage unit (CLME) to execute the processing of communication control (CLCC), analysis condition setting (CLIS), drawing setting (CLTS), and display (J). Run.
  • CLCC communication control
  • CLIS analysis condition setting
  • CLTS drawing setting
  • J display
  • Communication control controls the timing of communication with a wired or wireless application server (AS) or sensor network server (SS). Also, communication control (CLCC) converts the format of data and distributes destinations according to the type of data.
  • the analysis condition setting receives an analysis condition designated from the user via the input / output unit (CLIO), and records it in the analysis condition (CLMP) of the storage unit (CLME).
  • CLMP analysis condition
  • CLME storage unit
  • a period of data used for analysis, a member, a type of analysis, parameters for analysis, and the like are set.
  • the client (CL) sends these settings to the application server (AS) to request analysis, and in parallel with this, executes drawing settings (CLTS).
  • the drawing setting calculates the method of displaying the analysis result based on the analysis condition (CLMP) and the position at which the drawing is plotted.
  • the result of this process is recorded in drawing setting information (CLMT) of the storage unit (CLME).
  • the display (J) generates a display screen based on the format described in the drawing setting information (CLMT), the analysis result acquired from the application server (AS). For example, model drawing (JA) and the like shown in FIG. 2C are stored in the drawing setting information (CLMT). At this time, if necessary, the display (J) also displays an attribute such as the name of the person being displayed.
  • the created display result is presented to the user via an output device such as a display (CLOD).
  • the display (CLOD) displays a screen such as a scientific management knowledge model (KA) shown in FIG. 2C.
  • the user can finely adjust the display position by an operation such as drag and drop.
  • the sensor net server (SS) shown in FIG. 1C manages data collected from name tag type sensor nodes (TR) shown in FIG. 1E. Specifically, the sensor net server (SS) stores data sent from the base station (GW) shown in FIG. 1D in a database, and also shows the application server (AS) shown in FIG. 1A and FIG. 1B. Send sensing data based on the request from the client (CL). Furthermore, the sensor net server (SS) receives a control command from the base station (GW), and sends the result obtained from the control command back to the base station (GW).
  • GW name tag type sensor nodes
  • the sensor net server (SS) includes a transmission / reception unit (SSSR), a storage unit (SSME), and a control unit (SSCO). If time synchronization management (GWCD) is performed on the sensor net server (SS), the sensor net server (SS) also needs a clock.
  • SSSR transmission / reception unit
  • SSME storage unit
  • SSCO control unit
  • the transmission / reception unit performs data transmission and reception with the base station (GW), the application server (AS) and the client (CL). Specifically, the transmission / reception unit (SSSR) receives the sensing data sent from the base station (GW), and sends the sensing data to the application server (AS) or the client (CL).
  • the storage unit (SSME) is configured by a non-volatile storage device such as a hard disk or flash memory, and at least a data table (BA), a performance table (BB), data format information (SSMF), a terminal management table (SSTT), and a terminal Stores firmware (SSTF). Furthermore, the storage unit (SSME) may store a program executed by a CPU (not shown) of the control unit (SSCO).
  • tissue dynamics data acquired by name tag type sensor node (TR), information on name tag type sensor node (TR), and tissue dynamics data transmitted from name tag type sensor node (TR) passed It is a database for recording information of a base station (GW) and the like.
  • a column is created for each element of data, such as acceleration and temperature, and data is managed.
  • a table may be created for each element of data. In either case, all data are stored in the data table (BA) by associating the terminal information (TRMT), which is the ID of the acquired nameplate type sensor node (TR), with the information related to the acquired time. Ru.
  • TRMT terminal information
  • Ru the data table (BA) is the same as the data table (BA) of FIG. 2B.
  • the performance table (BB) is a database for recording evaluations (performances) related to organizations and individuals, which are input from name tag type sensor nodes (TR) or from existing data, together with time data.
  • the performance table (BB) is the same as the performance table (BB) of FIG. 2B.
  • Data format information includes a data format for communication, a method of separating sensing data tagged by a base station (GW) and recording it in a database, a response method to a request for data, and the like. There is. As described later, after data reception, before data transmission, this data format information (SSMF) is always referred to by the communication control unit (SSCC), and data format conversion and data management (SSDA) are performed.
  • SSCC communication control unit
  • SSDA data format conversion and data management
  • the terminal management table (SSTT) is a table which records which name tag type sensor node (TR) is currently under management of which base station (GW). When a name tag type sensor node (TR) is newly added under the management of the base station (GW), the terminal management table (SSTT) is updated.
  • the terminal firmware temporarily stores the updated terminal firmware (GWTF) of the name tag type sensor node stored in the terminal firmware registration unit (TFI).
  • the control unit includes a central processing unit CPU (not shown), and controls transmission / reception of sensing data and recording / extraction to a database. Specifically, various functions are realized by the CPU reading and executing various programs stored in the storage unit (SSME). Specifically, processing such as communication control (SSCC), terminal management information correction (SSTM) and data management (SSDA) is executed.
  • SSCC communication control
  • SSTM terminal management information correction
  • SSDA data management
  • a communication control unit controls the timing of communication with a wired or wireless base station (GW), application server (AS) and client (CL). Also, as described above, the communication control unit (SSCC) uses the data format in the sensor network server (SS) based on the data format information (SSMF) recorded in the storage unit (SSME) as the format of data to be transmitted and received. Convert to a format or data format specialized for each communication partner. Furthermore, communication control (SSCC) reads a header portion indicating the type of data and distributes the data to the corresponding processing unit. Specifically, received data is distributed to data management (SSDA), and a command for correcting terminal management information is distributed to terminal management information correction (SSTM). The destination of the data to be transmitted is determined to a base station (GW), an application server (AS) or a client (CL).
  • SSDA data management
  • SSTM terminal management information correction
  • the terminal management information correction updates the terminal management table (SSTT) when a command to correct terminal management information is received from the base station (GW).
  • SSDA Data Management
  • sensing data is recorded in appropriate columns of a database by element of data based on tag information. Even when the sensing data is read out from the database, processing such as sorting necessary data according to time information and terminal information and sorting in time order is performed.
  • the sensor net server organizes and records the data received via the base station (GW) by the data management (SSDA) in the performance table (BB) and the data table (BA). This corresponds to tissue dynamics data collection (B).
  • Performance input is a process of inputting a value indicating performance.
  • the performance is a subjective or objective evaluation that is determined based on some criteria.
  • a person who wears a nameplate type sensor node (TR) at a predetermined timing has a subjective evaluation (performance) value based on some criteria such as the degree of business achievement, degree of contribution to organization, and degree of satisfaction at that time. Enter The predetermined timing may be, for example, once every several hours, once a day, or when an event such as a meeting is over.
  • a person wearing the nameplate type sensor node (TR) operates the nameplate type sensor node (TR) or operates a personal computer (PC) such as a client (CL) to input a performance value. can do.
  • PC personal computer
  • handwritten values may be input later by the PC.
  • the name tag type sensor node can input the performance of a person (SOCIAL), a row (INTELLECTUAL), a mind (SPIRITUAL), a body (PHYSICAL), and an intelligence (EXECUTIVE) as a rating.
  • the input performance value is used for analysis processing. The meaning of each question is: “Can people make rich human relationships (cooperation, sympathy)", “Can you do what you have to do?", “Do you feel rewarding and fulfilling your work?" ", Did the body take care of the body (rest, nutrition, exercise)?", The knowledge "have new knowledge (notice, knowledge)”.
  • the performance on the organization may be calculated from the performance of the individual. Objective data such as sales or cost, and already quantified data such as customer questionnaire results may be periodically input as performance. When a numerical value is automatically obtained, such as an error occurrence rate in production control or the like, the obtained numerical value may be automatically input as a performance value. Furthermore, economic indicators such as gross national product (GNP) may be input. These are stored in the organization information table (H).
  • the base station (GW) shown in FIG. 1D has a role of mediating the name tag type sensor node (TR) shown in FIG. 1E and the sensor network server (SS) shown in FIG. 1C.
  • a plurality of base stations (GWs) are arranged to cover an area such as a living room or work area in consideration of the reach of wireless.
  • the base station (GW) includes a transceiver unit (GWSR), a storage unit (GWME), a clock (GWCK), and a control unit (GWCO).
  • the transmission / reception unit receives the radio from the nameplate type sensor node (TR), and performs wired or wireless transmission to the base station (GW). Furthermore, the transceiver unit (GWSR) comprises an antenna for receiving radio.
  • the storage unit (GWME) is configured of a non-volatile storage device such as a hard disk or a flash memory.
  • the storage unit (GWME) stores at least operation setting (GWMA), data format information (GWMF), a terminal management table (GWTT), and base station information (GWMG).
  • the operation setting (GWMA) includes information indicating the operation method of the base station (GW).
  • Data format information (GWMF) includes information indicating a data format for communication, and information necessary to tag sensing data.
  • the terminal management table (GWTT) is terminal information (TRMT) of name tag type sensor nodes (TR) under the current association and locals distributed for managing those name tag type sensor nodes (TR) Contains the ID.
  • Base station information (GWMG) includes information such as the address of the base station (GW) itself.
  • the storage unit (GWME) temporarily stores updated terminal firmware (GWTF) of the nameplate type sensor node.
  • the storage unit (GWME) may further store a program executed by a central processing unit CPU (not shown) in the control unit (GWCO).
  • the clock (GWCK) holds time information.
  • the time information is updated at regular intervals. Specifically, the time information of the clock (GWCK) is corrected by the time information acquired from the NTP (NETWORK TIME PROTOCOL) server (TS) at regular intervals.
  • NTP NETWORK TIME PROTOCOL
  • the control unit includes a CPU (not shown).
  • GWME storage unit
  • the CPU executes a program stored in the storage unit (GWME), acquisition timing of sensing data sensor information, processing of sensing data, transmission / reception to name tag type sensor node (TR) or sensor net server (SS) Manage timing and timing of time synchronization.
  • the CPU executes a program stored in the storage unit (GWME), the communication control unit (GWCC), associate (GWTA), time synchronization management (GWCD), time synchronization (GWCS), etc. Execute the process
  • the communication control unit controls the timing of communication with the name tag type sensor node (TR) and the sensor network server (SS) by wireless or wired. Also, the communication control unit (GWCC) distinguishes the type of received data. Specifically, the communication control unit (GWCC) identifies from the header portion of the data whether the received data is general sensing data, data for association, or a time synchronization response. Pass those data to the appropriate functions.
  • the communication control unit converts data into a format suitable for transmission and reception with reference to data format information (GWMF) recorded in the storage unit (GWME), and indicates the type of data. Perform data format conversion to add tag information.
  • the associate (GWTA) transmits a response (TRTAR) to the associate request (TRTAQ) sent from the name tag type sensor node (TR), and transmits a local ID assigned to the name tag type sensor node (TR).
  • TRTAR response to the associate request
  • TRAQ associate request
  • TRTF terminal firmware
  • the time synchronization management controls the interval and timing for performing time synchronization, and issues an instruction to perform time synchronization.
  • the sensor network server (SS) may collectively send an instruction to the base station (GW) of the entire system by executing time synchronous management (GWCD) by the sensor network server (SS) described later. .
  • Time synchronization connects to an NTP server (TS) on the network to request and acquire time information.
  • the time synchronization (GWCS) corrects the clock (GWCK) based on the acquired time information.
  • the time synchronization (GWCS) transmits a time synchronization instruction and time information (GWCSD) to the nameplate type sensor node (TR).
  • FIG. 1E shows a configuration of a name tag type sensor node (TR) which is an embodiment of a sensor node, and the name tag type sensor node (TR) is a plurality of infrared ray transmitting / receiving units (AB) for detecting a human facing situation. ), A three-axis acceleration sensor (AC) for detecting the wearer's movement, a microphone (AD) for detecting the wearer's speech and surrounding sounds, and an illumination sensor (for detecting the front and back of the nameplate type sensor node) Various sensors of LS1F, LS1B) and temperature sensor (AE) are mounted. The mounted sensor is an example, and other sensors may be used to detect the wearer's facing situation and movement.
  • the infrared transmitting / receiving unit (AB) continuously transmits terminal information (TRMT), which is unique identification information of the nameplate type sensor node (TR), toward the front direction periodically.
  • TRMT terminal information
  • the name tag type sensor node (TR) and the other name tag type sensor node (TR) are Exchange terminal information (TRMT) with each other by infrared rays. By doing this, it is possible to record who and who are facing each other.
  • Each infrared transmitting and receiving unit is generally constituted by a combination of an infrared light emitting diode for infrared transmission and an infrared phototransistor.
  • the infrared ID transmission unit (IRID) generates terminal information (TRMT), which is its own ID, and transfers it to the infrared light emitting diode of the infrared transmitting / receiving module.
  • TRMT terminal information
  • all the infrared light emitting diodes are lighted at the same time by transmitting the same data to a plurality of infrared transmitting and receiving modules.
  • separate data may be output independently of each other.
  • data received by the infrared phototransistor of the infrared transmitting / receiving unit is logically ORed by an OR circuit (IROR). That is, if at least one infrared light receiving unit receives an ID, it is recognized as an ID by the name tag type sensor node. Of course, it may be configured to have a plurality of ID receiving circuits independently. In this case, since the transmission / reception state can be grasped with respect to each infrared transmission / reception module, it is also possible to obtain additional information such as, for example, in which direction the other name tag type sensor node facing is located.
  • Sensor data (SENSD) detected by the sensor is stored in the storage unit (STRG) by the sensor data storage control unit (SDCNT).
  • the sensor data (SENSD) is processed into a transmission packet by the communication control unit (TRCC), and is transmitted to the base station (GW) by the transmission / reception unit (TRSR).
  • the communication timing control unit (TRTMG) takes out sensor data (SENSD) from the storage unit (STRG) and generates a timing for wireless transmission.
  • the communication timing control unit (TRTMG) has a plurality of time bases for generating a plurality of timings.
  • FMUD firmware update data
  • the name tag type sensor node (TR) of this embodiment detects that the external power supply (EPOW) is connected by the external power supply connection detection circuit (PDET), and generates an external power supply detection signal (PDETS).
  • the time base switching unit (TMGSEL) that switches the transmission timing generated by the communication timing control unit (TRTMG) by the external power supply detection signal (PDETS) or the data switching unit (TRDSEL) that switches data to be communicated wirelessly It has a unique configuration.
  • FIG. 1E shows, as an example, a configuration in which the time base switching unit (TMGSEL) switches the transmission timing by two time bases, time base 1 (TB1) and time base (TB2), by the external power detection signal (PDETS). ing.
  • the data switching unit is based on the external power supply detection signal (PDETS) from sensor data (SENSD) obtained from the sensor, the data to be collected (CMBD) accumulated in the past, and firmware update data (FMUD) It illustrates a configuration in which (TRDSEL) switches.
  • PETS external power supply detection signal
  • SENSD sensor data
  • CMBD data to be collected
  • FMUD firmware update data
  • the illuminance sensors (LS1F, LS1B) are mounted on the front and back of the nameplate type sensor node (TR), respectively.
  • the data acquired by the illuminance sensors (LS1F, LS1B) are stored in the storage unit (STRG) by the sensor data storage control unit (SDCNT) and simultaneously compared by the reverse detection (FBDET).
  • the illuminance sensor (front) (LS1F) mounted on the front receives external light
  • the illuminance sensor (back) mounted on the back (LS1B) is the nameplate type sensor node Since the positional relationship between the main body and the wearer is obtained, no extraneous light is received.
  • the illuminance detected by the illuminance sensor (front) (LS1F) takes a larger value than the illuminance detected by the illuminance sensor (back) (LS1B).
  • the name tag type sensor node (TR) is turned over, the illuminance sensor (back) (LS1B) receives foreign light, and the illuminance sensor (front) (LS1F) faces the wearer, so the illuminance sensor (front)
  • the illuminance detected by the illuminance sensor (back) (LS1B) is larger than the illuminance detected by (LS1F).
  • the name tag node is turned over by comparing the illuminance detected by the illuminance sensor (front) (LS1F) with the illuminance detected by the illuminance sensor (back) (LS1B) by the reverse detection (FBDET). It can detect that it is not attached.
  • FBDET turning over detection
  • a warning sound is generated by the speaker (SP) to notify the wearer.
  • the microphone (AD) acquires audio information.
  • Voice information can reveal the surrounding environment such as "noisy” or "quiet".
  • Face-to-face communication such as Furthermore, the facing state in which the infrared transmitter-receiver (AB) can not be detected due to the relationship of the standing position of the person can be compensated by the audio information and the acceleration information.
  • the voice acquired by the microphone acquires both a voice waveform and a signal obtained by integrating it with an integrating circuit (AVG).
  • the integrated signal represents the energy of the acquired speech.
  • a three-axis acceleration sensor detects the acceleration of the node, ie the movement of the node. For this reason, from the acceleration data, it is possible to analyze the movement of the person wearing the name tag type sensor node (TR), the behavior such as walking, and the like. Furthermore, by comparing the values of acceleration detected by a plurality of name tag type sensor nodes (TR), the degree of communication activity, mutual rhythm, and correlation between persons wearing those name tag type sensor nodes (TR) Etc. can be analyzed.
  • data acquired by the three-axis acceleration sensor (AC) is stored in the storage unit (STRG) by the sensor data storage control unit (SDCNT), and at the same time upper and lower detection
  • the direction of the name tag is detected by (UDDET). This is because the acceleration detected by the three-axis acceleration sensor (AC) utilizes two types of observation of dynamic acceleration change due to the movement of the wearer and static acceleration due to the gravitational acceleration of the earth .
  • the display device When the name tag type sensor node (TR) is attached to the chest, the display device (LCDD) displays personal information such as the affiliation and name of the wearer. In other words, it acts as a name tag.
  • the wearer holds the nameplate type sensor node (TR) in hand and directs the display device (LCDD) toward him, the location of the nameplate type sensor node (TR) is reversed.
  • the content displayed on the display device (LCDD) and the function of the button are switched by the vertical detection signal (UDDETS) generated by the vertical detection (UDDET).
  • the information displayed on the display device (LCDD) by the value of the upper and lower detection signal (UDDETS) is an analysis result by the infrared activity analysis (ANA) generated by the display control (DISP) and the name tag display (DNM) An example of switching to and) is shown.
  • ANA infrared activity analysis
  • DCM name tag display
  • the name tag type sensor node (TR) further includes a sensor such as a three-axis acceleration sensor (AC).
  • the process of sensing in the name tag type sensor node (TR) corresponds to tissue dynamics data acquisition (A) in FIG. 2A.
  • TRs name tag type sensor nodes
  • GW base station
  • PAN personal area network
  • the temperature sensor (AE) of the nameplate type sensor node (TR) is the temperature of the place where the nameplate type sensor node (TR) is located, and the illuminance sensor (front) (LS1F) is the illuminance of the name tag type sensor node (TR) To get This allows the surrounding environment to be recorded. For example, based on the temperature and the illuminance, it can be known that the nameplate type sensor node (TR) has moved from one place to another.
  • buttons 1 to 3 (BTNs 1 to 3), the display device (LCDD), the speaker (SP), and the like are provided as input / output devices corresponding to the person who has been worn.
  • the storage unit (STRG) is configured by a non-volatile storage device such as a hard disk or a flash memory, and terminal information (TRMT) which is a unique identification number of a name tag type sensor node (TR), sensing interval, and display
  • TRMT terminal information
  • TRMA operation settings
  • the storage unit (STRG) can temporarily record data, and is used to record sensed data.
  • the communication timing control unit holds time information (GWCSD), updates the time information (GWCSD) at fixed intervals, and records it as a clock (TRCK).
  • the time information is periodically corrected by the time information (GWCSD) transmitted from the base station (GW) in order to prevent the time information (GWCSD) from shifting with another nameplate type sensor node (TR).
  • the sensor data storage control unit controls the sensing interval and the like of each sensor according to the operation setting (TRMA) recorded in the storage unit (STRG), and manages the acquired data.
  • Time synchronization corrects a clock by acquiring time information from a base station (GW).
  • the time synchronization may be performed immediately after association, which will be described later, or may be performed according to a time synchronization command transmitted from the base station (GW).
  • the radio communication control unit When transmitting and receiving data, the radio communication control unit (TRCC) performs control of a transmission interval and conversion to a data format corresponding to transmission and reception.
  • the wireless communication control unit may have a wired communication function, not wireless, if necessary.
  • the radio communication control unit may perform congestion control so that the transmission timing does not overlap with another nameplate type sensor node (TR).
  • An associate (TRTA) transmits / receives an associate request (TRTAQ) and an associate response (TRTAR) to form a personal area network (PAN) with the base station (GW) shown in FIG. Determine (GW).
  • the associate (TRTA) is activated when the name tag type sensor node (TR) is powered on, and when the name tag type sensor node (TR) moves and transmission / reception with the base station (GW) is interrupted as a result To be executed.
  • a nameplate type sensor node (TR) is associated with one base station (GW) in a close range to which a radio signal from the nameplate type sensor node (TR) can reach.
  • the transmission / reception unit includes an antenna and performs transmission and reception of a radio signal. If necessary, the transceiver unit (TRSR) can also perform transmission and reception using a connector for wired communication. Transmission / reception data (TRSRD) transmitted / received by the transmission / reception unit (TRSR) is transferred to / from the base station (GW) via the personal area network (PAN).
  • TRSRD Transmission / reception data
  • GW base station
  • PAN personal area network
  • FIGS. 2A, 2B, and 2C show the overall flow of processing performed in one embodiment of the business microscope system, and are shown separately for convenience of illustration, but each of the illustrated Processing is performed in cooperation with each other.
  • A tissue dynamics data by a plurality of nameplate type sensor nodes (TRa, TRb,..., TRi, TRj) shown in FIG. 2A
  • CA modeling analysis
  • the analysis result is visualized by model drawing (JA), and the visualization result shows a series of flows called scientific management knowledge model (KA).
  • JA model drawing
  • KA scientific management knowledge model
  • Name tag type sensor node A includes sensors such as infrared transceiver (AB), acceleration sensor (AC), microphone (AD), temperature (AE), net (AFA), notice (AFB), thanks AFC) buttons (AF) are composed of buttons.
  • It has a screen (AG) for displaying the face-to-face information obtained from the infrared transmitter / receiver, a user interface (AA) for inputting a rating, and a microcomputer and a wireless transmission function although illustration is omitted.
  • the acceleration sensor (AC) detects the acceleration of the name tag type sensor node A (TRa) (that is, the acceleration of the person A (not shown) wearing the name tag type sensor node A (TRa)).
  • the infrared transceiver (AB) detects the facing state of the name tag type sensor node A (TRa) (that is, the state where the name tag type sensor node A (TRa) is facing another name tag type sensor node).
  • the name tag type sensor node A (TRa) facing another name tag type sensor node means that the person A who wears the name tag type sensor node A (TRa) is the person who wears the other name tag sensor node Show that you are facing.
  • the microphone (AD) detects the sound around the name tag type sensor node A (TRa)
  • the temperature sensor (AE) detects the temperature around the name tag type sensor node A (TRa).
  • the button (AF) is for performing input from a subjective viewpoint of a person A (not shown) wearing the name tag type sensor node A (TRa). If you are doing the main work net (AFA), if you find a new idea, etc., you notice (AFB), if you thank the members, the button of thanks (AFC) person A Let me press.
  • the system according to the present embodiment includes a plurality of name tag type sensor nodes (name tag type sensor node A (TRa) to name tag type sensor node J (TRj) in FIG. 2A).
  • Each name tag type sensor node is attached to one person.
  • name tag type sensor node A (TRa) is attached to person A
  • name tag type sensor node B (TRb) is attached to person B (not shown).
  • the purpose is to analyze relationships between people and further illustrate the performance of the organization.
  • the name tag type sensor node B (TRb) to the name tag type sensor node J (TRj) also have sensors, a microcomputer, and a wireless transmission function, similarly to the name tag type sensor node A (TRa).
  • the name tag is used when the description applies to any of the name tag type sensor node A (TRa) to the name tag type sensor node J (TRj), and when it is not necessary to distinguish between the name tag type sensor nodes. Described as a sensor node.
  • Each name tag type sensor node performs sensing by sensors constantly (or repeatedly at short intervals). Then, each name tag type sensor node wirelessly transmits the acquired data (sensing data) at a predetermined interval.
  • the interval for transmitting data may be the same as the sensing interval or may be larger than the sensing interval.
  • the data transmitted at this time is assigned a sensing time and a unique identifier (ID) of the sensed nameplate type sensor node.
  • the wireless transmission of data is collectively performed in order to maintain the usable state of the name tag type sensor node (TR) for a long time while being worn by a person by suppressing the power consumption by the transmission.
  • TR usable state of the name tag type sensor node
  • Data transmitted by wireless from the nameplate type sensor node is collected in tissue dynamics data collection (B) shown in FIG. 2B and stored in the database.
  • the data table (BA) stores sensor data obtained from name tag type sensor nodes.
  • the user ID is the identifier of the user
  • the acquisition time is the time when the name tag type sensor node (TR) receives
  • the base station is the base station that received sensor data from the name tag type sensor node (TR)
  • Acceleration sensor is sensor data of acceleration sensor (AC)
  • IR sensor is sensor data of infrared transceiver (AB)
  • sound sensor is sensor data of microphone (AD)
  • temperature BAG is Temperature sensor (AE) sensor data
  • net (BAJ) net (AFA) button The presence or absence of pressing, the terminal (BAI) is information for identifying the terminal.
  • the performance table (BB) stores performance values input at performance input (C) and rating input (AA).
  • the user ID (BBA) is the identifier of the user
  • acquisition time (BBB) is the time when the rating input (AA) is made by the name tag type sensor node (TR) or the time when the performance is input (C).
  • SOCIAL (BBC), INTELLECTUAL (BBD), SPIRITUAL (BBE), PHYSICAL (BBF), and EXECUTIVE (BBG) are information for rating
  • BHB terminal
  • identifying a terminal is information for identifying a terminal.
  • the dynamics data collection (B) an example is shown in which the data are stored in the order of arrival, and therefore, they are not necessarily arranged in time order.
  • the data table (BA) and the data table (BA) are an example, and a table may be created for each sensor data.
  • the tissue dynamics data collected by the tissue dynamics data collection (B) is generated by the modeling analysis (CA) shown in FIG. 2C to generate a model with a beneficial factor, visualized by model drawing (JA), and the visualization result is scientific It becomes a management knowledge model (KA).
  • Modeling analysis is a process to clarify which tissue activity is a beneficial factor, such as stress and productivity.
  • stress, productivity, etc. are used as objective variables, and tissue dynamics indexes, which are tissue activities, as explanatory variables, and correlation processing is performed to select useful factors based on the correlation results. This makes it possible to identify which organization activity affects stress, productivity, etc., and to identify the organization activity to be improved.
  • the modeling analysis (CA) converts the tissue dynamics data into a time-series table for each user by creating a facing table (C1A) and creating a body rhythm table (C1B). Then, based on this result, the user's face-to-face situation is summarized in a matrix form (face-to-face matrix creation (C1C)).
  • a variety of tissue dynamics covering tissue activity by processing network index extraction (CAA), body rhythm index extraction processing (CAB), face-to-face index extraction processing (CAC), and tissue activity index extraction (CAD) from these data Determine the index.
  • questionnaires G
  • personality questionnaires GA
  • leadership / teamwork questionnaires GB
  • employee satisfaction / satisfaction questionnaires GC
  • stress / mental disorder questionnaires GD
  • tissue activation questionnaires GE
  • FAC factor coefficients
  • CAF factor selection
  • the factor selected by factor selection is drawn by model drawing (JA). This result is a scientific management knowledge model (KA).
  • Face-to-face table creation is a process of putting together the face-to-face situation between members from infrared data of tissue dynamics data in time series in a certain constant period.
  • the extracted result is stored in the meeting table (FC1A) of the analysis result table (F).
  • FC1A One example of the facing table (FC1A) is shown in FIG. This stores one day (24 hours) in chronological order with a time resolution of one minute (FC1A3) with the user as one record.
  • the vertical axis is a user ID (FC1A1) for identifying a member individual
  • the horizontal axis is a resolution time (FC1A2) indicating time by time resolution.
  • FC1A1 the user ID
  • FC1A2 the resolution time
  • the user's face-to-face situation at a certain time only needs to read the correspondence between the user ID (FC1A1) and the resolution time (FC1A2).
  • the meeting situation of 2009/7/1 10:02 of the user ID 001 is facing two people, and the members who were facing are the user IDs 002 and 003.
  • NULL is stored in the facing table (FC1A) when there is no infrared data of tissue dynamics data of the corresponding user at that time.
  • the meeting table (FC1A) is generated for each day and each time resolution, so even if the same date, the time resolution is different. For example, although (FC1A4) and (FC1A5) are the same (July 2, 2009), they have different tables because they have different time resolutions. In addition, it is important to store the meeting table (FC1A) as a user ID facing the meeting number of people, so if this is satisfied, the table configuration used in the meeting table (FC1A) may be different. .
  • Body rhythm table creation is a process of putting together the body rhythm status in time series in a certain constant period by indicating the movement of the member's body from the acceleration data of the tissue dynamics data as Hz.
  • the extracted result is stored in the body rhythm table (FC1B) of the analysis result table (F).
  • FC1B body rhythm table
  • FIG. Assuming that the user is one record, one minute (24 hours) is stored in chronological order, with a time resolution of one minute (FC1B3).
  • the vertical axis is a user ID (FC1B1) for identifying a member individual, and the horizontal axis is a resolution time (FC1B2) indicating time by time resolution .
  • the physical rhythm situation of the user at a certain time only needs to read the correspondence between the user ID (FC1B1) and the resolution time (FC1B2).
  • the body rhythm situation of 2009/7/11 10:02 of user ID 001 is 2.1 Hz.
  • NULL is stored in the body rhythm table (FC1B) when there is no acceleration data of tissue dynamics data of the corresponding user at that time.
  • the body rhythm table (FC1B) Since the body rhythm table (FC1B) is generated for each day and for each time resolution, the same date may be another table if the time resolution is different. For example, although (FC 1 B 4) and (FC 1 B 5) are the same (July 2, 2009), they have different tables because they have different time resolutions. Further, since it is important to store the user's body rhythm, the body rhythm table (FC1B) may be different from the table configuration used in the body rhythm table (FC1B) if it is satisfied.
  • Face-to-face matrix creation is a process of removing time-series information from the face-to-face table (FC1A) arranged in time-series and putting together in a two-dimensional matrix how much face-to-face contact is performed for each user.
  • the extracted result is stored in the facing matrix (FC1C) of the analysis result table (F).
  • One example of the facing matrix (FC1C) is shown in FIG. FIG. 5 is a summary of one month's meeting results.
  • the time resolution in the facing table (FC1A) is taken as a unit, when 1 is stored in the facing matrix (FC1C), the time resolution is 1 minute if it is 1 minute, and 5 minutes if the time resolution is 5 minutes. It will be.
  • the vertical axis is a user ID (FC1C1) for identifying a member individual
  • the horizontal axis is a user ID (FC1C2) indicating a partner who has met.
  • the meeting time of the user 002 with the user 003 is 33 minutes.
  • FC1C face-to-face matrix
  • Period: 2009/7 / 1-July 31 (FC1C3) is the period used for the facing matrix (FC1C).
  • Actual days: 21 days (FC1C5) is the number of business days in the period (FC1C3).
  • Temporal resolution: 1 minute (FC1C6) is the temporal resolution in the facing table (FC1A).
  • Meeting determination time: 3 minutes / 1 day (FC1C7) is a threshold value for determining that meeting has occurred.
  • the infrared transmitting and receiving unit receives an infrared ray in the case of passing each other, it is determined that they have met, but such a threshold is introduced because the reaction is likely to be noise several times.
  • the face-to-face matrix (FC1C) may be different from the table configuration used in the face-to-face matrix (FC1C) if it is satisfied.
  • the network index extraction process is a process of obtaining an index from a network diagram created from a face-to-face matrix (FC1C). Then, an example of a table storing the index obtained by the network index extraction process (CAA) is the network index (FAAA) of FIG.
  • the network index (FAAA) is a table in which an index is stored for each user.
  • the network index is an index indicating the connection between each of a plurality of persons and other persons in the organization.
  • the table identifies the user: user ID (FAAA1) and network index (order (FAAA2), cohesion (FAAA3), two-step reach (FAAA4), mediation centrality (FAAA5), meeting time (total) (FAAA6)) It consists of Period: 2009/7 / 1-July 31 (FAAA6) shows the period used for the analysis. Temporal resolution: 1 minute (FAAA 7) is analysis time resolution.
  • the network diagram (ZA) of FIG. 7 is an example of a network diagram created from the facing matrix.
  • (ZA1) to (ZA5) are made up of nodes representing persons, and (ZA6) to (ZA11) are made up of lines (edges) connecting members facing each other.
  • the spring model (Hook's law) calculates the force (inward or outward) assuming that there is a spring when two nodes (points) are connected, and further calculates the distance from all nodes not connected with itself. It is a method to make it an optimal arrangement by repeating movement of a position as receiving repulsive force (repulsive force) according to.
  • the network indicator (FAAA) will be described with an example of this network diagram (ZA).
  • the order (FAAA2) is the number of edges connected to the node.
  • Takahashi (ZA1) is 2 because it is connected with Tanaka (ZA2) and Ito (ZA4).
  • the cohesion degree (FAAA3) is a density of nodes around oneself, and is an index indicating a degree of cooperation with one another around a person.
  • Ito (ZA4) in the network diagram (ZA) ito (ZA4) has three opponents, Takahashi (ZA1), Yamamoto (ZA5), and Tanaka (ZA2).
  • the two-step reach (FAAA4) is the total number of nodes within a range of two steps.
  • ZA network diagram
  • all the nodes that can be covered by the two steps in the case of Watanabe (ZA3) are (ZA1) to (ZA5), which is 4.
  • Mediation centrality is a value representing how much a node contributes to the connectivity of the entire network diagram.
  • Mediation centrality is the number of cases where a person is present on a route that arrives at the shortest step on the network diagram in the combination of all the persons in an organization. When there are n types of shortest routes between the person A and the person B, it is counted and calculated as 1 / n.
  • Meeting time (total) (FAAA5) is the total of meeting time during the period. This is a value obtained from the facing matrix (FC1C). The sum of the rows (horizontal rows) of each user in the facing matrix (FC1C) is the facing time.
  • the indicator is not limited to this, and another indicator may be created from the face-to-face matrix (FC1C) and used for analysis.
  • Body rhythm index extraction processing is processing for obtaining an index from a body rhythm table (FC1B). Then, one example of a table storing the index obtained by the body rhythm index extraction process (CAB) is the body rhythm index (FAAB) of FIG.
  • the body rhythm index (FAAB) is a table in which the index is stored for each user.
  • the table identifies the user: user ID (FAAB1) and body rhythm index (0 to 1 Hz appearance frequency (FAAB2), 1 to 2 Hz appearance frequency (FAAB3), 2 to 3 Hz appearance frequency (FAAB4), 0 to 1 Hz It consists of continuity (FAAB5), 1 to 2 Hz continuity (FAAB6), and 2 to 3 Hz continuity (FAAB7)).
  • Period July 1-July 31, 2009 (FAAB 8) indicate the period used for analysis.
  • Time resolution 1 minute (FAAB 9) is analysis time resolution.
  • Time interval 1 day (FAAB10) is a range designation when obtaining an average etc. in a period (FAAB8).
  • the body rhythm table (FC1B) stores a body rhythm in terms of Hz for each time resolution, so that a histogram for each 1 Hz section is created. And the histogram value from 0 Hz to 1 Hz is the appearance frequency of 0 to 1 Hz (FAAB2), the histogram value from 1 Hz to 2 Hz is the appearance frequency of 1 to 2 Hz (FAAB3), the histogram value from 2 Hz to 3 Hz is 2 to 3 Hz Find the frequency of occurrence (FAAB4).
  • the continuation of each rhythm can be obtained. Specifically, it is sufficient to check the degree of continuation, compare the physical rhythm at a certain time with the physical rhythm at the next time, count cases where the two physical rhythms are from 0 Hz to 1 Hz, The continuity from 0 Hz to 1 Hz in the time interval is obtained by dividing by FAAB10). Similarly, the continuity from 1 Hz to 2 Hz is determined as the continuity of 1 to 2 Hz (FAAB 6), and the continuity from 2 Hz to 3 Hz is determined as the continuity of 2 to 3 Hz (FAAB 7).
  • this is a daily value which is a time interval (FAAB10), when storing in the body rhythm index (FAAB), the average of the period (FAAB8) becomes a value stored in each value.
  • the index is not limited to this, and another index may be created from the body rhythm table (FC1B) and used for analysis. Furthermore, in the body rhythm index extraction process (CAB), the average in the period (FAAB 8) is stored, but a variance or the like may be used.
  • the face-to-face index extraction process is a process for obtaining an index from the face-to-face table (FC1A) and the body rhythm table (FC1B). Then, an example of a table storing the index obtained by the facing index extraction process (CAC) is the facing index (FAAC) of FIG.
  • the face-to-face indicator (FAAC) is a table in which the indicator is stored for each user.
  • the table identifies the user: user ID (FAAC1) and face-to-face index (face-to-face time (FAAC2), non-face-to-face time (FAAC3), active face-to-face time (FAAC4), passive face-to-face time (FAAC5), two-person face-to-face time (FAAC6), It is composed of 3 to 5 face-to-face hours (FAAC 7) and 6 to 6 face-to-face times (FAAC 8).
  • Period July 1-July 31, 2009 (FAAC 9) indicates the period used for analysis.
  • Time resolution 1 minute (FAAC 10) is analysis time resolution.
  • Time interval 1 day (FAAC 11) is a range designation when obtaining an average or the like in the period (FAAC 9).
  • FC1A From the face-to-face table (FC1A), find face-to-face time and non-face-to-face time when acquiring tissue dynamics data. If the value stored in the meeting table (FC1A) is 1 or more, the meeting time is counted, and if the value is 0, it is counted as the non-facing time. When the stored value is NULL, the facing time and the non-facing time are not counted.
  • the facing time (FAAC2) is the time when counting the facing
  • the non-facing time (FAAC3) is the time when the non-facing is counted. Since the analysis time resolution is one minute, the counted value itself becomes time.
  • the body rhythm in meeting is 2 Hz or more as active facing (active facing) and less than 2 Hz as passive facing (passive facing). This is because, when we focus on the relationship between the user's behavior and the movement rhythm at the time of meeting, in the case of a meeting that is considered to be positive like a meeting that includes not only words but gestures, the movement rhythm at the time of meeting is 2 Hz It is because it is based on the knowledge that it is more than.
  • Active facing time (FAAC4) is time when counting active facing
  • passive facing time (FAAC5) is time when counting passive facing. Since the analysis time resolution is one minute, the counted value itself becomes time.
  • FC1A Find out how many people were meeting from the meeting table (FC1A).
  • FC1A the face-to-face table
  • the analysis range was three for two, three to five, and six.
  • Two-person face-to-face time (FAAC 6) is the time when counting the two-person face-to-face.
  • the 3 to 5 face-to-face time (FAAC 7) is the time when counting from 3 to 5 face-to-face meetings.
  • Six people to face-to-face time (FAAC 8) is the time when counting six or more face-to-face meetings. Since the analysis time resolution is one minute, the counted value itself becomes time.
  • the index is not limited thereto, and another index may be created from the face-to-face table (FC1A) and the body rhythm table (FC1B) and used for analysis. Furthermore, in the face-to-face index extraction process (CAC), the average in the period (FAAC 9) is stored, but a variance or the like may be used.
  • the tissue activity index extraction process is a process of obtaining an index from a face-to-face table (FC1A) and a body rhythm table (FC1B).
  • FC1A face-to-face table
  • FC1B body rhythm table
  • FAAD organization activity index
  • FAAD organizational activity indicator
  • the table identifies the user user ID (FAAD1) and organization activity indicator (working hours average (FAAD2), average attendance time (FAAD3), average return time (FAAD4), standard deviation of working hours (FAAD5), standard deviation of attendance time (FAAD5) It consists of FAAD6) and return time standard deviation (FAAD7)).
  • Period July 1-July 31, 2009 (FAAD 8) indicate the period used for analysis.
  • Time interval: 1 day (FAAD 10) is a range designation at the time of obtaining an average or the like in a period (FAAD 8).
  • the start address means an address when data is stored (zero or more) from the time when tissue dynamics data can not be obtained (NULL).
  • the end address means an address when data can not be obtained (NULL) from the time when tissue dynamics data is obtained (0 or more).
  • time is not stored in the meeting table (FC1A) and the body rhythm table (FC1B), time is stored in chronological order, so time can be determined from the acquired address and time resolution (FAAD 9).
  • Working time becomes the working time by subtracting the start address from the end address.
  • the working hours average (FAAD2) is an average of time intervals (FAAD10) in the working hours period (FAAD8).
  • the working time standard deviation (FAAD5) is the average of the time interval (FAAD10) in the working time period (FAAD8).
  • the time of arrival at work (FAAD3) is the average of the time interval (FAAD10) in the period (FAAD8) of the start address.
  • the attendance time standard deviation (FAAD6) is an average of the time interval (FAAD10) in the period (FAAD8) of the start address.
  • the leaving time average (FAAD4) is the average of the time interval (FAAD10) in the end address period (FAAD8).
  • the return time standard deviation (FAAD7) is an average of the time interval (FAAD10) in the end address period (FAAD8).
  • the meeting table (FC1A) and the body rhythm table (FC1B) it can be determined not to use tissue dynamic data in an error state. For example, it is assumed that when the name tag type sensor node (TR) is left and returned to the office, the meeting with the nearby node is reacted. It does not actually meet, but it can not judge from infrared rays. In order to improve the accuracy, it is necessary to eliminate such misjudgment. As a countermeasure, it is determined whether the facing of the facing table (FC1A) is correct by comparing with the physical rhythm table (FC1B). That is, if a rhythm (body rhythm is 0 Hz, for a long time) which is not correctly attached by a human being is detected, the value of the facing table at that time is not used.
  • a rhythm body rhythm is 0 Hz, for a long time
  • the index is not limited to this, and another index may be created from the face-to-face table (FC1A) and the body rhythm table (FC1B) and used for analysis. Furthermore, in the body rhythm index extraction process (CAB), the mean and the standard deviation in the period (FAAD 10) are stored, but a variance or the like may be used.
  • indices required from various questionnaires (GA to GE) and objective organizational indices (productivity index, accident defect index) will be described. These are obtained based on the values input by the performance input (C).
  • Personality Questionnaire (GA) is a questionnaire that examines the characteristics of thinking and behavior. The following documents may be referred to as an example of the personality questionnaire (GA). V. Benet-Martinez and O. P. John, “Los Cinco Grandes across cultures and ethnic groups: Multitrait method analyzes of the Big Five in Spanish and English,” Journal of Personality and Social Psychology, 75, pp. 729-750, 1998. .
  • the user answers this questionnaire and stores the result as a personality index.
  • the personality index (FAAE) table of FIG. 9 will be described as an example of the personality index.
  • Answer date: July 15, 2009 (FAAE7) contains the date of response.
  • personality values are stored in each of the extroversion (FAAE2), the harmony (FAAE3), the integrity (FAAE4), the nervousness (FAAE5), and the openness (FAAE6).
  • FIG. 17 is a table showing the effect of this value.
  • the user's thinking and behavior show the degree of adaptation to society, and another questionnaire may be used.
  • the table configuration used in the personality index (FAAE) may be changed accordingly.
  • the organization information table (H) will be described with reference to FIG.
  • the organization information table (H) stores indexes related to the organization and members.
  • the table consists of the user ID (HA1) that identifies the user and the productivity index (performance (HA2), contribution (HA3), number of program steps (HA4), number of sales (HA5), sales (HA6)) .
  • the period is from July 1, 2009 to July 15, 2009 (HA7).
  • the table consists of a user ID (HB1) for identifying the user and an accident defect index (days closed (HB2), number of bugs (HB3), number of incidents (HB4), number of defects (HB5), number of complaints (HB6)) There is.
  • the period is from 1 July 2009 to 15 July 2009 (HB7).
  • a Leadership / Teamwork Questionnaire is a questionnaire that examines the work, cooperation, awareness, and behavior that members of a group perform to achieve the same goal.
  • the following documents may be referred to as an example of the Leadership / Teamwork Questionnaire (GB).
  • FIG. 11 An example of the questionnaire is shown in FIG. The user answers this questionnaire and stores the result in the leadership / teamwork index.
  • the Leadership / Teamwork Index (FABC) table of FIG. 11 will be described as an example of the Leadership / Teamwork Index.
  • the table is composed of a user ID (FABC1) for identifying a user and indicators (team orientation (FABC2), team leadership (FABC5), team process (FABC8)). Answer date: July 15, 2009 (FABC13) contains the date of response. From the Leadership / Teamwork Questionnaire (GB), we seek from three perspectives: Team Orientation (FABC2), Team Leadership (FABC5), and Team Process (FABC8).
  • the employee satisfaction / satisfaction questionnaire is a questionnaire that examines the level of health, happiness and prosperity in the presence of human beings.
  • the following documents may be referred to as an example of the employee satisfaction / fullness questionnaire (GC).
  • the Oxford Happiness Questionnaire a compact scale for the measurement of psychological well-being. Personality and Individual Differences, 33, 1073-1082, 2002. .
  • FIG. 11 An example of the questionnaire is shown in FIG. The user is asked to answer this questionnaire, and the result is stored in the employee's satisfaction / satisfaction index (FABD).
  • the employee satisfaction / severity index As an example of the employee satisfaction / severity index, the employee satisfaction / satisfaction index (FABD) table shown in FIG. 11 will be described.
  • the table is composed of a user ID (FABD1) for specifying a user and a happiness (FABD2) as an index. Answer date: July 15, 2009 (FABD3) contains the date of response. The higher the happiness (FABD2), the higher the degree of health, happiness and prosperity.
  • the stress / mental disorder questionnaire is a questionnaire for examining the degree of the psychological state of depression.
  • the following documents may be referred to as an example of a stress / mental disorder questionnaire (GD). Radloff, L .; S. (1977) 'The CES-D scale: A self report depression scale for research in the general population'. Applied Psychological Measurement 1: 385-401. .
  • FIG. 1 An example of the questionnaire is shown in FIG. The user answers this questionnaire and stores the result in stress / mental disorder index (FABE).
  • FABE mental disorder index
  • the stress / mental malfunction indicator (FABE) table in FIG. 11 will be described as an example of the stress / mental malfunction indicator.
  • the table is composed of a user ID (FABE1) for identifying a user and depression (FABE2) as an index. Date of answer: July 15, 2009 (FABE3) contains the date of response. The higher the depression (FABE2), the higher the psychological state of depression.
  • Tissue Activation Questionnaire is a questionnaire for examining the degree of subjective effects after measures of activation. An example of the questionnaire is shown in FIG. The user answers this questionnaire and stores the result in the organization activation index (FABF).
  • the tissue activation indicator (FABF) table of FIG. 11 will be described as an example of the tissue activation indicator.
  • the table is made up of a number of indicators such as a user ID (FABF1) for identifying a user, communication increase (FABE2), and feeling that it is easy to speak (FABE3). This indicator is present as much as the questionnaire item of the tissue activation questionnaire (GE). Answer date: July 15, 2009 (FABF5) contains the date of response.
  • tissue activation index (FABF)
  • the correlation analysis (CAE) shown in FIG. 2C is an analysis unit in which the stress and productivity of each member of an organization are taken as an objective variable and the tissue dynamics index which is an organization activity is taken as an explanatory variable. It is.
  • the feature of this correlation analysis is that not only the variables of the person but also the variables of members around the person are analyzed.
  • the objective variable (CAE1) of the person is a variable stored in the record of the user ID of the user in the objective variable (FAB) of the analysis result table (F) or the objective variable (HA) of the organization information table (H). is there.
  • the principal explanatory variable (CAE2) is a variable stored in the record of the principal user ID in the explanatory variable (FAA) of the analysis result table (F).
  • the surrounding of the surrounding explanatory variable means a surrounding member connected to the person by facing.
  • the surrounding explanatory variable (CAE3) is an explanatory variable to be obtained from surrounding members.
  • a method of processing surrounding explanatory variables will be described.
  • Select surrounding members by surrounding member selection (CAE 31).
  • Select a member connected to you from the facing matrix (FC1C).
  • FC1C facing matrix
  • a member of one step is a member connected to oneself.
  • the two-step member is a one-step member and a member connected to it.
  • the feature value calculation (CAE 32) is a process of obtaining surrounding explanatory variables from explanatory variables of members selected by surrounding member selection (CAE 31).
  • CAE 31 surrounding member selection
  • CAE 32 As a calculation method for determining surrounding explanatory variables from the explanatory variables of members selected in the one-step member, the average or variance of the explanatory variables of the selected surrounding members is determined. The same calculation is performed in the case of two steps.
  • the correlation (CAE4) is a correlation between the objective variable (CAE1) of the user and the explanatory variable (CAE2) of the user and the explanatory variables (CAE3) of the surroundings.
  • the correlated result is stored in the factor coefficient (FAC) of the analysis result table (F).
  • Period: 2009/7 / 1-July 31 (FAC1) is the period of the tissue dynamics index used for analysis.
  • the factor coefficient (FAC) is part of a table that stores the correlation coefficient obtained by the correlation (CAE4).
  • the person (FACA) is the result of the correlation between the object variable (CAE1) of the person and the explanatory variable (CAE2) of the person.
  • the one-step average (FACB) is the result of the correlation between the objective variable (CAE1) of the user and the surrounding explanatory variables (CAE3) (of which the average value of the explanatory variables of the members of one step).
  • the two-step variance (FACC) is the result of the correlation between the objective variable of the user (CAE1) and the surrounding explanatory variables (CAE3) (of which the variance of the explanatory variables of the two-step members).
  • the factor coefficient (FAC) can be obtained by substituting the correlation result between the user and the surrounding members, so store the result of the dispersion value of the explanatory variable of one step member and the average value of the explanatory variable of two step members. I don't care.
  • the table configuration of the person (FACA) will be described.
  • the objective variable (FACA1) on the vertical axis is a variable stored as the objective variable (FAB) of the analysis result table (F) or the objective variable (HA) of the organization information table (H). Therefore, items such as the performance (FACA2) and the degree of contribution (FACA3) that were the productivity index (HA) of the objective variable (HA) of the organization information table (H) are substituted.
  • the horizontal axis is a variable stored as an explanatory variable (FAA) of the analysis result table (F).
  • the items of the order (FACA6), the cohesion degree (FACA7), and the two-step attainment degree (FACA8), which are the network index (FAAA) of the explanatory variable (FAA) of the analysis result table (F), are substituted.
  • the personality index (FACA 9) obtained from the questionnaire is also stored as an explanatory variable.
  • the openness (FACA 10) is shown as an example of the personality index.
  • the correlation between the score (FACA2) and the cohesion degree (FACA7) of the network indicator (FACA5) is 0.47, and 0.01 enclosed in parentheses is the test result.
  • P value is used in this factor coefficient (FAC).
  • the P value is the probability of occurrence of a phenomenon that is less likely to occur than the observed phenomenon.
  • One-step average (FACB) and two-step variance (FACC) are also the same as the table configuration of the person (FACA).
  • CAE Correlation analysis
  • CAF Factor selection
  • CAE correlation analysis
  • model drawing a model is drawn using factor variables (CAF) using an objective variable and an explanatory variable with high coefficient values.
  • CAF factor variables
  • KA scientific management knowledge model
  • Period: 2009/7 / 1-July 31 (KA10) is the period of the tissue dynamics index used for analysis.
  • subjective or objective indicators such as stress or productivity of each member are used as objective variables, and they are exhaustive indicators such as body rhythm indicators and face-to-face indicators.
  • Correlation analysis is performed using tissue dynamics indicators as explanatory variables. Thereby, the factor of the subjective or objective index in the organization which is an analysis unit can be specified. Then, this model can specifically identify which organizational behavior should be improved.
  • tissue dynamics index is determined as the explanatory variable in the present embodiment, the index determined from the questionnaire (G) may be used as the explanatory variable. Furthermore, a tissue dynamics index may be used as the objective variable.
  • Example 1 the personality index (FAAE) was determined by the personality index (GA), but learning of the past personality index (GA) and learning of the model, the current organization dynamics index and the personality index from the model Find (FAAE).
  • Fig. 13 shows personality index extraction (CA1) which learns past personality questionnaire (GA) and obtains personality index coefficient (FAE) which is coefficient of model, and present tissue dynamics index and personality index coefficient (FAE) from present The personality index transformation (CA2) to obtain the personality index (FAF) of
  • CA1 personality index extraction
  • GA personality questionnaire
  • FAAA organization dynamics index
  • FAAB physical rhythm index
  • FAAC face-to-face index
  • FAAD activity index
  • CAE personality index coefficient
  • the process until the network index (FAAA), the physical rhythm index (FAAB), the face-to-face index (FAAC), the activity index (FAAD), and the personality index (FAAE) are obtained is the same as that of the first embodiment.
  • the personality index coefficient extraction (CA1A) will be described.
  • multiple regression with personality index (FAAE) as objective variable network index (FAAA), physical rhythm index (FAAB), face-to-face index (FAAC) and activity index (FAAD) as explanatory variables
  • coefficient and constant terms in the multiple regression equation are determined.
  • the personality factor (FAE) of the analysis result table (F) shown in FIGS. 14 and 15 is a table in which the coefficient and constant term in the multiple regression equation in each personality index (FAAE) are summarized.
  • FIG. 14 summarizes the coefficients and constant terms of the multiple regression equation in the tissue
  • FIG. 15 summarizes the coefficients and constant terms of the multiple regression equation for each user.
  • the vertical axis shown in FIG. 14 is the objective variable personality (FAE 1), which is composed of extroversion (FAE 2), harmony (FAE 3), integrity (FAE 4), nervousness (FAE 5) and openness (FAE 6) ing.
  • the horizontal axis is an explanatory variable, in which coefficients of multiple regression equations in indexes such as the network index (FAE 8) and the personality index (FAE 12) are stored.
  • the intercept (FAE14) is a constant term of the multiple regression equation. The same applies to the vertical axis and the horizontal axis shown in FIG. Although multiple regression analysis was used in personality index extraction (CA1), any other method may be used as long as it is used for learning.
  • personality index conversion uses the current index from among the personality index coefficient (FAE) obtained by personality index extraction (CA1) and the tissue dynamics index to obtain the current personality index (FAG).
  • FAE personality index coefficient
  • FAG tissue dynamics index
  • A2A coefficients and constant terms in multiple regression equation stored in personality index coefficient (FAE), network index (FAAA) that is tissue dynamics index, physical rhythm index (FAAB), face-to-face index Use current indicators of (FAAC) and activity indicator (FAAD). Then, by applying them to the multiple regression equation, the personality index is obtained from the tissue dynamics index.
  • FAE personality index coefficient
  • FAAA network index
  • FAAB physical rhythm index
  • FAAC face-to-face index
  • FAAC face-to-face index
  • FAAD activity indicator
  • the estimated personality index (FAG) of the analysis result table (F) is a personality index obtained by personality index conversion (CA2A).
  • An example of the table is shown in FIG.
  • the table format is omitted because it is the same as the personality index (FAAE) of the analysis result table (F).
  • Date: August 15, 2009 14:32 (FAG 7) shows the date and time of the current tissue dynamics indicator used in the analysis.
  • the personality index was sought, but other questionnaires (Leadership / Teamwork Questionnaire (GB), Employee's Challenge / Enrichment Questionnaire (GC), Stress / Mental Disorder Questionnaire (GD), Organization Activation Questionnaire)
  • GB Teamwork Questionnaire
  • GC Employee's Challenge / Enrichment Questionnaire
  • GD Stress / Mental Disorder Questionnaire
  • GE Organization Activation Questionnaire
  • a network diagram capable of simultaneously displaying a set facing a user is generated.
  • Conventional network diagrams can not see facing pairs.
  • the problem is solved by displaying both points of the network diagram (nodes) as a user and a face-to-face set.
  • FIG. 23 is a diagram showing a processing procedure for simultaneously displaying a set facing the user.
  • a model is configured by the facing group network modeling analysis (CB), and drawing is performed by the facing group network diagram drawing (JB), and the drawn result is a facing group network diagram (KB).
  • This process can be processed by the same framework as that of the first embodiment, and the face-to-face set network modeling analysis (CB) is the control unit (ASCO) of the application server (AS), the face-to-face pair network diagram drawing (JB) Is executed on the display (J) of the client (CL).
  • ASCO control unit
  • AS application server
  • JB face-to-face pair network diagram drawing
  • Face-to-face pair by face-to-face time is a process for determining face-to-face pairs and their face-to-face times. Since the members facing each other are described in the facing table (FC1A), the set and time will be determined from this. This result is stored in the face-to-face meeting time list (FBA) of the analysis result table (F).
  • the table configuration is composed of a facing set (FBA1) and a facing time (FBA2).
  • Period: 2009/7 / 1-July 31 (FBA3) is the period used for the facing table (FC1A).
  • Days: 31 days (FBA 4) is the number of days in the period (FBA 3).
  • Actual days: 21 days (FBA5) is the number of business days in the period (FBA3).
  • Meeting determination time: 3 minutes / 1 day (FC1C6) is a threshold value for determining that meeting has occurred.
  • FBA face-to-face meeting time list
  • the table configuration used in the face-to-face meeting time list (FBA) It does not matter.
  • Face-to-face grouping table generation is a process of combining face-to-face person facing time lists (FBA) into a face-to-face pair with a user.
  • Face-to-face hunting is a process for deleting from the face-to-face person face-to-face time list (FBA) those having a small face-to-face time (FBA2).
  • a threshold value may be obtained by multiplying the meeting determination time (FBA6) and the actual number of days (FBA5), and a value larger than the threshold value may be left.
  • the facing group table (FBB) is output from the list output by the facing branch hunting (CBB1).
  • the facing group table is composed of a facing group facing time list (FBBA) and a facing group connection matrix (FBBB).
  • FBBA facing group facing time list
  • FBBB facing group connection matrix
  • FIG. The face-to-face group face-to-face time list (FBBA) indicates the sizes of nodes (points) in the network diagram. It consists of a face-to-face pair (FBBA1) and a face-to-face time (FBBA2). Period: 2009/7 / 1-July 31 (FBBA3) indicates the period used on the facing table (FC1A).
  • the face-to-face set connection matrix indicates an edge (line) in the network diagram.
  • the vertical axis and the horizontal axis are face-to-face pairs with the user.
  • Period: 2009/7 / 1-July 31 (FBBB3) shows the period used on the facing table (FC1A).
  • 1 is substituted in order to connect the face-to-face set and its constituent members by an edge (line). Furthermore, in the face-to-face pair (FBBB4) having the inclusion relationship, 1 is substituted in order to connect the face-to-face pair having a large number of members including members and the small surface pair with an edge (line) (FBBB5). Then, among members in a large facing pair, members included in a small surface pair are not connected by an edge (line) (FBBB6).
  • the facing group table (FBB) is different from the table configuration used in the facing group table (FBB) if this is satisfied. It does not matter.
  • the user ID table (IA) of the user / place information table (I) will be described. An example of this table is shown in FIG.
  • the user ID table (IA) is a table for associating user IDs with information such as names and team names.
  • JB face-to-face combination network diagram drawing
  • CBB face-to-face combination table generation
  • the members are indicated by square points (nodes) and the facing set is indicated by circle points (nodes).
  • the point (node) of the member is placed outside, and the point (node) of the facing set is placed inside.
  • the size of the point (node) is determined by the face-to-face group face-to-face time list (FBBA).
  • points (nodes) stored with 1 in the facing group connection matrix (FBBB) are connected by lines (edges).
  • a network diagram capable of simultaneously displaying a user and a place is generated.
  • Conventional network diagrams can not see facing pairs. Resolving this problem by displaying both points of the network diagram (nodes) as both user and location.
  • FIG. 26 is a diagram showing a processing procedure for simultaneously displaying the user and the place.
  • a model is constructed by field network modeling analysis (CC), and drawing is performed by field network diagram drawing (JC).
  • the result of drawing is a field network diagram (KC).
  • CC field network modeling analysis
  • AS application server
  • JC field network diagram drawing
  • C1D In the place table process (C1D), the meeting situation between members is summarized in chronological order for each given period from infrared data of tissue dynamics data.
  • An infrared terminal emitting infrared light is installed in a location, and the name tag type sensor node (TR) detects the infrared radiation installed in the location to determine that the user is staying in the location.
  • TR name tag type sensor node
  • the place ID table (IB) of the user / place information table (I) will be described.
  • An example of this table is shown in FIG.
  • the place ID table (IB) is a table for associating a place ID, a place name and an infrared ID. It consists of a place ID (IB1), a place name (IB2), and an infrared ID (IB3).
  • the place name (IB2) is the name of the place
  • the infrared ID (IB3) is the ID of the infrared terminal installed in the place ID (IB1).
  • a plurality of infrared terminals may be installed in the place. When the plurality of infrared terminals are installed, the plurality of infrared IDs are described in the infrared ID (IB3).
  • the extracted result is stored in the place table (FC1D) of the analysis result table (F).
  • FC1D place table
  • FIG. This is a table in which one day (24 hours) is stored in chronological order with a time resolution of one minute (FC1D3) with one place as a record.
  • FC1D3 the vertical axis is a place ID (FC1D1) for determining the place
  • FC1D2 resolution time
  • the facing situation in a place at a certain time only needs to read the place having the place ID (FC1D1) and the resolution time (FC1D2).
  • FC1D1D1 place ID
  • FC1D2 resolution time
  • the value included here is either the number of registered persons, or the user ID or NULL if there is a person on the other hand.
  • NULL is stored in the location table (FC1D) when there is no infrared data of tissue dynamics data of the corresponding user at that time.
  • FC1D place table
  • FC1D place table
  • CCA place and user stay time
  • the place table (FC1D) describes the names of members registered on the time series according to the place, so use them to organize the set and time of registration in a table. What is summarized in the table is the face-to-face meeting time list (FCA) of the analysis result table (F).
  • the meeting ID (FCA1) is described as one record, and the meeting time in the member / registered pair (FCA2) is described.
  • Period: 2009/7 / 1-July 31 (FCA3) is the period used for the place table (FC1D).
  • Days: 31 days (FCA 4) is the number of days in the period (FCA 3).
  • Real Days: 21 days (FCA 5) is the number of business days in the period (FCA 3).
  • Meeting determination time: 1 minute / day (FCA 6) is a threshold value for determining that the user is enrolled.
  • Face-to-face combination table generation is a process of combining face-to-face person facing time lists (FCA) into a face-to-face pair with a user.
  • place branch hunting it is a process of deleting the one having a small meeting time value from the meeting time list (FCA) for each meeting person.
  • FCA meeting time list
  • FCA 6 a value obtained by multiplying the face-to-face determination time (FCA 6) and the actual number of days (FCA 5) may be used as a threshold, and a value larger than that value may be left. Since there are more points (nodes) in the network diagram, place branch hunting (CCB1) is performed. If there is another method, it may be used.
  • the user / field table (FCB) is output from the list output by the location branch hunting (CCB1).
  • the user / field table (FCB) is composed of a meeting time list by field (FCBA) and a user / field matrix (FCBB).
  • FCBA meeting time list by field
  • FCBB user / field matrix
  • FCBA The meeting facing time list
  • FCBA1 shows the sizes of nodes (points) in the network diagram. It consists of a place (FCBA1), a face-to-face pair (FCBA2) and a face-to-face time (FCBA3).
  • Period: July 1-July 31, 2009 (FCBA 4) indicates the period used in the place table (FC 1 D).
  • the user / field matrix (FCBB) indicates an edge (line) in the network diagram.
  • the user / place matrix (FCBB) has the vertical axis and the horizontal axis with the user.
  • Period: 2009/7 / 1-July 31 (FCBB3) shows the period used in the place table (FC1D).
  • FCBB3 shows the period used in the place table (FC1D).
  • 1 is substituted in order to connect a place and a member registered there by an edge (line).
  • the user / place table (FCB) is different from the table configuration used in the user / place table (FCB) if this is satisfied. It does not matter.
  • JC field network diagram drawing
  • KC field network diagram of FIG. Period: 2009/7 / 1-July 31
  • FC1D place table
  • the members are indicated by square points (nodes) and the facing set is indicated by circle points (nodes).
  • the point (node) of the member is placed outside, and the point (node) of the facing set is placed inside.
  • the size of the point (node) is determined by the meeting time list (FCBA) according to meeting place.
  • FCBA1 points (nodes) are displayed as a pie chart. Connect points (nodes) stored with 1 in the user / field matrix (FCBB) with lines (edges).
  • the points (nodes) are displayed like Watanabe (KC2), Ito (KC1), and a conference room (KC3).
  • the points (nodes) of the meeting room (KC3) are displayed as a circle graph, divided by the ratio of the facing time, and the users engaged in it and the facing group (for example, Ito (KC5), Watanabe, Ito (KC4)) are displayed.
  • mapping on a sketch is made to improve the sense of reality.
  • FIG. 29 is a diagram showing a processing procedure for mapping a team to a place.
  • the model is constructed by the place team modeling analysis (CD), and in the case of display by number of people, drawing is performed by the number of people map drawing (JDA) of the place, and the drawn result is the number of people map of the place (KDA)
  • JDA people map drawing
  • KDA people map drawing
  • JDB inside and outside team map drawing
  • KDB inside and outside team map
  • This process can be processed by the same framework as that of the first embodiment, and the place team modeling analysis (CD) is the control unit (ASCO) of the application server (AS), and the map drawing by number of people (JDA) The team internal and external map drawing (JDB) of the place is executed by the display (J) of the client (CL).
  • CD place team modeling analysis
  • AS application server
  • JDA map drawing by number of people
  • JDB team internal and external map drawing
  • the process until the location table (FC1D) of the analysis result table (F) is obtained is the same as that of the fourth embodiment, so the description is omitted.
  • Meeting time by place and number of people (CDA) is a process for obtaining meeting time by meeting person in a place. Since the members registered at the place are listed in chronological order in the place table (FC1D), the time for meeting with the number of people is calculated from now. This result is stored in the meeting time (FDA) by number of persons in the field of the analysis result table (F).
  • the table configuration is composed of a place ID (FDA1) and a number of people (FDA2).
  • Place ID is one record, and the face-to-face time in the number of people (FDA 2) is described.
  • the number of people (FDA 2) is classified by resolution of 1 person (FDA 6), 2 people (FDA 7), 3-5 people (FDA 8), 6 people or more (FDA 9).
  • Period: 2009/7 / 1-July 31 is the period used for the place table (FC1D).
  • Days: 31 days (FDA 4) is the number of days in the period (FDA 3).
  • Real Days: 21 days (FDA 5) is the number of business days in the period (FDA 3).
  • the field list (IC) is composed of a field map image (ICA) which is a sketch and field coordinates (ICB) in which coordinates of places are described.
  • the venue map image (ICA) is a sketch image.
  • the field coordinates (ICB) are composed of a place ID (ICB1) and coordinate values (ICB2).
  • the location ID (ICB1) is one record, and the X coordinate value (ICB3) and the Y coordinate value (ICB4) in the coordinate value (ICB2) are described.
  • the map according to the number of people in the place is a process of mapping from the meeting time according to the number of people in the place (FDA) to a sketch and drawing.
  • the result of drawing is a map by number of people (KDA) of the place of FIG.
  • Meeting time by number of people meeting time (FDA) by place number of people Meeting time by number of people is plotted on the pie chart at the corresponding place in the sketch.
  • the central angle of the pie chart changes according to the ratio of the facing time by the number of people.
  • the size of the pie chart is the sum of the meeting time by number of people.
  • the place name (IB2) is displayed from the place ID table (IB) near the corresponding place in the sketch.
  • Period: 2009/7 / 1-July 31 (KDA1) is the period used for the place table (FC1D).
  • the team internal and external meeting time (CDB) of the venue is a process for determining the internal and external team meeting time in the venue. Since the members registered at the place are listed in chronological order in the place table (FC1D), it is judged from the user ID table (IA) whether it is an in-team or out-of-team meeting from now on, Find each meeting time. This result is stored in the team internal and external meeting time (FDB) in the field of the analysis result table (F).
  • the table configuration is composed of a place ID (FDB1) and inside and outside (FDB2). Place ID (FDA 1) is described as one record, and the meeting time in the inside and outside (FDB 2) is described.
  • FDB2 Inside and outside (FDB2) are classified with the resolution of in-team (FDB 6) and out-of-team (FDB 7).
  • Period: 2009/7 / 1-July 31 (FDB3) is the period used for the place table (FC1D).
  • Days: 31 days (FDB 4) is the number of days in the period (FDB 3).
  • Actual days: 21 days (FDB 5) is the number of business days in the period (FDB 3).
  • the team internal and external map drawing (JDB) of a place is a process which maps and draws from a team internal and external facing time (FDB) of a place to a sketch.
  • the drawn result is a map inside and outside the team (KDB) in the place of FIG.
  • the meeting time by team inside and outside at the place of the team inside and outside meeting time (FDB) of the place is plotted on the pie chart in the corresponding place in the sketch.
  • the central angle of the pie chart changes according to the ratio of the facing time by the number of people.
  • the size of the pie chart is the sum of face-to-face contact times inside and outside the team.
  • the place name (IB2) is displayed from the place ID table (IB) near the corresponding place in the sketch.
  • Period: 2009/7 / 1-July 31 (KDB3) is the period used for the place table (FC1D).
  • mapping team information to the place on the sketch the feeling of reality can be improved.
  • mapping is performed on a sketch to improve the actual feeling.
  • FIG. 32 is a diagram showing a processing procedure for mapping a team to a place.
  • the model is constructed by the place facing set modeling analysis (CE), and in the case of the facing set, drawing is performed by the face set drawing map (JEA) of the place, and the drawing result is the face set map of the place (KEA)
  • drawing is performed by the user map drawing (JEB) of the place, and it is a user map (KEB) of the place.
  • This process can be processed by the same framework as that of the first embodiment, and the place facing group modeling analysis (CE) is performed by the control unit (ASCO) of the application server (AS), the face group map drawing of the field (JEA) ) And the user map drawing (JEB) of the place are executed on the display (J) of the client (CL).
  • the process until the location table (FC1D) of the analysis result table (F) is obtained is the same as that of the fourth embodiment, so the description is omitted.
  • Place-face-to-face group face-to-face time is a process for obtaining face-to-face time by face-to-face group in a place. Since the members registered at the place are described in chronological order in the place table (FC1D), the time for which the face group is facing is determined from this. This result is stored in the meeting set facing time (FEA) of the field of the analysis result table (F).
  • the table configuration is composed of a place ID (FEA1) and a user / face-to-face pair (FEA2).
  • the meeting time in the user / face-to-face pair (FEA2) is described with the place ID (FEA1) as one record.
  • the user / face-to-face pair (FEA 2) stores the registered user ID and its face-to-face time.
  • Period: 2009/7 / 1-July 31 (FEA3) is the period used for the place table (FC1D).
  • Days: 31 days (FEA 4) is the number of days in the period (FEA 3).
  • Real Days: 21 days (FEA 5) is the number of business days in the period (FEA 3).
  • the meeting group map drawing (JEA) of the place is a process of mapping and drawing from the meeting group facing time (FEA) of the place to the sketch.
  • the drawn result is a face-to-face set map (KEA) of the field of FIG.
  • Face-to-face pairing time of meeting place Face-to-face pairing time (FEA)
  • the round-to-face pairing time is plotted at the corresponding place in the floor plan.
  • the diameter of the circle is changed according to the facing time of the facing set. Also, if there are multiple circles in the same place, shift so as not to overlap.
  • the user name (IA2) is selected from the user ID table (IA) and the names of the members of the face-to-face set are displayed.
  • the place name (IB2) is selected from the place ID table (IB) and displayed.
  • Period: 2009/7 / 1-July 31 (KEA3) is the period used for the place table (FC1D).
  • Place-user face-to-face time is a process for determining the face-to-face time of the user at the place. Since the members registered in the place are described in chronological order in the place table (FC1D), the face-to-face time of each member is determined from the user ID table (IA). This result is stored in the user facing time (FEB) in the field of the analysis result table (F).
  • the table configuration is composed of a place ID (FEB1) and a user ID (FEB2).
  • the meeting time in the user ID (FEB2) is described with the place ID (FEB1) as one record.
  • Period: July 1-July 31, 2009 (FEB3) is the period used for the place table (FC1D).
  • Days: 31 days (FEB 4) is the number of days in the period (FEB 3).
  • Real Days: 21 days is the number of business days in the period (FEB3).
  • the user map drawing (JEB) of the place is a process of mapping and drawing from the user facing time (FEB) of the place to the sketch.
  • the drawn result is the user map (KEB) of the field of FIG.
  • a circle is plotted at the corresponding place in the floor plan for the user facing time (FEB) location of the place by user.
  • the diameter of the circle changes according to the face-to-face time of the user. Also, if there are multiple circles in the same place, shift so as not to overlap.
  • the user's name (IA2) is selected from the user ID table (IA) and the name of the member is displayed.
  • the place name (IB2) is selected from the place ID table (IB) and displayed.
  • Period: 2009/7 / 1-July 31 (KEB1) is the period used for the place table (FC1D).
  • mapping is performed on a sketch to improve the actual feeling.
  • FIG. 34 is a diagram showing a processing procedure for mapping the use situation of a place.
  • a model is constructed by the place use modeling analysis (CF), and in the case of the place use number of people on the time series in the place, the drawing is made by the place attendance graph (JF1) of the place use situation map drawing (JF)
  • drawing is performed by the field usage frequency graph (JF2) of the site usage situation map drawing (JF)
  • JF3 field temperature flag
  • KFA field utilization situation map
  • CF place use modeling analysis
  • AS application server
  • JF field use situation map drawing
  • the process until the location table (FC1D) of the analysis result table (F) is obtained is the same as that of the fourth embodiment, so the description is omitted.
  • the use situation by place (CFA) is a process for obtaining the average use rate and the average number of people in the place. Since the members registered in the place are listed in chronological order in the place table (FC1D), the average use rate and the average number of people in the place are determined from this.
  • This result is stored in the field attendance time (FFA) of the analysis result table (F).
  • the table configuration is composed of a place ID (FFA1) and an index (FFA2).
  • the usage rate (FFA3) and the average number people (FFA4) are described with the place ID (FFA1) as one record.
  • Period: 2009/7 / 1-July 31 (FFA5) is the period used for the place table (FC1D).
  • the place attendance graph (JF1) is a process of drawing the use situation of places on a time series from the place table (FC1D) and the place attendance time (FFA).
  • the result of drawing is a field attendance graph (KFA2) of FIG.
  • the number of registered persons is displayed as a line graph on the time series from the place table (FC1D).
  • Usage rate (FFA3) and Average number people (FFA4) obtained from the field attendance time (FFA) are displayed.
  • the face-to-face meeting count is a process for determining the number of usage hours at a place.
  • the table configuration is composed of a place ID (FFB1) and a time (FFB2).
  • a place ID (FFA1) is set as one record, and the number of times of use by time is described.
  • Period: 2009/7 / 1-July 31 (FFA7) has a period using the place table (FC1D).
  • the site use frequency graph (JF2) is a process of drawing the number of use times at a place from the site use time (FFB).
  • the drawn result is a field utilization time graph (KFA 3) of FIG.
  • the number of times of use is displayed as a bar graph by time from the field use time (FFB). Furthermore, it may be a pie chart.
  • temperature table process C1E
  • temperature data of tissue dynamics data is summarized in time series for each fixed period.
  • the extracted result is stored in the temperature table (FC1E) of the analysis result table (F).
  • FC1E temperature table
  • FIG. This is a table in which one day (24 hours) is stored in chronological order, with a user as one record and a time resolution of one minute (FC1E3).
  • the vertical axis is a user ID (FC1E1) for identifying a member individual, and the horizontal axis is a resolution time (FC1E2) indicating time by time resolution.
  • FC1E1 a resolution time
  • the temperature table status of the user at a certain time needs only to read the place where the user ID (FC1E1) and the resolution time (FC1E2) are present.
  • the temperature of 2009/7/11 10:02 of the user ID 001 is 23.5.
  • NULL is stored in the temperature table (FC1E) when there is no temperature data of tissue dynamics data of the corresponding user at that time.
  • FC1E temperature table
  • FC1E4 and FC1E5 are the same (July 2, 2009), they have different tables because they have different time resolutions.
  • the temperature table (FC1E) may be different from the table configuration used in the temperature table (FC1E) as long as the temperature table is satisfied.
  • CFC Field Average Temperature
  • the table configuration is composed of a place ID (FFC1) and an index (FFC2).
  • the usage rate (FFA3) is described with the place ID (FFA1) as one record.
  • Period: 2009/7 / 1-July 31 (FFA4) is performing the period used for the temperature table (FC1D).
  • the field temperature flag (JF3) is a process of drawing the use situation of a place on a time series from the temperature table (FC1E) and the field temperature (FFC).
  • the drawn result is a field temperature graph (KFA 4) of FIG.
  • the temperature table (FC1D) is displayed as a line graph of temperature data of the corresponding user on the time series. Then, the usage rate (FFC3) obtained from the field temperature (FFC) is displayed.
  • the field utilization time graph (KFB3) and the field temperature graph (KFB4) are processes for mapping and drawing on a sketch.
  • the result of drawing is the field utilization situation map (KFA) of FIG.
  • the icon for example, (KFA12) or (KFA12)
  • KFA1 the icon of the place name is arranged at the corresponding place in the sketch.
  • KFA2 the on-site field attendance graph
  • KFA3 the field usage time graph
  • KFA4 the field temperature graph
  • the communication frequency is visualized by reflecting the meeting period on the network diagram.
  • FIG. 37 is a diagram showing a processing procedure for reflecting the communication cycle on the network diagram.
  • a meeting time and a meeting period are determined by meeting cycle analysis (CG), and a network diagram is created by a meeting cycle network diagram drawing (JGA), and the result is a meeting cycle network diagram (KGA).
  • JGA meeting cycle network diagram drawing
  • KGA meeting cycle network diagram
  • JGB histogram diagram showing a period of time spent on meeting by user according to a meeting period histogram diagram
  • KGB meeting period histogram diagram
  • This process can be processed by the same framework as in the first embodiment, and the meeting cycle analysis (CG) corresponds to the control unit (ASCO) of the application server (AS), the meeting cycle network diagram drawing (JGA) and the meeting cycle
  • the histogram diagram drawing (JGB) is executed on the display (J) of the client (CL).
  • the process until obtaining the facing matrix (FC1C) of the analysis result table (F) is the same as that of the first embodiment, and hence the description is omitted.
  • FC1C facing matrix
  • the number of days used for creation is a plurality of days (period: July 1-July 31, 2009 (FC1C3)), but in Example 8, the meeting of the plurality of days shown in FIG. Find a matrix and a face-to-face matrix with one day.
  • the face-to-face time stored in the face-to-face matrix (FC1C) for each day is substituted with 1 for a large value and 0 for a small value based on a certain threshold.
  • the threshold is the meeting determination time of the meeting matrix (FC1C): 3 minutes / 1 day (FC1C7).
  • the binarized results are stored in the face-to-face binary matrix (FGA) of the analysis result table (F). Since the format of the file is the same as the facing matrix (FC1C), it is omitted.
  • the difference from the face-to-face matrix (FC1C) is a stored value, and the face-to-face matrix (FC1C) has multiple values, and the face-to-face binary matrix (FGA) has two values.
  • Meeting cycle extraction is a process for obtaining a meeting cycle from the meeting matrix (FC1C) of one day.
  • the meeting cycle is determined from the daily meeting of members.
  • As a method of determining the meeting period it can be considered that the actual number of days divided by the number of days in which the person has met.
  • the method of determining the meeting period may be another method.
  • the result obtained by the meeting period extraction (CGB) is stored in the meeting period matrix (FGB) of the analysis result table (F).
  • FIG. An example of a facing period matrix (FGB) is shown in FIG. It is a compilation of the meeting cycle results for one month.
  • the vertical axis is a user ID (FGB1) for identifying a member individual
  • the horizontal axis is a user ID (FGB2) indicating a partner who has met.
  • the meeting cycle (number of days) of the user 002 with the user 003 is 1.0, which means that they meet daily. As this value increases, it indicates that the face-to-face period increases. For example, the face-to-face contact with the user 002 in the user 001 is 2.3, but this faces once in about 2 days It means that.
  • Period July 1-July 31, 2009 (FGB 3) is the period used for the Meeting Period Matrix (FGB).
  • the facing cycle matrix (FGB) may be different from the table configuration used in the facing matrix (FC1C) if it is satisfied.
  • JGA face-to-face periodic network diagram drawing
  • KGA facing periodic network diagram
  • Period: 2009/7 / 1-July 31 (KGA1) has shown the period used by the facing period matrix (FGB).
  • Members are indicated by circle points (nodes).
  • lines (edges) connecting members indicate facing time / period.
  • the thickness of the line indicates the facing time
  • the shape of the line indicates the facing period.
  • the spring model calculates the force (inward or outward) assuming that there is a spring when two nodes (points) are connected, and further calculates the distance from all nodes not connected with itself. It is a method to make it an optimal arrangement by repeating movement of a position as receiving repulsive force (repulsive force) according to.
  • the points (nodes) are arranged like Takahashi (KGA2), Tanaka (KGA3) or Watanabe (KGA4).
  • the user name (IA2) is obtained from the user ID (IA1) using the user ID table (IA) and displayed.
  • Tanaka (KGA3) and Watanabe (KGA4) are indicated by a line (edge) connecting the two, and (KGA5) indicates that the facing time is short but they meet daily.
  • Tanaka (KGA3) and Takahashi (KGA2) is (KGA6), which indicates that the meeting time is long but the meeting cycle is long (a meeting cycle of several days).
  • Face-to-face period histogram drawing draws a histogram in consideration of the face-to-face period for each member from the face-to-face period matrix (FGB) for which the communication period is determined and the face-to-face matrix (FC1C) of multiple days that is the amount of communication. .
  • the drawn result is shown by a meeting period histogram chart (KGB).
  • Period: 2009/7 / 1-July 31 KGB2 has shown the period used by the facing period matrix (FGB).
  • the meeting time according to the meeting period for each member is shown.
  • Takahashi KGB2
  • Tanaka KGB3
  • Watanabe KGB4 are users, and the strip above is the facing time according to the period.
  • the band configuration is the facing time with members of cycle 2 or more (KGB5) and the facing time with members less than cycle 2 (KGB6), and the length of the band indicates the total facing time (KGB7).
  • the communication situation is visualized by reflecting the number of people facing each other at the time of communication on the network diagram.
  • FIG. 38 is a diagram showing a processing procedure for reflecting the number of people facing each other at the time of communication on the network diagram.
  • the meeting time and the meeting number are obtained by the meeting number analysis (CH), and a network diagram is created by the meeting network diagram drawing (JHA) by the meeting number, and the result is a meeting network view (KHA) by the meeting number.
  • the network diagram is integrated into one for each meeting number, and a network diagram is created by the maximum meeting time number network diagram (JHB), and the result is the maximum meeting time number network diagram (KHB).
  • This process can be processed by the same framework as that of the first embodiment, and the meeting number analysis (CH) corresponds to the control unit (ASCO) of the application server (AS), the meeting network diagram drawing by meeting number (JHA), The maximum meeting time number of people network diagram drawing (JHB) is executed on the display (J) of the client (CL).
  • the process until the meeting table (FC1A) of the analysis result table (F) is obtained is the same as that of the first embodiment, so the description is omitted.
  • Meeting matrix generation (CHA) according to the number of people in a meeting is a process of generating a matrix for each number of people in a meeting.
  • the basic matrix generation method is the same as in face-to-face matrix generation (C1C). However, it differs by only one point, which means that the meeting matrix to be stored differs depending on the number of people facing each other at the resolution time (FC1A2) of the meeting table (FC1A).
  • the number of people meeting at resolution time (FC1A2) is 2
  • the number of people meeting at resolution time (FC1A2) is 3 in substitution matrix for two-person meeting matrix (FHAA) of meeting matrix according to number of people facing each other (FHA) From 5 to 5, it is substituted in the 3-party-five-person meeting matrix (FHAB) of the meeting person-by-person meeting matrix (FHA), and when the number of meeting people at resolution time (FC1A2) is 6 or more FHA) will be assigned to substitution matrix for face-to-face matrix (FHAC) over six parties.
  • the meeting matrix generation (CHA) according to the number of meeting persons is a process of generating a matrix according to the predetermined number of meeting persons, and the range of the number of meeting persons in the meeting matrix can be arbitrarily determined.
  • Meeting matrix generation (CHA) by meeting number of people removed time series information from the meeting table (FC1A) arranged in time series, and put together how many people are meeting for each user in a two-dimensional matrix by meeting number of people It is a thing.
  • the extracted result is stored in a meeting matrix by face number (FHA) of the analysis result table (F).
  • FHA face-to-face meeting matrix
  • the Face-to-face Meeting Matrix is composed of a plurality of face-to-face matrixes, and the two-to-two face matrix (FHAA), the three-to-five face matrix (FHAB) or the six-to-more face matrix ( FHAC).
  • the vertical axis is a user ID (FHAA1) for identifying a member individual
  • the horizontal axis is a user ID (FHAA2) indicating the other party who has met.
  • the meeting time of the user 003 with the user 004 is 543 minutes.
  • the views of the three-to-five face-to-face matrix (FHAB) and the face-to-face matrix of six or more faces (FHAC) are the same.
  • Period 2009/7 / 1-July 31 (FHA1)
  • FHA The period used for the meeting matrix (FHA) according to the number of persons meeting.
  • Days: 31 days (FHA2) is the number of days in the period (FHA1).
  • Real Days: 21 days (FHA3) is the number of business days in the period (FHA1).
  • Temporal resolution: 1 minute (FHA 4) is the temporal resolution in the facing table (FC 1A).
  • Meeting determination time 3 minutes / 1 day (FHA 5) is a threshold value for determining that meeting has occurred. Even in the case of passing each other, if the infrared rays react, it is determined that they meet, so a few reactions are likely to be noise, so this threshold is introduced.
  • the face-to-face face-to-face matrix differs from the table configuration used in the face-to-face face-to-face matrix (FHA) if this is satisfied. I don't care.
  • Meeting network diagram by meeting number is a process of drawing a network diagram by meeting number from the meeting matrix (FHA) by meeting number showing the meeting situation by meeting number.
  • KHA meeting network diagram
  • Period: 2009/7 / 1-July 31 indicates the period used in the Meeting by Face-to-face Meeting Matrix (FHA).
  • the meeting network diagram (KHA) according to the number of people in the meeting is made up of three network diagrams, and the network diagram between two parties (KHAA) in the case of two parties, and three parties-five parties in the case of three to five. It is a network diagram (FHAB), a network diagram between six or more parties (FHAC) in the case of six or more parties.
  • the two-party network diagram (KHAA) will be described as an example. Members are indicated by circle points (nodes). Further, lines (edges) connecting members indicate facing time. In particular, the thickness of the line indicates the facing time.
  • KHAA The two-party network diagram
  • the spring model (Hook's law) calculates the force (inward or outward) assuming that there is a spring when two nodes (points) are connected, and further calculates the distance from all nodes not connected with itself. It is a method to make it an optimal arrangement by repeating movement of a position as receiving repulsive force (repulsive force) according to.
  • the points (nodes) are arranged like Ito (KHAA1), Watanabe (KHAA2) and Yamamoto (KHAA3).
  • the user name (IA2) is obtained from the user ID (IA1) using the user ID table (IA) and displayed.
  • KHAA1 and Watanabe (KHAA2) face each other they are connected by a line (edge) (KHAA4). Also, the case of Ito (KHAA1) and Yamamoto (KHAA3) is (KHAA5).
  • KHAA5 a value obtained by multiplying the meeting determination time of the meeting matrix (FHA) for each meeting number of people: 3 minutes / 1 day (FHA 5) and the real number of days: 21 days (FHA 3) may be used.
  • FHAB three-party five-party network diagram
  • FHAC six-or-more network diagram
  • Maximum Face-to-face Time Matrix Generation is a process of selecting the maximum face-to-face time according to the number of face-to-face persons and storing it in the matrix. Specifically, the largest value is selected from each matrix of the face-to-face meeting matrix (FHA) and stored in the maximum facing time matrix (FHB).
  • the maximum facing time matrix (FHB) will be described with reference to FIG.
  • the vertical axis is a user ID (FHB6) for identifying a member individual
  • the horizontal axis is a user ID (FHB7) indicating a partner who has met.
  • the meeting time of the user 003 with the user 004 is 543 minutes.
  • the method of obtaining is the user 004 in the user 003 of the two-party facing matrix (FHAA) of the facing matrix (FHA) according to the number of facing persons (FHAA), the three-five person facing matrix (FHAB) and the six or more facing matrix (FHAC)
  • FHAA two-party facing matrix
  • FHAB three-five person facing matrix
  • FHAC six or more facing matrix
  • FHB maximum facing time matrix
  • Period: July 1-July 31, 2009 (FHB1) is the period used for the maximum facing time matrix (FHB).
  • Days: 31 days (FHB2) is the number of days in the period (FHB1).
  • Real Days: 21 days (FHB3) is the number of business days in the period (FHB1).
  • Temporal resolution: 1 minute (FHB4) is the temporal resolution on the facing table (FC1A).
  • Meeting determination time: 3 minutes / 1 day (FHB 5) is a threshold value for determining that meeting has occurred.
  • the maximum face-to-face matrix may differ from the table configuration used in the maximum face-to-face matrix (FHB) if this is satisfied. Absent.
  • the maximum meeting time number of people matrix generation is a process of storing the number of people in the meeting when the maximum meeting time in each meeting number of people is selected in the matrix. Specifically, by storing the number of people in charge of the matrix in which the maximum value is stored from the matrix according to the number of people in the person-to-person matrix (FHA) by storing in the maximum meeting time number of people matrix (FHC) is there.
  • the maximum meeting time people matrix generation (CHC) will be described.
  • the vertical axis is a user ID (FHC6) for identifying a member individual
  • the horizontal axis is a user ID (FHC7) indicating a partner who has met.
  • the user 004 in the user 003 is 1.
  • the stored values (FHC 8) indicate that 1 is between two parties, 2 is between three and five, and 3 is between six and more. This is the same as the meeting number range of the meeting person by face meeting matrix (FHA).
  • the method of obtaining is the user 004 in the user 003 of the two-party facing matrix (FHAA) of the facing matrix (FHA) according to the number of facing persons (FHAA), the three-five person facing matrix (FHAB) and the six or more facing matrix (FHAC)
  • the face-to-face times of 543, 93, 0 are compared, and 543, which is the maximum value, is selected.
  • 543 is from the two-party face-to-face matrix (FHAA) and substitute 1 which means that.
  • Period: 2009/7 / 1-July 31 (FHC1) is the period used for the Maximum Face-to-Face Hour People Matrix (FHC).
  • Days: 31 days (FHC2) is the number of days in the period (FHC1).
  • Real Days: 21 days (FHC3) is the number of business days in the period (FHC1).
  • Temporal resolution: 1 minute (FHC4) is the temporal resolution in the facing table (FC1A).
  • Meeting determination time: 3 minutes / 1 day (FHC5) is a threshold value for determining that meeting has occurred. Even in the case of passing each other, if the infrared rays react, it is determined that they meet, so a few reactions are likely to be noise, so this threshold is introduced.
  • the maximum face-to-face time person matrix differs from the table configuration used in the maximum face-time time face person matrix (FHC) if this is satisfied. I don't care.
  • the maximum meeting time number of people network diagram drawing is a process of drawing the number of people meeting at the time of maximum meeting time from the maximum meeting time matrix (FHB) and the maximum meeting time number matrix (FHC) in the network diagram.
  • KHB maximum meeting time number of people network diagram 41
  • Period: 2009/7 / 1-July 31 shows the period used in the maximum facing time matrix (FHB) and the maximum facing time people matrix (FHC).
  • the spring model calculates the force (inward or outward) assuming that there is a spring when two nodes (points) are connected, and further calculates the distance from all nodes not connected with itself. It is a method to make it an optimal arrangement by repeating movement of a position as receiving repulsive force (repulsive force) according to.
  • the points (nodes) are arranged like Ito (KHB2), Tanaka (KHB3) and Watanabe (KHB4).
  • the user name (IA2) is obtained from the user ID (IA1) using the user ID table (IA) and displayed.
  • KHB2 and Watanabe KHB4 read 543 minutes from the maximum meeting time matrix (FHB) and 1 from the maximum meeting time people matrix (FHC) and read that the two parties are performing 543 minutes.
  • the line (edge) becomes like (KHB5).
  • Ito (KHB2) and Tanaka (KHB3) are 215 minutes from the Maximum Meeting Time Matrix (FHB) and 2 from the Maximum Meeting Time People Matrix (FHC). Read it.
  • a line (edge) becomes like (KHB7).
  • the thickness of the line indicates the facing time
  • the shape of the line solid line, broken line) indicates the number of facing people.
  • the maximum facing time matrix FHB
  • FHB5 3 minutes / 1 day
  • FHB3 the actual number of days
  • FIG. 42 is a diagram showing a processing procedure for visualizing the culture for each tissue.
  • Frequency principal component extraction obtains principal components from behavior index and personality index, and tissue frequency calculation (CIB) combines them into time series trends for each tissue, and tissue frequency graph drawing (JI), time series for each tissue A graph is generated, the result of which is the tissue frequency (KI).
  • CIA Frequency principal component extraction
  • CIB tissue frequency calculation
  • JI tissue frequency graph drawing
  • tissue frequency analysis is a control unit (ASCO) of an application server (AS)
  • tissue frequency graph drawing is a client (CL) ) Is executed (J).
  • the process until the explanatory variable (FFA) of the analysis result table (F) is obtained is the same as that of the first embodiment, so the description is omitted.
  • Frequency principal component extraction is a process for obtaining a feature in the activity of a tissue from an explanatory variable (FFA) of an analysis result table (F) for each individual. Specifically, the characteristic in the activity is clarified by performing principal component analysis on the explanatory variable (FFA) of the analysis result table (F). This result is stored in the frequency main component (FIA) of the analysis result table (F). An example is shown in FIG.
  • the frequency main component (FIA) will be described.
  • the Frequency Principal Component (FIA) is a table that stores features of tissue activity. Period: July 1, 2009 (FIA1) is the period / date used for analysis. Tissue (FIA2) is a tissue to be analyzed.
  • the explanatory variable (FIA3) is an element at the time of analysis, and the item is the same as the explanatory variable (FFA) of the analysis result table (F).
  • the first principal component (FIA 4) is the value of the first principal component.
  • the second principal component (FIA5) is the value of the second principal component.
  • principal component analysis is used as this example, other methods may be used if characteristics in tissue activity are clarified. Further, although the second main component is stored in the frequency main component (FIA), the main components after that (third and subsequent ones) may be stored.
  • Tissue frequency calculation is a process of collecting the indices obtained in frequency main component extraction (CIA) into one for each time series. Specifically, the first principal component (FIA 4) and the second principal component (FIA 5) for each tissue are two-dimensionally mapped from the frequency principal component (FIA) every two days to determine the center of gravity. And let the distance from the origin be the tissue frequency value.
  • Tissue frequency is a table that stores features of tissue activity for each time series.
  • Tissue FIB1 is a tissue to be analyzed.
  • the date is the date of analysis.
  • the view point of this table is that the organizational frequency of Organization A on July 2, 2009 is 1.5.
  • the first principal component and the second principal component of the explanatory variable are two-dimensionally mapped for each tissue to obtain the center of gravity, but other methods may be obtained if the characteristics in tissue activity are known. .
  • Tissue frequency graph drawing is a process of drawing tissue frequencies in a time series for each tissue from a tissue frequency (FIB) as a line graph.
  • the example is shown by the tissue frequency graph (KI) of FIG.
  • the horizontal axis is date (KI1), and the vertical axis is tissue frequency (KI2).
  • the values from tissue frequency (FIB) are plotted in a line graph for each tissue. In this way, by plotting work methods and atmospheres as tissue frequencies on a time series basis for each tissue, it becomes possible to understand at a glance the comparisons between tissues, regardless of the tissue.
  • the eleventh embodiment will describe seat repositioning as a measure for enhancing activation and reducing stress.
  • This process can be processed by the same framework as in the first embodiment, and the seat change analysis (CJ) is a control unit (ASCO) of the application server (AS), and the seat arrangement drawing (JJ) in the place is a client It is executed by the indication (J) of (CL).
  • the process until finding the facing matrix (FC1C) and the personality index (FAAE) of the analysis result table (F) is the same as that of the first embodiment, and therefore the description thereof is omitted.
  • seating arrangements for members within an organization are implemented with different goals in each organization. For example, reducing the stress of each member in an organization or activating communication in an organization is an example of the purpose.
  • the flow of processing is shown in FIG.
  • a network diagram is created, and using the coordinate values, the reach distance on the network diagram between persons, ie, the face-to-face distance is calculated (CJA).
  • a facing distance network diagram (CJC) is drawn from the facing distance matrix (CJB).
  • adaptability (GFB) to be an index related to stress is calculated (CJD) from the tissue personality index (FAAE) obtained by the personality questionnaire.
  • FAB tissue personality index
  • the present inventors conducted research into the relationship between stress on members and personality indicators (extroversion, harmony, integrity, nervousness, openness) indicating the degree of fitness to society. In order to find that there is a strong relationship, in the present embodiment, a stress related index is calculated based on the personality index.
  • the seat arrangement constraint is information for the user to constrain the seats arranged by the present embodiment.
  • the constraint is, for example, a function of designating that a person is forced to be placed in a specific seat, or conversely, designating that it is not placed in a specific seat. This information is given by the user from the client (CL) using a keyboard or the like.
  • CJE seat layout optimization
  • CJC face-to-face distance network diagram
  • GFB adaptability
  • IC field list
  • seat placement constraints (CIJ) can be used to place constraints on the seats being placed.
  • the network diagram (ZB) in FIG. 46 is an example of a network diagram in an organization.
  • This network diagram (ZB) is drawn from the facing matrix (FC1C) obtained by the sensor, and (ZB1) to (ZB7) represent nodes representing persons, and (ZB8) to (ZB15) represent members facing each other. It consists of the line (edge) which connected.
  • the facing members are arranged on the network diagram using, for example, a spring model. As a result, members frequently facing each other in the organization are arranged close on the network diagram, and members not facing each other are arranged far.
  • the network diagram which is an index for evaluating the activity of communication in the organization from the network diagram
  • the order (FAAA2) is the number of edges connected to a node
  • the cohesion (FAAA2) is the density of nodes around itself
  • the two-step reach (FAAA3) is within a total of two steps or less It is the proportion of nodes.
  • the degree (FAAA2), the cohesion degree (FAAA2), and the two-step attainment degree (FAAA3) have larger values if the persons placed far away on the network communicate directly. In other words, in order to activate communication within the organization, communication between persons located far away on this network diagram may be encouraged.
  • FIG. 47 shows the number of steps for all members (person (CJA 1A) to person (CJA 1B)) on the network diagram of FIG. 46 to reach on the network.
  • the edge of the facing step number 1 is omitted for simplification, and the edge of the facing step number 2 is represented by a dashed line.
  • the thick solid line (CJC8) shows the edge of the step number 4, and the thin solid line shows (CJC9), (CJC10), (CJC10), (CJC11), (CJC12), and (CJC13) the edge of the step number 3.
  • Adaptability has values from 0 to 5.
  • a seat in which a person is not surrounded by a person with higher adaptability (GFB) than the person.
  • This arrangement may be determined by computer simulation or calculation.
  • FIG. Shown in.
  • a person with high adaptability (GFB) represented by (CJE1) is shown by a shaded circle, and a person with low adaptability (GFB) represented by (CJE3) is shown by a white circle.
  • People who have high adaptability (GFB) are paired (CJE2), and those who have low adaptability (GFB) are paired (CJE4).
  • FIG. 51 is a table describing the adaptability (GFB) of a person on the facing network diagram (ZB).
  • the average value of all the adaptability (GFB) was 2.5, and those over this were indicated by shading.
  • the seat may be determined in accordance with the arrangement rule shown in FIG.
  • FIG. 52 shows an example determined by applying the above policy to optimize the seating arrangement (CJE), and fitting it to the organization place list (IC) by drawing the seat arrangement on the place (JJ).
  • This seating arrangement places people with little face-to-face communication close to each other to activate communication, and a person with high adaptability (GFB) does not surround a person with low adaptability (GFB), stress throughout the organization It is a seating arrangement that simultaneously realizes the reduction of
  • CJE seat placement constraints
  • a network diagram capable of simultaneously displaying a face-to-face communication and a hierarchy of job positions is generated.
  • the relationship with the hierarchy of the current position can not be seen.
  • the problem is solved by considering the hierarchy (nodes) of the job position when deciding the layout of the network diagram.
  • FIG. 53 is a diagram showing a processing procedure for simultaneously displaying the face-to-face communication and the job position hierarchy.
  • the model is constructed by the job position hierarchy network analysis (CK), the drawing is carried out by the job position hierarchy network diagram drawing (JK), and the drawn result is a job position hierarchy network diagram (KK).
  • Job position hierarchy network modeling analysis is the control unit (ASCO) of the application server (AS), job position hierarchy network diagram drawing (JK) Is executed on the display (J) of the client (CL).
  • CKA intra- and intra-hierarchical tissue frequency analysis
  • CKA intra-hierarchical and intra-hierarchical tissue frequency analysis
  • the intra-hierarchical and intra-hierarchical tissue frequency analysis (CKA) can be processed by the same framework as in Example 10, and is the same as Example 10 until the explanatory variable (FFA) of the analysis result table (F) is determined. Therefore, the explanation is omitted.
  • IA be an input.
  • the intra-hierarchical frequency is selected from data of members with the same job position (IA4) from the explanatory variables (FFA), and the values such as average or variance of facing time (FAAC2) or cohesion degree (FAAA3) as feature amount It is a thing.
  • the out-of-hierarchy frequency select the data (for example, the person in charge and the section manager) of a member belonging to two positions (IA4) from among the explanatory variables (FFA), and meet time (FAAC2) as a feature value.
  • the cohesion degree (FAAA3) as a value such as average or variance.
  • the calculation formula for calculating the intra-layer frequency and the extra-layer frequency may use other calculation methods in addition to the average and the variance. Further, it is possible to obtain an index of intra-layer frequency or extra-layer frequency for each team name (IA3) or organization (IA5).
  • position hierarchy network diagram coordinate identification (CKB)
  • CKB position hierarchy network diagram coordinate identification
  • FIG. 55 shows steps for obtaining coordinate values of each member in the processing flow from Step 1 (CKBA) to Step 4 (CKBD). Each step will be described.
  • Step 1 is an initial arrangement.
  • a placement area for each job position is determined in advance on the screen, and members are placed in accordance with the job position (IA4) of the user ID table (IA). Thus, each member is given a coordinate value.
  • the facing time between the two parties is shown from the facing matrix (FC1C).
  • FC1C the facing matrix
  • CKBA1 and Tanaka (CKBA2) have a meeting time of a certain level or more, and thus, two lines are connected like a line (CKBA3).
  • Step 2 (CKBB) and Step 3 (CKBC) are repeated, and the process is not completed until a predetermined number of times and the threshold value are reached.
  • Step 2 is distance calculation.
  • Step 1 CKBA
  • arrangement is performed and coordinate values are given.
  • CKBB2 determines the line length, and calculate the total sum value of the lines in the position hierarchy network diagram.
  • the distance between the line connecting Takahashi (CKBB1) and Tanaka (CKBB2) is 8 (CKBB3), and the total value of the lines (CKBB4) is 141.
  • Step 3 is an exchange of one same hierarchy member.
  • select 2 people in the layer select 2 people in the layer and exchange coordinate values.
  • Kobayashi (CKBC1) and Yamamoto (CKBC2) are exchanging (CKBC3).
  • Step 2 performs distance calculation, compares the total values of the lines before exchange, and if the value is smaller, it is regarded as success and the coordinate values of the respective members are updated. If the value does not decrease, the coordinate value is returned to the base. These are repeated, and it does not finish until it becomes less than a threshold for a predetermined number of times.
  • Step 4 is the calculation of the assigned centroid. Identify where members of the same team name (IA3) or members of the same organization (IA5) are distributed. The coordinate value of the member is determined from the team name (IA4) of the user ID table (IA) and the organization (IA5), and the average value of the coordinate of the member is obtained. In the example, the coordinate values of the sales center of gravity (CKBD1) and the development center of gravity are determined. The calculation formula may use other calculation method besides the average value. Furthermore, you may consider the position.
  • the position hierarchy network diagram coordinate specification may be any other process as long as it is possible to obtain an optimal coordinate value.
  • the position hierarchy network diagram coordinate list (FK) stores the coordinate values of the position hierarchy network diagram coordinate specification (CKB).
  • An example of the job title hierarchical network diagram coordinate list (FK) of the analysis result table (F) is shown in FIG.
  • the period (FK1) of the job position hierarchical network diagram coordinate list (FK) indicates the period included in the data, which is the same as the period (FC1C3) of the facing matrix (FC1C).
  • the number of days (FK2) indicates the number of days included in the data, which is the same as the number of days (FC1C4).
  • the actual number of days (FK3) represents the number of business days of the period (FK1), which is the same as the number of business days (FC1C5).
  • the time resolution (FK4) is the time resolution in the facing table (FC1A), which is the same as the time resolution (FC1C6).
  • the face-to-face determination time (FK5) is a threshold value for determining that the face-to-face is met, and is the same as the face-to-face determination time (FC1C7).
  • the user ID (FK6) indicates the ID of the user and corresponds to the user ID table (IA).
  • coordinate values (FK7) coordinate values of members obtained by position hierarchy network diagram coordinate specification (CKB) are stored.
  • the team name (FK8) indicates the name of the team and corresponds to the user ID table (IA).
  • the coordinate value (FK9) stores the coordinate value of the center of gravity of the team obtained by the position hierarchy network diagram coordinate specification (CKB).
  • the job position hierarchy network diagram drawing (JK) uses the job position hierarchy network diagram coordinate list (FK) generated by job position hierarchy network diagram coordinate specification (CKB) and the user ID table (IA) of the user / place information table (I). , Draw a network diagram that can simultaneously display face-to-face communication and the position hierarchy, as shown in FIG. 53.
  • the placement area by job position is determined in advance on the screen, and the section manager (KKA1) and the section manager (KKA2) described in the example according to the job position (IA4) of the user ID table (IA).
  • a figure is plotted based on the user ID (FK6) and its coordinate value (FK7) stored in the job position hierarchical network diagram coordinate list (FK).
  • the user name (IA2) of the user ID table (IA) is described around the plotted figure.
  • the shape of the figure may be changed depending on the team name (IA3), the job title (IA4) and the organization (IA5) described in the user ID table (IA).
  • the facing time between two parties is shown from the facing matrix (FC1C) of the analysis result table (F).
  • FC1C facing matrix
  • F analysis result table
  • intra- and intra-hierarchy index KB
  • intra-hierarchical and inter-hierarchical tissue frequency indicators determined by intra-hierarchical and intra-hierarchical tissue frequency analysis CKA
  • FIG. 53 the indicator inside and outside the hierarchy (KKB) in FIG. 53 is a display in tabular form, it may be displayed as a graph (a polygonal line, a bar, a circle, a band, a scatter chart, a radar chart).
  • the network diagram (KKA) of the job position hierarchical network diagram (KK) in FIG. 53 an example is shown in which the organization (IA5) in the user ID table (IA) displays financial members.
  • the network diagram (KKA) is created, the corresponding member is extracted, and the figure is plotted with reference to the job position hierarchical network diagram coordinate list (FK) described by the coordinate values of the member.
  • FK job position hierarchical network diagram coordinate list
  • the meeting time between two parties is shown from the meeting matrix (FC1C) of the analysis result table (F).
  • the network diagram (KKD) of FIG. 57 shows the connection between members of the organization and external members.
  • the corresponding members are extracted, and the position hierarchy network diagram coordinate list (FK) described by the coordinate values of the members is plotted, and the figure is plotted.
  • FK position hierarchy network diagram coordinate list
  • members of sales in team name (IA3) and members connected with members of sales (external members) are selected and displayed.
  • the meeting time between two parties is shown from the meeting matrix (FC1C) of the analysis result table (F).
  • FC1C meeting matrix
  • F analysis result table
  • TR name tag type sensor node GW base station SS sensor net server AS application server CL client NW network ASME storage unit
  • ASCO control unit CA modeling analysis ASCC communication control ASSR transmission / reception unit

Abstract

An analytical model composed of the beneficial behavioral factors required to solve an organization's problems is generated from internal organization behavioral data and communication data. Organization dynamics indices are determined from sensor data and set as explanatory variables to serve as internal organization behavioral data. Objective data such as productivity and accidents/failures of the organization, and subjective data from questionnaire responses such as leadership/teamwork indices, employee desire to work/fulfillment indices, and stress/mental distress indices are determined and set as purpose functions. By selecting beneficial factors of purpose variables from the explanatory variables and creating a model, the behaviors affecting the problem given by the organization as purpose variables become clear.

Description

組織行動分析装置及び組織行動分析システムOrganizational Behavior Analysis Device and Organizational Behavior Analysis System
 本発明は、組織内のメンバの行動データやコミュニケーションデータから、組織の状態を可視化する技術に関する。 The present invention relates to technology for visualizing the state of an organization from behavior data and communication data of members in the organization.
 あらゆる組織において、多かれ少なかれ問題を抱えている。この問題を解決すべく、書店には多くのビジネス書が並び、マネージャーはこれらの本を読みながら、その問題の解決に頭を悩ます。また、組織の問題は普遍的な問題であるため、経験が豊富なマネージャーは何度の同じ場面に出くわしているため、経験と感で、問題の解決を行なう。 We have more or less problems in every organization. In order to solve this problem, the bookstore is lined with many business books, and while the manager reads these books, the manager is troubled to solve the problem. In addition, since organizational problems are universal problems, experienced managers have many experiences with the same scene, so they can solve problems with experience and feeling.
 また、組織のメンバにアンケートに回答してもらうことにより、組織の問題を解決しようとする試みがある。メンバの主観が反映されるため、メンバが気になっている要因の発見が可能となる。 There are also attempts to solve organizational problems by having members of the organization answer questionnaires. Since the subjectivity of members is reflected, it is possible to discover factors that the members are concerned about.
 また、組織内のメンバの行動を検出する一つの方法が、センサネットを活用することである。センサネットとは、センサと無線通信回路を備えた端末を環境や物、人などに取り付け、センサから得られた様々な情報を無線経由で取り出すことで状態の取得や制御に応用するシステムである。このコミュニケーションを検出するためにセンサによって取得する物理量には、対面状態を検出する赤外線、発話や環境を検出する音声、人の動作を検出する加速度がある。これらのデータを用いることで、組織内のメンバやチームでの活動が特徴量として求めることができる。 Also, one way to detect the behavior of members in an organization is to utilize sensor nets. A sensor network is a system applied to the acquisition and control of the state by attaching a terminal equipped with a sensor and a wireless communication circuit to an environment, an object, a person, etc., and extracting various information obtained from the sensor via wireless. . The physical quantities acquired by the sensor to detect this communication include infrared rays for detecting a face-to-face state, voices for detecting speech and environment, and acceleration for detecting human action. By using these data, activities of members and teams in the organization can be obtained as feature quantities.
 センサから得られる情報を用いる例として、予め準備された分析モデルに基づいて、赤外線から得られる位置情報から求める特徴量とアンケート回答とを分析し、組織の特徴を見つける(注目点を抽出する)ことが特許文献1に開示されている。 As an example using information obtained from a sensor, based on an analysis model prepared in advance, feature quantities and questionnaire responses obtained from position information obtained from infrared rays are analyzed to find tissue features (extract points of interest) Patent Document 1 discloses that.
 時系列における加速度センサからの求めた特徴量とストレスアンケートを1つの画面に表示することで、ストレスと本人の行動との関係を見ることで、生活支援を行なうことが特許文献2に開示されている。 Patent Document 2 discloses that life support is performed by viewing the relationship between stress and the behavior of the person by displaying the feature value obtained from the acceleration sensor in a time series and the stress questionnaire on one screen. There is.
特開2008-9595号公報JP, 2008-9595, A 特開2001-344352号公報JP 2001-344352 A
 書籍やマネージャーの経験や直感から組織の問題の解決方法を見つけようとする場合、その組織から生じた問題なのに、該当組織のメンバをヒアリングすることがないため、問題を解決することが難しい。また、組織を分析したとしても、マネージャーは出張等により、その問題に必要な観察やメンバのヒアリングを十分にできない場合が多い。 When trying to find a solution to an organization's problem from the experience and intuition of books and managers, it is difficult to solve the problem because it does not interview members of the organization even though it is a problem that originates from that organization. Also, even if the organization is analyzed, the manager often can not make sufficient observation and interviews with members necessary for the problem due to business trips and the like.
 また、組織のメンバにアンケートで回答してもらうことにより組織の問題を解決しようとする場合、言葉としてわかっていたとしても、それを行動に移すことが難しい。なぜならば、どのような行動をすればよいのか分からないからである。どの組織行動を改善すべきなのかを特定しないと、組織には浸透しないと考えられる。 Also, when trying to solve an organization's problem by having members of the organization answer in a questionnaire, it is difficult to put it into action even if it is known as words. Because we do not know what kind of action we should do. It is considered that the organization does not penetrate unless it is specified which organizational behavior should be improved.
 また、センサにより得られる物理量から、組織内のメンバのかかわりやチームでの活動を検出することができ、例えば、特許文献1では、予め準備されていた分析モデルに基づいて、アンケート回答と位置情報とを分析することで、組織としての注目点を抽出している。しかしながら、予めその組織に適切なモデルを選択しておかなければ、よい結果を得ることが難しい。さらには、アンケート情報と位置情報のみを用いて、接触が少ないのに満足度が高い群などの注目点を抽出するにしかすぎず、具体的にどの組織行動を改善すべきなのかを特定することはできない。 In addition, it is possible to detect the relationship between members in an organization or the activity of a team from physical quantities obtained by sensors. For example, in Patent Document 1, questionnaire responses and position information are prepared based on an analysis model prepared in advance. And analyze the points of interest as an organization. However, it is difficult to obtain good results unless the appropriate model is selected in advance for that tissue. Furthermore, using only the questionnaire information and the position information, it is only to extract attention points such as a group with high satisfaction despite the few contacts, and specifically specify which organization behavior should be improved. It is not possible.
 また、特許文献2では、ストレス度合いと行動を一度に表示する方法について開示しているが、ストレスの要因となる行動を特定することや、その対策(ストレス解消)方法までは記述していない。 In addition, Patent Document 2 discloses a method of displaying stress level and behavior at one time, but does not describe specifying behavior that is a factor of stress or a countermeasure (stress elimination) method.
 このような事情を鑑み、本発明の目的は、組織内の行動データやコミュニケーションデータから、組織の問題解決に有益な因子で構成される分析モデルの生成することである。 In view of such circumstances, an object of the present invention is to generate an analysis model composed of factors useful for problem solution of an organization, from behavior data and communication data in the organization.
 組織の問題解決には、原因となる有益な因子を見つけることが必要であるため、大量のセンサデータから得られる組織内の行動データとアンケート回答とから、その組織の問題の原因となっている複数の因子をモデル化し、提示する。 Since problem solution of an organization needs to find out the cause and beneficial factor, behavior data in the organization obtained from a large amount of sensor data and questionnaire responses cause the problem of the organization Model and present multiple factors.
 組織内の行動データとして、センサデータから組織ダイナミクス指標を求め、それを説明変数とする。組織の生産性や事故・不良などの客観データやリーダシップ/チームワーク指標や社員のやりがい/充実指標やストレス/メンタル不調指標などアンケート回答からの主観データを求め、それを目的関数とする。目的変数の有益な因子を説明変数から選び出し、モデルを作成することで、その組織が目的変数としてあげた問題に影響を与えている行動を明らかにすることができる。 As behavior data in an organization, an organization dynamics index is obtained from sensor data and used as an explanatory variable. Objective data are obtained from questionnaire responses such as objective data such as organizational productivity and accidents / defects, leadership / teamwork indicators, employee's rewards / enhancement indicators, and stress / mental upset indicators. By selecting useful factors of the objective variable from the explanatory variables and creating a model, it is possible to clarify the behavior affecting the problem that the organization has identified as the objective variable.
 また、複数の人物で構成される組織の分析を行う組織行動分析装置である。複数の人物それぞれに装着される端末の赤外線送受信部及び加速度センサで取得されるセンサデータ、及び、複数の人物それぞれの主観的評価又は客観的評価を示すデータを受信する受信部と、センサデータ及び主観的評価又は客観的評価を示すデータを解析する制御部と、制御部が解析を行うための解析条件と制御部が解析した結果とを記録する記録部と、を備える。制御部は、複数の人物ごとに、組織内での人物間の関係及び組織内での行動を示す指標を、解析条件に基づいてセンサデータから算出して記録部に記録し、複数の人物それぞれの主観的評価又は客観的評価を示すデータと、組織内での人物間の関係及び組織内での行動を示す指標との相関をとり、組織における主観的評価又は客観的評価を示すデータの要因を特定する。 In addition, it is an organization behavior analysis device which analyzes an organization constituted by a plurality of persons. A receiver for receiving sensor data acquired by an infrared transmitting / receiving unit and an acceleration sensor of a terminal attached to each of a plurality of persons, and data indicating a subjective evaluation or an objective evaluation of each of the plurality of persons; A control unit that analyzes data indicating a subjective evaluation or an objective evaluation, and a recording unit that records analysis conditions for the control unit to analyze and a result of analysis by the control unit. The control unit calculates, from the sensor data, an indicator indicating the relationship between persons in the organization and the action in the organization for each of the plurality of persons, and records the index in the recording unit. Of data showing subjective evaluation or objective evaluation in the organization by correlating data showing subjective evaluation or objective evaluation of the subject with indicators of relationship between persons in the organization and actions in the organization Identify
 また、組織を構成する複数の人物それぞれに装着され、対面を示すデータを取得する赤外線送受信部と、加速度データを取得する加速度センサと、対面を示すデータ及び加速度データをセンサデータとして送信する送信部と、を有する端末と、センサデータを受信し、かつ、複数の人物それぞれの主観的評価又は客観的評価を示すデータを受信する受信部と、センサデータ及び主観的評価又は客観的評価を示すデータを解析する制御部と、制御部が解析を行うための解析条件と制御部が解析した結果とを記録する記録部と、を有する組織行動分析装置とを備える組織行動分析システムである。制御部は、複数の人物ごとに、組織内での人物間の関係及び組織内での行動を示す指標を、解析条件に基づいてセンサデータから算出して記録部に記録し、複数の人物それぞれの主観的評価又は客観的評価を示すデータと、組織内での人物間の関係及び組織内での行動を示す指標との相関をとり、組織における主観的評価又は客観的評価を示すデータの要因を特定する。 In addition, an infrared transmitting and receiving unit attached to each of a plurality of persons constituting a tissue and acquiring data indicating meeting, an acceleration sensor acquiring acceleration data, and a transmitting unit transmitting data indicating meeting and acceleration data as sensor data And a receiving unit that receives sensor data, and receives data indicating a subjective evaluation or an objective evaluation of each of a plurality of persons, sensor data, and data indicating the subjective evaluation or the objective evaluation The tissue behavior analysis system includes a tissue behavior analysis device including: a control unit that analyzes and a recording unit that records analysis conditions for the analysis by the control unit and the analysis result of the control unit. The control unit calculates, from the sensor data, an indicator indicating the relationship between persons in the organization and the action in the organization for each of the plurality of persons, and records the index in the recording unit. Of data showing subjective evaluation or objective evaluation in the organization by correlating data showing subjective evaluation or objective evaluation of the subject with indicators of relationship between persons in the organization and actions in the organization Identify
 さらに、複数の人物で構成される組織の分析を行う組織行動分析装置である。
複数の人物それぞれの主観的評価を示すデータを受信する受信部と、主観的評価を示すデータを解析する制御部と、組織内の座席位置を示すデータと、制御部が解析を行うための解析条件と制御部が解析した結果とを記録する記録部と、を備える。制御部は、複数の人物ごとに、ストレスに関連する指標を、解析条件に基づいて主観的評価を示すデータから算出する指標計算部と、座席位置を示すデータ及びストレスに関連する指標に基づいて複数の人物それぞれの組織内での座席配置を決定する座席配置決定部とを有する。
Furthermore, it is an organization behavior analysis device which analyzes an organization constituted by a plurality of persons.
A receiver for receiving data indicating subjective evaluation of each of a plurality of persons, a controller for analyzing data indicating the subjective evaluation, data for indicating a seat position in a tissue, and an analysis for the controller to analyze And a recording unit that records the condition and the analysis result of the control unit. The control unit calculates, for each of a plurality of persons, an index calculation unit that calculates an index related to stress from data indicating a subjective evaluation based on analysis conditions, data indicating a seat position, and an index related to stress. And a seat arrangement determining unit that determines the arrangement of seats in the organization of each of the plurality of persons.
 組織の問題解決に必要な有益な行動因子が明らかになり、その組織に相応しい施策を打ち出すことができる。 The useful action factors necessary for problem solution in an organization are clarified, and appropriate measures can be made for the organization.
実施例1のシステム構成図の例Example of system configuration of embodiment 1 実施例1のシステム構成図の例Example of system configuration of embodiment 1 実施例1のシステム構成図の例Example of system configuration of embodiment 1 実施例1のシステム構成図の例Example of system configuration of embodiment 1 実施例1のシステム構成図の例Example of system configuration of embodiment 1 実施例1の全体の処理フローを示す図の例The example of the figure which shows the whole processing flow of Example 1 実施例1の全体の処理フローを示す図の例The example of the figure which shows the whole processing flow of Example 1 実施例1の全体の処理フローを示す図の例The example of the figure which shows the whole processing flow of Example 1 実施例1の対面テーブルの例Example of Meeting Table in Example 1 実施例1の身体リズムテーブルの例Example of Body Rhythm Table of Example 1 実施例1の対面マトリックスの例Example of the facing matrix of Example 1 実施例1のネットワーク指標、および、身体リズム指標の例Example of Network Indicator of Example 1 and Body Rhythm Indicator ネットワーク図(その1)の例Example of network diagram (Part 1) 実施例1の対面指標、および、組織活動指標の例Example of face-to-face index of Example 1 and organization activity index 実施例1のパーソナリティ指標の例Example of Personality Index in Example 1 実施例1の生産性指標、事故不良指標の例Example of productivity index of example 1, accident failure index 実施例1のリーダシップ/チームワーク指標、社員のやりがい/充実度指標、ストレス/メンタル不調指標、組織活性化指標の例Examples of Leadership / Teamwork Index, Employee's Demanding / Fullness Index, Stress / Mental Disorder Index, and Organization Activation Index in Example 1 実施例1の因子係数の例Example of factor coefficient of Example 1 実施例2の全体の処理フローを示す図の例The example of the figure which shows the whole processing flow of Example 2 実施例2のパーソナリティ係数の例Example of personality coefficient of Example 2 実施例2のユーザ別パーソナリティ係数の例Example of user-specific personality factor in Example 2 実施例2の推定パーソナリティ指標の例Example of estimated personality index of Example 2 実施例1のビックファイブ理論の例Example of Big Five theory of Example 1 実施例1のパーソナリティアンケートの例Example of Personality Questionnaire in Example 1 実施例1のストレス/メンタル不調アンケートの例Example of stress / mental disorder questionnaire in Example 1 実施例1のハピネスアンケートの例Example of Happiness Questionnaire of Example 1 実施例1のチームワークアンケートの例Example of Teamwork Questionnaire of Example 1 実施例1の組織活性化アンケートの例Example of tissue activation questionnaire of Example 1 実施例3の全体の処理フローを示す図の例Example of the figure which shows the whole processing flow of Example 3 実施例3の対面組別テーブルの例Example of the face-to-face grouping table of the third embodiment 実施例3のユーザID表、場所ID表の例Example of User ID Table, Example of Location ID Table of Example 3 実施例4の全体の処理フローを示す図の例The example of the figure which shows the whole processing flow of Example 4 実施例4の場所テーブルの例Example of Place Table of Example 4 実施例4のユーザ・場テーブルの例Example of user / place table according to the fourth embodiment 実施例5の全体の処理フローを示す図の例Example of the figure which shows the whole processing flow of Example 5 実施例5の場リスト(IC)の例Example of Field List (IC) of Example 5 実施例5の場の人数別マップ、場のチーム内外マップの例Map according to the number of people in the field of Example 5, an example of the map inside and outside the team in the field 実施例6の全体の処理フローを示す図の例Example of the figure which shows the whole processing flow of Example 6 実施例6の場の対面組マップ、場のユーザマップの例Example of the meeting set map of the place of Example 6, an example of the user map of the place 実施例7の全体の処理フローを示す図の例Example of the figure which shows the whole processing flow of Example 7 実施例7の温度テーブルの例Example of Temperature Table of Example 7 実施例7の場利用状況マップの例Example of site use situation map of Example 7 実施例8の全体の処理フローを示す図の例Example of the figure which shows the whole processing flow of Example 8 実施例9の全体の処理フローを示す図の例Example of the figure which shows the whole processing flow of Example 9 実施例9の対面人数別対面マトリックスの例Example of the meeting matrix according to the number of people facing in the ninth embodiment 実施例9の最大対面時間マトリックス、最大対面時間人数マトリックスの例Example of Maximum Meeting Time Matrix of Example 9, Example of Maximum Meeting Time People Matrix 実施例9の最大対面時間人数ネットワーク図の例Example of the maximum meeting time number of people network diagram of the ninth embodiment 実施例10の全体の処理フローを示す図の例Example of the figure which shows the whole processing flow of Example 10 実施例10の周波数主成分、組織周波数の例Example of frequency main component of Example 10, tissue frequency 実施例10の組織周波数グラフの例Example of tissue frequency graph of Example 10 実施例11の全体の処理フローを示す図の例Example of the figure which shows the whole processing flow of Example 11 ネットワーク図(その2)の例Example of network diagram (Part 2) 実施例11の各人のネットワーク図上での到達ステップ数の例Example of the number of reaching steps on the network diagram of each person in the eleventh embodiment 実施例11の対面距離マトリックスの例Example of the facing distance matrix of Example 11 実施例11の対面距離ネットワーク図の例Example of the facing distance network diagram of the eleventh embodiment 実施例11の適応性を用いた座席配置の1例An example of seat arrangement using the adaptability of Example 11 実施例11の各人の適応性の1例One Example of Adaptability of Each Person of Example 11 実施例11の場への座席配置の例Example of the seat arrangement to the place of Example 11 実施例12の全体の処理フローの一例を示す図A figure showing an example of whole processing flow of Example 12 実施例12の階層内外組織周波数解析の処理フローの一例を示す図A diagram showing an example of a processing flow of intra- and intra-hierarchical tissue frequency analysis of the twelfth embodiment 実施例12の職位階層ネットワーク図座標特定の処理フローの一例を示す図FIG. 26 is a diagram showing an example of a process flow of job position hierarchical network diagram coordinate identification according to the twelfth embodiment; 実施例12の職位階層ネットワーク図座標リストの一例を示す図A diagram showing an example of a job position hierarchical network diagram coordinate list of the twelfth embodiment 実施例12の職位階層ネットワーク構成の一例を示す図The figure which shows an example of the post hierarchy hierarchy structure of Example 12
 以下、本発明の一実施形態を、図面を参照して説明する。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
 本発明における解析システムの位置づけと機能を明らかにするため、まずビジネス顕微鏡システムについて説明する。ここで、ビジネス顕微鏡とは、人間に装着したセンサノードでその人間の行動や振る舞いを観測し、組織アクティビティとして人物間の関係性と現在の組織の像を図示することで組織の改善に役立てるためのシステムである。また、センサノードで取得される対面検出・行動・音声等に関するデータを、総称して広く組織ダイナミクスデータと呼ぶ。 In order to clarify the position and function of the analysis system in the present invention, first, a business microscope system will be described. Here, a business microscope is used to monitor the behavior and behavior of a human being with a sensor node worn by a human, and to help improve the organization by illustrating the relationship between persons and the image of the current organization as an organization activity. System. In addition, data on face-to-face detection, action, voice, etc. acquired by the sensor node is generically referred to as tissue dynamics data.
 図1A、図1B、図1C、図1D、図1Eは一つの実施形態であるビジネス顕微鏡システムの構成要素を示す説明図であり、図示の都合上分割して示してあるが、各々図示された各処理は相互に連携して実行される。 FIGS. 1A, 1B, 1C, 1D and 1E are explanatory diagrams showing components of one embodiment of a business microscope system, which are shown separately for convenience of illustration. The respective processes are executed in cooperation with each other.
 図1A~図1Eは名札型センサノード(TR)から、基地局(GW)を経由し、組織ダイナミクスデータを格納するセンサネットサーバ(SS)、組織ダイナミクスデータの解析を行なうアプリケーションサーバ(AS)、閲覧者に解析結果を出力するクライアント(CL)までの一連の流れを示している。 1A to 1E show a sensor net server (SS) that stores organization dynamics data from a name tag type sensor node (TR), a base station (GW), and an application server (AS) that analyzes organization dynamics data, It shows a series of flows up to the client (CL) that outputs the analysis result to the viewer.
 本システムは、名札型センサノード(TR)、基地局(GW)、組織センサネットサーバ(SS)、アプリケーションサーバ(AS)、クライアント(CL)によって構成されている。なお、本実施例では、センサネットサーバとアプリケーションサーバとを別のデバイスとして説明するが、1のサーバでこれらサーバの機能を実現することも可能である。 This system comprises a name tag type sensor node (TR), a base station (GW), an organization sensor network server (SS), an application server (AS), and a client (CL). In the present embodiment, the sensor network server and the application server are described as separate devices, but it is also possible to realize the functions of these servers with one server.
 図1Aに示すアプリケーションサーバ(AS)は、組織ダイナミクスデータを解析及び処理する。図1Bに示すクライアント(CL)からの依頼を受け、又は、設定された時刻に自動的に、解析アプリケーションが起動する。解析アプリケーションは、図1Cに示すセンサネットサーバ(SS)に依頼し、必要な組織ダイナミクスデータを取得する。さらに、解析アプリケーションは、取得した組織ダイナミクスデータを解析し、解析結果を図1Bに示すクライアント(CL)に返す。あるいは、解析アプリケーションは、解析結果をそのまま解析結果データベース(F)に記録しておいてもよい。 An application server (AS), shown in FIG. 1A, analyzes and processes tissue dynamics data. At the request from the client (CL) shown in FIG. 1B or automatically at the set time, the analysis application is started. The analysis application requests the sensor network server (SS) shown in FIG. 1C to acquire necessary tissue dynamics data. Furthermore, the analysis application analyzes the acquired tissue dynamics data, and returns the analysis result to the client (CL) shown in FIG. 1B. Alternatively, the analysis application may record the analysis result as it is in the analysis result database (F).
 なお、解析に用いるアプリケーションは、解析アルゴリズム(D)に格納されており、制御部(ASCO)によって実行される。本実施例により実行される処理は、モデル化解析(CA)、パーソナリティ指標抽出解析(CA1)、パーソナリティ指標変換解析(CA2)である。 The application used for analysis is stored in the analysis algorithm (D) and executed by the control unit (ASCO). The processing executed by the present embodiment is modeling analysis (CA), personality index extraction analysis (CA1), and personality index conversion analysis (CA2).
 アプリケーションサーバ(AS)は、送受信部(ASSR)、記憶部(ASME)及び制御部(ASCO)を備える。 The application server (AS) includes a transmission / reception unit (ASSR), a storage unit (ASME), and a control unit (ASCO).
 送受信部(ASSR)は、図1Cに示すセンサネットサーバ(SS)及び図1Bに示すクライアント(CL)との間で組織ダイナミクスデータの送信及び受信を行う。具体的には、送受信部(ASSR)は、クライアント(CL)から送られてきたコマンドを受信し、センサネットサーバ(SS)に組織ダイナミクスデータ取得依頼を送信する。さらに、送受信部(ASSR)は、センサネットサーバ(SS)から組織ダイナミクスデータを受信し、解析結果をクライアント(CL)に送信する。 The transceiver unit (ASSR) transmits and receives tissue dynamics data between the sensor network server (SS) shown in FIG. 1C and the client (CL) shown in FIG. 1B. Specifically, the transmission / reception unit (ASSR) receives a command sent from the client (CL), and transmits a tissue dynamics data acquisition request to the sensor network server (SS). Further, the transmission / reception unit (ASSR) receives tissue dynamics data from the sensor network server (SS), and transmits an analysis result to the client (CL).
 記憶部(ASME)は、ハードディスク、メモリ又はSDカードのような外部記録装置で構成される。記憶部(ASME)は、解析のための設定条件及び解析結果を格納する。具体的には、記憶部(ASME)は、ユーザ/場所情報テーブル(I)、組織情報テーブル(H)、アンケート(G)、解析結果テーブル(F)、解析条件期間テーブル(E)、解析アルゴリズム(D)を格納する。 The storage unit (ASME) is configured of an external recording device such as a hard disk, a memory, or an SD card. The storage unit (ASME) stores setting conditions for analysis and analysis results. Specifically, the storage unit (ASME) includes user / place information table (I), organization information table (H), questionnaire (G), analysis result table (F), analysis condition period table (E), analysis algorithm Store (D).
 ユーザ/場所情報テーブル(I)は、ユーザの氏名、職位、ユーザIDなどの個人情報と、場の情報が記載されているテーブルである。 The user / place information table (I) is a table in which personal information such as the user's name, job title, and user ID, and information on places are described.
 組織情報テーブル(H)は、生産性(HA)や事故不良(HB)などその組織モデル化の際に必要なデータや、気候や株価などの組織活動をする際に必要なデータが一般情報として格納されているテーブルである。 The organizational information table (H) contains data necessary for organizational modeling, such as productivity (HA) and accident defects (HB), and data necessary for organizational activities such as climate and stock prices as general information. It is a stored table.
 アンケート(G)は、ユーザに行なってもらうアンケートとその回答が格納されているテーブルである。 The questionnaire (G) is a table in which the questionnaires the user is asked to perform and the responses are stored.
 解析結果テーブル(F)は、組織ダイナミクスデータを解析した結果(組織ダイナミクス指標)や、アンケート結果を解析した結果が格納されるテーブルである。 The analysis result table (F) is a table in which results of analysis of tissue dynamics data (tissue dynamics index) and results of analysis of questionnaire results are stored.
 解析条件期間テーブル(E)は、クライアント(CL)から依頼された表示のための解析条件を一時的に記憶しておくテーブルである。 The analysis condition period table (E) is a table for temporarily storing analysis conditions for display requested from the client (CL).
 解析アルゴリズム(D)は、解析に用いるプログラムが格納されている。クライアント(CL)からの依頼に従って、適切なプログラムが選択し、制御部(ASCO)に送られ、解析が実行される。 The analysis algorithm (D) stores a program used for analysis. At the request of the client (CL), an appropriate program is selected, sent to the control unit (ASCO), and analysis is performed.
 制御部(ASCO)は、中央処理部CPU(図示省略)を備え、データの送受信の制御及びセンシングデータの解析を実行する。具体的には、CPU(図示省略)が記憶部(ASME)に格納された各種プログラムを読み出して実行することによって各種機能を実現する。具体的には、通信制御(ASCC)、モデル化解析(CA)、パーソナリティ指標抽出解析(CA1)、パーソナリティ指標変換解析(CA2)が実行される。 The control unit (ASCO) includes a central processing unit CPU (not shown) and executes control of data transmission / reception and analysis of sensing data. Specifically, the CPU (not shown) realizes various functions by reading and executing various programs stored in the storage unit (ASME). Specifically, communication control (ASCC), modeling analysis (CA), personality index extraction analysis (CA1), and personality index conversion analysis (CA2) are executed.
 通信制御(ASCC)は、有線又は無線によるセンサネットサーバ(SS)及びクライアントデータ(CL)との通信のタイミングを制御する。さらに、通信制御(ASCC)は、データの形式変換、及び、データの種類別に行き先の振り分けを実行する。 Communication control (ASCC) controls the timing of communication with the sensor network server (SS) and client data (CL) by wire or wireless. Furthermore, communication control (ASCC) executes format conversion of data and distribution of destinations according to data types.
 モデル化解析(CA)は、組織ダイナミクスデータとアンケート結果からその組織が抱えている問題の主要因をモデル化する処理である。モデル化解析(CA)は、対面テーブル作成(C1A)、身体リズムテーブル作成(C1B)、対面マトリックス作成(C1C)、ネットワーク指標抽出(CAA)、身体リズム指標抽出(CAB)、対面指標抽出(CAC)、組織活動指標抽出(CAD)、相関分析(CAE)、因子選択(CAF)から構成されている。 Modeling analysis (CA) is a process of modeling the main factor of the problem which the organization has from tissue dynamics data and questionnaire results. Modeling analysis (CA) includes meeting table creation (C1A), body rhythm table creation (C1B), meeting matrix creation (C1C), network index extraction (CAA), body rhythm index extraction (CAB), meeting index extraction (CAC) Organization activity index extraction (CAD), correlation analysis (CAE), and factor selection (CAF).
 対面テーブル作成(C1A)は、組織ダイナミクスデータからユーザ毎に時系列に並び替えたものであり、対面に関するテーブルを作成する処理である。 Face-to-face table creation (C1A) is a process in which organization dynamics data is rearranged in chronological order for each user, and is a process for creating a table on face-to-face.
 身体リズムテーブル作成(C1B)は、組織ダイナミクスデータからユーザ毎に時系列に並び替えたものであり、身体リズムに関するテーブルを作成する処理である。 Body rhythm table creation (C1B) is a process in which tissue dynamics data is rearranged in chronological order for each user, and is a process of creating a table related to body rhythm.
 対面マトリックス作成(C1C)は対面テーブル作成(C1A)の結果からユーザ毎同士の対面をマトリックス状にまとめたテーブルを作成する処理である。 Face-to-face matrix creation (C1C) is a process of creating a table in which the faces of each user are summarized in a matrix from the result of the face-to-face table creation (C1A).
 ネットワーク指標抽出(CAA)は、対面テーブルから組織ダイナミクス指標におけるネットワークに関する指標を解析する。 Network metric extraction (CAA) analyzes metrics on networks in organizational dynamics metric from face-to-face table.
 身体リズム指標抽出(CAB)は、身体リズムテーブルから組織ダイナミクス指標における身体リズムに関する指標を解析する。 Body rhythm index extraction (CAB) analyzes indices related to body rhythm in tissue dynamics indices from a body rhythm table.
 対面指標抽出(CAC)は、対面テーブルと身体リズムテーブルから組織ダイナミクス指標における対面に関する指標を解析する。 Face-to-face index extraction (CAC) analyzes the face-to-face index in the tissue dynamics index from the face-to-face table and the body rhythm table.
 活動指標抽出(CAD)は、対面テーブルと身体リズムテーブルから組織ダイナミクス指標における組織に関する指標を解析する。 Activity index extraction (CAD) analyzes indexes related to the tissue in the tissue dynamics index from the face-to-face table and the body rhythm table.
 相関分析(CAE)は、組織ダイナミクス指標とアンケート結果との相関を求める分析である。 Correlation analysis (CAE) is an analysis that finds correlation between tissue dynamics indicators and questionnaire results.
 因子選択(CAF)は、相関分析の結果、有益な因子を選択する処理である。 Factor selection (CAF) is a process of selecting useful factors as a result of correlation analysis.
 パーソナリティ指標抽出解析(CA1)とパーソナリティ指標変換解析(CA2)は、従来、アンケートからユーザの主観データを取得しているが、このアンケートを用いなくでも、組織ダイナミクスデータから、パーソナリティ指標を求めるための処理である。 Conventionally, personality index extraction analysis (CA1) and personality index conversion analysis (CA2) have acquired user's subjective data from questionnaires, but even if this questionnaire is not used, personality indexes can be obtained from tissue dynamics data. It is a process.
 パーソナリティ指標抽出解析(CA1)は、それぞれのアンケート項目に対して、組織ダイナミクス指標の寄与係数を求めるものである。これは、パーソナリティ指標係数抽出(CA1A)によって行なわれる処理である。 The personality index extraction analysis (CA1) is to obtain the contribution coefficient of the tissue dynamics index for each questionnaire item. This is a process performed by personality index coefficient extraction (CA1A).
 パーソナリティ指標変換解析(CA2)は、組織ダイナミクス指標と、パーソナリティ指標抽出解析(CA1)で求めた寄与係数から、アンケートの代替となる指標を求める処理である。これは、パーソナリティ指標変換(CA2A)によって行なわれる処理である。 Personality index conversion analysis (CA2) is a process for obtaining an index to be used as a substitute for a questionnaire from tissue dynamics indexes and contribution coefficients obtained by personality index extraction analysis (CA1). This is a process performed by personality index conversion (CA2A).
 解析した結果は解析結果テーブル(F)、または、送受信部(ASSR)から図1Bに示すクライアント(CL)の表示(J)に送信する。 The analysis result is transmitted from the analysis result table (F) or the transmission / reception unit (ASSR) to the display (J) of the client (CL) shown in FIG. 1B.
 図1Bに示すクライアント(CL)は、ユーザとの接点であり、データの入出力を行なう。クライアント(CL)は、入出力部(CLIO)、送受信部(CLSR)、記憶部(CLME)及び制御部(CLCO)を備える。 The client (CL) shown in FIG. 1B is a contact point with the user, and performs data input / output. The client (CL) includes an input / output unit (CLIO), a transmission / reception unit (CLSR), a storage unit (CLME), and a control unit (CLCO).
 入出力部(CLIO)は、ユーザとのインタフェースとなる部分である。入出力部(CLIO)は、ディスプレイ(CLOD)、キーボード(CLIK)及びマウス(CLIM)等を備える。必要に応じて外部入出力(CLIU)に他の入出力装置を接続することもできる。 The input / output unit (CLIO) is a part serving as an interface with the user. The input / output unit (CLIO) includes a display (CLOD), a keyboard (CLIK), a mouse (CLIM) and the like. Other input / output devices can also be connected to the external input / output (CLIU) as required.
 ディスプレイ(CLOD)は、CRT(CATHODE-RAY TUBE)又は液晶ディスプレイ等の画像表示装置である。ディスプレイ(CLOD)は、プリンタ等を含んでもよい。 The display (CLOD) is an image display device such as a CRT (CATHODE-RAY TUBE) or a liquid crystal display. The display (CLOD) may include a printer or the like.
 送受信部(CLSR)は、図1Aに示すアプリケーションサーバ(AS)又は図1Cに示すセンサネットサーバ(SS)との間でデータの送信及び受信を行う。具体的には、送受信部(CLSR)は、解析条件(CLMP)をアプリケーションサーバ(AS)に送信し、解析結果を受信する。 The transmission / reception unit (CLSR) transmits / receives data to / from the application server (AS) shown in FIG. 1A or the sensor network server (SS) shown in FIG. 1C. Specifically, the transmission / reception unit (CLSR) transmits the analysis condition (CLMP) to the application server (AS), and receives the analysis result.
 記憶部(CLME)は、ハードディスク、メモリ又はSDカードのような外部記録装置で構成される。記憶部(CLME)は、解析条件(CLMP)及び描画設定情報(CLMT)等の、描画に必要な情報を記録する。解析条件(CLMP)は、ユーザから設定された解析対象のメンバの数及び解析方法の選択等の条件を記録する。描画設定情報(CLMT)は、図面のどの部分に何をプロットするかという描画位置に関する情報を記録する。さらに、記憶部(CLME)は、制御部(CLCO)のCPU(図示省略)によって実行されるプログラムを格納してもよい。 The storage unit (CLME) is configured by an external recording device such as a hard disk, a memory or an SD card. The storage unit (CLME) records information necessary for drawing such as analysis conditions (CLMP) and drawing setting information (CLMT). The analysis condition (CLMP) records conditions such as the number of analysis target members set by the user and the selection of the analysis method. The drawing setting information (CLMT) records information on the drawing position of what part of the drawing to plot. Furthermore, the storage unit (CLME) may store a program executed by a CPU (not shown) of the control unit (CLCO).
 制御部(CLCO)は、CPU(図示省略)を備え、通信の制御、クライアントユーザ(US)からの解析条件の入力、及び、解析結果をクライアントユーザ(US)に提示するための描画等を実行する。具体的には、CPUは、記憶部(CLME)に格納されたプログラムを実行することによって、通信制御(CLCC)、解析条件設定(CLIS)、描画設定(CLTS)、表示(J)の処理を実行する。 The control unit (CLCO) has a CPU (not shown) and executes control of communication, input of analysis conditions from the client user (US), and drawing for presenting the analysis result to the client user (US). Do. Specifically, the CPU executes the program stored in the storage unit (CLME) to execute the processing of communication control (CLCC), analysis condition setting (CLIS), drawing setting (CLTS), and display (J). Run.
 通信制御(CLCC)は、有線又は無線によるアプリケーションサーバ(AS)又はセンサネットサーバ(SS)との間の通信のタイミングを制御する。また、通信制御(CLCC)は、データの形式を変換し、データの種類別に行き先を振り分ける。 Communication control (CLCC) controls the timing of communication with a wired or wireless application server (AS) or sensor network server (SS). Also, communication control (CLCC) converts the format of data and distributes destinations according to the type of data.
 解析条件設定(CLIS)は、ユーザから入出力部(CLIO)を介して指定される解析条件を受け取り、記憶部(CLME)の解析条件(CLMP)に記録する。ここでは、解析に用いるデータの期間、メンバ、解析の種類及び解析のためのパラメータ等が設定される。クライアント(CL)は、これらの設定をアプリケーションサーバ(AS)に送信して解析を依頼し、それと並行して描画設定(CLTS)を実行する。 The analysis condition setting (CLIS) receives an analysis condition designated from the user via the input / output unit (CLIO), and records it in the analysis condition (CLMP) of the storage unit (CLME). Here, a period of data used for analysis, a member, a type of analysis, parameters for analysis, and the like are set. The client (CL) sends these settings to the application server (AS) to request analysis, and in parallel with this, executes drawing settings (CLTS).
 描画設定(CLTS)は、解析条件(CLMP)に基づいて解析結果を表示する方法、及び、図面をプロットする位置を計算する。この処理の結果は、記憶部(CLME)の描画設定情報(CLMT)に記録される。 The drawing setting (CLTS) calculates the method of displaying the analysis result based on the analysis condition (CLMP) and the position at which the drawing is plotted. The result of this process is recorded in drawing setting information (CLMT) of the storage unit (CLME).
 表示(J)は、アプリケーションサーバ(AS)から取得した解析結果を描画設定情報(CLMT)に記載されている形式にもとづいて表示画面を生成する。例えば、描画設定情報(CLMT)には図2Cに示すモデル描画(JA)等が格納されている。このとき必要であれば、表示(J)は、表示されている人物の氏名等の属性も表示する。作成された表示結果は、ディスプレイ(CLOD)等の出力装置を介してユーザに提示される。例えば、ディスプレイ(CLOD)には図2Cに示す科学的経営知モデル(KA)の様な画面を表示する。ドラッグ&ドロップ等の操作によって、ユーザが表示位置を微調整することもできる。 The display (J) generates a display screen based on the format described in the drawing setting information (CLMT), the analysis result acquired from the application server (AS). For example, model drawing (JA) and the like shown in FIG. 2C are stored in the drawing setting information (CLMT). At this time, if necessary, the display (J) also displays an attribute such as the name of the person being displayed. The created display result is presented to the user via an output device such as a display (CLOD). For example, the display (CLOD) displays a screen such as a scientific management knowledge model (KA) shown in FIG. 2C. The user can finely adjust the display position by an operation such as drag and drop.
 図1Cに示すセンサネットサーバ(SS)は、図1Eに示す名札型センサノード(TR)から集まったデータを管理する。具体的には、センサネットサーバ(SS)は、図1Dに示す基地局(GW)から送られてくるデータをデータベースに格納し、また、図1Aに示すアプリケーションサーバ(AS)及び図1Bに示すクライアント(CL)からの要求に基づいてセンシングデータを送信する。さらに、センサネットサーバ(SS)は、基地局(GW)からの制御コマンドを受信し、その制御コマンドから得られた結果を基地局(GW)に返信する。 The sensor net server (SS) shown in FIG. 1C manages data collected from name tag type sensor nodes (TR) shown in FIG. 1E. Specifically, the sensor net server (SS) stores data sent from the base station (GW) shown in FIG. 1D in a database, and also shows the application server (AS) shown in FIG. 1A and FIG. 1B. Send sensing data based on the request from the client (CL). Furthermore, the sensor net server (SS) receives a control command from the base station (GW), and sends the result obtained from the control command back to the base station (GW).
 センサネットサーバ(SS)は、送受信部(SSSR)、記憶部(SSME)及び制御部(SSCO)を備える。時刻同期管理(GWCD)がセンサネットサーバ(SS)で実行される場合、センサネットサーバ(SS)は時計も必要とする。 The sensor net server (SS) includes a transmission / reception unit (SSSR), a storage unit (SSME), and a control unit (SSCO). If time synchronization management (GWCD) is performed on the sensor net server (SS), the sensor net server (SS) also needs a clock.
 送受信部(SSSR)は、基地局(GW)、アプリケーションサーバ(AS)及びクライアント(CL)との間で、データの送信及び受信を行う。具体的には、送受信部(SSSR)は、基地局(GW)から送られてきたセンシングデータを受信し、アプリケーションサーバ(AS)又はクライアント(CL)へセンシングデータを送信する。 The transmission / reception unit (SSSR) performs data transmission and reception with the base station (GW), the application server (AS) and the client (CL). Specifically, the transmission / reception unit (SSSR) receives the sensing data sent from the base station (GW), and sends the sensing data to the application server (AS) or the client (CL).
 記憶部(SSME)は、ハードディスクやフラッシュメモリなどの不揮発記憶装置によって構成され、少なくとも、データテーブル(BA)、パフォーマンステーブル(BB)、データ形式情報(SSMF)、端末管理テーブル(SSTT)及び、端末ファームウェア(SSTF)を格納する。さらに、記憶部(SSME)は、制御部(SSCO)のCPU(図示省略)によって実行されるプログラムを格納してもよい。 The storage unit (SSME) is configured by a non-volatile storage device such as a hard disk or flash memory, and at least a data table (BA), a performance table (BB), data format information (SSMF), a terminal management table (SSTT), and a terminal Stores firmware (SSTF). Furthermore, the storage unit (SSME) may store a program executed by a CPU (not shown) of the control unit (SSCO).
 データテーブル(BA)は、名札型センサノード(TR)が取得した組織ダイナミクスデータ、名札型センサノード(TR)の情報、及び、名札型センサノード(TR)から送信された組織ダイナミクスデータが通過した基地局(GW)の情報等を記録しておくためのデータベースである。加速度、温度等、データの要素ごとにカラムが作成され、データが管理される。また、データの要素ごとにテーブルが作成されてもよい。どちらの場合にも、全てのデータは、取得された名札型センサノード(TR)のIDである端末情報(TRMT)と、取得された時刻に関する情報とを関連付けてデータテーブル(BA)に格納される。データテーブル(BA)は、図2Bのデータテーブル(BA)と同じものである。 In the data table (BA), tissue dynamics data acquired by name tag type sensor node (TR), information on name tag type sensor node (TR), and tissue dynamics data transmitted from name tag type sensor node (TR) passed It is a database for recording information of a base station (GW) and the like. A column is created for each element of data, such as acceleration and temperature, and data is managed. Also, a table may be created for each element of data. In either case, all data are stored in the data table (BA) by associating the terminal information (TRMT), which is the ID of the acquired nameplate type sensor node (TR), with the information related to the acquired time. Ru. The data table (BA) is the same as the data table (BA) of FIG. 2B.
 パフォーマンステーブル(BB)は、名札型センサノード(TR)から又は既存のデータから入力された、組織や個人に関する評価(パフォーマンス)を、時刻データと共に記録するためのデータベースである。パフォーマンステーブル(BB)は、図2Bのパフォーマンステーブル(BB)と同じものである。 The performance table (BB) is a database for recording evaluations (performances) related to organizations and individuals, which are input from name tag type sensor nodes (TR) or from existing data, together with time data. The performance table (BB) is the same as the performance table (BB) of FIG. 2B.
 データ形式情報(SSMF)には、通信のためのデータ形式、基地局(GW)でタグ付けされたセンシングデータを切り分けてデータベースに記録する方法、及び、データの要求に対する対応方法等が記録されている。後で説明するように、データ受信の後、データ送信の前には必ずこのデータ形式情報(SSMF)が通信制御部(SSCC)によって参照され、データ形式変換とデータ管理(SSDA)が行われる。 Data format information (SSMF) includes a data format for communication, a method of separating sensing data tagged by a base station (GW) and recording it in a database, a response method to a request for data, and the like. There is. As described later, after data reception, before data transmission, this data format information (SSMF) is always referred to by the communication control unit (SSCC), and data format conversion and data management (SSDA) are performed.
 端末管理テーブル(SSTT)は、どの名札型センサノード(TR)が現在どの基地局(GW)の管理下にあるかを記録しているテーブルである。基地局(GW)の管理下に新たに名札型センサノード(TR)が加わった場合、端末管理テーブル(SSTT)は更新される。 The terminal management table (SSTT) is a table which records which name tag type sensor node (TR) is currently under management of which base station (GW). When a name tag type sensor node (TR) is newly added under the management of the base station (GW), the terminal management table (SSTT) is updated.
 端末ファームウェア(SSTF)は、端末ファームウェア登録部(TFI)において格納された名札型センサノードの更新された端末ファームウェア(GWTF)を一時的に格納する。 The terminal firmware (SSTF) temporarily stores the updated terminal firmware (GWTF) of the name tag type sensor node stored in the terminal firmware registration unit (TFI).
 制御部(SSCO)は、中央処理部CPU(図示省略)を備え、センシングデータの送受信やデータベースへの記録・取り出しを制御する。具体的には、CPUが記憶部(SSME)に格納された各種プログラムを読み出して実行することによって各種機能を実現する。具体的には、通信制御(SSCC)、端末管理情報修正(SSTM)及びデータ管理(SSDA)等の処理を実行する。 The control unit (SSCO) includes a central processing unit CPU (not shown), and controls transmission / reception of sensing data and recording / extraction to a database. Specifically, various functions are realized by the CPU reading and executing various programs stored in the storage unit (SSME). Specifically, processing such as communication control (SSCC), terminal management information correction (SSTM) and data management (SSDA) is executed.
 通信制御部(SSCC)は、有線又は無線による基地局(GW)、アプリケーションサーバ(AS)及びクライアント(CL)との通信のタイミングを制御する。また、通信制御部(SSCC)は、上述の通り、送受信するデータの形式を、記憶部(SSME)内に記録されたデータ形式情報(SSMF)に基づいて、センサネットサーバ(SS)内におけるデータ形式、又は、各通信相手に特化したデータ形式に変換する。さらに、通信制御(SSCC)は、データの種類を示すヘッダ部分を読み取って、対応する処理部へデータを振り分ける。具体的には、受信されたデータはデータ管理(SSDA)へ、端末管理情報を修正するコマンドは端末管理情報修正(SSTM)へ振り分けられる。送信されるデータの宛先は、基地局(GW)、アプリケーションサーバ(AS)又はクライアント(CL)に決定される。 A communication control unit (SSCC) controls the timing of communication with a wired or wireless base station (GW), application server (AS) and client (CL). Also, as described above, the communication control unit (SSCC) uses the data format in the sensor network server (SS) based on the data format information (SSMF) recorded in the storage unit (SSME) as the format of data to be transmitted and received. Convert to a format or data format specialized for each communication partner. Furthermore, communication control (SSCC) reads a header portion indicating the type of data and distributes the data to the corresponding processing unit. Specifically, received data is distributed to data management (SSDA), and a command for correcting terminal management information is distributed to terminal management information correction (SSTM). The destination of the data to be transmitted is determined to a base station (GW), an application server (AS) or a client (CL).
 端末管理情報修正(SSTM)は、基地局(GW)から端末管理情報を修正するコマンドを受け取った際に、端末管理テーブル(SSTT)を更新する。 The terminal management information correction (SSTM) updates the terminal management table (SSTT) when a command to correct terminal management information is received from the base station (GW).
 データ管理(SSDA)は、記憶部(SSME)内のデータの修正・取得及び追加を管理する。例えば、データ管理(SSDA)によって、センシングデータは、タグ情報に基づいてデータの要素別にデータベースの適切なカラムに記録される。センシングデータがデータベースから読み出される際にも、時刻情報及び端末情報に基づいて必要なデータを選別し、時刻順に並べ替える等の処理が行われる。 Data Management (SSDA) manages correction / acquisition and addition of data in the storage unit (SSME). For example, according to data management (SSDA), sensing data is recorded in appropriate columns of a database by element of data based on tag information. Even when the sensing data is read out from the database, processing such as sorting necessary data according to time information and terminal information and sorting in time order is performed.
 センサネットサーバ(SS)が、基地局(GW)を介して受け取ったデータを、データ管理(SSDA)によってパフォーマンステーブル(BB)及びデータテーブル(BA)に整理して記録することが、図2Bにおける組織ダイナミクスデータ収集(B)に相当する。 In FIG. 2B, the sensor net server (SS) organizes and records the data received via the base station (GW) by the data management (SSDA) in the performance table (BB) and the data table (BA). This corresponds to tissue dynamics data collection (B).
 パフォーマンス入力(C)は、パフォーマンスを示す値を入力する処理である。ここで、パフォーマンスとは、何らかの基準に基づいて判定される主観的又は客観的な評価である。例えば、所定のタイミングで、名札型センサノード(TR)を装着した人物は、その時点における業務の達成度、組織に対する貢献度及び満足度等、何らかの基準に基づく主観的な評価(パフォーマンス)の値を入力する。所定のタイミングとは、例えば、数時間に一度、一日に一度、又は、会議等のイベントが終了した時点であってもよい。名札型センサノード(TR)を装着した人物は、その名札型センサノード(TR)を操作して、又は、クライアント(CL)のようなパーソナルコンピュータ(PC)を操作して、パフォーマンスの値を入力することができる。あるいは、手書きで記入された値が後にまとめてPCで入力されてもよい。本実施の形態では、名札型センサノードがレイティングとして人(SOCIAL)、行(INTELLECTUAL)、心(SPIRITUAL)、体(PHYSICAL)、知(EXECUTIVE)のパフォーマンスを入力できる例を示している。入力されたパフォーマンス値は、解析処理に用いられる。それぞれの問いの意味は、人は「豊かな人間関係(協力・共感)をつくれましたか」、行は「やるべきことを実行できましたか」、心は「仕事にやりがい、充実を感じましたか」、体は「体に配慮(休養・栄養・運動)できましたか」、知は「新しい知(気づき、知識)を得ましたか」である。 Performance input (C) is a process of inputting a value indicating performance. Here, the performance is a subjective or objective evaluation that is determined based on some criteria. For example, a person who wears a nameplate type sensor node (TR) at a predetermined timing has a subjective evaluation (performance) value based on some criteria such as the degree of business achievement, degree of contribution to organization, and degree of satisfaction at that time. Enter The predetermined timing may be, for example, once every several hours, once a day, or when an event such as a meeting is over. A person wearing the nameplate type sensor node (TR) operates the nameplate type sensor node (TR) or operates a personal computer (PC) such as a client (CL) to input a performance value. can do. Alternatively, handwritten values may be input later by the PC. In the present embodiment, an example is shown in which the name tag type sensor node can input the performance of a person (SOCIAL), a row (INTELLECTUAL), a mind (SPIRITUAL), a body (PHYSICAL), and an intelligence (EXECUTIVE) as a rating. The input performance value is used for analysis processing. The meaning of each question is: "Can people make rich human relationships (cooperation, sympathy)", "Can you do what you have to do?", "Do you feel rewarding and fulfilling your work?" ", Did the body take care of the body (rest, nutrition, exercise)?", The knowledge "have new knowledge (notice, knowledge)".
 組織に関するパフォーマンスは、個人のパフォーマンスから算出されてもよい。売上高又はコスト等の客観的なデータ、及び、顧客のアンケート結果等の既に数値化されているデータが、パフォーマンスとして定期的に入力されてもよい。生産管理等におけるエラー発生率等のように、自動で数値が得られる場合、得られた数値が自動的にパフォーマンスの値として入力されてもよい。さらに、国民総生産(GNP)などの経済指標を入力してもかまわない。これらを組織情報テーブル(H)に格納する。 The performance on the organization may be calculated from the performance of the individual. Objective data such as sales or cost, and already quantified data such as customer questionnaire results may be periodically input as performance. When a numerical value is automatically obtained, such as an error occurrence rate in production control or the like, the obtained numerical value may be automatically input as a performance value. Furthermore, economic indicators such as gross national product (GNP) may be input. These are stored in the organization information table (H).
 図1Dに示す基地局(GW)は、図1Eに示す名札型センサノード(TR)と図1Cに示すセンサネットサーバ(SS)を仲介する役目を持つ。無線の到達距離を考慮して、居室・職場等の領域をカバーするように複数の基地局(GW)が配置される。基地局(GW)は、送受信部(GWSR)、記憶部(GWME)、時計(GWCK)及び制御部(GWCO)を備える。 The base station (GW) shown in FIG. 1D has a role of mediating the name tag type sensor node (TR) shown in FIG. 1E and the sensor network server (SS) shown in FIG. 1C. A plurality of base stations (GWs) are arranged to cover an area such as a living room or work area in consideration of the reach of wireless. The base station (GW) includes a transceiver unit (GWSR), a storage unit (GWME), a clock (GWCK), and a control unit (GWCO).
 送受信部(GWSR)は、名札型センサノード(TR)からの無線を受信し、基地局(GW)への有線又は無線による送信を行う。さらに、送受信部(GWSR)は、無線を受信するためのアンテナを備える。 The transmission / reception unit (GWSR) receives the radio from the nameplate type sensor node (TR), and performs wired or wireless transmission to the base station (GW). Furthermore, the transceiver unit (GWSR) comprises an antenna for receiving radio.
 記憶部(GWME)は、ハードディスク、フラッシュメモリのような不揮発記憶装置で構成される。記憶部(GWME)には、少なくとも動作設定(GWMA)、データ形式情報(GWMF)、端末管理テーブル(GWTT)、及び基地局情報(GWMG)が格納される。動作設定(GWMA)は、基地局(GW)の動作方法を示す情報を含む。データ形式情報(GWMF)は、通信のためのデータ形式を示す情報、及び、センシングデータにタグを付けるために必要な情報を含む。端末管理テーブル(GWTT)は、現在アソシエイトできている配下の名札型センサノード(TR)の端末情報(TRMT)、及び、それらの名札型センサノード(TR)を管理するために配布しているローカルIDを含む。基地局情報(GWMG)は、基地局(GW)自身のアドレスなどの情報を含む。また、記憶部(GWME)には名札型センサノードの更新された端末ファームウェア(GWTF)を一時的に格納する。 The storage unit (GWME) is configured of a non-volatile storage device such as a hard disk or a flash memory. The storage unit (GWME) stores at least operation setting (GWMA), data format information (GWMF), a terminal management table (GWTT), and base station information (GWMG). The operation setting (GWMA) includes information indicating the operation method of the base station (GW). Data format information (GWMF) includes information indicating a data format for communication, and information necessary to tag sensing data. The terminal management table (GWTT) is terminal information (TRMT) of name tag type sensor nodes (TR) under the current association and locals distributed for managing those name tag type sensor nodes (TR) Contains the ID. Base station information (GWMG) includes information such as the address of the base station (GW) itself. The storage unit (GWME) temporarily stores updated terminal firmware (GWTF) of the nameplate type sensor node.
 記憶部(GWME)には、さらに、制御部(GWCO)中の中央処理部CPU(図示省略)によって実行されるプログラムが格納されてもよい。 The storage unit (GWME) may further store a program executed by a central processing unit CPU (not shown) in the control unit (GWCO).
 時計(GWCK)は時刻情報を保持する。一定間隔でその時刻情報は更新される。具体的には、一定間隔でNTP(NETWORK TIME PROTOCOL)サーバ(TS)から取得した時刻情報によって、時計(GWCK)の時刻情報が修正される。 The clock (GWCK) holds time information. The time information is updated at regular intervals. Specifically, the time information of the clock (GWCK) is corrected by the time information acquired from the NTP (NETWORK TIME PROTOCOL) server (TS) at regular intervals.
 制御部(GWCO)は、CPU(図示省略)を備える。CPUが記憶部(GWME)に格納されているプログラムを実行することによって、センシングデータセンサ情報の取得タイミング、センシングデータの処理、名札型センサノード(TR)やセンサネットサーバ(SS)への送受信のタイミング、及び、時刻同期のタイミングを管理する。具体的には、CPUが記憶部(GWME)に格納されているプログラムを実行することによって、通信制御部(GWCC)、アソシエイト(GWTA)、時刻同期管理(GWCD)及び時刻同期(GWCS)等の処理を実行する。 The control unit (GWCO) includes a CPU (not shown). When the CPU executes a program stored in the storage unit (GWME), acquisition timing of sensing data sensor information, processing of sensing data, transmission / reception to name tag type sensor node (TR) or sensor net server (SS) Manage timing and timing of time synchronization. Specifically, when the CPU executes a program stored in the storage unit (GWME), the communication control unit (GWCC), associate (GWTA), time synchronization management (GWCD), time synchronization (GWCS), etc. Execute the process
 通信制御部(GWCC)は、無線又は有線による名札型センサノード(TR)及びセンサネットサーバ(SS)との通信のタイミングを制御する。また、通信制御部(GWCC)は、受信したデータの種類を区別する。具体的には、通信制御部(GWCC)は、受信したデータが一般のセンシングデータであるか、アソシエイトのためのデータであるか、時刻同期のレスポンスであるか等をデータのヘッダ部分から識別して、それらのデータをそれぞれ適切な機能に渡す。 The communication control unit (GWCC) controls the timing of communication with the name tag type sensor node (TR) and the sensor network server (SS) by wireless or wired. Also, the communication control unit (GWCC) distinguishes the type of received data. Specifically, the communication control unit (GWCC) identifies from the header portion of the data whether the received data is general sensing data, data for association, or a time synchronization response. Pass those data to the appropriate functions.
 なお、通信制御部(GWCC)は、記憶部(GWME)に記録されたデータ形式情報(GWMF)を参照して、送受信のために適した形式にデータを変換し、データの種類を示すためのタグ情報を付け加えるデータ形式変換を実行する。 The communication control unit (GWCC) converts data into a format suitable for transmission and reception with reference to data format information (GWMF) recorded in the storage unit (GWME), and indicates the type of data. Perform data format conversion to add tag information.
 アソシエイト(GWTA)は、名札型センサノード(TR)から送られてきたアソシエイト要求(TRTAQ)に対する応答(TRTAR)を送信し、名札型センサノード(TR)に割り付けたローカルIDを送信する。アソシエイトが成立したら、アソシエイト(GWTA)は、端末管理テーブル(GWTT)と端末ファームウェア(GWTF)を用いて端末管理情報を修正する。 The associate (GWTA) transmits a response (TRTAR) to the associate request (TRTAQ) sent from the name tag type sensor node (TR), and transmits a local ID assigned to the name tag type sensor node (TR). When the associate is established, the associate (GWTA) corrects the terminal management information using the terminal management table (GWTT) and the terminal firmware (GWTF).
 時刻同期管理(GWCD)は、時刻同期を実行する間隔及びタイミングを制御し、時刻同期するように命令を出す。あるいは、この後説明するセンサネットサーバ(SS)が時刻同期管理(GWCD)を実行することによって、センサネットサーバ(SS)からシステム全体の基地局(GW)に統括して命令を送ってもよい。 The time synchronization management (GWCD) controls the interval and timing for performing time synchronization, and issues an instruction to perform time synchronization. Alternatively, the sensor network server (SS) may collectively send an instruction to the base station (GW) of the entire system by executing time synchronous management (GWCD) by the sensor network server (SS) described later. .
 時刻同期(GWCS)は、ネットワーク上のNTPサーバ(TS)に接続し、時刻情報の依頼及び取得を行う。時刻同期(GWCS)は、取得した時刻情報に基づいて、時計(GWCK)を修正する。そして、時刻同期(GWCS)は、名札型センサノード(TR)に時刻同期の命令と時刻情報(GWCSD)を送信する。 Time synchronization (GWCS) connects to an NTP server (TS) on the network to request and acquire time information. The time synchronization (GWCS) corrects the clock (GWCK) based on the acquired time information. The time synchronization (GWCS) transmits a time synchronization instruction and time information (GWCSD) to the nameplate type sensor node (TR).
 図1Eは、センサノードの一実施例である名札型センサノード(TR)の構成を示しており、名札型センサノード(TR)は人間の対面状況を検出するための複数の赤外線送受信部(AB)、装着者の動作を検出するための三軸加速度センサ(AC)、装着者の発話と周囲の音を検出するためのマイク(AD)、名札型センサノードの裏表検知のための照度センサ(LS1F、LS1B)、温度センサ(AE)の各種センサを搭載する。搭載するセンサは一例であり、装着者の対面状況と動作を検出するために他のセンサを使用してもよい。 FIG. 1E shows a configuration of a name tag type sensor node (TR) which is an embodiment of a sensor node, and the name tag type sensor node (TR) is a plurality of infrared ray transmitting / receiving units (AB) for detecting a human facing situation. ), A three-axis acceleration sensor (AC) for detecting the wearer's movement, a microphone (AD) for detecting the wearer's speech and surrounding sounds, and an illumination sensor (for detecting the front and back of the nameplate type sensor node) Various sensors of LS1F, LS1B) and temperature sensor (AE) are mounted. The mounted sensor is an example, and other sensors may be used to detect the wearer's facing situation and movement.
 本実施例では、赤外線送受信部を4組搭載する。赤外線送受信部(AB)は、名札型センサノード(TR)の固有識別情報である端末情報(TRMT)を正面方向に向かって定期的に送信し続ける。他の名札型センサノード(TR)を装着した人物が略正面(例えば、正面又は斜め正面)に位置した場合、名札型センサノード(TR)と他の名札型センサノード(TR)は、それぞれの端末情報(TRMT)を赤外線で相互にやり取りする。このようにすることにより、誰と誰が対面しているのかを記録することができる。 In the present embodiment, four infrared transmitting and receiving units are mounted. The infrared transmitting / receiving unit (AB) continuously transmits terminal information (TRMT), which is unique identification information of the nameplate type sensor node (TR), toward the front direction periodically. When a person wearing another name tag type sensor node (TR) is located substantially in front (for example, front or diagonal front), the name tag type sensor node (TR) and the other name tag type sensor node (TR) are Exchange terminal information (TRMT) with each other by infrared rays. By doing this, it is possible to record who and who are facing each other.
 各赤外線送受信部は一般に、赤外線送信のための赤外発光ダイオードと、赤外線フォトトランジスタの組み合わせにより構成される。赤外線ID送信部(IRID)は、自らのIDである端末情報(TRMT)を生成して赤外線送受信モジュールの赤外線発光ダイオードに対して転送する。本実施例では、複数の赤外線送受信モジュールに対して同一のデータを送信することで、全ての赤外線発光ダイオードが同時に点灯する。もちろん、それぞれ独立のタイミング、別のデータを出力してもよい。 Each infrared transmitting and receiving unit is generally constituted by a combination of an infrared light emitting diode for infrared transmission and an infrared phototransistor. The infrared ID transmission unit (IRID) generates terminal information (TRMT), which is its own ID, and transfers it to the infrared light emitting diode of the infrared transmitting / receiving module. In this embodiment, all the infrared light emitting diodes are lighted at the same time by transmitting the same data to a plurality of infrared transmitting and receiving modules. Of course, separate data may be output independently of each other.
 また、赤外線送受信部(AB)の赤外線フォトトランジスタによって受信されたデータは、論理和回路(IROR)によって論理和が取られる。つまり、最低どれか一つの赤外線受光部でID受光されていれば名札型センサノードにIDとして認識される。もちろん、IDの受信回路を独立して複数持つ構成でもよい。この場合、それぞれの赤外線送受信モジュールに対して送受信状態が把握できるので、例えば、対面する別の名札型センサノードがどの方向にいるかなど付加的な情報を得ることも可能である。 In addition, data received by the infrared phototransistor of the infrared transmitting / receiving unit (AB) is logically ORed by an OR circuit (IROR). That is, if at least one infrared light receiving unit receives an ID, it is recognized as an ID by the name tag type sensor node. Of course, it may be configured to have a plurality of ID receiving circuits independently. In this case, since the transmission / reception state can be grasped with respect to each infrared transmission / reception module, it is also possible to obtain additional information such as, for example, in which direction the other name tag type sensor node facing is located.
 センサによって検出したセンサデータ(SENSD)はセンサデータ格納制御部(SDCNT)によって、記憶部(STRG)に格納される。センサデータ(SENSD)は通信制御部(TRCC)によって送信パケットに加工され、送受信部(TRSR)によって基地局(GW)に対し送信される。 Sensor data (SENSD) detected by the sensor is stored in the storage unit (STRG) by the sensor data storage control unit (SDCNT). The sensor data (SENSD) is processed into a transmission packet by the communication control unit (TRCC), and is transmitted to the base station (GW) by the transmission / reception unit (TRSR).
 このとき、記憶部(STRG)からをセンサデータ(SENSD)取り出し、無線送信するタイミングを生成するのが通信タイミング制御部(TRTMG)である。通信タイミング制御部(TRTMG)は、複数のタイミングを生成する複数のタイムベースを持つ。 At this time, it is the communication timing control unit (TRTMG) that takes out sensor data (SENSD) from the storage unit (STRG) and generates a timing for wireless transmission. The communication timing control unit (TRTMG) has a plurality of time bases for generating a plurality of timings.
 記憶部に格納されるデータには、現在センサによって検出したセンサデータ(SENSD)の他、過去に蓄積した纏め贈りデータ(CMBD)や、名札型センサノードの動作プログラムであるファームウェアを更新するためのファームウェア更新データ(FMUD)がある。 For data stored in the storage unit, in addition to sensor data (SENSD) currently detected by a sensor, for giving up gift data (CMBD) accumulated in the past, and firmware that is an operation program of a name tag type sensor node There is firmware update data (FMUD).
 本実施例の名札型センサノード(TR)は、外部電源接続検出回路(PDET)により、外部電源(EPOW)が接続されたことを検出し、外部電源検出信号(PDETS)を生成する。外部電源検出信号(PDETS)によって、通信タイミング制御部(TRTMG)が生成する送信タイミングを切り替えるタイムベース切替部(TMGSEL)、または無線通信されるデータを切り替えるデータ切替部(TRDSEL)が本実施例の特有の構成である。図1Eでは一例として、送信タイミングを、タイムベース1(TB1)とタイムベース(TB2)の2つのタイムベースを、外部電源検出信号(PDETS)によってタイムベース切替部(TMGSEL)が切り替える構成を図示している。また通信されるデータを、センサから得たセンサデータ(SENSD)と、過去に蓄積した纏め贈りデータ(CMBD)と、ファームウェア更新データ(FMUD)とから、外部電源検出信号(PDETS)によってデータ切替部(TRDSEL)が切り替える構成を図示している。 The name tag type sensor node (TR) of this embodiment detects that the external power supply (EPOW) is connected by the external power supply connection detection circuit (PDET), and generates an external power supply detection signal (PDETS). The time base switching unit (TMGSEL) that switches the transmission timing generated by the communication timing control unit (TRTMG) by the external power supply detection signal (PDETS) or the data switching unit (TRDSEL) that switches data to be communicated wirelessly It has a unique configuration. FIG. 1E shows, as an example, a configuration in which the time base switching unit (TMGSEL) switches the transmission timing by two time bases, time base 1 (TB1) and time base (TB2), by the external power detection signal (PDETS). ing. Also, the data switching unit is based on the external power supply detection signal (PDETS) from sensor data (SENSD) obtained from the sensor, the data to be collected (CMBD) accumulated in the past, and firmware update data (FMUD) It illustrates a configuration in which (TRDSEL) switches.
 照度センサ(LS1F、LS1B)は、それぞれ名札型センサノード(TR)の前面と裏面に搭載される。照度センサ(LS1F、LS1B)により取得されるデータは、センサデータ格納制御部(SDCNT)によって記憶部(STRG)に格納されると同時に、裏返り検知(FBDET)によって比較される。名札が正しく装着されているときは、前面に搭載されている照度センサ(表)(LS1F)が外来光を受光し、裏面に搭載されている照度センサ(裏)(LS1B)は名札型センサノード本体と装着者との間に挟まれる位置関係となるため、外来光を受光しない。このとき、照度センサ(裏)(LS1B)で検出される照度より、照度センサ(表)(LS1F)で検出される照度の方が大きな値を取る。一方で、名札型センサノード(TR)が裏返った場合、照度センサ(裏)(LS1B)が外来光を受光し、照度センサ(表)(LS1F)が装着者側を向くため、照度センサ(表)(LS1F)で検出される照度より、照度センサ(裏)(LS1B)で検出される照度の方が大きくなる。 The illuminance sensors (LS1F, LS1B) are mounted on the front and back of the nameplate type sensor node (TR), respectively. The data acquired by the illuminance sensors (LS1F, LS1B) are stored in the storage unit (STRG) by the sensor data storage control unit (SDCNT) and simultaneously compared by the reverse detection (FBDET). When the name tag is properly attached, the illuminance sensor (front) (LS1F) mounted on the front receives external light, and the illuminance sensor (back) mounted on the back (LS1B) is the nameplate type sensor node Since the positional relationship between the main body and the wearer is obtained, no extraneous light is received. At this time, the illuminance detected by the illuminance sensor (front) (LS1F) takes a larger value than the illuminance detected by the illuminance sensor (back) (LS1B). On the other hand, when the name tag type sensor node (TR) is turned over, the illuminance sensor (back) (LS1B) receives foreign light, and the illuminance sensor (front) (LS1F) faces the wearer, so the illuminance sensor (front) The illuminance detected by the illuminance sensor (back) (LS1B) is larger than the illuminance detected by (LS1F).
 ここで、照度センサ(表)(LS1F)で検出される照度と、照度センサ(裏)(LS1B)で検出される照度を裏返り検知(FBDET)で比較することで、名札ノードが裏返って、正しく装着していないことが検出できる。裏返り検知(FBDET)で裏返りが検出されたとき、スピーカ(SP)により警告音を発生して装着者に通知する。 Here, the name tag node is turned over by comparing the illuminance detected by the illuminance sensor (front) (LS1F) with the illuminance detected by the illuminance sensor (back) (LS1B) by the reverse detection (FBDET). It can detect that it is not attached. When turning over is detected by turning over detection (FBDET), a warning sound is generated by the speaker (SP) to notify the wearer.
 マイク(AD)は、音声情報を取得する。音声情報によって、「騒々しい」又は「静か」等の周囲の環境を知ることができる。さらに、人物の声を取得・分析することによって、コミュニケーションが活発か停滞しているのか、相互に対等に会話をやり取りしているか一方的に話しているのか、怒っているのか笑っているのか、などの対面コミュニケーションを分析することができる。さらに、人物の立ち位置等の関係で赤外線送受信器(AB)が検出できなかった対面状態を、音声情報及び加速度情報によって補うこともできる。  The microphone (AD) acquires audio information. Voice information can reveal the surrounding environment such as "noisy" or "quiet". Furthermore, by acquiring and analyzing the voice of the person, whether the communication is active or stagnant, whether they are exchanging conversations on an equal basis, are talking unilaterally, are they angry or laughing, Face-to-face communication such as Furthermore, the facing state in which the infrared transmitter-receiver (AB) can not be detected due to the relationship of the standing position of the person can be compensated by the audio information and the acceleration information.
 マイク(AD)で取得される音声は、音声波形及び、それを積分回路(AVG)で積分した信号の両方を取得する。積分した信号は、取得した音声のエネルギーを表す。 The voice acquired by the microphone (AD) acquires both a voice waveform and a signal obtained by integrating it with an integrating circuit (AVG). The integrated signal represents the energy of the acquired speech.
 三軸加速度センサ(AC)は、ノードの加速度すなわちノードの動きを検出する。このため、加速度データから、名札型センサノード(TR)を装着した人物の動きの激しさや、歩行などの行動を解析することができる。さらに、複数の名札型センサノード(TR)が検出した加速度の値を比較することによって、それらの名札型センサノード(TR)を装着した人物間のコミュニケーションの活性度や相互のリズム、相互の相関等を解析できる。 A three-axis acceleration sensor (AC) detects the acceleration of the node, ie the movement of the node. For this reason, from the acceleration data, it is possible to analyze the movement of the person wearing the name tag type sensor node (TR), the behavior such as walking, and the like. Furthermore, by comparing the values of acceleration detected by a plurality of name tag type sensor nodes (TR), the degree of communication activity, mutual rhythm, and correlation between persons wearing those name tag type sensor nodes (TR) Etc. can be analyzed.
 本実施例の名札型センサノード(TR)では、三軸加速度センサ(AC)で取得されるデータは、センサデータ格納制御部(SDCNT)によって記憶部(STRG)に格納されると同時に、上下検知(UDDET)によって名札の向きを検出する。これは、三軸加速度センサ(AC)で検出される加速度は、装着者の動きによる動的な加速度変化と、地球の重力加速度による静的加速度の2種類が観測されることを利用している。 In the name tag type sensor node (TR) of this embodiment, data acquired by the three-axis acceleration sensor (AC) is stored in the storage unit (STRG) by the sensor data storage control unit (SDCNT), and at the same time upper and lower detection The direction of the name tag is detected by (UDDET). This is because the acceleration detected by the three-axis acceleration sensor (AC) utilizes two types of observation of dynamic acceleration change due to the movement of the wearer and static acceleration due to the gravitational acceleration of the earth .
 表示装置(LCDD)は、名札型センサノード(TR)を胸に装着しているときは、装着者の所属、氏名などの個人情報を表示する。つまり、名札として振舞う。一方で、装着者が名札型センサノード(TR)を手に持ち、表示装置(LCDD)を自分の方に向けると、名札型センサノード(TR)の転地が逆になる。このとき、上下検知(UDDET)によって生成される上下検知信号(UDDETS)により、表示装置(LCDD)に表示される内容と、ボタンの機能を切り替える。本実施例では、上下検知信号(UDDETS)の値により、表示装置(LCDD)に表示させる情報を、表示制御(DISP)によって生成される赤外線アクティビティ解析(ANA)による解析結果と、名札表示(DNM)とを切り替える例を示している。 When the name tag type sensor node (TR) is attached to the chest, the display device (LCDD) displays personal information such as the affiliation and name of the wearer. In other words, it acts as a name tag. On the other hand, when the wearer holds the nameplate type sensor node (TR) in hand and directs the display device (LCDD) toward him, the location of the nameplate type sensor node (TR) is reversed. At this time, the content displayed on the display device (LCDD) and the function of the button are switched by the vertical detection signal (UDDETS) generated by the vertical detection (UDDET). In this embodiment, the information displayed on the display device (LCDD) by the value of the upper and lower detection signal (UDDETS) is an analysis result by the infrared activity analysis (ANA) generated by the display control (DISP) and the name tag display (DNM) An example of switching to and) is shown.
 赤外線送受信器(AB)がノード間で赤外線をやり取りすることによって、名札型センサノード(TR)が他の名札型センサノード(TR)と対面したか否か、すなわち、名札型センサノード(TR)を装着した人物が他の名札型センサノード(TR)を装着した人物と対面したか否かが検出される。このため、名札型センサノード(TR)は、人物の正面部に装着されることが望ましい。上述の通り、名札型センサノード(TR)は、さらに、三軸加速度センサ(AC)等のセンサを備える。名札型センサノード(TR)におけるセンシングのプロセスが、図2Aにおける組織ダイナミクスデータ取得(A)に相当する。 Whether the name tag type sensor node (TR) has faced another name tag type sensor node (TR) by the infrared transceiver (AB) exchanging infrared rays between the nodes, ie, the name tag type sensor node (TR) It is detected whether or not the person wearing the has faced the person wearing the other nameplate type sensor node (TR). For this reason, it is desirable that the name tag type sensor node (TR) be mounted on the front of the person. As described above, the name tag type sensor node (TR) further includes a sensor such as a three-axis acceleration sensor (AC). The process of sensing in the name tag type sensor node (TR) corresponds to tissue dynamics data acquisition (A) in FIG. 2A.
 名札型センサノード(TR)は多くの場合には複数存在し、それぞれが近い基地局(GW)と結びついてパーソナルエリアネットワーク(PAN)を形成している。 In many cases, there are a plurality of name tag type sensor nodes (TRs), each of which is connected with a nearby base station (GW) to form a personal area network (PAN).
 名札型センサノード(TR)の温度センサ(AE)は名札型センサノード(TR)のある場所の温度を、照度センサ(表)(LS1F)は名札型センサノード(TR)の正面方向などの照度を取得する。これによって、周囲の環境を記録することができる。例えば、温度及び照度に基づいて、名札型センサノード(TR)が、ある場所から別の場所に移動したこと等を知ることもできる。 The temperature sensor (AE) of the nameplate type sensor node (TR) is the temperature of the place where the nameplate type sensor node (TR) is located, and the illuminance sensor (front) (LS1F) is the illuminance of the name tag type sensor node (TR) To get This allows the surrounding environment to be recorded. For example, based on the temperature and the illuminance, it can be known that the nameplate type sensor node (TR) has moved from one place to another.
 装着した人物に対応した入出力装置として、ボタン1~3(BTN1~3)、表示装置(LCDD)、スピーカ(SP)等を備える。 The buttons 1 to 3 (BTNs 1 to 3), the display device (LCDD), the speaker (SP), and the like are provided as input / output devices corresponding to the person who has been worn.
 記憶部(STRG)は、具体的にはハードディスク、フラッシュメモリなどの不揮発記憶装置で構成され、名札型センサノード(TR)の固有識別番号である端末情報(TRMT)、センシングの間隔、及び、ディスプレイへの出力内容等の動作設定(TRMA)を記録している。この他にも記憶部(STRG)は一時的にデータを記録することができ、センシングしたデータを記録しておくために利用される。 Specifically, the storage unit (STRG) is configured by a non-volatile storage device such as a hard disk or a flash memory, and terminal information (TRMT) which is a unique identification number of a name tag type sensor node (TR), sensing interval, and display The operation settings (TRMA) such as output contents to the are recorded. Besides, the storage unit (STRG) can temporarily record data, and is used to record sensed data.
 通信タイミング制御部(TRTMG)は、時刻情報(GWCSD)を保持し、一定間隔でその時刻情報(GWCSD)を更新して時計(TRCK)として記録する。時間情報は、時刻情報(GWCSD)が他の名札型センサノード(TR)とずれることを防ぐために、基地局(GW)から送信される時刻情報(GWCSD)によって定期的に時刻を修正する。 The communication timing control unit (TRTMG) holds time information (GWCSD), updates the time information (GWCSD) at fixed intervals, and records it as a clock (TRCK). The time information is periodically corrected by the time information (GWCSD) transmitted from the base station (GW) in order to prevent the time information (GWCSD) from shifting with another nameplate type sensor node (TR).
 センサデータ格納制御部(SDCNT)は、記憶部(STRG)に記録された動作設定(TRMA)に従って、各センサのセンシング間隔などを制御し、取得したデータを管理する。 The sensor data storage control unit (SDCNT) controls the sensing interval and the like of each sensor according to the operation setting (TRMA) recorded in the storage unit (STRG), and manages the acquired data.
 時刻同期は、基地局(GW)から時刻情報を取得して時計を修正する。時刻同期は、後述するアソシエイトの直後に実行されてもよいし、基地局(GW)から送信された時刻同期コマンドに従って実行されてもよい。 Time synchronization corrects a clock by acquiring time information from a base station (GW). The time synchronization may be performed immediately after association, which will be described later, or may be performed according to a time synchronization command transmitted from the base station (GW).
 無線通信制御部(TRCC)は、データを送受信する際に、送信間隔の制御、及び、送受信に対応したデータフォーマットへの変換を行う。無線通信制御部(TRCC)は、必要であれば、無線でなく有線による通信機能を持ってもよい。無線通信制御部(TRCC)は、他の名札型センサノード(TR)と送信タイミングが重ならないように輻輳制御を行うこともある。 When transmitting and receiving data, the radio communication control unit (TRCC) performs control of a transmission interval and conversion to a data format corresponding to transmission and reception. The wireless communication control unit (TRCC) may have a wired communication function, not wireless, if necessary. The radio communication control unit (TRCC) may perform congestion control so that the transmission timing does not overlap with another nameplate type sensor node (TR).
 アソシエイト(TRTA)は、図1Dに示す基地局(GW)とパーソナルエリアネットワーク(PAN)を形成するためのアソシエイト要求(TRTAQ)と、アソシエイト応答(TRTAR)を送受信し、データを送信すべき基地局(GW)を決定する。アソシエイト(TRTA)は、名札型センサノード(TR)の電源が投入されたとき、及び、名札型センサノード(TR)が移動した結果それまでの基地局(GW)との送受信が絶たれたときに実行される。アソシエイト(TRTA)の結果、名札型センサノード(TR)は、その名札型センサノード(TR)からの無線信号が届く近い範囲にある一つの基地局(GW)と関連付けられる。 An associate (TRTA) transmits / receives an associate request (TRTAQ) and an associate response (TRTAR) to form a personal area network (PAN) with the base station (GW) shown in FIG. Determine (GW). The associate (TRTA) is activated when the name tag type sensor node (TR) is powered on, and when the name tag type sensor node (TR) moves and transmission / reception with the base station (GW) is interrupted as a result To be executed. As a result of associate (TRTA), a nameplate type sensor node (TR) is associated with one base station (GW) in a close range to which a radio signal from the nameplate type sensor node (TR) can reach.
 送受信部(TRSR)は、アンテナを備え、無線信号の送信及び受信を行う。必要があれば、送受信部(TRSR)は、有線通信のためのコネクタを用いて送受信を行うこともできる。送受信部(TRSR)によって送受信される送受信データ(TRSRD)は、基地局(GW)との間でパーソナルエリアネットワーク(PAN)を介して転送される。 The transmission / reception unit (TRSR) includes an antenna and performs transmission and reception of a radio signal. If necessary, the transceiver unit (TRSR) can also perform transmission and reception using a connector for wired communication. Transmission / reception data (TRSRD) transmitted / received by the transmission / reception unit (TRSR) is transferred to / from the base station (GW) via the personal area network (PAN).
 図2A、図2B、図2Cは、一つの実施形態であるビジネス顕微鏡システムにおいて実行される処理の全体の流れを示しており、図示の都合上分割して示してあるが、各々図示された各処理は相互に連携して実行される。図2Aに示す複数の名札型センサノード(TRa、TRb、~、TRi、TRj)による組織ダイナミクスデータの取得(A)から、図2Cに示すセンサデータの解析であるモデル化解析(CA)、その解析結果をモデル描画(JA)にて可視化を行ない、可視化結果は科学的経営知モデル(KA)という、一連の流れを示している。 FIGS. 2A, 2B, and 2C show the overall flow of processing performed in one embodiment of the business microscope system, and are shown separately for convenience of illustration, but each of the illustrated Processing is performed in cooperation with each other. From acquisition (A) of tissue dynamics data by a plurality of nameplate type sensor nodes (TRa, TRb,..., TRi, TRj) shown in FIG. 2A, modeling analysis (CA) which is analysis of sensor data shown in FIG. The analysis result is visualized by model drawing (JA), and the visualization result shows a series of flows called scientific management knowledge model (KA).
 図2Aを用いて組織ダイナミクスデータ取得(A)について説明する。名札型センサノードA(TRa)は、赤外線送受信器(AB)、加速度センサ(AC)、マイク(AD)、温度(AE)等のセンサ類と、正味(AFA)、気づき(AFB)、感謝(AFC)のボタン(AF)のボタン類から構成されている。 Tissue dynamics data acquisition (A) will be described using FIG. 2A. Name tag type sensor node A (TRa) includes sensors such as infrared transceiver (AB), acceleration sensor (AC), microphone (AD), temperature (AE), net (AFA), notice (AFB), thanks AFC) buttons (AF) are composed of buttons.
 赤外線送受信器から得られた対面情報を表示する画面(AG)と、レイティングを入力するユーザインタフェース(AA)、また図示は省略するが、マイクロコンピュータ及び無線送信機能を有する。 It has a screen (AG) for displaying the face-to-face information obtained from the infrared transmitter / receiver, a user interface (AA) for inputting a rating, and a microcomputer and a wireless transmission function although illustration is omitted.
 加速度センサ(AC)は、名札型センサノードA(TRa)の加速度(すなわち、名札型センサノードA(TRa)を装着している人物A(図示省略)の加速度)を検出する。赤外線送受信器(AB)は、名札型センサノードA(TRa)の対面状態(すなわち、名札型センサノードA(TRa)が他の名札型センサノードと対面している状態)を検出する。なお、名札型センサノードA(TRa)が他の名札型センサノードと対面していることは、名札型センサノードA(TRa)を装着した人物Aが、他の名札型センサノードを装着した人物と対面していることを示す。マイク(AD)は、名札型センサノードA(TRa)の周囲の音、温度センサ(AE)は、名札型センサノードA(TRa)の周囲の温度を検出する。 The acceleration sensor (AC) detects the acceleration of the name tag type sensor node A (TRa) (that is, the acceleration of the person A (not shown) wearing the name tag type sensor node A (TRa)). The infrared transceiver (AB) detects the facing state of the name tag type sensor node A (TRa) (that is, the state where the name tag type sensor node A (TRa) is facing another name tag type sensor node). The name tag type sensor node A (TRa) facing another name tag type sensor node means that the person A who wears the name tag type sensor node A (TRa) is the person who wears the other name tag sensor node Show that you are facing. The microphone (AD) detects the sound around the name tag type sensor node A (TRa), and the temperature sensor (AE) detects the temperature around the name tag type sensor node A (TRa).
 ボタン(AF)は名札型センサノードA(TRa)を装着している人物A(図示省略)の主観的な視点からの入力を行なうものである。主業務を行なっている場合には正味(AFA)、新しいアイデアなどが発見した場合には、気づき(AFB)、メンバに感謝することがあった場合には、感謝(AFC)のボタンを人物Aは押すようにする。 The button (AF) is for performing input from a subjective viewpoint of a person A (not shown) wearing the name tag type sensor node A (TRa). If you are doing the main work net (AFA), if you find a new idea, etc., you notice (AFB), if you thank the members, the button of thanks (AFC) person A Let me press.
 本実施の形態のシステムでは、複数の名札型センサノード(図2Aの名札型センサノードA(TRa)~名札型センサノードJ(TRj))を備える。各名札型センサノードは、それぞれ、一人の人物に装着される。例えば、名札型センサノードA(TRa)は人物Aに、名札型センサノードB(TRb)は人物B(図示省略)に装着される。人物間の関係性を解析し、さらに、組織のパフォーマンスを図示するためである。 The system according to the present embodiment includes a plurality of name tag type sensor nodes (name tag type sensor node A (TRa) to name tag type sensor node J (TRj) in FIG. 2A). Each name tag type sensor node is attached to one person. For example, name tag type sensor node A (TRa) is attached to person A, and name tag type sensor node B (TRb) is attached to person B (not shown). The purpose is to analyze relationships between people and further illustrate the performance of the organization.
 なお、名札型センサノードB(TRb)~名札型センサノードJ(TRj)も、名札型センサノードA(TRa)と同様、センサ類、マイクロコンピュータ及び無線送信機能を備える。以下の説明において、名札型センサノードA(TRa)~名札型センサノードJ(TRj)のいずれにも当てはまる説明をする場合、及び、それらの名札型センサノードを特に区別する必要がない場合、名札型センサノードと記載する。 The name tag type sensor node B (TRb) to the name tag type sensor node J (TRj) also have sensors, a microcomputer, and a wireless transmission function, similarly to the name tag type sensor node A (TRa). In the following description, the name tag is used when the description applies to any of the name tag type sensor node A (TRa) to the name tag type sensor node J (TRj), and when it is not necessary to distinguish between the name tag type sensor nodes. Described as a sensor node.
 各名札型センサノードは、常時(又は短い間隔で繰り返し)センサ類によるセンシングを実行する。そして、各名札型センサノードは、取得したデータ(センシングデータ)を、所定の間隔で無線によって送信する。データを送信する間隔は、センシング間隔と同じであってもよいし、センシング間隔より大きい間隔であってもよい。このとき送信されるデータには、センシングした時刻と、センシングした名札型センサノードの固有の識別子(ID)が付与される。データの無線送信をまとめて実行するのは、送信による電力消費を抑えることによって、人が装着したままで、名札型センサノード(TR)の使用可能状態を長時間維持するためである。また、全ての名札型センサノードにおいて同一のセンシング間隔が設定されていることが、後の解析のためには望ましい。
無線によって名札型センサノードから送信されたデータは、図2Bに示す組織ダイナミクスデータ収集(B)において収集され、データベースに格納される。
Each name tag type sensor node performs sensing by sensors constantly (or repeatedly at short intervals). Then, each name tag type sensor node wirelessly transmits the acquired data (sensing data) at a predetermined interval. The interval for transmitting data may be the same as the sensing interval or may be larger than the sensing interval. The data transmitted at this time is assigned a sensing time and a unique identifier (ID) of the sensed nameplate type sensor node. The wireless transmission of data is collectively performed in order to maintain the usable state of the name tag type sensor node (TR) for a long time while being worn by a person by suppressing the power consumption by the transmission. In addition, it is desirable for the later analysis that the same sensing interval is set in all name tag type sensor nodes.
Data transmitted by wireless from the nameplate type sensor node is collected in tissue dynamics data collection (B) shown in FIG. 2B and stored in the database.
 データテーブル(BA)は名札型センサノードから得られたセンサデータを格納する。  The data table (BA) stores sensor data obtained from name tag type sensor nodes.
 ユーザID(BAA)はユーザの識別子、取得時間(BAB)は名札型センサノード(TR)が受信した時刻、基地局(BAC)は名札型センサノード(TR)からセンサデータを受信した基地局、加速度センサ(BAD)は加速度センサ(AC)のセンサデータ、IRセンサ(BAE)は赤外線送受信器(AB)のセンサデータ、音センサ(BAF)はマイク(AD)のセンサデータ、温度(BAG)は温度センサ(AE)のセンサデータ、気づき(BAH)は気づき(AFB)ボタンの押下の有無、感謝(BAI)は感謝(AFC)ボタンの押下の有無、正味(BAJ)は正味(AFA)ボタンの押下の有無、端末(BAI)は端末を識別するための情報である。 The user ID (BAA) is the identifier of the user, the acquisition time (BAB) is the time when the name tag type sensor node (TR) receives, the base station (BAC) is the base station that received sensor data from the name tag type sensor node (TR), Acceleration sensor (BAD) is sensor data of acceleration sensor (AC), IR sensor (BAE) is sensor data of infrared transceiver (AB), sound sensor (BAF) is sensor data of microphone (AD), temperature (BAG) is Temperature sensor (AE) sensor data, notice (BAH) notice (AFB) button pressed, thanks (BAI) thanks (AFC) button pressed, net (BAJ) net (AFA) button The presence or absence of pressing, the terminal (BAI) is information for identifying the terminal.
 パフォーマンステーブル(BB)はパフォーマンス入力(C)やレイティング入力(AA)において入力されたパフォーマンスの値を格納する。 The performance table (BB) stores performance values input at performance input (C) and rating input (AA).
 ユーザID(BBA)はユーザの識別子、取得時間(BBB)は名札型センサノード(TR)でレイティング入力(AA)した時刻、もしくは、パフォーマンス入力(C)した時刻である。SOCIAL(BBC)、INTELLECTUAL(BBD)、SPIRITUAL(BBE)、PHYSICAL(BBF)、EXECUTIVE(BBG)はレイティング内容、端末(BBH)は端末を識別するための情報である。 The user ID (BBA) is the identifier of the user, and the acquisition time (BBB) is the time when the rating input (AA) is made by the name tag type sensor node (TR) or the time when the performance is input (C). SOCIAL (BBC), INTELLECTUAL (BBD), SPIRITUAL (BBE), PHYSICAL (BBF), and EXECUTIVE (BBG) are information for rating, and terminal (BBH) is information for identifying a terminal.
 また、ダイナミクスデータ収集(B)では、データが届いた順に格納する例を示しているため、必ずしも、時刻順になっているとは限らない。また、データテーブル(BA)やデータテーブル(BA)は1例であり、センサデータ毎にテーブルを作成しても構わない。 Further, in the dynamics data collection (B), an example is shown in which the data are stored in the order of arrival, and therefore, they are not necessarily arranged in time order. Further, the data table (BA) and the data table (BA) are an example, and a table may be created for each sensor data.
 組織ダイナミクスデータ収集(B)によって集められた組織ダイナミクスデータは、図2Cに示すモデル化解析(CA)によって有益因子によるモデルが生成され、モデル描画(JA)によって可視化され、その可視化結果が科学的経営知モデル(KA)となる。 The tissue dynamics data collected by the tissue dynamics data collection (B) is generated by the modeling analysis (CA) shown in FIG. 2C to generate a model with a beneficial factor, visualized by model drawing (JA), and the visualization result is scientific It becomes a management knowledge model (KA).
 モデル化解析(CA)とは、ストレスや生産性等はどの組織活動が有益因子となっているのかを明らかにする処理である。具体的には、ストレスや生産性等を目的変数、組織活動である組織ダイナミクス指標を説明変数とし、それらの相関処理を行なうことにより、相関結果をもとに有益な因子を選択する。これにより、どの組織活動がストレスや生産性等に影響を与えているかが明らかになり、改善すべき組織活動を特定することができる。 Modeling analysis (CA) is a process to clarify which tissue activity is a beneficial factor, such as stress and productivity. Specifically, stress, productivity, etc. are used as objective variables, and tissue dynamics indexes, which are tissue activities, as explanatory variables, and correlation processing is performed to select useful factors based on the correlation results. This makes it possible to identify which organization activity affects stress, productivity, etc., and to identify the organization activity to be improved.
 モデル化解析(CA)の全体の流れを説明する。モデル化解析(CA)は、組織ダイナミクスデータから対面テーブル作成(C1A)や身体リズムテーブル作成(C1B)によって、ユーザ毎に時系列上のテーブルに変換する。そして、この結果からユーザの対面状況をマトリックス状(対面マトリックス作成(C1C))にまとめる。これらのデータからネットワーク指標抽出(CAA)、身体リズム指標抽出処理(CAB)、対面指標抽出処理(CAC)、組織活動指標抽出(CAD)の処理を行なうことで組織活動を網羅する様々な組織ダイナミクス指標を求める。 Describe the overall flow of modeling analysis (CA). The modeling analysis (CA) converts the tissue dynamics data into a time-series table for each user by creating a facing table (C1A) and creating a body rhythm table (C1B). Then, based on this result, the user's face-to-face situation is summarized in a matrix form (face-to-face matrix creation (C1C)). A variety of tissue dynamics covering tissue activity by processing network index extraction (CAA), body rhythm index extraction processing (CAB), face-to-face index extraction processing (CAC), and tissue activity index extraction (CAD) from these data Determine the index.
 アンケート(G)として、パーソナリティアンケート(GA)、リーダシップ/チームワークアンケート(GB)、社員のやりがい/充実度アンケート(GC)、ストレス/メンタル不調アンケート(GD)、組織活性化アンケート(GE)をユーザに回答してもらうことで、これを主観的なデータとして用いる。アンケート結果を解析結果テーブル(F)のそれぞれのテーブルに格納する。その際に、パーソナリティ指標(FAAE)は、説明変数(FAA)として用いる。この理由は、ユーザのパーソナリティは生まれもってきたものであり、変化しないという仮説に基づいている。また、組織における生産性指標(HA)や事故不良指標(HB)は目的変数として用いる。 As questionnaires (G), personality questionnaires (GA), leadership / teamwork questionnaires (GB), employee satisfaction / satisfaction questionnaires (GC), stress / mental disorder questionnaires (GD), tissue activation questionnaires (GE) This is used as subjective data by having the user answer. The questionnaire results are stored in each of the analysis result tables (F). At that time, the personality index (FAAE) is used as an explanatory variable (FAA). The reason is based on the hypothesis that the personality of the user is born and does not change. In addition, the productivity index (HA) and the accident failure index (HB) in the organization are used as objective variables.
 相関分析(CAE)では、説明変数(FAA)と解析結果テーブル(F)の目的変数(FAB)との相関、説明変数(FAA)と組織情報テーブル(H)の目的変数(HA)との相関を求める。その際に、メンバの説明変数とユーザの目的変数の相関を求めるだけでなく、メンバと対面している周囲のメンバの値を目的変数として用いてもかまわない。すなわち、対面マトリックス(FC1C)によって、メンバと対面している周囲のメンバを特定し、その特定した複数の周囲のメンバの目的変数の平均や分散などを用いる。 In correlation analysis (CAE), the correlation between the explanatory variable (FAA) and the objective variable (FAB) in the analysis result table (F), and the correlation between the explanatory variable (FAA) and the objective variable (HA) in the tissue information table (H) Ask for At that time, not only the correlation between the explanatory variable of the member and the objective variable of the user may be determined, but also the value of the surrounding member facing the member may be used as the objective variable. That is, the surrounding members facing the members are identified by the facing matrix (FC1C), and the average or the variance of the objective variables of the plurality of surrounding members identified is used.
 この結果を因子係数(FAC)に格納し、因子選択(CAF)によって有益な因子のみを選択する。その際には、相関値が高いという判定だけでなく、検定結果(例えばP値)がよいものや、組織活動として網羅しているものを選択することも可能である。 The results are stored in factor coefficients (FAC), and only factor that is beneficial by factor selection (CAF) is selected. In that case, it is possible not only to determine that the correlation value is high but also to select one that has a good test result (for example, P value) or one that is covered as tissue activity.
 因子選択(CAF)によって選択させた因子は、モデル描画(JA)により描画される。この結果が、科学的経営知モデル(KA)である。 The factor selected by factor selection (CAF) is drawn by model drawing (JA). This result is a scientific management knowledge model (KA).
 対面テーブル作成(C1A)は、組織ダイナミクスデータの赤外線データからメンバ間の対面状況をある一定期間毎に時系列順にまとめる処理である。抽出した結果を解析結果テーブル(F)の対面テーブル(FC1A)に格納する。対面テーブル(FC1A)の1例を図3に示す。これは、ユーザを1レコードとして、時間分解能1分間(FC1A3)として、時系列順に1日(24時間)分を格納する。 Face-to-face table creation (C1A) is a process of putting together the face-to-face situation between members from infrared data of tissue dynamics data in time series in a certain constant period. The extracted result is stored in the meeting table (FC1A) of the analysis result table (F). One example of the facing table (FC1A) is shown in FIG. This stores one day (24 hours) in chronological order with a time resolution of one minute (FC1A3) with the user as one record.
 対面テーブル(2009年7月1日)では、縦軸にメンバ個人を判別するためのユーザID(FC1A1)、横軸は時間分解能による時刻を示す分解能時刻(FC1A2)となっている。ある時刻におけるユーザの対面状況は、ユーザID(FC1A1)と分解能時刻(FC1A2)の対応関係を読み取るだけでよい。例えば、ユーザID001の2009/7/1 10:02の対面状況は2名と対面しており、対面していたメンバはユーザID002と003となっている。該当するユーザでかつその時刻の組織ダイナミクスデータの赤外線データが存在しない場合にNULLが対面テーブル(FC1A)に格納される。 In the face-to-face table (July 1, 2009), the vertical axis is a user ID (FC1A1) for identifying a member individual, and the horizontal axis is a resolution time (FC1A2) indicating time by time resolution. The user's face-to-face situation at a certain time only needs to read the correspondence between the user ID (FC1A1) and the resolution time (FC1A2). For example, the meeting situation of 2009/7/1 10:02 of the user ID 001 is facing two people, and the members who were facing are the user IDs 002 and 003. NULL is stored in the facing table (FC1A) when there is no infrared data of tissue dynamics data of the corresponding user at that time.
 対面テーブル(FC1A)は1日かつ時間分解能ごとに生成されるため、同じ日付でも、時間分解能が異なれば別テーブルとなる。例えば、(FC1A4)と(FC1A5)では、同じ(2009年7月2日)であるが、時間分解能が異なるため、別テーブルとなっている。また、対面テーブル(FC1A)は、対面人数と対面したユーザIDとして格納することが重要であるため、これが満たされるならば、対面テーブル(FC1A)で用いられているテーブル構成と異なってもかまわない。 The meeting table (FC1A) is generated for each day and each time resolution, so even if the same date, the time resolution is different. For example, although (FC1A4) and (FC1A5) are the same (July 2, 2009), they have different tables because they have different time resolutions. In addition, it is important to store the meeting table (FC1A) as a user ID facing the meeting number of people, so if this is satisfied, the table configuration used in the meeting table (FC1A) may be different. .
 身体リズムテーブル作成(C1B)は、組織ダイナミクスデータの加速度データからメンバの身体の動きをHzとして示すことで身体リズム状況をある一定期間毎に時系列順にまとめる処理である。 Body rhythm table creation (C1B) is a process of putting together the body rhythm status in time series in a certain constant period by indicating the movement of the member's body from the acceleration data of the tissue dynamics data as Hz.
 抽出した結果を解析結果テーブル(F)の身体リズムテーブル(FC1B)に格納する。身体リズムテーブル(FC1B)の1例を図4に示す。ユーザを1レコードとして、時間分解能1分間(FC1B3)として、時系列順に1日(24時間)分を格納する。 The extracted result is stored in the body rhythm table (FC1B) of the analysis result table (F). One example of the body rhythm table (FC1B) is shown in FIG. Assuming that the user is one record, one minute (24 hours) is stored in chronological order, with a time resolution of one minute (FC1B3).
 身体リズムテーブル(2009年7月1日)(FC1B3)では、縦軸にメンバ個人を判別するためのユーザID(FC1B1)、横軸は時間分解能による時刻を示す分解能時刻(FC1B2)となっている。ある時刻におけるユーザの身体リズム状況は、ユーザID(FC1B1)と分解能時刻(FC1B2)の対応関係を読み取るだけでよい。例えば、ユーザID001の2009/7/1 10:02の身体リズム状況は2.1Hzである。該当するユーザでかつその時刻の組織ダイナミクスデータの加速度データが存在しない場合にNULLが身体リズムテーブル(FC1B)に格納される。 In the body rhythm table (July 1, 2009) (FC1B3), the vertical axis is a user ID (FC1B1) for identifying a member individual, and the horizontal axis is a resolution time (FC1B2) indicating time by time resolution . The physical rhythm situation of the user at a certain time only needs to read the correspondence between the user ID (FC1B1) and the resolution time (FC1B2). For example, the body rhythm situation of 2009/7/11 10:02 of user ID 001 is 2.1 Hz. NULL is stored in the body rhythm table (FC1B) when there is no acceleration data of tissue dynamics data of the corresponding user at that time.
 身体リズムテーブル(FC1B)は1日かつ時間分解能ごとに生成されるため、同じ日付でも時間分解能が異なれば別テーブルとなる。例えば、(FC1B4)と(FC1B5)では、同じ(2009年7月2日)であるが、時間分解能が異なるため、別テーブルとなっている。また、身体リズムテーブル(FC1B)は、ユーザの身体リズムを格納することが重要であるため、これが満たされるならば、身体リズムテーブル(FC1B)で用いられているテーブル構成と異なってもかまわない。 Since the body rhythm table (FC1B) is generated for each day and for each time resolution, the same date may be another table if the time resolution is different. For example, although (FC 1 B 4) and (FC 1 B 5) are the same (July 2, 2009), they have different tables because they have different time resolutions. Further, since it is important to store the user's body rhythm, the body rhythm table (FC1B) may be different from the table configuration used in the body rhythm table (FC1B) if it is satisfied.
 対面マトリックス作成(C1C)は、時系列に並べられている対面テーブル(FC1A)から、時系列情報を取り除き、ユーザ毎にどのぐらい対面が行なわれているかを2次元マトリックスにまとめる処理である。 Face-to-face matrix creation (C1C) is a process of removing time-series information from the face-to-face table (FC1A) arranged in time-series and putting together in a two-dimensional matrix how much face-to-face contact is performed for each user.
 抽出した結果を解析結果テーブル(F)の対面マトリックス(FC1C)に格納する。対面マトリックス(FC1C)の1例を図5に示す。図5は、1ヶ月分の対面結果をまとめたものとなっている。また、対面テーブル(FC1A)における時間分解能を単位とするので、対面マトリックス(FC1C)に1と格納した場合、時間分解能が1分間なら1分間、時間分解能が5分間なら5分間対面していたということになる。 The extracted result is stored in the facing matrix (FC1C) of the analysis result table (F). One example of the facing matrix (FC1C) is shown in FIG. FIG. 5 is a summary of one month's meeting results. In addition, since the time resolution in the facing table (FC1A) is taken as a unit, when 1 is stored in the facing matrix (FC1C), the time resolution is 1 minute if it is 1 minute, and 5 minutes if the time resolution is 5 minutes. It will be.
 対面マトリックス(FC1C)では、縦軸はメンバ個人を判別するためのユーザID(FC1C1)、横軸は対面した相手を示すユーザID(FC1C2)である。例えば、ユーザ002におけるユーザ003との対面時間は、33分となっている。 In the face-to-face matrix (FC1C), the vertical axis is a user ID (FC1C1) for identifying a member individual, and the horizontal axis is a user ID (FC1C2) indicating a partner who has met. For example, the meeting time of the user 002 with the user 003 is 33 minutes.
 この対面マトリックス(FC1C)を作成するにあたっては、多くの情報が1つのマトリックスに集約されてしまうため、もとの情報を記述しておく必要がある。期間:2009年7月1日-7月31日(FC1C3)は対面マトリックス(FC1C)に用いた期間をしている。日数:31日間(FC1C4)は期間(FC1C3)における日数である。実質日数:21日間(FC1C5)は期間(FC1C3)に営業日数である。時間分解能:1分間(FC1C6)は対面テーブル(FC1A)における時間分解能である。対面判定時間:3分間/1日(FC1C7)は対面したと判定するための閾値である。すれ違ったりした場合に赤外線送受信部が赤外線を受信すると、対面したという判定になってしまうが、数回の反応はノイズである可能性が高いため、このような閾値を導入している。また、対面マトリックス(FC1C)は、ユーザの対面状況を格納することが重要であるため、これが満たされるならば、対面マトリックス(FC1C)で用いられているテーブル構成と異なってもかまわない。 In creating this face-to-face matrix (FC1C), it is necessary to describe the original information because a lot of information is aggregated into one matrix. Period: 2009/7 / 1-July 31 (FC1C3) is the period used for the facing matrix (FC1C). Days: 31 days (FC1C4) is the number of days in the period (FC1C3). Actual days: 21 days (FC1C5) is the number of business days in the period (FC1C3). Temporal resolution: 1 minute (FC1C6) is the temporal resolution in the facing table (FC1A). Meeting determination time: 3 minutes / 1 day (FC1C7) is a threshold value for determining that meeting has occurred. If the infrared transmitting and receiving unit receives an infrared ray in the case of passing each other, it is determined that they have met, but such a threshold is introduced because the reaction is likely to be noise several times. In addition, since it is important to store the user's face-to-face situation, the face-to-face matrix (FC1C) may be different from the table configuration used in the face-to-face matrix (FC1C) if it is satisfied.
 ネットワーク指標抽出処理(CAA)は対面マトリックス(FC1C)から作り出されたネットワーク図から指標を求める処理である。そして、ネットワーク指標抽出処理(CAA)によって求められた指標を格納するテーブルの1例が図6のネットワーク指標(FAAA)である。ネットワーク指標(FAAA)はユーザ毎に指標が格納されるテーブルとなっている。ネットワーク指標とは、複数の人物それぞれと組織内の他の人物との繋がりを示す指標である。 The network index extraction process (CAA) is a process of obtaining an index from a network diagram created from a face-to-face matrix (FC1C). Then, an example of a table storing the index obtained by the network index extraction process (CAA) is the network index (FAAA) of FIG. The network index (FAAA) is a table in which an index is stored for each user. The network index is an index indicating the connection between each of a plurality of persons and other persons in the organization.
 テーブルはユーザを特定するユーザID(FAAA1)とネットワーク指標(次数(FAAA2)、結束度(FAAA3)、2ステップ到達度(FAAA4)、媒介中心性(FAAA5)、対面時間(合計)(FAAA6))から構成されている。期間:2009年7月1日-7月31日(FAAA6)は分析に用いた期間を示している。時間分解能:1分間(FAAA7)は分析時間分解能である。 The table identifies the user: user ID (FAAA1) and network index (order (FAAA2), cohesion (FAAA3), two-step reach (FAAA4), mediation centrality (FAAA5), meeting time (total) (FAAA6)) It consists of Period: 2009/7 / 1-July 31 (FAAA6) shows the period used for the analysis. Temporal resolution: 1 minute (FAAA 7) is analysis time resolution.
 図7のネットワーク図(ZA)は、対面マトリックスから作成されるネットワーク図の1例である。このネットワーク図(ZA)は(ZA1)~(ZA5)は人物を表すノードと、(ZA6)~(ZA11)は対面しているメンバ同士を結んだ線(エッジ)から構成されている。配置にはバネモデルを使用する。バネモデル(フックの法則)とは、2つのノード(点)がつながれている場合、そこにバネがあるとして力(内向きまたは外向き)を計算し、さらに自分とつながっていない全てのノードから距離に応じた斥力(反発する力)を受けるとして位置の移動を繰り返すことにより最適な配置にする手法である。このネットワーク図(ZA)の例を挙げて、ネットワーク指標(FAAA)を説明する。 The network diagram (ZA) of FIG. 7 is an example of a network diagram created from the facing matrix. In this network diagram (ZA), (ZA1) to (ZA5) are made up of nodes representing persons, and (ZA6) to (ZA11) are made up of lines (edges) connecting members facing each other. Use a spring model for placement. The spring model (Hook's law) calculates the force (inward or outward) assuming that there is a spring when two nodes (points) are connected, and further calculates the distance from all nodes not connected with itself. It is a method to make it an optimal arrangement by repeating movement of a position as receiving repulsive force (repulsive force) according to. The network indicator (FAAA) will be described with an example of this network diagram (ZA).
 次数(FAAA2)は、ノードに繋がっているエッジの数である。ネットワーク図(ZA)の例でいうと、高橋(ZA1)は田中(ZA2)と伊藤(ZA4)と接続させているため2となる。 The order (FAAA2) is the number of edges connected to the node. In the example of the network diagram (ZA), Takahashi (ZA1) is 2 because it is connected with Tanaka (ZA2) and Ito (ZA4).
 結束度(FAAA3)は、自分の周りのノードの密度であり、ある人物の周囲において互いに連携している度合いを示す指標である。ネットワーク図(ZA)の伊藤(ZA4)について説明すると、伊藤(ZA4)の対面相手は、高橋(ZA1)、山本(ZA5)、田中(ZA2)の3人である。その3人の密度を調べればよく、その結果、3人間におけるエッジ数/3人間における最大エッジ数=2/3=0.67となる。 The cohesion degree (FAAA3) is a density of nodes around oneself, and is an index indicating a degree of cooperation with one another around a person. Referring to Ito (ZA4) in the network diagram (ZA), ito (ZA4) has three opponents, Takahashi (ZA1), Yamamoto (ZA5), and Tanaka (ZA2). The density of the three persons should be examined, and as a result, the number of edges in three persons / 3, the maximum number of edges in persons = 2/3 = 0.67.
 2ステップ到達度(FAAA4)は、全体において、2ステップ以内の範囲に存在するノードの数である。ネットワーク図(ZA)の例でいうと、渡辺(ZA3)の場合の2ステップでカバーできるノードは全部(ZA1)~(ZA5)であり、4となる。 The two-step reach (FAAA4) is the total number of nodes within a range of two steps. In the example of the network diagram (ZA), all the nodes that can be covered by the two steps in the case of Watanabe (ZA3) are (ZA1) to (ZA5), which is 4.
 媒介中心性(FAAA4)は、ノードがネットワーク図全体の連結性に対してどの程度貢献しているかを表す値である。媒介中心性は、組織全ての人物の組み合わせにおいて、ネットワーク図上で最短ステップで到達するルート上に、その人物が存在する場合の数である。なお、人物Aと人物Bの間の最短ルートがn種類ある場合には、1/nとしてカウントして算出する。 Mediation centrality (FAAA4) is a value representing how much a node contributes to the connectivity of the entire network diagram. Mediation centrality is the number of cases where a person is present on a route that arrives at the shortest step on the network diagram in the combination of all the persons in an organization. When there are n types of shortest routes between the person A and the person B, it is counted and calculated as 1 / n.
 対面時間(合計)(FAAA5)は、期間中の対面時間を合計である。これは、対面マトリックス(FC1C)から求める値である。対面マトリックス(FC1C)における各ユーザのロウ(横の列)の和が対面時間となる。 Meeting time (total) (FAAA5) is the total of meeting time during the period. This is a value obtained from the facing matrix (FC1C). The sum of the rows (horizontal rows) of each user in the facing matrix (FC1C) is the facing time.
 ネットワーク指標(FAAA)を説明したが、指標はこれに限らず、対面マトリックス(FC1C)から他の指標を作成し、これを分析に用いてもかまわない。 Although the network indicator (FAAA) has been described, the indicator is not limited to this, and another indicator may be created from the face-to-face matrix (FC1C) and used for analysis.
 身体リズム指標抽出処理(CAB)は身体リズムテーブル(FC1B)から指標を求める処理である。そして、身体リズム指標抽出処理(CAB)によって求められた指標を格納するテーブルの1例が図6の身体リズム指標(FAAB)である。身体リズム指標(FAAB)はユーザ毎に指標が格納されるテーブルとなっている。 Body rhythm index extraction processing (CAB) is processing for obtaining an index from a body rhythm table (FC1B). Then, one example of a table storing the index obtained by the body rhythm index extraction process (CAB) is the body rhythm index (FAAB) of FIG. The body rhythm index (FAAB) is a table in which the index is stored for each user.
 テーブルはユーザを特定するユーザID(FAAB1)と身体リズム指標(0~1Hzの出現頻度(FAAB2)、1~2Hzの出現頻度(FAAB3)、2~3Hzの出現頻度(FAAB4)、0~1Hzの継続性(FAAB5)、1~2Hzの継続性(FAAB6)、2~3Hzの継続性(FAAB7))から構成されている。 The table identifies the user: user ID (FAAB1) and body rhythm index (0 to 1 Hz appearance frequency (FAAB2), 1 to 2 Hz appearance frequency (FAAB3), 2 to 3 Hz appearance frequency (FAAB4), 0 to 1 Hz It consists of continuity (FAAB5), 1 to 2 Hz continuity (FAAB6), and 2 to 3 Hz continuity (FAAB7)).
 期間:2009年7月1日-7月31日(FAAB8)は分析に用いた期間を示している。時間分解能:1分間(FAAB9)は分析時間分解能である。時間区間:1日(FAAB10)は、期間(FAAB8)における平均等を求める際の範囲指定である。 Period: July 1-July 31, 2009 (FAAB 8) indicate the period used for analysis. Time resolution: 1 minute (FAAB 9) is analysis time resolution. Time interval: 1 day (FAAB10) is a range designation when obtaining an average etc. in a period (FAAB8).
 身体リズムテーブル(FC1B)には、時間分解能毎のHz換算の身体リズムが格納されているため、これを1Hzの区間毎のヒストグラムを作成する。そして、0Hzから1Hzまでのヒストグラム値を0~1Hzの出現頻度(FAAB2)、1Hzから2Hzまでのヒストグラム値を1~2Hzの出現頻度(FAAB3)、2Hzから3Hzまでのヒストグラム値を2~3Hzの出現頻度(FAAB4)を求める。 The body rhythm table (FC1B) stores a body rhythm in terms of Hz for each time resolution, so that a histogram for each 1 Hz section is created. And the histogram value from 0 Hz to 1 Hz is the appearance frequency of 0 to 1 Hz (FAAB2), the histogram value from 1 Hz to 2 Hz is the appearance frequency of 1 to 2 Hz (FAAB3), the histogram value from 2 Hz to 3 Hz is 2 to 3 Hz Find the frequency of occurrence (FAAB4).
 さらに、身体リズムテーブル(FC1B)には時系列上に身体リズムが格納されているため、それぞれのリズムの継続を求めることができる。具体的にいうと、継続の度合いを調べればよく、ある時刻の身体リズムと次の時刻の身体リズムを比較し、2つの身体リズムが0Hzから1Hzまでの場合をカウントし、それを時間区間(FAAB10)で割ることで、時間区間における0Hzから1Hzまでの継続性を求める。また、これと同じように、1Hzから2Hzまでの継続性を1~2Hzの継続性(FAAB6)、2Hzから3Hzまでの継続性を2~3Hzの継続性(FAAB7)として求める。 Furthermore, since the body rhythms are stored in chronological order in the body rhythm table (FC1B), the continuation of each rhythm can be obtained. Specifically, it is sufficient to check the degree of continuation, compare the physical rhythm at a certain time with the physical rhythm at the next time, count cases where the two physical rhythms are from 0 Hz to 1 Hz, The continuity from 0 Hz to 1 Hz in the time interval is obtained by dividing by FAAB10). Similarly, the continuity from 1 Hz to 2 Hz is determined as the continuity of 1 to 2 Hz (FAAB 6), and the continuity from 2 Hz to 3 Hz is determined as the continuity of 2 to 3 Hz (FAAB 7).
 さらに、これは時間区間(FAAB10)である1日毎の値であるため、身体リズム指標(FAAB)に格納する場合には期間(FAAB8)の平均がそれぞれの値に格納する値となる。 Furthermore, since this is a daily value which is a time interval (FAAB10), when storing in the body rhythm index (FAAB), the average of the period (FAAB8) becomes a value stored in each value.
 身体リズム指標(FAAB)を説明したが、指標はこれに限らず、身体リズムテーブル(FC1B)から他の指標を作成し、これを分析に用いてもかまわない。さらに、身体リズム指標抽出処理(CAB)では期間(FAAB8)における平均を格納したが、分散などを用いてもかまわない。 Although the body rhythm index (FAAB) has been described, the index is not limited to this, and another index may be created from the body rhythm table (FC1B) and used for analysis. Furthermore, in the body rhythm index extraction process (CAB), the average in the period (FAAB 8) is stored, but a variance or the like may be used.
 対面指標抽出処理(CAC)は対面テーブル(FC1A)と身体リズムテーブル(FC1B)から指標を求める処理である。そして、対面指標抽出処理(CAC)によって求められた指標を格納するテーブルの1例が図8の対面指標(FAAC)である。対面指標(FAAC)はユーザ毎に指標が格納されるテーブルとなっている。 The face-to-face index extraction process (CAC) is a process for obtaining an index from the face-to-face table (FC1A) and the body rhythm table (FC1B). Then, an example of a table storing the index obtained by the facing index extraction process (CAC) is the facing index (FAAC) of FIG. The face-to-face indicator (FAAC) is a table in which the indicator is stored for each user.
 テーブルはユーザを特定するユーザID(FAAC1)と対面指標(対面時間(FAAC2)、非対面時間(FAAC3)、アクティブ対面時間(FAAC4)、パッシブ対面時間(FAAC5)、2人対面時間(FAAC6)、3人~5人対面時間(FAAC7)、6人~対面時間(FAAC8))から構成されている。 The table identifies the user: user ID (FAAC1) and face-to-face index (face-to-face time (FAAC2), non-face-to-face time (FAAC3), active face-to-face time (FAAC4), passive face-to-face time (FAAC5), two-person face-to-face time (FAAC6), It is composed of 3 to 5 face-to-face hours (FAAC 7) and 6 to 6 face-to-face times (FAAC 8).
 期間:2009年7月1日-7月31日(FAAC9)は分析に用いた期間を示している。時間分解能:1分間(FAAC10)は分析時間分解能である。時間区間:1日(FAAC11)は、期間(FAAC9)における平均等を求める際の範囲指定である。 Period: July 1-July 31, 2009 (FAAC 9) indicates the period used for analysis. Time resolution: 1 minute (FAAC 10) is analysis time resolution. Time interval: 1 day (FAAC 11) is a range designation when obtaining an average or the like in the period (FAAC 9).
 対面テーブル(FC1A)から組織ダイナミクスデータ取得時における対面時間と非対面時間を求める。対面テーブル(FC1A)に格納されている値が1名以上ならば対面時間、0名なら非対面時間としてカウントする。格納されている値がNULLの場合には対面時間と非対面時間をカウントしない。対面時間(FAAC2)は対面をカウントした時間、非対面時間(FAAC3)は非対面をカウントした時間である。分析時間分解能が1分間であるため、カウントした値そのものが時間となる。 From the face-to-face table (FC1A), find face-to-face time and non-face-to-face time when acquiring tissue dynamics data. If the value stored in the meeting table (FC1A) is 1 or more, the meeting time is counted, and if the value is 0, it is counted as the non-facing time. When the stored value is NULL, the facing time and the non-facing time are not counted. The facing time (FAAC2) is the time when counting the facing, and the non-facing time (FAAC3) is the time when the non-facing is counted. Since the analysis time resolution is one minute, the counted value itself becomes time.
 対面テーブル(FC1A)により対面と判定された際の対面したメンバ間でのその時刻の身体リズムテーブル(FC1B)を調べることにより、対面の積極性、すなわちアクティブ対面かパッシブ対面かを判定する。この判定の閾値として、対面中の身体リズムが2Hz以上をアクティブ対面(積極的な対面)、2Hz未満をパッシブ対面(受動的な対面)とした。なぜなら、本発明者らが対面時のユーザの行動と動作リズムの関係に着目したところ、言葉だけでなく身振りを含めた対面のように積極的と考えられる対面は、対面時の動作リズムが2Hz以上であるという知見に基づくからである。アクティブ対面時間(FAAC4)はアクティブ対面をカウントした時間、パッシブ対面時間(FAAC5)はパッシブ対面をカウントした時間である。分析時間分解能が1分間であるため、カウントした値そのものが時間となる。 By examining the body rhythm table (FC1B) at the time between facing members at the time when it is determined to be facing by the facing table (FC1A), it is determined whether facing positiveness, that is, active facing or passive facing. As a threshold of this judgment, the body rhythm in meeting is 2 Hz or more as active facing (active facing) and less than 2 Hz as passive facing (passive facing). This is because, when we focus on the relationship between the user's behavior and the movement rhythm at the time of meeting, in the case of a meeting that is considered to be positive like a meeting that includes not only words but gestures, the movement rhythm at the time of meeting is 2 Hz It is because it is based on the knowledge that it is more than. Active facing time (FAAC4) is time when counting active facing, and passive facing time (FAAC5) is time when counting passive facing. Since the analysis time resolution is one minute, the counted value itself becomes time.
 対面テーブル(FC1A)からその何人で対面を行なっていたのかを調べる。対面テーブル(FC1A)では、分析時間分解能毎に対面人数が記載されているため、それをカウントすることで値を求める。分析幅を2人、3人~5人、6人の3つとした。2人対面時間(FAAC6)は、2人での対面をカウントした時間である。3人~5人対面時間(FAAC7)は、3人から5人までの対面をカウントした時間である。6人~対面時間(FAAC8)は、6人以上の対面をカウントした時間である。分析時間分解能が1分間であるため、カウントした値そのものが時間となる。 Find out how many people were meeting from the meeting table (FC1A). In the face-to-face table (FC1A), the face-to-face number is described for each analysis time resolution, so the value is obtained by counting the number. The analysis range was three for two, three to five, and six. Two-person face-to-face time (FAAC 6) is the time when counting the two-person face-to-face. The 3 to 5 face-to-face time (FAAC 7) is the time when counting from 3 to 5 face-to-face meetings. Six people to face-to-face time (FAAC 8) is the time when counting six or more face-to-face meetings. Since the analysis time resolution is one minute, the counted value itself becomes time.
 さらに、これらは時間区間(FAAC11)である1日毎の値であるため、期間(FAAC9)の平均がそれぞれの値となる。 Furthermore, since these are values for each day which is a time interval (FAAC11), the average of the period (FAAC9) becomes each value.
 対面指標(FAAC)を説明したが、指標はこれに限らず、対面テーブル(FC1A)と身体リズムテーブル(FC1B)から他の指標を作成し、これを分析に用いてもかまわない。さらに、対面指標抽出処理(CAC)では期間(FAAC9)における平均を格納したが、分散などを用いてもかまわない。 Although the face-to-face index (FAAC) has been described, the index is not limited thereto, and another index may be created from the face-to-face table (FC1A) and the body rhythm table (FC1B) and used for analysis. Furthermore, in the face-to-face index extraction process (CAC), the average in the period (FAAC 9) is stored, but a variance or the like may be used.
 組織活動指標抽出処理(CAD)は対面テーブル(FC1A)と身体リズムテーブル(FC1B)から指標を求める処理である。そして、組織活動指標抽出処理(CAD)によって求められた指標を格納するテーブルの1例が図8の組織活動指標(FAAD)である。組織活動指標(FAAD)はユーザ毎に指標が格納されるテーブルとなっている。 The tissue activity index extraction process (CAD) is a process of obtaining an index from a face-to-face table (FC1A) and a body rhythm table (FC1B). And an example of a table which stores an index obtained by organization activity index extraction processing (CAD) is organization activity index (FAAD) of FIG. The organizational activity indicator (FAAD) is a table in which the indicator is stored for each user.
 テーブルはユーザを特定するユーザID(FAAD1)と組織活動指標(就業時間平均(FAAD2)、出社時刻平均(FAAD3)、帰社時刻平均(FAAD4)、就業時間標準偏差(FAAD5)、出社時刻標準偏差(FAAD6)、帰社時刻標準偏差(FAAD7))から構成されている。期間:2009年7月1日-7月31日(FAAD8)は、分析に用いた期間を示している。時間分解能:1分間(FAAD9)は分析時間分解能である。時間区間:1日(FAAD10)は、期間(FAAD8)における平均等を求める際の範囲指定である。 The table identifies the user user ID (FAAD1) and organization activity indicator (working hours average (FAAD2), average attendance time (FAAD3), average return time (FAAD4), standard deviation of working hours (FAAD5), standard deviation of attendance time (FAAD5) It consists of FAAD6) and return time standard deviation (FAAD7)). Period: July 1-July 31, 2009 (FAAD 8) indicate the period used for analysis. Time resolution: 1 minute (FAAD 9) is analysis time resolution. Time interval: 1 day (FAAD 10) is a range designation at the time of obtaining an average or the like in a period (FAAD 8).
 対面テーブル(FC1A)と身体リズムテーブル(FC1B)とから組織ダイナミクスデータ取得開始番地と終了番地を求めることで、これから就業時間、出社時刻、帰社時刻を求める。開始番地とは、組織ダイナミクスデータが取れていない(NULL)時からデータが格納されるようになった(0名以上)時の番地を意味する。また、終了番地とは、組織ダイナミクスデータが取れている(0名以上)時からデータ取れなくなった(NULL)時の番地を意味する。なお、本実施形態では、就業時間中に名札型センサノードを装着し、帰宅時には名札型センサノードを取り外すことを前提としている。 By obtaining the tissue dynamics data acquisition start address and end address from the meeting table (FC1A) and the body rhythm table (FC1B), the working time, the time of coming to work, and the time of returning to work are determined from this. The start address means an address when data is stored (zero or more) from the time when tissue dynamics data can not be obtained (NULL). Further, the end address means an address when data can not be obtained (NULL) from the time when tissue dynamics data is obtained (0 or more). In the present embodiment, it is assumed that the name tag type sensor node is attached during working hours and the name tag type sensor node is removed when returning home.
 対面テーブル(FC1A)と身体リズムテーブル(FC1B)は、時刻は格納されていないが、時系列順に格納されているため、取得した番地と時間分解能(FAAD9)から時刻を求めることができる。 Although time is not stored in the meeting table (FC1A) and the body rhythm table (FC1B), time is stored in chronological order, so time can be determined from the acquired address and time resolution (FAAD 9).
 就業時間は終了番地から開始番地を引くことによりその値が就業時間となる。就業時間平均(FAAD2)は、就業時間の期間(FAAD8)における時間区間(FAAD10)の平均である。就業時間標準偏差(FAAD5)は、就業時間の期間(FAAD8)における時間区間(FAAD10)の平均である。 Working time becomes the working time by subtracting the start address from the end address. The working hours average (FAAD2) is an average of time intervals (FAAD10) in the working hours period (FAAD8). The working time standard deviation (FAAD5) is the average of the time interval (FAAD10) in the working time period (FAAD8).
 出社時刻平均(FAAD3)は、開始番地の期間(FAAD8)における時間区間(FAAD10)の平均である。出社時刻標準偏差(FAAD6)は、開始番地の期間(FAAD8)における時間区間(FAAD10)の平均である。 The time of arrival at work (FAAD3) is the average of the time interval (FAAD10) in the period (FAAD8) of the start address. The attendance time standard deviation (FAAD6) is an average of the time interval (FAAD10) in the period (FAAD8) of the start address.
 退社時刻平均(FAAD4)は、終了番地の期間(FAAD8)における時間区間(FAAD10)の平均である。帰社時刻標準偏差(FAAD7)は、終了番地の期間(FAAD8)における時間区間(FAAD10)の平均である。 The leaving time average (FAAD4) is the average of the time interval (FAAD10) in the end address period (FAAD8). The return time standard deviation (FAAD7) is an average of the time interval (FAAD10) in the end address period (FAAD8).
 対面テーブル(FC1A)と身体リズムテーブル(FC1B)から、エラー状態の組織ダイナミックデータを使用しないように判断することができる。例えば、名札型センサノード(TR)を放置して帰社した場合に、近くのノードとの対面を反応してしまったとする。実際には対面していないが、赤外線からでは判断できない。精度を高めるためには、このような誤判断を省く必要がある。対策としては、身体リズムテーブル(FC1B)と比較することで、対面テーブル(FC1A)の対面が正しいものであるかを判断する。すなわち、人間が正しく付けていないようなリズム(身体リズムが0Hz、かつ、長時間)が検出されたならば、そのときの対面テーブルの値を用いないようにする。 From the meeting table (FC1A) and the body rhythm table (FC1B), it can be determined not to use tissue dynamic data in an error state. For example, it is assumed that when the name tag type sensor node (TR) is left and returned to the office, the meeting with the nearby node is reacted. It does not actually meet, but it can not judge from infrared rays. In order to improve the accuracy, it is necessary to eliminate such misjudgment. As a countermeasure, it is determined whether the facing of the facing table (FC1A) is correct by comparing with the physical rhythm table (FC1B). That is, if a rhythm (body rhythm is 0 Hz, for a long time) which is not correctly attached by a human being is detected, the value of the facing table at that time is not used.
 組織活動指標(FAAD)を説明したが、指標はこれに限らず、対面テーブル(FC1A)と身体リズムテーブル(FC1B)から他の指標を作成し、これを分析に用いてもかまわない。さらに、身体リズム指標抽出処理(CAB)では期間(FAAD10)における平均や標準偏差を格納したが、分散などを用いてもかまわない。 Although the organization activity index (FAAD) has been described, the index is not limited to this, and another index may be created from the face-to-face table (FC1A) and the body rhythm table (FC1B) and used for analysis. Furthermore, in the body rhythm index extraction process (CAB), the mean and the standard deviation in the period (FAAD 10) are stored, but a variance or the like may be used.
 次に、各種アンケート(GA~GE)から求められる指標及び客観的な組織指標(生産性指標、事故不良指標)について説明する。これらは、パフォーマンス入力(C)により入力される値に基づいて求められる。パーソナリティアンケート(GA)とは、思考・行動の特性を調べるアンケートである。パーソナリティアンケート(GA)の例として、以下の文献を参考にしてもよい。V. Benet-Martinez and O.P.John,“Los Cinco Grandes across cultures and ethnic groups: Multitrait method analyses of the Big Five in Spanish and English,” Journal of Personality and Social Psychology, 75,pp.729-750,1998.。 Next, indices required from various questionnaires (GA to GE) and objective organizational indices (productivity index, accident defect index) will be described. These are obtained based on the values input by the performance input (C). Personality Questionnaire (GA) is a questionnaire that examines the characteristics of thinking and behavior. The following documents may be referred to as an example of the personality questionnaire (GA). V. Benet-Martinez and O. P. John, “Los Cinco Grandes across cultures and ethnic groups: Multitrait method analyzes of the Big Five in Spanish and English,” Journal of Personality and Social Psychology, 75, pp. 729-750, 1998. .
 アンケートの例を図18で示す。ユーザはこのアンケートに回答もらい、その結果をパーソナリティ指標として格納する。パーソナリティ指標の1例として、図9のパーソナリティ指標(FAAE)テーブルを説明する。ユーザを特定するユーザID(FAAE1)とパーソナリティ(外向性(FAAE2)、調和性(FAAE3)、誠実性(FAAE4)、神経性(FAAE5)、開放性(FAAE6))から構成されている。回答日:2009年7月15日 (FAAE7)は回答した日付が記載されている。 An example of the questionnaire is shown in FIG. The user answers this questionnaire and stores the result as a personality index. The personality index (FAAE) table of FIG. 9 will be described as an example of the personality index. The user ID (FAAE1) and personality (Extroversion (FAAE2), harmony (FAAE3), integrity (FAAE4), nervousness (FAAE5), openness (FAAE6)) for identifying the user. Answer date: July 15, 2009 (FAAE7) contains the date of response.
 ユーザに対して、外向性(FAAE2)、調和性(FAAE3)、誠実性(FAAE4)、神経性(FAAE5)、開放性(FAAE6)のそれぞれにはパーソナリティ値が格納されている。 For the user, personality values are stored in each of the extroversion (FAAE2), the harmony (FAAE3), the integrity (FAAE4), the nervousness (FAAE5), and the openness (FAAE6).
 外向性(FAAE2)の値が高いほど、外向的な傾向を意味している。調和性(FAAE3)の値が高いほど、他人に合わせる傾向を意味している。誠実性(FAAE4)の値が高いほど、誠実である傾向を意味している。神経性(FAAE5)の値が高いほど、神経質な傾向を意味している。開放性(FAAE6)の値が高いほど、新しい知識や経験に対して開放的な傾向を意味している。さらに、図17では、この値による効果を示した表である。また、ユーザの思考・行動により社会への適応度合いがわかればよく、別のアンケートを用いてもかまわない。また、それにあわせてパーソナリティ指標(FAAE)で用いられているテーブル構成を変更してもかまわない。 The higher the value of the extroversion (FAAE2), the more the outward trend is meant. The higher the value of harmony (FAAE3), the more it tends to fit others. The higher the value of integrity (FAAE 4), the more it tends to be honest. The higher the value of nervousness (FAAE5), the more nervous the tendency. The higher the value of openness (FAAE6), the more open the trend towards new knowledge and experience. Furthermore, FIG. 17 is a table showing the effect of this value. In addition, it is sufficient if the user's thinking and behavior show the degree of adaptation to society, and another questionnaire may be used. Also, the table configuration used in the personality index (FAAE) may be changed accordingly.
 組織情報テーブル(H)について、図10を用いて説明する。組織情報テーブル(H)はその組織やメンバに関する指標が格納されている。 The organization information table (H) will be described with reference to FIG. The organization information table (H) stores indexes related to the organization and members.
 生産性に関する指標を生産性指標(HA)に格納する。テーブルはユーザを特定するユーザID(HA1)と生産性指標(成績(HA2)、貢献度(HA3)、プログラムステップ数(HA4)、営業件数(HA5)、売り上げ(HA6))から構成されている。期間は期間:2009年7月1日~2009年7月15日(HA7)である。 Store productivity indicators in Productivity Indicators (HA). The table consists of the user ID (HA1) that identifies the user and the productivity index (performance (HA2), contribution (HA3), number of program steps (HA4), number of sales (HA5), sales (HA6)) . The period is from July 1, 2009 to July 15, 2009 (HA7).
 もし、貢献度(HA3)のようにアルファベット表記ならば、好成績を大きな値になるように変換する。また、チーム毎の指標であるならば、そのチームに属するメンバは同じ値を代入する。生産性に関する指標であれば、この他の指標を用いてもかまわない。 If it is alphabetic notation like contribution (HA3), convert good results into large values. In addition, if it is an index for each team, members belonging to that team substitute the same value. Other indicators may be used as long as they relate to productivity.
 事故や不良に関する指標を事故不良指標(HB)に格納する。テーブルはユーザを特定するユーザID(HB1)と事故不良指標(休業日数(HB2)、バグ数(HB3)、ヒヤリハット数(HB4)、不良件数(HB5)、クレーム件数(HB6))から構成されている。期間は期間:2009年7月1日~2009年7月15日(HB7)である。 Stores indicators related to accidents and defects in accident defect indicators (HB). The table consists of a user ID (HB1) for identifying the user and an accident defect index (days closed (HB2), number of bugs (HB3), number of incidents (HB4), number of defects (HB5), number of complaints (HB6)) There is. The period is from 1 July 2009 to 15 July 2009 (HB7).
 もし、チーム毎の指標であるならば、そのチームに属するメンバは同じ値を代入する。
また、事故不良に関する指標であれば、この他の指標を用いてもかまわない。
If it is an index for each team, members belonging to that team substitute the same value.
In addition, other indicators may be used as long as they are indicators relating to an accident failure.
 リーダシップ/チームワークアンケート(GB)とは、集団に属しているメンバが同じ目標を達成するために行う作業、協力、意識、行動を調べるアンケートである。リーダシップ/チームワークアンケート(GB)の例として、以下の文献を参考にしてもよい。三沢 良、佐相 邦英、山口 裕幸、看護師チームのチームワーク測定尺度の作成、社会心理学研究、24(3)pp.219-232、20090227.。 A Leadership / Teamwork Questionnaire (GB) is a questionnaire that examines the work, cooperation, awareness, and behavior that members of a group perform to achieve the same goal. The following documents may be referred to as an example of the Leadership / Teamwork Questionnaire (GB). Ryo Misawa, Kunihide Sashio, Hiroyuki Yamaguchi, making a teamwork measurement scale for nurse teams, social psychology research, 24 (3) pp. 219-232, 20090227. .
 アンケートの例を図21で示す。ユーザはこのアンケートに回答もらい、その結果をリーダシップ/チームワーク指標に格納する。リーダシップ/チームワーク指標の1例として、図11のリーダシップ/チームワーク指標(FABC)テーブルを説明する。 An example of the questionnaire is shown in FIG. The user answers this questionnaire and stores the result in the leadership / teamwork index. The Leadership / Teamwork Index (FABC) table of FIG. 11 will be described as an example of the Leadership / Teamwork Index.
 テーブルはユーザを特定するユーザID(FABC1)と指標(チームの志向性(FABC2)、チーム・リーダーシップ(FABC5)、チーム・プロセス(FABC8))から構成されている。回答日:2009年7月15日 (FABC13)は回答した日付が記載されている。リーダシップ/チームワークアンケート(GB)からはチームの志向性(FABC2)、チーム・リーダーシップ(FABC5)、チーム・プロセス(FABC8)の3つの視点から求めている。 The table is composed of a user ID (FABC1) for identifying a user and indicators (team orientation (FABC2), team leadership (FABC5), team process (FABC8)). Answer date: July 15, 2009 (FABC13) contains the date of response. From the Leadership / Teamwork Questionnaire (GB), we seek from three perspectives: Team Orientation (FABC2), Team Leadership (FABC5), and Team Process (FABC8).
 チームの志向性(FABC2)では、職務に関する態度や価値観を示す職務志向性(FABC3)やチーム内の対人関係の良好さを示す対人志向性(FABC4)を求める。 In the team orientation (FABC2), the job orientation (FABC3) showing attitudes and values regarding duties and the interpersonal orientation (FABC4) showing good interpersonal relationships in the team are sought.
 チーム・リーダーシップ(FABC5)では、メンバへの的確な指示・指導を示す職務遂行上の指示(FABC6)や対人関係の維持・強化を示す対人関係上の配慮(FABC7)を求める
 チーム・プロセス(FABC8)では、各自の仕事の進捗状況を相互にモニターし、必要に応じて行なわれる調整行動を示すモニタリングと相互調整(FABC9)、職務内容をメンバ間の合意により明確化する行動を表す職務の分析と明確化(FABC10)、知識や情報の周知徹底を図る行動を表す知識と情報の共有(FABC11)や間違いや問題点に関するフィードバックを表すフィードバック(FABC12)を求める。
In Team Leadership (FABC5), the Team Process (FABC8) asks for instructions on job performance (FABC6) that show appropriate instructions and guidance to members, and interpersonal considerations (FABC7) for maintaining and strengthening interpersonal relationships. ) Monitor each other's work progress mutually, monitor and show mutual coordination actions as needed (FABC 9), and analyze duties that indicate action content that is clarified by agreement among members Clarification (FABC10), sharing knowledge and information (FABC11) representing an action to disseminate knowledge and information thoroughly, and asking for feedback (FABC12) representing feedback on mistakes and problems.
 また、集団に属しているメンバが同じ目標を達成するために行う作業、協力、意識、行動がわかればよく、別のアンケートを用いてもかまわない。また、それにあわせてリーダシップ/チームワーク指標(FABC)で用いたれているテーブル構成を変更してもかまわない。 Also, it is sufficient if the members belonging to the group know the work, cooperation, awareness, and actions they perform to achieve the same goal, and other questionnaires may be used. Also, the table configuration used in the Leadership / Teamwork Index (FABC) may be changed accordingly.
 社員のやりがい/充実度アンケート(GC)とは、人間が存在する上で、健康で幸福で繁栄できる状態の度合いを調べるアンケートである。社員のやりがい/充実度アンケート(GC)の例として、以下の文献を参考にしてもよい。
Hills,P., and Argyle, M.The Oxford Happiness Questionnaire: a compact scale for the measurement of psychological well-being.Personality and Individual Differences,33,1073-1082,2002.。
The employee satisfaction / satisfaction questionnaire (GC) is a questionnaire that examines the level of health, happiness and prosperity in the presence of human beings. The following documents may be referred to as an example of the employee satisfaction / fullness questionnaire (GC).
Hills, P .; , And Argyle, M. The Oxford Happiness Questionnaire: a compact scale for the measurement of psychological well-being. Personality and Individual Differences, 33, 1073-1082, 2002. .
 アンケートの例を図20で示す。ユーザにこのアンケートに回答してもらい、その結果を社員のやりがい/充実度指標(FABD)に格納する。社員のやりがい/充実度指標の1例として、図11の社員のやりがい/充実度指標(FABD)テーブルを説明する。テーブルはユーザを特定するユーザID(FABD1)と指標であるハピネス(FABD2)から構成されている。回答日:2009年7月15日(FABD3)は回答した日付が記載されている。ハピネス(FABD2)が高いほど、健康で幸福で繁栄できる状態の度合いが高いという意味である。 An example of the questionnaire is shown in FIG. The user is asked to answer this questionnaire, and the result is stored in the employee's satisfaction / satisfaction index (FABD). As an example of the employee satisfaction / severity index, the employee satisfaction / satisfaction index (FABD) table shown in FIG. 11 will be described. The table is composed of a user ID (FABD1) for specifying a user and a happiness (FABD2) as an index. Answer date: July 15, 2009 (FABD3) contains the date of response. The higher the happiness (FABD2), the higher the degree of health, happiness and prosperity.
 また、人間が存在する上で、健康で幸福で繁栄できる状態の度合いがわかればよく、別のアンケートを用いてもかまわない。また、それにあわせて社員のやりがい/充実度指標(FABD)で用いたれているテーブル構成を変更してもかまわない。 Also, in the presence of human beings, it is sufficient to know the level of health, happiness and prosperity, and another questionnaire may be used. In addition, it is also possible to change the table configuration used in the employee satisfaction / sufficiency index (FABD) accordingly.
 ストレス/メンタル不調アンケート(GD)とは、抑うつの心理状態の度合いを調べるアンケートである。ストレス/メンタル不調アンケート(GD)の例として、以下の文献を参考にしてもよい。Radloff,L.S.(1977)’The CES-D scale: A self report depression scale for research in the general population’. Applied Psychological Measurement 1: 385-401.。 The stress / mental disorder questionnaire (GD) is a questionnaire for examining the degree of the psychological state of depression. The following documents may be referred to as an example of a stress / mental disorder questionnaire (GD). Radloff, L .; S. (1977) 'The CES-D scale: A self report depression scale for research in the general population'. Applied Psychological Measurement 1: 385-401. .
 アンケートの例を図19で示す。ユーザはこのアンケートに回答もらい、その結果をストレス/メンタル不調指標(FABE)に格納する。 An example of the questionnaire is shown in FIG. The user answers this questionnaire and stores the result in stress / mental disorder index (FABE).
 ストレス/メンタル不調指標の1例として、図11のストレス/メンタル不調指標(FABE)テーブルを説明する。テーブルはユーザを特定するユーザID(FABE1)と指標である憂うつ(FABE2)から構成されている。回答日:2009年7月15日(FABE3)は回答した日付が記載されている。憂うつ(FABE2)が高いほど、抑うつ度の高い心理状態であるという意味である。 The stress / mental malfunction indicator (FABE) table in FIG. 11 will be described as an example of the stress / mental malfunction indicator. The table is composed of a user ID (FABE1) for identifying a user and depression (FABE2) as an index. Date of answer: July 15, 2009 (FABE3) contains the date of response. The higher the depression (FABE2), the higher the psychological state of depression.
 また、ストレスや抑うつの心理状態の度合いがわかればよく、別のアンケートを用いてもかまわない。また、それにあわせてストレス/メンタル不調指標(FABE)で用いたれているテーブル構成を変更してもかまわない。 In addition, it is sufficient to know the degree of the mental state of stress or depression, and another questionnaire may be used. Also, the table configuration used in the stress / mental disorder index (FABE) may be changed accordingly.
 組織活性化アンケート(GE)とは、活性化の施策後に主観的な効果度合いを調べるためのアンケートである。アンケートの例を図22で示す。ユーザはこのアンケートに回答もらい、その結果を組織活性化指標(FABF)に格納する。組織活性化指標の1例として、図11の組織活性化指標(FABF)テーブルを説明する。テーブルはユーザを特定するユーザID(FABF1)とコミュニケーション増(FABE2)や発言しやすくなった実感(FABE3)などの多くの指標から構成されている。この指標は組織活性化アンケート(GE)のアンケート項目と同じだけ存在する。回答日:2009年7月15日(FABF5)は回答した日付が記載されている。 Tissue Activation Questionnaire (GE) is a questionnaire for examining the degree of subjective effects after measures of activation. An example of the questionnaire is shown in FIG. The user answers this questionnaire and stores the result in the organization activation index (FABF). The tissue activation indicator (FABF) table of FIG. 11 will be described as an example of the tissue activation indicator. The table is made up of a number of indicators such as a user ID (FABF1) for identifying a user, communication increase (FABE2), and feeling that it is easy to speak (FABE3). This indicator is present as much as the questionnaire item of the tissue activation questionnaire (GE). Answer date: July 15, 2009 (FABF5) contains the date of response.
 また、活性化の施策後に主観的な効果度合いがわかればよく、別のアンケートを用いてもかまわない。また、それにあわせて組織活性化指標(FABF)で用いたれているテーブル構成を変更してもかまわない。 In addition, it is sufficient if the degree of subjective effect is known after the activation measure, and another questionnaire may be used. Also, the table configuration used in the tissue activation index (FABF) may be changed accordingly.
 図2Cに示す相関分析(CAE)は、分析単位をある組織として、その組織のメンバそれぞれのストレスや生産性を目的変数、組織活動である組織ダイナミクス指標を説明変数とし、それらの相関を行う分析である。この相関分析の特徴は、分析には本人の変数だけでなく、本人の周囲のメンバの変数も分析対象にすることである。 The correlation analysis (CAE) shown in FIG. 2C is an analysis unit in which the stress and productivity of each member of an organization are taken as an objective variable and the tissue dynamics index which is an organization activity is taken as an explanatory variable. It is. The feature of this correlation analysis is that not only the variables of the person but also the variables of members around the person are analyzed.
 本人の目的変数(CAE1)は、解析結果テーブル(F)の目的変数(FAB)や組織情報テーブル(H)の目的変数(HA)内にある本人のユーザIDのレコードに格納されている変数である。 The objective variable (CAE1) of the person is a variable stored in the record of the user ID of the user in the objective variable (FAB) of the analysis result table (F) or the objective variable (HA) of the organization information table (H). is there.
 本人の説明変数(CAE2)は、解析結果テーブル(F)の説明変数(FAA)内にある本人ユーザIDのレコードに格納されている変数である。 The principal explanatory variable (CAE2) is a variable stored in the record of the principal user ID in the explanatory variable (FAA) of the analysis result table (F).
 周囲の説明変数(CAE3)の周囲とは、本人と対面によって繋がっている周囲のメンバを意味している。そして、周囲の説明変数(CAE3)とは、周囲のメンバから求める説明変数である。 The surrounding of the surrounding explanatory variable (CAE3) means a surrounding member connected to the person by facing. The surrounding explanatory variable (CAE3) is an explanatory variable to be obtained from surrounding members.
 周囲の説明変数(CAE3)の処理方法を説明する。周囲のメンバ選択(CAE31)によって、周囲のメンバを選択する。対面マトリックス(FC1C)から自分と繋がっているメンバを選択する。本実施例では、1ステップのメンバと2ステップのメンバを選択した例を示す。ここで、1ステップのメンバとは、自分と繋がっているメンバである。また、2ステップのメンバとは、1ステップのメンバとそれに繋がっているメンバである。 A method of processing surrounding explanatory variables (CAE3) will be described. Select surrounding members by surrounding member selection (CAE 31). Select a member connected to you from the facing matrix (FC1C). In this embodiment, an example is shown in which one step member and two step members are selected. Here, a member of one step is a member connected to oneself. The two-step member is a one-step member and a member connected to it.
 特徴量計算(CAE32)とは、周囲のメンバ選択(CAE31)によって選択されたメンバの説明変数から周囲の説明変数を求める処理である。1ステップのメンバで選ばれたメンバの説明変数から周囲の説明変数を求める計算方法として、選択された周囲のメンバの説明変数の平均や分散を求める。また、2ステップの場合にも同様の計算を行なう。  The feature value calculation (CAE 32) is a process of obtaining surrounding explanatory variables from explanatory variables of members selected by surrounding member selection (CAE 31). As a calculation method for determining surrounding explanatory variables from the explanatory variables of members selected in the one-step member, the average or variance of the explanatory variables of the selected surrounding members is determined. The same calculation is performed in the case of two steps.
 相関(CAE4)は本人の目的変数(CAE1)と本人の説明変数(CAE2)や周囲の説明変数(CAE3)との相関である。この相関した結果を解析結果テーブル(F)の因子係数(FAC)に格納する。期間:2009年7月1日-7月31日(FAC1)とは、分析に用いた組織ダイナミクス指標の期間である。 The correlation (CAE4) is a correlation between the objective variable (CAE1) of the user and the explanatory variable (CAE2) of the user and the explanatory variables (CAE3) of the surroundings. The correlated result is stored in the factor coefficient (FAC) of the analysis result table (F). Period: 2009/7 / 1-July 31 (FAC1) is the period of the tissue dynamics index used for analysis.
 因子係数(FAC)は、相関(CAE4)によって求めたれた相関係数を格納するテーブルの一部である。図12にその一例を示す。本人(FACA)とは、本人の目的変数(CAE1)と本人の説明変数(CAE2)との相関の結果である。1ステップ平均(FACB)とは、本人の目的変数(CAE1)と周囲の説明変数(CAE3)(その中でも1ステップのメンバの説明変数の平均値)との相関の結果である。2ステップ分散(FACC)とは、本人の目的変数(CAE1)と周囲の説明変数(CAE3)(その中でも2ステップのメンバの説明変数の分散値)との相関の結果である。 The factor coefficient (FAC) is part of a table that stores the correlation coefficient obtained by the correlation (CAE4). An example is shown in FIG. The person (FACA) is the result of the correlation between the object variable (CAE1) of the person and the explanatory variable (CAE2) of the person. The one-step average (FACB) is the result of the correlation between the objective variable (CAE1) of the user and the surrounding explanatory variables (CAE3) (of which the average value of the explanatory variables of the members of one step). The two-step variance (FACC) is the result of the correlation between the objective variable of the user (CAE1) and the surrounding explanatory variables (CAE3) (of which the variance of the explanatory variables of the two-step members).
 因子係数(FAC)は本人と周囲のメンバとの相関結果を代入すればよいので、1ステップのメンバの説明変数の分散値や2ステップのメンバの説明変数の平均値との結果を格納してもかまわない。 The factor coefficient (FAC) can be obtained by substituting the correlation result between the user and the surrounding members, so store the result of the dispersion value of the explanatory variable of one step member and the average value of the explanatory variable of two step members. I don't care.
 本人(FACA)のテーブル構成について説明する。縦軸の目的変数(FACA1)とは解析結果テーブル(F)の目的変数(FAB)や組織情報テーブル(H)の目的変数(HA)で格納されている変数である。よって、組織情報テーブル(H)の目的変数(HA)の生産性指標(HA)であった成績(FACA2)や貢献度(FACA3)という項目が代入される。横軸は、解析結果テーブル(F)の説明変数(FAA)で格納されている変数である。よって、解析結果テーブル(F)の説明変数(FAA)のネットワーク指標(FAAA)であった、次数(FACA6)、結束度(FACA7)、2ステップ到達度(FACA8)という項目が代入される。また、上述したように、アンケートから求められるパーソナリティ指標(FACA9)も、説明変数として格納される。図12では、パーソナリティ指標の例として、開放性(FACA10)を示している。 The table configuration of the person (FACA) will be described. The objective variable (FACA1) on the vertical axis is a variable stored as the objective variable (FAB) of the analysis result table (F) or the objective variable (HA) of the organization information table (H). Therefore, items such as the performance (FACA2) and the degree of contribution (FACA3) that were the productivity index (HA) of the objective variable (HA) of the organization information table (H) are substituted. The horizontal axis is a variable stored as an explanatory variable (FAA) of the analysis result table (F). Therefore, the items of the order (FACA6), the cohesion degree (FACA7), and the two-step attainment degree (FACA8), which are the network index (FAAA) of the explanatory variable (FAA) of the analysis result table (F), are substituted. Further, as described above, the personality index (FACA 9) obtained from the questionnaire is also stored as an explanatory variable. In FIG. 12, the openness (FACA 10) is shown as an example of the personality index.
 成績(FACA2)とネットワーク指標(FACA5)の結束度(FACA7)の相関は0.47であり、カッコで囲まれている0.01は検定結果である。この因子係数(FAC)ではP値を用いている。P値とは、観測した現象よりも起こりにくい現象が発生する確率である。さらに、検定結果によって表にハッチをかけており、P値<=0.001の場合には濃い色、0.001<P値<=0.01の場合には薄い色、P値>0.01の場合に色なし、というように識別できるようにしている。1ステップ平均(FACB)や2ステップ分散(FACC)も本人(FACA)のテーブル構成と同じである。 The correlation between the score (FACA2) and the cohesion degree (FACA7) of the network indicator (FACA5) is 0.47, and 0.01 enclosed in parentheses is the test result. P value is used in this factor coefficient (FAC). The P value is the probability of occurrence of a phenomenon that is less likely to occur than the observed phenomenon. Furthermore, the table is hatched according to the test results, and a dark color if P value <= 0.001, a light color if 0.001 <P value <= 0.01, P value> 0. In the case of 01, it is possible to distinguish such as no color. One-step average (FACB) and two-step variance (FACC) are also the same as the table configuration of the person (FACA).
 相関分析(CAE)では、変数同士の比較に相関を用いたが、有益な因子が見つけることができるならば、相関以外の手法を用いてもかまわない。また、因子係数(FAC)は、相関分析(CAE)によって求めた値を格納することが重要であるため、これが満たされるならば、因子係数(FAC)で用いられているテーブル構成と異なってもかまわない。 Correlation analysis (CAE) used correlation to compare variables, but other methods than correlation may be used if useful factors can be found. In addition, it is important to store the factor coefficient (FAC) obtained by correlation analysis (CAE), so if this is satisfied, it may be different from the table configuration used in factor coefficient (FAC). I do not mind.
 因子選択(CAF)は相関分析(CAE)によって求められた係数の中から有益な因子の選択を行う処理である。因子係数(FAC)の中から、係数(相関)値の高いものを選択する。また、選択基準を相関値だけでなく、検定結果(例えばP値)がよいものや、組織活動として網羅しているもの選択することも可能である。 Factor selection (CAF) is a process of selecting a beneficial factor from the coefficients obtained by correlation analysis (CAE). Among factor coefficients (FAC), select one with high coefficient (correlation) value. Further, it is possible to select not only the correlation value but also the selection result that has a good test result (for example, P value) or one that is covered as an organization activity.
 モデル描画(JA)では、因子選択(CAF)によって、係数値が高かった目的変数と説明変数を用いてモデルを描画する。その例を科学的経営知モデル(KA)で示す。期間:2009年7月1日-7月31日(KA10)とは、分析に用いた組織ダイナミクス指標の期間である。 In model drawing (JA), a model is drawn using factor variables (CAF) using an objective variable and an explanatory variable with high coefficient values. An example is shown by scientific management knowledge model (KA). Period: 2009/7 / 1-July 31 (KA10) is the period of the tissue dynamics index used for analysis.
 目的変数(KA11)として、ストレス/メンタル不調リスク(KA1)を選択し、説明変数(KA12)として、周囲の外向性(KA2)、本人の外向性(KA3)、本人の就業時間平均(KA4)、その他(KA5)を選択した。そして、目的変数(KA11)と説明変数(KA12)と線で結ぶ。配置としては、右側に目的変数(KA11)、左側に説明変数(KA12)を配置する。さらに、上から説明変数(KA12)の因子係数を大きい順に並べる。その他(KA5)とは、小さい因子係数をまとめたものである。説明変数(KA12)には、それぞれの因子係数を(KA6)~(KA9)の様に配置する。また、類似している説明係数が因子選択(CAF)によって選択されたら、それらを統合してもかまわない。 Stress / mental upset risk (KA1) is selected as the objective variable (KA11), and the extroversion (KA2) of the surroundings, the extraversion of the person (KA3), the average working hours of the person (KA4) as the explanatory variables (KA12) , Others (KA5) were selected. Then, a line connects the objective variable (KA11) and the explanatory variable (KA12). As an arrangement, an objective variable (KA11) is arranged on the right side and an explanatory variable (KA12) is arranged on the left side. Furthermore, the factor coefficients of the explanatory variable (KA12) are arranged in descending order from the top. The other (KA5) is a collection of small factor coefficients. The factor coefficients are arranged as (KA6) to (KA9) in the explanatory variable (KA12). Also, if similar explanatory coefficients are selected by factor selection (CAF), they may be integrated.
 モデル化解析することにより、ストレスや生産性に対して、どの組織活動が有益因子であるかが明らかになった。組織毎に抱えている問題が異なるように、組織毎に有益因子や因子係数も異なる。 By modeling analysis, it became clear which tissue activity is a beneficial factor for stress and productivity. Just as each organization has different problems, each organization has different benefit factors and factor coefficients.
 このように、分析単位となる複数のメンバからなる組織において、メンバそれぞれのストレス又は生産性などの主観的又は客観的指標を目的変数とし、身体リズム指標や対面指標などの網羅的な指標である組織ダイナミクス指標を説明変数として相関分析を行う。これにより、分析単位となる組織における主観的又は客観的指標の要因を特定することができる。そして、このモデルにより、具体的にどの組織行動を改善すべきかを特定することができる。 Thus, in an organization consisting of a plurality of members serving as an analysis unit, subjective or objective indicators such as stress or productivity of each member are used as objective variables, and they are exhaustive indicators such as body rhythm indicators and face-to-face indicators. Correlation analysis is performed using tissue dynamics indicators as explanatory variables. Thereby, the factor of the subjective or objective index in the organization which is an analysis unit can be specified. Then, this model can specifically identify which organizational behavior should be improved.
 なお、本実施例では、説明変数に組織ダイナミクス指標を求めたが、説明変数にアンケート(G)から求めた指標を用いてもかまわない。さらに、目的変数に組織ダイナミクス指標を用いてもかまわない。 Although the tissue dynamics index is determined as the explanatory variable in the present embodiment, the index determined from the questionnaire (G) may be used as the explanatory variable. Furthermore, a tissue dynamics index may be used as the objective variable.
 実施例1では、パーソナリティアンケート(GA)よって、パーソナリティ指標(FAAE)を求めたが、過去のパーソナリティアンケート(GA)を学習させ、モデルを作成することによって、現在の組織ダイナミクス指標とモデルからパーソナリティ指標(FAAE)を求める。 In Example 1, the personality index (FAAE) was determined by the personality index (GA), but learning of the past personality index (GA) and learning of the model, the current organization dynamics index and the personality index from the model Find (FAAE).
 図13は、過去のパーソナリティアンケート(GA)を学習し、モデルの係数であるパーソナリティ指標係数(FAE)を求めるパーソナリティ指標抽出(CA1)と、現在の組織ダイナミクス指標とパーソナリティ指標係数(FAE)から現在のパーソナリティ指標(FAF)を求めるパーソナリティ指標変換(CA2)から構成されている。 Fig. 13 shows personality index extraction (CA1) which learns past personality questionnaire (GA) and obtains personality index coefficient (FAE) which is coefficient of model, and present tissue dynamics index and personality index coefficient (FAE) from present The personality index transformation (CA2) to obtain the personality index (FAF) of
 はじめに、パーソナリティ指標抽出(CA1)について説明する。パーソナリティ指標抽出(CA1)は過去のパーソナリティアンケート(GA)と組織ダイナミクス指標であるネットワーク指標(FAAA)、身体リズム指標(FAAB)、対面指標(FAAC)と活動指標(FAAD)を用いてパーソナリティ指標係数抽出(CA1A)することにより、パーソナリティ指標係数(FAE)を求めることである。 First, personality index extraction (CA1) will be described. Personality index extraction (CA1) is a personality index coefficient using past personality questionnaire (GA) and organization dynamics index (Network index (FAAA), physical rhythm index (FAAB), face-to-face index (FAAC) and activity index (FAAD) The personality index coefficient (FAE) is obtained by extracting (CA1A).
 ネットワーク指標(FAAA)、身体リズム指標(FAAB)、対面指標(FAAC)、活動指標(FAAD)、パーソナリティ指標(FAAE)を求めるまでは、実施例1と同じであるため説明を省略する。パーソナリティ指標係数抽出(CA1A)について説明する。パーソナリティ指標係数抽出(CA1A)では、パーソナリティ指標(FAAE)を目的変数、ネットワーク指標(FAAA)、身体リズム指標(FAAB)、対面指標(FAAC)と活動指標(FAAD)を説明変数とした、重回帰分析を行なうことにより、重回帰式における係数及び定数項を求める。 The process until the network index (FAAA), the physical rhythm index (FAAB), the face-to-face index (FAAC), the activity index (FAAD), and the personality index (FAAE) are obtained is the same as that of the first embodiment. The personality index coefficient extraction (CA1A) will be described. In personality index coefficient extraction (CA1A), multiple regression with personality index (FAAE) as objective variable, network index (FAAA), physical rhythm index (FAAB), face-to-face index (FAAC) and activity index (FAAD) as explanatory variables By performing the analysis, coefficient and constant terms in the multiple regression equation are determined.
 図14、15に示す解析結果テーブル(F)のパーソナリティ係数(FAE)がパーソナリティ指標(FAAE)それぞれにおける重回帰式における係数及び定数項をまとめたテーブルである。図14は、組織における重回帰式の係数及び定数項をまとめたものであり、図15は、ユーザ毎に重回帰式の係数及び定数項をまとめたものである。 The personality factor (FAE) of the analysis result table (F) shown in FIGS. 14 and 15 is a table in which the coefficient and constant term in the multiple regression equation in each personality index (FAAE) are summarized. FIG. 14 summarizes the coefficients and constant terms of the multiple regression equation in the tissue, and FIG. 15 summarizes the coefficients and constant terms of the multiple regression equation for each user.
 図14に示す縦軸は目的変数であるパーソナリティ(FAE1)であり、外向性(FAE2)、調和性(FAE3)、誠実性(FAE4)、神経性(FAE5)と開放性(FAE6)から構成されている。横軸は説明変数であり、ネットワーク指標(FAE8)やパーソナリティ指標(FAE12)などの指標における重回帰式の係数が格納されている。また、切片(FAE14)は重回帰式の定数項である。図15に示す縦軸及び横軸についても同様である。パーソナリティ指標抽出(CA1)においては、重回帰分析を用いたが、学習に用いる手法であれば、それ以外でもかまわない。 The vertical axis shown in FIG. 14 is the objective variable personality (FAE 1), which is composed of extroversion (FAE 2), harmony (FAE 3), integrity (FAE 4), nervousness (FAE 5) and openness (FAE 6) ing. The horizontal axis is an explanatory variable, in which coefficients of multiple regression equations in indexes such as the network index (FAE 8) and the personality index (FAE 12) are stored. The intercept (FAE14) is a constant term of the multiple regression equation. The same applies to the vertical axis and the horizontal axis shown in FIG. Although multiple regression analysis was used in personality index extraction (CA1), any other method may be used as long as it is used for learning.
 次に、パーソナリティ指標変換(CA2)について説明する。パーソナリティ指標変換(CA2)はパーソナリティ指標抽出(CA1)で求めたパーソナリティ指標係数(FAE)と組織ダイナミクス指標の中から現在の指標を用いることで、現在のパーソナリティ指標(FAG)を求める。 Next, personality index conversion (CA2) will be described. Personality index conversion (CA2) uses the current index from among the personality index coefficient (FAE) obtained by personality index extraction (CA1) and the tissue dynamics index to obtain the current personality index (FAG).
 パーソナリティ指標変換(CA2A)では、パーソナリティ指標係数(FAE)に格納されている、重回帰式における係数及び定数項と、組織ダイナミクス指標であるネットワーク指標(FAAA)、身体リズム指標(FAAB)、対面指標(FAAC)と活動指標(FAAD)の現在の指標を用いる。そして、それらを重回帰式に当てはめることで、組織ダイナミクス指標からパーソナリティ指標を求める。 In personality index conversion (CA2A), coefficients and constant terms in multiple regression equation stored in personality index coefficient (FAE), network index (FAAA) that is tissue dynamics index, physical rhythm index (FAAB), face-to-face index Use current indicators of (FAAC) and activity indicator (FAAD). Then, by applying them to the multiple regression equation, the personality index is obtained from the tissue dynamics index.
 解析結果テーブル(F)の推定パーソナリティ指標(FAG)はパーソナリティ指標変換(CA2A)によって求めたパーソナリティ指標である。テーブルの1例を図16に示す。テーブルの形式は解析結果テーブル(F)のパーソナリティ指標(FAAE)と同じであるため割愛する。日時:2009年8月15日 14:32(FAG7)は、分析に用いた現在の組織ダイナミクス指標の日時と時間を示している。 The estimated personality index (FAG) of the analysis result table (F) is a personality index obtained by personality index conversion (CA2A). An example of the table is shown in FIG. The table format is omitted because it is the same as the personality index (FAAE) of the analysis result table (F). Date: August 15, 2009 14:32 (FAG 7) shows the date and time of the current tissue dynamics indicator used in the analysis.
 過去のパーソナリティ指標と組織ダイナミクス指標を用いて重回帰分析を行ない、モデルを作成することにより、アンケートを用いなくても、現在の組織ダイナミクス指標とモデルからパーソナリティ指標を求めることが可能となった。アンケートを用いる場合、月に1回など実施する頻度に制限を受けるおそれがあるが、本実施例によればそのような制限を受けることはなく、算出する頻度も適宜変更しうる。 By performing multiple regression analysis using past personality indicators and tissue dynamics indicators, and creating a model, it became possible to obtain personality indicators from current tissue dynamics indicators and models without using a questionnaire. In the case of using a questionnaire, there is a possibility that there will be a restriction on the frequency of implementation, such as once a month, but according to this embodiment such restriction is not received, and the frequency of calculation may be changed appropriately.
 本実施例では、パーソナリティ指標について求めたが、他のアンケート(リーダシップ/チームワークアンケート(GB)、社員のやりがい/充実度アンケート(GC)、ストレス/メンタル不調アンケート(GD)、組織活性化アンケート(GE)等)に対しても同じような処理を行なっても良い。 In this example, the personality index was sought, but other questionnaires (Leadership / Teamwork Questionnaire (GB), Employee's Challenge / Enrichment Questionnaire (GC), Stress / Mental Disorder Questionnaire (GD), Organization Activation Questionnaire) The same processing may be performed on (GE) and the like.
 実施例3では、ユーザと対面している組を同時に表示可能なネットワーク図を生成する。従来のネットワーク図では、対面している組を見ることができない。ネットワーク図の点(ノード)をユーザと対面組の両方表示することで、この問題を解決する。 In the third embodiment, a network diagram capable of simultaneously displaying a set facing a user is generated. Conventional network diagrams can not see facing pairs. The problem is solved by displaying both points of the network diagram (nodes) as a user and a face-to-face set.
 図23は、ユーザと対面している組を同時に表示するための処理手順を示した図である。対面組ネットワークモデル化解析(CB)によってモデルを構成し、対面組ネットワーク図描画(JB)によって描画を行ない、描画した結果は対面組ネットワーク図(KB)である。この処理は、実施例1と同じフレームワークで処理することが可能であり、対面組ネットワークモデル化解析(CB)はアプリケーションサーバ(AS)の制御部(ASCO)、対面組ネットワーク図描画(JB)はクライアント(CL)の表示(J)で実行される。 FIG. 23 is a diagram showing a processing procedure for simultaneously displaying a set facing the user. A model is configured by the facing group network modeling analysis (CB), and drawing is performed by the facing group network diagram drawing (JB), and the drawn result is a facing group network diagram (KB). This process can be processed by the same framework as that of the first embodiment, and the face-to-face set network modeling analysis (CB) is the control unit (ASCO) of the application server (AS), the face-to-face pair network diagram drawing (JB) Is executed on the display (J) of the client (CL).
 解析結果テーブル(F)の対面テーブル(FC1A)を求めるまでは、実施例1と同じであるため説明を省略する。対面組別対面時間(CBA)は対面の組とその対面時間を求める処理である。対面テーブル(FC1A)には対面しているメンバが記載されているため、これから組と時間を求める。この結果を解析結果テーブル(F)の対面者別対面時間リスト(FBA)に格納する。テーブル構成は対面組(FBA1)と対面時間(FBA2)から構成されている。 The process until the meeting table (FC1A) of the analysis result table (F) is obtained is the same as that of the first embodiment, so the description is omitted. Face-to-face pair by face-to-face time (CBA) is a process for determining face-to-face pairs and their face-to-face times. Since the members facing each other are described in the facing table (FC1A), the set and time will be determined from this. This result is stored in the face-to-face meeting time list (FBA) of the analysis result table (F). The table configuration is composed of a facing set (FBA1) and a facing time (FBA2).
 期間:2009年7月1日-7月31日(FBA3)は対面テーブル(FC1A)に用いた期間をしている。日数:31日間(FBA4)は期間(FBA3)における日数である。実質日数:21日間(FBA5)は期間(FBA3)に営業日数である。対面判定時間:3分間/1日(FC1C6)は対面したと判定するための閾値である。また、対面者別対面時間リスト(FBA)は、対面組の対面状況を格納することが重要であるため、これが満たされるならば、対面者別対面時間リスト(FBA)で用いたれているテーブル構成と異なってもかまわない。 Period: 2009/7 / 1-July 31 (FBA3) is the period used for the facing table (FC1A). Days: 31 days (FBA 4) is the number of days in the period (FBA 3). Actual days: 21 days (FBA5) is the number of business days in the period (FBA3). Meeting determination time: 3 minutes / 1 day (FC1C6) is a threshold value for determining that meeting has occurred. In addition, since it is important to store the facing situation of the face-to-face group, it is important to store the face-to-face meeting time list (FBA), and if this is satisfied, the table configuration used in the face-to-face meeting time list (FBA) It does not matter.
 対面組別テーブル生成(CBB)は、対面者別対面時間リスト(FBA)からユーザと対面組に組み合わせる処理である。 Face-to-face grouping table generation (CBB) is a process of combining face-to-face person facing time lists (FBA) into a face-to-face pair with a user.
 対面枝狩り(CBB1)では、対面者別対面時間リスト(FBA)から対面時間(FBA2)が小さい値のものを消去する処理である。対面判定時間(FBA6)と実質日数(FBA5)を掛け合わせたものを閾値として、その値より大きいものを残してもかまわない。テーブル生成(CBB2)では、対面枝狩り(CBB1)によって出力されたリストから対面組別テーブル(FBB)を出力する。 Face-to-face hunting (CBB1) is a process for deleting from the face-to-face person face-to-face time list (FBA) those having a small face-to-face time (FBA2). A threshold value may be obtained by multiplying the meeting determination time (FBA6) and the actual number of days (FBA5), and a value larger than the threshold value may be left. In the table generation (CBB2), the facing group table (FBB) is output from the list output by the facing branch hunting (CBB1).
 対面組別テーブル(FBB)は、対面組別対面時間リスト(FBBA)と対面組別接続マトリックス(FBBB)で構成される。図24にその一例を示す。対面組別対面時間リスト(FBBA)は、ネットワーク図におけるノード(点)の大きさを示すものである。対面組(FBBA1)と対面時間(FBBA2)で構成されている。期間:2009年7月1日-7月31日(FBBA3)は対面テーブル(FC1A)で用いた期間を示している。 The facing group table (FBB) is composed of a facing group facing time list (FBBA) and a facing group connection matrix (FBBB). An example is shown in FIG. The face-to-face group face-to-face time list (FBBA) indicates the sizes of nodes (points) in the network diagram. It consists of a face-to-face pair (FBBA1) and a face-to-face time (FBBA2). Period: 2009/7 / 1-July 31 (FBBA3) indicates the period used on the facing table (FC1A).
 対面組別接続マトリックス(FBBB)は、ネットワーク図におけるエッジ(線)を示すものである。対面組別接続マトリックス(FBBB)は縦軸と横軸はユーザと対面組とする。期間:2009年7月1日-7月31日(FBBB3)は対面テーブル(FC1A)で用いた期間を示している。 The face-to-face set connection matrix (FBBB) indicates an edge (line) in the network diagram. In the face-to-face pair connection matrix (FBBB), the vertical axis and the horizontal axis are face-to-face pairs with the user. Period: 2009/7 / 1-July 31 (FBBB3) shows the period used on the facing table (FC1A).
 このマトリックスでは、対面組とその構成されているメンバとをエッジ(線)で結ぶために、1を代入する。さらに、包含関係になっている対面組(FBBB4)には、メンバが含まれている人数が大きい対面組と小さい体面組とをエッジ(線)で結ぶために1を代入する(FBBB5)。そして、大きい対面組の内、小さい体面組に含まれているメンバに対しては、エッジ(線)で結ばない(FBBB6)。 In this matrix, 1 is substituted in order to connect the face-to-face set and its constituent members by an edge (line). Furthermore, in the face-to-face pair (FBBB4) having the inclusion relationship, 1 is substituted in order to connect the face-to-face pair having a large number of members including members and the small surface pair with an edge (line) (FBBB5). Then, among members in a large facing pair, members included in a small surface pair are not connected by an edge (line) (FBBB6).
 また、対面組別テーブル(FBB)は、メンバと対面組の対面状況を格納することが重要であるため、これが満たされるならば、対面組別テーブル(FBB)で用いられているテーブル構成と異なってもかまわない。 In addition, since it is important to store the facing situation of members and the facing group, the facing group table (FBB) is different from the table configuration used in the facing group table (FBB) if this is satisfied. It does not matter.
 ユーザ/場所情報テーブル(I)のユーザID表(IA)について説明する。このテーブルの一例を図25に示す。ユーザID表(IA)はユーザIDと氏名やチーム名などの情報を関連付けるためのテーブルである。ユーザID(IA1)、ユーザ名(IA2)、チーム名(IA3)、職位(IA4)、組織(IA5)から構成されている。 The user ID table (IA) of the user / place information table (I) will be described. An example of this table is shown in FIG. The user ID table (IA) is a table for associating user IDs with information such as names and team names. A user ID (IA1), a user name (IA2), a team name (IA3), a job title (IA4), and an organization (IA5).
 対面組ネットワーク図描画(JB)では、対面組別テーブル生成(CBB)によって求められた、メンバ/対面組のデータを対面組ネットワーク図として描画する。その例を図23の対面組ネットワーク図(KB)で示す。期間:2009年7月1日-7月31日(KX4)は対面テーブル(FC1A)で用いた期間を示している。 In the face-to-face combination network diagram drawing (JB), data of member / face-to-face pairs obtained by the face-to-face combination table generation (CBB) is drawn as a face-to-face pair network diagram. The example is shown by the facing group network diagram (KB) of FIG. Period: 2009/7 / 1-July 31 (KX4) has shown the period used by the facing table (FC1A).
 メンバを四角の点(ノード)、対面組を丸の点(ノード)で示している。また、メンバの点(ノード)を外側、対面組の点(ノード)を内側に配置する。その点(ノード)の大きさは対面組別対面時間リスト(FBBA)によって決定する。また、対面組別接続マトリックス(FBBB)で1と格納している点(ノード)を線(エッジ)で結ぶ。点(ノード)には渡辺(KB2)や伊藤(KB1)や渡辺、伊藤(KB3)の様に、ユーザID表(IA)を利用してユーザID(IA1)からユーザ名(IA2)を求め、表示する。 The members are indicated by square points (nodes) and the facing set is indicated by circle points (nodes). In addition, the point (node) of the member is placed outside, and the point (node) of the facing set is placed inside. The size of the point (node) is determined by the face-to-face group face-to-face time list (FBBA). In addition, points (nodes) stored with 1 in the facing group connection matrix (FBBB) are connected by lines (edges). Find user name (IA2) from user ID (IA1) using user ID table (IA) as point (node) like Watanabe (KB2), Ito (KB1), Watanabe, Ito (KB3), indicate.
 このように、ユーザと対面している組を同時に表示可能なネットワーク図を生成することで、ネットワーク図上で、実際にどのようなメンバと活動しているのかがわかるようになる。 As described above, by generating a network diagram capable of simultaneously displaying a pair facing a user, it becomes possible to know what members are actually active on the network diagram.
 実施例4では、ユーザと場所を同時に表示可能なネットワーク図を生成する。従来のネットワーク図では、対面している組を見ることができない。ネットワーク図の点(ノード)をユーザと場所の両方表示することで、この問題を解決する。 In the fourth embodiment, a network diagram capable of simultaneously displaying a user and a place is generated. Conventional network diagrams can not see facing pairs. Resolving this problem by displaying both points of the network diagram (nodes) as both user and location.
 図26は、ユーザと場所を同時に表示するための処理手順を示した図である。場ネットワークモデル化解析(CC)によってモデルを構成し、場ネットワーク図描画(JC)によって描画を行ない、描画した結果は場ネットワーク図(KC)である。 FIG. 26 is a diagram showing a processing procedure for simultaneously displaying the user and the place. A model is constructed by field network modeling analysis (CC), and drawing is performed by field network diagram drawing (JC). The result of drawing is a field network diagram (KC).
 この処理は、実施例1と同じフレームワークで処理することが可能であり、場ネットワークモデル化解析(CC)はアプリケーションサーバ(AS)の制御部(ASCO)、場ネットワーク図描画(JC)はクライアント(CL)の表示(J)で実行される。 This process can be processed by the same framework as that of the first embodiment, and field network modeling analysis (CC) is a control unit (ASCO) of an application server (AS), and field network diagram drawing (JC) is a client It is executed by the indication (J) of (CL).
 場所テーブル処理(C1D)は、組織ダイナミクスデータの赤外線データからメンバ間の対面状況をある一定期間毎に時系列順にまとめたものである。場所に赤外線を発光する赤外線端末を設置し、その場所に設置してある赤外線を名札型センサノード(TR)が検知することで、その場所に滞在していると判定する。 In the place table process (C1D), the meeting situation between members is summarized in chronological order for each given period from infrared data of tissue dynamics data. An infrared terminal emitting infrared light is installed in a location, and the name tag type sensor node (TR) detects the infrared radiation installed in the location to determine that the user is staying in the location.
 ユーザ/場所情報テーブル(I)の場所ID表(IB)について説明する。このテーブルの一例を図25に示す。 The place ID table (IB) of the user / place information table (I) will be described. An example of this table is shown in FIG.
 場所ID表(IB)は場所IDと場所名と赤外線IDを関連付けるためのテーブルである。場所ID(IB1)、場所名(IB2)、赤外線ID(IB3)から構成されている。場所名(IB2)はその場所の名前、赤外線ID(IB3)は場所ID(IB1)に設置してある赤外線端末のIDである。場所には赤外線端末は複数個設置してもかまわない、複数個設置した場合には、赤外線ID(IB3)に複数の赤外線IDを記述する。 The place ID table (IB) is a table for associating a place ID, a place name and an infrared ID. It consists of a place ID (IB1), a place name (IB2), and an infrared ID (IB3). The place name (IB2) is the name of the place, and the infrared ID (IB3) is the ID of the infrared terminal installed in the place ID (IB1). A plurality of infrared terminals may be installed in the place. When the plurality of infrared terminals are installed, the plurality of infrared IDs are described in the infrared ID (IB3).
 抽出した結果を解析結果テーブル(F)の場所テーブル(FC1D)に格納する。場所テーブル(FC1D)の1例を図27に示す。これは、場所を1レコードとして、時間分解能1分間(FC1D3)として、時系列順に1日(24時間)分を格納してある表である。場所テーブル(2009年7月1日)(FC1D3)では、縦軸に場所を判別するための場所ID(FC1D1)、横軸は時間分解能による時刻を示す分解能時刻(FC1D2)となっている。 The extracted result is stored in the place table (FC1D) of the analysis result table (F). One example of the place table (FC1D) is shown in FIG. This is a table in which one day (24 hours) is stored in chronological order with a time resolution of one minute (FC1D3) with one place as a record. In the place table (July 1, 2009) (FC1D3), the vertical axis is a place ID (FC1D1) for determining the place, and the horizontal axis is a resolution time (FC1D2) indicating time by time resolution.
 ある時刻における場所における対面状況は、場所ID(FC1D1)と分解能時刻(FC1D2)しているところを読み取るだけでよい。例えば、場所IDが00Aの2009/7/1 10:02の対面状況は2名と在籍しており、在籍しているメンバは002と003である。ここに含まれる値は、在籍人数、対面者がいた場合にはそのユーザIDとNULLのどれかである。該当するユーザでかつその時刻の組織ダイナミクスデータの赤外線データが存在しない場合にNULLが場所テーブル(FC1D)に格納される。 The facing situation in a place at a certain time only needs to read the place having the place ID (FC1D1) and the resolution time (FC1D2). For example, in the case of 2009/7/1 10:02 where the location ID is 00A, there are 2 people registered, and the registered members are 002 and 003. The value included here is either the number of registered persons, or the user ID or NULL if there is a person on the other hand. NULL is stored in the location table (FC1D) when there is no infrared data of tissue dynamics data of the corresponding user at that time.
 場所テーブル(FC1D)は1日かつ時間分解能に生成されるため、同じ日付でも、時間分解能が異なれば別テーブルとなる。例えば、(FC1D4)と(FC1D5)では、同じ(2009年7月2日)であるが、時間分解能が異なるため、別テーブルとなっている。 Since the place table (FC1D) is generated to one day and time resolution, even if the same date, the time resolution is different from one another. For example, although (FC1D4) and (FC1D5) are the same (July 2, 2009), they have different tables because they have different time resolutions.
 また、場所テーブル(FC1D)は、在籍人数と在籍しているユーザIDとして格納することが重要であるため、これが満たされるならば、場所テーブル(FC1D)で用いられているテーブル構成と異なってもかまわない。 Also, since it is important to store the place table (FC1D) as the number of registered people and the registered user ID, if this is satisfied, even if it is different from the table configuration used in the place table (FC1D) I do not mind.
 次に、場・ユーザ滞在時間(CCA)について説明する。場所における在籍しているメンバや組を求める処理である。 Next, the place and user stay time (CCA) will be described. It is a process to find the registered members and groups in the place.
 場所テーブル(FC1D)には、場所別に時系列上に在籍しているメンバ名が記載されているので、これを利用し、在籍の組と時間を表にまとめる。表にまとめたものが解析結果テーブル(F)の対面者別対面時間リスト(FCA)である。場所ID(FCA1)を1レコードとして、メンバ・在籍組(FCA2)における対面時間を記載したものである。期間:2009年7月1日-7月31日(FCA3)は場所テーブル(FC1D)に用いた期間をしている。日数:31日間(FCA4)は期間(FCA3)における日数である。実質日数:21日間(FCA5)は期間(FCA3)に営業日数である。対面判定時間:1分間/1日(FCA6)は在籍したと判定するための閾値である。 The place table (FC1D) describes the names of members registered on the time series according to the place, so use them to organize the set and time of registration in a table. What is summarized in the table is the face-to-face meeting time list (FCA) of the analysis result table (F). The meeting ID (FCA1) is described as one record, and the meeting time in the member / registered pair (FCA2) is described. Period: 2009/7 / 1-July 31 (FCA3) is the period used for the place table (FC1D). Days: 31 days (FCA 4) is the number of days in the period (FCA 3). Real Days: 21 days (FCA 5) is the number of business days in the period (FCA 3). Meeting determination time: 1 minute / day (FCA 6) is a threshold value for determining that the user is enrolled.
 対面組別テーブル生成(CCB)は、対面者別対面時間リスト(FCA)からユーザと対面組に組み合わせる処理である。 Face-to-face combination table generation (CCB) is a process of combining face-to-face person facing time lists (FCA) into a face-to-face pair with a user.
 場所枝狩り(CCB1)では、対面者別対面時間リスト(FCA)から対面時間が小さい値のものを消去する処理である。場所枝狩りの例として、対面判定時間(FCA6)と実質日数(FCA5)を掛け合わせたものを閾値として、その値より大きいものを残してもかまわない。ネットワーク図における点(ノード)が多くなってしまうため、場所枝狩り(CCB1)を行なう。もし、他の方法があるならば、それを用いてもかまわない。 In place branch hunting (CCB1), it is a process of deleting the one having a small meeting time value from the meeting time list (FCA) for each meeting person. As an example of place hunting, a value obtained by multiplying the face-to-face determination time (FCA 6) and the actual number of days (FCA 5) may be used as a threshold, and a value larger than that value may be left. Since there are more points (nodes) in the network diagram, place branch hunting (CCB1) is performed. If there is another method, it may be used.
 テーブル生成(CCB2)では、場所枝狩り(CCB1)によって出力されたリストからユーザ・場テーブル(FCB)を出力する。ユーザ・場テーブル(FCB)は、場別対面時間リスト(FCBA)とユーザ・場マトリックス(FCBB)で構成される。その1例を図28で示す。 In the table generation (CCB2), the user / field table (FCB) is output from the list output by the location branch hunting (CCB1). The user / field table (FCB) is composed of a meeting time list by field (FCBA) and a user / field matrix (FCBB). One example is shown in FIG.
 場別対面時間リスト(FCBA)は、ネットワーク図におけるノード(点)の大きさを示すものである。場(FCBA1)、対面組(FCBA2)と対面時間(FCBA3)で構成されている。期間:2009年7月1日-7月31日(FCBA4)は場所テーブル(FC1D)で用いた期間を示している。 The meeting facing time list (FCBA) shows the sizes of nodes (points) in the network diagram. It consists of a place (FCBA1), a face-to-face pair (FCBA2) and a face-to-face time (FCBA3). Period: July 1-July 31, 2009 (FCBA 4) indicates the period used in the place table (FC 1 D).
 ユーザ・場マトリックス(FCBB)は、ネットワーク図におけるエッジ(線)を示すものである。ユーザ・場マトリックス(FCBB)は縦軸と横軸はユーザと場とする。期間:2009年7月1日-7月31日(FCBB3)は場所テーブル(FC1D)で用いた期間を示している。このマトリックスでは、場とそこに在籍しているメンバとをエッジ(線)で結ぶために、1を代入する。また、ユーザ・場テーブル(FCB)は、場とメンバとの対面状況を格納することが重要であるため、これが満たされるならば、ユーザ・場テーブル(FCB)で用いられているテーブル構成と異なってもかまわない。 The user / field matrix (FCBB) indicates an edge (line) in the network diagram. The user / place matrix (FCBB) has the vertical axis and the horizontal axis with the user. Period: 2009/7 / 1-July 31 (FCBB3) shows the period used in the place table (FC1D). In this matrix, 1 is substituted in order to connect a place and a member registered there by an edge (line). In addition, since it is important to store the meeting situation of the place and the member, the user / place table (FCB) is different from the table configuration used in the user / place table (FCB) if this is satisfied. It does not matter.
 場ネットワーク図描画(JC)では、ユーザ・場別テーブル生成(CCB)によって求めたメンバ/場のデータを場ネットワーク図として描画する。その例を図26の場ネットワーク図(KC)で示す。期間:2009年7月1日-7月31日(KCA4)は場所テーブル(FC1D)で用いた期間を示している。 In the field network diagram drawing (JC), data of members / fields obtained by the user / field table generation (CCB) is drawn as a field network diagram. The example is shown by the field network diagram (KC) of FIG. Period: 2009/7 / 1-July 31 (KCA4) indicates the period used in the place table (FC1D).
 メンバを四角の点(ノード)、対面組を丸の点(ノード)で示している。また、メンバの点(ノード)を外側、対面組の点(ノード)を内側に配置する。その点(ノード)の大きさは対面場別対面時間リスト(FCBA)によって決定する。また、場(FCBA1)が複数ある場合には点(ノード)を円グラフとして表示する。ユーザ・場マトリックス(FCBB)で1と格納している点(ノード)を線(エッジ)で結ぶ。点(ノード)には渡辺(KC2)や伊藤(KC1)や会議室(KC3)の様に表示する。会議室(KC3)の点(ノード)は円グラフで表示し、対面時間の割合で分け、それに従事するユーザや対面組(例えば、伊藤(KC5)や渡辺、伊藤(KC4))を表示する。 The members are indicated by square points (nodes) and the facing set is indicated by circle points (nodes). In addition, the point (node) of the member is placed outside, and the point (node) of the facing set is placed inside. The size of the point (node) is determined by the meeting time list (FCBA) according to meeting place. In addition, when there are a plurality of places (FCBA1), points (nodes) are displayed as a pie chart. Connect points (nodes) stored with 1 in the user / field matrix (FCBB) with lines (edges). The points (nodes) are displayed like Watanabe (KC2), Ito (KC1), and a conference room (KC3). The points (nodes) of the meeting room (KC3) are displayed as a circle graph, divided by the ratio of the facing time, and the users engaged in it and the facing group (for example, Ito (KC5), Watanabe, Ito (KC4)) are displayed.
 このように、ユーザと場所を同時に表示可能なネットワーク図を生成することで、ネットワーク図上で、どの場が活用されているのかがわかるようになる。 As described above, by generating a network diagram capable of simultaneously displaying a user and a place, it becomes possible to identify which place is utilized on the network diagram.
 実施例5では、チーム対面時間等の数字を見たとしても、実態感が低いため、ユーザにフィードバックがかかりにくい。そこで、実際に何処の場が利用させているのかを一見してわかるために、見取り図上にマッピングし、実態感を向上させる。 In the fifth embodiment, even when looking at numbers such as team facing time, since the feeling of reality is low, it is difficult for feedback to be given to the user. Therefore, in order to see at a glance what places are actually used, mapping on a sketch is made to improve the sense of reality.
 図29は、場所にチームをマッピングするため処理手順を示した図である。
場所チームモデル化解析(CD)によってモデルを構成し、人数別表示の場合には、場の人数別マップ描画(JDA)によって描画を行ない、描画した結果は場の人数別マップ(KDA)であり、チーム内外表示の場合には、場のチーム内外マップ描画(JDB)によって描画を行ない、描画した結果は場のチーム内外マップ(KDB)である。
FIG. 29 is a diagram showing a processing procedure for mapping a team to a place.
The model is constructed by the place team modeling analysis (CD), and in the case of display by number of people, drawing is performed by the number of people map drawing (JDA) of the place, and the drawn result is the number of people map of the place (KDA) In the case of inside and outside team display, drawing is performed by the inside and outside team map drawing (JDB) of the place, and the drawn result is the inside and outside team map (KDB) of the place.
 この処理は、実施例1と同じフレームワークで処理することが可能であり、場所チームモデル化解析(CD)はアプリケーションサーバ(AS)の制御部(ASCO)、場の人数別マップ描画(JDA)と場のチーム内外マップ描画(JDB)はクライアント(CL)の表示(J)で実行される。 This process can be processed by the same framework as that of the first embodiment, and the place team modeling analysis (CD) is the control unit (ASCO) of the application server (AS), and the map drawing by number of people (JDA) The team internal and external map drawing (JDB) of the place is executed by the display (J) of the client (CL).
 解析結果テーブル(F)の場所テーブル(FC1D)を求めるまでは、実施例4と同じであるため説明を省略する。場所・人数別対面時間(CDA)は場における対面人別の対面時間を求める処理である。場所テーブル(FC1D)には場所に在籍しているメンバが時系列順に記載されているため、これから人数別と対面している時間を求める。この結果を解析結果テーブル(F)の場の人数別対面時間(FDA)に格納する。テーブル構成は場所ID(FDA1)と人数(FDA2)から構成されている。 The process until the location table (FC1D) of the analysis result table (F) is obtained is the same as that of the fourth embodiment, so the description is omitted. Meeting time by place and number of people (CDA) is a process for obtaining meeting time by meeting person in a place. Since the members registered at the place are listed in chronological order in the place table (FC1D), the time for meeting with the number of people is calculated from now. This result is stored in the meeting time (FDA) by number of persons in the field of the analysis result table (F). The table configuration is composed of a place ID (FDA1) and a number of people (FDA2).
 場所ID(FDA1)を1レコードとして、人数(FDA2)における対面時間を記載したものである。人数(FDA2)は、1人(FDA6)、2人(FDA7)、3人-5人(FDA8)、6人以上(FDA9)という分解能で分類している。期間:2009年7月1日-7月31日(FDA3)は場所テーブル(FC1D)に用いた期間をしている。日数:31日間(FDA4)は期間(FDA3)における日数である。実質日数:21日間(FDA5)は期間(FDA3)に営業日数である。 Place ID (FDA 1) is one record, and the face-to-face time in the number of people (FDA 2) is described. The number of people (FDA 2) is classified by resolution of 1 person (FDA 6), 2 people (FDA 7), 3-5 people (FDA 8), 6 people or more (FDA 9). Period: 2009/7 / 1-July 31 (FDA3) is the period used for the place table (FC1D). Days: 31 days (FDA 4) is the number of days in the period (FDA 3). Real Days: 21 days (FDA 5) is the number of business days in the period (FDA 3).
 次に、ユーザ/場所情報テーブル(I)の場リスト(IC)について図30を用いて説明する。場リスト(IC)は見取り図である場マップ画像(ICA)と場所の座標が記載されている場座標(ICB)から構成されている。場マップ画像(ICA)は見取り図画像である。場座標(ICB)は場所ID(ICB1)と座標値(ICB2)から構成されている。場所ID(ICB1)を1レコードとして、座標値(ICB2)におけるX座標値(ICB3)とY座標値(ICB4)を記載したものである。 Next, the field list (IC) of the user / place information table (I) will be described with reference to FIG. The field list (IC) is composed of a field map image (ICA) which is a sketch and field coordinates (ICB) in which coordinates of places are described. The venue map image (ICA) is a sketch image. The field coordinates (ICB) are composed of a place ID (ICB1) and coordinate values (ICB2). The location ID (ICB1) is one record, and the X coordinate value (ICB3) and the Y coordinate value (ICB4) in the coordinate value (ICB2) are described.
 場の人数別マップ描画(JDA)は場の人数別対面時間(FDA)から見取り図へマッピングして描画する処理である。描画した結果が、図31の場の人数別マップ(KDA)である。場の人数別対面時間(FDA)の場所における人数別対面時間を、見取り図における該当場所に円グラフ上をプロットする。円グラフには人数別の対面時間の割合によって、円グラフの中心角度が変化する。また、人数の凡例を表示する。円グラフの大きさは人数別の対面時間の和である。見取り図における該当場所の付近には、場所ID表(IB)から場所名(IB2)を表示する。期間:2009年7月1日-7月31日(KDA1)は場所テーブル(FC1D)に用いた期間をしている。 The map according to the number of people in the place (JDA) is a process of mapping from the meeting time according to the number of people in the place (FDA) to a sketch and drawing. The result of drawing is a map by number of people (KDA) of the place of FIG. Meeting time by number of people meeting time (FDA) by place number of people Meeting time by number of people is plotted on the pie chart at the corresponding place in the sketch. In the pie chart, the central angle of the pie chart changes according to the ratio of the facing time by the number of people. Also, display a legend of the number of people. The size of the pie chart is the sum of the meeting time by number of people. The place name (IB2) is displayed from the place ID table (IB) near the corresponding place in the sketch. Period: 2009/7 / 1-July 31 (KDA1) is the period used for the place table (FC1D).
 場のチーム内外対面時間(CDB)は場におけるチーム内外の対面時間を求める処理である。場所テーブル(FC1D)には場所に在籍しているメンバが時系列順に記載されているため、これからユーザID表(IA)と照らし合わせて、チーム内それともチーム外の対面であるかを判断し、それぞれの対面時間を求める。この結果を解析結果テーブル(F)の場のチーム内外対面時間(FDB)に格納する。テーブル構成は場所ID(FDB1)と内外(FDB2)から構成されている。場所ID(FDA1)を1レコードとして、内外(FDB2)における対面時間を記載したものである。内外(FDB2)は、チーム内(FDB6)、チーム外(FDB7)という分解能で分類している。期間:2009年7月1日-7月31日(FDB3)は場所テーブル(FC1D)に用いた期間をしている。日数:31日間(FDB4)は期間(FDB3)における日数である。実質日数:21日間(FDB5)は期間(FDB3)に営業日数である。 The team internal and external meeting time (CDB) of the venue is a process for determining the internal and external team meeting time in the venue. Since the members registered at the place are listed in chronological order in the place table (FC1D), it is judged from the user ID table (IA) whether it is an in-team or out-of-team meeting from now on, Find each meeting time. This result is stored in the team internal and external meeting time (FDB) in the field of the analysis result table (F). The table configuration is composed of a place ID (FDB1) and inside and outside (FDB2). Place ID (FDA 1) is described as one record, and the meeting time in the inside and outside (FDB 2) is described. Inside and outside (FDB2) are classified with the resolution of in-team (FDB 6) and out-of-team (FDB 7). Period: 2009/7 / 1-July 31 (FDB3) is the period used for the place table (FC1D). Days: 31 days (FDB 4) is the number of days in the period (FDB 3). Actual days: 21 days (FDB 5) is the number of business days in the period (FDB 3).
 場のチーム内外マップ描画(JDB)は場のチーム内外対面時間(FDB)から見取り図へマッピングして描画する処理である。描画した結果が図31の場のチーム内外マップ(KDB)である。場のチーム内外対面時間(FDB)の場所におけるチーム内外別対面時間を、見取り図における該当場所に円グラフ上をプロットする。円グラフには人数別の対面時間の割合によって、円グラフの中心角度が変化する。また、チーム内外の凡例を表示する。円グラフの大きさはチーム内外別の対面時間の和である。見取り図における該当場所の付近には、場所ID表(IB)から場所名(IB2)を表示する。期間:2009年7月1日-7月31日(KDB3)は場所テーブル(FC1D)に用いた期間をしている。 The team internal and external map drawing (JDB) of a place is a process which maps and draws from a team internal and external facing time (FDB) of a place to a sketch. The drawn result is a map inside and outside the team (KDB) in the place of FIG. The meeting time by team inside and outside at the place of the team inside and outside meeting time (FDB) of the place is plotted on the pie chart in the corresponding place in the sketch. In the pie chart, the central angle of the pie chart changes according to the ratio of the facing time by the number of people. Also, show legends inside and outside the team. The size of the pie chart is the sum of face-to-face contact times inside and outside the team. The place name (IB2) is displayed from the place ID table (IB) near the corresponding place in the sketch. Period: 2009/7 / 1-July 31 (KDB3) is the period used for the place table (FC1D).
 このように、見取り図上の場にチーム情報をマッピングすることで、実態感を向上させることができる。 As described above, by mapping team information to the place on the sketch, the feeling of reality can be improved.
 対面組対面時間等の数字を見たとしても、実態感が低いため、ユーザにフィードバックがかかりにくい。そこで、実施例6では、実際に何処の場が利用させているのかを一見してわかるために、見取り図上にマッピングし、実態感を向上させる。 Even when looking at numbers such as face-to-face pairs, it is difficult for the user to receive feedback because the feeling of reality is low. Therefore, in the sixth embodiment, in order to see at a glance which place is actually used, mapping is performed on a sketch to improve the actual feeling.
 図32は、場所にチームをマッピングするため処理手順を示した図である。場所対面組モデル化解析(CE)によってモデルを構成し、対面組の場合には、場の対面組マップ描画(JEA)によって描画を行ない、描画した結果は場の対面組マップ(KEA)であり、ユーザの場合には、場のユーザマップ描画(JEB)によって描画を行ない、場のユーザマップ(KEB)である。 FIG. 32 is a diagram showing a processing procedure for mapping a team to a place. The model is constructed by the place facing set modeling analysis (CE), and in the case of the facing set, drawing is performed by the face set drawing map (JEA) of the place, and the drawing result is the face set map of the place (KEA) In the case of the user, drawing is performed by the user map drawing (JEB) of the place, and it is a user map (KEB) of the place.
 この処理は、実施例1と同じフレームワークで処理することが可能であり、場所対面組モデル化解析(CE)はアプリケーションサーバ(AS)の制御部(ASCO)、場の対面組マップ描画(JEA)と場のユーザマップ描画(JEB)はクライアント(CL)の表示(J)で実行される。解析結果テーブル(F)の場所テーブル(FC1D)を求めるまでは、実施例4と同じであるため説明を省略する。 This process can be processed by the same framework as that of the first embodiment, and the place facing group modeling analysis (CE) is performed by the control unit (ASCO) of the application server (AS), the face group map drawing of the field (JEA) ) And the user map drawing (JEB) of the place are executed on the display (J) of the client (CL). The process until the location table (FC1D) of the analysis result table (F) is obtained is the same as that of the fourth embodiment, so the description is omitted.
 場所・対面組対面時間(CEA)は場における対面組別の対面時間を求める処理である。場所テーブル(FC1D)には場所に在籍しているメンバが時系列順に記載されているため、これから対面組と対面している時間を求める。この結果を解析結果テーブル(F)の場の対面組対面時間(FEA)に格納する。テーブル構成は場所ID(FEA1)とユーザ/対面組(FEA2)から構成されている。場所ID(FEA1)を1レコードとして、ユーザ/対面組(FEA2)における対面時間を記載したものである。ユーザ/対面組(FEA2)は、在籍しているユーザIDとその対面時間が格納される。期間:2009年7月1日-7月31日(FEA3)は場所テーブル(FC1D)に用いた期間をしている。日数:31日間(FEA4)は期間(FEA3)における日数である。実質日数:21日間(FEA5)は期間(FEA3)に営業日数である。 Place-face-to-face group face-to-face time (CEA) is a process for obtaining face-to-face time by face-to-face group in a place. Since the members registered at the place are described in chronological order in the place table (FC1D), the time for which the face group is facing is determined from this. This result is stored in the meeting set facing time (FEA) of the field of the analysis result table (F). The table configuration is composed of a place ID (FEA1) and a user / face-to-face pair (FEA2). The meeting time in the user / face-to-face pair (FEA2) is described with the place ID (FEA1) as one record. The user / face-to-face pair (FEA 2) stores the registered user ID and its face-to-face time. Period: 2009/7 / 1-July 31 (FEA3) is the period used for the place table (FC1D). Days: 31 days (FEA 4) is the number of days in the period (FEA 3). Real Days: 21 days (FEA 5) is the number of business days in the period (FEA 3).
 場の対面組マップ描画(JEA)は場の対面組対面時間(FEA)から見取り図へマッピングして描画する処理である。描画した結果が図33の場の対面組マップ(KEA)である。場の対面組対面時間(FEA)の場所における対面組別対面時間を、見取り図における該当場所に丸をプロットする。対面組の対面時間によって、丸の直径を変化する。また、同じ場所に複数個の丸が存在する場合には、重ならないようにずらす。また、対面組のメンバの氏名をユーザID表(IA)からユーザ名(IA2)を選択し表示する。見取り図における該当場所の付近には、場所ID表(IB)から場所名(IB2)を選び表示する。期間:2009年7月1日-7月31日(KEA3)は場所テーブル(FC1D)に用いた期間をしている。 The meeting group map drawing (JEA) of the place is a process of mapping and drawing from the meeting group facing time (FEA) of the place to the sketch. The drawn result is a face-to-face set map (KEA) of the field of FIG. Face-to-face pairing time of meeting place Face-to-face pairing time (FEA) The round-to-face pairing time is plotted at the corresponding place in the floor plan. The diameter of the circle is changed according to the facing time of the facing set. Also, if there are multiple circles in the same place, shift so as not to overlap. Further, the user name (IA2) is selected from the user ID table (IA) and the names of the members of the face-to-face set are displayed. In the vicinity of the corresponding place in the sketch, the place name (IB2) is selected from the place ID table (IB) and displayed. Period: 2009/7 / 1-July 31 (KEA3) is the period used for the place table (FC1D).
 場所・ユーザ対面時間(CDB)は場におけるユーザの対面時間を求める処理である。場所テーブル(FC1D)には場所に在籍しているメンバが時系列順に記載されているため、これからユーザID表(IA)と照らし合わせて、それぞれメンバの対面時間を求める。この結果を解析結果テーブル(F)の場のユーザ対面時間(FEB)に格納する。テーブル構成は場所ID(FEB1)とユーザID(FEB2)から構成されている。場所ID(FEB1)を1レコードとして、ユーザID(FEB2)における対面時間を記載したものである。期間:2009年7月1日-7月31日(FEB3)は場所テーブル(FC1D)に用いた期間をしている。日数:31日間(FEB4)は期間(FEB3)における日数である。実質日数:21日間(FEB5)は期間(FEB3)に営業日数である。 Place-user face-to-face time (CDB) is a process for determining the face-to-face time of the user at the place. Since the members registered in the place are described in chronological order in the place table (FC1D), the face-to-face time of each member is determined from the user ID table (IA). This result is stored in the user facing time (FEB) in the field of the analysis result table (F). The table configuration is composed of a place ID (FEB1) and a user ID (FEB2). The meeting time in the user ID (FEB2) is described with the place ID (FEB1) as one record. Period: July 1-July 31, 2009 (FEB3) is the period used for the place table (FC1D). Days: 31 days (FEB 4) is the number of days in the period (FEB 3). Real Days: 21 days (FEB5) is the number of business days in the period (FEB3).
 場のユーザマップ描画(JEB)は場のユーザ対面時間(FEB)から見取り図へマッピングして描画する処理である。描画した結果が、図33の場のユーザマップ(KEB)である。場のユーザ対面時間(FEB)の場所におけるユーザ別対面時間を、見取り図における該当場所に丸をプロットする。ユーザの対面時間によって、丸の直径を変化する。また、同じ場所に複数個の丸が存在する場合には、重ならないようにずらす。また、メンバの氏名をユーザID表(IA)からユーザ名(IA2)を選択し表示する。 The user map drawing (JEB) of the place is a process of mapping and drawing from the user facing time (FEB) of the place to the sketch. The drawn result is the user map (KEB) of the field of FIG. A circle is plotted at the corresponding place in the floor plan for the user facing time (FEB) location of the place by user. The diameter of the circle changes according to the face-to-face time of the user. Also, if there are multiple circles in the same place, shift so as not to overlap. Also, the user's name (IA2) is selected from the user ID table (IA) and the name of the member is displayed.
 見取り図における該当場所の付近には、場所ID表(IB)から場所名(IB2)を選び表示する。期間:2009年7月1日-7月31日(KEB1)は場所テーブル(FC1D)に用いた期間をしている。このように、見取り図上の場に対面組情報をマッピングすることで、実態感を向上させることができる。 In the vicinity of the corresponding place in the sketch, the place name (IB2) is selected from the place ID table (IB) and displayed. Period: 2009/7 / 1-July 31 (KEB1) is the period used for the place table (FC1D). As described above, by mapping the face-to-face set information to the place on the sketch, the feeling of reality can be improved.
 場の利用状況等の数字を見たとしても、実態感が低いため、ユーザにフィードバックがかかりにくい。そこで、実施例7では、実際に何処の場が利用させているのかを一見してわかるために、見取り図上にマッピングし、実態感を向上させる。 Even if you look at numbers such as the use situation of the place, it is hard for feedback to be given to the user because the feeling of reality is low. Therefore, in the seventh embodiment, in order to see at a glance which place is actually used, mapping is performed on a sketch to improve the actual feeling.
 図34は、場所の利用状況をマッピングするため処理手順を示した図である。
場所利用モデル化解析(CF)によってモデルを構成し、場所における時系列上の場所利用人数の場合には、場利用状況マップ描画(JF)の場在籍グラフ(JF1)によって描画を行ない、場所における利用時間回数の場合には、場利用状況マップ描画(JF)の場利用回数グラフ(JF2)によって描画を行ない、場所における時系列上の温度変化の場合には、場利用状況マップ描画(JF)の場温度フラグ(JF3)によって描画を行ない、これらの結果が場利用状況マップ(KFA)となる。
FIG. 34 is a diagram showing a processing procedure for mapping the use situation of a place.
A model is constructed by the place use modeling analysis (CF), and in the case of the place use number of people on the time series in the place, the drawing is made by the place attendance graph (JF1) of the place use situation map drawing (JF) In the case of the number of usage hours, drawing is performed by the field usage frequency graph (JF2) of the site usage situation map drawing (JF), and in the case of the temperature change in time series at the place, the field usage situation map drawing (JF) The drawing is performed according to the field temperature flag (JF3), and these results become the field utilization situation map (KFA).
 この処理は、実施例1と同じフレームワークで処理することが可能であり、場所利用モデル化解析(CF)はアプリケーションサーバ(AS)の制御部(ASCO)、場利用状況マップ描画(JF)はクライアント(CL)の表示(J)で実行される。 This process can be processed by the same framework as that of the first embodiment, and the place use modeling analysis (CF) is the control unit (ASCO) of the application server (AS) and the field use situation map drawing (JF) It is executed by display (J) of client (CL).
 解析結果テーブル(F)の場所テーブル(FC1D)を求めるまでは、実施例4と同じであるため説明を省略する。はじめに、場所における時系列上の場所利用人数を求める。場別利用状況(CFA)は場における平均利用割合と平均利用人数を求める処理である。場所テーブル(FC1D)には場所に在籍しているメンバが時系列順に記載されているため、これから場における平均利用割合と平均利用人数を求める。この結果を解析結果テーブル(F)の場在籍時間(FFA)に格納する。テーブル構成は場所ID(FFA1)とIndex(FFA2)から構成されている。 The process until the location table (FC1D) of the analysis result table (F) is obtained is the same as that of the fourth embodiment, so the description is omitted. First, find the number of people in the place in chronological order. The use situation by place (CFA) is a process for obtaining the average use rate and the average number of people in the place. Since the members registered in the place are listed in chronological order in the place table (FC1D), the average use rate and the average number of people in the place are determined from this. This result is stored in the field attendance time (FFA) of the analysis result table (F). The table configuration is composed of a place ID (FFA1) and an index (FFA2).
 場所ID(FFA1)を1レコードとして、Usage rate(FFA3)とAverage number people(FFA4)を記載する。期間:2009年7月1日-7月31日(FFA5)は場所テーブル(FC1D)に用いた期間をしている。  The usage rate (FFA3) and the average number people (FFA4) are described with the place ID (FFA1) as one record. Period: 2009/7 / 1-July 31 (FFA5) is the period used for the place table (FC1D).
 場在籍グラフ(JF1)は場所テーブル(FC1D)と場在籍時間(FFA)から時系列上に場所の利用状況を描画する処理である。描画した結果が、図36の場在籍グラフ(KFA2)である。場所テーブル(FC1D)から時系列上に在籍人数を折れ線グラフで表示する。そして、場在籍時間(FFA)から求めたUsage rate(FFA3)とAverage number people(FFA4)を表示する。 The place attendance graph (JF1) is a process of drawing the use situation of places on a time series from the place table (FC1D) and the place attendance time (FFA). The result of drawing is a field attendance graph (KFA2) of FIG. The number of registered persons is displayed as a line graph on the time series from the place table (FC1D). Then, Usage rate (FFA3) and Average number people (FFA4) obtained from the field attendance time (FFA) are displayed.
 次に、場所における利用時間回数を求める。場別対面カウント(CFB)は場所における利用時間回数を求める処理である。 Next, the number of usage hours at the place is determined. The face-to-face meeting count (CFB) is a process for determining the number of usage hours at a place.
 場所テーブル(FC1D)には場所に在籍しているメンバが時系列順に記載されているため、これから場における利用時間別回数を求める。この結果を解析結果テーブル(F)の場利用時間(FFB)に格納する。テーブル構成は場所ID(FFB1)とTime(FFB2)から構成されている。場所ID(FFA1)を1レコードとして、時間別の利用回数を記載する。期間:2009年7月1日-7月31日(FFA7)は場所テーブル(FC1D)を用いた期間をしている。 Since the members registered at the place are described in time series order in the place table (FC1D), the number of times by use time in the place is determined from this. This result is stored in the field use time (FFB) of the analysis result table (F). The table configuration is composed of a place ID (FFB1) and a time (FFB2). A place ID (FFA1) is set as one record, and the number of times of use by time is described. Period: 2009/7 / 1-July 31 (FFA7) has a period using the place table (FC1D).
 場利用回数グラフ(JF2)は場利用時間(FFB)から場所における利用時間回数を描画する処理である。描画した結果が図36の場利用時間グラフ(KFA3)である。場利用時間(FFB)から時間別に利用回数を棒グラフで表示する。さらに、円グラフでもかまわない。 The site use frequency graph (JF2) is a process of drawing the number of use times at a place from the site use time (FFB). The drawn result is a field utilization time graph (KFA 3) of FIG. The number of times of use is displayed as a bar graph by time from the field use time (FFB). Furthermore, it may be a pie chart.
 最後に、場所における時系列上の温度変化の場合を求める。温度テーブル処理(C1E)は、組織ダイナミクスデータの温度データを一定期間毎に時系列順にまとめたものである。抽出した結果を解析結果テーブル(F)の温度テーブル(FC1E)に格納する。温度テーブル(FC1E)の1例を図35に示してある。これは、ユーザを1レコードとして、時間分解能1分間(FC1E3)として、時系列順に1日(24時間)分を格納してある表である。 Finally, the case of temperature change over time in place is determined. In the temperature table process (C1E), temperature data of tissue dynamics data is summarized in time series for each fixed period. The extracted result is stored in the temperature table (FC1E) of the analysis result table (F). One example of the temperature table (FC1E) is shown in FIG. This is a table in which one day (24 hours) is stored in chronological order, with a user as one record and a time resolution of one minute (FC1E3).
 温度テーブル(2009年7月1日)(FC1E3)では、縦軸にメンバ個人を判別するためのユーザID(FC1E1)、横軸は時間分解能による時刻を示す分解能時刻(FC1E2)となっている。ある時刻におけるユーザの温度テーブル状況は、ユーザID(FC1E1)と分解能時刻(FC1E2)しているところを読み取るだけでよい。例えば、ユーザIDが001の2009/7/1 10:02の温度は23.5である。該当するユーザでかつその時刻の組織ダイナミクスデータの温度データが存在しない場合にNULLが温度テーブル(FC1E)に格納される。 In the temperature table (July 1, 2009) (FC1E3), the vertical axis is a user ID (FC1E1) for identifying a member individual, and the horizontal axis is a resolution time (FC1E2) indicating time by time resolution. The temperature table status of the user at a certain time needs only to read the place where the user ID (FC1E1) and the resolution time (FC1E2) are present. For example, the temperature of 2009/7/11 10:02 of the user ID 001 is 23.5. NULL is stored in the temperature table (FC1E) when there is no temperature data of tissue dynamics data of the corresponding user at that time.
 温度テーブル(FC1E)は1日かつ時間分解能に生成されるため、同じ日付でも、時間分解能が異なれば別テーブルとなる。例えば、(FC1E4)と(FC1E5)では、同じ(2009年7月2日)であるが、時間分解能が異なるため、別テーブルとなっている。 Since the temperature table (FC1E) is generated with one day and time resolution, even if the same date, the time resolution is different from the other table. For example, although (FC1E4) and (FC1E5) are the same (July 2, 2009), they have different tables because they have different time resolutions.
 また、温度テーブル(FC1E)は、ユーザ毎に温度テーブルを格納することが重要であるため、これが満たされるならば、温度テーブル(FC1E)で用いられているテーブル構成と異なってもかまわない。 Moreover, since it is important to store the temperature table for each user, the temperature table (FC1E) may be different from the table configuration used in the temperature table (FC1E) as long as the temperature table is satisfied.
 場平均温度(CFC)は、場における平均温度を求める処理である。場所テーブル(FC1D)には場所に在籍しているメンバが時系列順に記載されているため、これから温度テーブル(FC1E)を参照することで、場における平均温度を求める。 Field Average Temperature (CFC) is the process of determining the average temperature in a field. Since the members registered in the place are described in chronological order in the place table (FC1D), the average temperature in the place is determined by referring to the temperature table (FC1E).
 この結果を解析結果テーブル(F)の場温度(FFC)に格納する。テーブル構成は場所ID(FFC1)とIndex(FFC2)から構成されている。
場所ID(FFA1)を1レコードとして、Usage rate(FFA3)を記載する。期間:2009年7月1日-7月31日(FFA4)は温度テーブル(FC1D)に用いた期間をしている。
This result is stored in the field temperature (FFC) of the analysis result table (F). The table configuration is composed of a place ID (FFC1) and an index (FFC2).
The usage rate (FFA3) is described with the place ID (FFA1) as one record. Period: 2009/7 / 1-July 31 (FFA4) is performing the period used for the temperature table (FC1D).
 場温度フラグ(JF3)は温度テーブル(FC1E)と場温度(FFC)から時系列上に場所の利用状況を描画する処理である。描画した結果が図36の場温度グラフ(KFA4)である。温度テーブル(FC1D)から時系列上に該当するユーザの温度データの折れ線グラフで表示する。そして、場温度(FFC)から求めたUsage rate(FFC3)を表示する。 The field temperature flag (JF3) is a process of drawing the use situation of a place on a time series from the temperature table (FC1E) and the field temperature (FFC). The drawn result is a field temperature graph (KFA 4) of FIG. The temperature table (FC1D) is displayed as a line graph of temperature data of the corresponding user on the time series. Then, the usage rate (FFC3) obtained from the field temperature (FFC) is displayed.
 場利用状況マップ描画(JF)の場マップ統合(JF4)では、場在籍グラフ(JF1)、場利用回数グラフ(JF2)と場温度フラグ(JF3)から求めたれた画像である場在籍グラフ(KFB2)、場利用時間グラフ(KFB3)と場温度グラフ(KFB4)を見取り図へマッピングして描画する処理である。 In the field map integration (JF4) of the field usage situation map drawing (JF), the field attendance graph (KFB2) which is an image obtained from the field attendance graph (JF1), the field usage frequency graph (JF2) and the field temperature flag (JF3) ), The field utilization time graph (KFB3) and the field temperature graph (KFB4) are processes for mapping and drawing on a sketch.
 描画した結果が、図36の場利用状況マップ(KFA)である。見取り図(KFA1)のように、見取り図における該当場所に場所名のアイコン(例えば(KFA12)や(KFA12))を配置する。そして、場所名のアイコンをクリックしたら、その場における場在籍グラフ(KFA2)、場利用時間グラフ(KFA3)と場温度グラフ(KFA4)がホップアップされる。クリックアイコンの色を変化させる。
このように、見取り図上の場に場利用状況情報をマッピングすることで、実態感を向上させることができる。
The result of drawing is the field utilization situation map (KFA) of FIG. As shown in the sketch (KFA1), the icon (for example, (KFA12) or (KFA12)) of the place name is arranged at the corresponding place in the sketch. Then, when the icon of the place name is clicked, the on-site field attendance graph (KFA2), the field usage time graph (KFA3) and the field temperature graph (KFA4) are hopped up. Change the color of the click icon.
As described above, by mapping the use status information to the place on the sketch, the sense of actuality can be improved.
 コミュニケーションがどのぐらいの周期で行なわれているのかが重要である。通常のネットワーク図では対面の総和でしかなく、どのぐらいの周期で対面しているかがわからない。そこで、実施例8では、ネットワーク図上に対面周期を反映させることにより、コミュニケーション頻度を可視化させる。 It is important how often the communication is performed. In a normal network diagram, it is only the sum of facing, and it is not possible to know how often it is facing. Therefore, in the eighth embodiment, the communication frequency is visualized by reflecting the meeting period on the network diagram.
 図37は、コミュニケーションの周期をネットワーク図上に反映させるための処理手順を示した図である。対面周期解析(CG)によって対面時間と対面周期を求め、対面周期ネットワーク図描画(JGA)によってネットワーク図を作成し、この結果が対面周期ネットワーク図(KGA)である。また、ネットワーク図を作成するだけでなく、対面周期ヒストグラム図描画(JGB)によってユーザ毎の周期別対面時間を示すヒストグラム図を作成し、この結果が対面周期ヒストグラム図(KGB)である。 FIG. 37 is a diagram showing a processing procedure for reflecting the communication cycle on the network diagram. A meeting time and a meeting period are determined by meeting cycle analysis (CG), and a network diagram is created by a meeting cycle network diagram drawing (JGA), and the result is a meeting cycle network diagram (KGA). Further, in addition to creating a network diagram, a histogram diagram showing a period of time spent on meeting by user according to a meeting period histogram diagram (JGB) is created, and the result is a meeting period histogram diagram (KGB).
 この処理は、実施例1と同じフレームワークで処理することが可能であり、対面周期解析(CG)はアプリケーションサーバ(AS)の制御部(ASCO)、対面周期ネットワーク図描画(JGA)と対面周期ヒストグラム図描画(JGB)はクライアント(CL)の表示(J)で実行される。解析結果テーブル(F)の対面マトリックス(FC1C)を求めるまでは、実施例1と同じであるため説明を省略する。 This process can be processed by the same framework as in the first embodiment, and the meeting cycle analysis (CG) corresponds to the control unit (ASCO) of the application server (AS), the meeting cycle network diagram drawing (JGA) and the meeting cycle The histogram diagram drawing (JGB) is executed on the display (J) of the client (CL). The process until obtaining the facing matrix (FC1C) of the analysis result table (F) is the same as that of the first embodiment, and hence the description is omitted.
 対面マトリックス(FC1C)の例を図5で示している。図5では作成に用いている日数が複数日(期間:2009年7月1日-7月31日(FC1C3))となっているが、実施例8では、図5で示した複数日の対面マトリックスと、日数が1日の対面マトリックスを求める。 An example of the facing matrix (FC1C) is shown in FIG. In FIG. 5, the number of days used for creation is a plurality of days (period: July 1-July 31, 2009 (FC1C3)), but in Example 8, the meeting of the plurality of days shown in FIG. Find a matrix and a face-to-face matrix with one day.
 対面2値化(CGA)は、1日毎の対面マトリックス(FC1C)に格納されている対面時間をある閾値を基準として、大きい値の場合には1、小さい値の場合には0を代入する。閾値は対面マトリックス(FC1C)の対面判定時間:3分間/1日(FC1C7)である。2値化した結果は解析結果テーブル(F)の対面2値マトリックス(FGA)に格納させる。ファイルの形式は対面マトリックス(FC1C)と同じであるため、割愛する。対面マトリックス(FC1C)との違いは格納されている値であり、対面マトリックス(FC1C)は多値で、対面2値マトリックス(FGA)は2値である。 In the face-to-face binarization (CGA), the face-to-face time stored in the face-to-face matrix (FC1C) for each day is substituted with 1 for a large value and 0 for a small value based on a certain threshold. The threshold is the meeting determination time of the meeting matrix (FC1C): 3 minutes / 1 day (FC1C7). The binarized results are stored in the face-to-face binary matrix (FGA) of the analysis result table (F). Since the format of the file is the same as the facing matrix (FC1C), it is omitted. The difference from the face-to-face matrix (FC1C) is a stored value, and the face-to-face matrix (FC1C) has multiple values, and the face-to-face binary matrix (FGA) has two values.
 対面周期抽出(CGB)は1日の対面マトリックス(FC1C)から対面の周期を求める処理である。毎日のメンバー同士の対面から対面周期を求める。対面周期の求め方として、実質日数から対面した日数を割ったものが考えられる。対面周期の求め方は他の手法を求めてもかまわない。対面周期抽出(CGB)によって求めた結果を解析結果テーブル(F)の対面周期マトリックス(FGB)に格納する。 Meeting cycle extraction (CGB) is a process for obtaining a meeting cycle from the meeting matrix (FC1C) of one day. The meeting cycle is determined from the daily meeting of members. As a method of determining the meeting period, it can be considered that the actual number of days divided by the number of days in which the person has met. The method of determining the meeting period may be another method. The result obtained by the meeting period extraction (CGB) is stored in the meeting period matrix (FGB) of the analysis result table (F).
 対面周期マトリックス(FGB)の1例を図37に示してある。1ヶ月分の対面周期結果をまとめたものとなっている。対面周期マトリックス(FGB)では、縦軸はメンバ個人を判別するためのユーザID(FGB1)、横軸は対面した相手を示すユーザID(FGB2)である。ユーザ002におけるユーザ003との対面周期(日数)は、1.0となっており、これは毎日対面していることを意味している。この値が大きくなるにつれて、対面周期が大きくなっていることを示しており、例えば、ユーザ001におけるユーザ002との対面周囲は2.3であるが、これは2日ぐらいに一度対面していることを意味している。 An example of a facing period matrix (FGB) is shown in FIG. It is a compilation of the meeting cycle results for one month. In the face-to-face period matrix (FGB), the vertical axis is a user ID (FGB1) for identifying a member individual, and the horizontal axis is a user ID (FGB2) indicating a partner who has met. The meeting cycle (number of days) of the user 002 with the user 003 is 1.0, which means that they meet daily. As this value increases, it indicates that the face-to-face period increases. For example, the face-to-face contact with the user 002 in the user 001 is 2.3, but this faces once in about 2 days It means that.
 この対面周期マトリックス(FGB)を作成するにあたっては、多くの情報が1つのマトリックスに集約されてしまうため、もとの情報を記述しておく必要がある。期間:2009年7月1日-7月31日(FGB3)は対面周期マトリックス(FGB)に用いた期間をしている。日数:31日間(FGB4)は期間(FGB3)における日数である。実質日数:21日間(FGB5)は期間(FGB3)に営業日数である。 In creating this face-to-face cycle matrix (FGB), it is necessary to describe the original information because a lot of information is aggregated into one matrix. Period: July 1-July 31, 2009 (FGB 3) is the period used for the Meeting Period Matrix (FGB). Days: 31 days (FGB 4) is the number of days in the period (FGB 3). Real number of days: 21 days (FGB 5) is the number of business days in the period (FGB 3).
 また、対面周期マトリックス(FGB)は、ユーザの対面状況を格納することが重要であるため、これが満たされるならば、対面マトリックス(FC1C)で用いられているテーブル構成と異なってもかまわない。 In addition, since it is important to store the facing situation of the user, the facing cycle matrix (FGB) may be different from the table configuration used in the facing matrix (FC1C) if it is satisfied.
 対面周期ネットワーク図描画(JGA)では、コミュニケーションの周期を求めている対面周期マトリックス(FGB)とコミュニケーションの量である複数日の対面マトリックス(FC1C)からメンバの対面周期を考慮したネットワーク図を描画する。その例を対面周期ネットワーク図(KGA)で示す。期間:2009年7月1日-7月31日(KGA1)は対面周期マトリックス(FGB)で用いた期間を示している。メンバを丸の点(ノード)で示している。また、メンバ間を結ぶ線(エッジ)は対面時間/周期を示している。特に線の太さは対面時間を示し、線の形状(実線、破線)は対面周期を示している。配置にはバネモデルを使用する。バネモデル(フックの法則)とは、2つのノード(点)がつながれている場合、そこにバネがあるとして力(内向きまたは外向き)を計算し、さらに自分とつながっていない全てのノードから距離に応じた斥力(反発する力)を受けるとして位置の移動を繰り返すことにより最適な配置にする手法である。点(ノード)には高橋(KGA2)、田中(KGA3)や渡辺(KGA4)の様に配置する。ユーザID表(IA)を利用してユーザID(IA1)からユーザ名(IA2)を求め、表示する。そして、田中(KGA3)と渡辺(KGA4)の対面状況は両者を結ぶ線(エッジ)で示しており、(KGA5)では対面時間が短いが毎日対面していることを示している。また、田中(KGA3)と高橋(KGA2)の場合が(KGA6)であり、対面時間が多いが対面周期が長い(数日に1度の対面周期)ということを示している。 In the face-to-face periodic network diagram drawing (JGA), a network diagram taking account of the face-to-face period of members from the face-to-face period matrix (FGB) for which communication cycles are required and the face-to-face matrix (FC1C) of a plurality of days as communication amounts . The example is shown by a facing periodic network diagram (KGA). Period: 2009/7 / 1-July 31 (KGA1) has shown the period used by the facing period matrix (FGB). Members are indicated by circle points (nodes). Further, lines (edges) connecting members indicate facing time / period. In particular, the thickness of the line indicates the facing time, and the shape of the line (solid line, broken line) indicates the facing period. Use a spring model for placement. The spring model (Hook's law) calculates the force (inward or outward) assuming that there is a spring when two nodes (points) are connected, and further calculates the distance from all nodes not connected with itself. It is a method to make it an optimal arrangement by repeating movement of a position as receiving repulsive force (repulsive force) according to. The points (nodes) are arranged like Takahashi (KGA2), Tanaka (KGA3) or Watanabe (KGA4). The user name (IA2) is obtained from the user ID (IA1) using the user ID table (IA) and displayed. The facing situation of Tanaka (KGA3) and Watanabe (KGA4) is indicated by a line (edge) connecting the two, and (KGA5) indicates that the facing time is short but they meet daily. The case of Tanaka (KGA3) and Takahashi (KGA2) is (KGA6), which indicates that the meeting time is long but the meeting cycle is long (a meeting cycle of several days).
 対面周期ヒストグラム図描画(JGB)では、コミュニケーションの周期を求めている対面周期マトリックス(FGB)とコミュニケーションの量である複数日の対面マトリックス(FC1C)からメンバ毎の対面周期を考慮したヒストグラムを描画する。 Face-to-face period histogram drawing (JGB) draws a histogram in consideration of the face-to-face period for each member from the face-to-face period matrix (FGB) for which the communication period is determined and the face-to-face matrix (FC1C) of multiple days that is the amount of communication. .
 描画した結果を対面周期ヒストグラム図(KGB)で示す。期間:2009年7月1日-7月31日(KGB2)は対面周期マトリックス(FGB)で用いた期間を示している。メンバ毎の対面周期別の対面時間を示している。高橋(KGB2)、田中(KGB3)、渡辺(KGB4)はユーザであり、その上にある帯状のものが周期別の対面時間である。帯の構成は周期2以上のメンバとの対面時間(KGB5)と周期2未満のメンバとの対面時間(KGB6)であり、帯の長さが合計対面時間(KGB7)を示している。 The drawn result is shown by a meeting period histogram chart (KGB). Period: 2009/7 / 1-July 31 (KGB2) has shown the period used by the facing period matrix (FGB). The meeting time according to the meeting period for each member is shown. Takahashi (KGB2), Tanaka (KGB3), and Watanabe (KGB4) are users, and the strip above is the facing time according to the period. The band configuration is the facing time with members of cycle 2 or more (KGB5) and the facing time with members less than cycle 2 (KGB6), and the length of the band indicates the total facing time (KGB7).
 このように、コミュニケーションがどのぐらいの周期で行なっているかをネットワーク図上に反映することで、コミュニケーションの質を可視化し、実態感を向上させる。 In this way, by reflecting on the network diagram how often communication is performed, the quality of communication can be visualized and the actual feeling can be improved.
 コミュニケーション時の対面人数を知ることが重要である。通常のネットワーク図ではコミュニケーションを2者間の対面で表示してしまうため、何人でコミュニケーションしていたのかがわからない。そこで、実施例9では、ネットワーク図上にコミュニケーション時の対面人数を反映させることにより、コミュニケーション状況を可視化させる。 It is important to know the number of people you meet during communication. In a normal network diagram, communication is displayed in face-to-face communication between two parties, so it is not possible to know how many people were communicating. Therefore, in the ninth embodiment, the communication situation is visualized by reflecting the number of people facing each other at the time of communication on the network diagram.
 図38は、コミュニケーション時の対面人数をネットワーク図上に反映させるための処理手順を示した図である。対面人数解析(CH)によって対面時間と対面人数を求め、対面人数別対面ネットワーク図描画(JHA)によってネットワーク図を作成し、この結果が対面人数別対面ネットワーク図(KHA)である。また、対面人数別にネットワーク図を1つに統合し、最大対面時間人数ネットワーク図描画(JHB)によってネットワーク図を作成し、この結果が最大対面時間人数ネットワーク図(KHB)である。 FIG. 38 is a diagram showing a processing procedure for reflecting the number of people facing each other at the time of communication on the network diagram. The meeting time and the meeting number are obtained by the meeting number analysis (CH), and a network diagram is created by the meeting network diagram drawing (JHA) by the meeting number, and the result is a meeting network view (KHA) by the meeting number. In addition, the network diagram is integrated into one for each meeting number, and a network diagram is created by the maximum meeting time number network diagram (JHB), and the result is the maximum meeting time number network diagram (KHB).
 この処理は、実施例1と同じフレームワークで処理することが可能であり、対面人数解析(CH)はアプリケーションサーバ(AS)の制御部(ASCO)、対面人数別対面ネットワーク図描画(JHA)と最大対面時間人数ネットワーク図描画(JHB)はクライアント(CL)の表示(J)で実行される。解析結果テーブル(F)の対面テーブル(FC1A)を求めるまでは、実施例1と同じであるため説明を省略する。 This process can be processed by the same framework as that of the first embodiment, and the meeting number analysis (CH) corresponds to the control unit (ASCO) of the application server (AS), the meeting network diagram drawing by meeting number (JHA), The maximum meeting time number of people network diagram drawing (JHB) is executed on the display (J) of the client (CL). The process until the meeting table (FC1A) of the analysis result table (F) is obtained is the same as that of the first embodiment, so the description is omitted.
 対面人数別対面マトリックス生成(CHA)は、対面人数毎にマトリックス生成する処理である。基本的なマトリックスの生成方法は対面マトリックス作成(C1C)と同じである。しかし、1点だけ異なっており、それは、対面テーブル(FC1A)の分解能時刻(FC1A2)の対面人数に着目し、その対面人数によって格納する対面マトリックスが異なるということである。具体的にいうと、分解能時刻(FC1A2)の対面人数が2の場合には対面人数別対面マトリックス(FHA)の2者間対面マトリックス(FHAA)に代入、分解能時刻(FC1A2)の対面人数が3から5の場合には対面人数別対面マトリックス(FHA)の3者-5者間対面マトリックス(FHAB)に代入、分解能時刻(FC1A2)の対面人数が6以上の場合には対面人数別対面マトリックス(FHA)の6者間以上対面マトリックス(FHAC)に代入に代入することになる。対面人数別対面マトリックス生成(CHA)では、予め決められた対面人数別にマトリックスを生成する処理であり、対面マトリックスの対面人数の範囲を任意に決めることができる。 Meeting matrix generation (CHA) according to the number of people in a meeting is a process of generating a matrix for each number of people in a meeting. The basic matrix generation method is the same as in face-to-face matrix generation (C1C). However, it differs by only one point, which means that the meeting matrix to be stored differs depending on the number of people facing each other at the resolution time (FC1A2) of the meeting table (FC1A). Specifically, when the number of people meeting at resolution time (FC1A2) is 2, the number of people meeting at resolution time (FC1A2) is 3 in substitution matrix for two-person meeting matrix (FHAA) of meeting matrix according to number of people facing each other (FHA) From 5 to 5, it is substituted in the 3-party-five-person meeting matrix (FHAB) of the meeting person-by-person meeting matrix (FHA), and when the number of meeting people at resolution time (FC1A2) is 6 or more FHA) will be assigned to substitution matrix for face-to-face matrix (FHAC) over six parties. The meeting matrix generation (CHA) according to the number of meeting persons is a process of generating a matrix according to the predetermined number of meeting persons, and the range of the number of meeting persons in the meeting matrix can be arbitrarily determined.
 対面人数別対面マトリックス生成(CHA)は、時系列に並べられている対面テーブル(FC1A)から、時系列情報を取り除き、ユーザ別にどのぐらい対面しているかを対面人数別の2次元マトリックスにまとめたものである。抽出した結果を解析結果テーブル(F)の対面人数別対面マトリックス(FHA)に格納する。対面人数別対面マトリックス(FHA)の1例を図39に示してある。1ヶ月分の対面結果をまとめたものとなっている。 Meeting matrix generation (CHA) by meeting number of people removed time series information from the meeting table (FC1A) arranged in time series, and put together how many people are meeting for each user in a two-dimensional matrix by meeting number of people It is a thing. The extracted result is stored in a meeting matrix by face number (FHA) of the analysis result table (F). An example of a face-to-face meeting matrix (FHA) by face-to-face number is shown in FIG. It is a compilation of one month's meeting results.
 対面人数別対面マトリックス(FHA)では複数の対面人数別対面マトリックスから構成されており、2者間対面マトリックス(FHAA)、3者-5者間対面マトリックス(FHAB)や6者間以上対面マトリックス(FHAC)から構成されている。2者間対面マトリックス(FHAA)では、縦軸はメンバ個人を判別するためのユーザID(FHAA1)、横軸は対面した相手を示すユーザID(FHAA2)である。例えば、ユーザ003におけるユーザ004との対面時間は、543分となっている。3者-5者間対面マトリックス(FHAB)や6者間以上対面マトリックス(FHAC)の見方も同じである。 The Face-to-face Meeting Matrix (FHA) is composed of a plurality of face-to-face matrixes, and the two-to-two face matrix (FHAA), the three-to-five face matrix (FHAB) or the six-to-more face matrix ( FHAC). In the two-party face-to-face matrix (FHAA), the vertical axis is a user ID (FHAA1) for identifying a member individual, and the horizontal axis is a user ID (FHAA2) indicating the other party who has met. For example, the meeting time of the user 003 with the user 004 is 543 minutes. The views of the three-to-five face-to-face matrix (FHAB) and the face-to-face matrix of six or more faces (FHAC) are the same.
 この対面人数別対面マトリックス(FHA)を作成するにあたっては、多くの情報が1つのマトリックスに集約されてしまうため、もとの情報を記述しておく必要がある。期間:2009年7月1日-7月31日(FHA1)は対面人数別対面マトリックス(FHA)に用いた期間をしている。日数:31日間(FHA2)は期間(FHA1)における日数である。実質日数:21日間(FHA3)は期間(FHA1)に営業日数である。時間分解能:1分間(FHA4)は対面テーブル(FC1A)における時間分解能である。対面判定時間:3分間/1日(FHA5)は対面したと判定するための閾値である。すれ違ったりした場合でも、赤外線が反応してしまうと、対面したという判定になってしまうため、数回の反応はノイズである可能性が高いため、このような閾値を導入している。 In creating this face-to-face meeting matrix (FHA), it is necessary to describe the original information because a lot of information is integrated into one matrix. Period: 2009/7 / 1-July 31 (FHA1) The period used for the meeting matrix (FHA) according to the number of persons meeting. Days: 31 days (FHA2) is the number of days in the period (FHA1). Real Days: 21 days (FHA3) is the number of business days in the period (FHA1). Temporal resolution: 1 minute (FHA 4) is the temporal resolution in the facing table (FC 1A). Meeting determination time: 3 minutes / 1 day (FHA 5) is a threshold value for determining that meeting has occurred. Even in the case of passing each other, if the infrared rays react, it is determined that they meet, so a few reactions are likely to be noise, so this threshold is introduced.
 また、対面人数別対面マトリックス(FHA)は、ユーザの対面状況を格納することが重要であるため、これが満たされるならば、対面人数別対面マトリックス(FHA)で用いられているテーブル構成と異なってもかまわない。 Also, since it is important to store the user's face-to-face situation, the face-to-face face-to-face matrix (FHA) differs from the table configuration used in the face-to-face face-to-face matrix (FHA) if this is satisfied. I don't care.
 対面人数別対面ネットワーク図描画(JHA)は対面人数別の対面状況を示している対面人数別対面マトリックス(FHA)から対面人数別のネットワーク図の描画を行なう処理である。 Meeting network diagram by meeting number (JHA) is a process of drawing a network diagram by meeting number from the meeting matrix (FHA) by meeting number showing the meeting situation by meeting number.
 その例を対面人数別対面ネットワーク図(KHA)で示す。期間:2009年7月1日-7月31日(KHA1)は対面人数別対面マトリックス(FHA)で用いた期間を示している。対面人数別対面ネットワーク図(KHA)は3つのネットワーク図から構成させており、対面人数が2者の場合の2者間ネットワーク図(KHAA)、3者から5者の場合の3者-5者間ネットワーク図(FHAB)、6者以上の場合の6者間以上ネットワーク図(FHAC)である。 An example is shown by a meeting network diagram (KHA) according to the number of people facing each other. Period: 2009/7 / 1-July 31 (KHA1) indicates the period used in the Meeting by Face-to-face Meeting Matrix (FHA). The meeting network diagram (KHA) according to the number of people in the meeting is made up of three network diagrams, and the network diagram between two parties (KHAA) in the case of two parties, and three parties-five parties in the case of three to five. It is a network diagram (FHAB), a network diagram between six or more parties (FHAC) in the case of six or more parties.
 2者間ネットワーク図(KHAA)を例として説明する。メンバを丸の点(ノード)で示している。また、メンバ間を結ぶ線(エッジ)は対面時間を示している。特に線の太さは対面時間を示している。配置にはバネモデルを使用する。バネモデル(フックの法則)とは、2つのノード(点)がつながれている場合、そこにバネがあるとして力(内向きまたは外向き)を計算し、さらに自分とつながっていない全てのノードから距離に応じた斥力(反発する力)を受けるとして位置の移動を繰り返すことにより最適な配置にする手法である。 The two-party network diagram (KHAA) will be described as an example. Members are indicated by circle points (nodes). Further, lines (edges) connecting members indicate facing time. In particular, the thickness of the line indicates the facing time. Use a spring model for placement. The spring model (Hook's law) calculates the force (inward or outward) assuming that there is a spring when two nodes (points) are connected, and further calculates the distance from all nodes not connected with itself. It is a method to make it an optimal arrangement by repeating movement of a position as receiving repulsive force (repulsive force) according to.
 点(ノード)には伊藤(KHAA1)、渡辺(KHAA2)や山本(KHAA3)の様に配置する。ユーザID表(IA)を利用してユーザID(IA1)からユーザ名(IA2)を求め、表示する。 The points (nodes) are arranged like Ito (KHAA1), Watanabe (KHAA2) and Yamamoto (KHAA3). The user name (IA2) is obtained from the user ID (IA1) using the user ID table (IA) and displayed.
 そして、伊藤(KHAA1)と渡辺(KHAA2)は2者の対面をしているので、両者を線(エッジ)で結ぶ(KHAA4)。また、伊藤(KHAA1)と山本(KHAA3)の場合が(KHAA5)である。点(ノード)間を線(エッジ)を結ぶ際に、対面人数別対面マトリックス(FHA)が小さい値の場合にはノイズとして結ばなくてもかまわない。この閾値として、対面人数別対面マトリックス(FHA)の対面判定時間:3分間/1日(FHA5)と実質日数:21日間(FHA3)を掛け合わしたものを使用してもかまわない。また、2者間ネットワーク図(KHAA)と同様に、3者-5者間ネットワーク図(FHAB)や6者間以上ネットワーク図(FHAC)も求める。 Since Ito (KHAA1) and Watanabe (KHAA2) face each other, they are connected by a line (edge) (KHAA4). Also, the case of Ito (KHAA1) and Yamamoto (KHAA3) is (KHAA5). When connecting points (nodes) between points (nodes), it is not necessary to connect as a noise if the meeting person-by-person meeting matrix (FHA) has a small value. As the threshold value, a value obtained by multiplying the meeting determination time of the meeting matrix (FHA) for each meeting number of people: 3 minutes / 1 day (FHA 5) and the real number of days: 21 days (FHA 3) may be used. Further, similarly to the two-party network diagram (KHAA), a three-party five-party network diagram (FHAB) and a six-or-more network diagram (FHAC) are also obtained.
 最大対面時間マトリックス生成(CHB)とは対面人数別における最大対面時間を選択し、それをマトリックスに格納する処理である。具体的には、対面人数別対面マトリックス(FHA)のそれぞれのマトリックスから最大の値を 選択し、最大対面時間マトリックス(FHB)に格納することである。最大対面時間マトリックス(FHB)について図40を用いて説明する。最大対面時間マトリックス(FHB)では、縦軸はメンバ個人を判別するためのユーザID(FHB6)、横軸は対面した相手を示すユーザID(FHB7)である。例えば、ユーザ003におけるユーザ004との対面時間は、543分となっている。求め方は、対面人数別対面マトリックス(FHA)の2者間対面マトリックス(FHAA)と3者-5者間対面マトリックス(FHAB)と6者間以上対面マトリックス(FHAC)のユーザ003におけるユーザ004との対面時間を比較し、543、93、0の中から最大の値である543を選択する。 Maximum Face-to-face Time Matrix Generation (CHB) is a process of selecting the maximum face-to-face time according to the number of face-to-face persons and storing it in the matrix. Specifically, the largest value is selected from each matrix of the face-to-face meeting matrix (FHA) and stored in the maximum facing time matrix (FHB). The maximum facing time matrix (FHB) will be described with reference to FIG. In the maximum face-to-face time matrix (FHB), the vertical axis is a user ID (FHB6) for identifying a member individual, and the horizontal axis is a user ID (FHB7) indicating a partner who has met. For example, the meeting time of the user 003 with the user 004 is 543 minutes. The method of obtaining is the user 004 in the user 003 of the two-party facing matrix (FHAA) of the facing matrix (FHA) according to the number of facing persons (FHAA), the three-five person facing matrix (FHAB) and the six or more facing matrix (FHAC) The face-to-face times of 543, 93, 0 are compared, and 543, which is the maximum value, is selected.
 この最大対面時間マトリックス(FHB)を作成するにあたっては、多くの情報が1つのマトリックスに集約されてしまうため、もとの情報を記述しておく必要がある。期間:2009年7月1日-7月31日(FHB1)は最大対面時間マトリックス(FHB)に用いた期間をしている。日数:31日間(FHB2)は期間(FHB1)における日数である。実質日数:21日間(FHB3)は期間(FHB1)に営業日数である。時間分解能:1分間(FHB4)は対面テーブル(FC1A)における時間分解能である。対面判定時間:3分間/1日(FHB5)は対面したと判定するための閾値である。すれ違ったりした場合でも、赤外線が反応してしまうと、対面したという判定になってしまうため、数回の反応はノイズである可能性が高いため、このような閾値を導入している。また、最大対面時間マトリックス(FHB)は、ユーザの対面状況を格納することが重要であるため、これが満たされるならば、最大対面時間マトリックス(FHB)で用いられているテーブル構成と異なってもかまわない。 In creating this maximum facing time matrix (FHB), it is necessary to describe the original information because a lot of information is aggregated into one matrix. Period: July 1-July 31, 2009 (FHB1) is the period used for the maximum facing time matrix (FHB). Days: 31 days (FHB2) is the number of days in the period (FHB1). Real Days: 21 days (FHB3) is the number of business days in the period (FHB1). Temporal resolution: 1 minute (FHB4) is the temporal resolution on the facing table (FC1A). Meeting determination time: 3 minutes / 1 day (FHB 5) is a threshold value for determining that meeting has occurred. Even in the case of passing each other, if the infrared rays react, it is determined that they meet, so a few reactions are likely to be noise, so this threshold is introduced. Also, since it is important to store the user's face-to-face situation, the maximum face-to-face matrix (FHB) may differ from the table configuration used in the maximum face-to-face matrix (FHB) if this is satisfied. Absent.
 最大対面時間人数マトリックス生成(CHC)とは対面人数別における最大対面時間を選択したときの対面人数をマトリックスに格納する処理である。具体的には、対面人数別対面マトリックス(FHA)の対面人数別のマトリックスから最大の値が格納されていたマトリックスが担当している対面人数を最大対面時間人数マトリックス(FHC)に格納することである。 The maximum meeting time number of people matrix generation (CHC) is a process of storing the number of people in the meeting when the maximum meeting time in each meeting number of people is selected in the matrix. Specifically, by storing the number of people in charge of the matrix in which the maximum value is stored from the matrix according to the number of people in the person-to-person matrix (FHA) by storing in the maximum meeting time number of people matrix (FHC) is there.
 最大対面時間人数マトリックス生成(CHC)について説明する。最大対面時間人数マトリックス(FHC)では、縦軸はメンバ個人を判別するためのユーザID(FHC6)、横軸は対面した相手を示すユーザID(FHC7)である。例えば、ユーザ003におけるユーザ004は、1となっている。格納される値(FHC8)は、1を2者間、2を3者-5者間、3を6者間以上と示している。これは、対面人数別対面マトリックス(FHA)の対面人数範囲と同じである。求め方は、対面人数別対面マトリックス(FHA)の2者間対面マトリックス(FHAA)と3者-5者間対面マトリックス(FHAB)と6者間以上対面マトリックス(FHAC)のユーザ003におけるユーザ004との対面時間を比較し、543、93、0の中から最大の値である543を選択する。そして、543は2者間対面マトリックス(FHAA)からであり、それを意味する1を代入する。  The maximum meeting time people matrix generation (CHC) will be described. In the maximum meeting time number of people matrix (FHC), the vertical axis is a user ID (FHC6) for identifying a member individual, and the horizontal axis is a user ID (FHC7) indicating a partner who has met. For example, the user 004 in the user 003 is 1. The stored values (FHC 8) indicate that 1 is between two parties, 2 is between three and five, and 3 is between six and more. This is the same as the meeting number range of the meeting person by face meeting matrix (FHA). The method of obtaining is the user 004 in the user 003 of the two-party facing matrix (FHAA) of the facing matrix (FHA) according to the number of facing persons (FHAA), the three-five person facing matrix (FHAB) and the six or more facing matrix (FHAC) The face-to-face times of 543, 93, 0 are compared, and 543, which is the maximum value, is selected. And, 543 is from the two-party face-to-face matrix (FHAA) and substitute 1 which means that.
 この最大対面時間人数マトリックス(FHC)を作成するにあたっては、多くの情報が1つのマトリックスに集約されてしまうため、もとの情報を記述しておく必要がある。期間:2009年7月1日-7月31日(FHC1)は最大対面時間人数マトリックス(FHC)に用いた期間をしている。日数:31日間(FHC2)は期間(FHC1)における日数である。実質日数:21日間(FHC3)は期間(FHC1)に営業日数である。時間分解能:1分間(FHC4)は対面テーブル(FC1A)における時間分解能である。対面判定時間:3分間/1日(FHC5)は対面したと判定するための閾値である。すれ違ったりした場合でも、赤外線が反応してしまうと、対面したという判定になってしまうため、数回の反応はノイズである可能性が高いため、このような閾値を導入している。 In creating this maximum facing time number of people matrix (FHC), it is necessary to describe the original information because a lot of information is integrated into one matrix. Period: 2009/7 / 1-July 31 (FHC1) is the period used for the Maximum Face-to-Face Hour People Matrix (FHC). Days: 31 days (FHC2) is the number of days in the period (FHC1). Real Days: 21 days (FHC3) is the number of business days in the period (FHC1). Temporal resolution: 1 minute (FHC4) is the temporal resolution in the facing table (FC1A). Meeting determination time: 3 minutes / 1 day (FHC5) is a threshold value for determining that meeting has occurred. Even in the case of passing each other, if the infrared rays react, it is determined that they meet, so a few reactions are likely to be noise, so this threshold is introduced.
 また、最大対面時間人数マトリックス(FHC)は、ユーザの対面状況を格納することが重要であるため、これが満たされるならば、最大対面時間人数マトリックス(FHC)で用いられているテーブル構成と異なってもかまわない。 In addition, since it is important to store the user's face-to-face situation, the maximum face-to-face time person matrix (FHC) differs from the table configuration used in the maximum face-time time face person matrix (FHC) if this is satisfied. I don't care.
 最大対面時間人数ネットワーク図描画(JHB)は最大対面時間マトリックス(FHB)と最大対面時間人数マトリックス(FHC)から最大対面時間している時の対面人数をネットワーク図に描画する処理である。 The maximum meeting time number of people network diagram drawing (JHB) is a process of drawing the number of people meeting at the time of maximum meeting time from the maximum meeting time matrix (FHB) and the maximum meeting time number matrix (FHC) in the network diagram.
 その例を最大対面時間人数ネットワーク図41(KHB)で示す。期間:2009年7月1日-7月31日(KHB1)は最大対面時間マトリックス(FHB)と最大対面時間人数マトリックス(FHC)で用いた期間を示している。 An example is shown by the maximum meeting time number of people network diagram 41 (KHB). Period: 2009/7 / 1-July 31 (KHB1) shows the period used in the maximum facing time matrix (FHB) and the maximum facing time people matrix (FHC).
 メンバを丸の点(ノード)で示している。また、メンバ間を結ぶ線(エッジ)は対面時間を示している。特に線の太さは対面時間を示している。配置にはバネモデルを使用する。バネモデル(フックの法則)とは、2つのノード(点)がつながれている場合、そこにバネがあるとして力(内向きまたは外向き)を計算し、さらに自分とつながっていない全てのノードから距離に応じた斥力(反発する力)を受けるとして位置の移動を繰り返すことにより最適な配置にする手法である。点(ノード)には伊藤(KHB2)、田中(KHB3)や渡辺(KHB4)の様に配置する。ユーザID表(IA)を利用してユーザID(IA1)からユーザ名(IA2)を求め、表示する。 Members are indicated by circle points (nodes). Further, lines (edges) connecting members indicate facing time. In particular, the thickness of the line indicates the facing time. Use a spring model for placement. The spring model (Hook's law) calculates the force (inward or outward) assuming that there is a spring when two nodes (points) are connected, and further calculates the distance from all nodes not connected with itself. It is a method to make it an optimal arrangement by repeating movement of a position as receiving repulsive force (repulsive force) according to. The points (nodes) are arranged like Ito (KHB2), Tanaka (KHB3) and Watanabe (KHB4). The user name (IA2) is obtained from the user ID (IA1) using the user ID table (IA) and displayed.
 そして、伊藤(KHB2)と渡辺(KHB4)は最大対面時間マトリックス(FHB)から543分、最大対面時間人数マトリックス(FHC)から1ということで、2者の対面で543分間行なっていると読み取る。そして、線(エッジ)は(KHB5)のようになる。また、伊藤(KHB2)と田中(KHB3)は最大対面時間マトリックス(FHB)から215分、最大対面時間人数マトリックス(FHC)から2ということで、2者―5者間の対面で215分間行なっていると読み取る。そして、線(エッジ)は(KHB7)のようになる。特に線の太さは対面時間を示し、線の形状(実線、破線)は対面人数を示している。点(ノード)間を線(エッジ)を結ぶ際に、最大対面時間マトリックス(FHB)が小さい値の場合にはノイズとして結ばなくてもかまわない。この閾値として、最大対面時間マトリックス(FHB)の対面判定時間:3分間/1日(FHB5)と実質日数:21日間(FHB3)を掛け合わしたものを使用してもかまわない。 Then, Ito (KHB2) and Watanabe (KHB4) read 543 minutes from the maximum meeting time matrix (FHB) and 1 from the maximum meeting time people matrix (FHC) and read that the two parties are performing 543 minutes. And the line (edge) becomes like (KHB5). Also, Ito (KHB2) and Tanaka (KHB3) are 215 minutes from the Maximum Meeting Time Matrix (FHB) and 2 from the Maximum Meeting Time People Matrix (FHC). Read it. And a line (edge) becomes like (KHB7). In particular, the thickness of the line indicates the facing time, and the shape of the line (solid line, broken line) indicates the number of facing people. When connecting lines (edges) between points (nodes), it is not necessary to connect as noise if the maximum facing time matrix (FHB) has a small value. As the threshold value, a value obtained by multiplying the face-to-face determination time of the maximum face-to-face time matrix (FHB): 3 minutes / 1 day (FHB5) and the actual number of days: 21 days (FHB3) may be used.
 このように、コミュニケーションがどのぐらいのメンバで対面を行なっているのかをネットワーク図上に反映することで、コミュニケーションの質を可視化し、実態感を向上させることができる。 In this way, by reflecting on the network diagram how many members of the communication are facing each other, it is possible to visualize the quality of communication and improve the feeling of reality.
 組織毎に仕事のやり方や雰囲気が異なることが多く、通常、その組織に関わるとその組織の風土がわかる。また、組織間での差を知りたいが、主観的なことが多く、定量化できない。そこで、実施例10では、その組織に関わらなくても、組織間の比較を一見して理解できる可視化を行なう。 Organizations often have different ways of working and different moods, and usually they can tell the culture of the organization when they are involved. Also, I would like to know the difference between tissues, but it is often subjective and can not be quantified. Therefore, in the tenth embodiment, the comparison between the tissues is visually understood at a glance without regard to the organization.
 図42は、組織毎の風土を可視化するための処理手順を示した図である。周波数主成分抽出(CIA)によって行動指標とパーソナリティ指標から主成分を求め、組織周波数算出(CIB)によって組織毎に時系列の傾向にまとめ、組織周波数グラフ描画(JI)によって、組織別の時系列グラフを生成し、この結果が組織周波数(KI)である。 FIG. 42 is a diagram showing a processing procedure for visualizing the culture for each tissue. Frequency principal component extraction (CIA) obtains principal components from behavior index and personality index, and tissue frequency calculation (CIB) combines them into time series trends for each tissue, and tissue frequency graph drawing (JI), time series for each tissue A graph is generated, the result of which is the tissue frequency (KI).
 この処理は、実施例1と同じフレームワークで処理することが可能であり、組織周波数解析(CI)はアプリケーションサーバ(AS)の制御部(ASCO)、組織周波数グラフ描画(JI)はクライアント(CL)の表示(J)で実行される。解析結果テーブル(F)の説明変数(FFA)を求めるまでは、実施例1と同じであるため説明を省略する。  This process can be processed by the same framework as that of the first embodiment, tissue frequency analysis (CI) is a control unit (ASCO) of an application server (AS), tissue frequency graph drawing (JI) is a client (CL) ) Is executed (J). The process until the explanatory variable (FFA) of the analysis result table (F) is obtained is the same as that of the first embodiment, so the description is omitted.
 周波数主成分抽出(CIA)は個人別の解析結果テーブル(F)の説明変数(FFA)から組織の活動における特徴を求める処理である。具体的には、解析結果テーブル(F)の説明変数(FFA)を主成分分析することにより、活動における特徴を明らかにする。この結果を解析結果テーブル(F)の周波数主成分(FIA)に格納する。その一例を図43に示す。 Frequency principal component extraction (CIA) is a process for obtaining a feature in the activity of a tissue from an explanatory variable (FFA) of an analysis result table (F) for each individual. Specifically, the characteristic in the activity is clarified by performing principal component analysis on the explanatory variable (FFA) of the analysis result table (F). This result is stored in the frequency main component (FIA) of the analysis result table (F). An example is shown in FIG.
 周波数主成分(FIA)について説明する。周波数主成分(FIA)は組織活動の特徴を格納するテーブルである。期間:2009年7月1日(FIA1)は分析に用いた期間/日付が記載される。組織(FIA2)は分析する組織である。説明変数(FIA3)は分析する際の要素であり、項目は解析結果テーブル(F)の説明変数(FFA)と同じである。第1主成分(FIA4)は第1の主成分の値である。第2主成分(FIA5)は第2の主成分の値である。本例として、主成分分析を用いたが、組織活動における特徴を明らかになるならば、他の手法を用いてもかまわない。また、周波数主成分(FIA)では、第2主成分まで格納したが、それ以降(第3以降)の主成分を格納してもかまわない。  The frequency main component (FIA) will be described. The Frequency Principal Component (FIA) is a table that stores features of tissue activity. Period: July 1, 2009 (FIA1) is the period / date used for analysis. Tissue (FIA2) is a tissue to be analyzed. The explanatory variable (FIA3) is an element at the time of analysis, and the item is the same as the explanatory variable (FFA) of the analysis result table (F). The first principal component (FIA 4) is the value of the first principal component. The second principal component (FIA5) is the value of the second principal component. Although principal component analysis is used as this example, other methods may be used if characteristics in tissue activity are clarified. Further, although the second main component is stored in the frequency main component (FIA), the main components after that (third and subsequent ones) may be stored.
 組織周波数算出(CIB)では周波数主成分抽出(CIA)でもとめた指標を時系列毎に1つにまとめる処理である。具体的には、1日毎に、周波数主成分(FIA)から組織毎の第1主成分(FIA4)と第2主成分(FIA5)を2次元にマッピングし、重心を求める。そして、原点からの距離を組織周波数値とする。 Tissue frequency calculation (CIB) is a process of collecting the indices obtained in frequency main component extraction (CIA) into one for each time series. Specifically, the first principal component (FIA 4) and the second principal component (FIA 5) for each tissue are two-dimensionally mapped from the frequency principal component (FIA) every two days to determine the center of gravity. And let the distance from the origin be the tissue frequency value.
 組織周波数(FIB)について説明する。組織周波数(FIB)は組織活動の特徴を時系列毎に格納するテーブルである。組織(FIB1)は分析する組織である。日付(FIB2)は分析対象日である。この表の見方として、組織Aの2009年7月2日の組織周波数は1.5である。 The tissue frequency (FIB) will be described. Tissue frequency (FIB) is a table that stores features of tissue activity for each time series. Tissue (FIB1) is a tissue to be analyzed. The date (FIB2) is the date of analysis. The view point of this table is that the organizational frequency of Organization A on July 2, 2009 is 1.5.
 本例として、組織毎に説明変数の第1主成分と第2主成分を2次元にマッピングし、重心を求めたが、組織活動における特徴がわかるならば、他の手法を求めてもかまわない。  In this example, the first principal component and the second principal component of the explanatory variable are two-dimensionally mapped for each tissue to obtain the center of gravity, but other methods may be obtained if the characteristics in tissue activity are known. .
 組織周波数グラフ描画(JI)は組織周波数(FIB)から組織毎に時系列における組織周波数を折れ線グラフで描画する処理である。その例を図44の組織周波数グラフ(KI)で示す。横軸は日付(KI1)、縦軸は組織周波数(KI2)である。組織毎に組織周波数(FIB)からの値を折れ線グラフでプロットする。このように、組織毎に仕事のやり方や雰囲気を組織周波数として、時系列上にプロットすることで、その組織に関わらなくても、組織間の比較を一見して理解できるようになる。 Tissue frequency graph drawing (JI) is a process of drawing tissue frequencies in a time series for each tissue from a tissue frequency (FIB) as a line graph. The example is shown by the tissue frequency graph (KI) of FIG. The horizontal axis is date (KI1), and the vertical axis is tissue frequency (KI2). The values from tissue frequency (FIB) are plotted in a line graph for each tissue. In this way, by plotting work methods and atmospheres as tissue frequencies on a time series basis for each tissue, it becomes possible to understand at a glance the comparisons between tissues, regardless of the tissue.
 解析結果からの施策はいろいろあるが、それを普段使うことにより、無意識により改善することが重要である。実施例11では、活性化を高め、ストレスを低減するための施策としての席替配置について述べる。 There are various measures from the analysis results, but it is important to improve them unconsciously by using them regularly. The eleventh embodiment will describe seat repositioning as a measure for enhancing activation and reducing stress.
 組織内のメンバの対面ネットワーク分析結果と、パーソナリティ指標の両方を使って、実際の組織内の座席配置を決定する実施例を、図45を使用して説明する。 An example of using the results of face-to-face network analysis of members in an organization and personality indicators to determine the seating arrangement in an actual organization will be described using FIG.
 この処理は、実施例1と同じフレームワークで処理することが可能であり、席替解析(CJ)はアプリケーションサーバ(AS)の制御部(ASCO)、場への座席配置描画(JJ)はクライアント(CL)の表示(J)で実行される。解析結果テーブル(F)の対面マトリックス(FC1C)とパーソナリティ指標(FAAE)を求めるまでは、実施例1と同じであるため説明を省略する。 This process can be processed by the same framework as in the first embodiment, and the seat change analysis (CJ) is a control unit (ASCO) of the application server (AS), and the seat arrangement drawing (JJ) in the place is a client It is executed by the indication (J) of (CL). The process until finding the facing matrix (FC1C) and the personality index (FAAE) of the analysis result table (F) is the same as that of the first embodiment, and therefore the description thereof is omitted.
 一般には、組織内のメンバの座席配置はその組織それぞれで、異なった目的を持って実施される。たとえば、組織内のそれぞれのメンバのストレスを低減したり、または、組織内でのコミュニケーションを活性化するなどがその目的の例である。ここでは、センサで取得されたデータから分析される対面ネットワーク分析結果と、パーソナリティ指標を使って、組織内でのストレスを低減し、かつコミュニケーションを活性化させることを目的として、組織内のメンバの座席配置を決定する例を説明する。 In general, seating arrangements for members within an organization are implemented with different goals in each organization. For example, reducing the stress of each member in an organization or activating communication in an organization is an example of the purpose. Here, using the face-to-face network analysis results analyzed from the data acquired by the sensors and the personality index, it is possible to reduce stress in the organization and to activate communication by using members of the organization. An example of determining the seat arrangement will be described.
 処理の流れを図45に示す。センサにより得られた対面マトリクス(FC1C)から、ネットワーク図を作成し、その座標値を用いて、人物間のネットワーク図上での到達距離、すなわち対面距離を計算する(CJA)。この対面距離マトリクス(CJB)から、対面距離ネットワーク図(CJC)を描画する。一方で、パーソナリティアンケートにより得られた組織パーソナリティ指標(FAAE)から、ストレスに関連する指標であるとする適応性(GFB)を計算する(CJD)。本発明者らは、メンバがかかえるストレスと社会への適応度を示すパーソナリティ指標(外向性、調和性、誠実性、神経性、開放性)との関係について研究を進めていく中で、これらに強い関係があることを見いだしため、本実施例では、パーソナリティ指標に基づいてストレスに関連する指標を算出する。 The flow of processing is shown in FIG. From the face-to-face matrix (FC1C) obtained by the sensor, a network diagram is created, and using the coordinate values, the reach distance on the network diagram between persons, ie, the face-to-face distance is calculated (CJA). A facing distance network diagram (CJC) is drawn from the facing distance matrix (CJB). On the other hand, adaptability (GFB) to be an index related to stress is calculated (CJD) from the tissue personality index (FAAE) obtained by the personality questionnaire. The present inventors conducted research into the relationship between stress on members and personality indicators (extroversion, harmony, integrity, nervousness, openness) indicating the degree of fitness to society. In order to find that there is a strong relationship, in the present embodiment, a stress related index is calculated based on the personality index.
 座席配置制約(CJF)は、ユーザが、本実施例によって配置される座席に制約を与えるための情報である。制約とは、例えばある人物が特定の座席に強制的に配置されるよう指定したり、逆に特定の座席に配置されないように指定するなどの機能である。この情報は、クライアント(CL)からユーザがキーボード等を用いて入力することにより、与えられる。 The seat arrangement constraint (CJF) is information for the user to constrain the seats arranged by the present embodiment. The constraint is, for example, a function of designating that a person is forced to be placed in a specific seat, or conversely, designating that it is not placed in a specific seat. This information is given by the user from the client (CL) using a keyboard or the like.
 対面距離ネットワーク図(CJC)と、適応性(GFB)、を使って、座席配置の最適化(CJE)を行い、組織内の座席位置を示す場リスト(IC)とフィットさせることで、最終的な場への座席配置を得て、座席表を描画(JJ)する。また、座席配置制約(CIJ)を用いて、配置される座席に制約を設けることもできる。 Perform seat layout optimization (CJE) using face-to-face distance network diagram (CJC) and adaptability (GFB), and finally fit it with the field list (IC) showing the seat position in the organization. Get a seating arrangement for a place and draw a seating chart (JJ). Also, seat placement constraints (CIJ) can be used to place constraints on the seats being placed.
 図46のネットワーク図(ZB)は、組織におけるネットワーク図の1例である。このネットワーク図(ZB)は、センサにより得られた対面マトリクス(FC1C)から描画され、(ZB1)~(ZB7)は人物を表すノードと、(ZB8)~(ZB15)は対面しているメンバ同士を結んだ線(エッジ)から構成されている。図7と同様に、対面するメンバは、ネットワーク図上でたとえばバネモデルを使用して配置される。これにより、組織内で頻繁に対面するメンバ同士は、ネットワーク図上で近く配置され、対面しないメンバ同士は遠く配置される。 The network diagram (ZB) in FIG. 46 is an example of a network diagram in an organization. This network diagram (ZB) is drawn from the facing matrix (FC1C) obtained by the sensor, and (ZB1) to (ZB7) represent nodes representing persons, and (ZB8) to (ZB15) represent members facing each other. It consists of the line (edge) which connected. Similar to FIG. 7, the facing members are arranged on the network diagram using, for example, a spring model. As a result, members frequently facing each other in the organization are arranged close on the network diagram, and members not facing each other are arranged far.
 既に説明したように、ネットワーク図から組織内でのコミュニケーションの活性度を評価する指標には、ネットワーク指標(FAAA)である、次数(FAAA2)、結束度(FAAA2)、2ステップ到達度(FAAA3)などがある。次数(FAAA2)は、ノードに繋がっているエッジの数、結束度(FAAA2)は、自分の周りのノードの密度、2ステップ到達度(FAAA3)は、全体において、2ステップ以内の範囲に存在するノードの割合である。 As described above, the network diagram (FAAA), which is an index for evaluating the activity of communication in the organization from the network diagram, the order (FAAA2), the cohesion (FAAA2), and the two-step attainment (FAAA3) and so on. The order (FAAA2) is the number of edges connected to a node, the cohesion (FAAA2) is the density of nodes around itself, and the two-step reach (FAAA3) is within a total of two steps or less It is the proportion of nodes.
 ネットワーク上で遠くに配置されている者同士が、直接コミュニケーションすれば、次数(FAAA2)、結束度(FAAA2)、2ステップ到達度(FAAA3)はより大きな値を持つことは自明である。つまり、組織内のコミュニケーションを活性化にするためには、このネットワーク図上で遠くに配置されている者同士のコミュニケーションを促せばよい。 It is obvious that the degree (FAAA2), the cohesion degree (FAAA2), and the two-step attainment degree (FAAA3) have larger values if the persons placed far away on the network communicate directly. In other words, in order to activate communication within the organization, communication between persons located far away on this network diagram may be encouraged.
 ここでは、ネットワーク図上で遠くに配置されるメンバ同士の座席を近く配置し、そのメンバ間の物理的距離を減少させることで容易に会話をすることを可能にし、その結果、そのメンバ間のコミュニケーションを活性化させることを考える。 Here, it is possible to easily communicate by placing the seats of the members placed far apart on the network diagram close to each other and reducing the physical distance between the members, as a result, between the members Consider activating communication.
 図47は、図46のネットワーク図上の全てのメンバ同士(人物(CJA1A)から人物(CJA1B))が、ネットワーク上で到達するためのステップ数を示している。ステップ数が小さいほどコミュニケーションが緊密であり、ステップ数が大きいほど、コミュニケーションが疎遠であることを示す。これをマトリックスによって表記したのが図48の対面距離マトリクス(CJB)である。この対面距離マトリクス(CJB)を、対面マトリックスから対面ネットワーク図を作成するのと同様に、対面距離ネットワーク図として作成したのが図49で示す対面距離ネットワーク図(CJC)である。 FIG. 47 shows the number of steps for all members (person (CJA 1A) to person (CJA 1B)) on the network diagram of FIG. 46 to reach on the network. The smaller the number of steps, the tighter the communication, and the larger the number of steps, the more distant the communication. It is the facing distance matrix (CJB) of FIG. It is a facing distance network diagram (CJC) shown in FIG. 49 that this facing distance matrix (CJB) is created as a facing distance network diagram, similarly to creating a facing network diagram from the facing matrix.
 図49では、簡略化のため対面ステップ数1のエッジは省略し、対面ステップ数2のエッジは波線で表記した。太い実線で示した(CJC8)がステップ数4のエッジ、細い実線で示した(CJC9)、(CJC10)、(CJC11)、(CJC12)、(CJC13)がステップ数3のエッジを示す。 In FIG. 49, the edge of the facing step number 1 is omitted for simplification, and the edge of the facing step number 2 is represented by a dashed line. The thick solid line (CJC8) shows the edge of the step number 4, and the thin solid line shows (CJC9), (CJC10), (CJC10), (CJC11), (CJC12), and (CJC13) the edge of the step number 3.
 対面ネットワークにおいて、指標2ステップ到達度(FAAA3)を増大させるためには、図49に示した対面距離ネットワーク図で実線で表記されているステップ数3以上のエッジ((CJC8)~(CJC13))を持つ人物同士を、実際の組織内での座席を近く配置し、コミュニケーションを促せばよい。 In the face-to-face network, in order to increase the attainment of the index 2 step (FAAA3), an edge with three or more steps indicated by a solid line in the face-to-face distance network diagram shown in FIG. 49 ((CJC8) to (CJC13)) You can arrange communication between people who have the same seating position in the actual organization.
 さて、パーソナリティアンケート(GA)からパーソナリティ指標(FAAE)として、外向性、調和性、誠実性、神経性、開放性の5つのパーソナリティ指標が計算される。これらは、それぞれ0から1までの値を持つ。この5つのパーソナリティ指標全てを加算したものを適応性(GFB)と呼ぶ。適応性(GFB)は0から5までの値を持つ。 Now, five personality indexes of extroversion, harmony, integrity, nervousness and openness are calculated as personality indexes (FAAE) from the personality questionnaire (GA). These have values of 0 to 1, respectively. The sum of all five personality indexes is called adaptability (GFB). Adaptability (GFB) has values from 0 to 5.
 ここで、組織内対面ネットワークにおいて、ある人物の直接周囲(ネットワーク上で直接エッジが接続されている人物)に、適応性(GFB)が自分より高い人が多い場合、その人物のストレスが高くなる傾向があるとする。 Here, in the in-house face-to-face network, if there are many people whose adaptability (GFB) is higher than that of the person's immediate surroundings (people whose edges are directly connected on the network), the person's stress becomes high. Suppose there is a tendency.
 この場合、その人物の対面ネットワーク上の直接周囲に、適応性(GFB)の高い人物が集中しないようにすれば、その人のストレスを低減させることができる。具体的には、その人物の座席の周囲に、適応性(GFB)が高い人物が集中しないようにすることで、その人物のストレスを軽減させる。 In this case, if a person with high adaptability (GFB) is not concentrated directly on the face-to-face network of the person, stress of the person can be reduced. Specifically, stress of a person is reduced by preventing concentration of a person with high adaptability (GFB) around the seat of the person.
 一般には、ある人物が、その人物より適応性(GFB)が高い人物に囲まれない座席の配置方法は、無数に存在する。この配置は、コンピュータシミュレーションや計算で求めても良い。ここでは、その例として、適応性(GFB)が高い者同士、または適応性(GFB)が低い者同士を2人ずつ組み合わせ、それを2セットずつ交互に配置することで実現した例を図50に示す。(CJE1)に代表される適応性(GFB)が高い人物を、網掛けの円で、(CJE3)で代表される適応性(GFB)が低い人物を、白い円で示す。適応性(GFB)が高い人物同士をペアにしたものが(CJE2)、適応性(GFB)が低い人物同士をペアにしたものが(CJE4)である。本配置では、適応性(GFB)が低い人物に注目すれば、仮に一辺を適応性(GFB)が高い人物と接したとしても、異なる辺では必ず適応性(GFB)が低い人物と接するため、ストレスが高くなることはないと想定される。 Generally, there are innumerable ways of arranging a seat in which a person is not surrounded by a person with higher adaptability (GFB) than the person. This arrangement may be determined by computer simulation or calculation. Here, as an example, an example realized by combining two persons with high adaptability (GFB) or two persons with low adaptability (GFB) and alternately arranging them two sets each as shown in FIG. Shown in. A person with high adaptability (GFB) represented by (CJE1) is shown by a shaded circle, and a person with low adaptability (GFB) represented by (CJE3) is shown by a white circle. People who have high adaptability (GFB) are paired (CJE2), and those who have low adaptability (GFB) are paired (CJE4). In this arrangement, if attention is paid to a person with low adaptability (GFB), even if one side is in contact with a person with high adaptability (GFB), different sides always contact with a person with low adaptability (GFB). It is assumed that stress will not be high.
 たとえば、対面ネットワーク図(ZB)上の人物の適応性(GFB)を記載した表が図51である。全員の適応性(GFB)の平均値は2.5であり、これを上回るものを網掛けで表記した。ストレスを低減する座席配置を実現するためには、たとえば図50に示した配置ルールに従って座席を決定すればよい。 For example, FIG. 51 is a table describing the adaptability (GFB) of a person on the facing network diagram (ZB). The average value of all the adaptability (GFB) was 2.5, and those over this were indicated by shading. In order to realize the seat arrangement which reduces stress, for example, the seat may be determined in accordance with the arrangement rule shown in FIG.
 以上の方針を適用して座席配置を最適化(CJE)し、場への座席配置描画(JJ)によって、組織の場リスト(IC)にフィットさせて決定した例が図52である。 FIG. 52 shows an example determined by applying the above policy to optimize the seating arrangement (CJE), and fitting it to the organization place list (IC) by drawing the seat arrangement on the place (JJ).
 本座席配置は、対面コミュニケーションの少ない人物同士を近くに配置してコミュニケーションを活性化するとともに、適応性(GFB)の高い人物に適応性(GFB)の低い人物が囲まれない、組織全体のストレスの低減を同時に実現する座席配置である。 This seating arrangement places people with little face-to-face communication close to each other to activate communication, and a person with high adaptability (GFB) does not surround a person with low adaptability (GFB), stress throughout the organization It is a seating arrangement that simultaneously realizes the reduction of
 本実施例は、センサで取得されたデータから分析される対面ネットワーク分析結果と、パーソナリティ指標を使って自動的に最適な組織の座席配置を可能にするものである。座席の配置に希望があり、その配置を制御したい場合には、座席配置制約(CJE)により、自動配置される座席に制約を与えることができる。これにより、ユーザが配置される結果に変更を加えながら、比較・検討することを可能にする。 In this embodiment, it is possible to automatically optimize the seat arrangement of an organization using a face-to-face network analysis result analyzed from data acquired by a sensor and a personality index. If there is a desire to control the placement of the seat, seat placement constraints (CJE) can constrain the automatically placed seats. This allows the user to make comparisons and studies while making changes to the deployed results.
 組織内のメンバの対面ネットワーク分析結果と、パーソナリティ指標の両方を使って、実際の組織内の座席配置することにより、活性化を高め、ストレスを低減するための施策を普段使いの状況で実現することができる。 By arranging seating in the actual organization using both face-to-face network analysis results of members in the organization and personality index, measures to enhance activation and reduce stress are realized in everyday use situations be able to.
 実施例12では、対面コミュニケーションと職位の階層を同時に表示可能なネットワーク図を生成する。従来のネットワーク図では、現在の職位の階層との関係を見ることができない。ネットワーク図の配置を決める際に、職位の階層(ノード)を考慮することで、この問題を解決する。 In the twelfth embodiment, a network diagram capable of simultaneously displaying a face-to-face communication and a hierarchy of job positions is generated. In the conventional network diagram, the relationship with the hierarchy of the current position can not be seen. The problem is solved by considering the hierarchy (nodes) of the job position when deciding the layout of the network diagram.
 図53は、対面コミュニケーションと職位の階層を同時に表示するための処理手順を示した図である。職位階層ネットワークモデル化解析(CK)によってモデルを構成し、職位階層ネットワーク図描画(JK)によって描画を行ない、描画した結果は職位階層ネットワーク図(KK)である。 FIG. 53 is a diagram showing a processing procedure for simultaneously displaying the face-to-face communication and the job position hierarchy. The model is constructed by the job position hierarchy network analysis (CK), the drawing is carried out by the job position hierarchy network diagram drawing (JK), and the drawn result is a job position hierarchy network diagram (KK).
 この処理は、実施例1と同じフレームワークで処理することが可能であり、職位階層ネットワークモデル化解析(CK)はアプリケーションサーバ(AS)の制御部(ASCO)、職位階層ネットワーク図描画(JK)はクライアント(CL)の表示(J)で実行される。 This process can be processed by the same framework as that of the first embodiment, and the job position hierarchy network modeling analysis (CK) is the control unit (ASCO) of the application server (AS), job position hierarchy network diagram drawing (JK) Is executed on the display (J) of the client (CL).
 解析結果テーブル(F)の対面マトリックス(FC1C)を求めるまでは、実施例1と同じであるため説明を省略する。 The process until obtaining the facing matrix (FC1C) of the analysis result table (F) is the same as that of the first embodiment, and hence the description is omitted.
 階層内外組織周波数解析(CKA)の説明をする。階層内外組織周波数解析(CKA)では職位の階層内や階層外における組織周波数指標を求める。処理フローは図54に示している。 Explain intra- and intra-hierarchical tissue frequency analysis (CKA). In the intra-hierarchical and intra-hierarchical tissue frequency analysis (CKA), an organizational frequency index within and outside the job position hierarchy is obtained. The processing flow is shown in FIG.
 階層内外組織周波数解析(CKA)は、実施例10と同じフレームワークで処理することが可能であり、解析結果テーブル(F)の説明変数(FFA)を求めるまでは、実施例10と同じであるため説明を省略する。 The intra-hierarchical and intra-hierarchical tissue frequency analysis (CKA) can be processed by the same framework as in Example 10, and is the same as Example 10 until the explanatory variable (FFA) of the analysis result table (F) is determined. Therefore, the explanation is omitted.
 階層内外周波数処理(CKA1)では、ユーザ毎に求めた指標である解析結果テーブル(F)の説明変数(FFA)とメンバの所属を示しているユーザ/場所情報テーブル(I)のユーザID表(IA)を入力とする。階層内周波数とは、説明変数(FFA)の中から職位(IA4)が同じメンバのデータを選択し、特徴量として対面時間(FAAC2)や結束度(FAAA3)の平均や分散等の値とするものである。 In intra- and intra-layer frequency processing (CKA1), the explanatory variable (FFA) of the analysis result table (F), which is an index obtained for each user, and the user ID table (I) of the user / place information table (I) showing the affiliation of members. Let IA) be an input. The intra-hierarchical frequency is selected from data of members with the same job position (IA4) from the explanatory variables (FFA), and the values such as average or variance of facing time (FAAC2) or cohesion degree (FAAA3) as feature amount It is a thing.
 また、階層外周波数とは、説明変数(FFA)の中からある2つの職位(IA4)に所属しているメンバのデータ(例:担当と課長)を選択し、特徴量として対面時間(FAAC2)や結束度(FAAA3)の平均や分散等の値とするものである。階層内周波数や階層外周波数を求める計算式は、平均や分散のほかに、他の計算方法を用いてもかまわない。さらに、チーム名(IA3)や組織(IA5)毎の階層内周波数や階層外周波数の指標を求めてもかまわない。 Also, with the out-of-hierarchy frequency, select the data (for example, the person in charge and the section manager) of a member belonging to two positions (IA4) from among the explanatory variables (FFA), and meet time (FAAC2) as a feature value. And the cohesion degree (FAAA3) as a value such as average or variance. The calculation formula for calculating the intra-layer frequency and the extra-layer frequency may use other calculation methods in addition to the average and the variance. Further, it is possible to obtain an index of intra-layer frequency or extra-layer frequency for each team name (IA3) or organization (IA5).
 次に、職位階層ネットワーク図座標特定(CKB)について説明する。職位階層ネットワーク図座標特定(CKB)では、対面コミュニケーションと職位の階層を同時に表示可能なネットワーク図を生成するための座標値を求める。処理フローは図55に示している。 Next, position hierarchy network diagram coordinate identification (CKB) will be described. In the job position hierarchical network diagram coordinate specification (CKB), coordinate values for generating a network diagram capable of simultaneously displaying the face-to-face communication and the job position hierarchy are obtained. The processing flow is shown in FIG.
 図55に処理フローでは各メンバの座標値を求めるためのステップがStep1(CKBA)からStep4(CKBD)まで示している。各ステップについて説明する。 FIG. 55 shows steps for obtaining coordinate values of each member in the processing flow from Step 1 (CKBA) to Step 4 (CKBD). Each step will be described.
 Step1(CKBA)は初期配置である。画面上に予め職位別の配置エリアを決めておき、ユーザID表(IA)の職位(IA4)に従って、メンバを配置する。よって、各メンバには座標値が与えられる。さらに、対面マトリックス(FC1C)から2者間の対面時間を示している。ある一定時間以上の対面時間があるときに、配置したメンバ同士に線を結ぶ。その際に、対面時間に比例して線の太さ等を変更してもよい。例として、高橋(CKBA1)と田中(CKBA2)は一定以上の対面時間があるので、線(CKBA3)のように2者間を結んでいる。  Step 1 (CKBA) is an initial arrangement. A placement area for each job position is determined in advance on the screen, and members are placed in accordance with the job position (IA4) of the user ID table (IA). Thus, each member is given a coordinate value. Furthermore, the facing time between the two parties is shown from the facing matrix (FC1C). When there is a face-to-face time longer than a certain time, a line is connected to the arranged members. At that time, the thickness of the line may be changed in proportion to the facing time. As an example, Takahashi (CKBA1) and Tanaka (CKBA2) have a meeting time of a certain level or more, and thus, two lines are connected like a line (CKBA3).
 Step2(CKBB)とStep3(CKBC)とで最適な配置処理を行なっている。Step2(CKBB)とStep3(CKBC)を繰り返し、ある決められた回数、および、閾値以下になるまで終了しない。 Optimal placement processing is performed in Step 2 (CKBB) and Step 3 (CKBC). Step 2 (CKBB) and Step 3 (CKBC) are repeated, and the process is not completed until a predetermined number of times and the threshold value are reached.
 Step2(CKBB)は距離計算である。Step1(CKBA)では、配置を行ない、座標値が与えられる。線で結ばれているものに対して、線の長さを求め、その職位階層ネットワーク図の線の全体合計値を計算する。例として、高橋(CKBB1)と田中(CKBB2)を結んでいる線の距離は8(CKBB3)であり、線の全体合計値(CKBB4)は141である。 Step 2 (CKBB) is distance calculation. In Step 1 (CKBA), arrangement is performed and coordinate values are given. For those connected by a line, determine the line length, and calculate the total sum value of the lines in the position hierarchy network diagram. As an example, the distance between the line connecting Takahashi (CKBB1) and Tanaka (CKBB2) is 8 (CKBB3), and the total value of the lines (CKBB4) is 141.
 Step3(CKBC)はある1つの同階層メンバの交換である。層内で最適な配置にするために、層内で2名を選択し、座標値を交換させる。例として、小林(CKBC1)と山本(CKBC2)が交換(CKBC3)している。 Step 3 (CKBC) is an exchange of one same hierarchy member. In order to optimize placement in the layer, select 2 people in the layer and exchange coordinate values. As an example, Kobayashi (CKBC1) and Yamamoto (CKBC2) are exchanging (CKBC3).
 そして、Step2(CKBB)に戻り、距離計算を行ない、交換前の線の全体合計値を比較し、値が小さくなれば成功とみなしそれぞれのメンバの座標値を更新する。もし、値が小さくならなかった場合には、座標値を基に戻す。これらを繰り返し、ある決められた回数、および、閾値以下になるまで終了しない。 Then, the process returns to Step 2 (CKBB), performs distance calculation, compares the total values of the lines before exchange, and if the value is smaller, it is regarded as success and the coordinate values of the respective members are updated. If the value does not decrease, the coordinate value is returned to the base. These are repeated, and it does not finish until it becomes less than a threshold for a predetermined number of times.
 Step4(CKBD)は所属の重心の計算である。同じチーム名(IA3)のメンバ、もしくは、同じ組織(IA5)のメンバが何処に分布しているのかを明らかにする。メンバの座標値をユーザID表(IA)のチーム名(IA4)や組織(IA5)からメンバの座標の平均値を求める。例では、営業の重心(CKBD1)や開発の重心の座標値を求めている。計算式は、平均値のほかに、他の計算方法を用いてもかまわない。さらに、職位を考慮してもかまわない。 Step 4 (CKBD) is the calculation of the assigned centroid. Identify where members of the same team name (IA3) or members of the same organization (IA5) are distributed. The coordinate value of the member is determined from the team name (IA4) of the user ID table (IA) and the organization (IA5), and the average value of the coordinate of the member is obtained. In the example, the coordinate values of the sales center of gravity (CKBD1) and the development center of gravity are determined. The calculation formula may use other calculation method besides the average value. Furthermore, you may consider the position.
 職位階層ネットワーク図座標特定(CKB)は最適な座標値を求めることができればよく、他の処理を用いてもかまわない。 The position hierarchy network diagram coordinate specification (CKB) may be any other process as long as it is possible to obtain an optimal coordinate value.
 職位階層ネットワーク図座標特定(CKB)の座標値を格納したものが、職位階層ネットワーク図座標リスト(FK)である。解析結果テーブル(F)の職位階層ネットワーク図座標リスト(FK)の例を図56に示す。 The position hierarchy network diagram coordinate list (FK) stores the coordinate values of the position hierarchy network diagram coordinate specification (CKB). An example of the job title hierarchical network diagram coordinate list (FK) of the analysis result table (F) is shown in FIG.
 職位階層ネットワーク図座標リスト(FK)の期間(FK1)はデータに含まれている期間を示しており、対面マトリックス(FC1C)の期間(FC1C3)と同じである。日数(FK2)はデータに含まれている日数を示しており、日数(FC1C4)と同じである。実質日数(FK3)は期間(FK1)の営業日数を示したものであり、営業日数(FC1C5)と同じである。時間分解能(FK4)は対面テーブル(FC1A)における時間分解能であり、時間分解能(FC1C6)と同じである。対面判定時間(FK5)は対面したと判定するための閾値であり、対面判定時間(FC1C7)と同じである。 The period (FK1) of the job position hierarchical network diagram coordinate list (FK) indicates the period included in the data, which is the same as the period (FC1C3) of the facing matrix (FC1C). The number of days (FK2) indicates the number of days included in the data, which is the same as the number of days (FC1C4). The actual number of days (FK3) represents the number of business days of the period (FK1), which is the same as the number of business days (FC1C5). The time resolution (FK4) is the time resolution in the facing table (FC1A), which is the same as the time resolution (FC1C6). The face-to-face determination time (FK5) is a threshold value for determining that the face-to-face is met, and is the same as the face-to-face determination time (FC1C7).
 ユーザID(FK6)はユーザのIDを示しており、ユーザID表(IA)と対応している。座標値(FK7)は職位階層ネットワーク図座標特定(CKB)によって求めたメンバの座標値が格納されている。 The user ID (FK6) indicates the ID of the user and corresponds to the user ID table (IA). As coordinate values (FK7), coordinate values of members obtained by position hierarchy network diagram coordinate specification (CKB) are stored.
 チーム名(FK8)はチームの名前を示しており、ユーザID表(IA)と対応している。座標値(FK9)は職位階層ネットワーク図座標特定(CKB)によって求めたチームの重心の座標値が格納されている。 The team name (FK8) indicates the name of the team and corresponds to the user ID table (IA). The coordinate value (FK9) stores the coordinate value of the center of gravity of the team obtained by the position hierarchy network diagram coordinate specification (CKB).
 職位階層ネットワーク図描画(JK)では、職位階層ネットワーク図座標特定(CKB)によって生成した職位階層ネットワーク図座標リスト(FK)とユーザ/場所情報テーブル(I)のユーザID表(IA)を用いて、対面コミュニケーションと職位の階層を同時に表示可能なネットワーク図を、図53に示すように描画する The job position hierarchy network diagram drawing (JK) uses the job position hierarchy network diagram coordinate list (FK) generated by job position hierarchy network diagram coordinate specification (CKB) and the user ID table (IA) of the user / place information table (I). , Draw a network diagram that can simultaneously display face-to-face communication and the position hierarchy, as shown in FIG. 53.
 職位階層ネットワーク図(KK)で示す、期間:2009年7月1日-7月31日(KK1)は職位階層ネットワーク図座標リスト(FK)で用いた期間(FK1)を示している。 Period: July 1-July 31, 2009 (KK1) shown by the job position hierarchy network diagram (KK) shows the period (FK1) used in the job position hierarchy network diagram coordinate list (FK).
 ネットワーク図(KKA)では、まず、画面上に予め職位別の配置エリアを決めておき、ユーザID表(IA)の職位(IA4)に従って、例に記載してある部長(KKA1)、課長(KKA2)、担当(KKA3)のような、職位のエリアを描画する。 In the network diagram (KKA), first, the placement area by job position is determined in advance on the screen, and the section manager (KKA1) and the section manager (KKA2) described in the example according to the job position (IA4) of the user ID table (IA). Draw a job area, like the one in charge (KKA3)).
 職位階層ネットワーク図座標リスト(FK)に格納されているユーザID(FK6)とその座標値(FK7)を基に図形をプロットする。また、プロットした図形の周囲に、ユーザID表(IA)のユーザ名(IA2)を記載する。図形の形はユーザID表(IA)に記載してある、チーム名(IA3)、職位(IA4)、組織(IA5)によって変えてもかまわない。 A figure is plotted based on the user ID (FK6) and its coordinate value (FK7) stored in the job position hierarchical network diagram coordinate list (FK). In addition, the user name (IA2) of the user ID table (IA) is described around the plotted figure. The shape of the figure may be changed depending on the team name (IA3), the job title (IA4) and the organization (IA5) described in the user ID table (IA).
 また、解析結果テーブル(F)の対面マトリックス(FC1C)から2者間の対面時間を示している。ある一定時間以上の対面時間があるときに、配置したメンバ同士に線を結ぶ。その際に、対面時間に比例して線の太さ等を変更してもよい。 Moreover, the facing time between two parties is shown from the facing matrix (FC1C) of the analysis result table (F). When there is a face-to-face time longer than a certain time, a line is connected to the arranged members. At that time, the thickness of the line may be changed in proportion to the facing time.
 さらに、職位階層ネットワーク図座標リスト(FK)に格納されているチーム名(FK8)と座標値(FK9)を基に、チームの重心座標値を中心とした、エリア図形のプロットを行なう。例に記載してある営業(KKA5)、開発(KKA6)のような、チーム名のエリアを描画する。 Further, based on the team name (FK8) and the coordinate value (FK9) stored in the job position hierarchical network diagram coordinate list (FK), a plot of an area graphic centering on the barycentric coordinate value of the team is performed. Draw the area of the team name, such as sales (KKA5) and development (KKA6) described in the example.
 次に、階層内外指標(KKB)では、階層内外組織周波数解析(CKA)にて求めた階層内と階層間の組織周波数指標を記載する。図53の階層内外指標(KKB)では表形式での表示であるが、それをグラフ化(折れ線、棒線、円、帯、散布図、レーダーチャート)して表示してもかまわない。 Next, in the intra- and intra-hierarchy index (KKB), intra-hierarchical and inter-hierarchical tissue frequency indicators determined by intra-hierarchical and intra-hierarchical tissue frequency analysis (CKA) are described. Although the indicator inside and outside the hierarchy (KKB) in FIG. 53 is a display in tabular form, it may be displayed as a graph (a polygonal line, a bar, a circle, a band, a scatter chart, a radar chart).
 さらに、図53の職位階層ネットワーク図(KK)のネットワーク図(KKA)では、ユーザID表(IA)における組織(IA5)が金融のメンバを表示した例であるが、これをチーム名(IA3)に限定した表示が図57のネットワーク図(KKC)である。ネットワーク図(KKA)を作成したとき同じように、該当メンバを抽出し、そのメンバの座標値が記載している職位階層ネットワーク図座標リスト(FK)をみて、図形をプロットする。ネットワーク図(KKC)の例ではチーム名(IA3)における営業のメンバのみを選択して、表示している。解析結果テーブル(F)の対面マトリックス(FC1C)から2者間の対面時間を示している。ある一定時間以上の対面時間があるときに、配置したメンバ同士に線を結ぶ。その際に、対面時間に比例して線の太さ等を変更してもよい。 Furthermore, in the network diagram (KKA) of the job position hierarchical network diagram (KK) in FIG. 53, an example is shown in which the organization (IA5) in the user ID table (IA) displays financial members. Is a network diagram (KKC) of FIG. Similarly, when the network diagram (KKA) is created, the corresponding member is extracted, and the figure is plotted with reference to the job position hierarchical network diagram coordinate list (FK) described by the coordinate values of the member. In the example of the network diagram (KKC), only the sales members in the team name (IA3) are selected and displayed. The meeting time between two parties is shown from the meeting matrix (FC1C) of the analysis result table (F). When there is a face-to-face time longer than a certain time, a line is connected to the arranged members. At that time, the thickness of the line may be changed in proportion to the facing time.
 さらに、組織のメンバと外部のメンバとの繋がりを示したものが図57のネットワーク図(KKD)である。ネットワーク図(KKA)やネットワーク図(KKC)を作成したとき同じように、該当メンバを抽出し、そのメンバの座標値が記載している職位階層ネットワーク図座標リスト(FK)をみて、図形をプロットする。ネットワーク図(KKD)の例ではチーム名(IA3)における営業のメンバと営業のメンバと繋がっているメンバ(外部メンバ)を選び、表示している。解析結果テーブル(F)の対面マトリックス(FC1C)から2者間の対面時間を示している。ある一定時間以上の対面時間があるときに、配置したメンバ同士に線を結ぶ。その際に、対面時間に比例して線の太さ等を変更してもよい。また、外部メンバ同士の対面については、メンバ同士の線を結ばなくてもかまわない。 Further, the network diagram (KKD) of FIG. 57 shows the connection between members of the organization and external members. Similarly, when creating a network diagram (KKA) or a network diagram (KKC), the corresponding members are extracted, and the position hierarchy network diagram coordinate list (FK) described by the coordinate values of the members is plotted, and the figure is plotted. Do. In the example of the network diagram (KKD), members of sales in team name (IA3) and members connected with members of sales (external members) are selected and displayed. The meeting time between two parties is shown from the meeting matrix (FC1C) of the analysis result table (F). When there is a face-to-face time longer than a certain time, a line is connected to the arranged members. At that time, the thickness of the line may be changed in proportion to the facing time. Moreover, it is not necessary to connect a line between members about the facing of the external members.
 さらにネットワーク図(KKC)やネットワーク図(KKD)に応じて、階層内外指標(KKB)を表示してもかまわない。 Furthermore, according to the network diagram (KKC) and the network diagram (KKD), it is possible to display the intra-tier index (KKB).
 本実施例によれば、対面コミュニケーションと職位の階層を同時に表示可能なネットワーク図を生成することで、ネットワーク図上で、実際にどのようなチーム構成で活動しているのかがわかるようになる。 According to this embodiment, by generating a network diagram capable of simultaneously displaying the face-to-face communication and the hierarchy of the job position, it becomes possible to know what kind of team configuration is actually performed on the network diagram.
 以上、本発明の実施例について説明したが、本発明は上記実施例に限定されるものではなく、種々変形実施可能であり、上述した各実施例を適宜組み合わせることが可能であることは、当業者に理解されよう。 As mentioned above, although the Example of this invention was described, this invention is not limited to the said Example, A various deformation | transformation implementation is possible, It is possible to combine each Example mentioned above suitably. It will be understood by the trader.
 TR 名札型センサノード
 GW 基地局
 SS センサネットサーバ
 AS アプリケーションサーバ
 CL クライアント
 NW ネットワーク
 ASME 記憶部
 ASCO 制御部
 CA モデル化解析
 ASCC 通信制御
 ASSR 送受信部
TR name tag type sensor node GW base station SS sensor net server AS application server CL client NW network ASME storage unit ASCO control unit CA modeling analysis ASCC communication control ASSR transmission / reception unit

Claims (14)

  1.  複数の人物で構成される組織の分析を行う組織行動分析装置であって、
     上記複数の人物それぞれに装着される端末の赤外線送受信部及び加速度センサで取得されるセンサデータ、及び、上記複数の人物それぞれの主観的評価又は客観的評価を示すデータを受信する受信部と、
     上記センサデータ及び上記主観的評価又は客観的評価を示すデータを解析する制御部と、
     上記制御部が解析を行うための解析条件と上記制御部が解析した結果とを記録する記録部と、を備え、
     上記制御部は、
     上記複数の人物ごとに、上記組織内での人物間の関係及び上記組織内での行動を示す指標を、上記解析条件に基づいて上記センサデータから算出して上記記録部に記録し、
     上記複数の人物それぞれの主観的評価又は上記客観的評価を示すデータと、上記組織内での人物間の関係及び上記組織内での行動を示す指標との相関をとり、上記組織における上記主観的評価又は上記客観的評価を示すデータの要因を特定する組織行動分析装置。
    An organization behavior analysis apparatus for analyzing an organization composed of a plurality of persons, comprising:
    A receiving unit that receives sensor data acquired by the infrared transmitting / receiving unit and the acceleration sensor of the terminal attached to each of the plurality of persons, and data indicating a subjective evaluation or an objective evaluation of each of the plurality of persons;
    A control unit that analyzes the sensor data and data indicating the subjective evaluation or the objective evaluation;
    A recording unit that records analysis conditions for the control unit to analyze and a result of analysis performed by the control unit;
    The control unit
    For each of the plurality of persons, an index indicating an inter-person relationship in the organization and an action in the organization is calculated from the sensor data based on the analysis condition and recorded in the recording unit.
    The subjective evaluation of each of the plurality of persons or the data indicating the objective evaluation is correlated with the relationship between the persons in the organization and the index indicating the behavior in the organization, An organization behavior analysis device which specifies a factor of data indicating an evaluation or the above-mentioned objective evaluation.
  2.  請求項1に記載の組織行動分析装置において、
     上記制御部は、
     上記複数の人物ごとに、行動と思考の特性を示す指標を、上記解析条件に基づいて上記主観的評価を示すデータから算出して上記記録部に記録し、
     上記複数の人物それぞれの主観的評価又は上記客観的評価を示すデータと、上記行動と思考の特性を示す指標との相関をとり、上記組織における上記主観的評価又は上記客観的評価を示すデータの要因を特定する組織行動分析装置。
    In the organization behavior analysis device according to claim 1,
    The control unit
    For each of the plurality of persons, an index indicating characteristics of behavior and thinking is calculated from data indicating the subjective evaluation based on the analysis condition and recorded in the recording unit.
    The data showing the subjective evaluation or the objective evaluation in the organization by correlating the data indicating the subjective evaluation or the objective evaluation of each of the plurality of persons with the index indicating the characteristics of the behavior and thought Organizational behavior analysis device that identifies factors.
  3.  請求項1に記載の組織行動分析装置において、
     上記制御部は、
     上記赤外線送受信部で取得されるデータから上記複数の人物それぞれの対面状況を示す対面テーブルを作成する対面テーブル作成部と、
     上記加速度センサで取得されるデータから上記複数の人物それぞれの動きを示す身体リズムテーブルを作成する身体リズムテーブル作成部と、
     上記対面デーブルから作成されるネットワーク図に基づいて、上記複数の人物それぞれの他の人物との繋がりを示すネットワーク指標を抽出するネットワーク指標抽出部と、
     上記身体リズムテーブルから上記複数の人物それぞれの動きを示す周波数の出現頻度及び上記周波数の継続性を含む身体リズム指標を抽出する身体リズム指標抽出部と、
     上記対面テーブルと上記身体リズムテーブルに基づいて、上記複数の人物それぞれの対面時間及び対面積極性を示す対面指標を抽出する対面指標抽出部と、
     上記対面テーブルと上記身体リズムテーブルに基づいて、上記複数の人物それぞれの活動時間を示す組織活動指標を算出する組織活動指標抽出部と、を有し、
    上記組織内での人物間の関係及び上記組織内での行動を示す指標として、上記ネットワーク指標、上記身体リズム指標、上記対面指標、及び上記組織活動指標を用いる組織行動分析装置。
    In the organization behavior analysis device according to claim 1,
    The control unit
    A meeting table creation unit that creates a meeting table indicating the meeting situations of each of the plurality of persons from the data acquired by the infrared transmission / reception unit;
    A body rhythm table creation unit for creating a body rhythm table indicating the movement of each of the plurality of persons from the data acquired by the acceleration sensor;
    A network index extraction unit for extracting a network index indicating a connection between each of the plurality of persons with another person based on the network diagram created from the face-to-face table;
    A body rhythm index extracting unit for extracting a body rhythm index including the frequency of appearance of the frequency indicating the movement of each of the plurality of persons and the continuity of the frequency from the body rhythm table;
    A face-to-face index extraction unit that extracts a face-to-face index indicating the face-to-face time and the opposite area polarity of each of the plurality of persons based on the face-to-face table and the body rhythm table;
    An organization activity index extracting unit that calculates an organization activity index indicating an activity time of each of the plurality of persons based on the facing table and the body rhythm table;
    A tissue behavior analysis apparatus using the network index, the body rhythm index, the face-to-face index, and the organization activity index as an index indicating a relationship between persons in the organization and an action in the organization.
  4.  請求項1に記載の組織行動分析装置において、
     上記客観的評価を示すデータとは、上記複数の人物それぞれの生産性、事故不良の少なくとも何れか1つを示すデータである組織行動分析装置。
    In the organization behavior analysis device according to claim 1,
    The data indicating the objective evaluation is a data indicating the productivity of each of the plurality of persons and / or at least one of accident defects.
  5.  請求項1に記載の組織行動分析装置において、
     上記主観的評価を示すデータとは、上記複数の人物それぞれのリーダシップ/チームワーク指標、やりがい/充実指標、及びストレス/メンタル不調指標の少なくとも何れか1つである組織行動分析装置。
    In the organization behavior analysis device according to claim 1,
    The data indicating the subjective evaluation is at least one of a leadership / teamwork index for each of the plurality of persons, a challenge / enhancement index, and a stress / mental disorder index.
  6.  請求項1に記載の組織行動分析装置において、
     上記制御部は、上記複数の人物それぞれの主観的評価又は客観的評価を示すデータと、上記組織内の他の人物における上記組織内での人物間の関係及び上記組織内での行動を示す指標から算出される特徴量との相関をとる組織行動分析装置。
    In the organization behavior analysis device according to claim 1,
    The control unit is data indicating a subjective evaluation or an objective evaluation of each of the plurality of persons, and an index indicating a relationship between persons in the organization among the other persons in the organization and an action in the organization. A tissue behavior analysis device which correlates with the feature value calculated from.
  7.  組織を構成する複数の人物それぞれに装着され、対面を示すデータを取得する赤外線送受信部と、加速度データを取得する加速度センサと、上記対面を示すデータ及び上記加速度データをセンサデータとして送信する送信部と、を有する端末と、
     上記センサデータを受信し、かつ、上記複数の人物それぞれの主観的評価又は客観的評価を示すデータを受信する受信部と、上記センサデータ及び上記主観的評価又は客観的評価を示すデータを解析する制御部と、上記制御部が解析を行うための解析条件と上記制御部が解析した結果とを記録する記録部と、を有する組織行動分析装置とを備え、
     上記制御部は、
     上記複数の人物ごとに、上記組織内での人物間の関係及び上記組織内での行動を示す指標を、上記解析条件に基づいて上記センサデータから算出して上記記録部に記録し、
     上記複数の人物それぞれの主観的評価又は上記客観的評価を示すデータと、上記組織内での人物間の関係及び上記組織内での行動を示す指標との相関をとり、上記組織における上記主観的評価又は上記客観的評価を示すデータの要因を特定する組織行動分析システム。
    An infrared transmitting / receiving unit which is attached to each of a plurality of persons constituting a tissue and acquires data indicating meeting, an acceleration sensor acquiring acceleration data, a transmitting unit transmitting data indicating the meeting and the acceleration data as sensor data And a terminal having
    A receiving unit that receives the sensor data and receives data indicating a subjective evaluation or an objective evaluation of each of the plurality of persons, and analyzes the sensor data and data indicating the subjective evaluation or the objective evaluation A tissue behavior analysis device including a control unit, a recording unit that records analysis conditions for the control unit to analyze, and a result of analysis performed by the control unit;
    The control unit
    For each of the plurality of persons, an index indicating an inter-person relationship in the organization and an action in the organization is calculated from the sensor data based on the analysis condition and recorded in the recording unit.
    The subjective evaluation of each of the plurality of persons or the data indicating the objective evaluation is correlated with the relationship between the persons in the organization and the index indicating the behavior in the organization, An organizational behavior analysis system for identifying factors of data indicating evaluation or the above-mentioned objective evaluation.
  8.  請求項7に記載の組織行動分析システムにおいて、
     上記制御部は、
     上記複数の人物ごとに、行動と思考の特性を示す指標を、上記解析条件に基づいて上記主観的評価を示すデータから算出して上記記録部に記録し、
     上記複数の人物それぞれの主観的評価又は上記客観的評価を示すデータと、上記行動と思考の特性を示す指標との相関をとり、上記組織における上記主観的評価又は上記客観的評価を示すデータの要因を特定する組織行動分析システム。
    In the organization behavior analysis system according to claim 7,
    The control unit
    For each of the plurality of persons, an index indicating characteristics of behavior and thinking is calculated from data indicating the subjective evaluation based on the analysis condition and recorded in the recording unit.
    The data showing the subjective evaluation or the objective evaluation in the organization by correlating the data indicating the subjective evaluation or the objective evaluation of each of the plurality of persons with the index indicating the characteristics of the behavior and thought Organizational behavior analysis system to identify factors.
  9.  請求項7に記載の組織行動分析システムにおいて、
     上記制御部は、
     上記赤外線送受信部で取得されるデータから上記複数の人物それぞれの対面状況を示す対面テーブルを作成する対面テーブル作成部と、
     上記加速度センサで取得されるデータから上記複数の人物それぞれの動きを示す身体リズムテーブルを作成する身体リズムテーブル作成部と、
     上記対面デーブルから作成されるネットワーク図に基づいて、上記複数の人物それぞれの他の人物との繋がりを示すネットワーク指標を抽出するネットワーク指標抽出部と、
     上記身体リズムテーブルから上記複数の人物それぞれの動きを示す周波数の出現頻度及び上記周波数の継続性を含む身体リズム指標を抽出する身体リズム指標抽出部と、
     上記対面テーブルと上記身体リズムテーブルに基づいて、上記複数の人物それぞれの対面時間及び対面積極性を示す対面指標を抽出する対面指標抽出部と、
     上記対面テーブルと上記身体リズムテーブルに基づいて、上記複数の人物それぞれの活動時間を示す組織活動指標を算出する組織活動指標抽出部と、を有し、
     上記組織内での人物間の関係及び上記組織内での行動を示す指標として、上記ネットワーク指標、上記身体リズム指標、上記対面指標、及び上記組織活動指標を用いる組織行動分析システム。
    In the organization behavior analysis system according to claim 7,
    The control unit
    A meeting table creation unit that creates a meeting table indicating the meeting situations of each of the plurality of persons from the data acquired by the infrared transmission / reception unit;
    A body rhythm table creation unit for creating a body rhythm table indicating the movement of each of the plurality of persons from the data acquired by the acceleration sensor;
    A network index extraction unit for extracting a network index indicating a connection between each of the plurality of persons with another person based on the network diagram created from the face-to-face table;
    A body rhythm index extracting unit for extracting a body rhythm index including the frequency of appearance of the frequency indicating the movement of each of the plurality of persons and the continuity of the frequency from the body rhythm table;
    A face-to-face index extraction unit that extracts a face-to-face index indicating the face-to-face time and the opposite area polarity of each of the plurality of persons based on the face-to-face table and the body rhythm table;
    An organization activity index extracting unit that calculates an organization activity index indicating an activity time of each of the plurality of persons based on the facing table and the body rhythm table;
    An organization behavior analysis system using the network indicator, the body rhythm indicator, the face-to-face indicator, and the organization activity indicator as an indicator indicating a relationship between persons in the organization and an action in the organization.
  10.  請求項7に記載の組織行動分析システムにおいて、
     上記客観的評価を示すデータとは、上記複数の人物それぞれの生産性、事故不良の少なくとも何れか1つを示すデータである組織行動分析システム。
    In the organization behavior analysis system according to claim 7,
    The data which shows said objective evaluation are data which show at least any one of productivity of each of the said several persons, and an accident failure, and the organization behavior analysis system.
  11.  請求項7に記載の組織行動分析システムにおいて、
     上記主観的評価を示すデータとは、上記複数の人物それぞれのリーダシップ/チームワーク指標、やりがい/充実指標、及びストレス/メンタル不調指標の少なくとも何れか1つである組織行動分析システム。
    In the organization behavior analysis system according to claim 7,
    The data indicating the above-mentioned subjective evaluation is at least one of a leadership / teamwork index, a challenging / enhancement index, and a stress / mental disorder index for each of the plurality of persons.
  12.  請求項7に記載の組織行動分析システムにおいて、
     上記制御部は、上記複数の人物それぞれの主観的評価又は客観的評価を示すデータと、上記組織内の他の人物における上記組織内での人物間の関係及び上記組織内での行動を示す指標から算出される特徴量との相関をとる組織行動分析システム。
    In the organization behavior analysis system according to claim 7,
    The control unit is data indicating a subjective evaluation or an objective evaluation of each of the plurality of persons, and an index indicating a relationship between persons in the organization among the other persons in the organization and an action in the organization. A tissue behavior analysis system that correlates with feature amounts calculated from.
  13.  複数の人物で構成される組織の分析を行う組織行動分析装置であって、
     上記複数の人物それぞれの主観的評価を示すデータを受信する受信部と、
     上記主観的評価を示すデータを解析する制御部と、
     上記組織内の座席位置を示すデータと、制御部が解析を行うための解析条件と上記制御部が解析した結果とを記録する記録部と、を備え、
     上記制御部は、
     上記複数の人物ごとに、ストレスに関連する指標を、上記解析条件に基づいて上記主観的評価を示すデータから算出する指標計算部と、
     上記座席位置を示すデータ及び上記ストレスに関連する指標に基づいて上記複数の人物それぞれの上記組織内での座席配置を決定する座席配置決定部と、を有する組織行動分析装置。
    An organization behavior analysis apparatus for analyzing an organization composed of a plurality of persons, comprising:
    A receiving unit that receives data indicating the subjective evaluation of each of the plurality of persons;
    A control unit that analyzes data indicating the subjective evaluation;
    The recording unit configured to record data indicating the position of the seat in the tissue, an analysis condition for the control unit to analyze, and a result of the analysis performed by the control unit;
    The control unit
    An index calculation unit that calculates, for each of the plurality of persons, an index related to stress from data indicating the subjective evaluation based on the analysis condition;
    And a seat arrangement determination unit configured to determine a seat arrangement of each of the plurality of persons in the organization based on the data indicating the seat position and the index related to the stress.
  14.  請求項13に記載の組織行動分析装置において、
     上記受信部は、上記複数の人物それぞれに装着される端末の赤外線送受信部で取得されるセンサデータを受信し、
     上記制御部は、上記赤外線送受信部で取得されるデータから上記複数の人物それぞれの対面状況を示す対面テーブルを上記解析条件に基づいて作成する対面テーブル作成部と、上記対面テーブルから作成されるネットワーク図に基づいて、上記複数の人物間の対面距離を上記解析条件を用いて計算する対面距離計算部と、をさらに有し、
     上記座席配置決定部は、上記座席位置を示すデータ及び上記対面距離に基づいて上記複数の人物それぞれの上記組織内での座席配置を決定する組織行動分析装置。
    In the organization behavior analysis device according to claim 13,
    The receiving unit receives sensor data acquired by an infrared transmitting and receiving unit of a terminal attached to each of the plurality of persons;
    The control unit generates a meeting table indicating a meeting situation of each of the plurality of persons from the data acquired by the infrared transmitting and receiving unit based on the analysis condition, and a network generated from the meeting table And a face-to-face distance calculation unit that calculates the face-to-face distance between the plurality of persons based on the diagram using the analysis condition,
    The tissue behavior analysis device according to claim 1, wherein the seat arrangement determination unit determines the seat arrangement in the organization of each of the plurality of persons based on the data indicating the seat position and the facing distance.
PCT/JP2010/068289 2009-11-04 2010-10-18 Organization behavior analyzer and organization behavior analysis system WO2011055628A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011539329A JP5400895B2 (en) 2009-11-04 2010-10-18 Organizational behavior analysis apparatus and organizational behavior analysis system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009252576 2009-11-04
JP2009-252576 2009-11-04

Publications (1)

Publication Number Publication Date
WO2011055628A1 true WO2011055628A1 (en) 2011-05-12

Family

ID=43969868

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/068289 WO2011055628A1 (en) 2009-11-04 2010-10-18 Organization behavior analyzer and organization behavior analysis system

Country Status (2)

Country Link
JP (1) JP5400895B2 (en)
WO (1) WO2011055628A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013058128A (en) * 2011-09-09 2013-03-28 P & W Solutions Co Ltd Apparatus, method, and program for designing seating arrangement
WO2017037839A1 (en) * 2015-08-31 2017-03-09 株式会社日立製作所 Information processing device and information processing method
JP2017059111A (en) * 2015-09-18 2017-03-23 Necソリューションイノベータ株式会社 Organization improvement activity support system, information processing apparatus, method and program
JP2018088245A (en) * 2016-11-22 2018-06-07 パナソニックIpマネジメント株式会社 Cabin crew evaluation system and cabin crew evaluation method
JP2018136624A (en) * 2017-02-20 2018-08-30 ソーシャルアドバンス株式会社 Analysis apparatus for occupational stress survey and control program of analysis apparatus for occupational stress survey
US10258272B2 (en) 2015-10-08 2019-04-16 International Business Machines Corporation Identifying stress levels associated with context switches
WO2019146200A1 (en) * 2018-01-23 2019-08-01 ソニー株式会社 Information processing device, information processing method, and recording medium
CN112235564A (en) * 2020-09-10 2021-01-15 当趣网络科技(杭州)有限公司 Data processing method and device based on delivery channel
JP2021033326A (en) * 2019-08-13 2021-03-01 株式会社Marvellous Labo Organizational development support device and organizational development support program to support organizational development
JP7376155B2 (en) 2022-03-04 2023-11-08 Necプラットフォームズ株式会社 Processing equipment, processing system, processing method and program
JP7394422B1 (en) 2023-07-11 2023-12-08 Metateam株式会社 Teamwork visualization system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005128930A (en) * 2003-10-27 2005-05-19 Casio Comput Co Ltd Personnel information processing device and program
JP2008301071A (en) * 2007-05-30 2008-12-11 Hitachi Ltd Sensor node
JP2009181559A (en) * 2008-02-01 2009-08-13 Hitachi Ltd Analysis system and analysis server

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005128930A (en) * 2003-10-27 2005-05-19 Casio Comput Co Ltd Personnel information processing device and program
JP2008301071A (en) * 2007-05-30 2008-12-11 Hitachi Ltd Sensor node
JP2009181559A (en) * 2008-02-01 2009-08-13 Hitachi Ltd Analysis system and analysis server

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013058128A (en) * 2011-09-09 2013-03-28 P & W Solutions Co Ltd Apparatus, method, and program for designing seating arrangement
WO2017037839A1 (en) * 2015-08-31 2017-03-09 株式会社日立製作所 Information processing device and information processing method
JPWO2017037839A1 (en) * 2015-08-31 2018-08-02 株式会社日立製作所 Information processing apparatus and information processing method
JP2017059111A (en) * 2015-09-18 2017-03-23 Necソリューションイノベータ株式会社 Organization improvement activity support system, information processing apparatus, method and program
US10258272B2 (en) 2015-10-08 2019-04-16 International Business Machines Corporation Identifying stress levels associated with context switches
US10966648B2 (en) 2015-10-08 2021-04-06 International Business Machines Corporation Identifying stress levels associated with context switches
JP2018088245A (en) * 2016-11-22 2018-06-07 パナソニックIpマネジメント株式会社 Cabin crew evaluation system and cabin crew evaluation method
JP2018136624A (en) * 2017-02-20 2018-08-30 ソーシャルアドバンス株式会社 Analysis apparatus for occupational stress survey and control program of analysis apparatus for occupational stress survey
JPWO2019146200A1 (en) * 2018-01-23 2021-01-07 ソニー株式会社 Information processing equipment, information processing methods, and recording media
WO2019146200A1 (en) * 2018-01-23 2019-08-01 ソニー株式会社 Information processing device, information processing method, and recording medium
JP7367530B2 (en) 2018-01-23 2023-10-24 ソニーグループ株式会社 Information processing device, information processing method, and program
JP2021033326A (en) * 2019-08-13 2021-03-01 株式会社Marvellous Labo Organizational development support device and organizational development support program to support organizational development
CN112235564A (en) * 2020-09-10 2021-01-15 当趣网络科技(杭州)有限公司 Data processing method and device based on delivery channel
CN112235564B (en) * 2020-09-10 2023-06-27 当趣网络科技(杭州)有限公司 Data processing method and device based on delivery channel
JP7376155B2 (en) 2022-03-04 2023-11-08 Necプラットフォームズ株式会社 Processing equipment, processing system, processing method and program
JP7394422B1 (en) 2023-07-11 2023-12-08 Metateam株式会社 Teamwork visualization system

Also Published As

Publication number Publication date
JP5400895B2 (en) 2014-01-29
JPWO2011055628A1 (en) 2013-03-28

Similar Documents

Publication Publication Date Title
WO2011055628A1 (en) Organization behavior analyzer and organization behavior analysis system
JP5092020B2 (en) Information processing system and information processing apparatus
Im et al. Drivers and resources of customer co-creation: A scenario-based case in the restaurant industry
Consolvo et al. Conducting in situ evaluations for and with ubiquitous computing technologies
Olguín-Olguín et al. Sensor-based organisational design and engineering
Olguín et al. Social sensors for automatic data collection
US10381115B2 (en) Systems and methods of adaptive management of caregivers
JP2008287690A (en) Group visualization system and sensor-network system
Demir et al. A Next-Generation Augmented Reality Platform for Mass Casualty Incidents (MCI).
JP2009181559A (en) Analysis system and analysis server
JP5503719B2 (en) Performance analysis system
Klein et al. Shedding light on the usability of ecosystem services–based decision support systems: An eye-tracking study linked to the cognitive probing approach
JP2010198261A (en) Organization cooperative display system and processor
Vankipuram et al. Overlaying multiple sources of data to identify bottlenecks in clinical workflow
JP5372557B2 (en) Knowledge creation behavior analysis system and processing device
JP5591725B2 (en) Sensor information processing analysis system and analysis server
Bonaquist et al. An automated machine learning pipeline for monitoring and forecasting mobile health data
Waber et al. Sociometric badges: A new tool for IS research
JP5025800B2 (en) Group visualization system and sensor network system
Thakur et al. Human-Computer Interaction and Beyond: Advances Towards Smart and Interconnected Environments (Part II)
Ajibola et al. Development of landscape of usability evaluation methods for Mobile applications
US20220378297A1 (en) System for monitoring neurodegenerative disorders through assessments in daily life settings that combine both non-motor and motor factors in its determination of the disease state
Chen TeamDNA: Automatic Measures of Effective Teamwork Processes from Unconstrained Team Meeting Recordings
Loureiro ETdA: Ergonomic Tridimensional Analysis for common areas with circulation of people
Swain PASSIVE SENSING FRAMEWORKS FOR THE FUTURE OF INFORMATION WORKERS

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10828188

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011539329

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10828188

Country of ref document: EP

Kind code of ref document: A1