WO2014197176A2 - Social sensing and behavioral analysis system - Google Patents
Social sensing and behavioral analysis system Download PDFInfo
- Publication number
- WO2014197176A2 WO2014197176A2 PCT/US2014/038035 US2014038035W WO2014197176A2 WO 2014197176 A2 WO2014197176 A2 WO 2014197176A2 US 2014038035 W US2014038035 W US 2014038035W WO 2014197176 A2 WO2014197176 A2 WO 2014197176A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- metrics
- badge
- data
- sensor data
- visualization
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title description 18
- 230000003542 behavioural effect Effects 0.000 title description 5
- 238000000034 method Methods 0.000 claims abstract description 37
- 230000008569 process Effects 0.000 claims abstract description 4
- 238000012800 visualization Methods 0.000 claims description 43
- 238000004891 communication Methods 0.000 claims description 42
- 230000006399 behavior Effects 0.000 claims description 30
- 230000003993 interaction Effects 0.000 claims description 28
- 230000008520 organization Effects 0.000 claims description 27
- 238000012546 transfer Methods 0.000 claims description 15
- 230000010354 integration Effects 0.000 claims description 12
- 230000011664 signaling Effects 0.000 claims description 11
- 230000008859 change Effects 0.000 claims description 9
- 230000001939 inductive effect Effects 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 9
- 239000011159 matrix material Substances 0.000 claims description 7
- 230000000694 effects Effects 0.000 claims description 5
- 230000004931 aggregating effect Effects 0.000 claims 2
- 238000009877 rendering Methods 0.000 claims 1
- 238000004590 computer program Methods 0.000 description 12
- 230000000644 propagated effect Effects 0.000 description 12
- 230000009977 dual effect Effects 0.000 description 11
- 238000001514 detection method Methods 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000004566 IR spectroscopy Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 239000000047 product Substances 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 238000010223 real-time analysis Methods 0.000 description 1
- 230000007781 signaling event Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
Definitions
- Sensing and modeling social networks within an organization can be of benefit to enhance productivity and workplace happiness. Specifically, researchers look to capture phone, email, or other virtual means of communication in an effort to develop a complete model of a human network. Using a data-driven research approach, researchers can understand how communication networks function in an organization.
- the wearable computers generally rely on infrared transmission and are large and noticeable amongst others within the communication network. Additionally, certain wearable computers require the user to input subjective measures of how the user felt during certain interactions (i.e., the user felt the interaction went well, the user felt the interaction went poorly). Naturally, these shortcomings introduce numerous potential errors for researchers studying human behavior within an organization.
- the present invention overcomes the shortcomings of the prior art by providing a method, system, and computer program product for analyzing human behavior within an organization that may be integrated into an existing personnel identification system.
- Embodiments have a significantly improved sensor system for capturing human behavior data, and an improved human behavior analysis system for processing captured sensor data.
- the present invention is a method, system, or computer program product for analyzing human behavior within an organization.
- the method, system, or computer program product comprise providing a plurality of wearable electronic badges to be worn by a plurality of persons, each wearable electronic badge comprising: (a) a unique identification code, (b) a processor, a memory unit, a battery, and (c) a plurality of sensors to capture sensor data.
- the plurality of sensors include the following or equivalents thereof: an infrared sensor, a wireless communication module, a camera, an accelerometer, a compass, a plurality of microphones, an electromagnetic data transfer module, and an inductive battery charger.
- the electromagnetic data transfer module may be a near field communication (NFC) module, a radio frequency identification (RFID) module.
- the method, system, or computer program product further comprise transmitting from the wearable electronic badges being worn the sensor data to a server via a base station or directly via the wireless communication module; accepting a plurality of sensor data in the server and processing the plurality of sensor data to create a set of human behavior data; analyzing the set of human behavior data to calculate a plurality of metrics, said analyzing being implemented by one or more processors; and creating and displaying on a computer screen (or otherwise providing) a visualization of the plurality of metrics according to at least one user specification.
- NFC near field communication
- RFID radio frequency identification
- the sensor data includes data about the person wearing the given badge, including at least one of: proximity to another person, direct face-to-face interaction, location, individual speech patterns, conversational turn-taking patterns, orientation, body movement patterns, and posture patterns.
- the sensor data further includes information regarding interactions with the another person.
- the another person is not wearing one of the plurality of wearable electronic badges.
- the proximity of the person to the another person is measured by a combination of at least two of the following: the infrared sensor, the wireless communication module, the compass, and the camera.
- the conversational turn-taking patterns of the person are measured by a combination of at least two of the following: the infrared sensor, the wireless communication module, the compass, the camera, and the plurality of microphones.
- the plurality of wearable electronic badges are implemented into an existing personnel identification network, and the unique identification number is a number assigned by the existing personnel identification network, or manually assigned.
- the plurality of metrics are at least one of: a plurality of social network metrics, a plurality of conversational turn-taking metrics, or a plurality of social signaling metrics.
- the plurality of social network metrics include at least one of: betweenness centrality, degree centrality, cohesion, integration, and exploration.
- the plurality of conversational turn-taking metrics include at least one of: number of turns, number of pauses, number of successful interruptions, number of unsuccessful interruptions, average turn length, average pause length, turn-taking matrix, speaking time, silence time, listening time, overlapping speech time, dominance, interactivity, participation percentage, participation balance, and turn-taking balance.
- the plurality of social signaling metrics include at least one of: activity, consistency, mirroring, and influence; and the plurality of social signaling metrics are calculated based on any one or combination of: body movement patterns, individual speech patterns, and conversational turn-taking patterns.
- the visualization of the plurality of metrics illustrate the relationship between any one of the plurality of metrics and a set of performance metrics, the set of performance metrics selected from; productivity, job satisfaction, employee engagement, and drive.
- the visualization further illustrates a change in the set of performance metrics that correspond with a change in the sensed human behavior.
- the visualization of the plurality of metrics further includes a visualization of the plurality of metrics according to (or as a function of) location.
- the present invention is a method, system, or computer program product for analyzing human behavior within an organization.
- the method, system, or computer program product can comprise providing a plurality of wearable electronic badges to be worn by a plurality of persons, each wearable electronic badge comprising: (a) a unique identification code, (b) a processor, a memory unit, a battery, and (c) a plurality of sensors to capture sensor data.
- the plurality of sensors include: an infrared sensor, a wireless communication module, a camera, an accelerometer, a compass, a plurality of microphones, a near field communication (NFC) module, a radio frequency identification (RFID) module, and an inductive battery charger, wherein the sensor data is captured by the wireless communication module in combination with at least one other of the sensors in the plurality.
- an infrared sensor a wireless communication module
- a camera an accelerometer
- a compass a plurality of microphones
- NFC near field communication
- RFID radio frequency identification
- the method, system, or computer program product further comprise transmitting from the wearable electronic badges being worn the sensor data to a server (via a base stationor via the wireless communication module, or otherwise); accepting a plurality of sensor data in the server and processing the plurality of sensor data to create a set of human behavior data; analyzing the set of human behavior data to calculate a plurality of metrics, said analyzing being implemented by one or more processors; and providing (generating) and displaying on a computer screen a visualization of the plurality of metrics according to at least one user specification.
- the communication module measures a radio signal strength, said measured radio signal strength being used in combination with the captured sensor data from the plurality of sensors to determine when two or more persons are having a conversation.
- FIG. 1 is a block flow diagram of an example embodiment of the present invention.
- FIG. 2 is an illustration of a system for social sensing and behavioral analysis according to the present invention.
- FIG. 3 is a schematic of an example badge of the present invention.
- FIGs. 4A-4F are example graphical user interface views of an analysis of the metrics calculated in the present system.
- FIG. 5 illustrates a computer network or similar digital processing environment in which the present invention may be implemented.
- FIG. 6 is a diagram of the internal structure of a computer (e.g., client
- processor/device or server computers in the computer system of FIG. 5.
- FIG. 1 is a block flow diagram of an example embodiment method and system 100 of the present invention.
- step 105 a plurality of wearable electronic badges
- each badge is implemented into an existing personnel identification system within the organization.
- the badge is implemented into the identification card of a person in an organization.
- the badge is a substitute for an identification card for a person in an organization.
- Each badge at step 105 is equipped with a plurality of sensors.
- the sensor data of the badge is captured and initially stored on the badge.
- the sensor data captured by the badge includes, for example, proximity to another person, direct face-to-face interaction, location, individual speech patterns, conversational turn-taking patterns, orientation, body movement patters, and posture patterns.
- the badge is designed to be worn by the person throughout the day as the person interacts with other people, and tracks differences in personal behaviors. Further examples of the types of sensor data are described below under the heading "Analysis.”
- the sensor data from the badge is transferred to a base station.
- the transfer of sensor data to the base station can be done through any means known to one skilled in the art, including mechanically, electrically, and wirelessly.
- Wireless data communication or transfer can be achieved, for example, through a local area network or via a wireless radio transfer, such as Bluetooth.
- each base station transmits respective collected sensor data to a server.
- the server can aggregate the data from a plurality of base stations and additionally store the aggregated data in a database.
- the server can process the aggregated sensor data to yield a set of raw human behavior data.
- each badge transmits sensor data directly to the server, for example electronically or wirelessly, through a wireless data communication.
- Other data transfer methods and techniques are suitable so that the sensor receives, aggregates, and stores sensor data from the badges at step 125.
- the raw human behavior data can be analyzed by a computer or processor to create (or otherwise generate) a plurality of metrics.
- the calculations to create the plurality of metrics are described briefly below under the heading "Analysis,” and in more detail in Applicant's co-pending application, U.S. Patent Application No. 14/172,872, which is incorporated herein by reference in its entirety.
- a visualization of the generated plurality of metrics is created and displayed on a computer screen according to specifications of the user.
- An example of the visualization, as well as more detail with respect to the user specifications are described below with respect to FIGs. 4A through 4F.
- FIG. 2 is an illustration of a computer-based system 200 for social sensing and behavioral analysis implementing the steps of the method 100 of FIG. 1.
- a plurality of wearable electronic badges 205a-n can be dispersed amongst a plurality of people (individuals).
- Each wearable badge 205a-n has a unique identification code, either manually entered by a person, or electronically communicated (or electromagnetically transferred) from an identification card via a near field communication (NFC) module or a radio frequency identification (RFID) module or the like.
- NFC near field communication
- RFID radio frequency identification
- each badge 205a-n is equipped with a wireless transceiver.
- a plurality of base stations 210a-n are placed in various locations throughout the offices or premises of the subject organization or work site.
- the badges 205a- n transmit sensor data to the base stations 210a-n wirelessly or via a mechanical or electrical coupling, such as by a universal serial bus (USB).
- USB universal serial bus
- each base station 210a-n can also have a unique identification code.
- the base stations 210a-n have a plurality of sensors similar to the sensors in the badges 205a-n, and can be used for location mapping.
- the base station 21 Oa-n can create a spatial library of communication patterns that can be used in conjunction with the sensor data collected from the badges 205a-n. The combination of these datasets allows for analysis tools to show not only with whom a user interacted with, but also where on the premises said interaction occurred. Office layouts and performance metrics combined with communication and conversational patterns allow for determination as to which spaces promote higher levels of productivity, job satisfaction, employee engagement, and other organizational outcomes.
- the base stations 210a-n can transmit the collected sensor data either wirelessly, electrically, or mechanically to a plurality of central computers 215a-n.
- the central computers 215a-n are connected to a server 220 either wirelessly, mechanically, or electrically.
- the server 220 can also be connected to a database 223, which may either be housed inside the server 220, or stored externally.
- the sensor data from the base stations 210a-n is aggregated at the server 220 and processed into human behavior data.
- Badges 205 can transmit collected sensor data directly to server 220 wirelessly or via a mechanical or electrical coupling.
- the aggregated data and/or processed human behavior data from the server 220 can then be transferred to a computer 225a-n equipped with a display.
- the aggregated data and/or human behavior data is analyzed and processed to provide feedback to the user.
- the user may view on the computer 225a-n any of the user specified metrics he or she chooses.
- FIG. 3 is a schematic of an example badge 205.
- the badge 205 via a plurality of sensors, captures real time information about the user's communication, social signaling, and interaction patterns.
- the badge 205 contains a microcontroller 301.
- the microcontroller runs an embedded operating system that is stored on the NAND Flash Memory 320.
- the data collected from the sensors is also stored on the NAND Flash Memory 320.
- the flow of data from the NAND Flash Memory 320 to the microcontroller 301 is achieved through the memory controller 302, which is further coupled to a SDRAM Memory Unit 322.
- the microcontroller 301 can be coupled to an internal read only memory 325, and an internal synchronous random access memory 327.
- the sensor data is collected via a number of peripherals connected to the microcontroller 301.
- the sensor data collected is time stamped from a real time clock 355 within the badge 205.
- the sensors listed below are by way of example and not intended to be limiting, as anyone of skill in the art may recognize additional or substitute sensors that may still achieve the goals of the present invention.
- the badge 205 can be equipped with a plurality of digital MEMs microphones 340.
- the microphones 340 can be oriented in any direction in order to capture speech and speech patterns not only from the person wearing the badge 205, but also from other individuals in proximity to the person.
- the microphones 340 are capable of capturing speech and speech patterns of individuals not wearing a badge interacting with the person wearing the badge 205.
- the data collected by the microphones 340 can be processed by an audio codec 342 connected to a serial synchronous controller 345 or a serial peripheral interface bus 330 within the badge 205.
- a digital 3-axis compass 332 and a digital 3-axis accelerometer 335 can also be connected to the serial peripheral interface bus 330.
- the digital compass 332 allows the badge 205 to better collect data on the orientation of the person wearing the badge 205, and can aid the analysis system in determining whether two people were interacting or facing each other.
- the accelerometer 335 can collect data regarding the movement of the person wearing the badge 205. Movement includes, for example, speed of movement, body position including how he or she stands, and/posture.
- the data from the compass 332 and the accelerometer 335 can be collected as a single package.
- the badge 205 can also be equipped with an infrared data association transceiver (IrDA) 305 for collecting infrared (IR) data.
- IrDA 305 can be coupled to a universal asynchronous receiver/transmitter (US ART) 303.
- the IR data is used, in part, to detect proximity to another person and face-to-face interactions.
- the badge 205 can be equipped with an image sensor interface (referred to herein as a camera) 357, used to detect facial expressions when two people are interacting face-to-face.
- the camera 357 is a complementary metal oxide silicon (CMOS) camera.
- CMOS complementary metal oxide silicon
- a wireless communication module such as a BluetoothTM dual mode module 310 can be used in the badge 205.
- the wireless communication module 310 is not limited to BluetoothTM, and can be any short-wavelength UHF radio waves in the ISM band from 2.4 to 2.485 GHz or other suitable wireless communication method.
- the wireless communication module can collect data (such as radio signals) in conjunction with any of the other sensors in the badge 205.
- the combination of Bluetooth and IR data (from modules 310, 305) for proximity detection is a unique solution in that it allows for more accurate information about the environment and the people nearby.
- the limited range of the IR sensors 305 also allows for more accurate detection of face-to-face interactions whereby the IR sensor is used to detect actual face-to-face discussions and the Bluetooth low energy module 310 is used for proximity detection.
- the wireless communication module 310 can measure radio signal strength received from another source, such as a badge 205 worn by another person. Combining the measured radio signal strength with captured sensor data from the plurality of sensors can allow the system to determine when two or more persons are having a conversation. In other examples, long range infrared transmissions cause the signals to bounce off walls and thus increase detections of persons out of face-to-face range. Combining Bluetooth (or radio signal strength or data) and IR data with speech data from the
- microphones 340 it is possible to detect conversations with a much higher accuracy than just by using IR signals of IR sensors 305.
- the camera 357 and compass 332 are additions that may help in the detection of face to face interactions.
- the compass 332 detects the orientation of each user. If the received signal strength indication (RSSI) values received from the Bluetooth Dual Mode Module 310 are strong enough, then it is likely a conversation is occurring if the users are oriented towards each other. In another instance where IrDA Transceiver 305 alone may not be able to detect all users in a conversation, or if a user is interacting with a non-badge wearing participant, the camera 357 can be used for face detection. In conjunction with the IrDA Transceiver 305, the Bluetooth Dual Mode Module 310, the compass 332, and the camera 357 sensors allow for a more accurate portrayal of face-to-face interactions.
- RSSI received signal strength indication
- the badge 205 can also be powered by a Lithium Polymer Rechargeable Battery 365.
- the badge 205 can be powered from a momentary on tactile push button whereby a power management system supplies the microcontroller 301, the sensors, and the memory units.
- the badge 205 can also be equipped with a battery charge monitor 352 to provide accurate information about the remaining capacity of the battery 365.
- the real time clock 355 can be powered by a low-dropout regulator so as not to draw power from the battery 365.
- the badge 205 can also be equipped with an inductive charging receiver 370, such as a Qi compliant inductive charging receiver.
- a voltage regulator 367 may be coupled to the battery 365.
- the regulator 367 includes an inductive battery charger with USB adapter.
- the badge 205 can be equipped with a USB port 307, 312 for direct connection to a base station 210, a computer 215, or a server 220.
- Bluetooth Dual Mode Module 310 in conjunction with the inductive charging receiver 370 allows for a completely wireless solution.
- the Dual Mode Bluetooth Module 310 allows for a low power proximity solution while also ensuring that the device has the capability to connect to computers with existing classic Bluetooth technology at faster data rates.
- the badge 205 can also be equipped with a near field communication (NFC) reader/writer/emulator 337 and/or a radio-frequency identification (RFID) reader/writer 350.
- NFC near field communication
- RFID radio-frequency identification
- the NFC module 337 and the RFID module 350 each (separately) allows the badge 205 to identify the person wearing the badge 205 by pairing with an existing identification card that utilizes either NFC or RFID for identification.
- the NFC module 337 and the RFID module 350 allow the badge 205 to be implemented into an existing personnel identification system in an organization, such as by an identification card, or can otherwise supplement existing identification cards.
- the badge 205 when activated, may pair with an identification card of a person.
- the user can pair the badge 205 with an identification when the badge 205 is turned on and a LED indicator 317 blinks (illuminates), indicating that the device is searching for a NFC module 337 and/or RFID module 350 to pair the badge 205 to a user's identification card.
- the badge 205 can store the identity of the person using the badge 205 internal memory. If the badge 205 cannot detect a NFC module 337 or RFID module 350, the badge can also assign a hexadecimal identification code to that particular badge which can be stored for analysis later.
- the NFC module 337 and the RFID moldule 350 can read other badges or identification cards within proximity to the badge 205. This allows the badge 205 to identify and store indications of people with whom the person wearing the badge 205 is interacting with.
- the sensor data collected from the badges 205 can be processed and analyzed to calculate a plurality of metrics.
- the analysis can use any combination of the sensors and respective sensor data to calculate the plurality of metrics, including using sensed data from one or a combination of the IrDA transceiver 305, the Bluetooth Dual Mode Module 310, the compass 332, the plurality of microphones 340, the accelerometer 335, and the camera 357.
- the proximity of the person wearing a badge to another person is measured by sensed data from a combination of at least two of the following: the IrDA transceiver 305, the Bluetooth Dual Mode Module 310, the compass 332, and the camera 357.
- the proximity of the person wearing a badge to another person is measured by a combination of the IrDA transceiver 305 sensed data and the Bluetooth Dual Mode Module 310 sensed data.
- the conversational turn taking patterns of the person wearing a badge is measured by a combination of at least two of the following: the IrDA transceiver 305, the Bluetooth Dual Mode Module 310, the compass 332, the camera 357, and the plurality of microphones 340.
- the conversational turn taking patterns of the person wearing a badge is measured by a combination of sensed data from the Bluetooth Dual Mode Module 310 and the plurality if microphones 340.
- the sensor data is captured by a combination of the Bluetooth Dual Mode Module 310 and any one or more of the remaining plurality of sensors.
- Each set of sensor data downloaded from a badge 205 can be grouped such that analysis can be specific to a given project.
- the structure of each dataset can identify a list of participants, teams, and sessions to group different parameters together for analysis.
- the sensor data, in isolation or in aggregation, can be exported in any format desired by the user, such as CSV, or to an Excel Spreadsheet.
- ScalarDatasource which produces single value for each participant per session
- MultiScalarDatasource which produces multiple values for each participant per session
- TimeSeriesDatasource where there is one value per time unit per participant
- MultiTimeSeriesDatasource where there are multiple values per time unit per participant
- NetworkDatasource which is a single network matrix per dataset per session.
- the system 200 can also keep track of the various parameters of each badge 205 including Bluetooth MAC address, last time a badge 205 was connected to the method of system 200, and the various configuration parameters of the badge 205.
- the software has the ability to update the firmware configurations of the badge 205.
- the sampling frequencies as well as extra features can be enabled. Changing the sampling frequencies of the various sensors allows the badge 205 to be configured to save power or to collect more data.
- Such additional features include 128-bit AES encryption, enabling raw audio, enable real time analysis of raw Mel Bands, and enable the device 205 to detect generic Bluetooth devices in addition to the other badges 205.
- Examples of the plurality of metrics that may be analyzed include a plurality of social network metrics, a plurality of conversational turn-taking metrics, and a plurality of social signaling metrics.
- the plurality of social network metrics include metrics such as betweenness centrality, degree centrality, cohesion, integration, and exploration.
- Betweenness centrality is equal to the number of shortest paths from all vertices to all others that pass through a node where the social network of people is represented by a graph of nodes and vertices (end nodes). Each node or vertex in the graph represents a person in relation to other people in the social network.
- Degree centrality is defined as the number of ties that a node has.
- Network cohesion is defined as the minimum number of nodes in a social network that need to be removed in order to disconnect a group.
- Integration is the number of triads that a node is part of divided by the maximum number of possible triads. Exploration can be calculated as the percentage of time that someone from a group has spent with people outside of that group divided by the total interaction time.
- the plurality of conversational turn-taking metrics can include numerous metrics to analyze the quantity and quality of conversation amongst groups.
- the metrics can measure turn taking statistics, which include the number of turns a person takes while speaking, the number of pauses, the number of
- the metrics measure a turn taking matrix, identifying who talks after whom and how much each person talks within a group.
- the conversational turn-taking metrics can measure the time a person is silent, the time the person is listening, the frequency of overlapping speech (i.e., when two or more people are speaking at the same time). The metrics can also measure the dominance of a speaker, a participation and turn- taking percentage and balance, and the interactivity of the person amongst others.
- the plurality of social signaling metrics provides a measure of social signals given and received by a person or group of people.
- Example metrics include Activity, Consistency, Influence, and Mirroring.
- Activity quantifies the overall magnitude of low-level social signals over time.
- Body movement is generally measured by the accelerometer 335, and includes measurements of the average body movement energy, the acceleration vector magnitude, and posture.
- the posture of a person consists of two angles, a front/back and a left/right, which can be computed from a raw acceleration vector.
- Speech can be measured by the plurality of microphones 340 and the audio codec 342.
- This metric includes a measurement of the amplitude of speech and the frequency of speech, both measured by fast fourier transform (FFT). Additionally, the raw cepstrum, mel- frequency cepstrum, and the average number of turns can be calculated from the data received from the plurality of microphones 340 and the audio codec 342.
- FFT fast fourier transform
- Consistency quantifies how much change or variation there is in low-level social signals over time. This includes consistency in body movement, turn taking, volume, and fundamental voice frequency.
- the consistency metric can normalize the variation of each metric to account for outliers.
- Influence quantifies how much social signals from one person cause changes in social signals across other people during an interaction.
- This metric compares sensor data amongst a plurality of persons. Data measured under this metric includes measuring a person's influence over another's body movement, influence over another's amplitude, influence over another's frequency, and influence over another's turn taking.
- Mirroring quantifies how similar social signals are across people. For example, this metric can measure mirroring of another's body movements, volume, fundamental voice frequency, and turn taking.
- Additional metrics that can be measured are integration, interactivity, exploration, drive, engagement, a comparison of that person to their team members, productivity, job satisfaction, and employee engagement.
- the metrics measured in the "Analysis" section, described above, can be further processed and visualized to be viewed on a display monitor of a computer 225.
- the Visualization can include information about the individual badge-wearing person, such as a comparison of that person's metrics compared to the average of other people, or the average of a team, or the average of an organization.
- the "team” can refer to a group of people who self-identify as a team, or are identified as a team by the organization itself or the management.
- the Visualization can also compare the metrics of various teams in an organization. Specifically, the Visualization can illustrate the relationship between any one of the plurality of metrics and a set of performance metrics.
- the performance metrics can refer to metrics such as productivity, job satisfaction, employee engagement, and drive.
- the Visualization can also illustrate how a change in human behavior can affect change in the plurality of metrics.
- the Visualization can also calculate the plurality of metrics for the plurality of persons and organize them according to location or office layout. This allows the user to determine where different types of interactions are occurring within an organization.
- the Visualization can include information about the organization as a whole. Examples of visualizations that may be generated based on the data about the organization include an overview of the organizational dynamics; how individuals within the organization interact with each other; and graphical depictions of how, where, and when users communicate face-to-face.
- the Visualization can also include data regarding email communication between users, instant messaging communications, phone communications, and face-to-face interactions.
- FIG. 4A is an example graphical user interface of a visualization generated by embodiments of the present invention.
- a visualization area 405 can display information about the measured metrics.
- the information can include graphs, tables, or other charts which allow the user to easily read and understand the data.
- the glossary area 410 can provide explanations for the various items displayed in the visualization area 405.
- the summary area 415 can summarize the data.
- the calendar (Date Selection) area 420 allows the user to select different dates or ranges of dates on a calendar and view data for that corresponding date or time period.
- the data is time stamped through the use of a real time clock 355 in the badge 205. This allows the analysis system to identify when certain actions were performed. Additionally, the Visualization can demonstrate trends in behavior data.
- the user can select a range of dates in the calendar area 420 and the visualization area 405 can provide graphs, tables, or charts of data for the range of user- selected dates.
- the user can specify which metrics he or she wishes to view.
- Example metrics in FIG. 4A include integration 425, interactivity 430, exploration 435, drive 440, engagement 445, and team comparison 450. More metrics or less metrics may be included in the user interface, and the metrics recited above are merely by way of example.
- FIG. 4B is an example graphical user interface screen view of the team
- Team integration 425 can be, for example, measured as the number of triads (groups of three people) that a person is a part of compared to the number of possible triads.
- the Visualization area 405 can create a visualization of how each member of a team interacts with other members of the team. Additionally, in the example in FIG. 4B, the visualization area 405 can illustrate the integration values for four teams, and, in the summary area 415, highlight which team had the highest integration score and which team had the lowest.
- the user may also select a date range in the calendar area 420.
- each team's integration score is charted in a line graph 455 across four weeks. Next to the line graph 455, the data of the integration values over four weeks is tabulated in table 460.
- FIG. 4C is an example graphical user interface screen view generated to illustrate a team's interactivity 430.
- Team interactivity 430 is measured from the turn-taking matrix and/or participation balance over any length of time (e.g., minute-by-minute, hour-by-hour, or for an entire session).
- the visualization area 405 displays a graph with different sized circles. As explained in the glossary area 410, the size of each circle is proportional to the respective percentage of total speaking time.
- FIG. 4D is an example graphical user interface screen view generated to illustrate an organization's exploration 435.
- An organization's exploration 435 is the amount of face- to-face interaction within each team within an organization, as well as how much interaction a team has with other teams.
- FIG. 4E is an example graphical user interface screen view generated to illustrate an individual's energy.
- the visualization area 405 illustrates a graph of a person's drive 440 score.
- Drive 440 is explained in the glossary area 410 as to what degree or how energetic a person is during the day.
- Energy is proportional to the amount of body movement throughout the day, and it can be computed using the accelerometer 335 sensor data as the magnitude of the 3-axis acceleration vector (sample-by-sample, second-by-second, minute-by-minute, or at any other time resolution).
- body posture and posture changes can also be estimated using two angles (front/back and left/right) computed from the raw acceleration vector (arccosine of the dot product).
- FIG. 4F is an example graphical user interface screen view generated to demonstrate or otherwise illustrate an individual's speech profile.
- the visualization area 405 illustrates a pie graph of an individual's speech profile.
- the speech profile can illustrate the time an individual spends silent, the time an individual spends listening, and the time an individual spends speaking, or respective percentages thereof.
- This data can be detected by any speech detection algorithm (e.g., energy based, clustering based, etc.).
- “Listening” is tagged if someone in the individual's interaction group other than the individual is speaking at the same time the individual is present.
- “Overlap” is tagged if someone from the interaction group speaks when the individual is speaking. Individual participation is the speaking time divided by the total speaking time within a current interaction group.
- Dominance is based on the number of successful and unsuccessful interruptions, the number of turns, and the individual's participation score.
- Additional turn-taking metrics that can be measured and displayed include number of turns, number of pauses, number of successful interruptions, number of unsuccessful interruptions, average turn length, average pause length, turn-taking matrix, speaking time, silence time, listening time, overlapping speech time, dominance, interactivity, participation percentage, participation balance, and turn-taking balance.
- FIG. 5 illustrates a computer network or similar digital processing environment in which the present invention may be implemented.
- Client computer(s)/devices 50 and server computer(s) 60 provide processing, storage, and input/output devices executing application programs and the like. Client computer(s)/devices 50 can also be linked through communications network 70 to other computing devices, including other client devices/processes 50 and server computer(s) 60. Communications network 70 can be part of a remote access network, a global network (e.g., the Internet), a worldwide collection of computers, Local area or Wide area networks, and gateways that currently use respective protocols (TCP/IP, Bluetooth, etc.) to communicate with one another.
- Client device s/computers 50 implement badges (sensors) 205 and computers 225 described above, and servers 60 implement base stations 210, central computers 215 and servers 220 described above. Other electronic device/computer network architectures are suitable.
- FIG. 6 is a diagram of the internal structure of a computer (e.g., client
- processor/device 50 or server computers 60 in the computer system of FIG. 5.
- Each computer 50, 60 contains system bus 79, where a bus is a set of hardware lines used for data transfer among the components of a computer or processing system.
- Bus 79 is essentially a shared conduit that connects different elements of a computer system (e.g., processor, disk storage, memory, input/output ports, network ports, etc.) that enables the transfer of information between the elements.
- Attached to system bus 79 is I/O device interface 82 for connecting various input and output devices (e.g., keyboard, mouse, displays, printers, speakers, etc.) to the computer 50, 60.
- Network interface 86 allows the computer to connect to various other devices attached to a network (e.g., network 70 of FIG. 5).
- Memory 90 provides volatile storage for computer software instructions 92 and data 94 used to implement an embodiment of the present invention (e.g., sensor data analysis, visualization of calculated metrics, user interface of same and supporting code 100 detailed above).
- Disk storage 95 provides non-volatile storage for computer software instructions 92 and data 94 used to implement an embodiment of the present invention.
- Central processor unit 84 is also attached to system bus 79 and provides for the execution of computer instructions.
- the processor routines 92 and data 94 are a computer program product (generally referenced 92), including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the invention system.
- Computer program product 92 can be installed by any suitable software installation procedure, as is well known in the art.
- at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection.
- the invention programs are a computer program propagated signal product 107 embodied on a propagated signal on a propagation medium (e.g., a radio wave, an infrared wave, a laser wave, a sound wave, or an electrical wave propagated over a global network such as the Internet, or other network(s)).
- a propagation medium e.g., a radio wave, an infrared wave, a laser wave, a sound wave, or an electrical wave propagated over a global network such as the Internet, or other network(s).
- Such carrier medium or signals provide at least a portion of the software instructions for the present invention routines/program 92.
- the propagated signal is an analog carrier wave or digital signal carried on the propagated medium.
- the propagated signal may be a digitized signal propagated over a global network (e.g., the Internet), a telecommunications network, or other network.
- the propagated signal is a signal that is transmitted over the propagation medium over a period of time, such as the instructions for a software application sent in packets over a network over a period of milliseconds, seconds, minutes, or longer.
- the computer readable medium of computer program product 92 is a propagation medium that the computer system 50 may receive and read, such as by receiving the propagation medium and identifying a propagated signal embodied in the propagation medium, as described above for computer program propagated signal product.
- carrier medium or transient carrier encompasses the foregoing transient signals, propagated signals, propagated medium, storage medium and the like.
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Primary Health Care (AREA)
- Physics & Mathematics (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Computing Systems (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Health & Medical Sciences (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Arrangements For Transmission Of Measured Signals (AREA)
- Telephonic Communication Services (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2913538A CA2913538C (en) | 2013-06-07 | 2014-05-15 | Social sensing and behavioral analysis system |
EP14731467.8A EP3005026A4 (en) | 2013-06-07 | 2014-05-15 | Social sensing and behavioral analysis system |
AU2014275396A AU2014275396B2 (en) | 2013-06-07 | 2014-05-15 | Social sensing and behavioral analysis system |
SG11201509487WA SG11201509487WA (en) | 2013-06-07 | 2014-05-15 | Social sensing and behavioral analysis system |
JP2016518327A JP6444390B2 (en) | 2013-06-07 | 2014-05-15 | Social sensing and behavior analysis system |
HK16110480.8A HK1222464A1 (en) | 2013-06-07 | 2016-09-02 | Social sensing and behavioral analysis system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361832580P | 2013-06-07 | 2013-06-07 | |
US61/832,580 | 2013-06-07 |
Publications (4)
Publication Number | Publication Date |
---|---|
WO2014197176A2 true WO2014197176A2 (en) | 2014-12-11 |
WO2014197176A8 WO2014197176A8 (en) | 2015-03-05 |
WO2014197176A3 WO2014197176A3 (en) | 2015-04-16 |
WO2014197176A9 WO2014197176A9 (en) | 2015-05-21 |
Family
ID=50977100
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2014/038035 WO2014197176A2 (en) | 2013-06-07 | 2014-05-15 | Social sensing and behavioral analysis system |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP3005026A4 (en) |
JP (1) | JP6444390B2 (en) |
AU (1) | AU2014275396B2 (en) |
CA (1) | CA2913538C (en) |
HK (1) | HK1222464A1 (en) |
SG (1) | SG11201509487WA (en) |
WO (1) | WO2014197176A2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9443521B1 (en) | 2013-02-14 | 2016-09-13 | Sociometric Solutions, Inc. | Methods for automatically analyzing conversational turn-taking patterns |
US10049336B2 (en) | 2013-02-14 | 2018-08-14 | Sociometric Solutions, Inc. | Social sensing and behavioral analysis system |
US20180292887A1 (en) * | 2017-04-07 | 2018-10-11 | International Business Machines Corporation | Avatar-based augmented reality engagement |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019148925A (en) * | 2018-02-26 | 2019-09-05 | 国立大学法人山口大学 | Behavior analysis system |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3846844B2 (en) * | 2000-03-14 | 2006-11-15 | 株式会社東芝 | Body-mounted life support device |
US7367497B1 (en) * | 2003-12-09 | 2008-05-06 | Jason Lester Hill | Electronic access control, tracking and paging system |
US8583139B2 (en) * | 2004-12-31 | 2013-11-12 | Nokia Corporation | Context diary application for a mobile terminal |
US7880610B2 (en) * | 2005-12-15 | 2011-02-01 | Binforma Group Limited Liability Company | System and method that provide emergency instructions |
US8271082B2 (en) * | 2007-06-07 | 2012-09-18 | Zoll Medical Corporation | Medical device configured to test for user responsiveness |
US9123022B2 (en) * | 2008-05-28 | 2015-09-01 | Aptima, Inc. | Systems and methods for analyzing entity profiles |
WO2010099488A1 (en) * | 2009-02-27 | 2010-09-02 | University Of Iowa Research Foundation | Contact tracking using wireless badges |
JP5372588B2 (en) * | 2009-04-24 | 2013-12-18 | 株式会社日立製作所 | Organization evaluation apparatus and organization evaluation system |
US8500604B2 (en) * | 2009-10-17 | 2013-08-06 | Robert Bosch Gmbh | Wearable system for monitoring strength training |
US8715178B2 (en) * | 2010-02-18 | 2014-05-06 | Bank Of America Corporation | Wearable badge with sensor |
JP2011197768A (en) * | 2010-03-17 | 2011-10-06 | Ricoh Co Ltd | Sensor network system, organizational behavior information collection method, organizational behavior analysis server and portable sensor node |
JP2013003942A (en) * | 2011-06-20 | 2013-01-07 | Konica Minolta Holdings Inc | Relationship evaluation device, relationship evaluation system, relationship evaluation program, and relationship evaluation method |
JP2013008149A (en) * | 2011-06-23 | 2013-01-10 | Hitachi Ltd | Business-related facing data generation device and system |
KR20140064969A (en) * | 2011-09-23 | 2014-05-28 | 디지맥 코포레이션 | Context-based smartphone sensor logic |
JP5751143B2 (en) * | 2011-11-15 | 2015-07-22 | コニカミノルタ株式会社 | Minutes creation support device, minutes creation support system, and minutes creation program |
-
2014
- 2014-05-15 SG SG11201509487WA patent/SG11201509487WA/en unknown
- 2014-05-15 EP EP14731467.8A patent/EP3005026A4/en not_active Ceased
- 2014-05-15 AU AU2014275396A patent/AU2014275396B2/en active Active
- 2014-05-15 JP JP2016518327A patent/JP6444390B2/en active Active
- 2014-05-15 CA CA2913538A patent/CA2913538C/en active Active
- 2014-05-15 WO PCT/US2014/038035 patent/WO2014197176A2/en active Application Filing
-
2016
- 2016-09-02 HK HK16110480.8A patent/HK1222464A1/en unknown
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9443521B1 (en) | 2013-02-14 | 2016-09-13 | Sociometric Solutions, Inc. | Methods for automatically analyzing conversational turn-taking patterns |
US10049336B2 (en) | 2013-02-14 | 2018-08-14 | Sociometric Solutions, Inc. | Social sensing and behavioral analysis system |
US20180292887A1 (en) * | 2017-04-07 | 2018-10-11 | International Business Machines Corporation | Avatar-based augmented reality engagement |
US10585470B2 (en) * | 2017-04-07 | 2020-03-10 | International Business Machines Corporation | Avatar-based augmented reality engagement |
US11150724B2 (en) | 2017-04-07 | 2021-10-19 | International Business Machines Corporation | Avatar-based augmented reality engagement |
Also Published As
Publication number | Publication date |
---|---|
EP3005026A2 (en) | 2016-04-13 |
AU2014275396B2 (en) | 2016-12-15 |
SG11201509487WA (en) | 2015-12-30 |
EP3005026A4 (en) | 2017-01-11 |
WO2014197176A8 (en) | 2015-03-05 |
WO2014197176A3 (en) | 2015-04-16 |
CA2913538A1 (en) | 2014-12-11 |
HK1222464A1 (en) | 2017-06-30 |
CA2913538C (en) | 2022-01-04 |
WO2014197176A9 (en) | 2015-05-21 |
JP6444390B2 (en) | 2018-12-26 |
AU2014275396A1 (en) | 2015-12-03 |
JP2016524230A (en) | 2016-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10049336B2 (en) | Social sensing and behavioral analysis system | |
CN105981035B (en) | Manage the display of privacy information | |
US9443521B1 (en) | Methods for automatically analyzing conversational turn-taking patterns | |
JP5092020B2 (en) | Information processing system and information processing apparatus | |
US20180107793A1 (en) | Health activity monitoring and work scheduling | |
EP2432390B1 (en) | Activity monitoring device and method | |
CN104573360B (en) | A kind of evaluating system and evaluating method based on intelligent wearable device | |
AU2014275396B2 (en) | Social sensing and behavioral analysis system | |
CN107004055A (en) | System and method for providing the annexation between wearable device | |
CN104703662A (en) | Personal wellness device | |
JP6742380B2 (en) | Electronic device | |
EP2810426A2 (en) | A system and method for identifying and analyzing personal context of a user | |
US20180276281A1 (en) | Information processing system, information processing method, and storage medium | |
Bhandari et al. | Non-invasive sensor based automated smoking activity detection | |
CN109104209A (en) | A kind of intelligent wearable device | |
Arunkumar et al. | [Retracted] A Versatile and Ubiquitous IoT‐Based Smart Metabolic and Immune Monitoring System | |
Olguın et al. | Sociometric badges: Wearable technology for measuring human behavior | |
Waber et al. | Understanding organizational behavior with wearable sensing technology | |
Yi et al. | System architecture of intelligent personal communication node for body sensor network | |
Xu et al. | Handheld computers: Smartphone-centric wireless applications | |
Wasnik et al. | Monitoring stress level parameters of frequent computer users | |
Bâce et al. | HandshakAR: wearable augmented reality system for effortless information sharing | |
JP5907549B2 (en) | Face-to-face detection method | |
US11030269B2 (en) | Analytic data collection for application navigation | |
JP6047696B1 (en) | Image generation system, image generation method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2913538 Country of ref document: CA |
|
ENP | Entry into the national phase |
Ref document number: 2014275396 Country of ref document: AU Date of ref document: 20140515 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2016518327 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014731467 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14731467 Country of ref document: EP Kind code of ref document: A2 |