Meetings play a big role in many people’s jobs. In the U.S. alone, an estimated 11 million meetings take place in a typical day. Managers can spend up to three-quarters of their time in meetings, and approximately 97% of workers say that collaboration is essential to do their best work.
As a result, meetings are tremendously important for businesses. Yet understanding meetings — much less finding ways to increase their productivity — is challenging for researchers because it requires an understanding of many social signals and complex interpersonal dynamics. Most of the work done in this area has been from the social sciences perspective using field work and surveys.
MIT PhD student Been Kim and I recently used the data-driven approach of machine learning to study this area. Analyzing a very large amount of meeting data, we tried to address several specific questions:
Is it possible to predict if a proposal will be accepted in a meeting based on the language used?
There are many published lists of supposedly persuasive words, but they weren’t created from a data-driven approach. We wanted to prove that truly persuasive words exist, using statistics calculated from the data.
We found that there really are words that seem to be persuasive according to hypothesis tests. The top words most likely to result in accepted proposals were: yeah, give, start, meeting and discuss.
The word “yeah” seems surprising at first as a persuasive word, but when we looked at the way people were using it, we found they were using it to show agreement with something that someone else previously said. Perhaps if you frame a suggestion as if it were in agreement with others, it’s more likely to be accepted.
Since everyone wants their ideas accepted, it’s worth considering word choice in proposals. After all, you don’t want to undermine your proposal by not using the right language.
Can we automatically detect when key decisions will be made during a meeting?
Looking at the data, we found predictors of when a key decision is being made.
These predictors were pretty reliable; our algorithms obtained 92% accuracy in detecting the time frame when decisions would be made. The algorithms figured out that often when key decisions are being made, there are not too many proposals still on the table, and people are mainly receiving or asking for information.
This could be very useful information if you’re listening to a previously recorded meeting and want to fast forward or rewind to that key part. Or, it might increase managers’ efficiency if they could be automatically alerted to join a meeting when a decision is about to be made, allowing them to spend the remainder of the time on other tasks.
How long will meetings last once a key decision is made?
The duration of meetings seems to depend on how long it takes the group to make a key decision. We found an interesting pattern where meetings in which the key decision is made very quickly tend to wrap up very quickly. Similarly, if the meeting goes on for a long time before the key decision is made, the meeting tends to wrap up very quickly. But somehow, for meetings in the middle (in our particular dataset it was centered on 14 minutes), the wrap-up time can be quite long. In fact the meetings that took around 14 minutes for the main decisions to be made then took up to 18 additional minutes to conclude the meeting.
This might be useful information for time management purposes. Based on how long it takes to reach the key decision at a current meeting, you can better estimate when the meeting will end and adjust your schedule as needed.
Are there macro-patterns of interactions within meetings?
We looked for patterns in how people conversed, particularly interactions between positive/negative social acts and work-related assessment acts. A positive social act might be complimenting a person’s tie whereas a negative social act might be complaining that someone won’t let you talk. A work-related act is more directly related to the work at hand and could be the acceptance or rejection of a proposal.
In our analysis, we found that positive social acts are almost never next to negative assessments. This makes sense because it would sound rather disingenuous to say something like, “But I thought it was completely pointless. Superb job overall by the way.” Similarly, it would sound odd to say it was a pleasure working with someone and then comment that their idea will surely fail. The two statements just don’t go together.
In light of how much time people spend in meetings and how important collaboration is to their work, knowing how to be more persuasive and sensitive as well as how to increase productivity in meetings is very useful. By studying meetings, we can help improve our communication skills and become more efficient.
But this is just the beginning. Our study is one of the first data-driven works in the scientific field of meeting analysis and while we’ve made – and found support for – several hypotheses, there is much more work to be done in this arena.
Cynthia Rudin is an assistant professor of statistics whose work on this topic has been featured by the Association for the Advancement of Artificial Intelligence. She coauthored “Learning about Meetings” with MIT graduate student Been Kim.