At our last meeting, the board
voted 4-3 to enter into a three-year contract with
ThoughtExchange, an online platform to solicit ideas and opinions from district residents. The contract cost
$106,462. I voted against it, for several reasons.
The cost alone was not my primary objection. I can imagine issues on which a well-done survey of community opinion would be worth paying for. My concerns about ThoughtExchange were:
1. Although ThoughtExchange was presented as a way to “take the temperature of the community”—the vendor even referred to it as “polling”—it was not at all vetted for that purpose and is very clearly not up to the task.
There is a difference between tech expertise and statistical expertise; the vendor provided no information about the statistical capability of ThoughtExchange to measure the opinions of the community as a whole. He probably couldn’t, even if he tried. Participation is not random; some users might make one quick visit while others might visit repeatedly and participate for long stretches; it’s fairly easy for people to have multiple accounts; and the total number of participants on any given issue is likely to be a small fraction of the total community population. As a result, the margin of error, if it could even be calculated, is likely to be so enormous that the results would tell us very little about community sentiment.
That problem is compounded by the fact that a significant chunk of our community (estimated at about
six percent of households) does not have regular internet access, and that’s probably not a random chunk, but skewed toward low-income residents.
2. I’m concerned that the motivation to use ThoughtExchange is more about putting on a show of community engagement than actually engaging in a meaningful way. (I had these
same concerns about ThoughtExchange’s predecessor, MindMixer.) There’s no point in asking the public for input if we’re not willing to adjust our decisions accordingly once we get it, but it often seems like the district wants to do the former and not the latter. In those instances, people just feel worse than if their input was never solicited at all.
The district’s likely strategy is not to ask any questions that it doesn’t want to hear the answers to, and to word the questions in ways designed to push participants in certain ways. Three years ago, I
made fun of the district for using MindMixer to ask, “What are the school district’s biggest strengths?” Then, when the ThoughtExchange vendor made his
presentation, one of his examples of a question that could be asked was, “What are some things you appreciate about your school this year?”
Unfortunately, my (1) and (2) correspond to the two things we’re actually paying for (that we wouldn’t get from engagement through, say, the district’s Facebook page): (1) the “data” analysis (which is of little value if the data is not representative of the community as a whole), and (2) the manipulability and control that comes from being able to decide what questions to ask and how to ask them. We should not pay for either of those things.
It’s hard not to see ThoughtExchange as primarily a public relations campaign posing as a concern for community input. The board should consider whether that might put off as many people as it attracts. As commenter Amy Charles
wrote, “No, do not tax me in order to build a case for more taxes. Spend the money on the frigging schools, and do it sensibly.”