Sunday, July 31, 2011

Survey Anxiety!

Flickr CC photo courtesy of Beinecke Library
I have been working this summer on a survey for our staff on their technology usage.  The intent of this project is entirely for planning purposes.  I have put a lot of thought into this and solicited feedback from various experts in our community.  The process has created some challenges for me that I have decided to air in this blog post.  Perhaps you will have insights or suggestions.
This is a very exciting time to be in educational technology.  The rapid development of mobile technologies, touch technology, and cloud services call for new ed tech support and solutions.  But change is anxiety producing.  So simply by raising the subject, we may surface objections.  Furthermore, since Mercy is an early adopter of 1:1 computing, there are not too many road maps for us to follow.  The big companies like Apple, HP, Microsoft are introducing less expensive computing solutions for schools who are not yet 1:1.  Consequently, the concern that we might accidentally downgrade what we have worked so hard to build is a legitimate concern.

I wish to promote transparency and collaboration for major decisions that affect many stakeholders.  In this case, however, should all the results be shared?  I don't want to manipulate the results, but not being experienced in survey authoring, I'm a little concerned about dud questions producing results that may muddy the water.  Nevertheless, I'm strongly inclined to share the results within the community.  So here's a greater concern:  Some responses may be easy to identify individually.  Teachers of a particular course, one person "departments", etc. may be easy to spot even if names are withheld.  Will this affect candor?

Ulterior Motives
For me the biggest issue is dispelling concern that the survey is actually a way to "check up" on people.  We absolutely need to know who is filling out the surveys so that we get a clear idea of how critical certain softwares and functions may be.  If we decide on changes we want to be sensitive to current users, so as not to simply jerk the rug out from under them.  I'm hoping that transparency will help to alleviate this.  But I am a little concerned that folks will overstate usage "just in case."

Leading Questions
Obviously, not everything is up for grabs as we develop our "tech plan".  Consequently, we are focusing more attention on specific areas and have discussed some options that we want to "air out."  I've discovered that it is challenging to develop a set of questions that are not "leading," particularly since I of course do have my own opinions.  Nevertheless, I don't want to foreclose feedback.  This is a tough one

From a practical point of view, I have found it challenging to find the right blend of fixed choice questions and open-ended prompts.  Obviously, open-ended questions allow for more nuance and intensity of expression.  They also may have the desired effect of making respondents' feel more engaged in the process.  On the other hand, they are unwieldy and may make it harder to discover patterns.

I don't plan on sharing the results of the survey at the Drive-thru, but will surely blog on the process.  It's already been quite a learning experience.

No comments:

Blog Archive