Michael Herman
Opening Space for Business Agility

 
 
 

Evaluating Open Space Meetings

it's best to avoid this altogether, if at all possible. not because open space doesn't work, but because it works so well that any attempt to squeeze the spirit and momentum of an event or meeting into those little 5-point scales will totally miss the point of the whole endeavor.

perhaps even more importantly, asking participants to evaluate the meeting usually implies, in the end, that the meeting planners were somehow responsible for the quality of the participants' experiences... which is in direct opposition to the message and structure of the opening and working sessions of the meeting, in which participants were given full responsibility for the quality of their own experience, learning, working and results.


sometimes giving people a sheet of paper with 'evaluation' at the top of it absolutely cannot be avoided. in those cases try this...

evaluation


or this, from ChrisCorrigan?...

in lieu of evaluation, pass out page that requests the following:

pass them out with the proceedings on day three or at the end of the conference, so that they can be filling these out at the same time as they are ranking, voting, action planning.

could ask participants to bring them to the closing circle and pass them around with the talking stick... so the pile of results gets bigger as the stick progresses around the circle.

in some groups/settings may need to emphasize that these pages are for storytelling about the event, about their work during and expected progress after the event... NOT an evaluation of the conference. these pages are about specific learnings and actions, not general comments and evaluations.

some other questions that may be interesting...

also:


further note on evaluation of results

ChrisCorrigan? reminds me that we have to be sure to use the SAME instrument BEFORE and AFTER the ost meeting... if we are doing the open space because sponsors look around and notice that the org 'feels bad' then we should evaluate how it 'feels' afterward. if cash on hand is 50 days and the purpose is to find more days of cash, then measure that. if we do ost to respond to and address issues raised in a survey, then repeat the survey. we have to use the same gauge before and after the session to get a fair baseline. if sponsors' "sense" of things is that people are "disconnected" and afterward they are saying "people got connected" but the results didn't show up, we have to ask what results they were measuring... and how. can't get what you don't ask for and not fair to change the means of measurement/scoring after the game is over.