GEM Home

 

Learning for Change

Evaluation Model

Learning for Change is the overall framework Association for Progressive Communications Women’s Networking Support Programme (APC WNSP) uses in its evaluation tool. Grounded on the advocacy for social change and seen from a perspective that uses gender analysis in evaluating ICT initiatives and projects, Learning for Change, looks at evaluation as a process of learning that is dynamic, evolving, and interactive. It aims to examine how an ICT intervention from a gender perspective affects changes in an individual, organisation and community, including other broader social contexts.

Learning for Change Circle, includes different aspects, all featured in the text below.

Values and Practices of Learning for Change

 

Self and Social Change

The evaluation model pays special attention to self and social change – understanding the dynamic relationship between an ICT initiative on both self and social change. Learning for Change uses “self” to mean individuals, organisations and communities involved in an ICT initiative. Evaluation that focuses on selfchange examines the dynamic relationship between ICT initiatives and the way individuals, organisations and communities operate. At the same time, Learning for Change looks into the relationship between an ICT initiative and the broader social, political, cultural, and economic contexts, seeking to understand how these factors affect the initiative and vice-versa.

Gender Analysis

Gender analysis involves a systematic assessment of the different impacts of project activities on women and men. Used from an ICT context, gender analysis asserts that power relations in class, race, ethnicity, age, and geographic location interact with gender producing complex and hidden inequalities that affect social change. A gender analysis framework also looks into how ICTs, in particular, are used to maintain or bring about social change. Thus, a gendered approach in evaluating ICT projects and initiatives will, for example, disaggregate data by sex, analyse the sexual division of labour, and understand the gender disparities of access to and control over resources.

Learning by Doing

Evaluation is not a complex undertaking performed only by technical experts. Formal qualifications are helpful and valuable but they are not prerequisites for a thorough-going evaluation work. More important qualities to a meaningful evaluation are keen observation, critical reflection, and sensitivity on the effects of project activities and the contexts in which they operate. Recording observations and holding regular assessments give insights. These make up our learnings – real-life experiences – that validate knowledge in conducting evaluations.

Linked to Action

Change springs from learning by doing; from lessons learned. Evaluation exercises are not ends in themselves but are linked to action which emphasises the importance of using what was learned. Evaluation results must be popularised and should empower women, and propel groups and organisations to improve succeeding ICT initiative/s and evaluation exercises.

Participatory

An evaluation exercise must be participatory. It needs to engage with groups involved at the grassroots or those working in a particular community serviced by the ICT initiative. The process should involve all stakeholders; its findings shared with all who were involved to ensure accountability.

Critical Reflection

Evaluation is an opportunity to reflect thoroughly on the project or initiative, its advances and mishaps. It is important to constantly review information gained from an evaluation process. Judiciously examine what have been gathered and transform them into knowledge.

Sensitive to Bias

Evaluation is not a neutral activity. All stakeholders, including the evaluator, bring into the process their specific biases that will somehow affect the outcome of the evaluation. Evaluators should discuss their partialities with the group especially when these begin to influence their judgment in the evaluation exercise. It is best to remember that a successful evaluation thrives in an open atmosphere of trust and sincerity.

Context Sensitive

Each ICT initiative enters into a unique social, cultural, economic and political reality. An effective evaluator is sensitive to each unique reality and seeks to understand its dynamics as well as how these operate in a project. Context sensitivity also puts premium value on the choice of methodologies that will be used in the evaluation. Moreover, it requires an evaluation process to identify and investigate situations or other realities that ICT initiatives or projects were unable to reach.

Quantitative and Qualitative Aspects

Evaluations must take into account both quantitative and qualitative changes that emerge from an ICT initiative. Quantitative changes are those that can be measured numerically like the number of women who were taught to use email in a particular project or the number of times a website was accessed in a specific period. But it is always best to support quantitative data by findings on qualitative changes because quantitative measurements can only give half of a story.

Qualitative changes are changes that cannot be measured by numbers. An important qualitative change from a gender perspective, for example, is a woman’s sense of personal empowerment, greater self-confidence or a higher sense of self-esteem derived from or was a result of using ICTs. Another example would be manifestations of changes in relationships in an organisational or household set-up brought about by using ICTs. These qualitative changes can be gathered using methodologies like interviews or story telling.

 

Work Cited

Wood, Peregrine. “GEM Reference: The Betel Chew Ritual”. Association for Progressive Communications Women’s Networking Support Programme. February 2001. http://www.apcwomen.org/gem/resources/betelchew.htm#top