Thursday, 23 March 2017

Assessing creativity - where to start? (PART 2 of 2)

In part 1 of this post, we looked at whether students believed creativity was something they experienced in the curriculum, and ways in which educators could begin to assess creativity, particularly in STEM subjects. In the first part, we looked at assessing creative outputs, in this post we will look at ways of assessing the creative process.


By assessing the process that has led to an output, there is more room to acknowledge, reflect upon and evaluate learning. There is also room for failure and risk-taking (i.e. failure to come up with an innovative product), as long as something is learned from that failure.
The difficulty when assessing a process rather than an output is how to judge students’ work comparatively. One way of doing this is to let students define what creativity is for themselves, and then allow them to judge themselves against their own standards, for example with a narrated portfolio (Jackson 2008). This would contain self-assessing claims with evidence. The teacher would scrutinise the rigour of the self-assessment rather than making their own judgements about creativity.
Other alternatives to narrated portfolios are presentations, concept maps, lab books, posters, interviews, video diaries, blogs, scrapbooks and so on (Jackson 2008).

Following a review of case studies, it has become clear that there is a lack of studies concentrating specifically on the assessment of creativity in traditionally ‘non-creative’ subjects. Looking at other disciplines or educational areas has however been helpful in identifying assessment methods and criteria that could be adapted for HE and also STEM subjects.

One case study is a lexicon developed for the assessment of National Vocational Qualifications (NVQs) by Fennell (1993). The lexicon recognised that creativity can be applied or demanded of every occupation.  Fennell set out four categories of creativity, also known as the RFIO model:

“Materials and process are prescribed with little or no latitude”
“Materials and process are well-established but latitude is permitted, and variation, within agreed limits, may be welcomed”
“Materials and process are discretionary but work is within established conventions”
“Materials and processes are discretionary and work is either without precedent or significantly extends beyond established conventions”

This lexicon can be considered as a continuum representing a students’ creative journey. This lexicon could easily be applied to Higher Education, and in particular to traditionally ‘non-creative’ disciplines. Most courses, when mapping against this lexicon, would find content to fit this model. The lexicon could then be modified into an assessment rubric.

Another relevant case study could be the development of a ‘negotiated’ assessment process and criteria, adopted for all curriculum areas at Liverpool Institute of Performing Arts (Kleiman 2005). Six assessment fields were identified:
  1. Presentation/production (the product)
  2. Process
  3. Idea
  4. Technical (quality and utility of the technical features of the product)
  5. Documentation (research/ design/ planning/ evaluation etc)
  6. Interview (student’s ability to articulate understanding, utilisation, application etc)
The unique feature of this system was a ‘negotiation’ aspect – students could argue to alter the assessment weighting of the fields for their individual work. This means that in some cases, more emphasis can be put on the product, and in other cases, more emphasis on the process. This could be an adaptable model for many subjects. Perhaps the student-tutor ‘negotiation’ would not be appropriate for all levels, disciplines or cohorts, but the tutors could adapt the weighting themselves to most suit the particular assessment.

The way forward

Of course, developing assessments like this would require boldness and risk-taking from educators to deliver within modules and programmes! Perhaps one way to work towards the assessment of creativity would be to pilot small activities within modules, using only a small percentage of the assessment at first. A good example of this would be the Organ Donation Project (see case study), which was originally a non-assessed component of a module, then 5% of the assessment, and now 10%. Or how about introducing pass/fail elements? Or starting with an extra-curricular pilot?

So, what do you think? Do you think it is better to assess an output or a process? How do you feel about some of these assessment methods? Do you even think it is possible to assess creativity? What other ways could we use to help students engage with this enterprise capability? Let us know in the comments below.

If you're a University of Sheffield member of staff and you want support to embed innovation and creativity in your curriculum, please visit our website and get in touch.


Fennell, E. 1993 Categorising Creativity. Competence and Assessment. 23: 7

Jackson, N. 2008. Tackling the Wicked Problem of Creativity in Higher Education. Guildford: University of Surrey. 

Kleiman, P. 2005. Beyond the Tingle Factor: creativity and assessment in higher education. Paper presented at: Economic and Social Reseacrh Council (ESRC) seminar. University of Strathclyde, Scotland, November 2005 

No comments:

Post a Comment