Assessment – some notes

Edited by Derek Jones and Jolanda Morkel

Meetup 02 focused on assessment and in particular on what changes design educators are considering in response to current crises.

You can find the live notes from the session here.

If anyone from the session spots anything we’ve missed or wishes to edit anything then please let me know and we’ll update this post (

Overriding values

There was a general sense in the meetup that we are no longer in ‘business as usual’ and that this inevitably requires a response in terms of assessment: what we do; how much we do; when we do it; the conditions under which it should now take place, etc.

But it was also recognised that this might be difficult in terms of Institutional Policies, which may not be entirely appropriate for current circumstances and that certain explicit criteria of assessment may not be particularly appropriate right now.

At the same time, quality of some kind must be maintained if we are to respect our students and their achievements. The discussion tended towards recognition of individual development and achievement in students, rather than any particular extrinsic measure of success. But it was also acknowledged that this can be further complicated by professional or even legal requirements in some disciplines (e.g. architecture).

Basically, some negotiation between the absolutes of assessment, policy and regulations, and the recognition of individual learning, development and capability is required.

And on a practical note, we’re design educators – we’re used to this tension between transactional outcomes and the underlying value of a design education! (PS – See Orr and Shreeve (2018) for a really good book on exactly this!).

So this post focuses on things we might consider and can do as design educators in this space between formal structures and what’s actually happening.

Before we start, a couple of linked pieces well worth reading to get some more context on how assessment is being approached: Inside Higher Ed piece on assessment and from that article, Laura Gibbs’ (@OnlineCrsLady) list of HE policy and operational responses to assessment is worth having a read.

And a mini list of assessment forms, types and modes (well worth looking through for alternatives – even in design education):

Strategy and policy in assessment

A few top-level considerations first:

  • Purpose of assessment: it’s worth doing the basics here and simply (re)stating as clearly as you can what the point of your assessment is – and more importantly what it doesn’t need to be for, or do. Things are no longer normal so with curriculum changes should come assessment changes too.
    • Try making a diagram to help prioritise assessment criteria: absolutely required; ideally required; nice to have. What can you get rid of? What can you assess in other ways?
    • Use these priorities to form the basis of summative and formative assessment returns themselves (e.g. pass/fail grade; negotiated summary sheet with learning summary)
  • Hidden Curriculum: There is a significant amount of hidden curriculum in design education – implicit behaviours; practitioner knowledge; apprenticeship learning. Some of it is hidden because (some would argue) it’s impossible to make explicit (e.g. embodied practice; qualities of aesthetics). Regardless of your position on such implicit matters, it might now be harder than ever to demonstrate these (at a distance) and make them suitable for assessment.
    • Consider the assumptions in your assessment and try to make them visible. Ask whether they are really necessary for assessing your students.
    • Try to make your implicit assessment explicit to students (or at least describe what it is about) or consider removing it if it’s not absolutely necessary.
    • If it has to stay, consider adding conditions to any grades – an asterisk* to denote the exceptional circumstances under which the assessment took place (see the importance of doing this in this article).
  • Invisible learning: A lot of learning does take place without a teacher being present! Like the hidden curriculum, a lot of this learning is often not recognised or acknowledged. Of course, much of this is a summative or implicit part of the curriculum (e.g. that time working with student peers leads to far more learning than just a ‘working with others’ learning outcome).
    • Another way of considering alternative assessment is to take account of what else is being learned (especially in the current context). When you look at student work and ignore what was supposed to be learned, what else is demonstrated in their work? Can you offer acknowledgement of this? For example, managing a design workload effectively at a distance is quite a useful ability (particularly when there are other pressures).

When designing and creating assessment

  • Redesign your assessment: even if you have to keep the same assessment, you should go through a redesign to understand the consequences of it in the current context. Even if you decide to make no changes, you will still be more aware of the implications. Even at the OU, where we are set up to: 1) do assessment at a distance and still support individual student needs and 2) scale this reasonably readily (not easily – a lot of people are working really hard behind the scenes to make this work), we are still having to do this to make sure we consider our students.
  • Consider your students’ circumstances: If you are unsure, design for the worst imagined case (e.g. time-deprived student; caring responsibilities; poor internet connection (low bandwidth; low connectivity; intermittent connection); higher degrees of uncertainty)
  • Test and prototype your assignments: Work through the assessment yourself (as much as you can) to prototype and test it (and see tech testing using older tech below). If you can’t do it, maybe you shouldn’t expect students to…
  • Re-test and re-prototype your assignments: If you are planning on keeping your existing assessment strategy, form and mode, then it’s still worth testing. In particular, test any tech-dependent activity on older equipment, operating systems, browsers etc.
  • Be responsive to students feeding back on instructions: if a student is having problems understanding what is required, chances are other students are also having problems.
  • Avoid high intensity, high workload, longitudinal, high stakes assessments. For example, large-scale, long-term projects with little feedback and single point of assessment is a pretty toxic combination just now in terms of having multiple failure points.
    • If any of these conditions are a necessary assessment criterion then perhaps isolate them and support and assess them appropriately (e.g. long project – set up appropriate tuition points with good feedback points that are assessed and can be used later if circumstances change).

The object(s) of assessment

  • Flexible timings of assessment: Many institutions already support flexible deadlines to allow for unpredictable context changes to student circumstances. But also consider when (and for how long) students will carry out work to complete assessment and adjust it appropriately to contexts. How far can your institutional policies and procedures be stretched to accommodate an increase in requests?
  • Flexible modes of assessment: a few attendees noted that their modes of assessment would probably have to change (e.g. face to face jury crits are unlikely).
    • Consider possible online modes to assess design process and outputs. For example, synchronous (in realtime) or asynchronous (over time) modes are possible online. The webinar or online conference is an example of a synchronous mode, through which live online crits can be conducted. Asynchronous modes include, for example, online journals, video recordings and narrated presentations. 
    • The choice of mode will depend on what is appropriate for the discipline/ study area, the level of study, available tools and internet access, as well as the students’ circumstances etc. (see the list of forms, modes and types of assessment above).
  • Elective assessment: a few attendees noted that they made use of elective components of assessment. Whilst this will very often depend on the existing curriculum and how far along with this the students are, there may still be an opportunity to create elective options around assessment (modes, content, forms etc.).
  • Differentiated assessment: the idea of allowing students to elect different levels of assessment was raised, for example, a student wishing to pass only, undertaking assessment appropriate to that level of outcome. [REQUEST: Does anyone have any models of this they could share?]

Parameters of assessment

  • Consider open deadlines: Common in Sweden are fully open deadlines, where students submit their work at any point in time.
    • This can make it difficult to plan time and resources far in advance for assessment but the group did generally feel that this was inevitable and part of what we would have to deal with. But, if we are negotiating assessment, is there an opportunity to negotiate both submission and return?
    • Perhaps more important is having checks in place to ensure that students can develop their work to a suitable level ready to be assessed – that they are aware of the expectations and assessment levels (regardless of deadline openness).
    • A Cape Town (CPUT) model allows students to submit work up to the last day of the assessment cycle. Should students have no valid reason for the late submissions, a maximum of 50% can be obtained. However, if they have extenuating circumstances, a maximum of 100% will be allowed.
  • Consider extenuating and special circumstances: Again, most participants noted their institutions allowed for special and mitigating circumstances. It was acknowledged that the use of these was likely to rise.
    • Consider ways to make decisions around circumstance easier, such as deciding upfront how you will assess the most likely circumstances or scenarios that may arise.
  • Consider summative assessment forms: a few participants discussed a range of measures used in their institutions
    • Consider a simplified pass/fail approach to summative assessment rather than ranking, banding or grading. If this aligns with good interim feedback and formative assessment, it should mean that very few students get to the assessment point and then fail.
    • Consider using weighted assessment if appropriate. For example, can work completed to date be re-weighted to still meet the quality of assessment? Can some types of assessment be (re)weighted to suit student circumstances?
    • Consider using substitution of components of work or assessment, either by lowest summative contribution or by agreement with students (qualitative).
  • Consider negotiated assessment: Assessment strategies can be co-designed by and negotiated with your students. This is a student-centred approach that may improve buy-in and student commitment, through empowering the student (and lecturer). [REQUEST: Does anyone have any models of this they could share?]

 As a final thought, the meetup was a really useful reminder of how valuable it is to share ideas and practice with colleagues: perhaps a reminder that designing assessment should not be an isolated or solitary activity.


Orr, S. and Shreeve, A. (2018) Art and design pedagogy in higher education: knowledge, values and ambiguity in the creative curriculum, Routledge research in education, London ; New York, Routledge, Taylor & Francis Group.

Published by Derek Jones

Derek Jones is a Senior Lecturer in Design at The Open University (UK), part of the OU Design Group, and the Convenor of the DRS Pedagogy SIG. His main research interests are: the pedagogy of design and creativity, embodied cognition in physical and virtual environments, and theories of design knowledge.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: