On 4th-5th April 2002, fifteen members of staff from the Biological Sciences Department visited Myerscough College of Agriculture, near Bilsborrow, to discuss various aspects of assessment. The visit was funded by the HEDC from the Teaching Quality Enhancement Fund.
Assessment was chosen as the theme of this workshop because of its crucial role in the teaching and learning cycle. In the words of the Dearing report "it provides a principal vehicle for ensuring the standards appropriate for awards, as well as influencing the direction and form of student learning". Assessment has been perceived to be a weak point in many programmes of study and has traditionally been given less consideration than course content and delivery.
The main aims of the residential were to:
Ali is the Teaching and Learning Development Coordinator for the HEDC and is the Programme Director for the University's Certificate in Learning and Teaching in Higher Education (CiLTHE). Ali has a particular interest in assessment.
Maureen is Curriculum Design and Development Coordinator for the HEDC. One of her roles is the implementation of student progress files in the University.
Alan is the Director of Undergraduate Teaching in the Biological Sciences Department and part-time faculty tutor for the CiLTHE programme. Alan's role is to coordinate the Department's teaching, learning and assessment strategy and he organised this workshop.
As well as being Co-director of the LTSN Centre for Bioscience, Ian is Professor of Pharmacology at the University of Leeds and has many years hands-on experience of laboratory teaching.
Ali Cooper led the first session which began with an introduction to assessment and the reasons for its importance. We were told that assessment has a major influence on what students learn, how we teach, how students and we as teachers feel, how students organise their studies, how individuals are able to progress and how we spend our time. Ali reported that research on assessment has shown that students will change the way they learn, and what they learn, to meet the assessment demands. This strategic learning means that high performance may not be the best predictor of student learning. We then indulged in some role play to explore the reasons for assessment from both staff and student viewpoints. We were individually asked to take on the role of either a member of staff or a student and to write down as many reasons for assessment that we could think of. Pairs of "students" and pairs of "staff" then got together to compare responses. We then formed groups consisting of two "students" and two "staff" to compare the different attitudes to assessment. Ali then asked for examples of "student" reasons for assessment. These included:
"Staff" responses included:
Much of the ensuing discussion centred on feedback. It was generally felt that spending a lot of time writing feedback can be unhelpful to students if it is coming at the wrong time, for example if students have moved on to a new module or a major piece of work, such as a dissertation, is finished. Giving feedback is also demoralising for staff as there is evidence that only around 20% of students actually read it. One solution to the time spent on writing feedback may be to use feedback generating software which was described by Bob Lauder of the department. This allows a personalised set of comments to be put together from a pre-existing list, these are then printed out and handed back with the marked work. Bob reported that, while the software is easy to use, less than half his class had picked up their feedback sheets. One solution, offered by Ali, to the problem of lack of student interest in feedback may be to make them more actively involved in the assessment process. Students could negotiate the criteria for assessment. Even if the lecturer's criteria are the ones that are used when marking, at least students will feel more involved with the assessment process and more likely to take note of feedback. Self assessment could be used as a tool to encourage reflection among students. The criteria for assessment could be given to students on a cover sheet to be attached to the work and they could be asked to judge themselves against them and give themselves a mark. It was generally felt that this mark should not be used in the final summative assessment, although some departments do use self assessment marks in this way. Students could also be asked which specific areas they would like feedback on. It was felt by some staff that the mark alone is often enough to motivate students (only 55… I'll need to work harder next time).
After lunch Maureen Callander led a session on transferable skills, qualities and attitudes. We were asked to consider learning styles according to the VARK (video, audio, read/write, kinesthetic) model and to classify our own learning preferences. Perhaps not surprisingly, for a group of scientists, most of us had a preference for kinesthetic learning which involves learning by doing (for example by carrying out experiments). We then looked at the methods we prefer to use when teaching. In some cases there was agreement with our preferred learning styles but this was not always true. Interestingly, the preferred learning approach of most current students is read/write and yet this is not always our preferred teaching style. Student learning approaches are likely to change with widening participation, however, and the best approach would seem to be to use a variety of teaching styles.
Our next task was to look at various attributes and classify them as skills or qualities/attitudes. Some of these were straightforward, for example being able to speak a foreign language is clearly a skill and ambition is clearly a quality. Other attributes were less easy to classify, for example leadership might be thought of as a quality but it is possible to learn leadership skills.
When it comes to assessing skills, qualities and attitudes it was felt that, while we might properly assess transferable skills through formal means, qualities and attitudes are best assessed through informal means and reflected in references.
As an example piece of assessment, as well as to provide some light relief, participants were divided into three groups and sent on a treasure hunt around the College and surroundings. The academic aim of the treasure hunt was to illustrate some common faults with assessment assignments and to look at assessment from a student viewpoint. This was not, however, explained to the participants beforehand - they were simply given a set of cryptic instructions and questions to answer and were told that there would be a prize. At the end of the treasure hunt (which only two of the groups completed) participants were asked to say what was good and bad about it as a piece of assessment. The responses were as follows:
Good:
Bad:
It was interesting that the behaviour of the participants was exactly what we observe in students who are not happy about a piece of assessment!
Ian Hughes began this session by asking: What are laboratory skills? These can be divided into subject-specific skills such as isolating tissues, dissection and cannulation, and generic lab skills such as weighing, pipetting, making dilutions, centrifugation, chromatography, spectroscopy, organisation and time management. There are also other, non-lab skills such as professionalism and appropriate behaviour.
Next Ian listed some of the attributes we might look for in a skilled laboratory worker which included:
We were then reminded of the reality of laboratory teaching compared to 20 or 30 years ago. Today we have large numbers of students - the days of 6 students per class are gone - and this means that it is much more difficult to get to know students and to keep an eye on an individual's behaviour and performance in the lab. Large classes also mean that it is impossible to monitor each student continuously so there is a tendency to assess by snapshots of student practice which may not truly reflect ability. It is also normal for students to work in pairs or groups which creates difficulties in dissecting out an individual's performance. There is generally a shortage of staff and equipment and classes are timetabled for only 2-3 hours which means that laboratory activities are often dictated by what is possible rather than by educational aims and objectives. Students are much more likely to question marks and grades and, if necessary, appeal to higher authorities. Means of assessment must therefore be defensible, transparent and fair or we risk being sued.
Ian then gave us examples of some means of assessment which separate the lab skills from the write-up skills:
We were then divided into groups of 3 and allowed 30 minutes to agree an important laboratory skill for our students, 3 criteria to measure the extent of skill possession and the assessment methods to measure achievement of these criteria. We then gave short presentations detailing the skill chosen, criteria adopted, assessment measures for each criterion and practical difficulties we might envisage. Here are some selected examples of suggested activities (all assumed first year classes of 50-70 students:
Criteria: Accuracy, Speed, Recording
Assess by: weighing given volumes of liquid, impose time limit, recording of
weights on data sheet.
Criteria: Spectrophotometer use, Accuracy, Choice of suitable wavelength
Assess by: Observation by demonstrator, linearity of absorbance vs.
concentration plot, wavelength to be maximum absorbance value ±2nm.
Problems: Availability of equipment and demonstrators.
Criteria: Accuracy, Following a method (to make a gel), Interpretation
Assess by: Quantification of PCR product bands on gel, marker bands in the right
place on gel, calculation of size of unknown fragments.
Problems: Variability between gel kits.
Criteria: Accuracy, Safety, Recording
Assess by: Absorbance reading on spectrophotometer, filling in a COSSH form and
observation of spills (blue), recording results in lab book
Problems: How do you quantify safety? Faulty equipment.
Criteria: Correct pH, Correct concentration, Record keeping
Assess by: Demonstrator measures pH, student records absorbance on
spectrophotometer after adding molybdate, correct recording of volumes, weights
etc. in lab book.
Problems: pH meters will require to be standardised, should students be allowed
multiple attempts?, lack of equipment.
This session began on day one when participants were asked to consider whether unseen exams were necessarily the best way of testing our educational aims. Participants were given some examples of alternatives to unseen exams and were asked to list educational aims alongside the most appropriate means of assessment and the associated advantages and disadvantages. The table below contains a summary of the responses.
Aim | Assessment method(s) | Advantages | Disadvantages |
Obtain knowledge | Unseen exam | Easy to set and administer, cheating is difficult, students know what to expect | Encourages rote learning, and regurgitation, causes stress and anxiety in students |
Obtain knowledge | MCQ | As above plus marking is easy | Doesn't assess higher level skills |
Understanding | Seen exam | Time to compose thoughts and formulate an answer | Collaboration among students possible, doesn't assess ability to think quickly |
Understanding | Open book exam | Less pressure to revise and memorise an answer | Only a limited amount of material can be brought into an exam |
Acquire practical skills | Practical exam | Tests the skill in question | Difficult to assess, requires a lot of staff time |
Numeracy | Computer-based assessment | Easy to mark | May not reveal intermediate working |
On day two participants were split into three groups and asked to produce an overhead transparency that summarised their thoughts on exams and to suggest an alternative form of exam that would test students' abilities to synthesise information and make connections between different modules. Here are the suggestions:
Students would be given an edited paper lacking an abstract and discussion (and possibly also part of the introduction) and told to write the discussion in a 3 hour exam. Advantages of this form of assessment were considered to be that it tests critical analysis and is a real-life activity. Disadvantages are that it might be difficult to complete in 3 hours and there may be copyright problems (possibly solved by using our own rejected papers!).
Students would be given a set of 10 broad questions at the start of the module/course which would require material in depth and breadth from a range of lectures and further reading. Only three of the questions would be used in the exam at the end of the year. Advantages would be that students would know what to expect and could prepare material as the course progressed. A disadvantage might be that students would split the task, each group working on a different topic and exchanging information. Some felt that this would not necessarily be a bad thing.
This is a variation on the seen exam but in this case only a general area and the required approach would be revealed at the start of the year. There would be a limited choice of questions (possibly only one). It was felt that this would promote depth and synthesis but, again, that students might collaborate. It was considered by some that limiting choice too much might disadvantage certain students.
To evaluate the usefulness of the previous day and a half, blank sheets of paper were handed out to groups and comments invited. Here are some representative quotes which made us feel that we had achieved our aims and that the experience was an enjoyable one and worth repeating in future:
At the same time we acknowledged that if the workshop was to be worthwhile, the good ideas would have to brought forward and implemented as part of our common policy and strategy.
Report compiled by Alan Shirras on behalf of the Department of Biological Sciences.