Friday, 9 February 2018

Seeds for Solutions: Innovation Projects from 2015/16

Project Title: An e-Learning Platform to Promote Active Learning through Screencast Technology
Project Leader(s): Laura Hancock


The use of screencasts to supplement learning is becoming commonplace in higher education but there are concerns that their use may promote mostly passive learning. The aim of this project is to create an e-learning platform that provides the appropriate scaffolding to stimulate active learning in chemistry by combining screencast technology with interactive quizzes to deliver instant feedback. Users of this resource will be required to demonstrate recall and understanding of basic concepts before gaining access to higher level screencast material, allowing users to construct their own knowledge and discouraging a passive approach to learning.  
Creative Commons License
An e-Learning Platform to Promote Active Learning through Screencast Technology by Laura Hancock, Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.



Project Title: Personalised Immersive Learning: Using Virtual Reality Headsets to Provide Engaging and Flexible Clinical Learning Opportunities
Project Leader(s): Mel Humphreys, Luke Bracegirdle, Pete Lonsdale, Tim Smale, Daryl Kerr and Ian Wood


New virtual reality headsets allow us to provide students with personal, immersive environments where they can take part in simulations of clinical episodes. Building on previous work between the Schools of Pharmacy and Nursing & Midwifery, we will be extending existing work on the Virtual Ward to allow access by a greater range of students from more diverse settings, and utilising a variety of scenarios. Students will be able to immerse themselves within clinical environments, interacting with patients and healthcare teams to explore the essential skills of teamwork, communication, and meaningful decision making in an authentic and safe setting.

Personalised Immersive Learning - Final Project Report

Creative Commons License
Personalised Immersive Learning: Using Virtual Reality Headsets to Provide Engaging and Flexible Clinical Learning Opportunities by Mel Humphreys, Luke Bracegirdle, Pete Lonsdale, Tim Smale, Daryl Kerr and Ian Wood, Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Friday, 2 February 2018

Seeds for Solutions: Innovation Projects from 2015/16

Project Title: Technology inspired student-led interactive drug design
Project Leader(s): Mike Edwards & Tess Phillips


An interactive, technology inspired, group exercise to encapsulate the process of drug design in the teaching of Medicinal Chemistry is proposed. Students will collaborate in small groups to distil the important theoretical concepts studied into the design of new drug candidates based upon real-world examples. Built upon the intuitive visual interface of the iPad app Asteris, the exercise draws upon the benefits of student led group work to provide a fast yet innovative approach to the teaching of medicinal chemistry, and leverages communication technology to display the results. We envisage this process will have broader application to other disciplines.
Creative Commons License
Technology inspired student-led interactive drug design by Mike Edwards & Tess Phillips, Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.



Project Title: eANT: Electronic Annotation to Enhance the Flexibility of Teaching, Assessment and Feedback
Project Leader(s): Elizabeth Symons & Reinhold Heinlein


This project will explore the use of the digital-pen facility on a Windows-based tablet to enhance student learning in two ways:

(i) Revisions to Blackboard grading allow electronically submitted assessment to be annotated as though ‘pen and paper’. This is most suitable for symbol-rich disciplines and will greatly enhance the value of feedback for students.

(ii) Presentation of course material (lectures and computer workshops) will be improved by digital pen annotation of statistical software output during sessions which will help student understanding during the demonstration. It will also assist student review of material it can be saved to the KLE.


eANT - Final Project Report

Creative Commons License
eAnt: Electronic Annotation to Enhance the Flexibility of Teaching, Assessment and Feedback by Elizabeth Symons & Reinhold Heinlein, Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.



Project Title: Creating Active Student Learners through Team-Based Learning
Project Leader(s): Graeme Jones, Tess Phillips, Chloe Harold, Laura Hancock, Falko Drijfhout & Stuart McBain


We propose to introduce Team-Based Learning into problem sessions within Foundation Year, Chemistry and Forensic Science and into workshops that are part of the Keele MBChB programme. A comparative student performance data evaluation will be undertaken as part of the introduction of TBL into Foundation Year and questionnaire data will be gathered across all subjects. We will disseminate our experiences of TBL within the University and provide TBL resources so that colleagues can incorporate TBL into their own courses.  We will also present our findings at national teaching events in our own subject disciplines. Those interested in team based learning and in using the IF-AT cards are asked to contact Graeme Jones (g.r.jones@keele.ac.uk). 


Creative Commons License
Creating Active Student Learners through Team-Based Learning by Graeme Jones, Tess Phillips, Chloe Harold, Laura Hancock, Falko Drijfhout & Stuart McBain, Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Friday, 26 January 2018

Seeds for Solutions: Innovation Projects from 2015/16

Project Title: The Comorbidity Conundrum: an Integrated e-Learning Workspace and Mobile Application
Project Leader(s): Claire Rushton, Julie Green, Pete Lonsdale, Pauline Walsh & Umesh Kadam


Multimorbidity is an important challenge for current healthcare practice but has not yet been included in health education programmes. At SNAM a ‘6C comorbidity education framework’ has been developed to facilitate the inclusion of multimorbidity concepts into the current curricula. An integrated and interactive e-learning workspace and mobile application will activate this framework for student learning. In this workspace the framework will be applied to comorbidity patient cases with linked interactive activities and live communication opportunities for shared learning. The workspace will be fully integrated into current curricula and supported by a mobile application for transfer of skills into practice.

Creative Commons License
The Comorbidity Conundrum: an integrated e-learning workspace and mobile application by Claire Rushton, Julie Green, Pete Lonsdale, Pauline Walsh & Umesh Kadam is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.



Project Title: Factors for Consideration in Learning Analytics; Analysing Student Activity on the KLE to produce a more personalised and supportive system of education
Project Leader(s): Ed De Quincey & Theo Kyriacou


Traditionally a student's progress and level of engagement has been measured by assessment and physical attendance. However, in a student's day-to-day interactions with a University, other real-time measures are being generated e.g. VLE interaction. The analysis of this data has been termed Learning Analytics (LA). Following on from successful work at the University of Greenwich (de Quincey and Stoneham, 2014), this project aims to identify potential sources of data at Keele that are suitable for LA and how they can be used to produce a more personalised and supportive system of education, in the form of a Learner Dashboard.

Factors for Consideration in Learning Analytics - Final Project Report

Creative Commons License
Factors for Consideration in Learning Analytics; Analysing Student Activity on the KLE to produce a more personalised and supportive system of education by Ed De Quincey & Theo Kyriacou, Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.



Project Title: The JADE Student Learning Conference 2016: The development of a University-wide undergraduate research conference
Project Leader(s): Chris Little


This project will deliver an undergraduate research conference in June 2016. This conference would be open to all UG students to deliver verbal and poster presentations. Furthermore, the JADE journal would guarantee publication for award-winning presentations and publish a special Student Learning edition detailing the conference proceedings.  The editorial board would be made up of staff and students, with the long-term aim that the conference becomes entirely student owned. The conference will give UG students the space to pursue and present research interests and learning complementary to and beyond summative assessment requirements.

Creative Commons License
The JADE Student Learning Conference 2016: The development of a university-wide undergraduate research conference by Chris Little, Keele University is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Friday, 1 December 2017

Purple Pens: Enhancing Assessment Literacy and Student Engagement with Feedback through Students Writing Their Own Feedback

Dr Dave McGarvey, School of Chemical and Physical Sciences, Keele University

From an abstract submitted to https://www.heacademy.ac.uk/stem-conference-2017-poster-abstracts

The aim of this session is to describe and illustrate our experiences of a deceptively simple but effective strategy for improving the quality and timeliness of assessment feedback in large classes through the use of a tutor-led dialogic technique that involves students writing their own feedback using distinctly coloured pens. The objectives are to stimulate discussion of the distinctions between passive receipt of tutor-written feedback and students writing their own feedback in a tutor-led dialogic environment with a view to further enhancing students’ engagement with feedback and feedback literacy.
The UK Quality Code for Higher Education (Chapter B6) articulates indicators of sound practice as a basis for effective assessment [1]. Indicator 6 (developing assessment literacy) states:

‘Staff and students engage in dialogue to promote a shared understanding of the basis on which academic judgements are made’ [1].

This is followed by a narrative that commences: ‘Engaging students, and making use of examples and/or self and peer assessment activities, where appropriate, helps students to understand the process of assessment and the expected standards, and to develop their assessment literacy’ [1] and this captures the essence of the work described here. The use of dialogic feedback cycles provides examples of alternative approaches [2].

In the physical sciences the use of regular high-value, low-stakes paper-based assessments that typically involve calculations, analysis and interpretation of scientific observations and data (e.g. problem sheets, in-class tests) is common practice. Tutor experiences of marking such assessments are invariably characterised by observations of common errors /misconceptions, resulting in much of the feedback that is provided being repeated again and again, which is further exacerbated when dealing with large classes. The desirability of rapid turnaround times coupled with large classes presents challenges for the provision of detailed feedback, and is compounded by ineffectiveness due to the fact that many students do not understand the feedback, or do not read the feedback and only look at their mark.
We (David McGarvey, Laura Hancock, Katherine Haxton, Michael Edwards, Martin Hollamby) have recently trialled a tutor-led dialogic self-assessment method to enhance assessment literacy and feedback in selected high-value, low-stakes 1st year Chemistry assessments. The elements of the approach comprise (i) tutors surveying (but not marking, or writing feedback) completed assessments to inform the feedback to be provided (ii) prompt return of the unmarked work together with a distinctly coloured pen under controlled conditions (iii) interactive tutor-led assessment, during which students mark and write feedback on their own work with the distinctly coloured pen (iv) collection of the scripts to review marking and feedback annotations and (v) return of the work within a subsequent timetabled session.

From a detailed evaluation we have learned that students value this approach to provision of feedback for a variety of reasons, not least that the students have some autonomy over the feedback and can engage in dialogue with the teacher and peers.

‘I can make notes that make sense to me/explain things in the way that I understand them’’ (Keele student)
It is also quite efficient and provides an insight into students’ engagement with feedback. Detailed outcomes of the final student and tutor evaluation and examples [3] will be presented and discussed. Practical advice on adapting the methodology will be provided.

1. UK Quality Code for Higher Education, Chapter B6 (2013), http://bit.ly/2acRADP.

2. Chris Beaumont , Michelle O’Doherty & Lee Shannon (2011). ‘Reconceptualising assessment feedback: a key to improving student learning?’, Studies in Higher Education, 36:6, 671-687.

3. We thank Lydia Bennett (Keele chemistry undergraduate) for helpful comments and permission to use her annotations as examples.


       
Creative Commons License
Purple Pens: Enhancing Assessment Literacy and Student Engagement with Feedback through Students Writing Their Own Feedback by Dr Dave McGarvey, Keele University is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.com/2017/11/developing-academic-practice-at-level-4.html.

Thursday, 16 November 2017

The lecture is dead, long live the lecture, By Peter G Knight. School of Geography, Geology and the Environment, Keele University

When I first started teaching, back in the Stone Age, I went on a training session to develop my skills in lecturing. Yes, we had training sessions even way back then. And in that golden, wild-west age before health and safety, political correctness, or quality assurance, those sessions were real humdingers. They must have been, because in those days a single 2-hour session was all the University required for a youngster like me to make the transition from newly qualified PhD graduate (no teaching experience required) to fully-fledged lecturer. Training was a session, not a programme. And I think it was optional.

My training session involved me and half a dozen other new staff members each delivering a short example of their lecturing, and then discussing our different approaches. I think we may have been filmed. Perhaps it was set up as part of the session’s cunning design, but watching somebody nervously reading their lecture from an over-prepared script written down on index cards was possibly the best lesson in lecturing I could have had at that time. “Look how bad it can be. Don’t be this.” Even in those primitive times, and even as a youngster, I understood that a University Lecture was not supposed to be like that.

Now, thirty years later, we have recourse to a substantial literature telling us how useless the formal lecture is as a teaching tool. But I like lectures, and I think they still have their place in our teaching armoury. Not the lectures where somebody reads off a script, even if (especially if) the script is nowadays projected on PowerPoint, published on the virtual learning environment and available for replay on the PlayBack system. That’s not what I mean when I talk about lecturing.

For me, the lecture is not mainly about information delivery. Information can be delivered more effectively in other ways. If your idea of a lecture is reading out information from a script, cancel the lecture and post your script online. If you like the sound of your own voice, make a podcast. Recordings are great for students who want to listen to your pearls of wisdom while they do the washing up, walk to the park, or fall asleep at night. But where large numbers of students want face-to-face access to the individual expertise of a small number of teaching staff, the lecture remains an effective way of teaching… as long as you are careful with it! For me, the lecture is primarily about route laying, signposting, and motivation. There is some information content in my lectures, but the real aim is to show students the learning territory that lies open to them, and to motivate them to want to go and explore that territory. The lecture is a facilitating tool, not a content holder.

So what simple steps can we take to make our lectures more engaging and effective?

There are lots of different ways of doing this, depending on your own course context and teaching style, but for me it has been effective to use a blended learning approach in which the lecture is the glue holding together a range of other media. For example, a lecture might have strong online backup including a short topic summary and readings from both undergraduate textbooks and advanced research sources, so I can be sure that students have access to the core content even if I don’t go through all of it in detail in the lecture. Students can be encouraged to do pre-reading for the lecture (not just post-reading) so that they come along already clued up (and perhaps even with questions) rather than turning up saying “what are we doing today?” (or, worse still, if they say “what are you doing today?”). Preparation can be encouraged and enhanced using resources such as YouTube mini-lectures that flag up things for students to wonder about in advance. Or you could post material onto a module’s Facebook page. Students are then familiar with the key points before we start, and the lecture can operate at a higher level than if you needed to run through the basics for 15 minutes. You can see an example of these pre-lecture mini-lectures on YouTube at https://www.youtube.com/watch?v=MKahbVbo2Ec

For courses where students do preparatory work such as pre-videos or pre-reading, the lecture sessions can be improvised in response to in-class student questions/comments about what they have already done. It is usually easy to predict what students will want to learn more about (indeed you can steer them with the pre-resources that you provide), so you can still “write a lecture” in advance if you want, but then deliver it as a response to the questions they bring from their prep-work. Alternatively, you can simply give the bits of the lecture that become relevant as they ask questions about the video or the reading.

If you don’t want to set up preparatory resources, but plan to “stand and deliver” for 50 minutes, you don’t have to simply stand and deliver for 50 minutes! One effective approach is to break the session into manageable chunks, and to use each chunk to achieve a specific goal.

Here’s an example breakdown of a 50-minute 1st-year lecture, following a model that has worked well for me:

  • minutes 0-5: “establish a teachable moment” – in other words, do something that puts the students into a frame of mind where they want to learn. They won’t learn just because you force information on them. They will learn if they feel the desire or need to know something. This can be achieved in different ways. One basic approach is to ask them an interesting question (interesting to them) to which they don’t know the answer or to which you show the answer is not what they always thought. Essentially you need to make sure at the start of the lecture that the students are curious to know more about whatever it is you are covering. 
  • minutes 5-20: flag up the key issues in your topic of the day. This is the “core content” section of the lecture, and needs clear signposts and subheadings so students know exactly what to go away and read up on. Remember that you don’t have to teach them everything in class, just show them that it is there and help them to realise that it is important and interesting.
  • minutes 20-25: short break, with a reminder that students could take this opportunity to review their notes from the previous 20 mins and identify questions they might want to ask.
  • minutes 25-40: present a case-study or counterpoint example (perhaps from an important research paper) that draws together key themes from the day’s topic and perhaps illustrates them in an applied context (or from a perspective that will help shed light on what you did in minutes 5-20).
  • minutes 40-45: time to deal with student queries and comments about what you’ve done (including opportunity for them to ask questions they thought of in the mid-lecture break), and time to reiterate your key point. This might be a good point to throw in a quick Mentimeter activity to get student feedback on what they feel they have understood well or have found difficult in the session. Not heard of mentimeter? Check it out at https://www.mentimeter.com/ It is one of a whole raft of interactive tools that are available to help make lectures more engaging. If you don’t want to use technology, then a good old-fashioned 2-minute talk-to-your-neighbour buzz group can work well at this point, too.
  • minutes 45-50: a closing activity to reinforce their learning and encourage them to do the follow-up work you may have set. One simple approach here is to give them a short self-assessed or peer-assessed mini-test on what just happened in the lecture. Alternatively, the old-fashioned “conclusions” slide still has its place! Better still, throw them a teaser to prime them for the next session.

Recently I set out to redesign an entire module using that kind of framework as a starting point. I didn’t really stick perfectly to the plan, but making a step in that direction was a big improvement on my previous style. Basically, instead of a 50-minute block covering a set of information, think of half a dozen short blocks of varied content, including student activities, designed to signpost and motivate. If you are adopting an approach like this because you think students have short attention spans (we all have short attention spans), then you can also help by making sure you switch occasionally between different modes of presentation. For example, if your core session at minutes 5-20 is delivered by PowerPoint, then perhaps try using the whiteboard, or a box of sand, or at least a Prezi instead of a PowerPoint for the case study section. Or you could use a video for the opening few minutes, then talk-and-chalk for a bit before going back into PowerPoint.

Mix it up. Stay lively.

The example above is just that: an example. I’m not pretending to be able to teach anyone how to teach. But having looked at my example, take this as your own teachable moment: I know you are thinking you could do it better, and that I’ve missed a trick, or a bit of technology, or a key pedagogic theory. Excellent, then my work here is done: now please go and develop lectures even better than I have suggested!

NB: Some of this content was previously published on Peter G Knight’s own WordPress blog.   Creative Commons License
The lecture is dead, long live the lecture, By Peter G Knight. School of Geography, Geology and the Environment, Keele University by Peter G Knight, Keele University is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.com/.

Friday, 3 November 2017

Developing Academic Practice at Level 4 Mature (DAPL4M) Online

DAPL4 M Online. What went wrong and why, learning from our mistakes. 

Angela Rhead, Student Learning; Katy Lockett SSDS and Matthew Street, Student Learning / LPDC

Summary 
Student Learning launched the Developing Academic Practice at Level 5 (DAPL5) pilot in 2015/16. An intensive, seven-week open course in both semesters, DAPL5 supports Level 5 students to explore academic reading, thinking and writing practices in the context of their modules. Based on DAPL5’s success, and in partnership with SSDS’s mature student liaison officer, we piloted the DAPL4Mature (DAPL4M) course in October 2016 to re-engage mature students with learning confidently and increase awareness of the services available. A shorter, four-week course, DAPL4M focused on the preliminary aspects of scholarly study: making the most of handbooks; note-taking in seminars / lectures; managing reading lists.

With a small number of the students who applied actually able to attend the Wednesday morning workshops (6/26), we launched DAPL4M as a closed online course in semester two. We envisaged an online course, ‘attended’ at any time in the week but with a weekly ‘delivery schedule’ of content, discussion and online chat (see Table 1), would increase participation, perhaps also capturing students who had applied in semester one. We also hoped to engage with a wider range of students not attracted to the face-to-face, communal workshop approach. To reflect the learning journey, we added a session on using feedback, experiences and work from semester one to shape development in semester two.

Ultimately, we attracted fewer students: thirteen applied, three of whom had applied to the first DAPL4M. Nine of those thirteen engaged in the pre-course ‘Getting to Know Each Other’ KLE discussion forum; two engaged partially in the Session 1 blog discussions on ‘Being Academic’; by Session 2, no one was participating. Additionally, no one attended any of the ‘Live Chat’ discussions, intended to explore questions created by the week’s tasks. After a silent ‘Live Chat’ session in week three of the course, we reluctantly decided to close DAPL4M Online, providing details about the Write Direction 1:1 academic coaching service should anyone want to continue to focus on their academic development.

Having closed DAPL4M Online prematurely, we wanted to reflect and explore insights gained from piloting an ostensibly ‘failed’ initiative. We assessed it (entirely subjectively) at a grade of 35% in terms of success, and then shared our thoughts on two questions:

1. Why did we not judge it less than 35%?

2. Why did we not judge it more than 35%?

We then considered how to apply our ideas to future projects or to our wider practices. Find out more here

Creative Commons License
Developing Academic Practice at Level 4 Mature (DAPL4M) Online by Angela Rhead, Katy Lockett, Matthew Street is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.com/2017/11/developing-academic-practice-at-level-4.html.

Friday, 12 May 2017

Asking what they want won’t tell us what they need. Discuss. By Peter G Knight School of Geography, Geology and Environment, Keele University



In one of my 1st-year tutorial exercises early in each academic year I ask the new students whether they trust their lecturers and whether they believe what they are told in lectures. The students normally say that they do trust us, and then we have the whole conversation about reliability of sources and the importance of checking everything - including what they are told in lectures - against evidence and peer-reviewed publications. It is an important exercise, but looking at this year’s student evaluations of my teaching I wonder whether I am asking the question at the wrong point in the students’ careers.

It is that time of year now when many of us are getting feedback on our teaching from students as they complete their end-of-year module evaluation forms. When I started lecturing 30 years ago there were no such forms, and I confess with some shame to having played a substantial role in developing and introducing student evaluations. The kinds of feedback that we can pick up through these anonymous forms, or, increasingly, anonymous online surveys, are different from the feedback we used to get in the old days by having conversations with our students. Perhaps the anonymity and distance of the feedback form, rather like the distance afforded by interactions on social media, change the way that people respond when asked to offer, or vent, an opinion. Even in the years since we started using student evaluations of teaching, gradual changes in the nature of the students’ comments have reflected significant changes in our learning and teaching environment.

Even in the best-case scenario, student feedback forms are to be treated with extreme caution, especially when they are read by career-young teaching staff. The level of polite professionalism that we try to maintain in our own communications with students is not always reciprocated by some individuals as they deliver their anonymous feedback to us. In my current role I see feedback addressed to a lot of different staff, including inexperienced staff who are still being mentored, and I try to warn them, before they look at their first batch of forms, to be ready for the small percentage of responses that will be either irrationally hateful or inappropriately affectionate. At each end of the spectrum, there are usually a few forms that challenge the notion that every student’s opinion is a valid contribution to our course development process. One of my personal favourites was a verdict passed on my teaching by an anonymous student asked to comment on the merits and shortcomings of one of my modules, who wrote, drawing together his or her reflections on my year’s pedagogic efforts: “Dr Knight looks like a turtle”.

One of the biggest changes revealed by looking back over years of feedback on my modules is the change in student expectations. On the oldest forms I see students congratulating me for including projected 35mm photographic slides in my lectures. I used the occasional OHP if I wanted to push out the technological boat. The students were very happy with that. Gradually the comments changed to reflect the students’ satisfaction or boredom with PowerPoints, Prezi presentations, YouTube pre-lectures, the flipped classroom and a succession of virtual learning environments from WebCT (remember that?) through to Blackboard. The technological support that students now take for granted was not even imagined by previous generations. Students now will quickly complain about tutors who don’t provide online notes, copies of the slides, very specific set readings, and, now, captured recordings of the lectures themselves. But the students are quite right to expect the latest and best technology, and their feedback (if given thoughtfully) can help us to use it effectively.

Another change in student expectations, beyond the merely technical, is an increasingly prevalent assumption that learning should come easily. Perhaps it is connected to changes in technology. Almost any kind of basic information is now just a few seconds away, a few mobile thumb-clicks away, on Google. Even for more sophisticated academic materials Google Scholar, Web of Science, or the academic search engine of your choice makes even the CD databases of a few years ago seem stone age in comparison. I was brought up on index cards. I was trained to expect learning to be hard work. When you get right into the intellectual puzzles, learning still is hard work, but some students find this to be an unacceptable surprise. Only once, so far, has a student actually told me that they believe their £9,000 fees pay for the hard work to be done by me, rather than by them, but that kind of thinking is out there in the classroom now.

Increasingly over the last few years student feedback on my modules has started to include complaints that my teaching has given the students difficult intellectual challenges, or has required them to search for literature themselves, or has expected them come up with their own research-project designs. This year one final-year student wrote in the “what could be improved about this module” box that I raised lots of questions for long discussions instead of just telling them the answers. I am sure that a few years ago, with a different generation of students, that comment would have gone into the “what went well in this module” box. Most students don’t like clashing deadlines, and most of us may agree with them, but if one of the learning outcomes of a study-skills module is to develop time-management skills, then a deliberately clashed deadline is a learning opportunity. In a research-design module, giving students a ready-made project deprives them of project-formulation experience that will be invaluable to them in future employment. If you are learning a difficult analytical technique, skipping the difficult or boring bits is not good training. We have to recognise that sometimes, a bit like being at the gym, gain requires pain. Learning requires hard work on the part of the students, not just their teachers.

It is important to recognise and respond to student feedback on our modules even if sometimes we think the student is missing the point, or if a poorly designed questionnaire has failed to deliver our questions effectively. Galling though it is when we know that the assessment criteria are clearly set out in the easily accessible handbook, and that they were explained at length in the opening lecture, to have a student say on the feedback form that no, the assessment criteria were not made clear in advance, we can’t just shrug it off. We must consider why, despite all our efforts, this student did not believe the assessment criteria to have been made clear. That piece of student feedback should lead us to look again at the handbook, the timetable, our lecture resources, the scheduling of big nights at the Students’ Union, or whatever else might be contributing to the problem. What we should not do is just ignore the feedback. If that were our plan, we should not ask for feedback in the first place.

But this leads us to a key question. What is it that we are asking for? We sometimes talk about “student satisfaction surveys”, as if satisfaction will be the measure of our teaching quality. It won’t. This is becoming very important as we anticipate the incorporation of student feedback into the TEF. Ensuring students’ short-term satisfaction in a way that it will be reflected in their feedback is a different matter from ensuring their long-term learning, which might be most effectively won by painful hard work. If I were to design a module to make students give me the best feedback, it would not look the same as a module designed for the best learning outcomes. The danger in adjusting modules in response to poorly designed “satisfaction” surveys is that a student’s satisfaction may not equate to their learning gain. I currently lead two 3rd-year modules. One I would describe as competent but dull in its design, while the other has won external recognition as one of the UK’s most challenging, exciting and innovative modules. The competent but dull module has repeatedly and consistently scored 100% positive student satisfaction over the last few years. The exciting and innovative module has so far never scored 100%.

We altered some of our student evaluation questions recently in light of changes to the questions in the National Student Survey. One of our new questions is about whether students feel that staff value, and act upon, student feedback. As part of an institution and a subject area that genuinely does value and act upon student feedback I was confident that the responses to this question would be uniformly positive, but they were not. In the feedback on one of my modules, one student wrote in the “what could be improved” box that I seemed to teach the module the way I wanted to, rather than the way the student would have liked. They threw in for good measure the observation that this was arrogant and condescending of me. An old-timer like me does not get upset by personal comments as much as some younger colleagues may do, but I do take care to think them through. Reflecting on that particular comment I do believe that, on the whole, staff are well placed to know both what students will like and how students will most effectively learn. We try to strike the best balance between those, recognising that students may learn better if they are learning in a way that they enjoy, but also that some learning has to be hard won. From the student’s perspective, sometimes, only part of that is obvious.

That discussion that I have with the 1st-years about whether they trust their lecturers usually includes me inviting them to set out their own programme for the remainder of the course. Most years, the students decide that they would prefer me to do it, because, they say, at this stage in their careers they do not know what they need to learn or what is the best way to learn it. As one of them once said, their dentist never asks them how to proceed with the treatment, and they are happy to trust his judgement. I use that example myself now in those discussions, and ask them whether attending classes and doing assessments is a bit like going to the dentist: not always what they would choose to do for fun, but worthwhile and important nevertheless. If the students can see that the pain of a nine o’clock lecture or a challenging assignment is leading to an eventual benefit, and if we can earn and maintain their trust in us to be doing what is best for them when we schedule those classes and set those assessments, perhaps we can all do our best work.

So perhaps that is the key question that is missing from our surveys: do you trust your tutor to be doing what is best for your learning? If we have not earned their trust, that is something we really need to find out about. I don’t need an evaluation to find out whether I look like a turtle or if I am popular with the students. I need it to check whether they feel they can trust me to be doing a good job.

Creative Commons License
Asking what they want won’t tell us what they need. Discuss. by Peter G Knight, School of Geography, Geology and Environment, Keele University is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Based on a work at https://lpdcsolutions.blogspot.co.uk/.