Professional Learning
Professional Learning

Educators are the lead learners in schools. If they are to enable powerful, authentic, deep learning among their students, they need to live that kind of learning and professional culture themselves. When everyone is part of that experiential through-line, that’s when next generation learning thrives.

Learn More

Too often, we aren't tapping into the power of learning assessment to maximize the impact of faculty development. Here are ten possible approaches to change that.

Faculty development in higher education comes in many shapes and sizes. It ranges from short tutorials to yearlong intensive training, from focused training on a discrete aspect of teaching and learning to programming that covers the full gamut of topics, and from carefully personalized for an individual or small groups to mass produced and disseminated.

When it comes to assessment and faculty development, that range is just as varied. Sometimes faculty growth and development is carefully monitored, providing rich and meaningful feedback on faculty accomplishments. In other instances, however, “assessment” might be little more than a badge or certificate recognizing participation or completion.

I contend that too often we are not tapping into the power of learning assessment to maximize the impact of faculty development. There are so many wonderful and promising possibilities for faculty development. Now just imagine the possibilities if we started adding more intentional learning assessment strategies within these.

As a way to illustrate this, below are ten common approaches to faculty development in higher education. They are certainly not all distinct. There is plenty of overlap among them, but I divide them so I can more easily explore the role of assessment. After each one, I offer a couple of questions or ideas on how we can incorporate an assessment plan to the benefit of faculty and others.

The Exposure Approach

Many professional development offerings have stated learning goals or objectives. Yet in practice, the design of professional development is really just to expose people to new practices and ideas. To expand their sense of what is possible (or advisable). We leave it to the participant to decide what (or if) they will take away or learn. We might give ideas on how to apply the ideas, but apart from a few discussion questions or polls, we don’t provide a way for participants to determine if they learned anything new or now have the confidence or competence to apply what they learned.

This is not a harsh criticism of the exposure approach. It is just a common limitation. It is a resource for personal or professional growth and development, but doesn’t necessarily include a way to actually assess what faculty took away from it. Yet even with an exposure approach, we can easily add simple entry and exit surveys focused on goals and whether or not those specific goals were met. We can include questions about how people intend to apply one or more of the ideas. We can invite people to reflect on persistent questions or areas of confusion. We can also add simple, low-stakes “quizzes” or checks for understanding to the experience as a form of feedback to the participant (and insight for the PD organizers).

The Exemplar Approach

Think of this as ‘show and tell’ for adults. Faculty members gather together and provide short or lengthy showcases of what is working. This might be called something like a “best practices” or “promising practices” event. What is shared is done so by people who have done it before. They essentially tell the story of what they did, what worked, and what didn’t. Others can watch, listen, discuss, ask questions and find ideas to try out in their own courses.

Many faculty members describe this as a wonderfully practical and inspiring form of professional development. Yet, when you go back and try something, you are often doing so with little feedback. This is where we could supplement an event like this with some simple self-assessment checklists, such as a tool for people to plan out something similar, try it, and get feedback from how it worked. This takes the exemplar approach from show and tell to show, tell, do and learn.

The Tutorial Approach

Faculty have different and sometimes harried schedules. Getting a group that can meet at the same time and place can be challenging. That is why many faculty report a preference for self-paced professional development. They can work through at their own pace. They can do it at a time that works for them. They can even do it in different places, usually taking advantage of delivery through the web or via mobile devices.

We can easily add simple ungraded checks or quizzes to help faculty monitor their own learning and progress (and to provide important insights for instructional designers and others). We can build in simple self-assessment checks. We can add prompts that invite people to engage in some sort of follow up activity with a colleague. We can also add an elective option for faculty in a tutorial to actually create or do something with what they learned and then submit it for direct or narrative feedback. These and many other options add that rich feedback element to tutorials.

The Course Approach

Plenty also decide to stick with the well-known course motif for professional development. Often in a non-credit format, these have the benefits of a more structured and lengthy learning experience, even if they are just three to five-week short courses that meet online or in-person once every week or two. They have the benefit of giving people a chance to reflect, ready, create, and try out between sessions. As such, this is one of the models that more often includes a richer array of assessment and feedback mechanisms. This can involve badges, portfolios, peer assessment, self-assessment, or one-on-one feedback from a facilitator, among other options.

The Academy Approach

The academy approach, like the course approach, is one that tends to be a deeper and more extended experience. People might gather in a cohort over a year or longer. It might start and conclude with an intensive or in-person experience, but then have other shorter connections with the group between those two experiences. Assessment through coaching and mentoring, the use of portfolios, peer feedback and much more can be easily incorporated to add a rich assessment element to such longer-term professional development programs.

The Mentoring Approach

A mentor is someone to whom you can turn for guidance and wisdom along your journey as a faculty member. In some contexts, new faculty members get a mentor assigned to them. The mentors often don’t set specific learning goals with the mentee. Instead, it is often a set of structured meetings, but also someone to whom mentees can turn with questions and tips along the way. The mentee might have a challenge in a given course, and the mentor can be a resource for working through it.

Adding formal assessment to this can sometimes change the nature of the exchange, so we want to be careful about that. Yet, if there is a formal mentoring program in an institution, it doesn’t hurt to make sure of assessment, monitoring feedback resources available to people who want to use them. Or even having the mentee engage in an informal learning journal about experiences over the course of the year can be a rich source of insights for the mentee and others, granted that the person is comfortable sharing it.

The Coaching Approach

Some use the terms mentor and coach interchangeably, but others will be adamant that we distinguish between the two. In this case, I offer the following distinction:

  • mentor tends to be a broader type of relationship with a person.
  • coaching relationship tends to be more focused upon specific goals, tasks or outcomes.

For example, if I am struggling with writing quality tests in my class, I could find someone very skilled in testing to serve as a test-writing coach. The coach will give me tips on how to improve, but assessment is embedded into the experience. I will design tests and the coach will give me feedback and a critique that allows me to improve my practice in this area. The coach might give me examples to use as a guide. The coach is engaging in a persistent give-and-take relationships with me, but it is focused on personal or professional development in a given area.

I don’t see frequent use of coaching in much of higher education faculty development, but it can be quite powerful and effective. This is partly due to the fact that it makes great use of integrated feedback and assessment, it is highly personalized, and it focuses on skill development—not just exposure to new knowledge or information.

The Peer Approach

Similar to coaching in many ways, there are plenty of examples, especially within a given department or school, where faculty are sharing and helping one another in their professional development. This can be done on a 1:1 basis or in small groups, where those who are teaching the same courses are able to compare notes on curricula and teaching models. They might give each other feedback on how to teach certain concepts, how to write syllabi, how to handle certain teaching and learning challenges, and much more. Faculty might sit in on each other’s courses, observe, and give feedback afterward. This, too, tends to have many natural and integrated forms of feedback and assessment.

The Self-Directed Approach

Inherent to spending years developing expertise in a given discipline, faculty tend to be rather curious people with an inherent love of learning. As such, we’re learning independently as much or more than through something constructed by another person or group. Some spend more time than others on professional development around teaching and learning, but self-directed learning is a large and important aspect of any faculty development. We read books. We talk to colleagues. We reflect on our practice with the intent of learning from mistakes and improving. We seek out resources online and through professional associations. We examine current and emerging research. We experiment with new practices and learn from them.

Within this form of professional development, there are wonderful opportunities to incorporate a self-assessment strategy such as setting goals and creating simple checklists and rubrics to monitor our progress. Or, we invite feedback from colleagues, often in a narrative and/or informal format. We might also create a portfolio of our work, or engage in some sort of learning journal that documents our thoughts, experiments, experiences, and learning along the way. These aspects of an informal “assessment” plan help us to become more aware of our progress and learning over time, providing valuable structure to a self-directed approach.

The Buffet Approach

We’ve all seen it. Whether it is from a university center for excellence in learning and teaching or a professional association, they offer a “menu” of selections. Faculty members browse and pick that which seems relevant. They leave what is not for others. These offerings will include a range of the other specific forms of professional development like the others that you just read about in this article.

Some schools even use software to track faculty participation in faculty development over the year. Yet, it is far rarer to have any system in place to make faculty progress visible to themselves and others across various professional development activities. Over a year, a faculty member watches a few webinars, attends a couple of lunch-and-learn sessions, goes to a conference, has a peer observe a class and give feedback, and completes a short course on a topic of interest. These experiences are wonderful. What changed? Did these experiences convert into any progress in one’s professional goals in teaching and learning, for example?

As one of many possibilities, imagine a rich portfolio of work that a faculty member develops over one or more years, drawing from the lessons learned in various professional development activities, but framing them as support for one or more overarching goals. Or, consider the possibility of a robust digital badging system that issues leveled badges based upon one’s application of learning over a year or more.

There are countless other possibilities, but the core questions that I offer here are these: first, how do we build a formal or informal assessment plan that helps make faculty development more visible across professional development experiences? Second, what might be the benefits and limitations of this?

Conclusion

There are countless ways to approach professional development—far more than what you just read in this short list. Yet, I hope this illustrates the valuable role that feedback and assessment can play in any of them. An integrated assessment plan helps move us from simple exposure to a greater focus on growth and development. It helps to make progress and learning visible, which can increase motivation, confidence, and progress toward higher levels of competence. In addition, assessment allows us to measure the impact of our professional development. Ultimately, if used well and not in some unhelpful punitive form, integrating assessment into faculty professional development helps us to achieve what most of us want out of it: growth and development that results in better outcomes for everyone.

Bernard Bull

Concordia University

Bernard Bull is Associate Vice Provost of Academics & Associate Professor of Education at Concordia University.