Day 1: Echoes of Protest - Carol Hartmann, Aashay Roop, Lato Sele, Angelina Chabalala
By Wits Blended learning
Summary
## Key takeaways - **Audit tool for online course quality**: An audit tool was developed to monitor and evaluate online course content across the faculty of health sciences, focusing on eight key aspects of online pedagogy. [02:53] - **Comparing hypothetical courses using audit scores**: A comparison of two hypothetical courses revealed significant differences in audit scores, with one scoring 23.1% and the other 69%, highlighting disparities in content, assessment, and feedback. [03:56] - **Statistics inform curriculum decisions**: Statistical analysis, including descriptive and inferential methods, helps navigate complex curricular data to inform decisions and allocate resources effectively, rather than dictating solutions. [05:34], [08:37] - **Loop platform for curriculum mapping**: The Loop platform is used for curriculum mapping, visually representing what is taught, how it's taught, and how learning is assessed to enhance transparency and identify gaps or overlaps in programs. [10:05] - **Challenges in curriculum mapping**: Key challenges in curriculum mapping include obtaining course packs from staff, training them on the platform, and keeping information updated, as well as managing notional hours effectively. [18:15], [22:05] - **Iterative redevelopment reduces faculty fatigue**: A small, specialized team and a bottom-up approach, focusing on specific identified problems with data-driven insights, help reduce faculty fatigue and stabilize programs during continuous redevelopment. [27:47]
Topics Covered
- What defines a high-quality online learning experience for students?
- Statistics highlight problems but do not dictate the solutions.
- Curriculum mapping makes the entire learning journey transparent.
- Data makes subjective student workload conversations objective.
- How do you redesign curricula without burning out faculty?
Full Transcript
[Music]
Good afternoon everyone. My name is Lu
Si and together with Angelina Chavala
and Asher, we would like to share our
work which is titled charting a path
forward. lessons from health sciences
curriculum development
which reflects our collaborative efforts
to enhance curriculum development and
delivery in the faculty of health
sciences. So we grounded our work with
the integration of two key frames.
We grounded our work with the
integration of two key frameworks um the
ADI model and the SIP evaluation model.
While Addi guided our overall
instructional design process, we
specifically applied the SIP model
during the analysis and design phases of
our our uh work and to ensure our
curriculum improvements decisions were
informed using our context, input,
process and product aspects. The S model
developed by Stafle Beam includes four
branches. context where we looked at how
effective
uh the curriculum is being delivered to
students and staff input. This involves
selecting tools like UAZI, Moodle, uh
REDCap and Loop. The process phase is
the core focus of our presentation today
where we
evaluate how the curriculum is being
delivered to improve the actual teaching
and learning. The product in this phase
will then be able to to assess the
outcomes and impact of the curriculum
such as student success rates and
long-term improvements. We aim to
provide a focal perspective of the
process we went through to develop our
curriculum and mainly the virtual
learning environment pedagogy, the
statistical analysis and the curriculum
mapping on Tulum.
Let's begin with two questions that have
guided our inquiry into digital teaching
and learning. Firstly, what defines a
high quality online learning experience
from the students perspective and how
consistently is this delivered across
our courses? The second question, where
are the hidden strengths and silent gaps
in our digital pedagogy and how can we
use this insight to innovate?
To answer these, we developed a tool, an
audit tool designed to monitor and
evaluate online course content across
the faculty of health sciences. This
tool takes the form of a structured
survey or checklist created on REDCap
and focuses on eight key aspects of
online pedagogy, namely the
introduction, navigation, basic content,
communication, interactive or
collaborative items, assessment,
marking, evaluation, and feedback. and
lastly evidence of blended learning. The
audit is conducted on courses hosted on
learning on the learning management
systems like I mentioned earlier and
Moodle and the data is analyzed using
redcap or Microsoft Exile. The following
example
show two hypothetical courses which we
compared using the audit scores of the
selected sections of the audit form
where each section was scored based on
specific criteria and the results were
averaged to produce an overall audit
score.
So for the first course which is pres
which presented a low score of 23.1%
compared to the second course which had
a higher percentage um in the basic
content category indicating that fewer
types of learning materials were
available um compared to the second
course in the marketing evaluation and
feedback section. The second course
achieved a 69% audit score reflecting
more consistent evidence of grading
practices, evaluation mechanisms, and
feedback provided through the LMS.
Similarly, the assessment section showed
stronger performance in the second
course at 58.3% and greater frequency of
assessments uploaded to the LMS.
Finally, in terms of blended learning,
the second plot against Gorda
demonstrating clear tier integration of
online activities with in-person
learning environments, including spaces
such as the EO.
This indicates that there is no official
mental course overall, but there is room
for improvement required across all
courses in the online spaces. These
findings help us identify areas of
strength and weakness in digital content
delivery and they directly inform the
process phase of our curriculum
development helping us answer the
questions that we began with. Thank you
very much. And Ash will follow with
insights from statistical analysis
showing how data informs curriculum
effectiveness and equity.
Yeah. So good day everyone. As L
explained, today I'll be explaining the
statistical application um during our
curriculum development phases. Um when
it comes to curricular data, we often
faced with variable multiple variables
and factors that influence the data that
we receive from students or from a
particular curriculum. As such, my job
is to use statistic to help navigate
this complexity for both students as
well as the curriculum. The goal of this
is not to dictate decisions but to help
inform decisions and allocate limited
resources much more effectively within
models like statistics supports
evaluation at every stage ensuring that
curricula remains responsive and
effective within models like atti they
strengthen the analysis and evaluation
bases ensuring that design development
and implementation directly address the
real challenges that students are facing
and within the curriculara rather than
assumptions that make this promotes a
more equitable and effective curriculum
for the diverse students at births. So
how do we use statistics?
When when we begin with statistical
analysis, our first step is to gain an
overall picture of what's happening
within the curriculum and to identify
patterns and challenges within it. This
is done through historical analysis of
the course of curriculum or since
curriculum implementation.
We begin with descriptive statistics
such as throughput and path rates to see
how well our students are progressing
through the course. This helps those
involved in the process to identify bot
index. For example, a module will
consistently high failure rates or
demographical explanations such as piles
that are failing. Descriptive statistics
helps us to summarize complex data into
understandable outputs like averages,
central tendencies, and percentages.
Instead of working with a long list of
raw marks, we can see clear patterns
such as high attrition rates or patterns
of high attrition, which cohorts are
most at risk and where interventions may
be needed for specific courses.
This provides a solid foundation for
decision making and long-term planning.
Beyond description, we as well use
inferredial statistics to make
predictions and test different variables
within the curriculum. Inferential
analysis allows us to move from what's
happening to why to why it is happening
and what might happen next. For example,
predictive modeling using linear and
logistic regressions can forecast
student outcomes based on specific n
subjects or combinations of it.
Similarly, kaiquare tests and t tests
can help us explore the impact of
specific factors. These an these
analyses enable us to pinpoint find
specific variables more strongly linked
to student outcomes of curricula and
guide targeted interventions grounded in
statistical evidence rather than
assumptions. When descriptive statistics
show for example that quile one to three
schools are failing. We use inferential
statistics to dive deeper into the
specific variables impacting this such
as quiles n subjects such as buildings
or life science marks.
It's important to stress that statistics
highlights our problems but does not
dictate solutions. For example, if a
course consistently has high failure
rates, that does not mean we should
remove the course entirely. Instead, we
cross check with our curriculum mapping
and workload where we might find that
that the course is overloaded with
quantity in a short time frame. This
places excessive pressure on students
and becomes a reason for the high period
or that the course is out of sequence in
the curriculum disrupting systematic
learning. In this way, statistics guides
us towards the right questions.
So why does stats matter in curriculum
development? Statistics matters because
they ensure resources are used much more
wisely based on evidence that is
difficult to dispute. By identifying
bottlenecks through throughput,
descriptive and inferial statistics and
analyses and then triangulating with our
other processes of curriculum mapping
and workload, we can focus interventions
where they have the most impact. Whether
that means tutoring support, curriculum
sequencing adjustments, or reducing
content overload. I thank you and I hope
you have increased your knowledge around
statistical application during
curriculum development and hope you can
apply to you all faculty. My colleague
will now explain the third process
during our curriculum development phase
here at the NPHS through the loop system
that we employ to map our curriculum by
Andrew.
Good afternoon everyone. I'll be taking
you through one of the tools that we use
for our curriculum developments which is
curriculum mapping. So curriculum
mapping helps us bring together the
different parts of a curriculum into a
single usable picture. So it is defined
as a process of visually representing
what is taught, how it's taught and how
learning is assessed to make the
curriculum more transparent. Um so what
do we use for our curriculum mapping? We
use a platform called group. This stands
for learning opportunities, objectives,
and outcomes platform. This is a um a
web a web based platform that maps and
aligns curricula in real time. So the
types of questions that loop can help
you answer include what is our staff and
student workload? How can we better
align the outcomes and teaching
assessments? Um what are our teaching
methods? And what are the gaps and
overlaps in our programs? So loop
supports um the quality assurance and
curriculum changes. It reduces
unnecessary repetition and it links
learning events to learning objectives
and learning outcomes as well as
frameworks.
So um how do we do our curriculum map?
So we first start by collecting course
packs from lecturers and then we
download a mapping template from loop.
Now we will start inputting things like
learning events, periods, teaching
formats um as well as
um departments and lecturers as well.
Anyone who teaches or also the course
coordinators.
Um now I'll show you an overview of how
loop looks like.
So this is an overview of loop. So as
you can see here we have the first view
which is the learning events view. So in
our learning events view we have the
learning events the course code the year
of study department periods and teaching
format. So what you can do with this you
can also download this to do things like
um departmental workload. You can also
check the teaching methods across um
departments courses as well as programs.
Then we have our objectives view. So in
our objectives view here you can see the
learning. So here the learning events
are linked to learning objectives and we
also have um the verbs the action verbs
for each learning objectives. So here
you can see the right verbs used for
each learning objective and we also have
assessment as well. So um this helps us
understand how learning events are
connected to what is taught and how
we've assessed.
So the next
Okay. So the next view I wanted to show
you guys is the post outcomes view. Um
so the outcomes view is the same as the
objectives view. This just links our
learning event and objectives to course
outcomes and this will give us a bigger
picture of how the objectives and
assessments will all contribute to the
final competencies and outcomes that
learners are expected to achieve. Um and
the last view is the objectives by
framework view. So here we have
different objectives such as frameworks
such as NQF um SEPA and HBCSA.
So here this shows how our curriculum
aligns with broader institutional and
accreditation frameworks. And um this
will so linking all of this together
will allow us to search our curriculum
against external and internal um
standards and requirements such as our
um cases and and skills list. So these
serve as parameters that guide how we
design the current curriculum and
determine what is important. And in
doing this we are able to track, monitor
and ensure that these requirements are
constantly included in the curriculum.
So essentially loop allows us to see the
curriculum from different perspectives
and how they are all interconnected.
So in conclusion, the faculty of health
sciences curriculum development process
embodies a robust and ongoing endeavor
that leverages an integrated flow to
drive continuous improvement.
The cyclical system integrates three
interconnected components that we have
just explained. Statistical analysis
using STA to process data and uncover
trends in the curriculum performance and
identify bottlenecks.
curriculum mapping through the loop
system to align content with learning
outcomes, identify gaps and ensure
coherence as well as our virtual
learning environment auditing to provide
structured data collection, tracking and
evaluation of digital platforms.
Together, these processes foster mutual
contributions and benefits with relation
flowing continuously to inform dividends
based decision making.
While this model is not perfect, its
iterative adaptive nature constantly
evaluating and monitoring programs
enables proactive refinements addresses
emerging challenges and enhances
educational quality in response to
evolving demands. If you would like
these slides,
you can scan the QR code. And I thank
you guys for listening to our team of
health sciences and we look forward to
the question that you guys might have.
[Music]
question you refer to both the
objectives and the outcomes. So can I
just get to what
how you differentiate those? What in
your case were the outcomes and what
were objectives?
So by learning objectives we mean the
ones that so um if you have a different
topic maybe that you will be lecturing
today in a class um for that learning
event or for that topic you have your
objectives that you expect students to
um know about after this
>> like what the teacher is expecting
to get out of this
>> interaction
>> right um they have verbs which is um
maybe apply students should be able to
apply.
>> So those will be your objectives
>> and then we have our learning outcomes.
Um so by that it's so we have our exit
level outcomes for the particular phase
program
>> for the program the exit level outcome
for the program that for for um once the
students by the end of this program
>> to be competent to the next the student
should accomplish all all those. So your
outcomes there are the exit level
outcomes for the program and then where
you referring to objectives you mean the
intended learning outcomes basically for
the learning.
>> Thank you.
>> Thank you. Any other questions?
>> Thank you very much for for sharing with
us your experience of mapping. its
courses.
May you kindly share the the challenges
of doing an exercise such as this? Uh
because I I can see the benefit
but what could be what challenges could
be there in doing it.
So the challenges would be getting you.
So Mike just
um the first challenge would be getting
course packs from staff
and as well as maybe um so you know how
payments change, right? So
training um so we have to train staff
members how to actually map because as
like I mean some fields are always
changing so people should be able to go
on to loop and make the changes so that
the current stays updated. So I think
the challenge is there um getting cost
as well as getting getting updated time
tables and information to hold on to.
>> Hi guys. Um I wanted to know so with a
virtual learning environment audits
after you find out that okay this
blended learning course there's not
really much into the blend how do you
approach the lecturer to say I would
like you to improve or how do you go
about feedback giving the lecturer
feedback and seeing it come to life or
do we just give it and then
>> okay so part of our curriculum
development is the virtual learning
environment
And so far it's been
uh uh what can you say? We've been
monitoring and evaluating. We haven't
gotten to the stage where we've starting
to give exact feedback and implement
changes which would be the next step in
our in our whole process. But for now,
we just we've just been auditing and
working as many programs as we can.
>> Maybe I can just jump in there. Sorry, I
would oversee the team. Um, and so we
only actually started doing the audits
last year, I think around February. Um,
focusing on our undergraduate at
graduate courses to start with, of which
we have nine programs, and I think we've
done four so far. Yeah. Um, and so we
haven't started feeding it through to
our learning design team yet, but what
we are using or or we're kind of holding
the information in reserve. We've got um
four of our programs which are
undergoing curriculum review at the
moment. The lecturers are currently
looking at what is being taught and what
their outcomes are and therefore the
alignment with that and reviewing that
um and looking at some of the
bottlenecks that we've identified
through the stats um and through the
curriculum mapping. So we haven't gotten
to designing with those courses as yet.
But from the work that L is doing when
we get to the redesign of the course we
will be saying to them well you guys
aren't you're not communicating well
you're not optimizing the LMS in this
way and uh we've already started
chatting to our LXD team with Renell
over there about one of the programs in
particular about how we can actually
improve the navigation
uh because what we found is in within
the same course
one lecturer er will structure their
block according to concepts. Another
will structure it according to time and
a third one will find a different way to
do it and it makes it it increases the
extraneous load for the students in
terms of navigating the content in the
course. So it's a it's a big problem
that we have
a question. So I'm currently
struggling with the emotional arts in
terms of um the mapping process and I'd
like to know um maybe what is your
approach regarding that uh when you
approach the national hours because what
we find is that well let me speak for
myself my experience you find that um
it's the um the notional hours uh
indicates it is less than 120 hours or
whatever according to the credits
obviously at the end. You find that now
when the what is it the student or the
participant
um as the one experiencing the course
let's say the time spent on the activity
is 5 hours but when you experience that
specific
activity for another person it can be
actually 10 hours not actually five so
it's quite um subjective I wanted to ask
that are those maybe in the mapping
process and the design process actually
making those who are part of the process
actually quite conscious of such things.
So sorry guys.
Um so what we did with our um medical
program review we
didn't get the full putting into the
loop platform and mapping the objectives
against our exit level outcomes and our
HPCSA requirements and the other
cataloges the other things that we can u
map against in order to check for
alignment and check that we're producing
what we're supposed to produce. But what
we did do is from the timetables we
collected all the information in terms
of what was the timetabled time or that
the students were spending. And even
before we started adding on like you you
generally in the university people will
say well for every hour of contact time
at first year the student should spend
one hour at home studying doing stuff.
Even before we started doing that, we
were at about 3/4 of the of the notional
of the supposed notional hours. So, it
definitely made people a lot more
conscious of it. Um, and it's also a
great way to monitor and track because
particularly in our programs, we don't
follow things like diagonals and that
sort of thing. We've got a lot of
integrated programs um in some of our
professional courses and so people say
oh no we've got this new area which we
need to teach the students about this
this new genetic test well let's add on
a lecture um and so having that living
curriculum map and as Connie rightly
said the living is the difficult part
getting the start to buy is is very
difficult
um it allow you know it allows you to
monitor and track if I change if I
remove them or change the objectives of
this lecture, what am I actually
affecting in the third year of study or
in a different part of the curriculum,
what's the implication? It also allows
you to say, okay, well, should we bring
these two things together rather um and
plan in that sort of way? Um, and then
it does allow us to say, well, we're
over on our notional hours. What should
we cut? And one of the difficulties with
content that I've always seen is the the
conversation usually goes around
embryology in our factor in our faculty
which is the human development in the
womb. And so the conversation will
usually go there's way too much in
embryology when they're five lectures.
The students don't need to know
embryology.
Then the next point is no they need to
understand how the septum the wall
between the chamber of the hearts
develop because that's an importantly
common anomaly that they'll see and then
they'll say okay so we need to teach the
students embryology when and it'll go up
to 10 lectures whereas actually only 15
minutes of one lecture might be to do
with the heart if that makes sense. So
it it helps to make those sorts of
general conversations a lot more
concrete because you can actually look
and say okay so what what's related to
this topic um and so that's one of the
real values of the curriculum map um
that we're finding even before we we we
map to catalogs. Did I answer your
question on hours
>> in Yeah, in
>> it it really helps you. Yeah, it helps
you to identify where a course is or
isn't within the notional, but you do
still need to establish within your
faculty what are acceptable norms for
one hour of contact to one hour of
assessment prep, self-study or what's
that individual's duty study time. The
way that we've done it in our faculty,
we said, "Okay, if you take an average
student, not even an average student, if
you take the borderline student, what
does the borderline student need to do
in order to to be successful in this
course?"
And that is always unfortunately going
to be a namac.
I have a question if if you're willing
final question because we do have um
five more minutes left of the session if
you want to keep to uh my question is
let me pull it up.
Okay. So uh so this process is iterative
and I want to know in practical terms
you already answered this um how do you
guys manage the constant state of
redevelopment without causing fatigue
among faculty and destabilizing the
program students?
I think um jam to answer your question
with learning um we we have a small team
within faculty of health sciences and we
are doing our task differently. So for
example I do the stats within the
curriculum it does virtual learning
auditing and does the curriculum loop
mapping. So like we said it's it's
there's no set model or perfect model
that we follow but we found in our
experience that just enough information
to get informed decision and when you
put that workload out like in the way
that we have and that we use statistics
or in my example use statistics to help
identify specific problems so that
limited resources can be used
effectively. That's the point I was
trying to put forward most importantly.
um it helps to reduce where our
attention is focused on and uh with that
I think um instead of focusing on the
whole program for example we focus on
specific problems and that's and that
helps us to be more much more effective
and reduces the the strain on on staff
themselves who are involved in this
curriculum development and when you say
that um it destabilizes the curriculum
for for pro it destabilizes the program
for students and staff I think um we
also involve of stakeholder analyses,
stakeholder engagement. So with that, I
think um it helps stabilize it much
more. Um but like it's not a top down
approach that we're trying to come from.
We're trying to come from a bottom up
approach. we identify challenges and
instead of using and a definitive answer
such as if test say this um like I said
the cost has high failure rates doesn't
necessarily mean that we remove the
course in time instead we see with our
workload analyses where the cost might
be out of sequence and then that helps
specify the problem and it helps resolve
the problem by identifying the specific
aspects of it that are affecting the
specific problem identified with these
stats not that was
>> maybe I so one of the first things that
we do when any program asks us uh to to
help them with a review is we we don't
actually start with much of this we
start with well why are you wanting to
make changes what are your problems what
are you trying to address um and so like
the team said they they do um
stakeholder focus groups with staff and
students to try and figure out what the
issues
And I think one of the big things that
stop staff burnout is often curriculum
decisions are made kind of on people's
gut feeling rather than on a documented
analysis of what the problem might be
and then you try and it's not perfect
but we try and check that against for
example the stats or the audit or
whatever it might be that we can use. Um
and so that stops staff going around in
circles quite a bit. Um and it also
means that the team does collects that
data. The team also does benchmarking
against other programs and other uh
information which then we can feed to
the academics who have to make the
decisions. Um and so the academics time
is focused usually for any given program
we ask them for two hours a week that
they are involved initially in deciding
what the changes will be to their
curriculum which then increases their
buy in. Um, and because they can see the
data, they then get why they're making
that change, which actually makes the
change management easier. There are
problems with communicating it beyond
the group that we're working with. Yes.
Um, and that's why we come up with other
issues. Um, but that then the the the
people within the team are motivated for
the change. For example, we've been
working with the dental program now
since uh about June and I think there's
15 heads of department and we've had at
least twothirds of them at every single
meeting. Um all one of their
representatives now that that's great
buyin from the departments. You know,
they're they're seeing and they're
identifying their problems and seeing
and coming up with the solutions which
means that they then are more motivated
to carry on and implement them.
[Music]
Loading video analysis...