Transferring Learning into the Workplace: Evaluating a
Student-centered Learning Approach through
Computer Science Students’ Lens
M
˘
ad
˘
alina Era¸scu
1
and Velibor Mladenovici
2
1
Faculty of Mathematics and Informatics, West University of Timisoara, blvd. V. Parvan, Timisoara, Romania
2
Center of Academic Development at the West University of Timis
,
oara, blvd. V. Parvan, Timisoara, Romania
Keywords:
Transferring Learning into the Workplace, Higher Education, Student-centered Learning, Deep Learning
Approaches, Student Evaluation of Teaching Quality.
Abstract:
Over time, instructional training activities for academics that promote student-centered learning (SCL) in-
creased. However, few things are known about the extent to which academics’ learning is transferred into the
daily teaching practice. In this study, we investigated the impact of transferring learning into the workplace
of an Informatics teacher (first author of this paper) seeking to promote SCL within a new discipline in her
portfolio (i.e., Software Engineering). For this purpose, a quasi-experimental design with pre- and post-test
was employed. Self-reported data were collected as follows: from the experimental group, there were 52
students (28.8% female) at the pre-test, and 29 students (37.9% female) at the pre-test, while from the control
group, data were collected from 26 students (34.6% female) at the pre-test and 19 students (47.3% female) at
the post-test. Independent t-test analysis showed that the SCL initiative had only a positive impact on student
learning approaches and teaching quality as perceived by students. Concerning students’ learning approaches,
the SCL initiative had no effect. Several interpretations and perspectives of the current study are discussed.
1 INTRODUCTION
Teaching quality enhancement to improve student
learning is still an ongoing concern for most higher
education institutions worldwide. Specifically, in Eu-
rope, mainly because it influences student achieve-
ment, since the Bologna Process, student-centered
learning (SCL) became the primary instructional ap-
proach (Stes et al., 2012). SCL, among other as-
pects, focuses on the student’s needs (e.g., the cur-
riculum and courses are more flexible, the learning
process is more interactive), aiming to facilitate stu-
dents’ adoption of deep learning approaches (Kem-
ber, 2009). Therefore, many resources were invested
to improve staff development initiatives, develop effi-
cient instructional development programs (IDPs), as-
sess and enhance teaching quality, offer incentives for
teaching excellence, etc. (Stes et al., 2013). Conse-
quently, there is a massive requirement for quality ev-
idence of IDPs’ or staff development impact on daily
teaching practices (De Rijdt et al., 2013).
Some studies treated IDPs and the staff develop-
ment concept as similar concepts. Hence, they have
several related terms: academic development, instruc-
tional training, educational development, faculty, or
professional development. In this study, IDPs and
staff development initiatives are treated as correlated
but different constructs. Thus, we will refer to IDPs
as any initiative precisely planned to enhance aca-
demics’ teaching (i.e., in their role as a teacher) to
support student learning (Stes et al., 2010b). On the
other hand, we will refer to staff development initia-
tives as a sum of informal (e.g., exchange of ideas
among teachers) and formal (e.g., workshops) learn-
ing experiences of the teacher (Fullan, 1990). In staff
development initiatives, academics have to translate
their acquired competencies (e.g., knowledge, skills,
attitudes) into changes in their thinking and educa-
tional behavior. Therefore, in the present study, we
will consider (Baldwin and Ford, 1988) definition to
define the transfer of the acquired competencies (e.g.,
learning) to the workplace (i.e., in the classroom) due
to IDPs or staff development initiatives.
However, mainly because of the limited resources,
in the regular practice, IDPs’ and staff development
initiatives impact is generally assessed at one level
442
Era¸scu, M. and Mladenovici, V.
Transferring Learning into the Workplace: Evaluating a Student-centered Learning Approach through Computer Science Students’ Lens.
DOI: 10.5220/0010999300003182
In Proceedings of the 14th International Conference on Computer Supported Education (CSEDU 2022) - Volume 2, pages 442-449
ISBN: 978-989-758-562-3; ISSN: 2184-5026
Copyright
c
2022 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
(e.g., teachers’ attitude or knowledge, students’ learn-
ing approach, or perception of teaching quality) (Stes
et al., 2010a; Stes et al., 2012). The latest reviews
in the impact assessment of staff development recom-
mended that the impact of IDPs should be measured
on several levels of outcomes (Kirkpatrick and Kirk-
patrick, 2006) by well-designed studies (e.g., with at
least a quasi-experimental or a longitudinal approach)
(Ilie et al., 2020; Stes et al., 2010b). Even though not
without limitations, the present article aims to bring
more evidence regarding a specific SCL teaching ini-
tiative (i.e., as a result of attendance of an IDP and
of a staff development initiative) of learning transfer
to the workplace. Using a quasi-experimental design
with pre- and post-test, the current endeavor evalu-
ated the impact at two different levels: students’ per-
ceptions of teaching quality and students’ approaches
to learning.
Students’ perceptions of teaching quality or stu-
dent evaluation of teaching (SETs) represent one of
the most voluminous literature research works in the
applied psychology field (Ginns et al., 2007). Pa-
per (Marsh, 2007) suggests that teaching evaluation
in higher education institutions is important for two
main reasons. First, through this evaluation, one
can improve teachers’ performance by offering them
feedback and designing IDPs directed on the identi-
fied training needs. Second, one can use the results
from SETs in administrative decisions like promo-
tion, rewards, and external accountability. Regard-
ing the impact of IDPs or staff development initiative
on students’ perceptions of teaching quality, most of
the studies presented mixed results (e.g., positive im-
pact (Gibbs and Coffey, 2004; Meizlish et al., 2018);
negative impact (Stes et al., 2013). Therefore, for a
clearer picture, more studies are needed.
Nowadays, successful learning and studying in
higher education is most often associated with stu-
dents’ deep approaches to Learning (Asikainen,
2014). A deep learning approach is characterized by
significant engagement in the learning process, inde-
pendent thinking, analytic skills, and understanding
of the subject matter (Asikainen and Gijbels, 2017).
On the other side, there is the undesired surface learn-
ing approach. Its short-term benefits involve mem-
orizing the subject matter without understanding its
utility or implications (Asikainen and Gijbels, 2017).
Nevertheless, helping students transition towards a
deep approach to learning is not an easy task (Baeten
et al., 2010). Few studies showed that students at-
tending classes held by teachers who completed an
IDP increased their deep learning approaches com-
pared to the students from the control group (Gibbs
and Coffey, 2004). Nonetheless, studies investigating
the students’ level changes due to their teachers’ par-
ticipation in an IDP are scarce (Ilie et al., 2020).
1.1 Design and Aim of the Study and
Hypotheses
The present study used a quasi-experimental design
including a pre-test and post-test to assess the transfer
into the workplace of an SCL initiative. More pre-
cisely, the present study evaluated the degree of the
transfer into the workplace of an SCL approach into
the context of teaching the Software Engineering sub-
ject for bachelor Computer Science students. There-
fore, we evaluated the changes in students’ perception
of teaching quality and students’ approaches to learn-
ing. Specifically, we advanced three research ques-
tions:
Q1. Is there any progress in students’ approaches to
learning from the experimental group due to the
learning transfer into the workplace of the SCL
initiative implemented by their teacher?
Q2. Are there any statistically significant differences
between the experimental and control group stu-
dents regarding their approaches to learning?
Q3. Is the teacher’s teaching that implemented the
SCL initiative perceived as better by her students
than students’ perception of the teaching of her
counterpart in the control group at the end of the
semester?
Before introducing the method and results of the
study, we present an outline of the learning transfer
into the workplace of an SCL initiative in question.
1.2 Research Context
West University of Timisoara, Romania, organizes
and encourages participation at several IDPs that pro-
mote SCL. The first author of this paper participated
in an IDP and a staff development initiative. The
IDP (i.e., University didactics and psychopedagogy)
was attended between February and March 2020, hav-
ing the following structure: 5 disciplines, cumulating
150 hours, of which 40 hours theoretical courses (10
hours/discipline, within four disciplines) and 80 hours
of practical applications (20 hours/discipline, in 4 dis-
ciplines - The Management of the Students Groups,
Elaboration of the Didactic Materials, Modern Meth-
ods of Education, Curricular Design) and 30 hours of
practical applications in the fifth discipline (i.e., Feed-
back and Didactic Counseling). The primary purpose
of this IDP was to improve the level of competen-
cies of the university teaching staff regarding the de-
velopment of educational offers with innovative and
Transferring Learning into the Workplace: Evaluating a Student-centered Learning Approach through Computer Science Students’ Lens
443
student-centered instructive-educational content and
approaches. Regarding its gains, besides belonging to
a learning community, at the end of the IDP, each par-
ticipant has a complete curricular package (e.g., syl-
labus, teaching strategies, activity plans, assessment
tools, etc.) for a discipline they teach in the current
practice.
After graduating from the early mentioned IDP,
the first author applied and won one of the twenty di-
dactic incentives (i.e., inside the competition Didactic
Grants) supported by the university to further imple-
ment the SCL approach in the classroom. The staff
development initiative (i.e., Didactic Grants Compe-
tition) involved a training schedule similar to the Uni-
versity didactics and psychopedagogy IDP (but much
shorter and less complex), plus several other infor-
mal activities (e.g., informal counseling meetings via
Google Meet). As a graduate of the IDP, the first au-
thor of this paper did not have to repeat the training
activities. However, she was supposed to complete all
the other outputs of the Didactic Grants Competition
initiative (e.g., design three activity plans and imple-
ment at least one of them; record a teaching activity,
etc.). Thereby, in the summer semester of 2021, she
applied the acquired competencies in the IDP and the
staff development initiative to the lecture and labo-
ratory of Software Engineering, a new subject in her
teaching portfolio.
1.3 Learning Transfer into the
Workplace of the SCL Initiative
Regarding the adopted SCL approach, we mention
that the discipline taught to the experimental group
was Software Engineering, second year, undergradu-
ate level. Introductory topics in this field were pre-
sented based on the books (Van Vliet et al., 2008)
for the course, respectively (Seidl et al., 2015) for the
laboratory. The chosen topics were such that they pre-
pare the students to understand the basic notions when
working for software companies and writing their
Bachelor thesis at the end of the third year. At this
aim, the lecture was structured as follows: (1) Soft-
ware Management: The Software Life Cycle and
variants (advantages and disadvantages): The Water-
fall Model, Agile Methods, Prototyping, Incremental
Development, Rapid Application Development, and
DSDM, Extreme Programming; The Rational Uni-
fied Process (RUP); Intermezzo: Maintenance or Evo-
lution; Software Product Lines; Process Modelling;
(2) The Software Life Cycle: Requirements Engineer-
ing; Modelling; Architecture; Design. The laboratory
focused on UML modelling, emphasizing the follow-
ing topics: Use-case diagram, Class diagram, State-
machine diagram, Sequence diagram, and Activity di-
agram. All the semester, the classes were held on-
line due to the Covid19 breakthrough. During the
semester, the most challenging was to keep the stu-
dents focused and engaged. At this aim: (1) we de-
signed the course and laboratory to be very interac-
tive, and (2) the knowledge assessment was contin-
uous during the whole semester. In the Romanian
university system, each course and laboratory last 90
minutes. We did our best to divide this time as fol-
lows: (1) Clearly define the objectives of the cur-
rent course and laboratory and relate them with the
learning results of the discipline and with previous
and future ones. (2) A session in which the teacher
presented new material was of maximum 20 minutes
and was always followed by a practical session (in-
dividual or in teams) and a reflection of what was
taught. (3) We tried that each course/laboratory was
concluded with a summary of what was studied. This
summary was presented by a student randomly cho-
sen from the group (to keep students’ attention). The
knowledge assessment was done continuously during
the semester. It was composed of: (1) quizzes dur-
ing the lecture, (2) examination in the exam session
composed of short questions and synthesis subjects
to prove that the students deeply understood the top-
ics, and (3) team project for the laboratory. These
three components summed up 10 points, which is
the maximum grade in the Romanian grading system.
There was also the possibility to choose an individual
project on actual topics of research in software engi-
neering. This, together with excellent activity during
the semester (at least 9 points for quizzes and team
project), would have given the students the possibility
to have the maximum grade without taking the final
examination in the exam session.
2 METHOD
2.1 Participant Characteristics
Students in both experimental and control groups
were similar in terms of faculty (i.e., Faculty of Math-
ematics and Informatics), specialization (i.e., Ap-
plied Informatics), year of studies (2nd year), degree
(i.e., bachelor’s degree), the academic status of their
teacher (i.e., University lecturers), and teaching expe-
rience of their teacher (i.e., > 5 years). As compared
to the teacher who was the one responsible for the
learning transfer into the workplace of the SCL ini-
tiative, the counterpart teacher (i.e., the teacher from
the control group) did not follow any IDP or staff de-
velopment initiative on SCL. The distribution of stu-
CSEDU 2022 - 14th International Conference on Computer Supported Education
444
dents’ mean age, gender, class size and type of activ-
ity, and area of residence are presented in Table 1.
2.2 Measures
For the data collection, we used two instruments Re-
vised Two-Factor Study Process Questionnaire (R-
SPQ-2F (Biggs et al., 2001)) and Exemplary Teacher
Course Questionnaire (ETCQ (Kember and Leung,
2008)), both being previously used on the Roma-
nian population (Smarandache et al., 2021; Ilie et al.,
2021). The R-SPQ-2F measures students’ prefer-
ences for study strategies (Asikainen and Gijbels,
2017). The R-SPQ-2F has 20 items, assessing
two learning approaches, namely the deep and sur-
face learning approach. Each dimension of the R-
SPQ-2F is divided into two corresponding subscales
(i.e., motives and strategies). The items gather an-
swers through a 5-point Liker scale (i.e., from 1 =
never/only rarely true of me to 5 = always/almost al-
ways true of me). In terms of factorial structure of
R-SPQ-2F, we used the 2-factor one as it proved to be
superior on the Romanian population (Smarandache
et al., 2021). The deep learning approach scale mea-
sures students’ motives and strategies described by
intrinsic motivation and maximization of their under-
standing of the discipline. On the other hand, the sur-
face learning approach describes motives and strate-
gies related to extrinsic motivation involving mem-
orizing the course without understanding its impli-
cations or utility. We chose the ETCQ mainly be-
cause of its validity, reliability, and diagnostic power
(Kember and Leung, 2008, p.352). The ETCQ has
49 items, assessing nine dimensions (Table 5) of the
teaching process in the classroom environment as per-
ceived by students. The responses to each of the nine
dimensions of ETCQ were gathered with a 5-point
Likert scale (i.e., ranging from 1 = strongly disagree
to 5 = strongly agree). Reliability coefficients for the
two scales of the R-SPQ-2F and the nine scales of
ETCQ for both the control and experimental group at
the two collection data time points (i.e., pre- and post-
test) are presented in Table 2.
2.3 Data Collection
We used the aforementioned instruments and assem-
bled a quantitative pre-test (first week of the semester)
and a post-test (last week). Participation in the current
study was voluntary for all students, and all answers
were anonymous. Before completing the question-
naires, a researcher read one standard procedure to
fill in the questionnaire. Each student had an anony-
mous research code to help the research team match
their answers from pre-test to post-test. However, ex-
cepting the repeating students (i.e., which were too
few) the rest of the students that participated in the
post-test, according to the research code, were not the
same as those in the pre-test. In the case of R-SPQ-
2F, data were collected for both pre-test and post-test.
Students were instructed to report their general study
approaches about the study program they followed
(i.e., Applied informatics) at the pre-test. On the other
hand, at the end of the semester (i.e., at the post-test),
students were asked to report their specific learning
approaches in the case of the followed discipline (i.e.,
Software Engineering in the case of the experimental
group and Databases Administration in the case of the
control group). As students cannot accurately refer to
the teacher’s behavior with whom they did not study
before, in the case of ETCQ, data were gathered only
in the post-test moment.
2.4 Data Analysis
First, we assessed Cronbach’s α for each experimen-
tal and control group subscale for pre-test and post-
test moments. All the obtained values for Cronbach’s
α indicated acceptable reliability, with almost all
scales having good or very good reliability (α > .80)
(Table 1). Second, given the design of our study (i.e.,
same teachers and courses were considered for both
pre-test and post-test moments) and that only a few
students answered the questionnaires in both evalua-
tion moments, we could not perform a paired sample
t-test. Therefore, to determine that involved the in-
spection of normality and homogeneity of variance
assumptions: normal plots, stem and leaf plots, and
the calculation of skewness and kurtosis were used to
verify the normality of the data distribution, while the
Levene statistics were calculated to test the equality
of group variances. All the preliminary assumptions
for the analysis were met (i.e., we have a continuous
dependent variable; the independent variable has two
categorical, independent groups; the observations are
independent; there are no significant outliers; the data
distribution is normal) for most of the ETCQ dimen-
sions, excepting the active learning dimension where
the equal variances assumption was violated. There-
fore, in the case of the active learning dimension, we
followed the recommendation of (Howell, 2012), and
we performed the Welch t-test (i.e., the nonparamet-
ric version of interdependent t-test), while for the rest
of the R-SPQ-2F and ETCQ dimensions, we used the
interdependent t-test.
Transferring Learning into the Workplace: Evaluating a Student-centered Learning Approach through Computer Science Students’ Lens
445
Table 1: Demographic characteristics of the student sample.
Students’ characteristic
Pre-test Post-test
Experimental
group
Control
group
Experimental
group
Control
group
Mean age 20.31 20.73 21.14 20.74
Gender
Female 15 9 11 9
Male 33 16 16 10
Not mentioned 4 1 2 0
Class size & type of activity
30 students (seminary) - 26 - 19
>30 but 60 students (lecture) 52 - 29 -
Table 2: ETCQ and R-SPQ-2F α Cronbach’s indices for the experimental and control group at the pre-test and post-test.
Questionnaire / scale Moment No. of items
Alpha Cronbach α
Experimental group Control group
Revised Two-Factor Study Process Questionnaire (Biggs et al., 2001)
Deep Learning Approach
pre-test
10
.810 .687
post-test .853 .837
Surface Learning Approach
pre-test
10
.797 .803
post-test .847 .837
Exemplary Teacher Course Questionnaire (Kember and Leung, 2008)
Understanding Fundamental Concepts post-test 5 .826 .842
Relevance post-test 5 .743 .859
Challenging Beliefs post-test 6 .885 .823
Active Learning post-test 5 .702 .873
Teacher-Student Relationships post-test 5 .797 .886
Motivation post-test 6 .868 .883
Organization post-test 7 .944 .942
Flexibility post-test 5 .868 .948
Assignments post-test 5 .760 .655
Table 3: Student’s approaches to study at the beginning, respectively at the end of the semester for students in the experimental
group.
Group
Deep Learning Approach Surface Learning Approach
Mean score SD N Mean score SD N
Experimental
Before 2.72 0.66 52 2.44 0.68 52
After 2.66 0.72 29 2.49 0.78 29
Change -.06 .05
t .417 -.295
p .678 .769
3 RESULTS
Research Question 1. Regarding the scores of the
students in the experimental group, Table 3 presents
their approaches to studying at the beginning and the
end of the semester, respectively. No statistically sig-
nificant improvements can be discerned. However,
there is an elusive decrease in the deep learning ap-
proaches from the pre-test to the post-test moment
(i.e., change = -.06 with Mpre-test = 2.72, Mpost-test
= 2.66), respectively an elusive increase (i.e., change
= +.05 with Mpre-test = 2.44, Mpost-test = 2.49) in
students’ surface learning approaches.
Research Question 2. As of Table 4, there are
no statistically significant differences between the
two groups of students regarding their approaches to
learning none of the assessment moments.
Research Question 3. Independent t-test analy-
CSEDU 2022 - 14th International Conference on Computer Supported Education
446
Table 4: Comparison between experimental and control group before and after the end of semester regarding student’s learning
approaches.
Variables Moment
Group
t df pExperimental Control
N M SD N M SD
Deep Learning Approach
pre-test 52 2.72 0.66 26 2.74 0.54 -0.089 76 0.93
post-test 29 2.66 0.72 19 2.89 0.66 -1.140 46 0.26
Surface Learning Approach
pre-test 52 2.44 0.68 26 2.60 0.65 -0.969 76 0.34
post-test 29 2.49 0.78 19 2.79 0.73 -1.327 46 0.191
sis for the differences in students’ perception of the
teaching quality revealed statistically significant dif-
ferences for only three out of the nine scales of the
ETCQ (Table 5). First, there is a marginally statisti-
cally significant difference regarding the active learn-
ing behaviors of the two teachers: the students in
the experimental group reported more behaviors of
their teacher that encouraged and facilitated their ac-
tive learning than the students in the control group
(t[27.76]= 1.891, p = .069, d Cohen = 0.58). Second,
students in the experimental group reported lower
scores concerning their relationship with their teacher
than students in the control group, which said they
had a better relationship with their teacher (t[46]= -
2.065, p = .045, d Cohen = 0.61). Third, there is a
marginally statistically significant difference regard-
ing the organization of the two courses. Students from
the control group perceive their classes to be better
organized by their teacher (t[46]= -1.795, p = .079, d
Cohen = 0.53).
4 DISCUSSION
In the current study, we investigated the impact of
learning transfer into the workplace of an Applied in-
formatics higher university teacher by implementing a
student-centered learning (SCL) initiative during one
semester on a new subject in its portfolio. Hence, we
evaluated the SCL initiative’s impact on two levels:
students’ approaches to learning and students’ per-
ception of the teaching quality (i.e., which is also a
measure for teacher’s teaching behaviors).
Regarding the first two research questions of the
current investigation, the SCL initiative did not have
any impact on students’ learning approaches. There
were no improvements either on the deep approaches
to learning or on the surface learning approaches
of the students in the experimental group. Also,
there were no differences regarding students’ learn-
ing approaches between the control and experimen-
tal group. At the end of the semester, students in
both groups had the same learning approaches in
the two disciplines as, in general, in their bachelor
study program. Our results differ from several other
studies, which found that students of academics that
participated in an IDP or a staff development ini-
tiative were more likely to adopt deep learning ap-
proaches (Gibbs and Coffey, 2004). On the other side,
a recent study by (Asikainen and Gijbels, 2017) con-
cluded that most of the existing studies do not ex-
hibit clear empirical evidence proving that students
develop deep approaches to learning during higher ed-
ucation. Moreover, several other studies showed that
the deep approach to learning does not necessarily de-
velop during university studies. Students’ deep ap-
proach to learning during bachelor study years could
decline (Lietz and Matthews, 2010) while the surface
approach develops (Geitz et al., 2016).
Concerning the third research question of the
present study, there were both expected and unex-
pected results. The SCL initiative had a positive im-
pact only on the active learning dimension out of the
nine dimensions of teaching quality perceived by the
students. Effect sizes point towards a medium practi-
cally exciting impact of the SCL initiative on the scale
of Active learning (d Cohen = 0.58). One of the rea-
sons why the Active learning variable is higher for the
experimental group could be because the individual
quizzes and/or group tasks were constantly assigned
during lectures. Also, a group project with differ-
ent milestones was set, and feedback was given to all
the teams in the experimental group. This result is in
line with some other studies (Gibbs and Coffey, 2004;
Meizlish et al., 2018). For example, (Meizlish et al.,
2018) found a positive impact of an IDP for debutant
academics towards students’ ratings, showing a statis-
tically significant increase in the experimental group
compared to the control group. On the other hand,
students in the control group reported higher scores
on their teacher’s behaviors regarding the Teacher-
Student Relationships and Organization of the course.
This result could be explained by the fact that the dis-
cipline taught by the teacher who implemented the
SCL initiative was new in her teaching portfolio, this
being not the case of the counterpart teacher. An-
other possible reason could be that students in the ex-
perimental group perceived the numerous tasks and
Transferring Learning into the Workplace: Evaluating a Student-centered Learning Approach through Computer Science Students’ Lens
447
Table 5: ETCQ dimension scores for the experimental in comparison to the control group at the post-test moment.
ETCQ Scale (Group) N SD Mean score t df p Change d Cohen
Understand Fundam. Concepts
-1.252 426 0.217 Same -Experim Grp 29 3.68 0.75
Ctrl Grp 19 3.97 0.86
Relevance
-0.178 46 0.859 Same -Experimental Grp 29 3.73 0.71
Control Grp 19 3.77 0.71
Challenging Beliefs
0.475 46 0.637 Same -Experimental Grp 29 3.46 0.92
Control Grp 19 3.34 0.70
Active Learning
1.891 27.76 0.069* Better 0.58Experimental Grp 29 4.21 0.56
Control Grp 19 3.79 0.87
Teacher-Student Relationships
-2.065 46 0.045* Worse 0.61Experimental Group 29 3.35 0.81
Control Grp 19 3.84 0.79
Motivation
-0.839 46 0.406 Same -Experimental Grp 29 3.41 0.92
Control Grp 19 3.63 0.81
Organization
-1.795 46 0.079* Worse 0.53Experimental Grp 29 3.45 1.01
Control Grp 19 3.96 0.90
Flexibility
-0.894 46 0.376 Same -Experimental Grp 29 3.92 0.82
Control Grp 19 4.15 0.95
Assignments
0.115 46 0.909 Same -Experimental Grp 29 3.81 0.78
Control Grp 19 3.79 0.61
homework during the semester and their consistent
application as too strict. Also, in most disciplines, stu-
dents are being evaluated mainly in the examination
session. Thus, as suggested by other studies, teach-
ers must allocate extra time to successfully implement
what is learned during instructional development in
daily practice(Gibbs and Coffey, 2004; Stes et al.,
2010a). (Postareff et al., 2008) showed that chang-
ing the paradigm to a SCL approach is slow and pro-
gressive on the teachers’ side. Hence, one semester
counting 14 weeks may not be sufficient for visible
results. However, several studies which measured the
impact of an IDP or staff development initiative re-
ported no, limited, or even negative effects (Stes et al.,
2012; Stes et al., 2010a).
Limitations and Future Directions. The main lim-
itation of the current endeavor is the low number of
students in the two groups and the impossibility of
matching all the responses in the two assessment mo-
ments. As a consequence, our statistical power is
very low. Second, the employed design is quasi-
experimental (i.e., lack of randomization). Third, be-
cause of limited resources, we assessed the impact of
the SCL initiative only through quantitative investi-
gation. Thus, we should be cautious in interpreting
present results for the early mentioned reasons and
not only. Future studies should consider employing
an experimental design (i.e., conducting a randomized
controlled trial), quantitative and qualitative measure-
ments (e.g., classroom observations, interviews, etc.),
and most importantly, good statistical power. Also,
if possible, one should obtain answers from the same
students in the pre-test and post-test.
5 CONCLUSIONS
In this paper we presented the effect of an SCL ini-
tiative through a quasi-experimental design, with a
pre-test and post-test assessment. We showed that
transferring learning into the workplace of an Applied
informatics higher university teacher by implement-
ing student-centered learning (SCL) is perceived as
positive by the students. However, creating an ac-
tive learning environment may not be enough to con-
vince them to change their usual learning approaches.
Hence, one should strive to transfer their learning into
daily practice to influence student learning positively.
CSEDU 2022 - 14th International Conference on Computer Supported Education
448
ACKNOWLEDGEMENTS
We thank Monica Sancira from the Department of
Computer Science, West University of Timisoara, for
agreeing to participate in this study as the control
group teacher.
REFERENCES
Asikainen, H. (2014). Successful learning and studying in
biosciences. Exploring how students’ conceptions of
learning, approaches to learning, motivation and their
experiences of the teaching-learning environment are
related to study success. Helsinki: Unigrafia.
Asikainen, H. and Gijbels, D. (2017). Do students de-
velop towards more deep approaches to learning dur-
ing studies? a systematic review on the development
of students’ deep and surface approaches to learning
in higher education. Educational Psychology Review,
29(2):205–234.
Baeten, M., Kyndt, E., Struyven, K., and Dochy, F. (2010).
Using student-centred learning environments to stim-
ulate deep approaches to learning: Factors encourag-
ing or discouraging their effectiveness. Educational
research review, 5(3):243–260.
Baldwin, T. T. and Ford, J. K. (1988). Transfer of training:
A review and directions for future research. Personnel
psychology, 41(1):63–105.
Biggs, J., Kember, D., and Leung, D. Y. (2001). The
revised two-factor study process questionnaire: R-
spq-2f. British journal of educational psychology,
71(1):133–149.
De Rijdt, C., Stes, A., Van Der Vleuten, C., and Dochy,
F. (2013). Influencing variables and moderators of
transfer of learning to the workplace within the area
of staff development in higher education: Research re-
view. Educational Research Review, 8:48–74.
Fullan, M. (1990). Staff development, innovation, and
institutional development. Changing school culture
through staff development, 1220:16–133.
Geitz, G., Joosten-ten Brinke, D., and Kirschner, P. A.
(2016). Changing learning behaviour: Self-efficacy
and goal orientation in pbl groups in higher educa-
tion. International Journal of Educational Research,
75:146–158.
Gibbs, G. and Coffey, M. (2004). The impact of training of
university teachers on their teaching skills, their ap-
proach to teaching and the approach to learning of
their students. Active learning in higher education,
5(1):87–100.
Ginns, P., Prosser, M., and Barrie, S. (2007). Students’ per-
ceptions of teaching quality in higher education: The
perspective of currently enrolled students. Studies in
higher education, 32(5):603–615.
Ilie, M. D., Bibu, N. A., Isac, A., Mladenovici, V.,
and Iancu, D. E. (2021). Raport intermediar
1 (Ri1) privind evaluarea pretest a cadrelor di-
dactice s
,
i a student
,
ilor din grupul t
,
int
˘
a [Interme-
diate Report 1 (Ri1) on the pretest assessment
of academics and students in the target group].
https://cda.uvt.ro/documents/.
Ilie, M. D., Maricut
,
oiu, L. P., Iancu, D. E., Smarandache,
I. G., Mladenovici, V., Stoia, D. C., and Toth, S. A.
(2020). Reviewing the research on instructional de-
velopment programs for academics. trying to tell a dif-
ferent story: A meta-analysis. Educational Research
Review, 30:100331.
Kember, D. (2009). Promoting student-centred forms of
learning across an entire university. Higher education,
58(1):1–13.
Kember, D. and Leung, D. Y. (2008). Establishing the
validity and reliability of course evaluation question-
naires. Assessment & Evaluation in Higher Educa-
tion, 33(4):341–353.
Kirkpatrick, D. and Kirkpatrick, J. (2006). Evaluating
training programs: The four levels. Berrett-Koehler
Publishers.
Lietz, P. and Matthews, B. (2010). The effects of college
students’ personal values on changes in learning ap-
proaches. Research in higher education, 51(1):65–87.
Marsh, H. W. (2007). Do university teachers become more
effective with experience? a multilevel growth model
of students’ evaluations of teaching over 13 years.
Journal of educational psychology, 99(4):775.
Meizlish, D. S., Wright, M. C., Howard, J., and Kaplan,
M. L. (2018). Measuring the impact of a new faculty
program using institutional data. International Jour-
nal for Academic Development, 23(2):72–85.
Postareff, L., Lindblom-Ylänne, S., and Nevgi, A. (2008).
A follow-up study of the effect of pedagogical training
on teaching in higher education. Higher Education,
56(1):29–43.
Seidl, M., Scholz, M., Huemer, C., and Kappel, G. (2015).
UML@ classroom: An introduction to object-oriented
modeling. Springer.
Smarandache, I. G., Maricutoiu, L. P., Ilie, M. D., Iancu,
D. E., and Mladenovici, V. (2021). Students’ approach
to learning: evidence regarding the importance of the
interest-to-effort ratio. Higher Education Research &
Development, pages 1–16.
Stes, A., Coertjens, L., and Van Petegem, P. (2010a). In-
structional development for teachers in higher educa-
tion: Impact on teaching approach. Higher education,
60(2):187–204.
Stes, A., Coertjens, L., and Van Petegem, P. (2013). In-
structional development in higher education: Impact
on teachers’ teaching behaviour as perceived by stu-
dents. Instructional Science, 41(6):1103–1126.
Stes, A., De Maeyer, S., Gijbels, D., and Van Petegem,
P. (2012). Instructional development for teachers in
higher education: Effects on students’ perceptions of
the teaching–learning environment. British Journal of
Educational Psychology, 82(3):398–419.
Stes, A., Min-Leliveld, M., Gijbels, D., and Van Petegem, P.
(2010b). The impact of instructional development in
higher education: The state-of-the-art of the research.
Educational research review, 5(1):25–49.
Van Vliet, H., Van Vliet, H., and Van Vliet, J. (2008).
Software engineering: principles and practice, vol-
ume 13. John Wiley & Sons Hoboken, NJ.
Transferring Learning into the Workplace: Evaluating a Student-centered Learning Approach through Computer Science Students’ Lens
449