STEM Education Research Exchange

A regular cross-disciplinary meeting with a focus on discipline-based education research (DBER), Scholarship of Teaching and Learning (SoTL) research, and pedagogical practices in STEM at UBC's two main campuses.

Once in full swing, we are aiming to meet monthly with a focus on research and evidence-based innovation and teaching practices in STEM. Sessions will be 90 min, broken up into ~30 min scheduled presentation, ~30 min open discussion, ~30 min flexible time, featuring works in progress and the exchange of ideas (not polished presentations of published research). Individuals or groups will submit their proposals as a short, structured abstract on a Google Form along with the range of dates during which they would prefer to present. The current SERE committee will select presenters based on criteria like broad interest and diversity in disciplines, methods, and contexts.

Click here for a document with a full description of the purpose and goals of the STEM Education Research Exchange

To join SERE and receive email notifications about upcoming events, please contact Warren Code at Skylight: warcode@science.ubc.ca.

SERE members can sign up to present their research in progress at an upcoming session by filling out this SERE sign-up Google Form.

If you have any questions, please feel free to contact one of the committee members listed below if you know them, or if you aren’t sure whom to contact please email Warren Code at Skylight (warcode@science.ubc.ca).

The SERE Committee:

Trish Schulte (Biology), Jackie Stewart (Chemistry), Gabriel Potvin (Chemical & Biological Engineering), Georg Rieger (Physics & Astronomy), and Warren Code (Skylight)

Previous Sessions

July 2023 - Program-Level Assessment

Alireza Bagherzadeh (Chemical and Biological Engineering, UBCV)

Recording available by request.

When: Tuesday, July 18, 2023 - 3:00pm-4:30pm

Where: Online via Zoom and in person

  • In-Person: Chemical and Biological Engineering (CHBE) Building, Room 202
  • Zoom (registration closed)

Abstract:

MIT published a report on the global state of the art in engineering education (March 2018) and noted “measuring the actual learning” as the next big frontier of the field. Students come to class with an existing level of knowledge about the subject matter, therefore measuring the “delta” in students’ knowledge would be a more accurate representation of the effectiveness of the course, teacher and/or program. Exams and in particular finals do not measure the “delta”, and the accuracy of these tests in reflecting actual students learning is a controversial topic. They might be reasonable measures of immediate knowledge uptake, but they do not provide insights about deep learning and long-term knowledge retention.

In order to measure students' actual learning and retention of knowledge, a new holistic assessment is developed, herein referred to as yearly assessment (YA). The objective is to measure the “delta” in students learning after they go through one year of instruction in our undergraduate program.

In this work, the results for year two of our curriculum, i.e. Y2-YA, are presented. The assessment was administered on Canvas, once at the beginning of the academic year (the “pre-test”) and once again after the same cohort of students returned a year later to start the third year of their undergraduate degree (the “post-test”). The assessment is in the form of conceptual multiple-choice questions (CMCQs) targeting the key concepts of core courses of the curriculum. These CMCQs underwent a few rounds of revisions and were vetted by each course instructor before the implementation in September 2021. For the Y2-YA, roughly 175 questions were developed and 50 questions were randomly selected from this bank of CMCQs for each student. Students were given 2 minutes per question and were advised that this assessment does not impact their academic record and they don’t need to study for it.

The results of the pre-test provide insight into the prior knowledge and academic level of each cohort and can be used to semi-quantitatively compare different cohorts of students. Furthermore, calculating the cohort-average normalized gain, G=(post-pre)/(100-pre), for each course and its key concepts provide an objective measure of the effectiveness of the course and valuable feedback for the instructor. This figure can also provide additional evidence for the first graduate attribute of the CEAB, i.e. engineering knowledge base. Individual reports can also be prepared for each student, providing feedback on their performance against their cohort.

The yearly assessment has been used to objectively measure students learning and retention of knowledge as they go through one year of instruction in our program. The cohort-average normalized gain was 20%, a figure that shows there is lots of room for improvement. Analysis of the results also showed no statistically significant difference between the performance of male and female students. Additionally, these assessments serve as a tool for evaluating the impact of curriculum changes on students learning and will be incorporated into a new evaluation system to support the continuous improvement process.

June 2023 - Modeling guessing behavior with limited test attempts

Bowen Hui (Computer Science, Mathematics, Physics, and Statistics Department, UBCO)

When: Thursday, June 29, 2023 - 3:00pm-4:30pm

Where: Online via Zoom (registration closed)

Abstract: 

Mastery learning and deliberate practice promote personalized learning, allowing the learner to improve through a repetitive and targeted approach. Since this is a new pedagogical approach to assessing student learning, we collected data to investigate test-taking behavior and evaluated potential learning gains in this new test format. The most relevant existing work is in the rapid guessing literature, where responses to online test items are considered as guesses mainly based on the response times. Although our test format does not allow for a direct application of the existing models, we implemented some ideas and would like to gauge feedback from others.

The course involved here is COSC 341, offered by Computer Science in the Faculty of Science at the Okanagan campus. The data comes from Canvas quizzes during UBC's online teaching period due to COVID, when small tests were given online and a maximum of 3 attempts were allowed per question. In the beginning, we put each question in a separate screen, but students wanted them all to be on the same screen. For this reason, we could not obtain item-level data. There are a total of 10 modules, each with a pre-test and a post-test. On average a test had 5 questions.

The literature suggests that rapid response times is an indication of guessing. We developed a response time threshold model and approximated the amount of guessing that took place based on that model. Recently, we explored other models of guessing but had trouble using accuracy or more complicated models because we did not have item-level response data. The amount of guessing per module pointed to the quality of the tests used as well as potentially the difficulty of the module content. This helps instructors improve their course development activities.

June 2023 - Supporting student belonging through undergraduate field experiences

Alison Jolley (University of Waikato, New Zealand)

Recording available by request.

When: Thursday, June 15, 2023 - 2:30pm-4:00pm

Where: Online via Zoom and in person

  • In-Person: Chemical and Biological Engineering (CHBE) Building, Room 202
  • Zoom (registration closed)

Abstract: 

Sense of belonging has extensive benefits for student learning and has been well studied in classroom settings, but it has had limited exploration in field education contexts. This talk will explore preliminary analyses from a mixed methods study of student sense of belonging during eight biology and geology undergraduate field experiences. We will discuss what belonging looked like for students, the teaching practices and contexts that contributed to its development, and how promising findings might best be shared with practitioners.

May 2023 - Tips and tricks for getting the most out of UBC's survey tool, Qualtrics

Joss Ives (Physics and Astronomy, UBCV)

When: Tuesday, May 30, 2023 - 3:30pm-5:00pm

Where: Online via Zoom and in person

  • In-Person: Chemical and Biological Engineering (CHBE) Building, Room 102
  • Zoom (registration closed)

Description: 

In this session, Joss Ives from Physics and Astronomy at UBC Vancouver will share a sampler of survey tool discoveries in Qualtrics. You can find the Qualtrics support page from UBC IT here: https://it.ubc.ca/services/teaching-learning-tools/survey-tool. We will spend more time on those items of greatest interest to the people in attendance:

  • Using answers from previous questions in subsequent questions, for example, to create IFAT (scratch-card multiple choice) tests
  • Using a Contact List to save information about who has previously completed parts of a survey and/or to save information that can be imported into a future survey
  • Emailing completed surveys or confirmations to students/participants
  • Seamlessly passing a student from one survey to another and optionally including embedded information. Can be used to make part of a survey completely anonymous while also having identifiers for participation credit.
  • Embedding information into the survey URL (such as section number for a course, etc.) and obscuring that information using encoding
  • CWL login and what data you have access to through that process
  • Other embedded data magic (e.g., survey flow based on date)

February 2023 - Update from the SALTISE-Research Team: Connecting the Dots! 

Elizabeth Charles (Dawson College, Co-Director of SALTISE)

When: Thursday, February 23, 2023 - 3:30pm-4:30pm

Where: Both online & in-person

  • In-Person: Biological Sciences (BIOL) Building, Room 1012 
  • Zoom (registration closed)

Abstract: 

The Montreal-based SALTISE-Research Team is a group of practitioner-researchers who have been working together for over 15 years investigating issues involving the use of student-centered pedagogies (active learning instruction) and new learning spaces to promote and increase students' conceptual learning in science. To date our emphasis has been centered heavily on physics education. Our research methodology follows a Design Based Research (DBR) approach, which means we have incrementally improved the design of the intervention in question based on the results of early implementations. Consequently, our findings typically are ready for use in the classroom

In this presentation I will report on our current project, which involves the designing of scaffolds to support learning when using inquiry-based labs (IBL). IBL is a type of active learning instruction and engages students directly in making decisions along the path of the scientific process - e.g., methods to use, data to collect. As such, it stands in contrast to traditional labs that often rely heavily on cookbook-like procedures. The IBL study involved three iterations of a 15-week lab intervention that refined scaffolds to increasingly support students' capacity to work independently on the culminating lab which was a design challenge - build a string- or pipe-based instrument that must be able to play 3+ fundamental frequencies

Early iterations show that while students could successfully engage in making decisions related to designing methods and procedures they could not complete a lab experiment that required decisions about analyzing and interpretation. We refer to this as being able to "plot the dots on a graph, but not able to connect the dots." Results of our 3-iteration IBL-DBR study illustrates the challenge and the modifications taken to overcome a critical shortcoming regarding students' ability to engage fully in a design-based IBL activity. I will describe briefly what these scaffolds look like and why we believe students are now able to "connect the dots." 

Learn more about SALTISE at https://www.saltise.ca/.

October 2022 - Defining, Developing, and Measuring Student Agency in Instructional Labs

Natasha Holmes (Department of Physics, Cornell University)

Recording available by request.

When: Wednesday, October 12, 2022 - 2:00pm-3:00pm

Where: Both online & in-person

  • In-Person: Chemical and Biological Engineering (CHBE) Building, Room 202
  • Zoom (registration closed)

Abstract: 

Across science education, labs are offering more and more opportunities for students to design and conduct open-ended experiments. How do we most effectively hand over that control to the students, supporting and encouraging their agency? How do we measure whether students have taken up that agency and, if so, what are the benefits? This presentation will offer practical insight for instructors and researchers interested in fostering and evaluating student agency in instructional science labs.

May 2022 - Comparing student grade outcomes before and after a collection of course interventions

Thursday, May 26, 10:30am in person and on Zoom

Warren Code

Skylight (Science Centre for Learning and Teaching) at UBCV

Background: Comparing quantitative outcomes (e.g., grades) of groups is a popular approach to investigating effects of interventions, though in education settings it is rarely possible to use randomized controlled trials to establish comparable groups.

Setting: This project looks at grades trends in the UBC Faculty of Science before and after a major initiative which affected teaching methods in many (but not all) courses over a period of ten years. This study is observational in that students were not assigned to their courses.

Study design: To establish comparable groups before and after the initiative in each course, we use an approach called "propensity score matching" which deploys a clever statistical approach to construct groups that have similar mixtures of covariates on average; for example, not requiring that each Biology student with an A average be compared directly with another Biology student with an A average, but rather that the number of students with A averages and the number of Biology students in the "before intervention" and "after intervention" groups are approximately the same. Limitations of this approach will be discussed, but the intention is that the effect size computed from such a comparison offers a firmer basis for causal inference than simpler regression models comparing outcomes. We aggregate the effect sizes (one for each course) using meta-analysis methods.

Results: We will offer some preliminary findings from this analysis approach with differential effects based on the level of contact with the initiative's course transformation projects.

Implications for teaching and learning: This work is more about the evidence of impact for active learning (we can point attendees to various examples of course interventions from this initiative) on grade outcomes in courses where they are implemented, as well as downstream courses. One way this could be informative is offering guidelines around the extent of effects one might observe when proposing a teaching intervention and how it might be evaluated in terms of student performance - grades can show differences but provides a cruder measure than purpose-built assessments (e.g., concept inventories).

Mar 2022 - Performance differences in isomorphic test questions

March 1 from 2:00-3:30pm on Zoom

Georg Rieger

Department of Physics & Astronomy at UBCV

Background: In PHYS 100, a relatively large question bank is used for bi-weekly tests. When looking at the results for different versions of isomorphic questions, we noticed fairly large differences in student performance. What we learned from comparing the question details could potentially be useful for test question design.

Setting: The data is from Physics 100, an introductory algebra-based course that is taken by students who do not have Physics 12 credit. The examples we intend to show will be accessible to most participants.

Study design: This is a comparative study that puts the questions, their solution steps and their outcomes side-by-side. Explanations are proposed for the observed performance differences.

Results: We identified 17 items that can lead to differences between different versions of isomorphic questions. These items are often related to student difficulties with physics and math.

Implications for teaching and learning: While the explanations are not proven yet (we will discuss this), the results have provided some insights into student learning. It highlights in particular the importance of math literacy in physics and probably engineering and other physical sciences.

Dec 2021 - First meeting and sample session on equity and learning performance

December 8 from 3:30-5:00pm on Zoom

Jaclyn Stewart and Taylor Wright

Department of Chemistry at UBCV

At this first meeting, we spent the first part discussion the Exchange and then had a shortened version of our intended format, with a presentation from a project looking at student outcomes and equity in introductory courses. The study involved survey instrument development, with 11 demographic variables, and looked at the survey results relative to performance data.