text.skipToContent text.skipToNavigation
background-image

Methodological Advances and Issues in Studying College Impact New Directions for Institutional Research, Number 161

  • Erscheinungsdatum: 04.12.2014
  • Verlag: Jossey-Bass
eBook (ePUB)
22,99 €
inkl. gesetzl. MwSt.
Sofort per Download lieferbar

Online verfügbar

Methodological Advances and Issues in Studying College Impact

Which curricular and cocurricular practices promote student learning and persistence? While most research and assessment on college student outcomes offers limited insight into causal effects, this volume provides strong evidence of the impact of college on students. The first section discusses statistical analyses that offer more accurate estimates of the causal effect of a particular student experience, such as receiving a need-based scholarship or using academic support services. Providing an overview of the analytical framework, it also includes real-world examples to illustrate implementation for institutional researchers. The second section includes original research to enhance the value of student surveys, including: - aspects of questionnaire design and techniques to cope with item nonresponse, - variation in respondent effort, - interpretation of student self-reported gains, and - practical insights to improve survey-based research. This is the 161st volume of this Jossey-Bass quarterly report series. Timely and comprehensive, New Directions for Institutional Research provides planners and administrators in all types of academic institutions with guidelines in such areas as resource coordination, information analysis, program evaluation, and institutional management.

Produktinformationen

    Format: ePUB
    Kopierschutz: AdobeDRM
    Seitenzahl: 128
    Erscheinungsdatum: 04.12.2014
    Sprache: Englisch
    ISBN: 9781119045588
    Verlag: Jossey-Bass
    Größe: 2010 kBytes
Weiterlesen weniger lesen

Methodological Advances and Issues in Studying College Impact

1

The goal of this chapter is to provide a brief introduction to one of the most rigorous nonexperimental analytical methods currently employed by education researchers: regression discontinuity .
Applying Regression Discontinuity Design in Institutional Research

Allyson Flaster , Stephen L. DesJardins

Institutional researchers are often tasked with studying the effects of a variety of postsecondary education practices, policies, and processes. Examples include the effectiveness of precollegiate outreach programs such as summer bridge programs; whether first-year experience and developmental classes affect student outcomes; whether students residing in living-learning communities have outcomes that differ from their nonparticipating colleagues; and whether financial aid provision affects student outcomes such as persistence and completion. Given scarce institutional resources and the push for accountability in postsecondary education, decision makers are increasingly interested in whether institutional policies and programs actually achieve their intended goals.

Although a large body of research about the effectiveness of institutional interventions designed to improve student and institutional outcomes exists, there have been calls to improve the rigor of our research (DesJardins & Flaster, 2013; Schneider, Carnoy, Kilpatrick, Schmidt, & Shavelson, 2007). In particular, there has been a push for education researchers to be able to make more rigorous ("causal") claims about our practices, policies, and processes.

Experiments (randomized controlled trials, or RCTs), which are characterized by the random assignment of subjects into treatment and control groups, are considered the "gold standard" for making causal claims (Schneider et al., 2007; Shadish, Cook, & Campbell, 2002). The rationale for conducting experiments is to be able to provide an unbiased estimate of the treatment on an outcome, but RCTs are often impracticable or may even be unethical in some research contexts (see Bielby, House, Flaster, & DesJardins, 2013; DesJardins & Flaster, 2013, for details). There are, however, statistical methods that can be employed when using observational (nonexperimental) data. These quasi-experimental methods attempt to remedy the inferential problems that arise when units of observation, such as students, are not randomly assigned into treatment or control groups. Even though these methods, some of which are discussed in this volume, do not randomize units into treatment/control status, when properly applied they can substantially reduce any estimation bias due to nonrandom assignment.

Here we provide an introduction to one of these methods: the regression discontinuity (RD) design. In the next section, we discuss a framework often used as the conceptual basis in nonexperimental analyses such as RD.
Overview of the Counterfactual Framework

Many social scientists have employed the counterfactual framework in support of analysis designed to make rigorous claims about the effectiveness of institutional practices, policies, or processes ("interventions"). Collectively, these interventions are often referred to as treatments . The counterfactual framework posits that, hypothetically, each unit (individuals, classrooms, households, and so on) under study has two potential outcomes: one outcome under treatment and another outcome under nontreatment (Holland, 1986; Murnane & Willett, 2011). Ideally, to determine whether a treatment causes an effect, we would compare each unit's outcome in a world where it received the treatment and then compare its outcome in a counterfactual world where it did not receive the treatment.

For example, imagine we want to determine whether the provision of student financial aid (the treatment) improves the retention rate of stu

Weiterlesen weniger lesen

Kundenbewertungen