SlideShare a Scribd company logo
1 of 22
Download to read offline
http://mmr.sagepub.com/
Journal of Mixed Methods Research
http://mmr.sagepub.com/content/6/1/34
The online version of this article can be found at:
DOI: 10.1177/1558689811417133
2012 6: 34 originally published online 2 September 2011
Journal of Mixed Methods Research
and Natalie Stipanovic
Julia L. Sharp, Catherine Mobley, Cathy Hammond, Cairen Withington, Sam Drew, Sam Stringfield
A Mixed Methods Sampling Methodology for a Multisite Case Study
Published by:
http://www.sagepublications.com
On behalf of:
Mixed Methods International Research Association
can be found at:
Journal of Mixed Methods Research
Additional services and information for
http://mmr.sagepub.com/cgi/alerts
Email Alerts:
http://mmr.sagepub.com/subscriptions
Subscriptions:
http://www.sagepub.com/journalsReprints.nav
Reprints:
http://www.sagepub.com/journalsPermissions.nav
Permissions:
http://mmr.sagepub.com/content/6/1/34.refs.html
Citations:
What is This?
- Sep 2, 2011
OnlineFirst Version of Record
- Mar 13, 2012
Version of Record
>>
by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from by guest on October 11, 2013
mmr.sagepub.com
Downloaded from
Journal of Mixed Methods Research
6(1) 34–54
Š The Author(s) 2012
Reprints and permission: http://www.
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1558689811417133
http://jmmr.sagepub.com
1
Clemson University, Clemson, SC, USA
2
University of Louisville, Louisville, KY, USA
Corresponding Author:
Julia L. Sharp, Department of Applied Economics and Statistics, 237 Barre Hall, Clemson University, Clemson, SC
29634-0313, USA
Email: jsharp@clemson.edu
A Mixed Methods Sampling
Methodology for a Multisite
Case Study
Julia L. Sharp1
, Catherine Mobley1
,
Cathy Hammond1
, Cairen Withington1
,
Sam Drew1
, Sam Stringfield2
,
and Natalie Stipanovic2
Abstract
The flexibility of mixed methods research strategies makes such approaches especially suitable
for multisite case studies.Yet the utilization of mixed methods to select sites for these studies
is rarely reported.The authors describe their pragmatic mixed methods approach to select a
sample for their multisite mixed methods case study of a statewide education policy initiative
in the United States.The authors designed a four-stage sequential mixed methods site selection
strategy to select eight sites in order to capture the broader context of the research, as well
as any contextual nuances that shape policy implementation.The authors anticipate that their
experience would provide guidance to other mixed methods researchers seeking to maximize
the rigor of their multisite case study sampling designs.
Keywords
site selection, mixed methods sampling, mixed methods study, case study, purposive sampling
Social scientists use mixed methods research in a variety of contexts, including educational
research and policy analyses. Ultimately, the mixing decisions in these studies entail deciding on
how and at what stages the mixing will occur (Creswell & Plano Clark, 2007). Much of the
pertinent literature focuses on selecting qualitative and quantitative data collection strategies to
explore the study’s research questions. More robust studies often involve mixing methods
throughout the research process (Tashakkori & Teddlie, 1998, 2003a), including during the
sample or site selection stage of research.
Collins, Onwuegbuzie, and Jiao (2007) described sample selection as one of the most
important stages of mixed methods studies. Mixed methods sampling techniques may be useful
when it is challenging to obtain a representative sample using only one method. Such sampling
techniques are often appropriate for mixed methods studies that incorporate both goals of
generalizability of research findings and in-depth understanding of the research context (Kemper,
Stringfield, & Teddlie, 2003; Onwuegbuzie & Leech, 2007). Specifically, if a study cannot use
Sharp et al. 35
random assignment or selection, then multistage, mixed methods sampling designs may be used
to select participants or sites that are more likely representative of the population studied and that
are best suited to answer the research questions.
Addressing sampling issues early in the research design helps ensure that the mixed
methodology employed is maximized to the fullest extent possible (Lieber, 2009). Yet few mixed
methods studies provide enough details about sample or site selection so researchers can learn
from the strategy employed (Teddlie & Tashakkori, 2009). Collins et al.’s (2007) extensive
literature review uncovered only four articles that specifically address sample selection. There is
a similar lack of published information about how to best select sites for multisite case studies
(Onwuegbuzie & Leech, 2007).
Our study aims to fill this gap in the literature. In our article, we describe the mixed methods
multisite study of a statewide education policy initiative and the sampling strategy that we
developed and used to select sites. We anticipate that our experience can provide guidance to
other mixed methods researchers seeking to maximize the rigor of their own multisite case study
sampling designs.
Background of Our Mixed Methods Multisite Case Study
There continues to be much concern regarding how best to prepare students for higher education and
ultimately the demands of the modern workplace. This concern is evidenced in federal, state, and
local legislation, initiatives, and programs. A major focus of the Obama administration’s education
agenda is preparing students for college and/or for high-skill, high-wage jobs (White House Web site,
2010). Across the United States, policymakers have developed innovative programs and curriculum
reform efforts to improve school-to-work and school-to-college transitions.
In 2007, the National Research Center for Career and Technical Education funded three
overlapping multiyear studies to investigate the development of Perkins IV–defined Programs of
Study (POS) and the impact of POS on student outcomes (see Table 1 for acronym definitions).
Our study, one of the three, is a longitudinal, 5-year examination of the early effects of a state-
mandated school reform policy that requires a focus on career awareness and exploration at all
school levels. This policy is based on South Carolina’s Education and Economic Development
Act (EEDA) of 2005 that mandated the creation of locally relevant career pathways or programs
of study in high schools.Among other requirements, these pathways or programs must align with
postsecondary education, pertain to local economic realities and industry, and provide work-
based learning opportunities for students. All South Carolina public schools are expected to have
fully implemented EEDA by July 2011.
Our overarching study aims to assess the extent to which a statewide reform mandate such as
EEDA facilitates the creation of POS and whether these POS affect students’ engagement,
achievement, high school completion, and successful transition to postsecondary education and/
or employment. We also explore whether or not the availability of school and community
resources and future employment opportunities influence the development of POS and the
outcomes of students enrolled in them. This study considers two main questions: (a) Can a
statewide mandate like EEDA increase the number of and level of participation in school POS?
and (b) How does the number of and level of participation in POS, in combination with various
political, economic, and social characteristics, influence selected outcomes for South Carolina’s
secondary students and the schools they attend?
Given that the state law applies to all high schools in South Carolina, random assignment of
schools to experimental and control groups was not possible. Thus, our 5-year study employs a
quasi-experimental design with a mixed methods, triangulated approach (Tashakkori & Teddlie,
2003a), following three student cohorts (the Classes of 2009, 2011, and 2014) from a sample of
eight high schools from economically and culturally diverse regions of South Carolina. Each
36 Journal of Mixed Methods Research 6(1)
school’s Class of 2009 received very little to no exposure to the reforms, whereas the Class of
2011 is receiving exposure during high school and the Class of 2014 is receiving exposure from
middle through high school. Using a mixed methods research design will help us achieve a
broader understanding of the policy’s impacts on students, teachers, schools, and POS creation.
Our study most closely approximates the pragmatic parallel mixed methods research design
in which both qualitative and quantitative data are collected and analyzed to address research
questions. The mixing occurs either concurrently or after some time passes (Mertens, 2010). The
qualitative and quantitative approaches address various dimensions of the main research
questions (Teddlie & Tashakkori, 2009). Quantitative data include student outcome data (e.g.,
grades and attendance) from the three student cohorts and responses from student questionnaires.
Qualitative data include the results from content analyses of school course catalogs and career-
related materials and perspectives culled from interviews and focus groups conducted with
school administrators, counselors, and teachers from sample high schools, relevant career center
staff, and administrators at partner two-year colleges. Both qualitative and quantitative data
come from student Individual Graduation Plans (IGPs) from the state database, questionnaires of
guidance personnel and high school principals, and Class of 2009 and Class of 2011 student
transition and post-graduation information.
Justification for Using Mixed Methods Research
for Our Multisite Case Study
The choice to use a mixture of qualitative or quantitative methods is a reflection of an
epistemological, or philosophical, stance to research, as much as it is a choice in actual data
collection methodologies. Case studies can be postpositivistic, phenomenological, or both
(Amaratunga, Baldry, Sarshar, & Newton, 2002). Our study was influenced by the research
philosophy of pragmatism that served as a bridge between conflicting paradigms and across the
paradigm–methodology–method continuum (Johnson & Onwuegbuzie, 2004). Indeed, the
complex nature of the social world requires a more fluid understanding and application of the
relationship between philosophical paradigms (assumptions about the social world and nature of
knowledge), methodology (the logic of inquiry), and methods (techniques of data collection).
Various authors have argued for the need to move beyond incommensurability, whereby one
set of philosophical assumptions necessarily dictates a specific methodological approach, which
would then subsequently limit one’s choice of data collection methods (Howe, 1988; Johnson &
Onwuegbuzie, 2004). As described below, the flexibility inherent in a pragmatic approach to
research is especially important in complex case studies.
Longitudinal multisite case studies like ours combine the study of specific sites with an
exploration of the various contexts in which the policy might be implemented to provide a
Table 1. Definition of Acronyms
Acronym Definition
EEDA Education and Economic Development Act
IGP Individual Graduation Plan
NRCCTE National Research Center for Career and Technical Education
POS Programs of Study
PSLOI Preliminary Site Selection Level of Policy Implementation
SDE South Carolina Department of Education
SLOI Site Selection Level of Policy Implementation
WIA Workforce Investment Area
Sharp et al. 37
broader basis for generalization (Simons, 1996). For our study, we faced the challenges of
selecting sites that were representative of the study area (South Carolina) while also selecting
sites that could tell us the most about the complexities of policy implementation. To meet these
challenges, our research design and site selection process relied on the epistemological approach
of pragmatism (Biesta, 2010; Greene & Hall, 2010). This mixed methods research approach is
primarily guided by a study’s research questions, is based on the needs of and contingencies
present in a particular study, and ultimately reflects a value of both subjective and objective
knowledge (Johnson & Onwuegbuzie, 2004; Tashakkori & Teddlie, 2003a). This pragmatic
research philosophy results in more robust and interesting findings and thus is of greater value to
policymakers and practitioners (Sammons, 2010), although we recognize that pragmatism is not
the only, or even most appropriate, philosophical foundation for all mixed methods research
(Biesta, 2010; Greene & Hall, 2010).
Several principles of pragmatism influenced our logic of inquiry, research design, and the
methods used for our site selection process (Johnson & Onwuegbuzie, 2004):
• Pragmatism recognizes the importance of eclecticism and pluralism whereby “different,
even conflicting theories and perspectives can be useful; observation, experience and
experiments are all useful ways to gain an understanding of people and the world”
(Johnson & Onwuegbuzie, 2004, p. 18).
• Pragmatists prefer action over philosophizing.
• Pragmatism endorses practical theory or praxis (theory that informs effective practice).
• From a pragmatic perspective, “knowledge is viewed as being both constructed and
based on the reality of the world we experience and live in” (Johnson & Onwuegbuzie,
2004, p. 18).
These principles of pragmatism subsequently influenced our logic of inquiry on four different
levels described below.
First, as a study of policy implementation, our research seeks “actionable knowledge of direct
practical value in the context being studied” (Greene & Hall, 2010, p. 138). The nature of our
study made it important for us to consider how the data would help us learn about the impact of
the legislation and school-based policies related to POS. As expressed by Rorty (1999), a more
comprehensive assessment of policy implementation can be obtained using a pragmatic approach,
not necessarily for the goal of providing a more “accurate account” but rather to improve the
usefulness of the research. Thus, we had to be mindful of the consequences of our study, its
utility for informing future policy initiatives (Feilzer, 2010), and its value for a variety of
stakeholders, including practitioners and policy makers (Sammons, 2010).
In a similar vein, we felt it was important to capture the broader context of the research as well
as the subtle contextual nuances that shaped policy implementation and how these varied across
the study sites. This contextual search is a second quality of pragmatism that influenced our
research. This notion is reinforced by research that demonstrates that some schools are more
effective in achieving outcomes (such as implementing a complex policy like EEDA) than others
(Tashakkori & Teddlie, 2003b). Thus, we recognized the need to account for school effects and
other contextual variables when explaining outcomes.
Third, a pragmatic approach to sample selection shaped our conceptions of generalizability
and how the findings can be applied when reporting study results. Careful selection of sites is
no trivial matter and this choice is often governed by whether the researcher wants to generalize
results to a larger population (in which case random sampling is often used) or desires to learn
more about a specific setting or phenomenon (in which case purposive sampling strategies are
often used; Onwuegbuzie & Leech, 2007). As Tashakkori and Teddlie (1998) claimed, mixed
38 Journal of Mixed Methods Research 6(1)
methods researchers often switch between different types of generalizability–generalizing
results to a theoretical population and generalizing results to another specific setting. The
intermingling of quantitative and qualitative approaches and their underlying philosophies is a
central feature of mixed methods studies and of multisite case studies where there is pressure
to generalize research findings beyond a single case (Simons, 1996). In light of these factors,
our study used a purposive sampling strategy that integrated quantitative and qualitative
approaches.
Finally, from a pragmatic perspective, mixed methods studies are strengthened when research
teams are composed of individuals from a variety of disciplines (Sammons, 2010). This reflects
Greene’s (2007) notion that mixed methods researchers should engage in a “mixed methods way
of thinking,” whereby we not only mix methodologies but also “different ways of seeing,
interpreting, and knowing” about the social world (p. xi). The development of our research
design and site selection process was influenced by the make-up of the research team and a
consideration of the relationships between the team members themselves and between the team
members and the study design, a feature of mixed methods research (Sammons, 2010; Tashakkori
& Teddlie, 2003b). Our team includes a statistician, an economist, sociologists, and educational
researchers, each contributing unique methodological training and skills to the project.According
to Teddlie and Tashakkori (2009), this collaborative team approach is especially useful for mixed
methods studies such as ours that use the parallel mixed methods design.
The philosophical approach of pragmatism and the four elements comprising our logic of
inquiry shaped our choice of methods used to select study sites. In the next section, we review
the literature on case studies and sample selection.
Selecting Sample Sites for Mixed Methods Case Studies
Yin (2003) cited several reasons for engaging in case studies. Case studies are (a) relevant when
the focus of a study is on “how” and “why,” (b) used when researchers cannot manipulate the
behavior of those under study, (c) appropriate when researchers want to learn more about the
contextual conditions that are especially relevant to the phenomenon under study, and (d) used
when the boundaries between the subject of study and the context are not clear. All four of these
reasons are applicable to our study.
Yin (1994) distinguished between a variety of case study approaches ranging from a single
case study to multiple case studies such as the Type 4 design, which involves multiple sources of
data, multiple cases, multiple methods, and multiple units of analysis. Longitudinal multisite
case studies such as ours offer a combination of the study of a specific site with the need to
understand the context and provide a wider basis for generalizing findings (Simons, 1996). Such
studies are particularly valuable as they allow for comparisons within cases and across time and
contexts, thus offering a “rich harvest of lessons and insights” (Zartman, 2005, p. 8).
Despite the recognition that a mixed methods design is appropriate for case studies, the
literature offers little guidance on how to select study sites and how many sites to select.
Researchers often resort to convenience sampling and choosing sites that allow easy access. As
a result, site selection often lacks any theoretical justification and the resulting data are often not
situated within a particular theoretical context (Walford, 2001). Although we recognize the need
for convenience sampling in some cases, our work highlights the importance of rigor when
selecting sites for case studies, especially when researchers need to understand the influence of
a policy on outcomes achieved within complex settings, such as educational institutions. Such
complex studies often require that researchers use a combination of quantitative and qualitative
sampling techniques (Kemper et al., 2003).
Sharp et al. 39
Concerning sampling strategies, Goetz and LeCompte (1984) called for criterion-based
sampling to “establish the criteria, bases, or standards necessary for units to be included” in the
research study (p. 77). This idea of criterion-based sampling is similar to purposeful or purposive
samplingdesignsthatareoftenusedtoselectasampletoattainrepresentativenessorcomparability
in a study (Patton, 1980; Teddlie & Yu, 2007). However, both strategies offer more generalized
guidelines than those used in our study. That is, we did not select what Patton (1980) described
as the most extreme or deviant, typical cases, critical cases, or politically sensitive cases. Other
site selection criteria include choosing sites that have high experience levels of the phenomenon
under study and choosing sites that increase the chance for negotiating access (Pettigrew, 1990).
Some of these factors played an indirect role in, but did not drive, our selection process.
No universal rule exists regarding the number of sites to select for multisite case studies
(Axinn & Pearce, 2006). Yin (2009) observed that the number of cases depends on both literal
replication (the amount of certainty desired concerning the research findings) and theoretical
replication (the extent to which external, contextual factors shape research findings and how
many cases are needed to reflect this variety). Such decisions are directly related to the idea that
case study research is not meant to be generalizable in the positivist, statistical sense of the word.
Thus, the traditional concept of random sampling typically does not apply to multisite case
studies (Yin, 2009). Rather, a purposive sampling strategy generally is used to select the best
sites possible, given the research goals and questions.
Our mixed methods sampling strategy represents a unique integration of quantitative and
qualitative methods at the sampling (i.e., site selection) phase of our mixed methods study
(O’Cathain, Murphy, & Nicholl, 2007). Within the context of our overarching parallel mixed
methods study, we developed a four-stage nested mixed methods sampling strategy following the
principles of a pragmatic sequential mixed methods approach. With this strategy, one type of data
informs the collection of another type of data in a subsequent stage (Mertens, 2010). The
remainder of this article describes our mixed methods site selection strategy in more detail.
Using Mixed Methods to Select Sites for Our Multisite Case Study
As is the case in most multisite case studies, we faced the challenges of deciding how to select
study sites (i.e., high schools) and identifying relevant criteria for selecting those sites. Budgetary
and time restraints limited the number of sites that could be studied, thus making it even more
important to choose sites in a way that would allow us to learn as much as possible about state
policy implementation and POS under differing school conditions.
Collins et al. (2007) listed major sampling schemes frequently employed in mixed methods
research. A random selection of high schools may have led us to select schools all with similar
policy implementation levels or with similar other characteristics that would best be varied in
order to address the research questions. Instead, we used a multistage, mixed methods sampling
design to select a sample that characterizes the population of interest so we could better analyze
the impact of the policy on students and schools.
Similar to Wells, Hirschberg, Lipton, and Oakes’s (1995) study of school detracking efforts
and Teddlie and Stringfield’s (1993) research on school effects, we were interested in selecting a
sample of schools that exhibited variety on primary variables of interest. In particular, we wanted
to assure variation on critical variables shown in past research to influence the implementation
of school reforms together with other variables that were perceived from the outset to have
potential influence on outcomes. Following Axinn and Pearce (2006), we sought to identify “all
factors believed to produce initial characteristics or conditions in a nonrandom way. These
measures can then be used in sophisticated statistical models to simulate random assignment of
initial conditions” (p. 161). We used a dual approach to identify key variables—hypothesizing
some early on and allowing others to emerge through data collection.
40 Journal of Mixed Methods Research 6(1)
In our site selection plan, we aimed for variation across sites in actual level of policy
implementation. Based on policy guidelines provided to schools,1
the study team identified the
most salient initiatives for high schools and grouped them into the following six key facets
around which to measure policy implementation: (a) identification of and assistance for students
who are at high risk for dropping out of school; (b) integration of rigorous academic and career-
focused curricula, organized into career clusters and majors; (c) increased counselor roles in
education and career planning; (d) implementation of evidence-based high school reform; (e)
facilitation of local business–education partnerships and resource dissemination; and (f)
articulation between kindergarten through 12th grade and higher education.
Coupled with a desire for obtaining variety in levels of policy implementation, we aimed to
include schools from a diversity of contexts that are important for understanding educational reform
efforts such as EEDA. Thus, our sample included high schools that varied across several policy-
relevant factors including industry-related variables, availability of community and economic
resources, and level of implementation of the statewide policy. Defining these variables before data
collection allowed us to discuss in advance how to operationalize the contextual variables relevant
for our study. Focusing on these contextual variables during our site selection process enabled us to
connect them to student and school outcomes, a consideration that has become increasingly
important in school effectiveness research (Teddlie, Stringfield, & Reynolds, 2000; Wimpelberg,
Teddlie, & Stringfield, 1989) and other kinds of educational studies (Sammons, 2010).
Our Mixed Methods Site Selection Strategy
In using a mixed methods approach for site selection, we considered several factors important for
our study of EEDA implementation and the policy’s impact on POS and student outcomes. First,
sample selection and study data collection were narrowed to high schools as the penultimate sites
of EEDAefforts, even though EEDAis a kindergarten through college initiative with implications
beyond that in terms of further education/training and community partnerships required for
successful implementation. Second, the team chose to limit the sampling frame to those high
schools considered to be “traditional” high schools that included only Grades 9 through 12.2
Among the high schools listed on the South Carolina Department of Education website (SDE;
2010), there were more than 150 schools that we defined as traditional. In all practicality, the
team was unable to include all these high schools in our sample. We chose to include eight high
schools in our study sample so that we could conduct our study within time and budget constraints,
yet effectively answer our research questions. Our final sample size of eight schools was within
the number (4-12 sites) suggested by Teddlie and Tashakkori (2009) for mixed methods multisite
case studies.
As an alternative to selecting experimental and control schools and to provide a measure of
control over various factors that might affect the study at sample schools, our sampling design
followed the MaxMinCon strategy (Kerlinger, 1986; Tashakkori & Teddlie, 1998). South
Carolina is geographically divided into 12 Workforce Investment Area (WIA) regions (Figure 1).
The team sampled to Maximize differences among WIAs (e.g., communities offering differing
economic opportunities) and schools within WIAs (on level of EEDA implementation based on
the six key facets previously listed). Furthermore, the team chose to Minimize differences
between schools within WIAs on student background characteristics and district support for the
schools, and Control for as many extraneous variables as practical, so as to minimize error
variability. Although anchored in a postpositivist tradition, the MaxMinCon methodology is also
reflective of the pragmatic goals of generalizability, contextuality, and relevance (Tashakkori &
Teddlie, 1998). This strategy led us to a four-stage sequential school selection process that we
Sharp et al. 41
describe in more detail below. Figure 2 provides an illustration of the sampling strategy used for
site selection.
Stage 1: Representing Regional and Industrial Diversity
All high schools could, in theory, offer a wide range of POS options. In practice, schools may
have chosen to offer specific POS best matched to the careers most likely to be available to
students in their region. Hence, in the first stage of sampling, we introduced controls for economic
and industry conditions that might affect the availability and development of business partners
for POS and work-based learning opportunities and career-specific education and employment
opportunities. Local and regional economics would likely influence policy implementation since
the reform model is career-focused and is intended to be linked and relevant to local labor
markets and industries. We also wanted to control for a school’s local economic conditions so
that we could compare policy implementation for schools facing similar labor market and
economic conditions and contrast schools from different local conditions.
We used industry-related (private and government) information for 10 primary industries in
each of the 12 WIAs in South Carolina (South Carolina Employment Security Commission,
2008). As a part of EEDA, a Regional Education Center (REC) is being developed in each WIA
to serve as a hub for the region’s training and education resources. The Regional Education
Centers will help to facilitate business–education partnerships, coordinate workforce education
programs, and promote community involvement. Thus, we considered the WIA as an economic
entity focused on a somewhat distinctive industry mix.
Figure 1. Workforce Investment Areas (WIAs) in South Carolina
Note: From South Carolina Employment Security Commission,“Spotlights:WIA Profiles.” Retrieved from http://www
.sces.org/lmi/spotlights/WIA/
42 Journal of Mixed Methods Research 6(1)
Industry employment data, averaged within each WIA, were used in a quantitative Chi-square
analysis to explore the association between WIAs and industry employment. We used this
analysis to explore the statistical justification to select WIAs based on concentrations of workers
in major state industries. Results indicated a significant association (using a significance level of
.05) between the WIAs and industry employment, χ2
(33, N = 1,116,799) = 108200.70, p < .01.
Three WIAs were identified in which employment for one of the top five South Carolina
Figure 2. Sampling strategies and resulting number of schools selected at each stage of the sampling
design
Note: Near the end of Stage 3, one cluster of schools declined to participate in the study. A substitute cluster of 12
schools was selected using the Stage 2 process. Stage 3 procedures were then applied to this new group to reach a
revised grouping of 10 schools at the end of Stage 3.
Stage 1:
Representing
Regional and Industrial Diversity
Chi-square analysis
to select 4 WIAs
(59 schools)
Stage 2:
Selecting School Clusters Based on
Level of Available Economic Resources
Hierarchical cluster analysis on
selected economic measures
(31 schools)
Stage 3:
Ranking High Schools on
EEDA Implementation Level
Quantitative and qualitative data collection
to rank schools on PSLOI.
16 high and low ranked schools were
invited to participate in the study
with 10 agreeing to participate
Stage 4:
Validation of
Policy Implementation Level and
Variation on Key School Characteristics
Implementation validation site visits
to select the final sample
of 8 schools
Sharp et al. 43
industries (trade, transportation, and utilities; government; manufacturing; leisure and hospitality;
and professional and business services) was significantly greater than expected, and one WIA
was identified where employment in two of the top five industries was significantly greater than
expected. We selected these four WIAs so that we could make comparisons among, and within,
WIAs. Fifty-nine high schools that met our school criteria were located in these four WIAs.
Although 59 high schools would have been a more manageable sample than the original
sampling frame of more than 150 traditional 4-year high schools in South Carolina, it still would
have been difficult for us to collect timely and appropriate data from such a large sample during
the study period. The schools in the four WIAs also varied across other study-relevant factors,
including local economic conditions, thus making it difficult to discern in our final analysis
whether the impact of the legislation was due to policy or economic conditions. For this reason,
a second quantitative sampling stage was used to select schools of varying local economic
conditions across the WIAs and with similar economic conditions within the WIAs.
Stage 2: Selecting School Clusters Based on Level of Available Economic Resources
During the second sequential stage of school selection, we used hierarchical cluster analysis to
cluster schools within each of the four selected WIA regions based on the level of selected local
economic measures (i.e., measures that were closer to the school level of analysis). Research
indicates that community resource and poverty levels can influence a school’s ability to
implement change (Bryk, Sebring,Allensworth, Luppescu, & Easton, 2009; Teddlie & Reynolds,
2000). The impact of economic resources on student and school outcomes and on successful
implementation of school reforms is also well documented. Balfanz and Legters (2004) found
that urban, low-income high schools, or so-called dropout factories, produced the highest
percentages of dropouts. Dropout rates are also higher in impoverished communities (Rumberger,
2001), and some links have been found between dropout rates and employment rates. Schools
0
5
10
15
20
25
WIA 2: Rural
WIA
High Poverty Cluster
Mid/Low Poverty Cluster
Selected Cluster
Count
of
Schools
WIA 3: Urban
WIA 1: Rural
WIA 4: Urban
Figure 3. Clustering of 59 schools using data representing local economic conditions
44 Journal of Mixed Methods Research 6(1)
with higher concentrations of lower income students tend to have higher dropout rates
(Rumberger, 1995).
We used the following local economic measures to cluster schools within the WIAs: per
capita income by postal codes of all students enrolled in each school, a school poverty index
based on the percentage of students eligible for Medicaid or qualified for free and/or reduced
price lunch by school, the percentage of families in poverty with children below the age of 18
years by postal code, and the percentage civilian unemployment by postal code. Most of the 59
schools in our set of potential sites did not draw from specific postal codes, that is, the postal
delivery zones did not align with attendance zones. Therefore, for each potential site, we acquired
a data set of postal codes of all students enrolled for the most recent school year, then applied a
weight to each postal code for each school according to the proportion of students from each
postal code. These weights were then applied to the 2000 Census postal code data so that the data
were representative of the student populations at the schools (U.S. Bureau of the Census, 2000).
A hierarchical cluster analysis was performed within each WIA using SAS v. 9.2 (SAS
Institute Inc., 2008). In this analysis, each observation begins in its own cluster and the two
closest clusters (based on the squared distance between the averages) are grouped together. This
merging is continued until only one cluster remains. Figure 3 illustrates the clustering of the 59
Table 2. Demographic Characteristics of School Clusters
Demographic
Factor
High-Poverty Clusters Low-to-Moderate Poverty Clusters
Rural WIA Urban WIA Rural WIA Urban WIA
Average per capita
income (1999)a
$15,521 $19,752 $19,128 $24,268
Range in per capita
income (1999)a
$13,486-$18,156 $16,305-$23,034 $18,638-$19,758 $21,505-$29,223
Average school
poverty index
(2004-05, 2005-
06, 2006-07)b
74% 54% 50% 35%
Range in school
poverty index
(2004-05, 2005-
06, 2006-07)b
50%-92% 36%-83% 47%-57% 13%-51%
Range in percent
unemployment
(1999)c
5%-12% 3%-7% 4%-5% 2%-7%
Note:WIA = Workforce Investment Area.
a.A local per capita income figure was derived for each school using weighted 5-digit postal code data (weighted by
postal code residence data for students enrolled in each school) from the U.S. Census Bureau. 2000 Census of Popula-
tion and Housing, Summary File 3 (SF3), Sample Data,Table P82 Per Capita Income in 1999 (Dollars)—Universe:Total
population.The list of postal codes used to get weighted averages of all census data for schools came from South
Carolina Department of Education, Office of Data Management and Analysis (personal communication, September 25,
2008).
b.This is school-level data published in the South Carolina Department of Education State of South Carolina Education
Accountability Act report cards (South Carolina Department of Education, 2005, 2006a, 2007), available online at the
South Carolina Department of Education website.The poverty index is a measure of the percentage of students at
each school eligible for Medicaid or qualified for free and/or reduced lunch.
c.A local percentage of civilian unemployment figure was derived for each school using weighted 5-digit postal code
data (weighted by postal code residence data for students enrolled in each school) from the U.S. Census Bureau. 2000
Census of Population and Housing, SF3, Sample Data,Table P43 Sex by Employment Status for the Population 16Years
and Over—Universe: Population 16 years and over.
Sharp et al. 45
schools on the four local economic measures. We clustered schools in each WIA into one of two
clusters: either high or low-to-moderate (mid/low) poverty. Clusters selected from two WIAs
(one with more urban areas and one with little or no urban areas) included high-poverty schools
and clusters selected from the other two WIAs (one with more urban areas and one with little or
no urban areas) included low-to-moderate poverty schools. Demographics for the four selected
school clusters are shown in Table 2. Thirty-three eligible high schools were contained in these
four clusters. Two schools were removed from the sampling frame due to excessive missing data
for the third stage,3
leaving 31 schools from which to select our final sample.
Stage 3: Ranking High Schools on EEDA ImplementationWithin Each
SelectedWIA Cluster
Researchers in multisite case studies have frequently selected schools based on varying levels of
implementation or exposure to a particular policy or practice (e.g., Burgess, Pole, Evans, &
Priestly,1994).Severalstudiesprovidedstrongsupportforincludingpriorlevelofimplementation
as a selection variable for our study (Stallings & Kaskowitz, 1974; Stringfield, Millsap, &
Herman, 1997; Datnow, Borman, Stringfield, Overman, & Castellano, 2003). Selecting schools
that exhibit a range of implementation levels helped us avoid a challenge that frequently arises
in multisite studies: selecting only schools that are exemplary or selecting only schools that
exhibit low levels of implementation (Christ, 2007; Wolf, Borko, Elliott, & McIver, 2000).
Pettigrew (1990) recommended identifying “polar sites” for study, whereby researchers select
cases that illustrate high and low performance on the indicator of interest.
In the third sequential stage of sample selection, the research team gave schools preliminary
site selection level of policy implementation (PSLOI) scores based on available data from the
2007-2008 school year (the initial year of the study) on the six measures previously described.
Because visiting all 31 schools was not practical, we used data from the SDE and school and
district websites to assist in formulating the PSLOI scores. The PSLOI scores allowed us to
consider relevant contextual factors during sample selection.
The research team collected both quantitative and qualitative data on school EEDA
implementation to obtain the PSLOI for each school. Survey data were collected from a state-
mandated questionnaire on guidance activities and an SDE high school reform needs
questionnaire. We acquired additional SDE data on each school’s progress in policy
implementation. Schools are required to inform staff, parents, and students about EEDA, and
many choose to do so through school websites. An instrument was developed to analyze content
of materials, text, and catalogs available on school and district websites about curriculum, course
selection, registration, programs, guidance personnel, and materials for parents and students. The
content analysis consisted of conceptual analysis of materials to identify the common and most
consistent themes as they related to the research questions and the policy itself. We looked at how
closely these materials met the state’s standard policy format and how much EEDA information
schools provided on their websites.4
We used the constant comparative method of analysis
(Glaser & Straus, 1967), comparing the various sources of data, simultaneously coding and
analyzing the data as we progressed. The content analysis instrument was tested until 100%
agreement among three reviewers was achieved. In total, three schools were used to test the
instrument. The instrument was then used to content analyze each school’s website. We also
contacted schools and districts to collect missing data so as not to bias PSLOI simply due to
ineffective or poor school or district websites.
From the SDE data sources and the website analyses, we initially identified 63 possible data
points that could be used to rank schools on our six identified key policy facets. After more
in-depth review of each data point, some were found to duplicate content measured by other data
46 Journal of Mixed Methods Research 6(1)
points whereas others seemed unreliable (e.g., a survey question was unclear and responses
varied widely). We chose 41 of the possible 63 data points to include in our scoring. A final
coding scheme was devised for all the data used.5
Schools that had more advanced implementation
of the state policy across the six identified facets received higher PSLOI scores, and schools that
had less advanced implementation across the six facets received lower PSLOI scores.
Schools were then rank ordered within clusters on the PSLOI scores, with the goal of
identifying one school with a high level and one school with a low level of implementation from
each cluster. Figure 4 illustrates the range in PSLOI scores across the schools within WIAs. If
two or more schools had similar PSLOI scores, we considered other factors such as school size,
urbanicity of the school (as defined by the National Center for Education Statistics, 2002), and
minority enrollment to ensure that a diverse array of schools was chosen. Sixteen high schools
(two with high PSLOI and two with low PSLOI scores in each of the four clusters) were invited
to receive preliminary site data validation visits and to possibly participate in the study.
Of the initial 16 schools (schools marked by solid black lines in Figure 4, with “F” in the
school name to signify “first round” picks) contacted, nine agreed to receive visits, whereas the
remaining schools either did not respond to repeated contacts or declined to participate (Figure
4, “D” in the school name to signify “declined”). One of these nine schools, School 4FD (Figure
4) was removed from the sampling frame when the structure of the high school was modified so
that it no longer met our definition for inclusion in the study. In the WIA with the fewest schools
accepting our invitation to participate in the study (WIA3), we invited two substitute high schools
0
10
20
30
40
50
60
1FV
2F
3F
4FD
5FD
6
7F
8
9
10
11F
12F
13FD
14FV
15
16
17S
18
19FD
20
21SD
22F
23FD
24FD
25D
26D
27D
28D
29D
30FD
31FD
32SND
33SND
34
35
36
37
38
39SN
40
41
42
43
Schoolsa
PSLOI
WIA 1: Rural
Mid/Low
Poverty Cluster
WIA 2: Rural
High
Poverty Cluster
WIA 3: Urban
Mid/Low
Poverty Cluster
WIA 4: Urban
High
Poverty Cluster
WIA 3: Urban
High
Poverty Cluster
16 Schools Initially Invited to Participate in Study
Substitutes Selected to Replace Those We Were Unable to Visit
8 Schools Chosen for the Sample
Figure 4. Preliminary selection level of implementation (PSLOI) scores for the 43 high schools
considered for inclusion in the sample
Note:The 16 original schools invited to participate are shown with solid black bars;WIA4 schools declined to
participate as did several other schools (labeled with “D” in school names). Substitute schools invited to participate
are shown with striped bars (and labeled with “S” in school names). Eight schools selected for study have stars above
their bars. In all, 43 (31 across original 4 clusters plus 12 in new “replacement” WIA3 high poverty cluster) were given
PSLOI scores and considered for inclusion in the study.
a. Schools are numbered in order of PSLOI by WIA cluster. Letters following the numbers in school names correspond
to the following codes: F = one of the first 16 schools chosen;V = visited but not selected; D = declined to participate,
did not conform to criteria, or never responded to invitation; S = substitute school; N = school from new WIA3 high
poverty cluster invited to participate.
Sharp et al. 47
with PSLOI scores similar to those schools not participating to receive validation visits (Figure
4, “S” in the school name to signify “substitutes”). One of the two substitute schools agreed to
participate and was included in Stage 4.
Toward the end of Stage 3, schools in one WIA cluster declined to participate in our study
(WIA4 in Figure 3) due to their time, budgetary, and research circumstances. To compensate for
losing WIA4, the high-poverty cluster from the remaining urban group from the Stage 2 sampling
frame was added as a substitute cluster. We contacted three schools from the substitute cluster
(WIA3, urban, high-poverty). One of these three schools agreed to participate and was also
included in Stage 4. Thus, at the end of Stage 3, we had identified 10 schools to receive site visits
in Stage 4.
Stage 4:Validating Policy Implementation Level andVariation on Key School
Characteristics
The fourth stage of the sampling scheme involved site visits to validate the qualitative and
quantitative data collected in the prior stages. We visited the schools to verify that the PSLOI
scores generally reflected the reality at the schools and to determine the qualifications of the
school for inclusion in the final study sample. The 10 site visits were scheduled with the assistance
of site administrators. The research team met with key school personnel including principals,
assistant principals, guidance directors, guidance counselors, and teachers to verify the scores.
Interviews with each individual or group were 30 minutes to 1 hour in length. We asked about
EEDAimplementation, the stage of development of the high school’s majors and career pathways,
and the operational details of the IGP development process. We asked guidance directors and
guidance personnel to describe their specific roles in policy implementation; the ways in which
they work with students, teachers, and parents on career development; and the amount of time
they devoted to these activities.
We conducted one to three focus groups with 9th and 10th grade teachers at each school. Each
group included three to six teachers from different concentration areas, including math, English,
social studies, science, career and technical education, honors, advanced placement (courses
through which students have the opportunity to earn college credit), college preparation, and
basic and special education courses. Focus groups lasted from 45 minutes to 1 hour each.
Teachers were asked to discuss their perceptions of school implementation of the various
components of EEDA, including career-focused activities and curricula, the progress made in
implementation, and the impacts of the reform on their work generally and specifically on how
they teach their courses (Smink et al., 2010).
From the site visit interviews and observations, the team was able to substantiate the initial
implementation selection scores and revise where necessary. The PSLOI scores were updated
from verified information gathered during the validation site visits. The new scores (called site
selection level of policy implementation scores; SLOI) were used for final sample selection and
will also be revised for use at the end of the study to compare the change in level of policy
implementation over time. In addition to SLOI scores, we considered other information for our
final selection of study sites including school staff opinions on policy implementation, the
school’s interest in participation in the study, and the school’s cooperation in providing materials.
The final eight high schools selected varied in terms of the level of EEDA implementation
(high and low-to-moderate) and levels of poverty, urbanicity, and industry characteristics (as
characterized by location within a particular WIA). As a result, in our subsequent analyses, we
will be better able to make important comparisons between and among schools, based on the
characteristics of the schools. We will also be able to obtain a wide variety of information to
understand better how these and other factors influence state policy implementation and
subsequently other outcomes of interest.
48 Journal of Mixed Methods Research 6(1)
Discussion
Case studies are characterized by their multilevel, multidimensional characteristics. Such
research studies naturally evolve over time, as do the contexts and sites themselves. Schools are
complex and hierarchical in nature, with multiple interrelated levels, including students,
classrooms, schools, and districts. A number of factors about our study and the settings we
explored led us to a mixed methods approach not only for data collection and analysis to address
our research questions but also to select our study sites so that we could investigate these
questions. Multiple vantage points and data sources are necessary to better understand the
complexity of these educational settings. Such complexities inevitably invite reflection on how
we framed our research, designed the overall study, and developed our site selection strategy.
We assume that mixed methods have been used to select participants or cases for other mixed
methods research studies, but few authors have described the specifics of their sampling
strategies. Specifically, we used quantitative analyses to select four WIA regions (Stage 1) and to
cluster schools from these regions by selected local economic measures to select high and low-
to-moderate poverty schools (Stage 2). During the third and fourth stages of sampling, we not
only used qualitative and quantitative data but also qualitative and quantitative methods to obtain
scores to better rank and compare schools for site selection. This mixed methods sampling design
was crucial to help us address the research questions for our study, a reflection of our pragmatic
philosophical stance to the study design.
Our study was influenced by several factors common to policy implementation studies. For
example, the primary goal of Stages 3 and 4 was to assess whether schools were meeting EEDA
mandates as of the date we selected our sites. Since we aim to examine the implementation and
impact of a mandated state policy on school and student outcomes, we realized that there would
be both official reports of the implementation process, so that the school will appear to be
following mandates, and firsthand data about the actual implementation process at schools. Such
data may show that the policy is not being fully implemented or is not implemented as required.
Policy implementation research must also account for the various individuals who need to
implement the policy and the fact that implementation is filtered down through the hierarchy. At
the school level, the policy must be interpreted by administrators, implemented by counselors
and teachers, and requires student participation.
The mixing of quantitative and qualitative data sources during the sequential site selection
process allowed us to corroborate the various sources of information and to accommodate
multiple viewpoints on initial levels of policy implementation. Comparing questionnaire results
with school archival data and following up with school staff were essential for checking the
quality of various data sources used for site selection. This corroboration increased our confidence
in the combined data to address our research questions, an important consideration in mixed
methods research (Creswell & Plano Clark, 2007).
Pragmatism provided the essential framework for our research design and for site selection
methodology. Johnson, Onwuegbuzie, and Turner (2007) observed that mixed methods research
“should be used when the nexus of contingencies in a situation, in relation to one’s research
question(s), suggests that mixed methods research is likely to provide superior research findings
and outcomes” (p. 129). The mixed methods sampling design was crucial to help us address the
research questions for our study, a reflection of our pragmatic philosophical stance to the study
design. In the context of our complex multisite case study, we were particularly influenced by the
notions that (a) there are multiple routes to knowledge, (b) as policy researchers we should make
“warranted assertions” rather than ultimate claims of truth, and (c) theories are important for
predicting and explaining change, rather than being viewed as “true” or “false” (Johnson &
Onwuegbuzie, 2004). That is, the complex and multilevel nature of our longitudinal case study
Sharp et al. 49
required a philosophical stance that recognizes that research is situated and purposeful (Scott &
Briggs, 2009).
As described earlier, four pragmatic principles influenced our logic of inquiry (methodology)
and our data collection techniques (method)—utility, contextual relevance, generalization, and
the use of interdisciplinary research teams. These principles were enacted throughout our four-
stage site selection process. Because this process was ultimately guided by our desire to learn
more about a complex policy initiative, it was important for us to consider how the data would
help us learn about the impact of the legislation and school-based policies related to POS. Thus,
by using a pragmatic framework, we recognized the need to obtain a more comprehensive picture
of policy implementation (e.g., by quantifying the varying implementation levels in Stage 3) and
to learn more about differences between sites in policy implementation. In studies of policy
implementation, the researcher will not always be aware of all the contextual factors that
influence policy implementation. By conducting interviews and focus groups with school staff
during Stage 4, we were able to consider aspects of implementation that were not apparent from
the review of official policy guidelines and data. We were also able to appraise contextual
influences that challenged and/or altered policy implementation at school sites.
From a pragmatic perspective, the economic context is particularly relevant and was thus
explicitly considered during the first and second sequential stages of sampling. By accounting
for these differences in our sample selection process, we can engage in more thoughtful
generalizability of research findings across regions with varying economic circumstances. As
Collins and O’Cathain (2009) explained, “the researcher’s choice of sampling design impacts the
legitimation of the researcher’s inferences and the appropriate generalization of results” (p. 5).
In terms of generalizations of results, when random samples are not possible, researchers
should select sites that vary across policy implementation levels and should control for major
contextual variables. By combining quantitative and qualitative components in the sampling
scheme, we achieved a balance of schools across the state on initial level of policy implementation,
industry mix, local economic conditions, and incidentally on location (urbanicity) and school
size. Our mixed methods sampling scheme will allow us to draw comparisons and contrasts
across several dimensions that are important for addressing our research questions, including the
level of policy implementation and the availability of various community resources. This mixing
of sampling procedures will help us increase internal validity and trustworthiness and the
generalizability/transferability of results (Kemper et al., 2003). Our site selection strategy
increases our ability to Maximize at least initial variance on issues of greatest policy interest,
Minimize differences on student background characteristics, and Control for many extraneous
variables (MaxMinCon).
EEDA focuses on developing students’knowledge and abilities for high-skill, high-wage jobs
and preparing them for the modern workforce. Since the policy is statewide, its effectiveness
depends on ensuring benefits to students in all communities, regardless of levels of resources.
This requires a better understanding of the influence of community-level poverty on educational
outcomes. This is especially the case in South Carolina, where educational inequities could
potentially influence EEDA implementation (Kuczera, 2011). Thus, in the spirit of previous
pragmatic approaches to school effectiveness and school improvement research (Tashakkori &
Teddlie, 2003b), our site selection strategy accounted for varying levels of poverty (during Stage
2) so we could ultimately learn about the influence of community resources on study outcomes.
Our site selection process provides a strong foundation for our subsequent mixed methods
data collection and analytical procedures that capitalize on the benefits of these approaches. The
practical utility of pragmatism allowed us to incorporate both quantitative and qualitative
methods into our sampling strategy with the goal of ensuring that our research is practical,
contextual, responsive, and consequential (Datta, 1997). As our study progresses, and at the
50 Journal of Mixed Methods Research 6(1)
conclusion of our study, the sharing of the practical consequences of our methodological and
methods decisions should prove beneficial to the mixed methods community (Scott & Briggs,
2009). We anticipate that the site selection process described in this article will enable other
researchers to think more purposefully about their selection of sites for mixed methods studies,
whether these sites are schools or other organizations. Specifically, if random assignment or
selection cannot be achieved or are inappropriate, multistage, mixed methods sampling designs
such as ours may be used to select participants and/or sites. The rigor associated with such
strategies can help researchers ultimately gain more valuable information about policy
implementation across a range of settings.
Authors’ Note
Julia Sharp and Catherine Mobley are the primary authors. Sam Drew and Cathy Hammond are co–principal
investigators on the project. The research study team consists of all listed authors.
Acknowledgments
We would like to thank Marty Duckenfield and Peg Chrestman for their careful review of the article. We
would also like to thank the JMMR editors and four anonymous reviewers whose comments and suggestions
have significantly strengthened this article.
Declaration of Conflicting Interests
The author(s) declared the following potential conflicts of interest with respect to the research, authorship,
and/or publication of this article:
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or
publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or
publication of this article:
The work reported herein is being completed through the National Dropout Prevention Center, Clemson
University, and is supported under the National Research Center for Career and Technical Education,
University of Louisville, PR/Award (No. VO51A070003) as administered by the Office of Vocational and
Adult Education, U.S. Department of Education. However, the contents do not necessarily represent the
positions or policies of the Office of Vocational and Adult Education or the U.S. Department of Education
and you should not assume endorsement by the Federal Government.
Notes
1. South Carolina Technical College System series, How EEDA works for South Carolina, including: An
educator’s guide to develop and implement the EEDA curriculum framework and Individual Graduation
Plan (2006a) and An educator’s orientation guide to the Education and Economic Development Act
(2006b); and South Carolina Department of Education, South Carolina Education and Economic
Development Act guidelines (2006b).
2. The following schools were excluded from our sampling frame: schools that are kindergarten through
12th grade, 6th grade to 12th grade schools, schools with grade levels other than 9th grade to 12th
grade, vocational/career centers, magnet schools, charter or lab schools, and alternative schools.
3. Eighteen out of 33 schools had missing data, but we were able to contact all but two to obtain the data.
The two schools that were excluded were unresponsive to our requests for data and neither school had
submitted data on enrollment and types of at-risk programs nor had they completed a questionnaire that
would give guidance counselor and career specialist information and more details on the at-risk efforts
and whole school reform. One of the two schools with excessive missing data had also failed to submit
Sharp et al. 51
a state-mandated guidance report that would give details on implementation and participation in policy
activities. Also, this school’s course catalog could not be located online.
4. SDE provided standardized EEDA materials to all schools on a statewide website and through regional
training sessions. EEDA guidelines stipulate that schools must use the standardized form for IGP
development. Additionally, all schools were required to use the 16 federally defined career clusters for
reporting to the state but were allowed to modify the clusters (the names and what types of subjects
were included under each) for school use and to choose their own majors for each cluster. Most schools
moved to the standard IGP format in their course registration materials in the first year of EEDA, but
not all were using this format at the time we were selecting schools and reviewing catalogs online.
5. Most survey data were already scaled appropriately for our purposes (e.g., yes = 1, no = 0; or a range
0-5) with higher values indicating higher implementation; for data where a higher score would indicate
lower implementation, the scale was reversed. Some raw data was in a form that would result in one
question carrying more weight than another. For example, for the percentage of 9th graders with
a complete IGP, the range was 2 to 100 with a median of 96. Giving this data point a value of 96
compared with another data point with values from 0 to 5 would give too much weight to the first data
point. In such cases, responses were categorized into three groups with scores ranging between 0 and
2. See Smink et al. (2010) for more details about this scoring process.
References
Amaratunga, D., Baldry, D., Sarshar, M., & Newton, R. (2002). Quantitative and qualitative research in the
built environment: Application of “mixed” research approach. Work Study, 51(1), 17-31.
Axinn, W. G., & Pearce, L. D. (2006). Mixed method data collection strategies. New York, NY: Cambridge
University Press.
Balfanz, R., & Legters, N. (2004). Locating the dropout crisis. Which high schools produce the nation’s
dropouts? Where are they located? Who attends them? Baltimore, MD: Johns Hopkins University
Center for Social Organization of Schools.
Biesta, G. (2010). Pragmatism and the philosophical foundations of mixed methods research. In A.
Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research
(pp. 95-117). Thousand Oaks, CA: Sage.
Bryk, A., Sebring, P. B., Allensworth, E., Luppescu, S., & Easton, J. Q. (2009). Organizing schools for
improvement: Lessons from Chicago. Chicago, IL: University of Chicago Press.
Burgess, R. G., Pole, C. J., Evans, K., & Priestly, C. (1994). Four studies from one or one study from four?
Multisite case study research. In A. Bryman & R. R. Burgess (Eds.), Analyzing qualitative data (pp.
129-145). London, England: Routledge.
Christ,T.W.(2007).Arecursiveapproachtomixedmethodsresearchinalongitudinalstudyofpostsecondary
education disability support services. Journal of Mixed Methods Research, 1(3), 226-241.
Collins, K. M. T., & O’Cathain,A. (2009). Ten points about mixed methods research to be considered by the
novice researcher. International Journal of Multiple Research Approaches, 3(1), 2-7.
Collins, K. M. T., Onwuegbuzie, A. J., & Jiao, Q. G. (2007). A mixed-methods investigation of mixed-
methods sampling designs in social and health science research. Journal of Mixed Methods Research,
1(3), 267-294.
Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand
Oaks, CA: Sage.
Datnow, A., Borman, G. D., Stringfield, S., Overman, L. T., & Castellano, M. (2003). Comprehensive
school reform in culturally and linguistically diverse contexts: Implementation and outcomes from a
four-year study. Educational Evaluation and Policy Analysis, 25(2), 143-170.
Datta, L. (1997). A pragmatic basis for mixed-method designs. In J. C. Greene & V. J. Caracelli (Eds.),
Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms
(New Directions for Evaluation, No. 74, pp. 34-36). San Francisco, CA: Jossey-Bass.
52 Journal of Mixed Methods Research 6(1)
Feilzer, M. F. (2010). Doing mixed methods research pragmatically: Implications for the rediscovery of
pragmatism as a research paradigm. Journal of Mixed Methods Research, 41(1), 6-16.
Glaser, B. G., & Strauss,A. L. (1967). The discovery of grounded theory: Strategies for qualitative research.
Chicago, IL: Aldine.
Goetz, J. P., & LeCompte, M. D. (1984). Ethnography and qualitative design in educational research.
Orlando, FL: Academic Press.
Greene, J. C. (2007). Mixed methods in social inquiry. San Francisco, CA: Jossey-Bass.
Greene J. C., & Hall, J. N. (2010). Dialectics and pragmatism: Being of consequence. In A. Tashakkori &
C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research (pp. 119-143).
Thousand Oaks, CA: Sage.
Howe, K. R. (1988). Against the quantitative-qualitative incompatibility thesis or dogmas die hard.
Educational Researcher, 17(8), 10-16.
Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time
has come. Educational Researcher, 33(7), 14-26.
Johnson, R. B., Onwuegbuzie,A. J., & Turner, L.A. (2007). Toward a definition of mixed methods research.
Journal of Mixed Methods Research, 1(2), 112-133.
Kemper, E., Stringfield, S., & Teddlie, C. (2003). Mixed methods sampling strategies. In A. Tashakkori
& C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 273-296).
Thousand Oaks, CA: SAGE.
Kerlinger, F. (1986). Foundations of behavioral research. New York, NY: Holt, Rinehart & Winston.
Kuczera, M. (2011). Learning for jobs: OECD reviews of vocational and technical training: United States:
South Carolina. Paris, France: Organization for Economic Cooperation and Development.
Lieber, E. (2009). Mixing qualitative and quantitative methods: Insights into design and analysis issues.
Journal of Ethnographic & Qualitative Research, 3, 218-227.
Mertens, D. M. (2010). Research and evaluation in education and psychology: Integrating diversity with
quantitative, qualitative, and mixed methods (3rd ed.). Thousand Oaks, CA: Sage.
National Center for Education Statistics. (2002). School locale codes, 1987-2000. Washington, DC: U.S.
Department of Education.
O’Cathain, A., Murphy, E., & Nicholl, J. (2007). Integration and publications as indicators of “yield” from
mixed methods studies. Journal of Mixed Methods Research, 1(2), 147-163.
Onwuegbuzie, A. J., & Leech, N. L. (2007). Sampling designs in qualitative research: Making the sampling
process more public. The Qualitative Report, 12, 238-254.
Patton, M. Q. (1980). Qualitative evaluation methods. Newbury Park, CA: Sage.
Pettigrew, A. M. (1990). Longitudinal field research on change: Theory and practice. Organization Science,
1, 267-292.
Rorty, R. (1999). Philosophy and social hope. London, England: Penguin Books.
Rumberger, R. W. (1995). Dropping out of middle school: A multilevel analysis of students and schools.
American Educational Research Journal, 32, 583-625.
Rumberger, R. W. (2001). Who drops out of school and why. Santa Barbara, CA: University of California–
Santa Barbara. Retrieved from http://education.ucsb.edu/rumberger/internet%20pages/Papers/
Rumberger--NRC%20dropout%20paper%20version%2012%20with%20figures.doc
Sammons, P. (2010). The contribution of mixed methods to recent research on educational effectiveness. In
A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research
(pp. 697-723). Thousand Oaks, CA: Sage.
SAS Institute Inc. (2008). SAS 9.2 software, help and documentation. Cary, NC: Author.
Scott, P. J., & Briggs, J. S. (2009). A pragmatist argument for mixed methodology in medical informatics.
Journal of Mixed Methods Research, 3(3), 223-241.
Simons, H. (1996). The paradox of case study research. Cambridge Journal of Education, 26, 225-240.
Sharp et al. 53
Smink, J., Drew, S., Hammond, C., Withington, C., Mobley, C., Sharp, J., et al. (2010). A longitudinal
study of the South Carolina Personal Pathways to Success initiative. Louisville, KY: National Research
Center for Career and Technical Education.
South Carolina Department of Education. (2005). 2005 EAAreport card Excel files: High schools [Data File].
Retrieved August 22, 2008, from http://www.ed.sc.gov/topics/researchandstats/schoolreportcard/2005/
data
South Carolina Department of Education. (2006a). 2006 EAA report card Excel files: Poverty indices
[Data File]. Retrieved August 22, 2008, from http://www.ed.sc.gov/topics/researchandstats/school
reportcard/2006/data/
South Carolina Department of Education (2006b). South Carolina Education and Economic Development
Act guidelines. Columbia, SC: Author.
South Carolina Department of Education. (2007). 2007 State of South Carolina Education Accountability
Act report cards - data files: Poverty index [Data File]. Retrieved August 22, 2008, from http://www.
ed.sc.gov/topics/researchandstats/schoolreportcard/2007/data/
South Carolina Department of Education. (2010). South Carolina High Schools [Data File]. Retrieved
February 19, 2010 from http://ed.sc.gov/schools/allschools.cfm
South Carolina Employment Security Commission. (2008). Spotlights: WIA profiles. Retrieved from http://
www.sces.org/lmi/spotlights/WIA
South Carolina Technical College System. (2006a). How EEDA works for South Carolina. An educator’s
guide to develop and implement the EEDA curriculum framework and individual graduation plan.
Columbia, SC: Author.
South Carolina Technical College System. (2006b). How EEDA works for South Carolina. An educator’s
orientation guide to the Education and Economic Development Act. Columbia, SC: Author.
Stallings, J. A., & Kaskowitz, D. (1974). Follow through classroom observation evaluation, 1972-73: A
study of implementation. Menlo Park, CA: Stanford Research Institute, Stanford University.
Stringfield, S., Millsap, M.A., & Herman, R. (1997). Special strategies for educating disadvantaged children:
Findings and implications of a longitudinal study. Washington, DC: U.S. Department of Education.
Tashakkori, A., & Teddlie, C. (Eds.). (1998). Mixed methodology: Applying qualitative and quantitative
approaches. Thousand Oaks, CA: Sage.
Tashakkori, A., & Teddlie, C. (Eds.). (2003a). Handbook of mixed methods in social and behavioral
research. Thousand Oaks, CA: Sage.
Tashakkori, A., & Teddlie, C. (2003b). The past and future of mixed methods research: From data
triangulation to mixed model designs. InA. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods
in social and behavioral research (pp. 671-701). Thousand Oaks, CA: Sage.
Teddlie, C., & Reynolds, D. (2000). The international handbook on school effectiveness research. London,
England: Falmer.
Teddlie, C., & Stringfield, S. (1993). Schools make a difference. New York, NY: Teachers College Press.
Teddlie, C., Stringfield, S., & Reynolds, D. (2000). Context issues within school effectiveness research.
In C. Teddlie & D. Reynolds (Eds.), The international handbook on school effectiveness research (pp.
160-186). London: Falmer.
Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and
qualitative approaches in the social and behavioral sciences. Thousand Oaks, CA: Sage.
Teddlie, C., & Yu, F. (2007). Mixed methods sampling: A typology with examples. Journal of Mixed
Methods Research, 1(1), 77-100.
U.S. Bureau of the Census. (2000). 2000 Census of Population and Housing: Summary tape file 3, Tables
P43, P82 [Data files]. Retrieved from http://factfinder.census.gov
Walford, G. (2001). Site selection within comparative case study and ethnographic research. Compare, 31,
151-163.
54 Journal of Mixed Methods Research 6(1)
Wells, A. S., Hirschberg, D., Lipton, M., & Oakes, J. (1995). Bounding the case within its context: A
constructivist approach to studying detracking reform. Educational Researcher, 24(5), 18-24.
White House.gov. (2010). Issues: Education. Retrieved from http://www.whitehouse.gov/issues/education
Wimpelberg, R., Teddlie, C., & Stringfield, S. (1989). Sensitivity to context: The past and future of effective
schools research. Educational Administration Quarterly, 25, 82-105.
Wolf, S., Borko, H., Elliott, R., & McIver, M. C. (2000). “That dog won’t hunt!” Exemplary school change
efforts within the Kentucky reform. American Educational Research Journal, 37, 349-393.
Yin, R. K. (1994). Case study research: Design and methods (2nd ed.). Thousand Oaks, CA: Sage.
Yin, R. K. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage.
Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Thousand Oaks, CA: Sage.
Zartman, I. W. (2005). Comparative case studies. International Negotiation, 10, 3-15.

More Related Content

Similar to A Mixed Methods Sampling Methodology For A Multisite Case Study

The opinion of students of Arts and Science Colleges in Tamil Nadu regarding ...
The opinion of students of Arts and Science Colleges in Tamil Nadu regarding ...The opinion of students of Arts and Science Colleges in Tamil Nadu regarding ...
The opinion of students of Arts and Science Colleges in Tamil Nadu regarding ...journal ijrtem
 
Ambiguous Assessment Critiquing The Anthropology Graduate Admissions Process
Ambiguous Assessment  Critiquing The Anthropology Graduate Admissions ProcessAmbiguous Assessment  Critiquing The Anthropology Graduate Admissions Process
Ambiguous Assessment Critiquing The Anthropology Graduate Admissions ProcessTracy Morgan
 
University of Huddersfield Research Lesson Obs
University of Huddersfield Research Lesson ObsUniversity of Huddersfield Research Lesson Obs
University of Huddersfield Research Lesson ObsDominic Brockway
 
Running head IMPACT OF SOCIAL MEDIA ON STUDENT’S PERFORMANCE1.docx
Running head IMPACT OF SOCIAL MEDIA ON STUDENT’S PERFORMANCE1.docxRunning head IMPACT OF SOCIAL MEDIA ON STUDENT’S PERFORMANCE1.docx
Running head IMPACT OF SOCIAL MEDIA ON STUDENT’S PERFORMANCE1.docxwlynn1
 
Exploring educational and cultural adaptation through social networking
Exploring educational and cultural adaptation through social networkingExploring educational and cultural adaptation through social networking
Exploring educational and cultural adaptation through social networkingkruwanida
 
Analysing Research Methodologies A Case Study Of Masters Of Education In Edu...
Analysing Research Methodologies  A Case Study Of Masters Of Education In Edu...Analysing Research Methodologies  A Case Study Of Masters Of Education In Edu...
Analysing Research Methodologies A Case Study Of Masters Of Education In Edu...Sabrina Green
 
Career And Educational Goals Research Paper
Career And Educational Goals Research PaperCareer And Educational Goals Research Paper
Career And Educational Goals Research PaperStacey Wilson
 
Research study final_pkneduc518 (2)
Research study final_pkneduc518 (2)Research study final_pkneduc518 (2)
Research study final_pkneduc518 (2)Pamela Noble
 
METHODS1Sampling and MethodologyStuden
METHODS1Sampling and MethodologyStudenMETHODS1Sampling and MethodologyStuden
METHODS1Sampling and MethodologyStudenDioneWang844
 
An Analysis Of Online Courses In Research Ethics In The Fogarty-Sponsored Bio...
An Analysis Of Online Courses In Research Ethics In The Fogarty-Sponsored Bio...An Analysis Of Online Courses In Research Ethics In The Fogarty-Sponsored Bio...
An Analysis Of Online Courses In Research Ethics In The Fogarty-Sponsored Bio...Stacy Taylor
 
SurveyMETHOD.pptx
SurveyMETHOD.pptxSurveyMETHOD.pptx
SurveyMETHOD.pptxBayissaBekele
 
Psyc 255 case study paper instructionsreviewed for fall d 2020
Psyc 255 case study paper instructionsreviewed for fall d 2020 Psyc 255 case study paper instructionsreviewed for fall d 2020
Psyc 255 case study paper instructionsreviewed for fall d 2020 YASHU40
 
Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Alexander Decker
 
Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Alexander Decker
 
THE INFLUENCE OF LEARNING MANAGEMENT SYSTEM ON STUDENTS' PERFORMANCE IN UNIVE...
THE INFLUENCE OF LEARNING MANAGEMENT SYSTEM ON STUDENTS' PERFORMANCE IN UNIVE...THE INFLUENCE OF LEARNING MANAGEMENT SYSTEM ON STUDENTS' PERFORMANCE IN UNIVE...
THE INFLUENCE OF LEARNING MANAGEMENT SYSTEM ON STUDENTS' PERFORMANCE IN UNIVE...Sedufia Bokoh
 
Secondary Analysis Of Qualitative Data
Secondary Analysis Of Qualitative DataSecondary Analysis Of Qualitative Data
Secondary Analysis Of Qualitative DataDeborah Gastineau
 
Micropolitical Behavior of Second Graders: A qualitative study of student Res...
Micropolitical Behavior of Second Graders: A qualitative study of student Res...Micropolitical Behavior of Second Graders: A qualitative study of student Res...
Micropolitical Behavior of Second Graders: A qualitative study of student Res...Jack Frost
 
Research Paradigms.htmlThe term ‘research’ is commonly underst.docx
Research Paradigms.htmlThe term ‘research’ is commonly underst.docxResearch Paradigms.htmlThe term ‘research’ is commonly underst.docx
Research Paradigms.htmlThe term ‘research’ is commonly underst.docxaudeleypearl
 
Student perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studyStudent perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studymcjssfs2
 

Similar to A Mixed Methods Sampling Methodology For A Multisite Case Study (20)

The opinion of students of Arts and Science Colleges in Tamil Nadu regarding ...
The opinion of students of Arts and Science Colleges in Tamil Nadu regarding ...The opinion of students of Arts and Science Colleges in Tamil Nadu regarding ...
The opinion of students of Arts and Science Colleges in Tamil Nadu regarding ...
 
Ambiguous Assessment Critiquing The Anthropology Graduate Admissions Process
Ambiguous Assessment  Critiquing The Anthropology Graduate Admissions ProcessAmbiguous Assessment  Critiquing The Anthropology Graduate Admissions Process
Ambiguous Assessment Critiquing The Anthropology Graduate Admissions Process
 
University of Huddersfield Research Lesson Obs
University of Huddersfield Research Lesson ObsUniversity of Huddersfield Research Lesson Obs
University of Huddersfield Research Lesson Obs
 
Running head IMPACT OF SOCIAL MEDIA ON STUDENT’S PERFORMANCE1.docx
Running head IMPACT OF SOCIAL MEDIA ON STUDENT’S PERFORMANCE1.docxRunning head IMPACT OF SOCIAL MEDIA ON STUDENT’S PERFORMANCE1.docx
Running head IMPACT OF SOCIAL MEDIA ON STUDENT’S PERFORMANCE1.docx
 
Exploring educational and cultural adaptation through social networking
Exploring educational and cultural adaptation through social networkingExploring educational and cultural adaptation through social networking
Exploring educational and cultural adaptation through social networking
 
Analysing Research Methodologies A Case Study Of Masters Of Education In Edu...
Analysing Research Methodologies  A Case Study Of Masters Of Education In Edu...Analysing Research Methodologies  A Case Study Of Masters Of Education In Edu...
Analysing Research Methodologies A Case Study Of Masters Of Education In Edu...
 
Career And Educational Goals Research Paper
Career And Educational Goals Research PaperCareer And Educational Goals Research Paper
Career And Educational Goals Research Paper
 
Research study final_pkneduc518 (2)
Research study final_pkneduc518 (2)Research study final_pkneduc518 (2)
Research study final_pkneduc518 (2)
 
METHODS1Sampling and MethodologyStuden
METHODS1Sampling and MethodologyStudenMETHODS1Sampling and MethodologyStuden
METHODS1Sampling and MethodologyStuden
 
An Analysis Of Online Courses In Research Ethics In The Fogarty-Sponsored Bio...
An Analysis Of Online Courses In Research Ethics In The Fogarty-Sponsored Bio...An Analysis Of Online Courses In Research Ethics In The Fogarty-Sponsored Bio...
An Analysis Of Online Courses In Research Ethics In The Fogarty-Sponsored Bio...
 
Survey Research Design
Survey Research DesignSurvey Research Design
Survey Research Design
 
SurveyMETHOD.pptx
SurveyMETHOD.pptxSurveyMETHOD.pptx
SurveyMETHOD.pptx
 
Psyc 255 case study paper instructionsreviewed for fall d 2020
Psyc 255 case study paper instructionsreviewed for fall d 2020 Psyc 255 case study paper instructionsreviewed for fall d 2020
Psyc 255 case study paper instructionsreviewed for fall d 2020
 
Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...
 
Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...
 
THE INFLUENCE OF LEARNING MANAGEMENT SYSTEM ON STUDENTS' PERFORMANCE IN UNIVE...
THE INFLUENCE OF LEARNING MANAGEMENT SYSTEM ON STUDENTS' PERFORMANCE IN UNIVE...THE INFLUENCE OF LEARNING MANAGEMENT SYSTEM ON STUDENTS' PERFORMANCE IN UNIVE...
THE INFLUENCE OF LEARNING MANAGEMENT SYSTEM ON STUDENTS' PERFORMANCE IN UNIVE...
 
Secondary Analysis Of Qualitative Data
Secondary Analysis Of Qualitative DataSecondary Analysis Of Qualitative Data
Secondary Analysis Of Qualitative Data
 
Micropolitical Behavior of Second Graders: A qualitative study of student Res...
Micropolitical Behavior of Second Graders: A qualitative study of student Res...Micropolitical Behavior of Second Graders: A qualitative study of student Res...
Micropolitical Behavior of Second Graders: A qualitative study of student Res...
 
Research Paradigms.htmlThe term ‘research’ is commonly underst.docx
Research Paradigms.htmlThe term ‘research’ is commonly underst.docxResearch Paradigms.htmlThe term ‘research’ is commonly underst.docx
Research Paradigms.htmlThe term ‘research’ is commonly underst.docx
 
Student perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studyStudent perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative study
 

More from Fiona Phillips

Remarkable Sample College Application Essays Thats
Remarkable Sample College Application Essays ThatsRemarkable Sample College Application Essays Thats
Remarkable Sample College Application Essays ThatsFiona Phillips
 
What Is The Purpose Of A Literary Analysi
What Is The Purpose Of A Literary AnalysiWhat Is The Purpose Of A Literary Analysi
What Is The Purpose Of A Literary AnalysiFiona Phillips
 
Fine Beautiful How To Write Project Proposal Report N
Fine Beautiful How To Write Project Proposal Report NFine Beautiful How To Write Project Proposal Report N
Fine Beautiful How To Write Project Proposal Report NFiona Phillips
 
How To Write A Literature Review - . Online assignment writing service.
How To Write A Literature Review - . Online assignment writing service.How To Write A Literature Review - . Online assignment writing service.
How To Write A Literature Review - . Online assignment writing service.Fiona Phillips
 
Analysis Essay Writing Logan Square Auditorium
Analysis Essay Writing Logan Square AuditoriumAnalysis Essay Writing Logan Square Auditorium
Analysis Essay Writing Logan Square AuditoriumFiona Phillips
 
Complete Analytical Ess. Online assignment writing service.
Complete Analytical Ess. Online assignment writing service.Complete Analytical Ess. Online assignment writing service.
Complete Analytical Ess. Online assignment writing service.Fiona Phillips
 
Kanji Practice Paper Kanji Paper Has Columns Of Squa
Kanji Practice Paper Kanji Paper Has Columns Of SquaKanji Practice Paper Kanji Paper Has Columns Of Squa
Kanji Practice Paper Kanji Paper Has Columns Of SquaFiona Phillips
 
Urgent Essay Writing Service. Urgent Ess
Urgent Essay Writing Service. Urgent EssUrgent Essay Writing Service. Urgent Ess
Urgent Essay Writing Service. Urgent EssFiona Phillips
 
Scholarships Without Essay. Online assignment writing service.
Scholarships Without Essay. Online assignment writing service.Scholarships Without Essay. Online assignment writing service.
Scholarships Without Essay. Online assignment writing service.Fiona Phillips
 
How To Write And Revise An Argument Essay - Virtual Writing Tutor Blog
How To Write And Revise An Argument Essay - Virtual Writing Tutor BlogHow To Write And Revise An Argument Essay - Virtual Writing Tutor Blog
How To Write And Revise An Argument Essay - Virtual Writing Tutor BlogFiona Phillips
 
006 Report Essay Ielts Sample Thatsnotus
006 Report Essay Ielts Sample Thatsnotus006 Report Essay Ielts Sample Thatsnotus
006 Report Essay Ielts Sample ThatsnotusFiona Phillips
 
Professional Research Paper Writers Online Research Paper Writing
Professional Research Paper Writers Online Research Paper WritingProfessional Research Paper Writers Online Research Paper Writing
Professional Research Paper Writers Online Research Paper WritingFiona Phillips
 
Step In Writing A Research Paper. 8. Online assignment writing service.
Step In Writing A Research Paper. 8. Online assignment writing service.Step In Writing A Research Paper. 8. Online assignment writing service.
Step In Writing A Research Paper. 8. Online assignment writing service.Fiona Phillips
 
Oxford Test Of English El Examen Oficial De Oxford U
Oxford Test Of English El Examen Oficial De Oxford UOxford Test Of English El Examen Oficial De Oxford U
Oxford Test Of English El Examen Oficial De Oxford UFiona Phillips
 
Linking Words Contrast Archives - English Study Here
Linking Words Contrast Archives - English Study HereLinking Words Contrast Archives - English Study Here
Linking Words Contrast Archives - English Study HereFiona Phillips
 
Ways To Write The Date - Pgbari.X.Fc2.Com
Ways To Write The Date - Pgbari.X.Fc2.ComWays To Write The Date - Pgbari.X.Fc2.Com
Ways To Write The Date - Pgbari.X.Fc2.ComFiona Phillips
 
Mba Essay Writing Services - TakeoffsuperstoreS Diary
Mba Essay Writing Services - TakeoffsuperstoreS DiaryMba Essay Writing Services - TakeoffsuperstoreS Diary
Mba Essay Writing Services - TakeoffsuperstoreS DiaryFiona Phillips
 
Essay Writing Prompts Glossary Of Grammatical A
Essay Writing Prompts  Glossary Of Grammatical AEssay Writing Prompts  Glossary Of Grammatical A
Essay Writing Prompts Glossary Of Grammatical AFiona Phillips
 
When I Look Back To My First Experienc. Online assignment writing service.
When I Look Back To My First Experienc. Online assignment writing service.When I Look Back To My First Experienc. Online assignment writing service.
When I Look Back To My First Experienc. Online assignment writing service.Fiona Phillips
 
How To Write College Essays For Scholarships
How To Write College Essays For ScholarshipsHow To Write College Essays For Scholarships
How To Write College Essays For ScholarshipsFiona Phillips
 

More from Fiona Phillips (20)

Remarkable Sample College Application Essays Thats
Remarkable Sample College Application Essays ThatsRemarkable Sample College Application Essays Thats
Remarkable Sample College Application Essays Thats
 
What Is The Purpose Of A Literary Analysi
What Is The Purpose Of A Literary AnalysiWhat Is The Purpose Of A Literary Analysi
What Is The Purpose Of A Literary Analysi
 
Fine Beautiful How To Write Project Proposal Report N
Fine Beautiful How To Write Project Proposal Report NFine Beautiful How To Write Project Proposal Report N
Fine Beautiful How To Write Project Proposal Report N
 
How To Write A Literature Review - . Online assignment writing service.
How To Write A Literature Review - . Online assignment writing service.How To Write A Literature Review - . Online assignment writing service.
How To Write A Literature Review - . Online assignment writing service.
 
Analysis Essay Writing Logan Square Auditorium
Analysis Essay Writing Logan Square AuditoriumAnalysis Essay Writing Logan Square Auditorium
Analysis Essay Writing Logan Square Auditorium
 
Complete Analytical Ess. Online assignment writing service.
Complete Analytical Ess. Online assignment writing service.Complete Analytical Ess. Online assignment writing service.
Complete Analytical Ess. Online assignment writing service.
 
Kanji Practice Paper Kanji Paper Has Columns Of Squa
Kanji Practice Paper Kanji Paper Has Columns Of SquaKanji Practice Paper Kanji Paper Has Columns Of Squa
Kanji Practice Paper Kanji Paper Has Columns Of Squa
 
Urgent Essay Writing Service. Urgent Ess
Urgent Essay Writing Service. Urgent EssUrgent Essay Writing Service. Urgent Ess
Urgent Essay Writing Service. Urgent Ess
 
Scholarships Without Essay. Online assignment writing service.
Scholarships Without Essay. Online assignment writing service.Scholarships Without Essay. Online assignment writing service.
Scholarships Without Essay. Online assignment writing service.
 
How To Write And Revise An Argument Essay - Virtual Writing Tutor Blog
How To Write And Revise An Argument Essay - Virtual Writing Tutor BlogHow To Write And Revise An Argument Essay - Virtual Writing Tutor Blog
How To Write And Revise An Argument Essay - Virtual Writing Tutor Blog
 
006 Report Essay Ielts Sample Thatsnotus
006 Report Essay Ielts Sample Thatsnotus006 Report Essay Ielts Sample Thatsnotus
006 Report Essay Ielts Sample Thatsnotus
 
Professional Research Paper Writers Online Research Paper Writing
Professional Research Paper Writers Online Research Paper WritingProfessional Research Paper Writers Online Research Paper Writing
Professional Research Paper Writers Online Research Paper Writing
 
Step In Writing A Research Paper. 8. Online assignment writing service.
Step In Writing A Research Paper. 8. Online assignment writing service.Step In Writing A Research Paper. 8. Online assignment writing service.
Step In Writing A Research Paper. 8. Online assignment writing service.
 
Oxford Test Of English El Examen Oficial De Oxford U
Oxford Test Of English El Examen Oficial De Oxford UOxford Test Of English El Examen Oficial De Oxford U
Oxford Test Of English El Examen Oficial De Oxford U
 
Linking Words Contrast Archives - English Study Here
Linking Words Contrast Archives - English Study HereLinking Words Contrast Archives - English Study Here
Linking Words Contrast Archives - English Study Here
 
Ways To Write The Date - Pgbari.X.Fc2.Com
Ways To Write The Date - Pgbari.X.Fc2.ComWays To Write The Date - Pgbari.X.Fc2.Com
Ways To Write The Date - Pgbari.X.Fc2.Com
 
Mba Essay Writing Services - TakeoffsuperstoreS Diary
Mba Essay Writing Services - TakeoffsuperstoreS DiaryMba Essay Writing Services - TakeoffsuperstoreS Diary
Mba Essay Writing Services - TakeoffsuperstoreS Diary
 
Essay Writing Prompts Glossary Of Grammatical A
Essay Writing Prompts  Glossary Of Grammatical AEssay Writing Prompts  Glossary Of Grammatical A
Essay Writing Prompts Glossary Of Grammatical A
 
When I Look Back To My First Experienc. Online assignment writing service.
When I Look Back To My First Experienc. Online assignment writing service.When I Look Back To My First Experienc. Online assignment writing service.
When I Look Back To My First Experienc. Online assignment writing service.
 
How To Write College Essays For Scholarships
How To Write College Essays For ScholarshipsHow To Write College Essays For Scholarships
How To Write College Essays For Scholarships
 

Recently uploaded

Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 

Recently uploaded (20)

INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 

A Mixed Methods Sampling Methodology For A Multisite Case Study

  • 1. http://mmr.sagepub.com/ Journal of Mixed Methods Research http://mmr.sagepub.com/content/6/1/34 The online version of this article can be found at: DOI: 10.1177/1558689811417133 2012 6: 34 originally published online 2 September 2011 Journal of Mixed Methods Research and Natalie Stipanovic Julia L. Sharp, Catherine Mobley, Cathy Hammond, Cairen Withington, Sam Drew, Sam Stringfield A Mixed Methods Sampling Methodology for a Multisite Case Study Published by: http://www.sagepublications.com On behalf of: Mixed Methods International Research Association can be found at: Journal of Mixed Methods Research Additional services and information for http://mmr.sagepub.com/cgi/alerts Email Alerts: http://mmr.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: http://mmr.sagepub.com/content/6/1/34.refs.html Citations: What is This? - Sep 2, 2011 OnlineFirst Version of Record - Mar 13, 2012 Version of Record >> by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from by guest on October 11, 2013 mmr.sagepub.com Downloaded from
  • 2. Journal of Mixed Methods Research 6(1) 34–54 Š The Author(s) 2012 Reprints and permission: http://www. sagepub.com/journalsPermissions.nav DOI: 10.1177/1558689811417133 http://jmmr.sagepub.com 1 Clemson University, Clemson, SC, USA 2 University of Louisville, Louisville, KY, USA Corresponding Author: Julia L. Sharp, Department of Applied Economics and Statistics, 237 Barre Hall, Clemson University, Clemson, SC 29634-0313, USA Email: jsharp@clemson.edu A Mixed Methods Sampling Methodology for a Multisite Case Study Julia L. Sharp1 , Catherine Mobley1 , Cathy Hammond1 , Cairen Withington1 , Sam Drew1 , Sam Stringfield2 , and Natalie Stipanovic2 Abstract The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies.Yet the utilization of mixed methods to select sites for these studies is rarely reported.The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a statewide education policy initiative in the United States.The authors designed a four-stage sequential mixed methods site selection strategy to select eight sites in order to capture the broader context of the research, as well as any contextual nuances that shape policy implementation.The authors anticipate that their experience would provide guidance to other mixed methods researchers seeking to maximize the rigor of their multisite case study sampling designs. Keywords site selection, mixed methods sampling, mixed methods study, case study, purposive sampling Social scientists use mixed methods research in a variety of contexts, including educational research and policy analyses. Ultimately, the mixing decisions in these studies entail deciding on how and at what stages the mixing will occur (Creswell & Plano Clark, 2007). Much of the pertinent literature focuses on selecting qualitative and quantitative data collection strategies to explore the study’s research questions. More robust studies often involve mixing methods throughout the research process (Tashakkori & Teddlie, 1998, 2003a), including during the sample or site selection stage of research. Collins, Onwuegbuzie, and Jiao (2007) described sample selection as one of the most important stages of mixed methods studies. Mixed methods sampling techniques may be useful when it is challenging to obtain a representative sample using only one method. Such sampling techniques are often appropriate for mixed methods studies that incorporate both goals of generalizability of research findings and in-depth understanding of the research context (Kemper, Stringfield, & Teddlie, 2003; Onwuegbuzie & Leech, 2007). Specifically, if a study cannot use
  • 3. Sharp et al. 35 random assignment or selection, then multistage, mixed methods sampling designs may be used to select participants or sites that are more likely representative of the population studied and that are best suited to answer the research questions. Addressing sampling issues early in the research design helps ensure that the mixed methodology employed is maximized to the fullest extent possible (Lieber, 2009). Yet few mixed methods studies provide enough details about sample or site selection so researchers can learn from the strategy employed (Teddlie & Tashakkori, 2009). Collins et al.’s (2007) extensive literature review uncovered only four articles that specifically address sample selection. There is a similar lack of published information about how to best select sites for multisite case studies (Onwuegbuzie & Leech, 2007). Our study aims to fill this gap in the literature. In our article, we describe the mixed methods multisite study of a statewide education policy initiative and the sampling strategy that we developed and used to select sites. We anticipate that our experience can provide guidance to other mixed methods researchers seeking to maximize the rigor of their own multisite case study sampling designs. Background of Our Mixed Methods Multisite Case Study There continues to be much concern regarding how best to prepare students for higher education and ultimately the demands of the modern workplace. This concern is evidenced in federal, state, and local legislation, initiatives, and programs. A major focus of the Obama administration’s education agenda is preparing students for college and/or for high-skill, high-wage jobs (White House Web site, 2010). Across the United States, policymakers have developed innovative programs and curriculum reform efforts to improve school-to-work and school-to-college transitions. In 2007, the National Research Center for Career and Technical Education funded three overlapping multiyear studies to investigate the development of Perkins IV–defined Programs of Study (POS) and the impact of POS on student outcomes (see Table 1 for acronym definitions). Our study, one of the three, is a longitudinal, 5-year examination of the early effects of a state- mandated school reform policy that requires a focus on career awareness and exploration at all school levels. This policy is based on South Carolina’s Education and Economic Development Act (EEDA) of 2005 that mandated the creation of locally relevant career pathways or programs of study in high schools.Among other requirements, these pathways or programs must align with postsecondary education, pertain to local economic realities and industry, and provide work- based learning opportunities for students. All South Carolina public schools are expected to have fully implemented EEDA by July 2011. Our overarching study aims to assess the extent to which a statewide reform mandate such as EEDA facilitates the creation of POS and whether these POS affect students’ engagement, achievement, high school completion, and successful transition to postsecondary education and/ or employment. We also explore whether or not the availability of school and community resources and future employment opportunities influence the development of POS and the outcomes of students enrolled in them. This study considers two main questions: (a) Can a statewide mandate like EEDA increase the number of and level of participation in school POS? and (b) How does the number of and level of participation in POS, in combination with various political, economic, and social characteristics, influence selected outcomes for South Carolina’s secondary students and the schools they attend? Given that the state law applies to all high schools in South Carolina, random assignment of schools to experimental and control groups was not possible. Thus, our 5-year study employs a quasi-experimental design with a mixed methods, triangulated approach (Tashakkori & Teddlie, 2003a), following three student cohorts (the Classes of 2009, 2011, and 2014) from a sample of eight high schools from economically and culturally diverse regions of South Carolina. Each
  • 4. 36 Journal of Mixed Methods Research 6(1) school’s Class of 2009 received very little to no exposure to the reforms, whereas the Class of 2011 is receiving exposure during high school and the Class of 2014 is receiving exposure from middle through high school. Using a mixed methods research design will help us achieve a broader understanding of the policy’s impacts on students, teachers, schools, and POS creation. Our study most closely approximates the pragmatic parallel mixed methods research design in which both qualitative and quantitative data are collected and analyzed to address research questions. The mixing occurs either concurrently or after some time passes (Mertens, 2010). The qualitative and quantitative approaches address various dimensions of the main research questions (Teddlie & Tashakkori, 2009). Quantitative data include student outcome data (e.g., grades and attendance) from the three student cohorts and responses from student questionnaires. Qualitative data include the results from content analyses of school course catalogs and career- related materials and perspectives culled from interviews and focus groups conducted with school administrators, counselors, and teachers from sample high schools, relevant career center staff, and administrators at partner two-year colleges. Both qualitative and quantitative data come from student Individual Graduation Plans (IGPs) from the state database, questionnaires of guidance personnel and high school principals, and Class of 2009 and Class of 2011 student transition and post-graduation information. Justification for Using Mixed Methods Research for Our Multisite Case Study The choice to use a mixture of qualitative or quantitative methods is a reflection of an epistemological, or philosophical, stance to research, as much as it is a choice in actual data collection methodologies. Case studies can be postpositivistic, phenomenological, or both (Amaratunga, Baldry, Sarshar, & Newton, 2002). Our study was influenced by the research philosophy of pragmatism that served as a bridge between conflicting paradigms and across the paradigm–methodology–method continuum (Johnson & Onwuegbuzie, 2004). Indeed, the complex nature of the social world requires a more fluid understanding and application of the relationship between philosophical paradigms (assumptions about the social world and nature of knowledge), methodology (the logic of inquiry), and methods (techniques of data collection). Various authors have argued for the need to move beyond incommensurability, whereby one set of philosophical assumptions necessarily dictates a specific methodological approach, which would then subsequently limit one’s choice of data collection methods (Howe, 1988; Johnson & Onwuegbuzie, 2004). As described below, the flexibility inherent in a pragmatic approach to research is especially important in complex case studies. Longitudinal multisite case studies like ours combine the study of specific sites with an exploration of the various contexts in which the policy might be implemented to provide a Table 1. Definition of Acronyms Acronym Definition EEDA Education and Economic Development Act IGP Individual Graduation Plan NRCCTE National Research Center for Career and Technical Education POS Programs of Study PSLOI Preliminary Site Selection Level of Policy Implementation SDE South Carolina Department of Education SLOI Site Selection Level of Policy Implementation WIA Workforce Investment Area
  • 5. Sharp et al. 37 broader basis for generalization (Simons, 1996). For our study, we faced the challenges of selecting sites that were representative of the study area (South Carolina) while also selecting sites that could tell us the most about the complexities of policy implementation. To meet these challenges, our research design and site selection process relied on the epistemological approach of pragmatism (Biesta, 2010; Greene & Hall, 2010). This mixed methods research approach is primarily guided by a study’s research questions, is based on the needs of and contingencies present in a particular study, and ultimately reflects a value of both subjective and objective knowledge (Johnson & Onwuegbuzie, 2004; Tashakkori & Teddlie, 2003a). This pragmatic research philosophy results in more robust and interesting findings and thus is of greater value to policymakers and practitioners (Sammons, 2010), although we recognize that pragmatism is not the only, or even most appropriate, philosophical foundation for all mixed methods research (Biesta, 2010; Greene & Hall, 2010). Several principles of pragmatism influenced our logic of inquiry, research design, and the methods used for our site selection process (Johnson & Onwuegbuzie, 2004): • Pragmatism recognizes the importance of eclecticism and pluralism whereby “different, even conflicting theories and perspectives can be useful; observation, experience and experiments are all useful ways to gain an understanding of people and the world” (Johnson & Onwuegbuzie, 2004, p. 18). • Pragmatists prefer action over philosophizing. • Pragmatism endorses practical theory or praxis (theory that informs effective practice). • From a pragmatic perspective, “knowledge is viewed as being both constructed and based on the reality of the world we experience and live in” (Johnson & Onwuegbuzie, 2004, p. 18). These principles of pragmatism subsequently influenced our logic of inquiry on four different levels described below. First, as a study of policy implementation, our research seeks “actionable knowledge of direct practical value in the context being studied” (Greene & Hall, 2010, p. 138). The nature of our study made it important for us to consider how the data would help us learn about the impact of the legislation and school-based policies related to POS. As expressed by Rorty (1999), a more comprehensive assessment of policy implementation can be obtained using a pragmatic approach, not necessarily for the goal of providing a more “accurate account” but rather to improve the usefulness of the research. Thus, we had to be mindful of the consequences of our study, its utility for informing future policy initiatives (Feilzer, 2010), and its value for a variety of stakeholders, including practitioners and policy makers (Sammons, 2010). In a similar vein, we felt it was important to capture the broader context of the research as well as the subtle contextual nuances that shaped policy implementation and how these varied across the study sites. This contextual search is a second quality of pragmatism that influenced our research. This notion is reinforced by research that demonstrates that some schools are more effective in achieving outcomes (such as implementing a complex policy like EEDA) than others (Tashakkori & Teddlie, 2003b). Thus, we recognized the need to account for school effects and other contextual variables when explaining outcomes. Third, a pragmatic approach to sample selection shaped our conceptions of generalizability and how the findings can be applied when reporting study results. Careful selection of sites is no trivial matter and this choice is often governed by whether the researcher wants to generalize results to a larger population (in which case random sampling is often used) or desires to learn more about a specific setting or phenomenon (in which case purposive sampling strategies are often used; Onwuegbuzie & Leech, 2007). As Tashakkori and Teddlie (1998) claimed, mixed
  • 6. 38 Journal of Mixed Methods Research 6(1) methods researchers often switch between different types of generalizability–generalizing results to a theoretical population and generalizing results to another specific setting. The intermingling of quantitative and qualitative approaches and their underlying philosophies is a central feature of mixed methods studies and of multisite case studies where there is pressure to generalize research findings beyond a single case (Simons, 1996). In light of these factors, our study used a purposive sampling strategy that integrated quantitative and qualitative approaches. Finally, from a pragmatic perspective, mixed methods studies are strengthened when research teams are composed of individuals from a variety of disciplines (Sammons, 2010). This reflects Greene’s (2007) notion that mixed methods researchers should engage in a “mixed methods way of thinking,” whereby we not only mix methodologies but also “different ways of seeing, interpreting, and knowing” about the social world (p. xi). The development of our research design and site selection process was influenced by the make-up of the research team and a consideration of the relationships between the team members themselves and between the team members and the study design, a feature of mixed methods research (Sammons, 2010; Tashakkori & Teddlie, 2003b). Our team includes a statistician, an economist, sociologists, and educational researchers, each contributing unique methodological training and skills to the project.According to Teddlie and Tashakkori (2009), this collaborative team approach is especially useful for mixed methods studies such as ours that use the parallel mixed methods design. The philosophical approach of pragmatism and the four elements comprising our logic of inquiry shaped our choice of methods used to select study sites. In the next section, we review the literature on case studies and sample selection. Selecting Sample Sites for Mixed Methods Case Studies Yin (2003) cited several reasons for engaging in case studies. Case studies are (a) relevant when the focus of a study is on “how” and “why,” (b) used when researchers cannot manipulate the behavior of those under study, (c) appropriate when researchers want to learn more about the contextual conditions that are especially relevant to the phenomenon under study, and (d) used when the boundaries between the subject of study and the context are not clear. All four of these reasons are applicable to our study. Yin (1994) distinguished between a variety of case study approaches ranging from a single case study to multiple case studies such as the Type 4 design, which involves multiple sources of data, multiple cases, multiple methods, and multiple units of analysis. Longitudinal multisite case studies such as ours offer a combination of the study of a specific site with the need to understand the context and provide a wider basis for generalizing findings (Simons, 1996). Such studies are particularly valuable as they allow for comparisons within cases and across time and contexts, thus offering a “rich harvest of lessons and insights” (Zartman, 2005, p. 8). Despite the recognition that a mixed methods design is appropriate for case studies, the literature offers little guidance on how to select study sites and how many sites to select. Researchers often resort to convenience sampling and choosing sites that allow easy access. As a result, site selection often lacks any theoretical justification and the resulting data are often not situated within a particular theoretical context (Walford, 2001). Although we recognize the need for convenience sampling in some cases, our work highlights the importance of rigor when selecting sites for case studies, especially when researchers need to understand the influence of a policy on outcomes achieved within complex settings, such as educational institutions. Such complex studies often require that researchers use a combination of quantitative and qualitative sampling techniques (Kemper et al., 2003).
  • 7. Sharp et al. 39 Concerning sampling strategies, Goetz and LeCompte (1984) called for criterion-based sampling to “establish the criteria, bases, or standards necessary for units to be included” in the research study (p. 77). This idea of criterion-based sampling is similar to purposeful or purposive samplingdesignsthatareoftenusedtoselectasampletoattainrepresentativenessorcomparability in a study (Patton, 1980; Teddlie & Yu, 2007). However, both strategies offer more generalized guidelines than those used in our study. That is, we did not select what Patton (1980) described as the most extreme or deviant, typical cases, critical cases, or politically sensitive cases. Other site selection criteria include choosing sites that have high experience levels of the phenomenon under study and choosing sites that increase the chance for negotiating access (Pettigrew, 1990). Some of these factors played an indirect role in, but did not drive, our selection process. No universal rule exists regarding the number of sites to select for multisite case studies (Axinn & Pearce, 2006). Yin (2009) observed that the number of cases depends on both literal replication (the amount of certainty desired concerning the research findings) and theoretical replication (the extent to which external, contextual factors shape research findings and how many cases are needed to reflect this variety). Such decisions are directly related to the idea that case study research is not meant to be generalizable in the positivist, statistical sense of the word. Thus, the traditional concept of random sampling typically does not apply to multisite case studies (Yin, 2009). Rather, a purposive sampling strategy generally is used to select the best sites possible, given the research goals and questions. Our mixed methods sampling strategy represents a unique integration of quantitative and qualitative methods at the sampling (i.e., site selection) phase of our mixed methods study (O’Cathain, Murphy, & Nicholl, 2007). Within the context of our overarching parallel mixed methods study, we developed a four-stage nested mixed methods sampling strategy following the principles of a pragmatic sequential mixed methods approach. With this strategy, one type of data informs the collection of another type of data in a subsequent stage (Mertens, 2010). The remainder of this article describes our mixed methods site selection strategy in more detail. Using Mixed Methods to Select Sites for Our Multisite Case Study As is the case in most multisite case studies, we faced the challenges of deciding how to select study sites (i.e., high schools) and identifying relevant criteria for selecting those sites. Budgetary and time restraints limited the number of sites that could be studied, thus making it even more important to choose sites in a way that would allow us to learn as much as possible about state policy implementation and POS under differing school conditions. Collins et al. (2007) listed major sampling schemes frequently employed in mixed methods research. A random selection of high schools may have led us to select schools all with similar policy implementation levels or with similar other characteristics that would best be varied in order to address the research questions. Instead, we used a multistage, mixed methods sampling design to select a sample that characterizes the population of interest so we could better analyze the impact of the policy on students and schools. Similar to Wells, Hirschberg, Lipton, and Oakes’s (1995) study of school detracking efforts and Teddlie and Stringfield’s (1993) research on school effects, we were interested in selecting a sample of schools that exhibited variety on primary variables of interest. In particular, we wanted to assure variation on critical variables shown in past research to influence the implementation of school reforms together with other variables that were perceived from the outset to have potential influence on outcomes. Following Axinn and Pearce (2006), we sought to identify “all factors believed to produce initial characteristics or conditions in a nonrandom way. These measures can then be used in sophisticated statistical models to simulate random assignment of initial conditions” (p. 161). We used a dual approach to identify key variables—hypothesizing some early on and allowing others to emerge through data collection.
  • 8. 40 Journal of Mixed Methods Research 6(1) In our site selection plan, we aimed for variation across sites in actual level of policy implementation. Based on policy guidelines provided to schools,1 the study team identified the most salient initiatives for high schools and grouped them into the following six key facets around which to measure policy implementation: (a) identification of and assistance for students who are at high risk for dropping out of school; (b) integration of rigorous academic and career- focused curricula, organized into career clusters and majors; (c) increased counselor roles in education and career planning; (d) implementation of evidence-based high school reform; (e) facilitation of local business–education partnerships and resource dissemination; and (f) articulation between kindergarten through 12th grade and higher education. Coupled with a desire for obtaining variety in levels of policy implementation, we aimed to include schools from a diversity of contexts that are important for understanding educational reform efforts such as EEDA. Thus, our sample included high schools that varied across several policy- relevant factors including industry-related variables, availability of community and economic resources, and level of implementation of the statewide policy. Defining these variables before data collection allowed us to discuss in advance how to operationalize the contextual variables relevant for our study. Focusing on these contextual variables during our site selection process enabled us to connect them to student and school outcomes, a consideration that has become increasingly important in school effectiveness research (Teddlie, Stringfield, & Reynolds, 2000; Wimpelberg, Teddlie, & Stringfield, 1989) and other kinds of educational studies (Sammons, 2010). Our Mixed Methods Site Selection Strategy In using a mixed methods approach for site selection, we considered several factors important for our study of EEDA implementation and the policy’s impact on POS and student outcomes. First, sample selection and study data collection were narrowed to high schools as the penultimate sites of EEDAefforts, even though EEDAis a kindergarten through college initiative with implications beyond that in terms of further education/training and community partnerships required for successful implementation. Second, the team chose to limit the sampling frame to those high schools considered to be “traditional” high schools that included only Grades 9 through 12.2 Among the high schools listed on the South Carolina Department of Education website (SDE; 2010), there were more than 150 schools that we defined as traditional. In all practicality, the team was unable to include all these high schools in our sample. We chose to include eight high schools in our study sample so that we could conduct our study within time and budget constraints, yet effectively answer our research questions. Our final sample size of eight schools was within the number (4-12 sites) suggested by Teddlie and Tashakkori (2009) for mixed methods multisite case studies. As an alternative to selecting experimental and control schools and to provide a measure of control over various factors that might affect the study at sample schools, our sampling design followed the MaxMinCon strategy (Kerlinger, 1986; Tashakkori & Teddlie, 1998). South Carolina is geographically divided into 12 Workforce Investment Area (WIA) regions (Figure 1). The team sampled to Maximize differences among WIAs (e.g., communities offering differing economic opportunities) and schools within WIAs (on level of EEDA implementation based on the six key facets previously listed). Furthermore, the team chose to Minimize differences between schools within WIAs on student background characteristics and district support for the schools, and Control for as many extraneous variables as practical, so as to minimize error variability. Although anchored in a postpositivist tradition, the MaxMinCon methodology is also reflective of the pragmatic goals of generalizability, contextuality, and relevance (Tashakkori & Teddlie, 1998). This strategy led us to a four-stage sequential school selection process that we
  • 9. Sharp et al. 41 describe in more detail below. Figure 2 provides an illustration of the sampling strategy used for site selection. Stage 1: Representing Regional and Industrial Diversity All high schools could, in theory, offer a wide range of POS options. In practice, schools may have chosen to offer specific POS best matched to the careers most likely to be available to students in their region. Hence, in the first stage of sampling, we introduced controls for economic and industry conditions that might affect the availability and development of business partners for POS and work-based learning opportunities and career-specific education and employment opportunities. Local and regional economics would likely influence policy implementation since the reform model is career-focused and is intended to be linked and relevant to local labor markets and industries. We also wanted to control for a school’s local economic conditions so that we could compare policy implementation for schools facing similar labor market and economic conditions and contrast schools from different local conditions. We used industry-related (private and government) information for 10 primary industries in each of the 12 WIAs in South Carolina (South Carolina Employment Security Commission, 2008). As a part of EEDA, a Regional Education Center (REC) is being developed in each WIA to serve as a hub for the region’s training and education resources. The Regional Education Centers will help to facilitate business–education partnerships, coordinate workforce education programs, and promote community involvement. Thus, we considered the WIA as an economic entity focused on a somewhat distinctive industry mix. Figure 1. Workforce Investment Areas (WIAs) in South Carolina Note: From South Carolina Employment Security Commission,“Spotlights:WIA Profiles.” Retrieved from http://www .sces.org/lmi/spotlights/WIA/
  • 10. 42 Journal of Mixed Methods Research 6(1) Industry employment data, averaged within each WIA, were used in a quantitative Chi-square analysis to explore the association between WIAs and industry employment. We used this analysis to explore the statistical justification to select WIAs based on concentrations of workers in major state industries. Results indicated a significant association (using a significance level of .05) between the WIAs and industry employment, χ2 (33, N = 1,116,799) = 108200.70, p < .01. Three WIAs were identified in which employment for one of the top five South Carolina Figure 2. Sampling strategies and resulting number of schools selected at each stage of the sampling design Note: Near the end of Stage 3, one cluster of schools declined to participate in the study. A substitute cluster of 12 schools was selected using the Stage 2 process. Stage 3 procedures were then applied to this new group to reach a revised grouping of 10 schools at the end of Stage 3. Stage 1: Representing Regional and Industrial Diversity Chi-square analysis to select 4 WIAs (59 schools) Stage 2: Selecting School Clusters Based on Level of Available Economic Resources Hierarchical cluster analysis on selected economic measures (31 schools) Stage 3: Ranking High Schools on EEDA Implementation Level Quantitative and qualitative data collection to rank schools on PSLOI. 16 high and low ranked schools were invited to participate in the study with 10 agreeing to participate Stage 4: Validation of Policy Implementation Level and Variation on Key School Characteristics Implementation validation site visits to select the final sample of 8 schools
  • 11. Sharp et al. 43 industries (trade, transportation, and utilities; government; manufacturing; leisure and hospitality; and professional and business services) was significantly greater than expected, and one WIA was identified where employment in two of the top five industries was significantly greater than expected. We selected these four WIAs so that we could make comparisons among, and within, WIAs. Fifty-nine high schools that met our school criteria were located in these four WIAs. Although 59 high schools would have been a more manageable sample than the original sampling frame of more than 150 traditional 4-year high schools in South Carolina, it still would have been difficult for us to collect timely and appropriate data from such a large sample during the study period. The schools in the four WIAs also varied across other study-relevant factors, including local economic conditions, thus making it difficult to discern in our final analysis whether the impact of the legislation was due to policy or economic conditions. For this reason, a second quantitative sampling stage was used to select schools of varying local economic conditions across the WIAs and with similar economic conditions within the WIAs. Stage 2: Selecting School Clusters Based on Level of Available Economic Resources During the second sequential stage of school selection, we used hierarchical cluster analysis to cluster schools within each of the four selected WIA regions based on the level of selected local economic measures (i.e., measures that were closer to the school level of analysis). Research indicates that community resource and poverty levels can influence a school’s ability to implement change (Bryk, Sebring,Allensworth, Luppescu, & Easton, 2009; Teddlie & Reynolds, 2000). The impact of economic resources on student and school outcomes and on successful implementation of school reforms is also well documented. Balfanz and Legters (2004) found that urban, low-income high schools, or so-called dropout factories, produced the highest percentages of dropouts. Dropout rates are also higher in impoverished communities (Rumberger, 2001), and some links have been found between dropout rates and employment rates. Schools 0 5 10 15 20 25 WIA 2: Rural WIA High Poverty Cluster Mid/Low Poverty Cluster Selected Cluster Count of Schools WIA 3: Urban WIA 1: Rural WIA 4: Urban Figure 3. Clustering of 59 schools using data representing local economic conditions
  • 12. 44 Journal of Mixed Methods Research 6(1) with higher concentrations of lower income students tend to have higher dropout rates (Rumberger, 1995). We used the following local economic measures to cluster schools within the WIAs: per capita income by postal codes of all students enrolled in each school, a school poverty index based on the percentage of students eligible for Medicaid or qualified for free and/or reduced price lunch by school, the percentage of families in poverty with children below the age of 18 years by postal code, and the percentage civilian unemployment by postal code. Most of the 59 schools in our set of potential sites did not draw from specific postal codes, that is, the postal delivery zones did not align with attendance zones. Therefore, for each potential site, we acquired a data set of postal codes of all students enrolled for the most recent school year, then applied a weight to each postal code for each school according to the proportion of students from each postal code. These weights were then applied to the 2000 Census postal code data so that the data were representative of the student populations at the schools (U.S. Bureau of the Census, 2000). A hierarchical cluster analysis was performed within each WIA using SAS v. 9.2 (SAS Institute Inc., 2008). In this analysis, each observation begins in its own cluster and the two closest clusters (based on the squared distance between the averages) are grouped together. This merging is continued until only one cluster remains. Figure 3 illustrates the clustering of the 59 Table 2. Demographic Characteristics of School Clusters Demographic Factor High-Poverty Clusters Low-to-Moderate Poverty Clusters Rural WIA Urban WIA Rural WIA Urban WIA Average per capita income (1999)a $15,521 $19,752 $19,128 $24,268 Range in per capita income (1999)a $13,486-$18,156 $16,305-$23,034 $18,638-$19,758 $21,505-$29,223 Average school poverty index (2004-05, 2005- 06, 2006-07)b 74% 54% 50% 35% Range in school poverty index (2004-05, 2005- 06, 2006-07)b 50%-92% 36%-83% 47%-57% 13%-51% Range in percent unemployment (1999)c 5%-12% 3%-7% 4%-5% 2%-7% Note:WIA = Workforce Investment Area. a.A local per capita income figure was derived for each school using weighted 5-digit postal code data (weighted by postal code residence data for students enrolled in each school) from the U.S. Census Bureau. 2000 Census of Popula- tion and Housing, Summary File 3 (SF3), Sample Data,Table P82 Per Capita Income in 1999 (Dollars)—Universe:Total population.The list of postal codes used to get weighted averages of all census data for schools came from South Carolina Department of Education, Office of Data Management and Analysis (personal communication, September 25, 2008). b.This is school-level data published in the South Carolina Department of Education State of South Carolina Education Accountability Act report cards (South Carolina Department of Education, 2005, 2006a, 2007), available online at the South Carolina Department of Education website.The poverty index is a measure of the percentage of students at each school eligible for Medicaid or qualified for free and/or reduced lunch. c.A local percentage of civilian unemployment figure was derived for each school using weighted 5-digit postal code data (weighted by postal code residence data for students enrolled in each school) from the U.S. Census Bureau. 2000 Census of Population and Housing, SF3, Sample Data,Table P43 Sex by Employment Status for the Population 16Years and Over—Universe: Population 16 years and over.
  • 13. Sharp et al. 45 schools on the four local economic measures. We clustered schools in each WIA into one of two clusters: either high or low-to-moderate (mid/low) poverty. Clusters selected from two WIAs (one with more urban areas and one with little or no urban areas) included high-poverty schools and clusters selected from the other two WIAs (one with more urban areas and one with little or no urban areas) included low-to-moderate poverty schools. Demographics for the four selected school clusters are shown in Table 2. Thirty-three eligible high schools were contained in these four clusters. Two schools were removed from the sampling frame due to excessive missing data for the third stage,3 leaving 31 schools from which to select our final sample. Stage 3: Ranking High Schools on EEDA ImplementationWithin Each SelectedWIA Cluster Researchers in multisite case studies have frequently selected schools based on varying levels of implementation or exposure to a particular policy or practice (e.g., Burgess, Pole, Evans, & Priestly,1994).Severalstudiesprovidedstrongsupportforincludingpriorlevelofimplementation as a selection variable for our study (Stallings & Kaskowitz, 1974; Stringfield, Millsap, & Herman, 1997; Datnow, Borman, Stringfield, Overman, & Castellano, 2003). Selecting schools that exhibit a range of implementation levels helped us avoid a challenge that frequently arises in multisite studies: selecting only schools that are exemplary or selecting only schools that exhibit low levels of implementation (Christ, 2007; Wolf, Borko, Elliott, & McIver, 2000). Pettigrew (1990) recommended identifying “polar sites” for study, whereby researchers select cases that illustrate high and low performance on the indicator of interest. In the third sequential stage of sample selection, the research team gave schools preliminary site selection level of policy implementation (PSLOI) scores based on available data from the 2007-2008 school year (the initial year of the study) on the six measures previously described. Because visiting all 31 schools was not practical, we used data from the SDE and school and district websites to assist in formulating the PSLOI scores. The PSLOI scores allowed us to consider relevant contextual factors during sample selection. The research team collected both quantitative and qualitative data on school EEDA implementation to obtain the PSLOI for each school. Survey data were collected from a state- mandated questionnaire on guidance activities and an SDE high school reform needs questionnaire. We acquired additional SDE data on each school’s progress in policy implementation. Schools are required to inform staff, parents, and students about EEDA, and many choose to do so through school websites. An instrument was developed to analyze content of materials, text, and catalogs available on school and district websites about curriculum, course selection, registration, programs, guidance personnel, and materials for parents and students. The content analysis consisted of conceptual analysis of materials to identify the common and most consistent themes as they related to the research questions and the policy itself. We looked at how closely these materials met the state’s standard policy format and how much EEDA information schools provided on their websites.4 We used the constant comparative method of analysis (Glaser & Straus, 1967), comparing the various sources of data, simultaneously coding and analyzing the data as we progressed. The content analysis instrument was tested until 100% agreement among three reviewers was achieved. In total, three schools were used to test the instrument. The instrument was then used to content analyze each school’s website. We also contacted schools and districts to collect missing data so as not to bias PSLOI simply due to ineffective or poor school or district websites. From the SDE data sources and the website analyses, we initially identified 63 possible data points that could be used to rank schools on our six identified key policy facets. After more in-depth review of each data point, some were found to duplicate content measured by other data
  • 14. 46 Journal of Mixed Methods Research 6(1) points whereas others seemed unreliable (e.g., a survey question was unclear and responses varied widely). We chose 41 of the possible 63 data points to include in our scoring. A final coding scheme was devised for all the data used.5 Schools that had more advanced implementation of the state policy across the six identified facets received higher PSLOI scores, and schools that had less advanced implementation across the six facets received lower PSLOI scores. Schools were then rank ordered within clusters on the PSLOI scores, with the goal of identifying one school with a high level and one school with a low level of implementation from each cluster. Figure 4 illustrates the range in PSLOI scores across the schools within WIAs. If two or more schools had similar PSLOI scores, we considered other factors such as school size, urbanicity of the school (as defined by the National Center for Education Statistics, 2002), and minority enrollment to ensure that a diverse array of schools was chosen. Sixteen high schools (two with high PSLOI and two with low PSLOI scores in each of the four clusters) were invited to receive preliminary site data validation visits and to possibly participate in the study. Of the initial 16 schools (schools marked by solid black lines in Figure 4, with “F” in the school name to signify “first round” picks) contacted, nine agreed to receive visits, whereas the remaining schools either did not respond to repeated contacts or declined to participate (Figure 4, “D” in the school name to signify “declined”). One of these nine schools, School 4FD (Figure 4) was removed from the sampling frame when the structure of the high school was modified so that it no longer met our definition for inclusion in the study. In the WIA with the fewest schools accepting our invitation to participate in the study (WIA3), we invited two substitute high schools 0 10 20 30 40 50 60 1FV 2F 3F 4FD 5FD 6 7F 8 9 10 11F 12F 13FD 14FV 15 16 17S 18 19FD 20 21SD 22F 23FD 24FD 25D 26D 27D 28D 29D 30FD 31FD 32SND 33SND 34 35 36 37 38 39SN 40 41 42 43 Schoolsa PSLOI WIA 1: Rural Mid/Low Poverty Cluster WIA 2: Rural High Poverty Cluster WIA 3: Urban Mid/Low Poverty Cluster WIA 4: Urban High Poverty Cluster WIA 3: Urban High Poverty Cluster 16 Schools Initially Invited to Participate in Study Substitutes Selected to Replace Those We Were Unable to Visit 8 Schools Chosen for the Sample Figure 4. Preliminary selection level of implementation (PSLOI) scores for the 43 high schools considered for inclusion in the sample Note:The 16 original schools invited to participate are shown with solid black bars;WIA4 schools declined to participate as did several other schools (labeled with “D” in school names). Substitute schools invited to participate are shown with striped bars (and labeled with “S” in school names). Eight schools selected for study have stars above their bars. In all, 43 (31 across original 4 clusters plus 12 in new “replacement” WIA3 high poverty cluster) were given PSLOI scores and considered for inclusion in the study. a. Schools are numbered in order of PSLOI by WIA cluster. Letters following the numbers in school names correspond to the following codes: F = one of the first 16 schools chosen;V = visited but not selected; D = declined to participate, did not conform to criteria, or never responded to invitation; S = substitute school; N = school from new WIA3 high poverty cluster invited to participate.
  • 15. Sharp et al. 47 with PSLOI scores similar to those schools not participating to receive validation visits (Figure 4, “S” in the school name to signify “substitutes”). One of the two substitute schools agreed to participate and was included in Stage 4. Toward the end of Stage 3, schools in one WIA cluster declined to participate in our study (WIA4 in Figure 3) due to their time, budgetary, and research circumstances. To compensate for losing WIA4, the high-poverty cluster from the remaining urban group from the Stage 2 sampling frame was added as a substitute cluster. We contacted three schools from the substitute cluster (WIA3, urban, high-poverty). One of these three schools agreed to participate and was also included in Stage 4. Thus, at the end of Stage 3, we had identified 10 schools to receive site visits in Stage 4. Stage 4:Validating Policy Implementation Level andVariation on Key School Characteristics The fourth stage of the sampling scheme involved site visits to validate the qualitative and quantitative data collected in the prior stages. We visited the schools to verify that the PSLOI scores generally reflected the reality at the schools and to determine the qualifications of the school for inclusion in the final study sample. The 10 site visits were scheduled with the assistance of site administrators. The research team met with key school personnel including principals, assistant principals, guidance directors, guidance counselors, and teachers to verify the scores. Interviews with each individual or group were 30 minutes to 1 hour in length. We asked about EEDAimplementation, the stage of development of the high school’s majors and career pathways, and the operational details of the IGP development process. We asked guidance directors and guidance personnel to describe their specific roles in policy implementation; the ways in which they work with students, teachers, and parents on career development; and the amount of time they devoted to these activities. We conducted one to three focus groups with 9th and 10th grade teachers at each school. Each group included three to six teachers from different concentration areas, including math, English, social studies, science, career and technical education, honors, advanced placement (courses through which students have the opportunity to earn college credit), college preparation, and basic and special education courses. Focus groups lasted from 45 minutes to 1 hour each. Teachers were asked to discuss their perceptions of school implementation of the various components of EEDA, including career-focused activities and curricula, the progress made in implementation, and the impacts of the reform on their work generally and specifically on how they teach their courses (Smink et al., 2010). From the site visit interviews and observations, the team was able to substantiate the initial implementation selection scores and revise where necessary. The PSLOI scores were updated from verified information gathered during the validation site visits. The new scores (called site selection level of policy implementation scores; SLOI) were used for final sample selection and will also be revised for use at the end of the study to compare the change in level of policy implementation over time. In addition to SLOI scores, we considered other information for our final selection of study sites including school staff opinions on policy implementation, the school’s interest in participation in the study, and the school’s cooperation in providing materials. The final eight high schools selected varied in terms of the level of EEDA implementation (high and low-to-moderate) and levels of poverty, urbanicity, and industry characteristics (as characterized by location within a particular WIA). As a result, in our subsequent analyses, we will be better able to make important comparisons between and among schools, based on the characteristics of the schools. We will also be able to obtain a wide variety of information to understand better how these and other factors influence state policy implementation and subsequently other outcomes of interest.
  • 16. 48 Journal of Mixed Methods Research 6(1) Discussion Case studies are characterized by their multilevel, multidimensional characteristics. Such research studies naturally evolve over time, as do the contexts and sites themselves. Schools are complex and hierarchical in nature, with multiple interrelated levels, including students, classrooms, schools, and districts. A number of factors about our study and the settings we explored led us to a mixed methods approach not only for data collection and analysis to address our research questions but also to select our study sites so that we could investigate these questions. Multiple vantage points and data sources are necessary to better understand the complexity of these educational settings. Such complexities inevitably invite reflection on how we framed our research, designed the overall study, and developed our site selection strategy. We assume that mixed methods have been used to select participants or cases for other mixed methods research studies, but few authors have described the specifics of their sampling strategies. Specifically, we used quantitative analyses to select four WIA regions (Stage 1) and to cluster schools from these regions by selected local economic measures to select high and low- to-moderate poverty schools (Stage 2). During the third and fourth stages of sampling, we not only used qualitative and quantitative data but also qualitative and quantitative methods to obtain scores to better rank and compare schools for site selection. This mixed methods sampling design was crucial to help us address the research questions for our study, a reflection of our pragmatic philosophical stance to the study design. Our study was influenced by several factors common to policy implementation studies. For example, the primary goal of Stages 3 and 4 was to assess whether schools were meeting EEDA mandates as of the date we selected our sites. Since we aim to examine the implementation and impact of a mandated state policy on school and student outcomes, we realized that there would be both official reports of the implementation process, so that the school will appear to be following mandates, and firsthand data about the actual implementation process at schools. Such data may show that the policy is not being fully implemented or is not implemented as required. Policy implementation research must also account for the various individuals who need to implement the policy and the fact that implementation is filtered down through the hierarchy. At the school level, the policy must be interpreted by administrators, implemented by counselors and teachers, and requires student participation. The mixing of quantitative and qualitative data sources during the sequential site selection process allowed us to corroborate the various sources of information and to accommodate multiple viewpoints on initial levels of policy implementation. Comparing questionnaire results with school archival data and following up with school staff were essential for checking the quality of various data sources used for site selection. This corroboration increased our confidence in the combined data to address our research questions, an important consideration in mixed methods research (Creswell & Plano Clark, 2007). Pragmatism provided the essential framework for our research design and for site selection methodology. Johnson, Onwuegbuzie, and Turner (2007) observed that mixed methods research “should be used when the nexus of contingencies in a situation, in relation to one’s research question(s), suggests that mixed methods research is likely to provide superior research findings and outcomes” (p. 129). The mixed methods sampling design was crucial to help us address the research questions for our study, a reflection of our pragmatic philosophical stance to the study design. In the context of our complex multisite case study, we were particularly influenced by the notions that (a) there are multiple routes to knowledge, (b) as policy researchers we should make “warranted assertions” rather than ultimate claims of truth, and (c) theories are important for predicting and explaining change, rather than being viewed as “true” or “false” (Johnson & Onwuegbuzie, 2004). That is, the complex and multilevel nature of our longitudinal case study
  • 17. Sharp et al. 49 required a philosophical stance that recognizes that research is situated and purposeful (Scott & Briggs, 2009). As described earlier, four pragmatic principles influenced our logic of inquiry (methodology) and our data collection techniques (method)—utility, contextual relevance, generalization, and the use of interdisciplinary research teams. These principles were enacted throughout our four- stage site selection process. Because this process was ultimately guided by our desire to learn more about a complex policy initiative, it was important for us to consider how the data would help us learn about the impact of the legislation and school-based policies related to POS. Thus, by using a pragmatic framework, we recognized the need to obtain a more comprehensive picture of policy implementation (e.g., by quantifying the varying implementation levels in Stage 3) and to learn more about differences between sites in policy implementation. In studies of policy implementation, the researcher will not always be aware of all the contextual factors that influence policy implementation. By conducting interviews and focus groups with school staff during Stage 4, we were able to consider aspects of implementation that were not apparent from the review of official policy guidelines and data. We were also able to appraise contextual influences that challenged and/or altered policy implementation at school sites. From a pragmatic perspective, the economic context is particularly relevant and was thus explicitly considered during the first and second sequential stages of sampling. By accounting for these differences in our sample selection process, we can engage in more thoughtful generalizability of research findings across regions with varying economic circumstances. As Collins and O’Cathain (2009) explained, “the researcher’s choice of sampling design impacts the legitimation of the researcher’s inferences and the appropriate generalization of results” (p. 5). In terms of generalizations of results, when random samples are not possible, researchers should select sites that vary across policy implementation levels and should control for major contextual variables. By combining quantitative and qualitative components in the sampling scheme, we achieved a balance of schools across the state on initial level of policy implementation, industry mix, local economic conditions, and incidentally on location (urbanicity) and school size. Our mixed methods sampling scheme will allow us to draw comparisons and contrasts across several dimensions that are important for addressing our research questions, including the level of policy implementation and the availability of various community resources. This mixing of sampling procedures will help us increase internal validity and trustworthiness and the generalizability/transferability of results (Kemper et al., 2003). Our site selection strategy increases our ability to Maximize at least initial variance on issues of greatest policy interest, Minimize differences on student background characteristics, and Control for many extraneous variables (MaxMinCon). EEDA focuses on developing students’knowledge and abilities for high-skill, high-wage jobs and preparing them for the modern workforce. Since the policy is statewide, its effectiveness depends on ensuring benefits to students in all communities, regardless of levels of resources. This requires a better understanding of the influence of community-level poverty on educational outcomes. This is especially the case in South Carolina, where educational inequities could potentially influence EEDA implementation (Kuczera, 2011). Thus, in the spirit of previous pragmatic approaches to school effectiveness and school improvement research (Tashakkori & Teddlie, 2003b), our site selection strategy accounted for varying levels of poverty (during Stage 2) so we could ultimately learn about the influence of community resources on study outcomes. Our site selection process provides a strong foundation for our subsequent mixed methods data collection and analytical procedures that capitalize on the benefits of these approaches. The practical utility of pragmatism allowed us to incorporate both quantitative and qualitative methods into our sampling strategy with the goal of ensuring that our research is practical, contextual, responsive, and consequential (Datta, 1997). As our study progresses, and at the
  • 18. 50 Journal of Mixed Methods Research 6(1) conclusion of our study, the sharing of the practical consequences of our methodological and methods decisions should prove beneficial to the mixed methods community (Scott & Briggs, 2009). We anticipate that the site selection process described in this article will enable other researchers to think more purposefully about their selection of sites for mixed methods studies, whether these sites are schools or other organizations. Specifically, if random assignment or selection cannot be achieved or are inappropriate, multistage, mixed methods sampling designs such as ours may be used to select participants and/or sites. The rigor associated with such strategies can help researchers ultimately gain more valuable information about policy implementation across a range of settings. Authors’ Note Julia Sharp and Catherine Mobley are the primary authors. Sam Drew and Cathy Hammond are co–principal investigators on the project. The research study team consists of all listed authors. Acknowledgments We would like to thank Marty Duckenfield and Peg Chrestman for their careful review of the article. We would also like to thank the JMMR editors and four anonymous reviewers whose comments and suggestions have significantly strengthened this article. Declaration of Conflicting Interests The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. Funding The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The work reported herein is being completed through the National Dropout Prevention Center, Clemson University, and is supported under the National Research Center for Career and Technical Education, University of Louisville, PR/Award (No. VO51A070003) as administered by the Office of Vocational and Adult Education, U.S. Department of Education. However, the contents do not necessarily represent the positions or policies of the Office of Vocational and Adult Education or the U.S. Department of Education and you should not assume endorsement by the Federal Government. Notes 1. South Carolina Technical College System series, How EEDA works for South Carolina, including: An educator’s guide to develop and implement the EEDA curriculum framework and Individual Graduation Plan (2006a) and An educator’s orientation guide to the Education and Economic Development Act (2006b); and South Carolina Department of Education, South Carolina Education and Economic Development Act guidelines (2006b). 2. The following schools were excluded from our sampling frame: schools that are kindergarten through 12th grade, 6th grade to 12th grade schools, schools with grade levels other than 9th grade to 12th grade, vocational/career centers, magnet schools, charter or lab schools, and alternative schools. 3. Eighteen out of 33 schools had missing data, but we were able to contact all but two to obtain the data. The two schools that were excluded were unresponsive to our requests for data and neither school had submitted data on enrollment and types of at-risk programs nor had they completed a questionnaire that would give guidance counselor and career specialist information and more details on the at-risk efforts and whole school reform. One of the two schools with excessive missing data had also failed to submit
  • 19. Sharp et al. 51 a state-mandated guidance report that would give details on implementation and participation in policy activities. Also, this school’s course catalog could not be located online. 4. SDE provided standardized EEDA materials to all schools on a statewide website and through regional training sessions. EEDA guidelines stipulate that schools must use the standardized form for IGP development. Additionally, all schools were required to use the 16 federally defined career clusters for reporting to the state but were allowed to modify the clusters (the names and what types of subjects were included under each) for school use and to choose their own majors for each cluster. Most schools moved to the standard IGP format in their course registration materials in the first year of EEDA, but not all were using this format at the time we were selecting schools and reviewing catalogs online. 5. Most survey data were already scaled appropriately for our purposes (e.g., yes = 1, no = 0; or a range 0-5) with higher values indicating higher implementation; for data where a higher score would indicate lower implementation, the scale was reversed. Some raw data was in a form that would result in one question carrying more weight than another. For example, for the percentage of 9th graders with a complete IGP, the range was 2 to 100 with a median of 96. Giving this data point a value of 96 compared with another data point with values from 0 to 5 would give too much weight to the first data point. In such cases, responses were categorized into three groups with scores ranging between 0 and 2. See Smink et al. (2010) for more details about this scoring process. References Amaratunga, D., Baldry, D., Sarshar, M., & Newton, R. (2002). Quantitative and qualitative research in the built environment: Application of “mixed” research approach. Work Study, 51(1), 17-31. Axinn, W. G., & Pearce, L. D. (2006). Mixed method data collection strategies. New York, NY: Cambridge University Press. Balfanz, R., & Legters, N. (2004). Locating the dropout crisis. Which high schools produce the nation’s dropouts? Where are they located? Who attends them? Baltimore, MD: Johns Hopkins University Center for Social Organization of Schools. Biesta, G. (2010). Pragmatism and the philosophical foundations of mixed methods research. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research (pp. 95-117). Thousand Oaks, CA: Sage. Bryk, A., Sebring, P. B., Allensworth, E., Luppescu, S., & Easton, J. Q. (2009). Organizing schools for improvement: Lessons from Chicago. Chicago, IL: University of Chicago Press. Burgess, R. G., Pole, C. J., Evans, K., & Priestly, C. (1994). Four studies from one or one study from four? Multisite case study research. In A. Bryman & R. R. Burgess (Eds.), Analyzing qualitative data (pp. 129-145). London, England: Routledge. Christ,T.W.(2007).Arecursiveapproachtomixedmethodsresearchinalongitudinalstudyofpostsecondary education disability support services. Journal of Mixed Methods Research, 1(3), 226-241. Collins, K. M. T., & O’Cathain,A. (2009). Ten points about mixed methods research to be considered by the novice researcher. International Journal of Multiple Research Approaches, 3(1), 2-7. Collins, K. M. T., Onwuegbuzie, A. J., & Jiao, Q. G. (2007). A mixed-methods investigation of mixed- methods sampling designs in social and health science research. Journal of Mixed Methods Research, 1(3), 267-294. Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage. Datnow, A., Borman, G. D., Stringfield, S., Overman, L. T., & Castellano, M. (2003). Comprehensive school reform in culturally and linguistically diverse contexts: Implementation and outcomes from a four-year study. Educational Evaluation and Policy Analysis, 25(2), 143-170. Datta, L. (1997). A pragmatic basis for mixed-method designs. In J. C. Greene & V. J. Caracelli (Eds.), Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms (New Directions for Evaluation, No. 74, pp. 34-36). San Francisco, CA: Jossey-Bass.
  • 20. 52 Journal of Mixed Methods Research 6(1) Feilzer, M. F. (2010). Doing mixed methods research pragmatically: Implications for the rediscovery of pragmatism as a research paradigm. Journal of Mixed Methods Research, 41(1), 6-16. Glaser, B. G., & Strauss,A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago, IL: Aldine. Goetz, J. P., & LeCompte, M. D. (1984). Ethnography and qualitative design in educational research. Orlando, FL: Academic Press. Greene, J. C. (2007). Mixed methods in social inquiry. San Francisco, CA: Jossey-Bass. Greene J. C., & Hall, J. N. (2010). Dialectics and pragmatism: Being of consequence. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research (pp. 119-143). Thousand Oaks, CA: Sage. Howe, K. R. (1988). Against the quantitative-qualitative incompatibility thesis or dogmas die hard. Educational Researcher, 17(8), 10-16. Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26. Johnson, R. B., Onwuegbuzie,A. J., & Turner, L.A. (2007). Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1(2), 112-133. Kemper, E., Stringfield, S., & Teddlie, C. (2003). Mixed methods sampling strategies. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 273-296). Thousand Oaks, CA: SAGE. Kerlinger, F. (1986). Foundations of behavioral research. New York, NY: Holt, Rinehart & Winston. Kuczera, M. (2011). Learning for jobs: OECD reviews of vocational and technical training: United States: South Carolina. Paris, France: Organization for Economic Cooperation and Development. Lieber, E. (2009). Mixing qualitative and quantitative methods: Insights into design and analysis issues. Journal of Ethnographic & Qualitative Research, 3, 218-227. Mertens, D. M. (2010). Research and evaluation in education and psychology: Integrating diversity with quantitative, qualitative, and mixed methods (3rd ed.). Thousand Oaks, CA: Sage. National Center for Education Statistics. (2002). School locale codes, 1987-2000. Washington, DC: U.S. Department of Education. O’Cathain, A., Murphy, E., & Nicholl, J. (2007). Integration and publications as indicators of “yield” from mixed methods studies. Journal of Mixed Methods Research, 1(2), 147-163. Onwuegbuzie, A. J., & Leech, N. L. (2007). Sampling designs in qualitative research: Making the sampling process more public. The Qualitative Report, 12, 238-254. Patton, M. Q. (1980). Qualitative evaluation methods. Newbury Park, CA: Sage. Pettigrew, A. M. (1990). Longitudinal field research on change: Theory and practice. Organization Science, 1, 267-292. Rorty, R. (1999). Philosophy and social hope. London, England: Penguin Books. Rumberger, R. W. (1995). Dropping out of middle school: A multilevel analysis of students and schools. American Educational Research Journal, 32, 583-625. Rumberger, R. W. (2001). Who drops out of school and why. Santa Barbara, CA: University of California– Santa Barbara. Retrieved from http://education.ucsb.edu/rumberger/internet%20pages/Papers/ Rumberger--NRC%20dropout%20paper%20version%2012%20with%20figures.doc Sammons, P. (2010). The contribution of mixed methods to recent research on educational effectiveness. In A. Tashakkori & C. Teddlie (Eds.), Sage handbook of mixed methods in social and behavioral research (pp. 697-723). Thousand Oaks, CA: Sage. SAS Institute Inc. (2008). SAS 9.2 software, help and documentation. Cary, NC: Author. Scott, P. J., & Briggs, J. S. (2009). A pragmatist argument for mixed methodology in medical informatics. Journal of Mixed Methods Research, 3(3), 223-241. Simons, H. (1996). The paradox of case study research. Cambridge Journal of Education, 26, 225-240.
  • 21. Sharp et al. 53 Smink, J., Drew, S., Hammond, C., Withington, C., Mobley, C., Sharp, J., et al. (2010). A longitudinal study of the South Carolina Personal Pathways to Success initiative. Louisville, KY: National Research Center for Career and Technical Education. South Carolina Department of Education. (2005). 2005 EAAreport card Excel files: High schools [Data File]. Retrieved August 22, 2008, from http://www.ed.sc.gov/topics/researchandstats/schoolreportcard/2005/ data South Carolina Department of Education. (2006a). 2006 EAA report card Excel files: Poverty indices [Data File]. Retrieved August 22, 2008, from http://www.ed.sc.gov/topics/researchandstats/school reportcard/2006/data/ South Carolina Department of Education (2006b). South Carolina Education and Economic Development Act guidelines. Columbia, SC: Author. South Carolina Department of Education. (2007). 2007 State of South Carolina Education Accountability Act report cards - data files: Poverty index [Data File]. Retrieved August 22, 2008, from http://www. ed.sc.gov/topics/researchandstats/schoolreportcard/2007/data/ South Carolina Department of Education. (2010). South Carolina High Schools [Data File]. Retrieved February 19, 2010 from http://ed.sc.gov/schools/allschools.cfm South Carolina Employment Security Commission. (2008). Spotlights: WIA profiles. Retrieved from http:// www.sces.org/lmi/spotlights/WIA South Carolina Technical College System. (2006a). How EEDA works for South Carolina. An educator’s guide to develop and implement the EEDA curriculum framework and individual graduation plan. Columbia, SC: Author. South Carolina Technical College System. (2006b). How EEDA works for South Carolina. An educator’s orientation guide to the Education and Economic Development Act. Columbia, SC: Author. Stallings, J. A., & Kaskowitz, D. (1974). Follow through classroom observation evaluation, 1972-73: A study of implementation. Menlo Park, CA: Stanford Research Institute, Stanford University. Stringfield, S., Millsap, M.A., & Herman, R. (1997). Special strategies for educating disadvantaged children: Findings and implications of a longitudinal study. Washington, DC: U.S. Department of Education. Tashakkori, A., & Teddlie, C. (Eds.). (1998). Mixed methodology: Applying qualitative and quantitative approaches. Thousand Oaks, CA: Sage. Tashakkori, A., & Teddlie, C. (Eds.). (2003a). Handbook of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage. Tashakkori, A., & Teddlie, C. (2003b). The past and future of mixed methods research: From data triangulation to mixed model designs. InA. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 671-701). Thousand Oaks, CA: Sage. Teddlie, C., & Reynolds, D. (2000). The international handbook on school effectiveness research. London, England: Falmer. Teddlie, C., & Stringfield, S. (1993). Schools make a difference. New York, NY: Teachers College Press. Teddlie, C., Stringfield, S., & Reynolds, D. (2000). Context issues within school effectiveness research. In C. Teddlie & D. Reynolds (Eds.), The international handbook on school effectiveness research (pp. 160-186). London: Falmer. Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. Thousand Oaks, CA: Sage. Teddlie, C., & Yu, F. (2007). Mixed methods sampling: A typology with examples. Journal of Mixed Methods Research, 1(1), 77-100. U.S. Bureau of the Census. (2000). 2000 Census of Population and Housing: Summary tape file 3, Tables P43, P82 [Data files]. Retrieved from http://factfinder.census.gov Walford, G. (2001). Site selection within comparative case study and ethnographic research. Compare, 31, 151-163.
  • 22. 54 Journal of Mixed Methods Research 6(1) Wells, A. S., Hirschberg, D., Lipton, M., & Oakes, J. (1995). Bounding the case within its context: A constructivist approach to studying detracking reform. Educational Researcher, 24(5), 18-24. White House.gov. (2010). Issues: Education. Retrieved from http://www.whitehouse.gov/issues/education Wimpelberg, R., Teddlie, C., & Stringfield, S. (1989). Sensitivity to context: The past and future of effective schools research. Educational Administration Quarterly, 25, 82-105. Wolf, S., Borko, H., Elliott, R., & McIver, M. C. (2000). “That dog won’t hunt!” Exemplary school change efforts within the Kentucky reform. American Educational Research Journal, 37, 349-393. Yin, R. K. (1994). Case study research: Design and methods (2nd ed.). Thousand Oaks, CA: Sage. Yin, R. K. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage. Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Thousand Oaks, CA: Sage. Zartman, I. W. (2005). Comparative case studies. International Negotiation, 10, 3-15.