[Itech] Fwd: TP Msg. #1199 Quantitative and Qualitative and Assessment Methods

Teresa Franklin franklit at ohio.edu
Tue Oct 2 13:55:53 EDT 2012


Graduate Students:

Something to think about concerning research.

Dr. Franklin

---------- Forwarded message ----------
From: Rick Reis <reis at stanford.edu>
Date: Tue, Oct 2, 2012 at 12:22 PM
Subject: TP Msg. #1199 Quantitative and Qualitative and Assessment Methods
To: tomorrows-professor <tomorrows-professor at lists.stanford.edu>


Each primary type of qualitative data contributes unique and valuable
perspectives about student learning to the outcomes-based assessment
process. When used in combination, a more complete or holistic picture of
student learning is created.
-----------------------------------------------------------------------------------------------------------------------------------

TOMORROW'S PROFESSOR(sm) eMAIL NEWSLETTER
http://cgi.stanford.edu/~dept-ctl/cgi-bin/tomprof/postings.php

Archives of all past postings can be found at:
http://cgi.stanford.edu/~dept-ctl/cgi-bin/tomprof/postings.php

Sponsored by
Stanford Center for Teaching and Learning
http://ctl.stanford.edu

Check out the Tomorrow's Professor Blog at:
http://derekbruff.org/blogs/tomprof/

Folks:

The posting below describes the differences between quantitative and
qualitative research and the appropriate uses of each of them. It is from
Chapter 4, Assessment Methods in the book: Demonstrating Student Success, A
Practical Guide to Outcomes-Based Assessment of Learning and Development in
Student Affairs, by Marilee J. Bresciani, Megan Moore Gardner, and Jessica
Hickmott. Published by Stylus Publishing , LLC 2283 Quicksilver Drive,
Sterling, Virginia 20166-2102.[http://www.styluspub.com] © Copyright 2009
by Stylus Publishing, LLC. All rights reserved. Reprinted with permission.

Regards,

Rick Reis
reis at stanford.edu
UP NEXT: What Mentors Do

Tomorrow's Research

---------------------------------------------------------- 1,700 words
---------------------------------------------------------

Quantitative and Qualitative and Assessment Methods

Quantitative Assessment Methods

Quantitative methods use numbers for interpreting data (Maki, 2004) and
“are distinguished by emphasis on numbers, measurement, experimental
design, and statistical analysis” (Palomba & Banta 1999). Large numbers of
cases may be analyzed using quantitative design, and this type of design is
deductive in nature, often stemming from a preconceived hypothesis (Patton,
2002). The potential to generalize results to a broader audience and
situations make this type of research/assessment design popular with many.
Although assessment can be carried out with the rigor of traditional
research, including a hypothesis and results that are statistically
significant, this is not a necessary component of programmatic
outcomes-based assessment. It is not essential to have a certain sample
size unless the scope of your assessment is on the institutional level.

A traditionally favored type of research design that has influenced
outcomes-based assessment methodology is quantitative assessment.
Quantitative assessment offers a myriad of data collection tools including
structured interviews, questionnaires, and tests. In the higher education
setting, this type of design is found in many nationally employed
assessment tools (e.g., National Survey of Student engagement, Community
College survey of Student Engagement, and the CORE Institute Alcohol and
Drug Survey) but can also be locally developed and used to assess more
specific campus needs and student learning outcomes. It is important when
engaging in quantitative methodological design, sampling, analysis, and
interpretation to ensure that those individuals involved are knowledgeable
about, as well as comfortable with, engaging in quantitative design
(Palomba & Banta, 1999).

At Colorado State University, two primary quantitative assessment methods
are used to examine apartment life on campus. “The Apartment Life Exit
Survey is given to residents as they begin the 'vacate' process from their
apartment. Results are tabulated twice each year, once at the end of fall
semester and once in the summer” (Bresciani et al., in press).

Administrators at Pennsylvania State University originally measured the
success of their newspaper readership program based on satisfaction and
use. The quantitative survey they were using was later revised “to include
more detailed information on students' readership behavior (e.g., how
frequently they are reading a paper, how long, and which sections),
students' engagement on campus and in the community, and their
self-reported gains in various outcomes (e.g., developing an understanding
of current issues, expanding their vocabulary, articulating their views on
issues, increasing their reading comprehension)” (Bresiani et al., 2009).
This revision allowed them to use survey methodology while still measuring
the impact of the program on student learning.

CSUS underwent a similar revision process of a locally developed
quantitative survey looking at its new student orientation program.
Originally, only student and parent satisfaction were measured. This was
later revised to include a true/false component in the orientation
evaluation that used a form of indirect assessment. In the final revision,
a pre-and post-test were administered to those students attending
orientation to measure the knowledge gained in the orientation session
(Bresciani et al., 2009).

In addition, a great deal of data already contained in student
transactional systems can be used to assist in the evaluation of programs.
Data such as facility usage, service usage, adviser notations,
participation in student organizations, leadership role held, and length of
community service can all help in explaining why outcomes may have been
met. For instance, staff at an institution's counseling service desire for
all students who are treated for sexually transmitted diseases to be able
to identify the steps and strategies to avoid contracting them before
leaving the 45-minute office appointment. However, when they evaluated
this, they learned that only 70% of the students were able to do this, but
they also examined their office appointment log and realized that because
of the high volume of patients, they were only able to spend 27 minutes
with each student on average. The decreased intended time to teach students
about their well-being may explain why the counseling staff's results were
lower than they would have desired.

Qualitative Assessment Methods

According to Denzin and Lincoln (2004), qualitative research is
“multimethod in focus, involving an interpretive, naturalistic approach to
its subject matter” (p. 2). Upcraft and Schuh (1996) expand this definition
by stating, “Qualitative methodology is the detailed description of
situations, events, people, interactions, and observed behaviors, the use
of direct quotations from people about their experiences, attitudes,
beliefs, and thoughts” (p. 21). Qualitative assessment is focused on
understanding how people make meaning of and experience their environment
or world (Patton, 2002). It is narrow in scope, applicable to specific
situations and experiences, and is not intended for generalization to broad
situations. Different from quantitative research, qualitative research
employs the researcher as the primary means of data collection (e.g.,
interviews, focus groups, and observations. Also unlike quantitative
research, the qualitative approach is inductive in nature, leading to the
development or creation of a theory rather than the testing of a
preconceived theory of hypothesis (Patton). It is important to note then
that when applying qualitative methodology to outcomes-based assessment,
you are not fully using an inductive approach because you are using the
methodology to determine whether an intended outcome has been identified.
However, the application of the methods themselves can yield very rich
findings for outcomes-based assessment.

Data for qualitative analysis generally result from fieldwork. According to
Patton (2002), during fieldwork a researcher spends a significant amount of
time in the setting that is being investigated or examined. Generally
multimethod in focus, three types of findings often result from the
qualitative fieldwork experience; interviews, observations, and documents.

Each primary type of qualitative data contributes unique and valuable
perspectives about student learning to the outcomes-based assessment
process. When used in combination, a more complete or holistic picture of
student learning is created.

Interviews

Interviews comprise a number of open-ended, questions that result in
responses that yield information “about people's experiences, perceptions,
opinions, feelings, and knowledge” (Patton, 2002, p.4). It is common to
engage in face-to face verbal interviews with one individual; however,
interviews may also be conducted with a group and administered via mail,
telephone, or the Web (Upcraft & Schuh, 1996). Though questions and format
may differ, an essential component of any interview is the “trust and
rapport to be built with respondents” (Upcraft & schuh, p. 32). Open-ended
questions can also be given to students at the conclusion of a program or
an event to receive quick and immediate feedback. At Widener University,
“questions presented before, during, and after the {student health
services} presentations allowed for an interactive experience and a means
to monitor learning progress” (Bresciani et al., in press.

Observations

Observations, on the other hand, do not require direct contact with a study
participant or group. Rather, this type of data collection involves a
researcher providing information-rich descriptions of behavior,
conversations, interactions, organizational processes, or any other type of
human experience obtained through observation. Such observation may be
either participant, in which the researcher is actually involved in the
activities, conversations, or organizational processes, or nonparticipant,
in which the research remains on the outside of the activity, conversation,
or organizational process in scope (Creswell, 1998; Denzin & Lincoln, 2000;
Patton, 2002). In keeping a record of observations, many methods can be
used. One way is to take notes during the observation; another method
commonly employed is to create a checklist or rubric to use during the
observation. The checklist or rubric not only gives the observer a set of
criteria to observe, but it also allows the observer to show student
progress over time and to correlate a number with a qualitative process. At
North Carolina State University, for example,

           a total of 259 students that were found guilty of a violation of
the {Student} Code {of Conduct} were assigned a paper with questions
           specifically written to correspond with the criteria for the
development of insight and impact on life issues, as identified in the
learning
           outcome. A rubric was used to review the papers. The rubric was
created based on a theory of insight by Mary M. Murray (1995). In her book
           Artwork of the Mind, Murray describes how to determine the
development of insight through writing. Initially 20 papers were drawn
randomly to
           test the rubric. The rubric originally had a scale with three
categories; beginning, developing and achieved and six dimensions based on
the
           theory and practice. In total, 22 papers were drawn and reviewed
based on the rubric. (Bresciani eit al., 2009).

Isothermal Community College (ICC) incorporated the qualitative assessment
method of using portfolios for professionals completing the assessment
process. Although this particular example focuses on staff and departments
using portfolios, this method of assessment is commonly used with students
as well. At ICC

          each year staff set aside time to reflect on what has been
learned through assessment, compile related documents into a portfolio, and
summarize
          major areas of learning into what we refer to as “reflective
narratives.” The process is systematic and ongoing with portfolios and
narratives
          submitted for review by various administrators in June of each
year. (Bresciani et al., 2009.

Documents

Finally, documents include “written materials and other documents from
organizational, clinical, or programs records; memoranda and
correspondence; official publications and reports; personal diaries,
letters, artistic works, photographs, and memorabilia; and written
responses to open-ended surveys” (Patton, 2002, p. 4). Public records and
personal documents are the two primary categories of documents one might
use when doing outcomes-based assessment or research (Upcraft & Schuh,
1996). Newspaper and magazine excerpts, enrollment and retention records,
and judicial records are examples of personal records. Both types of
documents can enhance the overall data collected in an assessment project.
It is important to note, however, that the authenticity of documents must
be determined prior to using them for assessment (Creswell, 1998; Patton,
2992; Upcraft & Schuh, 1996).

In addition to the aforementioned documents, many student affairs
professionals also use portfolios, student reflections, reports, or other
forms of classroom-type documents for outcomes-based assessment data
collection. Again, criteria checklists or rubrics can be used in the
analysis of documents to identify whether outcomes are met. Keep in mind
that whenever criteria are used with a qualitative method, the process of
inductive discovery is diminished and therefore is the true nature of the
qualitative methodology. Nonetheless, documents are a rich source of
information and provide a great starting point for any assessment project.

REFERENCES

Bresciani, M. J. (in press-a). Challenges in the implementation of
outcome-based assessment program review in a California Community College
District. Community College Journal of Research and Practice.

Bresciani, M. J. (in press-b). An introduction to outcomes-based
assessment; A comparison of approaches. McClellan & J. Stringer (Eds.),
Handbook for student affairs administration (3rd ed.). San Francisco;
Jossey Bass.

Bresciani, M. J. (in press-c). Understanding barriers to student
affairs/services professionals' engagement in outcomes-based assessment of
student learning and development. College Student Journal.

Creswell, J. W. (1998). Qualitative inquiry and research design: choosing
among five traditions. Thousand Oaks, CA: Sage.

Denzin, N., & Lincoln, Y. (Eds). (2000). Handbook of qualitative research.
Thousand Oaks, CA: Sage.

Maki, P. L., (2004). Assessing for learning: Building a sustainable
commitment across the institution Sterling, VA: Stylus.

Palomba, C. A. & Banta, T. W. (1999). Assessment essentials: Planning,
implementing, and improving assessment in higher education. San Francisco:
Jossey-Bass.

Patton, M.Q. (2002). Qualitative research and evaluation methods. Thousand
Oaks, CA: Sage.

Schuh, J. H., Upcraft, M.L., & Associates. (2001). Assessment practice in
student affairs: an application manual. San Francisco: Jossey-Bass.

Upcraft, M.L. & Schuh, J.H. (1996). Assessment in student affairs: A guide
for practitioners. San Francisco: Jossey-Bass.



* * * * * * *
NOTE: Anyone can SUBSCRIBE to the Tomorrows-Professor Mailing List by going
to:
https://mailman.stanford.edu/mailman/listinfo/tomorrows-professor
You can UNSUBSCRIBE by hitting "return" to this posting with the word
"unsubscribe" in the subject line.





--++**==--++**==--++**==--++**==--++**==--++**==--++**==
tomorrows-professor mailing list
tomorrows-professor at lists.stanford.edu
https://mailman.stanford.edu/mailman/listinfo/tomorrows-professor






-- 
*~~~~~~The best student-centered learning *experience in America~~~~

Dr. Teresa Franklin
Professor, Instructional Technology
Instructional Technology Program Coordinator
313D McCracken Hall
Dept. Educational Studies
The Gladys W. and David H. Patton College of Education
Ohio University
Athens, OH 45701
740-593-4561 (office)
740-541-8847 (cell)
740-593-0477 (fax)
also: franklinteresa at gmail.com

"A teacher affects eternity; [she]he can never tell where the influence
stops." - Henry Adams

*
*
**

*
*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://listserv.ohio.edu/pipermail/itech/attachments/20121002/66d94b32/attachment-0001.html 


More information about the Itech mailing list