What is Assessment and its Place in Education? 
Assessment is one of the newer and more frequently misunderstood words 
and issues in education circles. Many educators are not quite sure
what it is and if it has a place in education. As such, perhaps the 
six-step conceptual model of assessment I first developed in 1998, 
have presented nationally and internationally, and now present on 
the cover will clarify the focus of this issue of Academic Exchange 
Quarterly. For more details on the model, see the references, 
publications, and workshops identified below.

Briefly  stated, the model is constructed upon the following concepts:
- Assessment is purposed-based and all steps of the model are purpose-dependent.
- Assessment is intended to be formative.
- Assessment is not complete until a decision has been made and action taken.
- Assessment is an iterative process.

Practically, the model explains assessment as a process fit for any 
organizational level (classroom, program, institution, etc.), in regard 
to all educational ends (academics, administration, and services)  and 
clarifies "assessment" in relation to purposes, outcomes, measurement, 
evaluation, and decision making.

Assessment as a process begins with the stating of a purpose in terms of 
what is valued and what the educators wish to do with pedagogy, curricula, 
and programs to enhance the meeting of that purpose. Outcomes are then 
stated so that learners and educators know what desired evidence suggests 
learners' progress toward the purpose as well as the effectiveness of the 
educators' interventions on behalf of student learners. In response to the 
purpose, interventions, and outcomes, a design to both measure and then 
to evaluate (judge relative to the purpose) the progress and interventions 
must be established. Then data is collected, analyzed, and evaluated by 
placing it in juxtaposition to the value(s) in the purpose statement and 
judging the progress of learners and the effectiveness of the educational 
interventions. At this point, decisions must be made and actions taken to 
improve the purpose, interventions, outcomes, and/or  measurement and 
evaluation design.

This is assessment and in this issue we have positioned eighteen articles 
that complement the conceptual model as well as demonstrate that assessment 
has a place in various education contexts. New pedagogy, campus-wide 
initiatives, end-of-course, and discipline specific examples of assessment 
are provided.

I am glad to see that some of the new, popular pedagogies are being assessed 
in four articles. We use them, but how do we know if they help students learn? 
Major and Palmer consider the effectiveness of problem-based learning while 
Coste and Druker similarly focus on service-learning. Portfolios are 
considered by Baume and Yorke as well as Fazal, Goldsby, Cozza, Goethals, 
and Howard.

Five articles, when read as a series, contribute much to our understanding 
and doing of campus-wide assessment. Mueller, Waters, Smeaton, and Pinciotti 
describe their campus-wide assessment model design and implementation process. 
Norton and Dudycha contribute to our understanding of identifying learning 
goals. Klassen and Watson address the assessment of general education. Adams 
and Slater introduce a web-based adaptive senior exit survey, and McLure and 
Rao look at college impact on lifetime educational aspirations.

And what about the end-of-course assessments like grading, course evaluation, 
and teaching effectiveness? Davis writes about fairness in grading and Ulmer 
provides a model of self-grading. Moskal is focused on quality course 
evaluations while Carey, Perrault, and Gregory link outcomes assessment with 
teaching effectiveness. Carey, Wallace, and Carey address the assessment of 
student academic motivation, and Cambiano, Vore, and Snow consider the assessment 
of learning preferences in distance learning which could impact the entire course.

Three field or discipline specific articles provide assessment insights beyond 
their academic boundaries and are worth your reading. Reynolds, Brothen, and 
Wambach introduce a writing assessment tool used in psychology. Catelli and 
Carlino describe the use of collaborative action research in teacher education. 
Skillen and Trivett give an example from biology regarding the assessment of 
genre conventions.

These authors   give you much to assess; including, what you value about 
assessment, what you hope to see students accomplish,  how you assess,  and 
what actions you will take to improve assessment. I invite your comments  and 
make myself available to answer your assessment questions. Feel free to 
contact me at kborland@montana.edu

Dr. Kenneth W. Borland, Jr., Montana State University

Borland, K. (1998, August). Assessment for faculty: The brass tacks and the 
	brass ring. Paper presented at Faculty Development Conference, 
	Montana State University, Bozeman, MT.
Borland, K. (1998). The assessment of transition change: Challenged purposes 
	for seeking a college education. The Journal of College Orientation 
	and Transition, 6 (1), 21-26.
Borland, K. (1999, June & 2000, June). Getting assessment from faculty: 
	Communicating the brass tacks and the brass ring. Paper presented at 
	the American Association for Higher Education Assessment Conference, 
	Denver and Charlotte, respectively.
Borland, K., Howard, R., & Baker, L. (2000, April). Assessment, institutional 
	research, and decision-support. Paper presented at the Colorado 
	Regional Higher Education Assessment Conference, Denver.
Borland, K. (2000, July). "What is Assessment?" for educators from Ukraine, 
	Uzbekistan, and Turkmenistan. Paper presented at American Studies 
	Scholars Program at Montana State University, Bozeman, Montana.
Borland, K. (2000, August). Teaching, learning, curriculum, and assessment. 
	Paper presented at Faculty Development Conference, King Fahd 
	University of Petroleum & Minerals, Dharan, Saudi Arabia.
Borland, K. (2000, August). Assessing general education outcomes in general 
	education, the major, and capstone courses. Paper presented at 
	Faculty Development Conference, Dickinson State University, 
	Dickinson, ND.
Borland, K. & Marley, R. (due 2001, April). A conceptual and strategic process 
	for engineering program assessment: a case study at Montana State 
	University. Paper presented at Best Assessment Processes IV: A Working 
	Symposium, Rose-Hulman Institute of Technology, Terre Haute, IN.
Borland, K. (due 2002, March). Assessing retention: Six steps and four 
	paradigms. Journal of College Student Retention: Research, Theory & 
	Practice, 4, (3).