Coverart for item
The Resource Evaluation essentials from A to Z, Marvin C. Alkin

Evaluation essentials from A to Z, Marvin C. Alkin

Label
Evaluation essentials from A to Z
Title
Evaluation essentials from A to Z
Statement of responsibility
Marvin C. Alkin
Creator
Subject
Language
eng
Cataloging source
DLC
http://library.link/vocab/creatorName
Alkin, Marvin C
Illustrations
illustrations
Index
index present
Literary form
non fiction
Nature of contents
bibliography
http://library.link/vocab/subjectName
Evaluation research (Social action programs)
Label
Evaluation essentials from A to Z, Marvin C. Alkin
Instantiates
Publication
Bibliography note
Includes bibliographical references and index
Contents
  • Section B
  • What Is the Evaluation Plan (Process Measures)?
  • The Evaluation Design
  • Process Measures
  • Program Elements
  • Program Mechanisms
  • Question
  • Section P
  • What Is the Evaluation Plan (Outcome Measures)?
  • An Exercise to Assist Us
  • Toward Stronger Causal Models
  • Why Do Evaluations?
  • Descriptive Designs
  • Mixed Methods
  • Summary
  • My Advice
  • Section Q
  • What Is the Evaluation Plan (Procedures and Agreements)?
  • Evaluation Activities: Past, Present, and Upcoming
  • The Written Evaluation Plan
  • The Contract
  • My Advice
  • Making Decisions
  • Section R
  • How Are Quantitative Data Analyzed?
  • Types of Data
  • A First Acuqintance with the Data
  • Measures of Central Tendency
  • Measures of Variability
  • Getting Further Acquainted with the Data
  • Descriptive and Inferential Statistics
  • Are the Results Significant?
  • Appropriate Statistical Techniques
  • Issues for Professional Evaluation
  • My Warning
  • Section S
  • How Are Qualitative Data Analyzed?
  • Refining the Data
  • Testing the Validity of the Analysis
  • Section T
  • How Do Analyzed Data Answer Questions?
  • Difficulties in Valuing
  • Valuing in a Formative Context
  • "Valuing" Redefined
  • Time-Out.
  • A Final Note
  • Section U
  • How Are Evaluation Results Reported?
  • Communication
  • Reporting
  • The Final Written Report
  • Nature and Quality of Writing
  • Section V
  • What is the Evaluator's Role in Helping Evaluations to Be Used?
  • A Word about "Use"
  • The RUPAS Case
  • What Is Use?
  • What Can You Do?
  • Guard against Misuse
  • Section W
  • How Are Evaluations Managed?
  • Acquiring the Evaluation
  • Contract/Agreement
  • Operational Management
  • Section X
  • What Are the Evaluation Standards and Codes of Behavior?
  • The Rural Parents' Support Program (RUPAS): A Community Well-Being Case Study
  • Judging an Evaluation
  • The Program Evaluation Standards
  • American Evaluation Association Guiding Principles
  • Section Y
  • How Are Costs Analyzed?
  • Cost-Effectiveness Analysis
  • Cost-Benefit Analysis
  • Cost-Utility Analysis
  • And Now to Costs
  • How to Determine Cost
  • Nicole Eisenberg
  • Section Z
  • How Can You Embark on a Program to Learn More about Evaluation?
  • Getting Feedback on Evaluation
  • Taking Full Advantage of This Volume
  • Gaining Evaluation Expertise Beyond This Volume
  • Section C
  • Who Does Evaluations?
  • Machine generated contents note:
  • Evaluator Settings
  • Multiple Orientations to Doing Evaluation
  • My View
  • Section D
  • Who Are the Stakeholders for an Evaluation?
  • Stakeholders, Not Audience
  • Who are the Stakeholders?
  • Focus on Primary Stakeholders
  • Differences in Stakeholder Participation
  • Section E
  • Section A
  • How Are Positive Stakeholder Relationships Maintained?
  • Gaining RTC (Respect, Trust, Credibility)
  • Concluding Note
  • Section F
  • What is the Organizational, Social, and Political Context?
  • Organizational Context
  • Social Context
  • Political Context
  • My Advice
  • Thinking Ahead
  • What Is Evaluation?
  • Section G
  • How Do You Describe the Program?
  • Program Components
  • Program Size and Organizational Location
  • Program Differences
  • What Should We Know about Programs
  • Learning about the Program
  • Section H
  • How Do You "Understand" the Program?
  • Theory of Action
  • Professional Program Evaluation
  • Logic Models
  • Why Is This Important?
  • What Does a Logic Model Look Like?
  • A Partial Logic Model
  • Getting Started
  • Section I
  • What Are the Questions/Issues to Be Addressed?
  • Kinds of Evaluation Questions
  • Getting Started on Defining Questions
  • Some Next Steps
  • Evaluation and Research
  • Section J
  • Who Provides Data?
  • Again, Be Clear on the Questions
  • Focus of the Data
  • Selecting Individuals
  • Section K
  • What Are Instruments for Collecting Quantitative Data?
  • Instruments for Attaining Quantitative Data
  • Acquisition of Data
  • Existing Data
  • Evaluation Definition
  • Finding Existing Instruments
  • Developing New Instruments
  • Questionnaire Construction
  • Measuring Achievement
  • Achievement Test Construction
  • Observation Protocols
  • Section L
  • What Are Instruments for Collecting Qualitative Data?
  • Developing New Instruments
  • Observations
  • A Confusion of Terms
  • Interviews and Focus Groups
  • Surveys and Questionnaires
  • Section M
  • What Are the Logistics of Data Collection?
  • Gaining Data Access
  • Collecting Data
  • Quality of Data
  • Understanding the organization's Viewpoints
  • My Advice
  • Section N
  • Evaluation Purposes
  • Are the Questions Evaluable (Able to be Evaluated)?
  • Stage of the Program
  • Resources
  • Nature of the Question
  • Establishing Standards
  • Technical Issues
  • Ethical Issues
  • Political Feasibility
  • My Advice
  • Section O
Control code
ocn515449497
Dimensions
23 cm
Extent
xii, 260 p.
Isbn
9781606238998
Isbn Type
(hardcover : alk. paper)
Lccn
2010022239
Other physical details
ill.
System control number
(OCoLC)515449497
Label
Evaluation essentials from A to Z, Marvin C. Alkin
Publication
Bibliography note
Includes bibliographical references and index
Contents
  • Section B
  • What Is the Evaluation Plan (Process Measures)?
  • The Evaluation Design
  • Process Measures
  • Program Elements
  • Program Mechanisms
  • Question
  • Section P
  • What Is the Evaluation Plan (Outcome Measures)?
  • An Exercise to Assist Us
  • Toward Stronger Causal Models
  • Why Do Evaluations?
  • Descriptive Designs
  • Mixed Methods
  • Summary
  • My Advice
  • Section Q
  • What Is the Evaluation Plan (Procedures and Agreements)?
  • Evaluation Activities: Past, Present, and Upcoming
  • The Written Evaluation Plan
  • The Contract
  • My Advice
  • Making Decisions
  • Section R
  • How Are Quantitative Data Analyzed?
  • Types of Data
  • A First Acuqintance with the Data
  • Measures of Central Tendency
  • Measures of Variability
  • Getting Further Acquainted with the Data
  • Descriptive and Inferential Statistics
  • Are the Results Significant?
  • Appropriate Statistical Techniques
  • Issues for Professional Evaluation
  • My Warning
  • Section S
  • How Are Qualitative Data Analyzed?
  • Refining the Data
  • Testing the Validity of the Analysis
  • Section T
  • How Do Analyzed Data Answer Questions?
  • Difficulties in Valuing
  • Valuing in a Formative Context
  • "Valuing" Redefined
  • Time-Out.
  • A Final Note
  • Section U
  • How Are Evaluation Results Reported?
  • Communication
  • Reporting
  • The Final Written Report
  • Nature and Quality of Writing
  • Section V
  • What is the Evaluator's Role in Helping Evaluations to Be Used?
  • A Word about "Use"
  • The RUPAS Case
  • What Is Use?
  • What Can You Do?
  • Guard against Misuse
  • Section W
  • How Are Evaluations Managed?
  • Acquiring the Evaluation
  • Contract/Agreement
  • Operational Management
  • Section X
  • What Are the Evaluation Standards and Codes of Behavior?
  • The Rural Parents' Support Program (RUPAS): A Community Well-Being Case Study
  • Judging an Evaluation
  • The Program Evaluation Standards
  • American Evaluation Association Guiding Principles
  • Section Y
  • How Are Costs Analyzed?
  • Cost-Effectiveness Analysis
  • Cost-Benefit Analysis
  • Cost-Utility Analysis
  • And Now to Costs
  • How to Determine Cost
  • Nicole Eisenberg
  • Section Z
  • How Can You Embark on a Program to Learn More about Evaluation?
  • Getting Feedback on Evaluation
  • Taking Full Advantage of This Volume
  • Gaining Evaluation Expertise Beyond This Volume
  • Section C
  • Who Does Evaluations?
  • Machine generated contents note:
  • Evaluator Settings
  • Multiple Orientations to Doing Evaluation
  • My View
  • Section D
  • Who Are the Stakeholders for an Evaluation?
  • Stakeholders, Not Audience
  • Who are the Stakeholders?
  • Focus on Primary Stakeholders
  • Differences in Stakeholder Participation
  • Section E
  • Section A
  • How Are Positive Stakeholder Relationships Maintained?
  • Gaining RTC (Respect, Trust, Credibility)
  • Concluding Note
  • Section F
  • What is the Organizational, Social, and Political Context?
  • Organizational Context
  • Social Context
  • Political Context
  • My Advice
  • Thinking Ahead
  • What Is Evaluation?
  • Section G
  • How Do You Describe the Program?
  • Program Components
  • Program Size and Organizational Location
  • Program Differences
  • What Should We Know about Programs
  • Learning about the Program
  • Section H
  • How Do You "Understand" the Program?
  • Theory of Action
  • Professional Program Evaluation
  • Logic Models
  • Why Is This Important?
  • What Does a Logic Model Look Like?
  • A Partial Logic Model
  • Getting Started
  • Section I
  • What Are the Questions/Issues to Be Addressed?
  • Kinds of Evaluation Questions
  • Getting Started on Defining Questions
  • Some Next Steps
  • Evaluation and Research
  • Section J
  • Who Provides Data?
  • Again, Be Clear on the Questions
  • Focus of the Data
  • Selecting Individuals
  • Section K
  • What Are Instruments for Collecting Quantitative Data?
  • Instruments for Attaining Quantitative Data
  • Acquisition of Data
  • Existing Data
  • Evaluation Definition
  • Finding Existing Instruments
  • Developing New Instruments
  • Questionnaire Construction
  • Measuring Achievement
  • Achievement Test Construction
  • Observation Protocols
  • Section L
  • What Are Instruments for Collecting Qualitative Data?
  • Developing New Instruments
  • Observations
  • A Confusion of Terms
  • Interviews and Focus Groups
  • Surveys and Questionnaires
  • Section M
  • What Are the Logistics of Data Collection?
  • Gaining Data Access
  • Collecting Data
  • Quality of Data
  • Understanding the organization's Viewpoints
  • My Advice
  • Section N
  • Evaluation Purposes
  • Are the Questions Evaluable (Able to be Evaluated)?
  • Stage of the Program
  • Resources
  • Nature of the Question
  • Establishing Standards
  • Technical Issues
  • Ethical Issues
  • Political Feasibility
  • My Advice
  • Section O
Control code
ocn515449497
Dimensions
23 cm
Extent
xii, 260 p.
Isbn
9781606238998
Isbn Type
(hardcover : alk. paper)
Lccn
2010022239
Other physical details
ill.
System control number
(OCoLC)515449497

Library Locations

    • Wellington LibraryBorrow it
      Wellington- Massey University Library, Block 5, 63 Wallace Street, Wellington, 6021, NZ
      -40.385395 175.617407
Processing Feedback ...