SOCW 6311 wk 9 Discussion: Reporting a Process Evaluation

Workbook

for
Designing
a Process
Evaluation

Produced for the

Georgia Department of Human
Resources

Division of Public Health

By

Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.

Department of
Georgia State University

July 2002

Evaluation Expert Session
July 16, 2002 Page 1

What is process evaluation?

Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks “what,” and outcome evaluation asks, “so
what?”

When conducting a process evaluation, keep in mind these three
questions:

1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?

This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.

Why is process evaluation important?
1. To determine the extent to which the program is being

implemented according to plan
2. To assess and document the degree of fidelity and variability in

program implementation, expected or unexpected, planned or
unplanned

3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention

and the outcomes
5. To provide information on what components of the intervention

are responsible for outcomes
6. To understand the relationship between program context (i.e.,

setting characteristics) and program processes (i.e., levels of
implementation).

7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,

and funders
10. To improve the quality of the program, as the act of evaluating is

an intervention.

Evaluation Expert Session
July 16, 2002 Page 2

Stages of Process Evaluation Page Number

1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**

Also included in this workbook:

a. Logic Model Template 30
b. Pitfalls to avoid 30
c. References 31

Evaluation can be an exciting,
challenging, and fun experience

Enjoy!

* Previously covered in Evaluation Planning Workshops.
** Will not be covered in this expert session. Please refer to the Evaluation Framework

and Evaluation Module of FHB Best Practice Manual for more details.

Evaluation Expert Session
July 16, 2002 Page 3

Forming collaborative relationships

A strong, collaborative relationship with program delivery staff and management will
likely result in the following:

Feedback regarding evaluation design and implementation
Ease in conducting the evaluation due to increased cooperation
Participation in interviews, panel discussion, meetings, etc.
Increased utilization of findings

Seek to establish a mutually respectful relationship characterized by trust, commitment,
and flexibility.

Key points in establishing a collaborative
relationship:

Start early. Introduce yourself and the evaluation team to as many delivery staff and
management personnel as early as possible.

Emphasize that THEY are the experts, and you will be utilizing their knowledge and

information to inform your evaluation development and implementation.

Be respectful of their time both in-person and on the telephone. Set up meeting places
that are geographically accessible to all parties involved in the evaluation process.

Remain aware that, even if they have requested the evaluation, it may often appear as

an intrusion upon their daily activities. Attempt to be as unobtrusive as possible and
request their feedback regarding appropriate times for on-site data collection.

Involve key policy makers, managers, and staff in a series of meetings throughout the

evaluation process. The evaluation should be driven by the questions that are of
greatest interest to the stakeholders. Set agendas for meetings and provide an
overview of the goals of the meeting before beginning. Obtain their feedback and
provide them with updates regarding the evaluation process. You may wish to
obtained structured feedback. Sample feedback forms are throughout the workbook.

Provide feedback regarding evaluation findings to the key policy makers, managers,

and staff when and as appropriate. Use visual aids and handouts. Tabulate and
summarize information. Make it as interesting as possible.

Consider establishing a resource or expert “panel” or advisory board that is an official

group of people willing to be contacted when you need feedback or have questions.

Evaluation Expert Session
July 16, 2002 Page 4

Determining Program Components

Program components are identified by answering the questions who, what, when, where,
and how as they pertain to your program.

Who: the program clients/recipients and staff
What: activities, behaviors, materials
When: frequency and length of the contact or intervention
Where: the community context and physical setting
How: strategies for operating the program or intervention

BRIEF EXAMPLE:

Who: elementary school students
What: fire safety intervention
When: 2 times per year
Where: in students’ classroom
How: group administered intervention, small group practice

1. Instruct students what to do in case of fire (stop, drop and roll).
2. Educate students on calling 911 and have them practice on play telephones.
3. Educate students on how to pull a fire alarm, how to test a home fire alarm and how to

change batteries in a home fire alarm. Have students practice each of these activities.
4. Provide students with written information and have them take it home to share with their

parents. Request parental signature to indicate compliance and target a 75% return rate.

Points to keep in mind when determining program
components

Specify activities as behaviors that can be observed

If you have a logic model, use the “activities” column as a starting point

Ensure that each component is separate and distinguishable from others

Include all activities and materials intended for use in the intervention

Identify the aspects of the intervention that may need to be adapted, and those that should

always be delivered as designed.

Consult with program staff, mission statements, and program materials as needed.

Evaluation Expert Session
July 16, 2002 Page 5

Your Program Components

After you have identified your program components, create a logic model that graphically
portrays the link between program components and outcomes expected from these
components.

Now, write out a succinct list of the components of your program.

WHO:

WHAT:

WHEN:

WHERE:

HOW:

Evaluation Expert Session
July 16, 2002 Page 6

What is a Logic Model

A logical series of statements that link the problems your program is attempting to
address (conditions), how it will address them (activities), and what are the expected
results (immediate and intermediate outcomes, long-term goals).

Benefits of the logic model include:

helps develop clarity about a project or program,
helps to develop consensus among people,
helps to identify gaps or redundancies in a plan,
helps to identify core hypothesis,
helps to succinctly communicate what your project or program is about.

When do you use a logic model

Use…

– During any work to clarify what is being done, why, and with what intended results

– During project or program planning to make sure that the project or program is logical and
complete

– During evaluation planning to focus the evaluation

– During project or program implementation as a template for comparing to the actual program
and as a filter to determine whether proposed changes fit or not.

This information was extracted from the Logic Models: A Multi-Purpose Tool materials developed by Wellsys
Corporation for the Evaluation Planning Workshop Training. Please see the Evaluation Planning Workshop
materials for more information. Appendix A has a sample template of the tabular format.

Evaluation Expert Session
July 16, 2002 Page 7

Determining Evaluation Questions

As you design your process evaluation, consider what questions you would like to answer. It is only after
your questions are specified that you can begin to develop your methodology. Considering the importance
and purpose of each question is critical.

BROADLY….

What questions do you hope to answer? You may wish to turn the program components that you have just identified
into questions assessing:

Was the component completed as indicated?
What were the strengths in implementation?
What were the barriers or challenges in implementation?
What were the apparent strengths and weaknesses of each step of the intervention?
Did the recipient understand the intervention?
Were resources available to sustain project activities?
What were staff perceptions?
What were community perceptions?
What was the nature of the interaction between staff and clients?

These are examples. Check off what is applicable to you, and use the space below to write additional broad,
overarching questions that you wish to answer.

Evaluation Expert Session
July 16, 2002 Page 8

SPECIFICALLY …

Now, make a list of all the specific questions you wish to answer, and organize your questions categorically. Your
list of questions will likely be much longer than your list of program components. This step of developing your
evaluation will inform your methodologies and instrument choice.

Remember that you must collect information on what the program is intended to be and what it is in reality, so you
may need to ask some questions in 2 formats.

For example:

How many people are intended to complete this intervention per week?”
How many actually go through the intervention during an average week?”

Consider what specific questions you have. The questions below are only examples! Some may not be appropriate
for your evaluation, and you will most likely need to add additional questions. Check off the questions that are
applicable to you, and add your own questions in the space provided.

WHO (regarding client):
Who is the target audience, client, or recipient?
How many people have participated?
How many people have dropped out?
How many people have declined participation?
What are the demographic characteristics of clients?

Race
Ethnicity
National Origin
Age
Gender
Sexual Orientation
Religion
Marital Status
Employment
Income Sources
Education
Socio-Economic Status

What factors do the clients have in common?
What risk factors do clients have?
Who is eligible for participation?
How are people referred to the program? How are the screened?
How satisfied are the clients?

YOUR QUESTIONS:

Evaluation Expert Session
July 16, 2002 Page 9

WHO (Regarding staff):
Who delivers the services?
How are they hired?
How supportive are staff and management of each other?
What qualifications do staff have?
How are staff trained?
How congruent are staff and recipients with one another?
What are staff demographics? (see client demographic list for specifics.)

YOUR QUESTIONS:

WHAT:

What happens during the intervention?
What is being delivered?
What are the methods of delivery for each service (e.g., one-on-one, group session, didactic instruction,

etc.)
What are the standard operating procedures?
What technologies are in use?
What types of communication techniques are implemented?
What type of organization delivers the program?
How many years has the organization existed? How many years has the program been operating?
What type of reputation does the agency have in the community? What about the program?
What are the methods of service delivery?
How is the intervention structured?
How is confidentiality maintained?

YOUR QUESTIONS:

WHEN:
When is the intervention conducted?
How frequently is the intervention conducted?
At what intervals?
At what time of day, week, month, year?
What is the length and/or duration of each service?

Evaluation Expert Session
July 16, 2002 Page 10

YOUR QUESTIONS:

WHERE:
Where does the intervention occur?
What type of facility is used?
What is the age and condition of the facility?
In what part of town is the facility? Is it accessible to the target audience? Does public transportation access

the facility? Is parking available?
Is child care provided on site?

YOUR QUESTIONS:

WHY:

Why are these activities or strategies implemented and why not others?
Why has the intervention varied in ability to maintain interest?
Why are clients not participating?
Why is the intervention conducted at a certain time or at a certain frequency?

YOUR QUESTIONS:

Evaluation Expert Session
July 16, 2002 Page 11

Validating Your Evaluation Questions

Even though all of your questions may be interesting, it is important to narrow your list to questions that
will be particularly helpful to the evaluation and that can be answered given your specific resources, staff,
and time.

Go through each of your questions and consider it with respect to the questions below, which may be helpful in
streamlining your final list of questions.

Revise your worksheet/list of questions until you can answer “yes” to all of these questions. If you cannot answer
“yes” to your question, consider omitting the question from your evaluation.

Validation

Yes

No

Will I use the data that will stem from these questions?

Do I know why each question is important and /or valuable?

Is someone interested in each of these questions?

Have I ensured that no questions are omitted that may be important to
someone else?

Is the wording of each question sufficiently clear and unambiguous?

Do I have a hypothesis about what the “correct” answer will be for each
question?

Is each question specific without inappropriately limiting the scope of the
evaluation or probing for a specific response?

Do they constitute a sufficient set of questions to achieve the purpose(s) of
the evaluation?

Is it feasible to answer the question, given what I know about the
resources for evaluation?

Is each question worth the expense of answering it?

Derived from “A Design Manual” Checklist, page 51.

Evaluation Expert Session
July 16, 2002 Page 12

Determining Methodology

Process evaluation is characterized by collection of data primarily through two formats:

1) Quantitative, archival, recorded data that may be managed by an computerized

tracking or management system, and

2) Qualitative data that may be obtained through a variety of formats, such as

surveys or focus groups.

When considering what methods to use, it is critical to have a thorough
understanding and knowledge of the questions you want answered. Your
questions will inform your choice of methods. After this section on types of
methodologies, you will complete an exercise in which you consider what method
of data collection is most appropriate for each question.

Do you have a thorough understanding of your
questions?

Furthermore, it is essential to consider what data the organization you are
evaluating already has. Data may exist in the form of an existing computerized
management information system, records, or a tracking system of some other
sort. Using this data may provide the best reflection of what is “going on,” and it
will also save you time, money, and energy because you will not have to devise
your own data collection method! However, keep in mind that you may have to
adapt this data to meet your own needs – you may need to add or replace fields,
records, or variables.

What data does your organization already have?

Will you need to adapt it?

If the organization does not already have existing data, consider devising a
method for the organizational staff to collect their own data. This process will
ultimately be helpful for them so that they can continue to self-evaluate, track
their activities, and assess progress and change. It will be helpful for the
evaluation process because, again, it will save you time, money, and energy that
you can better devote towards other aspects of the evaluation. Management
information systems will be described more fully in a later section of this
workbook.

Do you have the capacity and resources to devise
such a system? (You may need to refer to a later
section of this workbook before answering.)

Evaluation Expert Session
July 16, 2002 Page 13

Who should collect the data?

Given all of this, what thoughts do you have on who should collect data for your
evaluation? Program staff, evaluation staff, or some combination?

Program Staff: May collect data from activities such as attendance, demographics,
participation, characteristics of participants, dispositions, etc; may
conduct intake interviews, note changes regarding service delivery,
and monitor program implementation.

Advantages: Cost-efficient, accessible, resourceful, available, time-efficient,

and increased understanding of the program.

Disadvantages: May exhibit bias and/or social desirability, may use data for critical

judgment, may compromise the validity of the program; may put
staff in uncomfortable or inappropriate position; also, if staff collect
data, may have an increased burden and responsibility placed upon
them outside of their usual or typical job responsibilities. If you
utilize staff for data collection, provide frequent reminders as well
as messages of gratitude.

Evaluation staff: May collect qualitative information regarding implementation,
general characteristics of program participants, and other
information that may otherwise be subject to bias or distortion.

Advantages: Data collected in manner consistent with overall goals and timeline

of evaluation; prevents bias and inappropriate use of information;
promotes overall fidelity and validity of data.

Disadvantages: May be costly and take extensive time; may require additional

training on part of evaluator; presence of evaluator in organization
may be intrusive, inconvenient, or burdensome.

Evaluation Expert Session
July 16, 2002 Page 14

When should data be collected?

Conducting the evaluation according to your timeline can be challenging. Consider how
much time you have for data collection, and make decisions regarding what to collect
and how much based on your timeline.

In many cases, outcome evaluation is not considered appropriate until the program has
stabilized. However, when conducting a process evaluation, it can be important to start
the evaluation at the beginning so that a story may be told regarding how the program
was developed, information may be provided on refinements, and program growth and
progress may be noted.

If you have the luxury of collecting data from the start of the intervention to the end of
the intervention, space out data collection as appropriate. If you are evaluating an
ongoing intervention that is fairly quick (e.g., an 8-week educational group), you may
choose to evaluate one or more “cycles.”

How much time do you have to conduct your evaluation?

How much time do you have for data collection (as opposed to designing the evaluation,
training, organizing and analyzing results, and writing the report?)

Is the program you are evaluating time specific?

How long does the program or intervention last?

At what stages do you think you will most likely collect data?

Soon after a program has begun

Descriptive information on program characteristics that will not change; information
requiring baseline information

During the intervention
Ongoing process information such as recruitment, program implementation

After the intervention
Demographics, attendance ratings, satisfaction ratings

Evaluation Expert Session
July 16, 2002 Page 15

Before you consider methods

A list of various methods follows this section. Before choosing what methods are
most appropriate for your evaluation, review the following questions. (Some may
already be answered in another section of this workbook.)

What questions do I want answered? (see previous section)

Does the organization already have existing data, and if so, what kind?

Does the organization have staff to collect data?

What data can the organization staff collect?

Must I maintain anonymity (participant is not identified at all) or confidentiality

(participant is identified but responses remain private)? This consideration
pertains to existing archival data as well as original data collection.

How much time do I have to conduct the evaluation?

How much money do I have in my budget?

How many evaluation staff do I have to manage the data collection activities?

Can I (and/or members of my evaluation staff) travel on site?

What time of day is best for collecting data? For example, if you plan to conduct

focus groups or interviews, remember that your population may work during the
day and need evening times.

Evaluation Expert Session
July 16, 2002 Page 16

Types of methods

A number of different methods exist that can be used to collect process
information. Consider each of the following, and check those that you think would
be helpful in addressing the specific questions in your evaluation. When “see
sample” is indicated, refer to the pages that follow this table.

√ Method Description

Activity,
participation, or
client tracking log

Brief record completed on site at frequent intervals by participant or deliverer.
May use form developed by evaluator if none previously exists. Examples: sign
in log, daily records of food consumption, medication management.

Case Studies
Collection of in-depth information regarding small number of intervention
recipients; use multiple methods of data collection.

Ethnographic
analysis

Obtain in-depth information regarding the experience of the recipient by
partaking in the intervention, attending meetings, and talking with delivery staff
and recipients.

Expert judgment
Convene a panel of experts or conduct individual interviews to obtain their
understanding of and reaction to program delivery.

Focus groups
Small group discussion among program delivery staff or recipients. Focus on
their thoughts and opinions regarding their experiences with the intervention.

Meeting minutes
(see sample)

Qualitative information regarding agendas, tasks assigned, and coordination and
implementation of the intervention as recorded on a consistent basis.

Observation
(see sample)

Observe actual delivery in vivo or on video, record findings using check sheet
or make qualitative observations.

Open-ended
interviews –
telephone or in
person

Evaluator asks open questions (i.e., who, what, when, where, why, how) to
delivery staff or recipients. Use interview protocol without preset response
options.

Questionnaire
Written survey with structured questions. May administer in individual, group,
or mail format. May be anonymous or confidential.

Record review

Obtain indicators from intervention records such patient files, time sheets,
telephone logs, registration forms, student charts, sales records, or records
specific to the service delivery.

Structured
interviews –
telephone or in
person

Interviewer asks direct questions using interview protocol with preset response
options.

Evaluation Expert Session

July 16, 2002
Page 17

Sample activity log

This is a common process evaluation methodology because it systematically records exactly what is happening during
implementation. You may wish to devise a log such as the one below and alter it to meet your specific needs. Consider
computerizing such a log for efficiency. Your program may already have existing logs that you can utilize and adapt for your
evaluation purposes.

Site:

Recorder:

Code

Service

Date

Location

# People

# Hours

Notes

Evaluation Expert Session
July 16, 2002

Page 18

Meeting Minutes

Taking notes at meetings may provide extensive and invaluable process information that
can later be organized and structured into a comprehensive report. Minutes may be taken
by program staff or by the evaluator if necessary. You may find it helpful to use a
structured form, such as the one below that is derived from Evaluating Collaboratives,
University of Wisconsin-Cooperative Extension, 1998.

Meeting Place: __________________ Start time: ____________
Date: _____________________________ End time: ____________

Attendance (names):

Agenda topic: _________________________________________________

Discussion: _____________________________________________________

Decision Related Tasks Who responsible Deadline

1.

2.

3.

Agenda topic: _________________________________________________

Discussion: _____________________________________________________

Decision Related Tasks Who responsible Deadline

1.

2.

3.

Sample observation log

Evaluation Expert Session
July 16, 2002

Page 19

Observation may occur in various methods, but one of the most common is
hand-recording specific details during a small time period. The following is several rows
from an observation log utilized during an evaluation examining school classrooms.

CLASSROOM OBSERVATIONS (School Environment Scale)
Classroom 1: Grade level _________________ (Goal: 30 minutes of observation)

Time began observation: _________Time ended observation:_________
Subjects were taught during observation period: ___________________

PHYSICAL ENVIRONMENT

Question

Answer
1. Number of students

2. Number of adults in room:
a. Teachers
b. Para-pros
c. Parents

Total:
a.
b.
c.

3. Desks/Tables
a. Number of Desks
b. Number of Tables for students’ use
c. Any other furniture/include number
(Arrangement of desks/tables/other furniture)

a.
b.
c.

4. Number of computers, type

5. How are computers being used?

6. What is the general classroom setup? (are there walls, windows, mirrors,
carpet, rugs, cabinets, curtains, etc.)

7. Other technology (overhead projector, power point, VCR, etc.)

8. Are books and other materials accessible for students?

9. Is there adequate space for whole-class instruction?

12. What type of lighting is used?

13. Are there animals or fish in the room?

14. Is there background music playing?

15. Rate the classroom condition
Poor Average Excellent

16. Are rules/discipline procedures posted? If so, where?

17. Is the classroom Noisy or Quiet?
Very Quiet Very Noisy

Choosing or designing measurement instruments
Consider using a resource panel, advisory panel, or focus group to offer feedback

Evaluation Expert Session
July 16, 2002

Page 20

regarding your instrument. This group may be composed of any of the people listed
below. You may also wish to consult with one or more of these individuals throughout
the development of your overall methodology.

Who should be involved in the design of your instrument(s) and/or provide feedback?

Program service delivery staff / volunteers
Project director
Recipients of the program
Board of …

Place your order
(550 words)

Approximate price: $22

Calculate the price of your order

550 words
We'll send you the first draft for approval by September 11, 2018 at 10:52 AM
Total price:
$26
The price is based on these factors:
Academic level
Number of pages
Urgency
Basic features
  • Free title page and bibliography
  • Unlimited revisions
  • Plagiarism-free guarantee
  • Money-back guarantee
  • 24/7 support
On-demand options
  • Writer’s samples
  • Part-by-part delivery
  • Overnight delivery
  • Copies of used sources
  • Expert Proofreading
Paper format
  • 275 words per page
  • 12 pt Arial/Times New Roman
  • Double line spacing
  • Any citation style (APA, MLA, Chicago/Turabian, Harvard)

Our guarantees

Delivering a high-quality product at a reasonable price is not enough anymore.
That’s why we have developed 5 beneficial guarantees that will make your experience with our service enjoyable, easy, and safe.

Money-back guarantee

You have to be 100% sure of the quality of your product to give a money-back guarantee. This describes us perfectly. Make sure that this guarantee is totally transparent.

Read more

Zero-plagiarism guarantee

Each paper is composed from scratch, according to your instructions. It is then checked by our plagiarism-detection software. There is no gap where plagiarism could squeeze in.

Read more

Free-revision policy

Thanks to our free revisions, there is no way for you to be unsatisfied. We will work on your paper until you are completely happy with the result.

Read more

Privacy policy

Your email is safe, as we store it according to international data protection rules. Your bank details are secure, as we use only reliable payment systems.

Read more

Fair-cooperation guarantee

By sending us your money, you buy the service we provide. Check out our terms and conditions if you prefer business talks to be laid out in official language.

Read more
Open chat
1
You can contact our live agent via WhatsApp! Via + 1 929 473-0077

Feel free to ask questions, clarifications, or discounts available when placing an order.

Order your essay today and save 20% with the discount code GURUH