Section 1

Preview this deck

Purpose of Data Collection

Front

Star 0%
Star 0%
Star 0%
Star 0%
Star 0%

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Active users

0

All-time users

0

Favorites

0

Last updated

5 years ago

Date created

Mar 1, 2020

Cards (27)

Section 1

(27 cards)

Purpose of Data Collection

Front

- to provide the best information to answer your questions & help you make good decisions about your program - Quantitative methods - Qualitative methods - Mixed methods

Back

Process Evaluation Strategies

Front

- reviewing existing records - creating new forms for documentation - interviewing personnel and clients - direct observation - Meeting notes, minutes - Policies, procedures, reports & Memos - Newsletters, websites, & other media

Back

Program Evaluation

Front

- 3 types of program evaluation: - Process - Impact - Outcome - Formative evaluation

Back

Process Evaluation

Front

- is a continual feedback loop - monitoring the intervention as it takes place & making mid-course corrections

Back

Are participants satisfied with their experience? (Process Evaluation)

Front

- Driven by the organization hope that satisfied customers will return for more - Acquire feedback to improve the program - Are particpants fully participating in program? - Satisfied participants = positive health outcomes? Postive rating bias - Cognitive consistency Self-selection, & Positive response style

Back

Experimental Design

Front

- Research design chosen determines types of conclusions evaluator can make about program's effect on participants - What are main types of research designs? - Experimental - Quasi-experimental - Non-experimental

Back

Program improvement

Front

- Where should we concentrate our efforts? - What are our successes? - Do we want to make changes?

Back

Accountability

Front

- Is expenditure of funds worthwhile? - Should we continue to fund the program?

Back

Why be concerned about evaluation

Front

Shortage of resources; so must invest wisely; Accountability; What gets measured gets done; If you don't measure results, you can't tell success from failure; If you can't see success, you can't reward it; If you can't reward success, you're probably rewarding failure; If you can't see success, you can't learn from it; If you can't recognize failure, you can't correct it; If you can demonstrate results, you can win public support.

Back

Experimental Design (cont'd)

Front

- trade-offs in program evaluation design - Limited time - Limited resources - Limited ability to understand the problem - Unanticipated outcomes

Back

Process Evaluation

Front

- Strategies include street intercept surveys, population-based surveys, targeted surveys with community groups

Back

Process Evaluation

Front

- We need to have good documentation of program design in order to evaluate program & replicate it

Back

Generalizability

Front

- Is program effective and will it work with our population?

Back

Formative Evaluation

Front

- To test run various aspects of program before implementation - Assess strengths & weakness of intervention components while you are developing them so you can modify & improve them before implementation to increase their effectiveness with target audience

Back

Outcome Evaluation

Front

- contingent upon appropriate implementation of program components - Type II Error - How do we know that change in participants is due to our intervention? - Comparison group - Secular effects

Back

Program Evaluation

Front

- Evaluation is part of program planning - What is evaluated in program evaluation is determined by priorities determined in needs assessment - How effectively have priorities identified in needs assessment been met by program

Back

Impact Evaluation

Front

- assesses immediate effects of intervention - Changes in knowledge, attitudes, behaviors, skills - Is the program achieving its goals? - Can the impact be attributed to intervention or another influence?

Back

Process Evaluation

Front

- How well is the program being implemented? - Extent (coverage and intensity) - Fidelity - Quality of intervention implementation - Actual vs planned activities/services - Mulitple data collection methods should be used

Back

Outcome Evaluation

Front

- Identifies changes in program participants as a result of participating in the programs - Short, intermediate, long-term effects - Assesses program's effect on morbidity & mortality rates

Back

Process Evaluation

Front

- Systematically gathering information during program implementation to - Monitor implementation - document what was done - Identify needed changes to the program in progress - Identify internal and external factors that contributed to success or limited success

Back

Who is the program failing to reach and why? (Process Evaluation)

Front

- Assess program awareness and appeal of program with eligible participants and community members - Strategies include following-up drop-outs, on-the-spot interviews of those who are eligible but chose not to participate, focus groups

Back

Program Evaluation

Front

- systematic collection, analysis, & interpretation of information about health promotion programs for the purpose of answering questions and making decisions about a program.

Back

Experimental Design (cont'd)

Front

- Can be difficult or unethical to withhold an intervention from the control group - Contamination between groups can occur - Too few subjects or groups for randomization - Creates an unrealistic situation that is not generalizable to the "real world"

Back

Impact Evaluation (cont'd)

Front

- Is the impact the same across all groups? - Are there unintended effects? - How does it compare w/ other interventions? - Helpful if intervention outcomes are not known for years

Back

Rationale for evaluation (Pirie)

Front

- Accountability - Program improvement - Generalizability

Back

Process Evaluation

Front

- Is the program reaching the target population? - Individuals - Community - How will the intervention reach them?

Back

Evaluation Frameworks

Front

- a consistent approach, structure, or format to help you & others understand thinking that went into evaluation - Types of questions you asked and how you asked them - CDC Evaluation Framework - Institute of Medicine Framework

Back