Evaluating the Impact of Technology on Teaching and Learning

Performing an effective technology evaluation may seem like a daunting task, but with some guidance and a step-by-step approach, it's a task that can be managed by most school systems. Further, when you consider that the dollars you invest in determining if your technology plan "is working" are very small compared to the vast sums you might continue to invest in non-performing systems and ideas...well, the time and money spent on evaluation is a pretty wise investment.

Sun Associates specializes in assisting school districts to develop effective evaluation and assessment procedures for their educational technology technology plans. The following links provide some starting information, sample tools, and sample reports. For more information, please contact us.

Sample Focus Group and Interview Questions

Sun Associates offers a variety of sample data collection and evaluation tools for districts seeking to assess the impact of technology on teaching and learning.

Research Resources for Technology Evaluation

An essential part of developing a technology evaluation is the creation of performance indicators. In Sun Associates' evaluation work, indicators are rooted in a district or project's vision for how technology is intended to support teaching and learning. In this regard, it is often useful to have a good working knowledge of what the research says about the link between technology and learning. Here's our annotated bibliography that hits the high points of this research (with particular emphasis on current issues such as 21st century learning, 1:1 technology access, and the flipped classroom). We also have a research summary that pulls many of these ideas together.

Is It Working? Designing a Technology and Assessment Plan

Presentation from NECC 2000 on designing a technology evaluation and assessment plan

A Presentation on Technology Evaluation

This PowerPoint presentation overviews the basic concepts surrounding instructional technology evaluation and introduces Sun Associates' basic evaluation methodologies.

A 3-Step Evaluation Process

Information on the three basic steps of technology evaluation and assessment.

Background Resources for Technology Evaluation

It is critical that your evaluation effort be based in realistic expectations of what is possible -- both in terms of potential impacts of educational technology and what is possible to evaluate. Here are some resources which provide a useful background in both of these areas. This online bibliography comes from one of our administrator workshops on technology evaluation.

Data Collection Strategies

Evaluation is built around data. In terms of technology evaluation, how do you collect meaningful qualitative and quantitative data showing technology impact and use? This article covers data collection basics and put data collection strategies such as surveys into a broader assessment context.

Sample Online Teacher Surveys

Online surveys can be an efficient way of collecting data from a large number of teachers...particularly if your district has a well-developed network which offers all teachers WWW connectivity. Here are examples of online surveys developed for our evaluation clients. We've eliminated the possibility of actually clicking "submit" and submitting these sample surveys, but in all other ways, they behave like a fully-functioning survey.

Focus Group Questions

As we discuss in our article on data collection, a meaningful evaluation will be comprised of data coming from several different sources. Aside from surveys, focus groups are one of the most common and richest data collection strategies. This article discusses focus group basics and provides sample questions from a teacher-focused group.

Observation Templates

Building and classroom observations are the third -- and often most detailed -- legs of the "triangle" of data collection. These are sample templates which we have have used in several of our evaluation and data-collection projects.

A Sample Evaluation Rubric

A critical part of the Sun Associates evaluation process is the development of performance assessments for teacher or student use of instructional technology (and other aspects of technology implementation, such as teacher professional development). These assessments make use of indicators and benchmarks which are then organized into a rubric. Here's a sample.

Final Evaluation Report

The "final report" -- even if it's just a summary of the first year of an ongoing formative evaluation -- is often a watershed event for the school or district conducting a technology evaluation. The report serves to focus community attention on specific aspects of your technology integration work as well as to showcase the fact that you care enough about technology to critically assess what is and is not working. Most of our clients find that the final report serves to heighten and enhance awareness of technology in their schools and its impact on teaching and learning.

This is a sample of an evaluation project final report. This project utilized our standard formative evaluation methodology and centered around assessing the effectiveness of a district-wide technology staff development effort.

See our page on current work for more links to Sun Associates evaluation reports.

Project Proposal

Sun Associates' projects always start with a detailed project proposal. Here is a sample proposal showing how we generally outline a project to potential clients. Naturally, each proposal is unique to the client for which it is created. If you would like more information on how we could help you with a technology evaluation project, please contact us!

Back to our main Technology Evalation page



Contact Us

Information on this site that has been produced by Sun Associates is Copyright 1997 - 2010 Sun Associates and is available for individual, one-time, use by educators. Duplication is prohibited without permission. All other material is the property of its authors and Sun Associates makes no warranty for its use or accuracy.

Last updated, August 9, 2012