September 2000

Performance Assessment Links in Science (PALS):
An On-Line Resource Library

DRAFT Final Report

 

Prepared for:

The National Science Foundation
4201 Wilson Boulevard
Arlington, VA 22230

 

INTRODUCTION

The alignment of standards and tests is essential if assessment systems are to serve the multiple purposes envisioned in support of school reform - e.g., to communicate expectations, signal priorities, model desired teaching and assessment practice, provide sound accountability data, and provide feedback to inform instruction. Understanding the central role that performance assessment plays in standards-based reform, educators are seeking ways to use this form of assessment to test student learning. Education agencies need pools of performance tasks to use as components in their student assessment programs and in evaluations of state and federally funded programs (McLaughlin & Shepard, 1995). Reform projects, too, need standards-based assessment, as do teachers who are trying to implement reforms. Experience indicates, however, that the level of effort and costs of developing performance assessment tasks, scoring rubrics, and rater training materials, and then ensuring their technical quality, are very high (CPRE, 1995; Quellmalz, 1984).

This final report summarizes a grant completed by the Center for Technology in Learning at SRI International to develop Performance Assessment Links in Science (PALS), an innovative approach for sharing exemplary assessment resources, collaborating on the development of new ones, and understanding how the use of standards-based performance assessment can advance science education reform at all levels of the educational system.

GOALS

The PALS project has addressed two primary goals: (1) To develop a two-tiered on-line performance assessment resource library composed of performance assessment tasks for elementary, middle, and secondary levels from multiple sources, such as state assessment programs and consortia and national reference exams (NAEP, TIMSS, New Standards). One tier is intended for use by teachers and professional development organizations. The second tier is intended to be a password-protected, secure Accountability Pool of science performance assessments for use by state assessment programs and systemic reform programs (e.g., Systemic Initiatives). (2) To identify, study, and evaluate the effectiveness of policies, implementation models, and technical quality requirements for the use of the two tiers of PALS.

Our partners include the Council of Chief State School Officers (CCSSO), three states (Connecticut, Illinois, and Kentucky) and two assessment consortia [the CCSSO State Collaborative on Assessment of Students and Standards (SCASS) for science and the Partnership for Assessment of -Standards-based Science (PASS)]. The partners have participated in the PALS project development by contributing assessment tasks, promoting implementation of the online assessment resources, and providing ongoing advice as members of the PALS Steering Committee.

Figure 1 portrays our vision of how a fully developed PALS could support the accessibility and use of science performance assessment. In our design for PALS, assessment programs such as reference exams (e.g. TIMSS, the National Assessment of Educational Progress, the New Standards Science Assessment), state and other mandated testing programs, and districts would contribute standards-based science assessment tasks with documented technical quality to the PALS on-line resource library. The fully developed on-line performance assessment bank would provide two sets of resources: an open pool for use by professional development organizations for capacity-building and by teachers for classroom assessment, and a secure pool for use by assessment programs for accountability testing.

The on-line Professional Development Pool contains resources that have documented technical quality and have been released for access by teachers and professional development groups. Pre-service and in-service programs can reach teachers in geographically distributed and remote locales, resulting in great savings in travel and materials expenses. On-line guidelines and templates can support classroom use of science performance assessments. Teachers can administer the science performance tasks as part of their classroom assessments, adapt them, or use them as models for developing similar investigations. Teachers can engage in on-line rating sessions and conversations about how their students' work meets local and national science standards.

A second set of resources, the Accountability Pool, will be composed of password-protected, secure tasks accessible only by approved assessment program staff. Assessment programs can thus share their resources and have access to a much larger pool of science performance assessments to use or adapt for their testing administrations. The PALS resource library provides large, continually updated collections that support efficient search, selection, and printing. Moreover, on-line rater training and scoring promise to vastly reduce assessment costs by allowing these functions to take place at geographically distributed sites.

PROJECT ACTIVITIES AND OUTCOMES

Steering Committee Meetings

The PALS Steering Committee met annually to review the project progress and recommend directions for each year of the project. In Year 1, Steering Committee members included Carmen Chapman, Kathleen Comfort, Bernard Gifford, Daniel Ochs, Douglas Rindone, Edward Roeber, and Brian Stecher. In Years 2 and 3, the Steering Committee consisted of Kathleen Comfort, Daniel Ochs, Steven Weiner (for Douglas Rindone), John Olson (for Edward Roeber), Pamela Stanko (for Carmen Chapman), and Brian Stecher. Joan Herman, the external evaluator also attended each meeting. At the Year 1 meeting the Steering Committee addressed six issues: (1) criteria for including performance assessment resources in the pools, (2) procedures for aligning resources with science standards, (3) policies for access by partners and others to the two pools, (4) implementation models, (5) alternatives for operating, sustaining, and scaling the pools, and (6) plans for contributions to the pools by partners and others. The committee agreed that science tasks could be included if they had been developed according to a systematic test development process, including content and sensitivity reviews; had been field tested with at least 100 students; reported score distributions; and had established acceptable levels of interrater reliability. The committee felt that the project should focus on stocking the Professional Development Pool, given the concerns of several of the partners about the security of tasks in the Accountability Pool for high-stakes testing. The general discussion about strategies for supporting and growing the library after the grant period ranged from fee-for-use to licensing. Finally, the partners explained the ways in which they could contribute tasks or broker implementation studies. For each subsequent year of the grant, the PALS Steering Committee and the external evaluator have met to provide feedback and direction on the project's progress. In Years 2 and 3, the committee was informed of the Web site's: (1) new features,; (2) growth in numbers of new tasks; (3) links to state standards; (4) links to science inquiry curricula, and (5) the site's increasing use. In addition, project staff discussed any difficulties encountered during the implementation process.

Development of On-line Science Assessment Resources

During the three years of the project, the PALS team expanded the number and types of resources to be placed on the PALS Web site. The proposed resource development included a systematic process for collecting, formatting, and posting the assessment task components (administration procedures, student booklet, scoring rubrics, scored student work, and technical quality information), indexing the assessments to the National Science Education Standards (NSES), and generation of customized assessment planning charts. In response to user feedback, the PALS staff considerably elaborated presentation of the assessment task components, added functions for search by additional science standards and curriculum frameworks and expanded the purpose and function of the assessment guidelines. We describe the development activities below.

Performance Assessment Tasks.

As of September 30, 2000, there are approximately 170 science performance assessment tasks posted on the Web site, and new tasks are continually being processed and added. An additional 50 tasks, are in process. The PALS staff is making substantial edits and informational improvements for previously posted tasks, and creating instructive equipment set-up illustrations to include in the administration procedures for many of the elementary tasks.

Currently, there are approximately 44 tasks posted for grade levels K-4, 88 tasks for grade levels 5-8, and 49 tasks for grade levels 9-12. (Totals are approximated to account for tasks that will be posted by the time this report is submitted for review.) We have made some progress in our efforts to identify and obtain specific tasks designed to address those science standards that were underrepresented in the collection at the end of Year 2.

The assessment collection has considerably increased the task formats and development sources. The performance tasks in the resource bank currently are from eleven sources (new tasks are continually being added). In addition to the organizations that had previously provided assessment materials (i.e. The Council of Chief State School Officers State Collaborative on Assessment and Student Standards, The Partnership for the Assessment of Science, The Third International Mathematics and Science Study, The National Assessment of Educational Progress, The Connecticut Academic Performance Test, The Kentucky Department of Education, the New York State Education Department, and the Assessment of Performance Unit Department of Education and Science, Welch Office, Department of Education for Northern Ireland), we have obtained tasks from the New Standards Science Reference Exam Project, the RAND assessment research group, and the Oregon State Department of Education. As before, the vast majority of the assessment tasks in the current collection require student knowledge and understanding of science inquiry skills as well as discipline-specific scientific content. Available formats include students working individually and also many tasks involve students working in cooperative groups. Table 1 below summarizes the distribution of assessment tasks in the PALS collection by grade range and NSES content areas. Appendix A presents the complete task list indexed to the NSES.

 

Table 1

PALS Task Collection, National Science Education Standard Distribution and Description of Contributors

Grade Level A- Science as Inquiry B- Physical Science C- Life Science D- Earth & Space Science E- Science and Technology F- Science in Personal & Social Perspectives G- History & Nature of Science Task Totals
Elementary (K-4) 40 28 9 3 2 1 0 40
Middle (5-8) 84 50 16 23 4 4 0 84
High (9-12) 56 28 15 3 6 8 1 56

*All PALS Tasks contain scientific inquiry. Many PALS tasks address more than one content area.

Task Sources: Number of Tasks Contributed and Posted
Assessment of Performance Unit (APU) 20
Connecticut Academic Performance Test (CAPT) 5
Council of Chief State School Officers (CCSSO/SCASS) 33
California Systemic Initiative Assessment Consortium (CSIAC) 1
Illinois State Teachers Association (ISTA) 6
Kentucky Department of Ed (KDOE) 25
National Assessment of Educational Progress (NAEP) 1
New Standards (NS) 2
New York Department of Education (NYDOE) 56
Oregon State Department of Education (OSDE) 19
RAND publication (RAND) 10
Third International Math and Science Study (TIMSS) 2

 

The SRI assessment team continues to find an over-abundance of performance assessment tasks addressing concepts from the physical sciences and a much poorer representation of the other content areas. Given that large scale assessment programs have been the main source of field-tested performance assessments for the PALS project, the conditions and requirements for large scale, standardized administration of the assessments tend to severely limit development of tasks in the areas of Life and Earth/Space science. The PALS project team will continue to seek out field-tested investigations for under-represented NSES standards. Curriculum-linked tasks being developed in Oregon and Maine may provide additional tasks that are extended investigations and linked to life and earth/space science standards. It is likely, however, that field-tested science performance tasks have not been developed for a number of the NSES. New development will fill the gaps for these under represented standards.

A prototype of the Accountability Pool "shopping mall' has been developed. Users will have to register online at PALS. The shopping mall will accommodate tenants such as test publishers, assessment program, and host "collaboratories" of groups wishing to co-develop secure assessments. Users will be able to "window shop" and view a sample assessment task displayed by the tenant. For example, a district assessment director could browse the sample science assessments offered by the CCSSO Science SCASS and the Partnership for the Assessment of Standards-Based Science (PASS). The assessment director can then contact the publisher or program for further information or activity. Appendix B presents the screen shots of the Accountability Pool design (See Figures 11 and 12). The PALS project will be seeking partners to participate in the Accountability Pool. State partners remain wary of test security, so we will be working with our two district implementation partners in Texas and Illinois on lower stakes accountability testing and with science curriculum programs on program evaluation assessments.

The technical quality of tasks in the PALS collection has been assured by the contributors, all experienced reputable assessment developers. With input from the external evaluator, the PALS staff developed an interview questionnaire to administer to previous and past contributors to complete information about the development process and technical quality indicators gathered for the PALS tasks. The PALS staff has conducted interviews with assessment program directors from Connecticut, Kentucky, and New York. Information has also been documented for tasks developed by the CCSSO Science SCASS, the New Standards Science assessment program, and the Illinois State Teachers Association. The technical quality information collected during the interviews is being posted with the PALS tasks.

PALS staff have obtained technical quality information from five of the twelve developers (Connecticut Academic Performance Test, Illinois Learning Standards, New York State Education Dept., Kentucky Dept. of Education, and New Standards. The technical quality of the NAEP and TIMSS tasks can be found on each assessment's web site. Tasks from all of these developers underwent a content review, usually done by content experts and science teachers. Curriculum and assessment experts were also used during content reviews. Tasks from two of the five developers underwent sensitivity reviews. All developers administered tasks to representative samples of students with regard to gender, ethnicity, socio-economic status, and disability, and four developers with regard to limited-English proficient status. Developers administered each task to at least 500 students per grade level. Some analyses were also conducted by the various developers, including interrater reliability analyses, cut score and internal consistency analyses, and disaggragations of scores by sub-groups.

The PALS Guide.

The PALS Guide identifies the features of good assessments, provides detailed information on three key components of performance assessment (i.e., achievement targets, tasks, and rubrics) and serves as an interactive tool to guide users in adapting performance assessments to their own purposes. For each of the three components of performance assessments (i.e., achievement targets, tasks, rubrics) a suite of resources is provided that can enhance teachers' knowledge of performance assessment. These resources include: (1) an overview of the nature and role that each component plays in an assessment system; (2) procedures for designing effective targets, tasks, and rubrics; (3) common errors in designing the components; (4) illustrative cases that describe well vs. poorly designed components; and (5) a list of citations for journal articles, books, and technical reports that provide additional print resources (several available online).

To adapt a PALS task, the user can complete a multi-step, interactive exercise. Using a variety of templates, the user modifies an existing performance assessment and is able to print out the revised task for use in their classroom. To adapt the task, the user proceeds through the following steps:

We plan to expand the number of task adaptation examples. Initial reactions to the Guide have been very positive. We view this feature as a major step forward in creating online professional development for performance assessments.

Assessment Charts. To help users identify assessments in the PALS collection that have been indexed to important science standards, the on-line system provides assessment planning charts (Stiggins, 1994; Stiggins, Rubel, & Quellmalz, 1986). The PALS Web site software automatically produces an assessment planning chart to display tasks that are intended to test selected standards. Brief descriptions of each candidate task are presented below the chart to help the users decide if they would like to proceed to review the task components.

Standards. The PALS project has created a relational data base to permit users to search for tasks in the PALS collection that have been designed to test science standards selected by the user. Each of the 180 science assessments in the PALS collection has been indexed to the National Science Education Standards the investigation intends to test. We plan to add the AAAS Benchmarks to the Web site. In addition, the PALS Web site allows searching by other science standards from states or districts. In Year 3, science standards from the states of Texas and Illinois were been placed on the PALS Web site to illustrate the possibility of customizing the search for science assessments via state and local standards.

Curriculum Frameworks. In response to requests from professional developers and teachers participating in PALS sessions, another search feature has been added to the PALS Web site. Users can now search for tasks in the PALS library related to curriculum materials. We have created tables indexing PALS tasks to a few widely used NSF-funded science inquiry curricula to illustrate the curriculum customization feature. The enthusiastic reception of this new feature suggest that providing teachers with cross links among science standards, curriculum units, and performance assessments will greatly increase use of PALS and forward science education reform.

Technology Development

Besides the ongoing addition of new tasks, PALS technology development has focussed on three main areas:

Each of these areas is elaborated in more detail below.

Scalability: Supporting growth. Over the past two years, the size of the PALS Web site has increased fivefold from 10 megabytes of data across 1000 files in 1998, to 48 megabytes of data across 5000 files as of September, 2000. The PALS Web site received over 170,000 hits in 1999, and is expected to reach 200,000 hits in 2000 (130,000 as of Aug 31). Requests for the homepage account for approximately 10% of these hits. At least 10% of the visits were from 91 countries other than the United States: PALS has been accessed over 100 times from 37 countries, over 500 times from 11 countries and over 1000 times from 6 countries other than the US. Given these growth figures, we recognized a critical need to optimize our underlying technology in order to maintain or improve our Web site performance.

The PALS architecture employs a relational database with four tables (task information, standards information, mappings between tasks and standards, and mappings between standards) to store PALS data. This data is used primarily for dynamic generation of assessment charts. For improved performance and scalability, we ported our database server (from MySQL, then SOLID) to a faster, more extensible server called MySQL. The migration to MySQL improved Web site performance tenfold, allowing us to better handle growth as we scale to hundreds of performance tasks. We also optimized our server-side scripting language (used to dynamically generate assessment charts) to work with the new database, and realized another twofold gain in performance as a result. Finally, we migrated to a faster web server (from NSCA to Netscape Server, and then to Apache) to further improve performance and take advantage of advanced security and programming features, such as Secure Sockets Layers (SSL) and Java servlets.

Usability: Redesign of the PALS interface. In 1998, we conducted our first user surveys, and significantly revised the PALS interface as a result (e.g., structured pages to better facilitate scanning, added search and discussion support, and improved the overall navigation). Although the PALS Web site thoroughly presented its information and made effective use of the Web's technological potentials, we felt that a graphical treatment of the site's presentation may provide for an even more intuitive and successful user experience. We realized that as a functional resource, it is more vital for graphical information to serve a purpose than to present an artistic composition. However, branding and design can provide for a functional experience, not just an aesthetic one. When a user can immediately know they are using a material (such as a performance task) and not a material guideline, it often means there are visual cues that are expediting the process. If every page looks the same, a user can get lost because these visual cues are missing.

With these design goals in mind, we collaborated with an interaction designer to give the PALS site a "facelift" including aesthetic enhancements to increase site usability (see Figure 2). Working with our designer, we addressed branding, including color treatment, text layout, icon development, and logo treatment, to help establish a sense of place and identity for PALS in the vast space of the web. Site navigation and structure remained essentially consistent with the prior design.

Figure 2. Redesign of the PALS interface.

 

New Features: Supporting user needs. Over the past two years, we developed several new features to benefit our users. These include: narrowed search with relevance feedback on matching tasks, personalized assessment charts ("My Chart"), the ability to search for tasks via a variety of standards frameworks, user ratings of tasks, and the PALS guide. Each of these features is described in more detail below.

Narrowed search with relevance feedback. Given our expanding pool of tasks, some searches were resulting in overly-large results sets and charts. In these cases, our users reported difficulty identifying the few tasks of most interest to them. We addressed the need to narrow and highlight search results in three ways:

1. A subset of 35 curriculum/subject area tags (e.g., Heat, Utility, Atoms, etc) was developed to cover all tasks in the database, and all tasks were tagged with these tags. When searching for tasks via standards, users can now select an area of interest from the subject area menu to restrict the results of their search to tasks that address these particular subjects.

2. The system now returns a list of matching results (see Figure 3) rather than automatically generating an assessment chart that may be too large to be of use. Users can select a subset of tasks from that list, and then generate an assessment chart of a more manageable size (see Figure 5).

3. Relevance feedback was added in the form of "stars" beside each task name to indicate how many of the selected standards the task is designed to test (see Figures 4 and 5). This feedback is intended to help users more easily identify the tasks most relevant to their particular needs.

 

 

 

 

 

 

 

Figure 3. Narrowing search results by curriculum/subject area.

Figure 4. Search results with relevance feedback.

My Chart. Some PALS users reported difficulties finding their previously-generated assessment charts (Figure 5). In the original design, when a user created a chart and clicked on a task in the chart, a new window popped up with the task information. This new window often obscured the assessment chart window, and users didn't realize that they needed to rearrange their windows or close the task window to access the chart again. During the redesign of the PALS interface, we added a "My Chart" button to each page that would recall the last assessment chart upon request (see Figures 2-5, top right). This is similar to how "shopping carts" work on various e-commerce Web sites (the content of your cart is always one-click away). To enable this feature, we used "cookies" to track chart information (selected tasks and standards). When the user generates an assessment chart, the standards or tasks they selected are saved on the user's computer (as a browser cookie). If the user returns to the chart page from any page on the site (other than by submitting new tasks/standards for a new chart), the last-generated chart is recreated from the information stored in the last cookie.

Figure 5. PALS assessment chart.

Alignment with multiple standard framework. Many of our users have asked if PALS tasks can be indexed by local standards frameworks, in addition to the National Science Standards (NSES). In response, we have aligned selected state and curriculum frameworks to the tasks so that they can also be used to search for tasks in PALS. This was done by using the NSES as a "reference" target, mapping all tasks to the NSES, mapping the new standard framework to the NSES, and writing a script to automatically translate from tasks to frameworks via the NSES in response to a search request. As a result, any framework that is mapped to NSES can be entered in PALS and can automatically be used to search for tasks. We have had several contacts requesting that their local state, district, or curriculum standards be linked to PALS, and have completed this linking for K-12 standards for the states of Illinois and Texas, and for the FOSS curriculum (see Figure 6).

Figure 6. Alignment with other standards.

Figure 7. Task homepage with rating feedback area.

Collecting user ratings of task. To further help users identify useful tasks, and to provide feedback to PALS developers, we have added the ability for users to rate each task along two dimensions: "How likely are you to use this task?" and "How likely are you to adapt this task?" A rating form is available on the home page of each task (see Figure 7), where users can submit their ratings, as well as text comments. Once the user submits their ratings for a task, they get to see the mean rating for the task based on all user ratings submitted so far. Web site research has found that such "rewards" (e.g., seeing collective user ratings) entice users to more often submit their opinions and increase feedback. The PALS task rating feature was developed in collaboration with another SRI project called URLex (URL exchange). Once a large base of task ratings have been collected, we could use them to provide recommendation services.

PALS Guide. We envision that the tasks on the PALS Web site can be used as-is or adapted for teacher's specific needs. However, adapting tasks is not trivial. Several of our users have requested examples of how adaptation might be done. The purpose of the PALS Guide is to walk users through the three main areas of performance assessments (targets or standards, tasks and task design, and finally, rubrics and scoring) and give them opportunities to learn about these elements as well as methods for adapting specific tasks for their needs (see Figure 8). The guide also includes a glossary and a set of descriptions outlining features of good performance assessment. Like "My Chart", access to the PALS Guide is available from every page on the PALS Web site (and, in turn, users are one-click away from the PALS resources once they are in the Guide).

Figure 8. New PALS Guide.

Future Issues

At present, all of the tasks in the PALS resource bank are public tasks. As the accountability pool of tasks is developed, we will add password-protection for secure tasks in this pool. We have already put the PALS server behind a firewall, and may also consider employing Secure Sockets Layer (SSL, the industry-wide standard for encrypting data transferred across the internet) to encrypt performance events requested by approved users.

Outreach and Implementation Activities

A major effort of the PALS project has been to disseminate information widely about the Web site and to promote use of the resources. Our activities have included participation in conferences, presentations to educators and professional associations, participation in professional development activities, meetings with state and local policymakers, networking with professional development groups, and outreach to on line and other dissemination vehicles. Table 2 summarizes the PALS Outreach and Implementation activities. PALS Project staff have participated in 14 national, regional, state and local conferences or meetings to make presentations about the PALS resources and how they might be used to enhance assessment activities. Project staff have also participated in one state and two local professional development training sessions to describe how the PALS resources could be incorporated into these activities. Additionally, we have entered into discussions with three groups who conduct preservice and inservice training for science teachers. These discussions centered on how we might collaborate with these groups. In addition, information about PALS was distributed to curriculum development staff from the National Sciences Resource Center and to state science supervisors at a leadership institute organized by the National Academies Center for Science.

Table 2

PALS Outreach and Implementation Activities

Conferences/ presentations
  • Presentation at the national meeting of the state science coordinators in Boston at the annual NSTA conference
  • Presentation to U.S. Department of Education staff
  • Presentation to DoDEA staff
  • Presentation to National Academy of Science staff
  • Presentation to North Central Regional Education Lab staff to facilitate dissemination activities through the regional resource network
  • Participation in science conference for 300 teachers from the Los Angeles and Fresno Systemic Initiatives
  • Conducted PALS demonstration for the assessment work group at the CILT conference in Utah
  • Presented PALS resources to two sessions of the annual Assessment Training Institute in Oregon
  • Presentation on science assessment and PALS to participants at the Exploratorium Assessment/Inquiry Conference in San Francisco
  • Presentation made to participants of the Integrated/Coordinated Science Conference in Los Angeles
  • Presentation to participants of the Oregon State Teacher Association education summit
  • Presentation and discussion of implementation models with 20 regional education service center directors for science in Texas
  • Presentation to state technology and assessment staff of the Illinois State Board of Education
  • PALS included in two proposed symposia for the 2000 AERA meeting
  • Presented PALS to science coordinators and graduate students at Georgia State University
Professional development activities
  • Discussion with the Chicago Systemic Initiative staff on the development of a college course for elementary teachers on the development of science performance tasks incorporating PALS resources
  • Provided information on PALS to 300 K-12 teachers as part of the Los Angeles SI professional development meeting
  • Discussion with staff from the Elementary Science Center at Cal Tech about how PALS resources could be incorporated into their curriculum development workshops
  • Information on PALS disseminated to curriculum development staff from the National Sciences Resources Center
  • Provided additional professional development training on the development of science performance tasks for teachers in the SUPER program
  • Made contacts with program staff from the Collaborative for Teacher Preparation program who provide innovative preservice training for K-12 teachers to explore possible collaboration
  • Conducted a PALS demonstration for the state-sponsored summer teacher professional development institute for trainer-of-trainers in Oregon
  • PALS materials disseminated at the State Leadership Institute on Standards-Based Mathematics and Science Education organized by the National Academies Center for Science, Mathematics, and Engineering Education and sponsored by the Eisenhower National Clearinghouse and National Institutes of Health's Office of Science Education
  • Conducted a PALS demonstration for Sandia, a summer institute for middle school and high school science teachers at the Lawrence Livermore National Laboratory
  • Conducted a PALS demonstration for the San Mateo County Science and Math Convention
  • Presented PALS to Texas science administrators in San Antonio
  • Conducted PALS demonstrations in Dallas and Spring Independent School District in Texas
  • Conducted a PALS demonstration to science teachers and teachers-in-residence at Tennessee State University in Nashville
Other
  • Conducted on line discussion sessions in the PALS room in TAPPEDIN with participants interested in learning about PALS
  • Added PALS assessment materials to accompany the Full Option Science System (FOSS) materials developed by the Lawrence Hall of Science
  • Meeting with NEA staff to discuss possible collaboration activities

Table 3 presents the Web Sites providing links to the PALS Web site.

Table 3

Web Site Links to PALS

Web Site URL

SciLinks

Web-based source of key education web sites. Correlates Web sites with science topics presented in textbooks

http://www.scilinks.org

ERIC: Search Assessment and Evaluation on the Internet

Comprehensive assessment and evaluation site. Currently averages 5,000 user a day.

http://ericae.net/sintbod.htm

MiddleWeb

PALS featured as "site of the Week." This site serves as an Internet resource guide for middle school teachers.

http://www.middleweb.com/Contents.html

Education World

Seach engineer for education Web sites. A place where educations can find information without searching the entire Internet.

http://www.education-world.com

North Central Regional Education Laboratory (NCREL)

NCREL is a not-for-profit organization dedicated to helping schools

http://www.ncrel.org

National Science Resources Center (NSRC)

NSRC collects and disseminates information about exemplary teaching resources

http://www.si.edu/nsrc/start.htm

Education Connection

Extensive directory of a educational resources

http://www.asd.com/asd/edconn
ExplorAsource http://www.explorasource.com/educator/

American Educational Research Association (AERA)

AERA is the most prominent international professional organization with the primary goal of advancing educational research and its practical application.

http://www.aera.net/anews/resource/wr99-003.htm

Regional Alliance

The Alliance provides professional development and technical assistance to schools, districts, and other reform efforts in the region.

http://ra.terc.edu/alliance/TEMPLATE/about_ra/about_ra.cfm

PALS IMPACTS

The PALS project has employed numerous methods to gather information about the quality and navigability of the online resources. These methods include: (1) usage statistics, (2) interviews with individual users, (3) studies of online rater training and scoring, (4) online ratings of individual assessment tasks and comments, (5) comments and reactions from participants in PALS sessions held during TAPPEDIN After school Online, (6) surveys following presentations and professional development sessions, (7) telephone interviews, (8) district professional development programs, and (9) curriculum program evaluation. We summarize the findings from these data sources below.

(1) Usage Statistics. In Year 3, usage of the PALS resources has tripled. The home page has been accessed 32,157 times from 2/97 to the present (10/01) (See Figure 9). The total number of successful page requests (any page in PALS) is 407,979 (10/01) (See Figure 10). From January to September 1999, the PALS home page has been viewed approximately 10,000 times. Over 103,000 page views for all the PALS Web pages have been recorded from January to September 1999. Analyses of monthly usage statistics revealed high and steadily increasing viewing of the PALS resources during the summer months. Records of users' domain addresses reveal users logging in from over 40 countries, including the United States, Canada, Australia, United Kingdom, Singapore, New Zealand, South Korea, Malaysia, Japan, South Africa, Netherlands, and Germany.

Figure 9. Cumulative Number of PALS Home Page Hits 1998-2000

Figure 10. Cumulative Number of PALS Pages Viewed 1998-2000

(2) Interviews with Individual Users. During the Planning Grant, the PALS project studied the search and select function with state assessment personnel and our Advisory Panel. The feedback was that the navigation was easy to use.

(3) Studies of Online Rater Training and Scoring. During the Planning Grant, the PALS project also studied the effectiveness of an online rater training and scoring feature. In collaboration with the CCSSO Science SCASS and the California Science Assessment Consortium (CSIAC), the project created online training packets and online scoring and calibration features. Raters were initially calibrated at 80% exact agreement level and then scored 100 validation papers simulating "live scoring" at over 80% exact agreement with scores provided by the developers' field test scores. The study documented the capability of the PALS site to host online training and scoring.

(4) Online Ratings of Individual Assessment Tasks and Comments. As described in the Technology Development section above, the PALS Web site has recently offered a task rating and comment feature. In response to the question, "How likely are you to use this task?" the average rating of forty-six users on a scale of 1(low) to 4(high) was 3.2. Their average rating on the question "how likely are you to adapt this task?" was 3.5. Users could also submit comments about the specific task. Selected comments included: " My fourth graders would enjoy this task." "Great task for 8th graders" "(task is) too easy/hard for (my particular) grade" "…slight adaptation to be used with second graders." "Too simple for fourth grade." "Special education students need to be paired with a buddy…" In addition, teachers commented on the usefulness of the rubrics, examples of student work and how easy it was to incorporate or adapt the tasks for their existing curriculum.

(5) Participant Comments and Reactions to PALS in TAPPEDIN After School Online Sessions. Over the last 12 months we have provided regular online discussion sessions about PALS through SRI's TAPPEDIN virtual environment. The real time PALS sessions were part of the spectrum of offerings referred to as, "After School Online" sessions designed specifically for the interests and needs of practicing teachers. Email notification of each month's offerings are sent to TAPPEDIN users and other interested individuals and educational organizations. Users typically log-on individually and participate or "sit-in" on the sessions in an informal manner. PALS project Co-PI, Tom Hinojosa, typically hosted each session with the assistance of one of the TAPPEDIN online staff. Each discussion was organized around a specific topic relating to science performance assessment, but deviated at times in response to the expressed interests and questions of the participants. Eight online sessions were conducted involving 40 participants, ranging from pre-service and experienced teachers, to teacher professional developers and university instructors. All participants expressed a positive reaction to PALS and the intention to return individually to the site for further exploration. Sample comments from the TAPPEDIN transcripts were"

(6) Professional Development Surveys. Over the course of the project, we surveyed eight groups of teachers and administrators about the PALS Web site. Survey forms were developed by PALS staff and reviewed by the external evaluator. Educators received their surveys after participating in professional development activities focused on Web site features and use. Across the eight survey administrations, 142 teachers and 63 administrators completed surveys. Survey administrations were conducted in Sandia; California; Bend River, Oregon; San Antonio, Texas; San Mateo, California; Dallas, Texas; Atlanta, Georgia; Spring Independent School District in Houston, Texas; and Tennessee State University, in Nashville, Tennessee. A summary of survey results and selected anecdotes from workshop participants are presented below. (Appendix C contains a table summarizing teacher and administrator ratings and a summary of their comments.) Most teachers and administrators believed that PALS is easy to use and is a good resource for teachers. One teacher wrote, "I have just spent about two hours on the PALS page-What a wonderful resource! I like how they have included the exemplars of student work…The whole site was easy to navigate." Another wrote, "I think the site will be very useful to me when I am looking for performance assessments which relate to topics of study which I am already teaching." In addition all of the teachers and administrators surveyed stated that they could find ways to use the PALS resources during this school year. The most common ways that teachers planned to use PALS resources with their classes or other staff were: to assess students in their classrooms, to work with other teacher to develop their own science assessments, as a professional development tool to learn more about performance assessments, either alone or collaboratively with other teachers, and to work with colleagues to redesign/change their science courses. Administrators were most likely to use PALS to help teachers in their region select tasks to assess students in their classrooms and to help teachers collaborate on the development of their own science assessments. The findings from our survey also indicate that teachers are willing to share information with SRI. In particular, they are most willing to send us copies of PALS tasks that they have adapted, copies of rubrics they use, adapt or develop, and copies of student responses to the performance tasks they use or create.

(7) Telephone Interviews. To document the impact of PALS in classrooms, schools, and districts, we are conducting follow-up telephone interviews and e-mail correspondence with those who have received PALS direct training over the past 18 months. The preliminary data gathered from these first 10 interviews is encouraging, and suggests that teachers are either using or planning to use PALS tasks in their classrooms. Teachers and administrators have generally come into contact with PALS in two ways: through professional development seminars given by PALS staff, and by finding the PALS resource independently on the Internet. Since many teachers have been introduced to PALS very recently, most interviewees are still in the planning stages of using PALS as a resource for science assessment. The Springfield, Illinois school district introduced its elementary science teachers to PALS and plans to have its middle school science teachers use PALS tasks in the 2000-01 school year. All interviewees from Texas and Tennessee, who received their training in June 2000, have indicated they are planning to use PALS tasks in the coming school year. At one middle school in Spring Independent School District in Texas, all science teachers are required to use at least one PALS task within the first six-week grading period. The Spring ISD is also conducting an additional three-hour training session on PALS for all secondary science teachers (grades 6-12). Individual teachers in Tennessee and Texas have also indicated they will use PALS tasks in the 2000-01 school year. Toni Foster, who teaches science in grades 4-8 at Black Butte School in Oregon, has been using PALS for over a year after finding the web-site independently. She has used PALS assessments on eight different occasions, and has been able to select tasks that align with the Oregon state science standards. PALS is also having an important impact in other areas outside the classroom. The Drug Abuse Research Teams (DART), a group of educators and scientists promoting drug abuse research in schools, is developing performance assessment tasks after receiving training from PALS researchers. In the research setting, Hickey and Holbrook (2000) have modified, extended, and refined PALS tasks to assess the Learning By Design (LBD) curriculum, supported by the National Science Foundation, against a comparison classroom. As the authors put it, "When referenced to established science education standards and embedded in a learning environment such as LBD, the PALS assessment tasks provide most of what is needed to create a systemically valid assessment system"

(8) District Professional Development Programs. Springfield Illinois. A case study was recently carried out of one mid-west urban district using the PALS web site during the 1999-2000 school year. The district serves approximately 15,000 students located in 35 schools. During the summer of 1999, training was held for both elementary and high school teachers in the district. The focus of the district was to develop performance tasks as part of an assessment strategy to evaluate student progress. Given this curriculum approach, PALS was deemed to be an excellent fit. The philosophy of the site matched perfectly with the districts philosophy of instruction, especially since the development and use of rubrics had always been part of the summer training. Teachers liked the PALS site because it emulates the classic approach of gathering data, making observations, solving a problem and writing a conclusion. The teachers' immediate reaction was that the site gave them access to a variety and breadth of performance assessment tasks, scoring rubrics and other materials that they had never seen before. The study concluded that the teachers' PALS experience with the site could become an integral part of their classroom assessment. The participating teaching staff believe that PALS is a great site since it "…provided examples of good and bad rubrics as tools to guide teachers in creating their own rubrics…that teachers can gain a variety of skills while exploring PALS' databases, including illustrative examples of how to use a rubric to improve their own consistency in scoring performance assessments." The district has developed a strategy to provide follow-up training in their schools and will incorporate PALS as an integral component of the training for the district's grade four through eighth grade teachers.
Spring Independent School District, Houston, Texas
. The Spring Independent School District in Houston, Texas, has recently begun implementation of their 2000-2001 science professional development plan in which PALS plays a prominent position. Most notable in this plan is the central role-played by the primary stake holders, as represented by the science department chairs, who were responsible for developing the expectations for PALS use. Initial reports indicate that the majority of teachers have already developed plans to implement PALS activities. (See letter from Spring Independent School District in Appendix D.)

(9) Curriculum Program Evaluation. Learning By Design Evaluation. We are particularly pleased with the use of PALS tasks for one accountability purpose with the Georgia Tech Learning By Design curriculum (LBD) being developed by Janet Kolodner of Georgia Tech and her colleagues, with NSF support. The LBD materials are designed to support constructivist, inquiry-based learning around collaborative engineering design problems, and reflect recent cognitive science research on analogical and case-based reasoning (e.g., Kolodner, 1997). The evaluator, Dan Hickey, has used two PALS performance assessments as well as more conventional multiple-choice items in 12 classrooms that implemented the LBD Physical Sciences curricula and in 2 comparison classrooms (Hickey & Holbrook, 2000). All 234 students worked in groups to complete a moderately revised version of Speeding in a School Zone. These same students also completed a substantially revised version of Where the Rubber Meets the Road as a collaborative performance assessment. Videotapes of 47 groups' collaborative problem solving activities are being scored on 10 dimensions using a scale adapted from Pomplum (1996). They expect that groups in LBD classrooms will show more optimal collaboration and solutions, because the LBD activities are designed to foster effective learning-oriented participation structures. A subset of the videotapes, as well as videotapes of the corresponding LBD classrooms, are being analyzed using interpretive ethnographic methods with a framework described by Hickey, Wolfe, & Kindfield (in press). This analysis will consider evidential validity by documenting the constraints and affordances of both the learning environment and assessment environment. In addition, systemic validity and the desire to use assessment practices to directly enhance learning is a concern (Hickey & Holbrook, 2000).

Future Work

The long-range vision for PALS is to design and build a full suite of services and tools to support continuous science assessment reform. Such a fully developed system would provide:

Sustainability

Several strategies for the indefinite sustainability of the PALS resources are currently being explored. Based on market research and expert consulting advice, the production of the PALS CD-ROM seems promising. In addition, the PALS online resources may be enhanced and made commercially available to professional development programs and individual science educators. A variety of face-to-face presentation models, either alone or in conjunction with other professional development programs, are also being studied. Also, technical assistance services through the custom curriculum indexing feature already available on PALS offers a potential revenue stream to support and maintain the project.

Publications

Herman, J. (2000) PALS Project Evaluation Report.

Hinojosa, T., Quellmalz, E.S., Padilla, C. & Schank, P. (2000). PALS: A Technology-Based Tool to Support Best Practices in Inquiry-Based Science. Presented at the annual meeting of the American Education Research Association in New Orleans. http://pals.sri.com

Kerins, Tom, Performance Assessment on a Platter-But will Teachers Partake?, Paper delivered at AERA Convention, New Orleans, LA, April 2000.

Quellmalz, E., Hinojosa, T., Hinojosa, L., Schank, P. (2000). PALS Final Project Report.

Quellmalz, E., Haertel, G. (2000). "Breaking the Mold: Technology-Based Science Assessment in the 21st Century." Menlo Park, CA: SRI International.

Quellmalz, E.S. (1999). The role of technology in advancing performance standards in science and mathematics learning. In K. Comfort (Ed.) How Good is Good Enough? Setting Performance Standards for Science and Mathematics Learning. Washington, DC: American Association for the Advancement of Science

Quellmalz, E.S. (1999). Performance Assessment Links in Science (PALS). Classroom Assessment Connections. Portland, Oregon: Assessment Training Institute.

Quellmalz, E., Schank, P., Hinojosa, T., & Padilla, C. (September, 1999). Performance Assessment Links in Science (PALS). ERIC/AE Digest Series EDO-TM-99-04, University of Maryland, College Park. (http://ericae.net/disgests/m9904.pdf).

Quellmalz, E., & Schank, P. (April, 1998). Performance Assessment Links in Science (PALS): On-line, Interactive Resources. Presented at the annual meeting of the American Education Research Association in San Diego, CA., April 1998

Products

Performance Assessment Links in Science (PALS).Web site. http://pals.sri.com

Performance Assessment Links in Science (PALS) Demonstration CD.

Performance Assessment Links in Science (PALS) Web site Sampler (in press).

 


 

References

Consortium for Policy Research in Education (CPRE) (1995). Tracking Student Achievement in Science and Math: The Promise of State Assessment Programs. New Brunswick, NJ: CPRE Policy Briefs.

Hickey, D. T. & Holbrook, J. (2000). PALS-Supported Performance Assessments in the Learning by Design Project. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA.

Hickey, D. T., Wolfe, E. W., & Kindfield, A. C. H. (2000). Assessing learning in a technology-supported genetics environment: Evidential and consequential validity issues. Educational Assessment, 6 (3), 155-196.

Kolodner, J. L. (1997). Educational implications of analogy: A view from case-based reasoning. American Psychologist, 52, 57-66.

McLaughlin, M.W. & Shepard, L.A. (1995). Improving Education through Standards-Based Reform: A Report by the National Academy of Education Panel on Standards-Based Education Reform. National Academy of Education, Stanford University.

Pomplum, M. (1996). Cooperative groups: Alternative assessment for students with disabilities. The Journal of Special Education, 30 (1), 1-17.

Quellmalz, E.S. (1984). Designing writing assessments: Balancing fairness, utility, and cost. Educational Evaluation and Policy Analysis, 6, 63-72.

Stiggins, R.J. (1994). Student-Centered Classroom Assessment. New York: Macmillan College Publishing Company.

Stiggins, R.J. Rubel, E., & Quellmalz, E.S. (1986). Measuring Thinking Skills in the Classroom. Washington, D.C.: NEA Professional Library.

 


 

Appendices

 

Appendix A

Complete PALS Task List Indexed to the National Science Education Standards

TASK Source A B C D E F G Sample Curriculum Topics
Elementary                  
1. Balls and Ramp NYDOE X X           Motions/Forces
2. Bird Eggs NS X X X         Motions/Forces//Adaptations
3. Classification ISTA X X           Classify
4. Classifying Candy I NYDOE X X           Classify
5. Classifying Candy II NYDOE X X           Classify
6. Classifying Candy III NYDOE X X           Classify
7. Classifying Rocks CCSSO X X   X       Classify//Earth Materials
8. Colored Dots 1 NYDOE X X           Reactions
9. Colored Dots 2 NYDOE X X           Reactions
10. Colored Dots 3 NYDOE X X           Reactions
11. Containers TIMSS X X           Heat
12. Critter Museum RAND X   X         Traits
13. Floating Pencil NAEP X X           Mass/Density
14. Follow those Tracks NYDOE X   X         Ecosystems
15. Food We Eat CCSSO X         X   Health
16. Fossil Guide CCSSO X     X       Fossils
17. Identifying Creatures NYDOE X   X         Traits
18. Incline RAND X X           Motion/Forces
19. Insects and Spiders ISTA X   X         Traits
20. Keep It Cool CCSSO X X           Heat
21. Magnifiers NYDOE X X           Applied Science
22. Magnetic Mapping CCSSO X X           Magnets
23. Magnetic Testing ISTA X X           Magnets
24. Making Marbles CCSSO X X           Measurement
25. Measurement: balance ISTA X X           Measurement
26. Measurement: length, vol, t ISTA X X           Measurement
27. Minerals CCSSO X     X       Earth Materials
28. Mystery Boxes ISTA X X           Motion/Forces
29. Packing Materials NYDOE X X     X     Applied Science //Evaluation
30. Pathfinder KDOE X X           Electricity
31. People Interacting Environ. KDOE X   X         Behavior
32. Plastic Wrapped KDOE X X           Applied Science
33. Pulse TIMSS X   X         Physiology
34. Rising Waters KDOE X X           Mass/Density
35. Run For Your Life NYDOE X   X         Behavior
36. Some of Its Parts KDOE X       X     Evaluation
37. Swings CCSSO X X           Motion/Forces
38. Temperature School KDOE X X           Heat
39. Testing Foods NYDOE X X           Classify
40. Tree Study NS X   X         Traits
Middle School                  
1. Acid & Base Alien I RAND X X X         Acid/Base//Physiology
2. Acid & Base Alien II RAND X X X         Acid/Base//Physiology
3. Acid & Base Alien III RAND X X X         Acid/Base//Physiology
4. Acid & Base Test 1 NYDOE X X           Acid/Base
5. Acid & Base Test 2 NYDOE X X           Acid/Base
6. Acid & Base Test 1--Micro NYDOE X X           Acid/Base
7. Acid & Base Test 2--Micro NYDOE X X           Acid/Base
8. Acid Base Indicators CCSSO X X           Acid/Base
9. Acid Rain & Its Effects CCSSO X     X       Weather
10. Acid Precipitation--Micro NYDOE X X X         Acid/Base//Ecosystems
11. Acid Precipitation NYDOE X X X         Acid/Base//Ecosystems
12. Air in Soils NYDOE X     X       Soil
13. Barometer OSDE X     X       Weather
14. Blizzard of 1993 NYDOE X     X       Weather
15. Blue APU X X           Reactions
16. Building Materials RAND X     X       Resources
17. Changing Ramp Heights NYDOE X X           Motion
18. Changing Rocks NYDOE X     X       Earth Processes
19. Chemical Changes NYDOE X X           Reactions
20. Chemical Weathering NYDOE X     X       Weather
21. Circular Motion CCSSO X X           Motion/Forces
22. Classification of Animals RAND X   X         Traits
23. Classifying Material CCSSO X X           Magnets
24. Classification CCSSO X   X         Traits
25. Creeping NYDOE X     X       Earth Processes
26. Crustal Sinking NYDOE X     X       Earth Processes
27. Density CCSSO X X           Mass/Density
28. Density and Buoyancy OSDE X X           Mass/Density
29. Density of a Sinker NYDOE X X           Mass/Density
30. Density of Minerals NYDOE X X           Mass/Density
31. Dichotomous Key 2 NYDOE X   X         Traits
32. Earthquake Epicenter NYDOE X     X       Earth Processes
33. Electrical Circuits & Switches CCSSO X X     X     Electricity//Innovation
34. Erosion OSDE X     X       Earth Processes
35. Estimating APU X X           Measurement
36. Fault Line CSIAC X     X       Earth Processes
37. Formation of Wind NYDOE X     X       Weather
38. Formation of Rain NYDOE X     X       Weather
39. Friction RAND X X           Motion/Forces
40. Growth of Yeast CCSSO X   X         Growth
41. Half-Ball KDOE X X           Motion/Forces
42. Heat Retention CCSSO X X           Heat
43. Heating Crystals APU X X           Reactions
44. Height of Bounce NYDOE X X           Motion/Forces
45. Ice Melting OSDE X X           Heat
46. Identifying Elements CCSSO X X           Reactions
47. Insulators OSDE X X           Heat
48. Magnets OSDE X X           Magnets
49. More Power to You CCSSO X X           Electricity
50. Ocean Bottom Profile NYDOE X     X       Land Forms
51. Oil Spill KDOE X       X X   Innovation//Hazards
52. Paper Chromatography CCSSO X X           Reactions
53. Peat Pots NYDOE X     X       Soils
54. Pendulum OSDE X X           Motion
55. Pond Water OSDE X   X         Ecosystems
56. Powder KDOE X X       X   Reactions//Innovation
57. Predator-Prey CCSSO X   X         Ecosystems
58. Probing Under the Surface NYDOE X     X       Land Forms
59. Pulse TIMSS X   X         Physiology
60. Puddles & Soils NYDOE X     X       Soils
61. Rain Drops APU X X           Applied Science
62. Rate of Solution NYDOE X X           Reactions
63. Reaction Rates CCSSO X X           Reactions
64. River Planning KDOE X     X   X   Water//Development
65. Sand in Bottles NYDOE X X           Motions/Forces
66. Seed Growth OSDE X   X         Growth
67. Scale Model of the Solar System CCSSO X     X       Solar System
68. Soap, Wood and Water NYDOE X X           Mass/Density
69. Sound Box APU X X           Sound
70. Stimulus Response OSDE X   X         Physiology
71. Sugar/Starch Test 1 NYDOE X X           Reactions
72. Sugar/Starch Test 2 NYDOE X X           Reactions
73. Sun and Temperature NYDOE X X           Heat
74. Survival APU X X           Heat
75. Swinging APU X X           Motions/Forces
76. Tadpoles APU X   X         Traits
77. Temperature and Enzymes CCSSO X X           Reactions
78. Threads APU X X           Reactions
79. Unknown Liquids NYDOE X X           Mass/Density
80. Velocity CCSSO X X           Motions/Forces
81. Water Holding Capacity OSDE X     X       Soils
82. Water Pollution KDOE X     X   X   Water//Hazards
83. Wet/Dry Hygrometers CCSSO X     X       Weather
84. Where Rubber Meets Road KDOE X X     X     Motions/Forces//Innovation
High School                  
1. Ajax Seed Company NYDOE X   X         Traits
2. Anofasp APU X   X         Traits
3. Are Enzymes Specific NYDOE X   X         Cells
4. Are Fruits/Veg. Made of Cells NYDOE X   X         Cells
5. Best Place to Live KDOE X         X   Hazards
6. Boat Building KDOE X X     X     Mass/Density//Innovation
7. The Captain and Lake Wilmar RAND X X X X X     Heat//Ecosystems//Weather//Hazards
8. Car Wash CAPT X     X   X   Soil//Hazards
9. Cardiovascular Homeostasis OSDE X   X         Physiology
10. Catalase Enzyme OSDE X X           Reactions
11. Classification in Action KDOE X X           Classify
12. Coat Caper KDOE X           X Scientific Knowledge
13. Coffee Cooling OSDE X X           Heat
14. Cooling of a Liquid OSDE X X           Heat
15. Cut Above the Rest KDOE X X     X     Motion/Forces//Innovation
16. Deliver APU X X           Measurement
17. Developing a Nutritional Snack CCSSO X       X X   Innovation//Health
18. Effectiveness of Antacids KDOE X X           Reactions
19. Electrical Energy CCSSO X X     X     Heat//Iinnovation
20. Fish Kill NYDOE X         X   Hazards
21. Friction Force OSDE X X           Motion/Forces
22. Heating Crystals APU X X           Reactions
23. Here's Looking at You KDOE X   X         Traits
24. How Effective is Perspiration NYDOE X X X         Heat//Physiology
25. Human Inheritance NYDOE X   X         Traits
26. Ice Cold CAPT X X           Reactions
27. Instruments APU X X           Measurement
28. Keep It Hot CAPT X X           Heat
29. Let the Sunshine In KDOE X X           Heat
30. Nutritional Content of Food CCSSO X         X   Health
31. Old Problems/New Solutions KDOE X         X   Development
32. Paper Chromatography CCSSO X X           Reactions
33. Radioactive Decay CCSSO X X       X   Atoms//Hazards
34. Radiation RAND X X           Heat
35. Rain Drops APU X X           Applied Science
36. Rate of Solvation CCSSO X X           Reactions
37. Seasonal Changes NYDOE X   X         Ecosystems
38. Speed & Collisions CCSSO X X           Motions/Forces
39. Soapy Water CAPT X X           Reactions
40. Soiled Again CAPT X     X       Soils
41. Sow Bug Habitats NYDOE X   X         Behaviors
42. Suburban Ecosystems OSDE X   X         Ecosystems
43. Survival APU X X           Heat
44. Swinging APU X X           Motions/Forces
45. Tadpoles APU X   X         Traits
46. Testing a New Drug NYDOE X   X     X   Physiology
47. That's the Way the Ball Bounces KDOE X       X     Innovation
48. Vitamin C Testing NYDOE X X           Reactions
49. Water Regulation NYDOE X   X         Physiology
50. Wig Wag APU X X           Motions/Forces

 

Appendix B:

Screenshots of the Accountability Pool

Figure 11. Introduction to the Accountability Pool Figure

12. Types of Assessment Publishers and Projects

 

 

Appendix C:

Summary of Survey Results and Comments

Table 4. Summary of PALS Survey Results

  Sandia 2/27/99 Oregon 10/8/99 Texas 1/19/00 San Mateo 3/4/00 Dallas 3/8/00 Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total
n 23 6 54 12 19 9 26 56 205

Question 1 (Easy to Use) Sandia 2/27/99 Oregon 10/8/99 Texas 1/19/00 San Mateo 3/4/00 Dallas 3/8/00 Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
Very easy 15 3 32 7 17 8 24 38 144 70.24%
Somewhat easy 8 2 19 4 2 1 2 12 50 24.39%
Not at all easy               3 3 1.46%
Did not use   1               0.00%
No Response     3 1       3 7 3.41%

Question 2 (Useful) Sandia 2/27/99 Oregon 10/8/99 Texas 1/19/00 San Mateo 3/4/00 Dallas 3/8/00 Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
Very useful 16 5 33 9 11 8 23 46 151 73.66%
Somewhat useful 7 1 13 1 7 1 2 6 38 18.54%
Not at all useful         1       1 0.49%
Did not use     5 2       4 11 5.37%
No Response     3       1   4 1.95%

Question 3 (Plans for Use) Sandia 2/27/99 Oregon 10/8/99 Texas 1/19/00 San Mateo 3/4/00 Dallas 3/8/00 Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
3a: assess students 22 5 10 11 13 3 26 52 142 69.27%
3b: assess science curriculum 4 3 26 3 8 5 13 22 84 40.98%
3c: develop own assessments 13 2 33 5 11 3 16 35 118 57.56%
3d: redesign science courses 7 1 22 4 3 2 14 11 64 31.22%
3e: align courses with NSES 11 2 21 4 8 5 8 16 75 36.59%
3f: professional development tool 15 4 41 8 12 5 19 40 144 70.24%
3g: conduct professional development sessions (Georgia, Spring and TSU only)           4 13 16 33 16.10%
Other 2 1 6 3 1 5     20 9.76%
No Response     1           1 0.49%

Question 4 (Previous Experience) Sandia 2/27/99 Oregon 10/8/99 Texas 1/19/00 San Mateo 3/4/00 Dallas 3/8/00 Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
4a: participated in training/scoring sessions 17 3 33 5 8 6 14 39 125 60.98%
4b: participated in other professional development sessions 11 3 36 2 7 6 11 41 117 57.07%
4c: trained others 1 1 23   5 3 5 3 41 20.00%
4d: administered assessments 17 2 35 6 11 3 20 39 133 64.88%
4e: developed assessments 7   19 1 4 5 13 18 67 32.68%
No Response     5 2 5   3 2 17 8.29%

Question 5 (Willing to Share) Sandia 2/27/99 Oregon 10/8/99 Texas 1/19/00 San Mateo 3/4/00 Dallas 3/8/00 Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
5a: adapted PALS tasks 20 4 18 6 8 4 24 23 125 52.20%
5b: assessments developed using PALS 12 3 8 6 5 6 20 10 117 34.15%
5c: rubrics 15 4 12 6 6 6 19 18 41 41.95%
5d: student responses 15 3 10 8 8 6 19 25 133 45.85%
5e: experiences via phone interview 8 3 17 6 9 5 16 11 67 36.59%
No Response   1 24 2 8 2 1 21 17 28.78%

Question 6 (Access to Computers)
(Georgia, Spring, and TSU only)

Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
Yes in Classroom 8 23 44 75 82.42%
No in Classroom   2 7 9 9.89%
Yes in Lab Setting 5 22 25 52 57.14%
No in Lab Setting   2 11 13 14.29%

Question 7 (Experience Level with Computers)
(Georgia, Spring, and TSU only)

Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
Never used before         0.00%
Occasional User   3 7 10 10.99%
Several times a week   8 24 32 35.16%
Many times a day 9 14 22 45 49.45%

Question 8 (Subjects for PALS-like Resources)
(Georgia, Spring, and TSU only)

Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
Math 7 13 40 60 65.93%
Language Arts 7 8 41 56 61.54%
Social Studies 5 12 40 57 62.64%
Other 2 3 11 16 17.58%

Question 9 (Interest in convening teachers to:)
(Georgia, Spring, and TSU only)

Georgia 5/15/00 Spring 6/15/00 TSU 6/22/00 Total %
9a: develop new assessments 3 15 8 26 28.57%
9b: try out new assessments 5 18 34 57 62.64%
9c: link assessments to standards 4 12 10 26 28.57%
9d: find existing assessments 2 9 7 18 19.78%
9e: co-development of assessment resources 2 6 7 15 16.48%

 

Table 5. Selected Illustrative Comments and Suggestions for Improvement from Teachers and Administrators

Comments
Organization of Web site
"I like the way PALS is organized and cross-references skills and interest."

"The administration procedures were helpful. Actually the entire set-up was-with the various hot buttons allowing for a variety of ways of obtaining information."

Easy to Navigate and Useful
"I have just spent about two hours on the PALS page-What a wonderful resource! I like how they have included the exemplars of student work…The whole site was easy to navigate."

"I think the site will be very useful to me when I am looking for performance assessments which relate to topics of study which I am already teaching."

Suggestions for Improvement
More Elementary Tasks

"I would like to see more elementary tasks in common topics such as life cycles and weather."

Clearly State Grade Levels Specified for Each Task
"I wish PALS had K stuff and had all grade level material listed with a click."

More Life Sciences at Intermediate Level
"I logged on first to find a way to assess life cycles-no luck. I'd like to see more life sciences at the intermediate level."

Teachers Request Assessments to Follow Teaching Units
"I think that it would be nice if teachers could submit requests for performance assessments that they could use in their classrooms to follow their teaching units."

 

Appendix D:

Letter from Spring Independent School District, Houston, Texas

P.A.L.S. Website

PLANS FOR USE IN THE SPRING INDEPENDENT SCHOOL DISTRICT FOR SCHOOL YEAR 2000-2001

One focus of the science program in Spring ISD is the use of investigatory tasks, or performance tasks, as a regular part of instruction to help students to develop the full range of science process skills. We have worked for a number of years at increasing the number of such tasks that science teachers include in instruction, as well as the quality of the tasks. One obstacle to increased use has been the time needed to develop well-designed tasks and an accompanying scoring procedure. The P.A.L.S. website has provided a great tool to overcome this obstacle.

Spring ISD is utilizing the P.A.L.S. task this year in the following way:

The first six weeks plan was carried out with the majority of teachers either implementing a P.A.L.S. activity or developing plans for implementation. (Several seventh grade teachers encountered difficulty in obtaining some supplies needed to carry out their activity and had to postpone implementation until after the inservice date.)

Based on discussions at the September 25th inservice Spring ISD teachers are excited about the possibilities that P.A.L.S. provides and have had sufficient exposure to the site to be able to use it effectively for the remainder of the school year. They look forward to expansion of the site and the possibility of additional activities.

P.A.L.S. has provided a much needed tool for fulfilling an instructional need in Spring I.S.D.

Cynthia Lueckemeyer, Ed.D.
Program Director for Science
Spring Independent School District
16717 Ella Blvd. Houston, TX 77090
(281) 586-1185
cynthial@hiway.spring.isd.tenet.edu