Heritage Co-authors Book Chapter
CRESST Assistant Director for Professional Development Margaret Heritage recently co-authored a chapter focusing on the assessment of young language learners. The chapter is published in a new book, The Companion to Language Assessment edited by Antony John Kunnan and published by Wiley & Sons.
CRESST colleagues Alison Bailey and Frances Butler are co-authors of the chapter. A table of contents is available.
CRESST Shares Work at California Research Conference
CRESST staff will be sharing research and information at this year's California Educational Research Association Conference at the Disneyland Hotel in Anaheim, December 4-6.
Joan Herman Presents at National Education Summit
Dr. Herman's presentation, Assessment of 21st Century Skills, What Now, What Next? is available as well as a related research report, On the Road to Assessing Deeper Learning: The Status of Smarter Balanced and PARCC Assessment Consortia.
New Resource to Help with CCSS Implementation
The Center on Standards and Assessment Implementation, a partnership between WestEd and CRESST, has recently launched a new web site, with extensive resources to help states, school districts, and schools implement the Common Core State Standards and prepare for consortia-based assessments.
The comprehensive web site contains an extensive library of reviewed and approved resources and tools, center-developed products, a state of the states interactive mapping guide, upcoming events and news, as well as assessment development updates.
New CRESST Resource Paper
A new CRESST resource paper, Formative Assessment and Next Generation Science Standards, illustrates an innovative formative assessment model to measure student progress in meeting Next Generation Science Standards.
CRESST Senior Research Scientist Joan Herman recently presented this paper at the K-12 Center at ETS Invitational Research Symposium on Science Assessment.
Student Errors Help Researchers to Improve Computer Games
CRESST researchers Deirdre Kerr and Greg Chung are using the types of errors that students make on computer learning games to develop better games.
In a new report, Kerr and Chung found that different types of errors tend to have different effects on learning. They also found that algebra students were twice as likely to learn if they felt positively about the game and that students with a low self-belief in math were almost three times as likely to be confused by the game compared to students with higher self-beliefs.
They are using their results to improve the design of the game, especially to improve feedback. Download their report here.
Julia Phelan Attends Big Data in Education Meeting in Helsinki
Senior Researcher Julia Phelan recently represented CRESST at a meeting to improve data science in learning and teaching at the U.S. Embassy in Helsinki, Finland.
Among the purposes of the international gathering was to improve student achievement measures and provide teachers with actionable data. Researchers also discussed methods for motivating students and better methods for developing useful student profiles.
Additional information is available at the World Educational Portals website.
New Joan Herman Paper - Formative Assessment and Science
CRESST Senior Research Scientist Joan Herman recently presented a paper, "Formative Assessment and Next Generation Science Standards: A Proposed Model," to an Invitational Research Symposium on Science Assessment.
In her paper, Dr. Herman illustrates an innovative formative assessment model to measure student progress in meeting Next Generation Science Standards.
Herman Shares CRESST Work in Vancouver
Her key topics included a brief history of assessment in the United States, purposes of assessment, evidence centered assessment design and validation, depth of knowledge issues, and innovative performance assessment examples.
A complete copy of Dr. Herman's presentation is available.
Journal Features CRESST Research
Introduction - co-authored by Harold F. O'Neil;
Potential Applications of Latent Variable Modeling for the Psychometrics of Medical Simulation by Li Cai.
Focus Areas »
CRESST's unique expertise makes it an ideal partner in military and medical simulation-based training and assessment. Our military and medical clients and partners include the: Office of Naval Research (ONR), Defense Advanced Research Projects Agency (DARPA), Naval Education and Training Command (NETC), Surface Warfare Officer's School Command (SWOS)... Read more »
Since its inception, CRESST has conducted research, development and evaluation that improves Pre-K–12 public education across the United States. Our innovative methods and indicators for evaluating educational quality are in broad use, including comprehensive approaches for monitoring and improving schools and their programs. Read more »
Many CRESST projects overlap with the field of higher education, especially those that support teacher capacity building programs as well as adult learning. This area of our work will continue to grow as the need to measure postsecondary instructional quality and services expands. Read more »
For more than 20 years, CRESST has conducted research and development supporting improvements in adult learning. Our current projects range from assessments and tools in military marksmanship to research-based guidelines for "what works" in distance learning. Read more »
During the past few years, CRESST has applied its evaluation, assessment, and instructional expertise to the field of medical research, development, and training. For example, we have recently assisted medical organizations in the design, development, analysis, and reporting of medical simulations while assisting other agencies in the evaluation of medical training interventions. Read more »