Workshops

The Workshop fee (€100, for 1 workshop only) includes 1 coffee break and lunch.

All pre-conference workshops will take place on November 2, 09:30 - 16:30. Please select only one workshop during registration.

Click above on a workshop title to view more information or download details to your device.

Workshop 1

Applications of Item Response Theory

Download as PDF

Presenters
Theo Eggen Twente University, the Netherlands; Dutch National Institute for Educational Assessment, Cito, The Netherlands
Frans Kleintjes Dutch National Institute for Educational Assessment, Cito, The Netherlands
Marieke van Onna Dutch National Institute for Educational Assessment, Cito, The Netherlands

Biographies
Dr. Ir. Theo Eggen (1953) is member of Psychometric Research Centre of the Dutch National Institute for Educational measurement (Cito), with Cito since 1985. He has major experience in advising on the methodological aspects (research design and data analysis) of educational research and test development, in conducting data analysis and in multidisciplinary cooperation projects. Theo has expert knowledge of statistical tools and packages, of specialized psychometric computer programs and of computer programming. He worked as a consultant in educational measurement at university, at Cito and internationally. He teaches introductory and specialized courses and has presented many papers at national and international conferences. He is the author of research articles, syllabi and textbooks. Specializations are: item response theory, national assessment, missing data problems and adaptive testing. Theo holds a doctorate (Dr) in educational measurement and a MSc (Ir) in Statistics of Twente University of Technology. In 2008 he was accredited as a Fellow of the of the Association of Educational Assessment Europe.

Ir. Frans Kleintjes (1953) is Director Cito International at Cito since 2015. Before he was a Senior Research Scientist in the Psychometric Research Centre Cito. the Dutch National Institute for Educational Assessment. With Cito since 1988. 1 His major task is providing methodological and psychometrical consult in research and educational test development projects. Since 2009 also Senior consultant in Cito International, consulting on psychometric issues in international projects. He has a vast experience in consulting, training and performance on issues related to many psychometric aspects of the development of educational tests and testing procedures. He conducts both at Cito and abroad, training courses and consultancies in psychometrics, test construction and on assessment and quality control in education. Frans holds a MSc. (Ir) in Applied Mathematics of Twente University of Technology, with a specialization in educational measurement since 1981. In 2011 he was accredited as a Fellow of the Association of Educational Assessment Europe . Frans was a member of the Professional Development Committee and co-chair of the Professional Affairs Board from 2004 to 2014. The PAB assesses applications for Fellows, Practitioners and Associates of the association. He is secretary of the Dutch Association for Exams (NVE), an association for all those interested in educational testing such as examining bodies and test developers in the Netherlands.

Dr. Marieke van Onna (1973) is a member of the Psychometric Research Centre at Cito. With Cito since 2007. Her main task is coordinating all methodological consultation on the national exams in secondary and vocational education. Her expertise is in translating IRT and methodology to test situations in practice. This involves everything from the design of tests, to CBT, to data handling, to item banking, to choosing an appropriate IRT model and subsequent analysis, to standard setting, up to the design of reports, school evaluation and political considerations. She has experience with the development of tests in primary, secondary, vocational and higher education, as well as computer-based tests of driving competency. Marieke holds a Ph.D. in Methodology of the Social Sciences and a MA in Philosophy of the University of Groningen, Netherlands.

Why AEA members should attend this workshop
The workshop will offer an introduction to IRT and applications from a practical point of view. IRT is used for many measurement applications including item banking, test construction, adaptive test administration, scaling, linking and equating, standard setting, test scoring and score reporting. Main features of these applications will be addressed in the workshop. Participants will be able to understand and assess the usefulness of IRT in their own work.

Who this Workshop is for
The workshop is aimed at those who want to know more about IRT with a focus on applications. Participants might be novice ore more experienced user. No prior knowledge is required to attend the workshop. Participants will practice using the software for some examples and are invited to bring their own laptops for practicing (Windows).

Overview
Item Response Theory (IRT) is used to analyse response data at item level. Unlike classical theory it does not solely construct and analyse fixed tests forms administered at one occasion. IRT is the theoretical framework which can be used in modern times where assessments are often based on different tests on different occasions. In IRT estimating characteristics of items and examinees and defining how these characteristics interact in describing item and test performance. When used properly IRT can increase the efficiency of the testing process, enhance the information provided by that process, and make detailed predictions about unobserved testing situations.

The first session of the workshop starts with some theory, including a comparison of IRT with classical test theory. Major properties of IRT will be highlighted using illustrative examples. IRT output will be explained, discussed and interpreted based on materials that will be provided. Several concepts used in IRT will be explained using examples from the test construction experience of the presenters and, when available, from participants.

In the second session participants will, hands on, detect main features of IRT through performing some exercises.

The third session is a short theoretical introduction in Computer Adaptive Testing, in which special attention will be paid to computerized adaptive testing. The goals and usefulness of simulations for constructing CATs will be discussed. The measurement characteristics of a CAT can be studied and set before publishing it. Information can be collected by simulation studies that use the available IRT calibrated item bank and the proposed target population. The performance of proposed selection algorithms and constraints can be studied. Customized software will be distributed (Windows based) and used by participants to determine the measurement characteristics in CAT.

In the last workshop session the use of IRT features will be discussed. We intend to deal with applications of IRT such as: Itembanking; Standard-setting;, Maintaining Standards and Grading; Reporting using the ability scale; Student Monitoring Systems; National and International Assessment.

Preparation for the workshop
No special preparation is required, the workshop format will be interactive allowing participants to discuss their own experience and/or problems. If available, participants are encouraged to bring their own data and analyses for discussion. It is the belief of the workshop leaders that sharing experience in applications will stimulate and enable participants in solving educational measurement problems that they encounter in their practice or anticipate encountering.

Schedule

Time Session Presenter
0900 Coffee and registration  
0930 Welcome & introductions
Outline of the Workshop
Theo Eggen, Frans Kleintjes and Marieke van Onna
0945 Introduction to Item Response Theory Marieke van Onna
1100 Break  
1130 Main features of IRT - hands on session Frans Kleintjes and Marieke van Onna
1300 Lunch  
1400 Computerized Adaptive Testing Theory and hands on simulations Theo Eggen
1530 Break  
1545 Using IRT features Theo Eggen, Frans Kleintjes and Marieke van Onna
1630 Workshop close  

Workshop 2

Developing Constructed Response Test Items

Download as PDF

Presenter
Ezekiel Sweiry

Biography
Ezekiel Sweiry is a Senior Assessment Researcher at AQA. He has 16 years of experience in test development and assessment research. During this time, he has worked for the Department for Education and the three largest UK awarding bodies. As a test developer, he has been involved in the development of a range of high stakes tests in England at both primary and secondary level. His particular research interests include the factors that affect the difficulty and accessibility of test items, the item and mark scheme features that affect marking reliability, and the comparability of paper-based and computer–based assessments.

Why AEA members should attend this workshop
While considerable research and guidance on writing selected response (SR) test items exists in the literature, guidance on writing constructed response (CR) items is scarce. This is despite the fact that there is far greater potential for examinees to misunderstand the requirements of CR items. In addition, the development of CR mark schemes that show both intrinsic validity and high levels of marker agreement presents serious challenges that are all but absent for SR items. The purpose of this workshop is to present and discuss guidance on developing CR items and their mark schemes.

Who this Workshop is for
The workshop is aimed at anyone with an interest in constructed response item and mark scheme design, including test developers, educational assessment researchers and those involved in the scoring of responses. No specific prior knowledge is needed.

Overview
The guidance in this workshop is based on an synthesis of available research literature on CR item writing, relevant aspects of cognitive psychology (including models of language comprehension, working memory capacity and cognitive load theory), and the presenter’s own experience of high stakes test development across primary (ages 7 - 11) and secondary (ages 11 - 18) education in the UK.

A range of CR item writing issues will be explored, including the appropriate use of language, real-world contexts and diagrams. The workshop will also consider the features of items, mark schemes and examinee responses that can affect marking reliability, and ultimately, how mark schemes can be designed to maximise marking reliability. Mark scheme validity (the extent to which the mark scheme gives credit for, and only for, responses which match expert notions of what an item should be measuring), a concept almost entirely ignored in the literature, will also be explored. Finally, the workshop will consider how qualitative and quantitative evidence from item trialling can be used to identify problematic items and mark schemes. All of these issues will be investigated through an amalgamation of theory and research evidence with practical activities and example test questions, and participants will have the opportunity to review and discuss potential revisions to a variety of sample questions and mark schemes.

The focus of this workshop will be primarily on short constructed-response items (typically worth up to 4 or 5 marks) and their mark schemes, though some aspects of the design of more open- ended items and their mark schemes will also be covered. The primary basis for the guidance will be on ensuring that items and mark schemes are, as far as possible, free of construct irrelevant variance, which occurs when scores are influenced by factors irrelevant to the construct. These factors can make items unintentionally easy (construct irrelevant easiness) or difficult (construct irrelevant difficulty).

Preparation for the workshop
No specific workshop preparation is required.

Schedule

Time Session Presenter
0900 Coffee and registration  
0930 Welcome and introductions
Outline of the workshop
Ezekiel Sweiry
0945 Language accessibility in test items Ezekiel Sweiry
1100 Break  
1130 Item features and construct irrelevant variance Ezekiel Sweiry
1300 Lunch  
1400 Mark scheme design, marking reliability and mark scheme validity Ezekiel Sweiry
1530 Break  
1545 Response analysis: interpreting the evidence from item trialling Ezekiel Sweiry
1630 Workshop close  

Workshop 3

Discussing and sharing experiences of the social, political and also cultural drivers of assessment practices and policies.

Download as PDF

Presenters
Lecturer Egil Hartberg Lillehammer University College, Norway
Project manager, Vegard Meland Lillehammer University College, Norway
Professor Stephen Dobson University of South Australia and Lillehammer University College, Norway

Biographies
Lecturer Egil Hartberg is from Lillehammer University College, Norway. He is a project leader and assistant professor focusing on skills development in the school sector.
He has worked as a teacher and headmaster for ten years at high school and middle school. Egil works assessment and feedback. He has published the book Feedback in school with Stephen Dobson and Lillian Gran and has written several articles on assessment and learning, including the self-assessment, tests and homework.
Egil teaches the college postgraduate studies in learning management and assessment, and also in various professional development programs. LillehammerUniversityCollege was, by Egil , responsible for teaching about learning promotional consideration in the Ministry of comprehensive , national education related to transition the project New GIV.
Webpage: Click to view

Vegard Meland is employed as a project manager focusing on skills development in schools and kindergartens. He also works with online studies of educational using innovative ICT and assessment solutions.
Vegard has teacher training from Volda University College, Norway with a majors in arts and crafts, social studies and special education 1. In addition, he has studied human resources management at Lillehammer. He has worked 11 years in primary school, where for the last 6 years he was a deputy head in Lillehammer.
Webpage: Click to view

Professor Stephen Dobson, University of South Australia and Lillehammer University College. Prior to entering higher education he worked for thirteen years with refugees as a community worker. His research and teaching interests include assessment, professional development, refugee studies, bildung, inclusion and classroom studies. He is Dean and and Head of School of Education in Australia and guest professor at the Centre for Life Long Learning, Lillehammer University College, Norway. Dobson is fluent in Scandinavian languages and a member of the Teacher Education Expert Standing Committee for the Australian Institute for Teaching and School Leadership (AITSL). He is a fellow of the Association for Educational Assessment.
Webpage: Click to view

Why AEA members should attend this workshop
Workshop participants will have the opportunity to discuss and share their varied experiences of working with different assessment resources.

Who this Workshop is for
Educational professionals working in the tertiary and schooling sectors with responsibility/experience of assessment. The workshop will also be relevant for assessment developers in the corporate and state sector who would like to discuss experiences of building educator capacity through innovative assessment resources and how this might impact upon reducing social inequality amongst stakeholder groups.

Overview
How can we in different countries facilitate school and teacher professional development programs about assessment with the goal of promoting learning? Which components of the assessment practice can be labelled as global, and in which parts are local and national contexts especially important?

In this workshop we will explore these questions as we explore the role of social, political, digital and also cultural drivers of assessment practices, and how important it is to take account of local and national contexts in professional development programs. How do the drivers interact with the professional development programs, and with different stakeholders and actors in policy?

These drivers exert an impact on the trickle down effect as assessment policy is implemented in the classroom. It is exemplified in the manner in which inter3 national programs of assessment, such as PISA, gain their national versions and are implemented at a classroom level. However, it is by no means a one-way topdown process. As Goodlad pointed out already in the 70s, an intended education policy takes on many unintended forms as is distanced from the policy makers and enters the realm of classroom practice. Feedback and evidence is collected and communicated both ways and there are many intervening activities, such as programmes devised to raise teacher professional competence and increase curriculum understanding. In this workshop, we will also explore the role played by such evidence and feedback, and if changes in assessment policy, supported by professional development, have led to improved outcomes, equal opportunities and social justice.

This workshop therefore focuses in particular upon the drivers which might influence on school and teacher professional development programs about assessment and their success, or lack of:

  • Cultural drivers, such as changes in manner in which assessment practice in the classroom increasingly requires the skill to collect, use and inform practice through large sets of school and classroom generated data (i.e. Learning analytics)
  • Political drivers such as nationally funded programs to raise teacher assessment competence though professional development.
  • Social drivers such as the parental driven expectation that all teachers should have or acquire through professional development up-to-date numeracy/mathematics and literacy skills.
  • Digital drivers, such as using MOOCs in teacher professional development, either in fully online/facilitator independent mode or in blended mode.

In the workshop we will explore a number of cases including MOOCs and teacher professional development in mathematics. Participants are invited to bring their own cases for discussion. They will have the opportunity throughout the day to make reference to and present their cases in a supportive and responsive setting, such that they obtain feedback from all present. They are encouraged to bring one-two page written summaries of their cases for distribution to other participants and conveners on the day.

Preparation for the workshop
Develop a short 5-10 minute verbal/visual presentation of your experiences in this topic for use during the workshop.

As background reading the facilitators of the workshop suggest the following two articles as pre-reading. They will both be distributed directly to the names participants by the conveners of this session. The first reading is on assessment developments in Norway, the second on a country further afield, namely Singapore with a policy the designed to introduce alternatives to the exam based culture:

  • Therese N. Hopfenbeck, María Teresa Flórez Petour & Astrid Tolo (2015) Balancing tensions in educational policy reforms: large-scale implementation of Assessment for Learning in Norway, Assessment in Education: Principles, Policy & Practice, 22:1, 44-60
  • Christina Tong Li Ratnam-Lim & Kelvin Heng Kiat Tan (2015) Large-scale implementation of formative assessment practices in an examination-oriented culture, Assessment in Education: Principles, Policy & Practice, 22:1, 61-78

Schedule

Time Session Presenter
0900 Coffee and registration  
0930 Welcome and introductions
Outline of the workshop
Dobson
0945 How can we facilitate school and teacher professional development programs about assessment with the goal of promoting learning? Hartberg and Meland
1100 Break  
1130 Presentations of cases for discussion. Participants
1300 Lunch  
1400 Cultural, political, social and digital drivers that influence on school and teacher professional development programs about assessment. Introduction by Dobson.
Discussion
1530 Break  
1545 Which components of the assessment practice can be labelled as global? In which parts are local and national contexts especially important? Discussion
1630 Workshop close  

Workshop 4

Using data collected in IEA studies for informing policy and practice

Download as PDF

Presenters
Sabine Meinck IEA Data Processing and Research Center, Hamburg, Germany
David Rutkowski University of Oslo, Oslo, Norway

Biographies
Dr. Sabine Meinck is heading the Research, Analysis and Sampling Unit. Her specific research interest and expertise is in the methodology of large‐scale assessments with a special focus on sampling, weighting and variance estimation, as well as on advanced statistical analysis methods. She was involved or responsible in developing the sampling design in all IEA studies of the last decade and in various national assessments. Dr. Meinck conducts workshops on a regular basis on the topics of her expertise and consults with national research coordinators of IEA studies to adjust internationally defined study designs fitting country‐specific needs. Recently she elaborated an expertise on the possibilities and limits of using large‐scale assessment data to give feedback to participating schools. Dr Meinck has a lectureship at Hamburg University (master course “Complex analysis methods in educational research”).

Dr. David Rutkowski is a Professor of Education at the Center for Educational Measurement(CEMO) at the University of Oslo, Norway. David joined CEMO in 2015 before that he was a faculty member at Indiana University and also worked as a researcher at the International Association for the Evaluation of Educational Achievement (IEA) Data Processing and Research Center in Hamburg Germany. He earned his PhD in educational policy with a research specialization in evaluation from the University of Illinois at Urbana‐Champaign. David’s research is focused in the area of educational policy and technical topics within international large‐scale assessment and program evaluation. His interests include how large scale assessments are used within policy debates, the impact of background questionnaire quality on achievement results, and topics concerning immigrant students at the international level. David has collaborated with or consulted for national and international organizations including the US State Department, USAID, the IEA and the OECD. David has worked on evaluations and projects in over 20 countries to include Afghanistan, South Sudan, Trinidad and Tobago and the US. He is a member of the IEA Publications Committee and Co‐editor of the IEA policy brief series.

Why AEA members should attend this workshop
The workshop will illustrate possibilities and limitations of large‐scale assessment data to inform policy and practice, given the complexities of the designs of such studies. Participants will get the opportunity to learn about the study specifics and develop and exchange ideas on how results arising from this data can be “translated” best to inform politicians, school staff and the public.

Who this Workshop is for
Researchers and education specialists using ILSA data for informing policy and practice are addressed in this workshop.

Overview
As a leading entity in the field of education for nearly 60 years, IEA promotes capacity building and knowledge sharing to facilitate innovation and foster quality in education. IEA’s manifold empirical studies inspire fruitful dialogue on critical educational issues, informing the development of evidencebased policies and practices across the globe.

Only within the past decade, IEA conducted nine international large‐scale assessments in education (ILSA), each with up to 59 participating countries, studying various topics and target populations. IEA studies approach educational reality in all its complexity, collecting not only achievement data but also a wide range of information about the contexts within which teaching and learning occurs.

One major objective of these studies is to provide policy‐makers with high‐quality data to understand key factors that influence the outcomes of education and serve as evidence when driving the course of actions and evaluation of educational reforms (Wagemaker, 2014).

All data arising from IEA studies are publicly available and excellently documented; they provide a tremendously valuable and rich source for secondary analysis in many fields of educational research and for answering research questions addressing specific national interests. Due to the complexity of the data however, thorough methodological skills and contextual knowledge are needed to analyse and interpret this data correctly. When presenting the results of these studies to the public, a careful “translation” of complex contents into a language that is accessible to the respective audience is needed to transport the core messages while retaining correct interpretation.

This workshop provides information on the contemporary IEA studies and how results may be used to develop policy briefs. More specifically, we will discuss the utility and limitations of ILSAs for informing policy‐makers and education practitioners. The workshop will start with and introduction to International Association for Evaluation of Educational Achievement (IEA) studies, discuss some of the methodological issues related to analysis of ILSA data and show the utility for policy‐making and educational practice. We will then provide an overview of how policy briefs can be structured along with illustrative examples of how the IEA has developed their policy brief series. Time will also be provided for participants to work together and discuss possible outlines for briefs that would be relevant for their specific context. Finally, participants will get the opportunity to work with IEA data using simple tools that can handle the methodological issues automatically.

The workshop will focus on the following key topics:

  1. General information about the studies: their goals, purposes and intent, theoretical frameworks, target populations and respondents, achievement domains and background information collected from the different respondents.
  2. Introduction to the statistical complexities of the studies.
  3. Data sources and tools for analyzing IEA datasets.
  4. The construction and development of educational policy briefs that utilize ILSA data.

Preparation for the workshop
The instructors will provide all necessary data and documentation to the participants. It is expected that participants have some knowledge on basic statistics, although it is not a must. The participants should bring their own laptop for the hands‐on training session.

Schedule

Time Session Presenter
0900 Coffee and registration  
0930 Welcome and introductions
Outline of the workshop
Sabine Meinck & David Rutkowski
0945 Overview of IEA Studies,
Introduction into Methodological Issues
Sabine Meinck
1100 Break  
1130 Policy Brief Overview David Rutkowski
1300 Lunch  
1400 Asking and Answering Relevant Questions with ILSAs Sabine Meinck & David Rutkowski
1530 Break  
1545 Policy Brief Outline David Rutkowski
1630 Workshop close  
Sponsors
Partners