Logos
Home >
ARCHIVE - these pages are no longer maintained. The project finished in 2008

Main Menu

Funded by

esrc
Grant Number:
RES-043-25-0007

PSR


What is SIMPLE?

imageTeaching statistics is often a challenge, but perhaps never more so than when teaching the subject to social science undergraduates. Many social science students avoid the post-GCSE study and practice of quantitative skills and are surprised (and often dismayed) to discover that they play an important role in the social sciences. These students may fail to accept that quantitative skills are an essential part of their social science education, and they are likely to be frustrated in their encounters with statistical terms and concepts. Many students report that the material doesn’t seem to ‘stick’ the way that information about other aspects of their social science disciplines do: When they return to the material after days, weeks or months, it seems unfamiliar and they feel they have made little progress. These students lack confidence with statistical concepts and skills and frequently aim for merely avoiding failure, rather than engaging in efforts for mastery. They ask for, and often seem to benefit from individual tutorials, but are part of an educational system that typically cannot invest the resources necessary to provide extensive one-on-one teaching.

 

SIMPLE (Statistics Instruction Modules with Purposeful Learning Emphasis) is one of five curriculum development projects recently funded by the ESRC to encourage quantitative skills development among social science undergraduates.

The people developing the project are at Lancaster University: Catherine Fritz (Psychology in Education, Educational Research), Brian Francis (Maths & Stats), Peter Morris (Psychology) and Moira Peelo (Centre for the Enhancement of Teaching and Learning) are the PIs; the steering group also includes David Denver (Politics), Andrew Folkard (Geography), Julie-Ann Sime (Psychology in Education and Centre for Studies in Advanced Learning Technology, Educational Research) and the project employs a full time learning technologist/programmer, Alberto Ramirez Martinell, who is also working towards a PhD related to the use of technology to enhance learning.  The combined experience of the steering group members includes well over 100 years of statistics-teaching experience and over 60 years of research into influences on human learning.  The project will produce two main components:

  • Software that will organize, schedule and track performance on hierarchical modules, with user-friendly interfaces with both tutors and students, and
  • a small set of fully defined hierarchical modules, including explanatory materials and imbedded formative and diagnostic assessment.

There are many good reasons for developing better quantitative skills among undergraduates. The ESRC are properly concerned with the skills and commensurate employment and research opportunities of UK-trained social scientists. There is also a broader issue of social equity. If current students are denied supportive opportunities to develop these skills, they are also being denied the opportunities to evaluate research and to consider when quantitative methods are appropriate in their own and others' research. Without a broad base of research skills, including quantitative analysis skills, new researchers are handicapped in their ability to understand and conduct research in any discipline. The training of social scientists, and the broader education of effective members of society, is damaged at least as much by limited numeracy skills as by limited literacy. General employment prospects appear to suffer more from poor numeracy skills than they do from poor literacy skills (Bynner & Parsons, 2000).

So, what will the system and modules achieve? How will they support the development of quantitative skills? These modules will supplement courses in quantitative skills. They might represent a substantial component of the laboratory/workshop part of a course, or they might be used as a substitute for some lectures and homework. Initially, while they are being developed at Lancaster University, they will replace some lecture, lab and homework elements. They could be used strictly at an individual level, or might be used to support group work. They will provide instructional material, examples, diagnostic and formative assessment, and opportunities for spaced practice of new concepts and skills. During the project we will develop 4 or 5 modules addressing basic topic areas – ones that would be appropriate in introductory courses. The final selections have not yet been made, but likely candidates include:

  • Measuring people and situations – Variables, populations and sampling, operational definitions, reliability and validity, categorical and continuous data
  • Being normal - Introducing normal distributions and z-scores
  • Handling and describing data - Using Excel, describing data with numbers, describing data with graphs (histograms, bar, lines, scatterplots, error bars)
  • Odds are . . . - Introducing probabilities and significance testing
  • From counting to concluding – Frequency data, percentages and using the chi-squared statistic
    Data handling, part 2 - Using SPSS
  • What’s the difference? - t test and Cohen’s d, between and within groups
  • Going together - Correlation and a brief introduction to linear regression

The SIMPLE system and modules will provide a tool to help both tutors and students have more effective quantitative skills tuition.

For tutors, the modules offer elements of individualized instruction for students, with minimum intervention from the tutor. The modules developed in this project can be used as they stand, or modified by the tutors to add, remove or change the examples, specific subtopics, individual activities or explanations. Although designing a module will require careful planning, the process will not require programming skills or the use of specialized software; modules are built from familiar types of files including PowerPoint slides, Word documents, and Excel spreadsheets. Tutors specify how the elements are to be used in a Module Definition Spreadsheet which defines what options will be available to students under what conditions. Students who perform well on imbedded assessments can be fast-tracked through a lesson; those who need more support can be guided to more explanations, examples, and opportunities to apply their new knowledge. Tutors will be able to monitor and adjust students' progress through Excel files that record students' activities and their performance. The pre-developed modules can provide substantial support to tutors who are new to teaching the subject, and could substantially ease the burden on new course development. Because any element of a module can be easily changed, modules can easily be jointly developed for use in multiple disciplines: The discipline-linked examples can be customized for each programme, while sharing common structure for the activities and explanations of quantitative concepts.

For students, the system is designed to achieve two goals: Changing students' beliefs and changing their knowledge and skill level. Our approach is to challenge their beliefs directly, as one step towards improving the efficacy of their tuition, and in the same stroke to assure that the learning activities and materials are structured to develop fluent knowledge and skills.

Some students strongly express the belief that they "can't do maths/stats". By incorporating formative assessment and practice activities at an early stage as well as later stages, students are likely to succeed. The scheduling of formative assessment is based on research showing that success on early tests leads to success on later tests (e.g., Fritz, Morris, Acton, Voelkel & Etkind, 2007; Fritz, Morris, Nolan & Singleton, 2007; Morris & Fritz, 2006; Morris, Fritz, Jackson, Nichol, & Roberts, 2005). It is important to assess performance early, when students will almost certainly succeed. This early success demonstrates to students that they can succeed and it contributes substantially to their later success. Our original modules will include preliminary assessment of new material quite early (following just one or two slides or activities), after a short while (following a few slides or activities) and again after a longer interval, providing feedback and clarification if needed at each stage. These formative assessments are embedded in the presentation of new material and the development of new activities. Further formative assessment on a topic will follow after longer intervals, especially for material that students had difficulty grasping in the first instance.

Another belief that interferes with students’ learning is that the material is not relevant to their interests. To challenge this belief we will build into the materials a scaffolded version of a somewhat personalized, problem-based learning approach (e.g., Arts, Gijselaers, & Segers, 2006; Kirschner, Sweller, & Clark, 2006). Research questions that are relevant to the students themselves, or to engaging parts of their discipline, will be developed through instructional materials. Data collection methods and research designs will be considered and one or more choices will be pursued, each with an associated dataset. Although students are reasonably confident when taking the steps from a research question to data collection method and research design, they are often lost when faced with data and the need to apply it to the research question. The concrete, personalized examples will lead students to see that data analysis concepts and tools are useful. Working through multiple, varied examples will demonstrate the broader relevance of these skills to students’ interests.

The same features that challenge interfering beliefs also benefit the learning process directly. Early assessment and practice strengthens learning so that students are more likely to succeed on later assessment and practice. Varied examples provide practice in varied contexts, improving the likelihood that the principles and tools will be appropriately applied in novel contexts. Because the system selects activities based on the tutor’s criteria and the individual student’s performance, every student will engage in appropriate amounts of practice, at appropriate times: Students who master a concept surely and quickly will require less practice before moving on to a new concept; those who are less successful will encounter more explanation, examples and practice. Another advantage of individualized, online tuition is that students can work privately or in small groups of their own choosing, thereby avoiding potential embarrassment from the need to work more slowly or to work through more examples.

Over the next 17 months we will develop basic software, 4 or 5 modules and guidance for modifying the modules and developing new ones. By this time next year we will demonstrate the system and modules to interested parties with plans to make the software and prototype modules available by late summer. Interested parties should contact Catherine Fritz.

 

References

Arts, J. A. R., Gijselaers, W. H., & Segers, M. S. R. (2006). Enhancing problem-solving expertise by means of an authentic, collaborative, computer supported and problem-based course. European Journal of Psychology of Education, 21, 71-90. Bynner, J., & Parsons, S. (2000). The impact of poor numeracy on employment and career progression. In C. Tikly & A. Wolf (Eds.), The maths we need now (pp 26-51). London: University of London Institute of Education (Bedford Way Papers). Fritz, C. O., Morris, P. E., Acton, M., Voelkel, A. R., & Etkind, R. (2007). Comparing and combining expanding retrieval practice and the keyword mnemonic for foreign vocabulary learning. Applied Cognitive Psychology, 21, 499-526. Fritz, C. O., Morris, P. E., Nolan, D., & Singleton, J. (2007, available online to preview from this Journal. Expanding retrieval practice: An effective aid to preschool children’s learning. Quarterly Journal of Experimental Psychology. Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86. Morris, P. E., & Fritz, C. O. (2006). How to . . . Improve your memory. The Psychologist, 19, 608-611. Morris, P. E., Fritz, C. O., Jackson, L., Nichol, E., & Roberts, E. (2005). Strategies for learning proper names: Expanding retrieval practice, meaning and imagery. Applied Cognitive Psychology, 19, 779-798.

| Home  | Work in Progress | People | Links | Contact |