Contents of OverviewIntroductionWorkpackages
Frema (Framework Reference Model for Assessment) is one of five e-Learning Framework (ELF) Reference Model projects in the JISC e-Learning programme. JISC's e-Learning Programme has the aim of identifying how e-learning can facilitate learning and to advise how such e-learning systems might be implemented. A core part of this second aim is the e-Learning Framework (ELF), which is a service-oriented view of the core system modules required to support e-learning applications.
For this to occur it is necessary to develop ELF Reference Models that describe how areas of the e-learning domain map to the ELF and which can act as a driver for implementation and evaluation. Once complete, such reference models will ease the development of further services and promote the re-use of existing ones.
Figure 1: e-Learning Framework
This project will develop the reference model for the Assessment domain area. The JISC Circular 10/04 identifies six domain areas, of which five are the subject of the circular: assessment, learning content, enterprise, personal development planning, and personal learning environments. The learning content domain area reference model involves the specification of services for the design, construction and execution of learning activity that can be used and shared by multiple institutions and the lifelong learner.
The Assessment services will focus upon the creation, execution and recording of electronic assessments which are accessible across institutions and to the lifelong learner.
Frema is a 1 year collaboration between the Universities of Southampton, Strathclyde and Hull and is funded by a JISC grant of £148,095.
Assessment is a large and complex portion of the e-Learning Framework, as shown in Figure 2, interacting with VLE's, Portals and Marking Tools at the User Agent layer, multiple Learning Domain Services including Sequencing tools, Grading, Marking, Reporting, Competency and Tracking and relying on most of the common services.
Figure 2: A Summary of the Assessment Domain 11-04: ELF Conference 11/04 (http://www.elearning.ac.uk/resources/Assessment.ppt)
The project is a collaboration between the Universities of Southampton, Strathclyde and Hull and the team has many of the key players in this domain.Members of the Frema project team have been working on defining and implementing the components of assessment framework for some years through initiatives such as:
- TOIA (Technologies for Interoperable Assessment - University of Strathclyde, (X4L Strand dB) http://www.toia.ac.uk/ ),
- APIS (Assessment Provision through Interoperable Segments - University of Strathclyde ĘC eLearning Framework and Tools strand),
- ASSIS (Assessment Sequencing - University of Hull (eLearning Tools) http://www.jisc.ac.uk/deletassis.html),
- QTI SIG CETIS Question and Test Interoperability Special Interest Group http://www.cetis.ac.uk/groups/20010801132906/viewGroup.
The Frema team are in regular communication with other active teams such as the TIP project (Authentication and Authorisation, University of Oxford, http://www.jisc.ac.uk/delettip.html ) and Serving Maths (University of York), http://www.jisc.ac.uk/deletsm.html.
The ultimate aim of ELF is for each service to reference an open specification or standard and for open-source implementations to be available. The stated intention of the ELF effort is to facilitate the integration of commercial, home-grown and open source components by agreeing common service definitions data models and protocols.
For this to occur it is necessary to develop ELF Reference Models that describe how areas of the e-learning domain map to the ELF and which can act as a driver for implementation and evaluation. Once complete, such reference models will ease the development of further services and promote the re-use of existing ones.The e-Learning Framework Reference Model for Assessment will be composed of several distinct parts that:
- Define its domain scope in terms of existing practise
- Relate them explicitly to the ELF in the form of a service profile and service descriptions
- Provide prototype services that fulfil the profile.
The development of an ELF Reference model for Assessment will also be expected to contribute to the ELF itself, in terms of new service definitions and implementations.
Wilson, Blinco and Rehak (2004) describe a reference model as "... a selection of Services defined in one or more Frameworks together with rules or constraints on how those Services should be combined to realize a particular functional, or organisational goal. A Reference Model constrains the number of unique organisational infrastructures".The Assessment reference model will include:
- A domain definition that will describe the scope of the Assessment domain, including evidence that gives an overview of current practises, processes and systems.
- A set of use cases that describe common solution patterns in the Assessment domain area.
- Service profile definitions for both existing services and those that need to be developed for the Assessment domain area, their scope, behaviour, and data.
- Assessment reference model prototype implementations.
The Assessment reference model project will build upon existing specifications and standards from JISC, IMS, and other projects. In particular, it is expected to reference: agreed standards such as SOAP and WSDL of the W3C; Learning Design, Content Packaging, Simple Sequencing, and Question and Test Interoperability of IMS; the OSIDs from OKI and the current SAKAI initiative of the US Universities consortium; and the current ELF and associated specifications from JISC and CETIS.
The Assessment reference model project is likely to identify additional work, and may identify new developments, that could be usefully introduced into these standards. Such material will be discussed with CETIS and the relevant SIGs.
- Open Middleware Infrastructure Institute (OMII) This
project will use OMII to provide the infrastructure on which to
build the proof of concept implementations. OMII_1 is a collection
of tested, documented and integrated software components that
provides a standard platform for integrating e-Science middleware as
well as a simple, secure web service-based Grid infrastructure for
new e-Science users. It provides an infrastructure that allows
collaborative working between users (Clients) and providers of grid
resources and applications in a trusted and secure environment.
OMII_1 is centred around two sets of standards: web services and
grid services. Web services provide service-level Server-Agent
connectivity, utilising SOAP for communication and WSDL for web
service specification and publication and XML Security. Web services
are provided by Jakarta's Apache Axis and hosted by Jakarta's Apache
The Open Middleware Infrastructure Institute is funded as part of the UK e-science program, with initial funding of £6.5M over 3 years starting from 1 Jan 2004. The OMII engineers reliable, resilient, robust distributions of grid infrastructure with generic services. The UK OMII vision is to be the source for reliable, interoperable, open-source Grid middleware.
- CETIS The team is closely connected with a number of CETIS groups, especially the Assessment SIG which is managed by members of the team. The team has a close interest in the development of IMS QTI and other IMS standards.
- DeL regional pilot projects Substantial provision has been made over the duration of the Assessment reference model project for the support and co-operation with DeL projects. In addition to the intention to publish and disseminate material via the Assessment reference model project Web site, some 70 person-days have been budgeted for support in work package 4. A further 6 full day meetings at Southampton and 6 full day and evening meetings regionally have been budgeted during all the work packages.
- Other ELF Assessment projects Key parts of the
project (such as the creation of a domain definition and
the development of a gap analysis and service profile) will
relate the Assessment Reference Model to other projects
working with the ELF. Such projects may be developing
service requirements or definitions themselves, be
interested in services developed as part of the Reference
model, and may have implementations that may be evaluated as
part of the Reference Model prototype. In the case of
Assessment domain area projects will include:
- TOIA (Technologies for Interoperable Assessment) - University of Strathclyde (X4L Strand dB)
- APIS (Assessment Provision through Interoperable Segments) - University of Strathclyde ĘC (eLearning Framework and Tools Strand)
- ASSIS (Assessment Sequencing) - University of Hull - (eLearning Tools)
- TIP (Tools Integration -Authentication and Authorisation) - University of Oxford - (eLearning Tools)
The project will begin by providing a more detailed and in-depth analysis and specification of the Assessment reference model requirements, and by specifying the standards for the Assessment reference model.
To do this a domain definition will be produced that describes the scope of the Assessment domain and a portfolio will be assembled as evidence that gives an overview of current practises, processes and systems. Standards relevant to the Assessment domain area will be reviewed (eg IMS QTI), as will the relevant DeL regional pilot projects, relevant JISC projects (eg TOIA, ASSIS, APIS) and other projects.
In the light of these reviews a support plan for the relevant DeL regional pilot projects will be developed.
Work Package 2 represents a first iteration of the design of the reference model and implementation of a prototype. In this first iteration, a particular portion of the Assessment domain area will be addressed (expected to be around half of the domain) and use cases and scenarios will be identified and documented. The particular portion is expected to be the more straightforward aspects of the Assessment domain area, where relevant Assessment services can be readily identified and related to the reference model.
Use cases will be developed that describe common solution patterns in the Assessment domain area. These will be coupled with narrative descriptions of the elements/actors and processes involved in the domain area and the ways in which they interact.
Based on these use cases a gap analysis will be undertaken. The current state and status of the e-Learning Framework with respect to the Assessment domain area will be established and existing ELF services relevant to the Assessment domain area will be identified. This analysis will provide an initial table of service gaps in the e-Learning Framework, as well as an identification of the Assessment domain area services to be implemented.
A Service Profile will be produced for the Assessment domain area that describes the existing and new services in context and gives details of each service in terms of its functional scope, data and behaviour, and API. If new services are required for the Assessment domain area then they will be scoped, and their behaviour and data defined.
A proof of concept prototype will be designed and implemented based around the Assessment domain area use cases and the Service Profile.
In work package 3, the other portion of the Assessment domain area will be identified (expected to be the remaining half of the domain) and use cases and scenarios will be identified and documented. This particular portion is expected to be the more awkward aspects of the Assessment domain area, which will benefit from the work of iteration 1.
Experience gained from the first iterationof the reference model design will be used to revise the use cases described in WP2 and also to inform the development of the remaining use cases planned for iteration
The analysis of the use cases will provide an augmentation to the table of service gaps in the e-Learning Framework. The gap analysis from will be revised and extended to cover the new portion of the Assessment domain area if necessary.
The Service Profile will be updated to describe the existing and new services in context, and the service descriptions will be updated or augmented. If new services are required for the second iteration of Assessment then they will be scoped, and their behaviour and data defined.
The proof of concept prototype from will be extended to support the new use cases and service profile.
At the end of iteration 2 the proof of concept prototypes will be evaluated and the Assessment reference model and service definitions further revised.
Work package 4 is a substantial project activity which runs throughout the project. Its components include project dissemination work (Web site construction and maintenance, developing and running workshops, providing conference posters and presenting conference papers, and authoring and publishing journal articles), providing support for DeL regional pilot projects in the Assessment domain area, collaboration and coordination with the other e-Learning Framework reference model projects, collaboration and validation activities with CETIS and the CETIS Assessment SIG, and collaboration with other relevant Assessment domain area bodies and authorities (eg IMS).