Digital library of educational resources and services : evaluation of components

The notions of digital library of educational resources and services (DLE) and its main components are presented in the article. DLE is considered here to be the aggregate of knowledge repositories and services organized as complex information system. The requirements and the framework for such system’s architecture design aimed to implement the approach based on the main DLE’s com­ ponents’ reusability and learning customisation possibilities for its users are briefly presented in the article. The main components of such DLEs are learning objects (LOs, i.e. learning assets (LAs) and units of learning (UoLs)), their repositories, and appropriate services such as virtual learning environments (VLEs). The article aims to describe the original LOs evaluation instrument based on presented approach for DLE design as well as the original method for complex evaluation of VLEs compounding pedagogical, organizational and technical evaluation criteria. Homogeneous, simple and clear criteria rating system is suggested for all main DLE components evaluation.


DLE: Concept, Requirements and Design Framework
DLEs are considered here to be the aggregates of "knowledge repositories, and services, organized as complex information systems" (Digital Libraries…, 2003).The notion 'knowledge' is here as the synonym of 'digital learning resources' (LRs).Further the notion 'digital learning resources' is used as an 'umbrella' notion for different kinds of digital learning content such as 'learning objects', 'learning assets', 'units of learning', 'learning courses'.The following LO notion is considered here as the most suitable for basic component for creation of pedagogically and organizationally flexible, cost effective DLE: "LO is any digital resource that can be reused to support learning" (Wiley, 2000).Learning assets (LAs) are considered here as smaller pedagogically decon-textualised parts (pieces) LOs can be combined of (Jevsikova, Kurilovas, 2006).Unit of Learning (UoL) itself and all its components are considered here as embedded LOs, including learning objectives, prerequisites, learners' or trainers' roles, activity assignment, information objects, communication objects, tools and questionnaire objects (Paquette, 2004).LO repositories are considered here as properly constituted systems (i.e.organised LOs collections) consisting of LOs, their metadata and tools / services to manage them.
The presented approach for DLE model is based on the idea that such system should be based mainly on 'ultimately reusable' LOs, their repositories and appropriate services such as VLEs.This kind of DLE should be technologically stable and effective from pedagogical, organisational, and socio-economic points of view.Ultimate reusability of LOs should be en-sured by their partition to two main separate parts (LAs and UoLs) which should work independently and should have clear different functions.LAs are considered not to be directly interconnected with particular pedagogical processes / scenarios / designs, and therefore it should be possible to reuse the same LAs to implement different learning designs.UoLs are conversely considered to be LOs containing learning designs reusable for different subjects and different LOs / LAs.This kind of "reusable" DLE design seems to be one of the best possible e-learning solutions from technologic, educational, organizational and socio-economic points of view.The detailed evidence of this statement is out of scope of the article, but shortly the main components' reusability indeed ensures system's pedagogical and organizational flexibility as well as the better financial and economic efficiency indicators such as less investment into LRs for one probable user, major financial benefit, less time to buy off, etc.
It could be achieved because: (1) major reusability of main DLEs components is achieved, (2) more users can benefit from such system, (3) content and learning design creators have the possibility not to reinvent the wheel but use and improve already created LRs, (4) better conditions are created for various content / design creators to improve the quality of existing LRs by their permanent (collaborative) modification.
The main scientific and technologic decisions to provide the ultimate reusability of DLE content and services could be full implementation of:

Evaluation of Learning Objects
A LR truly becomes a LO (a resource, reusable within another learning context) when it is associated with self-describing informationmetadata.Metadata is used to implement LO repositories, to search for LOs in the repository, to share LOs, to import LOs into or export them from VLEs, to combine them with other LOs (using them as building blocks to build lessons, courses, etc.) (Jevsikova, Kurilovas, 2006).The various approaches to LOs attempt to meet two common objectives: (1) to reduce the overall costs of LOs, and (2) to obtain better LOs.The need for reusability of LO has at least three elements: firstly, that it is interoperable and can be used in different platforms; secondly, that it can fit into a variety of pedagogic situations; thirdly, that it can be made more appropriate to a pedagogic situation by modifying it to suit a particular teacher's or student's needs (McCormick et al., 2004).
The evaluation of LOs is a comparatively new concern as the quantity of LOs has grown and the development of LO repositories has come about to allow for greater ease in finding and using LOs for both classroom and online instruction.The need to evaluate LOs requires the development of criteria to be used in judging them (Haughey, Muirhead, 2005).(10) accessibility.Each measure was weighted equally and was rated on a four point scale from "weak" to "moderate" to "strong" to "perfect".

Foreign Approaches to Learning Object Evaluation
The criteria used by Merlot (Merlot) to review LOs for acceptance in its repository fall into three broad areas: (1) Quality of content: including consideration of the quality of the specific information in LO and how well the content models fit the skills of the discipline; (2) Potential effectiveness as a teaching-learning tool: including the "actual effectiveness" of the object through personal use or making judgments about the potential effectiveness for improving instruction and learning by faculty and students; (3) Ease of use: including consideration of the general layout of LO, the computer interface, attention to the buttons, menus, text and types of user-object navigation.Merlot peer reviewers used a five star scale: from one star denoting "material not worthy of use" to a five star rating representing "excellence all around".
The Collaborative Learning Object Exchange (CLOE) based at the University of Waterloo (Ontario, Canada) has developed a peer review process for material in LO repository.Peer reviewers -instructional designers and subject matter experts, were asked to evaluate the LOs on the merits of quality of the content, its effectiveness as a teaching tool and its ease of use.The 17 CLOE criteria are: (1) the content of the LO is accurate; (2) the use of technology is appropriate for this content; (3) the content is presented clearly and professionally (spelling / grammar, etc.); (4) appropriate academic references are provided; (5) credits to creators are provided; (6) there are clear learning objectives; (7) the LO meets the stated learning objectives; (8) the target learners are clearly identified; (9) there are clear instructions for using the LO; (10) the technology helps learners to engage effectively with the concept / skill / idea; (11) the LO provides an opportunity for learners to obtain feedback within or outside the LO; ( 12) the author provides evidence that the LO enhances student learning; (13) pre-requisite knowledge/skills, if needed, are identified; (14) the LO stands alone and could be used in other learning environments; (15) the LO is easy to use (i.e.navigation, user control); ( 16) the author indicates whether the LO is accessible for learners with diverse needs; (17) technical requirements for the LO are provided (Haughey, Muirhead, 2005).
More recently, LO Evaluation Instrument (LOEI) was developed to examine school level content.The 15 LOEI criteria are listed below: Integrity: (1) the content of the LO is accurate and reflects the ways in which knowledge is conceptualized within the domain.Usability: (2) clear instructions for using the LO are provided; (3) the LO is easy to use (i.e., navigation, user control, visibility of system status).Learning: (4) learning objectives are made explicit to learners and teachers; (5) the target learners are clearly identified and addressed; (6) pre-requisite knowledge / skills are clear with connections to prior and future learning.Design: (7) the technology helps learners to engage effectively with the concept / skill / ideas; (8) the LO structures information content in order to scaffold student learning; (9) the LO provides an opportunity for learners to obtain feedback either within or outside the LO; (10) the LO stands alone and reflects an awareness of the varying educational environments in which learning sequences and objects may be used by the learner.Values: (11) the LO is appropriate for community and cultural affiliations, including language, dialect, reading and writing; (12) help and documentation files are provided for students and teachers including contextual assistance; (13) the design of visual and auditory information enhances learning and mental processes; (14) the LO is accessible to learners with diverse needs; (15) the LO does not require instructor intervention to be used effectively in a mixture of learning environments and learning sequences.

Approved Lithuanian Learning Objects Evaluation Instrument
Educational content and software purchased for Lithuanian schools have to be approved by a special IT expert group and subject experts groups.Special 'Method of Schools Provision with Computer Teaching Aids' for certification of educational software and content was approved in June 2005.Evaluation criteria established by this Method are: (1) quality of educational material; (2) psychological and pedagogical aspects; (3) learning management and interactivity; (4) user interface; (5) users' management possibilities; (6) tools (design possibilities); (7) communication and collaboration possibilities and tools; (8) technical features; (9) documentation and additional tools; (10) economic efficiency (Kurilovas, 2005b).

Original Learning Objects Evaluation Instrument
The article proposes the original LOs evaluation instrument based on proposed approach for DLE architecture design as well as on above mentioned LOs evaluation criteria.In conformity with this instrument LOs evaluation criteria are: (1) Reusability: Each selected criterion is proposed to be given an importance rating to be used when evaluating LOs.Major criteria have to be broken down into sub-criteria with each sub-criterion also having an importance rating.The importance rating range is 0-4, with 0 being the lowest and 4 being of the highest importance.Each sub-criterion has then to be rated using a range of 0-4, these ratings defined as: 0 -failed or feature does not exist; 1has poor support and / or it can be done but with significant effort; 2 -fair support but needs modification to reach the desired level of support; 3 -good support and needs a minimal amount of effort; 4 -excellent support and meets the criteria out of the box, minimal effort (Technical Evaluation…, 2006).
The article proposes to weigh each LOs evaluation criteria equally and to use this simple and clear criteria rating system for evaluation of all components of DLE: LOs, LO repositories and VLEs.
It would be purposeful not to incorporate pedagogically contextualised aspects (that is, everything dealing with LOs usage pedagogical processes / methods / scenarios) into LOs, but to describe them in separate LD-compliant UoLs, and evaluate these pedagogical criteria while evaluating UoLs.

Technical Evaluation of Learning Object Repositories
One of the largest repositories' technical evaluation projects ('Open Access Repositories') was implemented in New zealand in 2006.
Looking particularly for assurances that the selected repository/s had a secure future, the criteria selected for this evaluation were: (1) scalability; (2) ease of working on code-base, extensibility; (3) security; (4) interoperability (ability to integrate with other repositories -OAI-PMH compliance, and ease of integration with systems such as VLEs); (5) ease of deployment, ability to support multiple installations on a single platform (required for hosting facility); (6) ease of system administration (ability to configure for different uses); (7) internationalisation -multiple language interfaces; (8) open source (type of license); (9) quality and configurability of workflow tools; (10) strength of community (Technical Evaluation…, 2006).
Each selected criterion was given an importance rating to be used when evaluating the different repository systems.Major criteria were also broken down into sub-criteria with each sub-criterion also having an importance rating, and the above mentioned criteria rating system was used to evaluate repository systems.

Evaluation of Virtual Learning Environments
There are different kinds of ICT tools and systems to support various pedagogies -socalled e-Learning platforms, VLEs, Learning Management Systems, Content Management Systems, etc.The term VLE is used here as "a single piece of software, accessed via standard Web browser, which provides an integrated online learning environment".VLEs usually include the following functions: (1) controlled access; (2) student tracking; (3) resources and materials; (4) communications; (5) links; ( 6) customisation (Kurilovas, 2006).

Technical Evaluation of VLEs
Suggested framework for technical evaluation of VLEs is based on the Methodology of Technical Evaluation of LMSs (Technical Evaluation…, 2004).One of its goals was to select a best-of-breed LMS for development and large-scale deployment among: ( 1 Findings on the short-listed VLEs' technical evaluation show that the systems show significant differences in their design, architecture and implementation.On the overall evaluation, Moodle shows a clear advantage, particularly in criteria that is critical to the long-term viability of the system.The primary differentiating advantages of Moodle are: (1) System Architecture: Moodle's main strength is its simple but solid design and architecture.Moodle's architecture sets an excellent foundation, following good practices of low coupling and high cohesion, which the other LMSs fail to achieve.This yields a system that is simple, flexible and effective; and easily accessible to developers.The Moodle approach is pragmatic, using intelligent strategies.Authentication is modular and separate from the rest of the modules.This will allow easier integration with a portal framework, and an interface to student management systems.

(2)
Community: There is a lively developer community built around Moodle, with programmers other than the main maintainer contributing sizeable modules and fixes; a second criterion the other systems fail to meet.
Moodle does have limitations, notably it currently lacks IMS support, and its roles and permissions system is limited.ATutor, while strong in features and usability, has serious architectural problems.Ilias, while promising, has a complex architecture with tight coupling that is hard to work with and debug.The code is new, and lacks maturity.The developer community for Ilias is very small outside the core team.

Pedagogical and Organizational Evaluation of VLEs
For more complex VLEs evaluation it is suggested to additionally use the Framework for the Pedagogical Evaluation of VLEs (Britain & Liber, 2004) which aims to help to analyse e-learning tools without being distracted by the details of user interface objects and components.The Framework It is necessary to answer a number of key questions that VSM and the Conversational Framework suggest when evaluating VLEs: (1) Programme Level: Can you obtain a view at programme level?Does the system permit or provide a space for negotiation between programme managers and module tutors on resource questions?Can the performance of a module be monitored by the programme manager?Does the system provide tools for new modules to go through design, development and validation and then be added to a programme?How does the system support teachers working on different modules to coordinate their activities and assist each other, etc.? (2) Module Level: What tools does the system provide for teachers to present / express their ideas to students?What tools does the system provide for students to articulate their ideas to teachers and other students?What facilities are there to organise learners in a variety of ways in the module?What types of learning activity are supported by the system?What underlying pedagogical model(s) or approach(es) does the system encourage?What facilities are there to monitor how well learning is progressing on the module?Can learners find and manage resources -do they have their own file stores or repositories?Can they talk to other students, create their own discussions, create their own learning activities involving peers?To what extent is it possible for the teacher to adapt the module structure once teaching is underway: Can the teacher add / change / delete resources or fragments of module structure?Can he / she add / remove people or split them into different groups?Can he / she create and assign resources or learning activities to individuals, etc.? (3) Student Level: How is the system student-centred?Does the system provide time management / planning / organisation tools for the individual student to organise their work?Can a student monitor his / her own activity?Can students provide feedback on the quality of the module?Can a student do Personal Development Planning within the system, etc.?

Conclusions
The article proposes the original LOs evaluation instrument based on proposed approach to DLE architecture design and international experience.The main LOs evaluation criteria are: (1) reusability; (2) quality of content; (3) design and usability; (4) economic efficiency.It is suggested not to incorporate pedagogically contextualised aspects into LOs, but to describe them in separate LD-compliant UoLs, and evaluate these pedagogical criteria while evaluating UoLs.The original method for complex evaluation of VLEs compounding pedagogical, organizational and technical evaluation criteria is also proposed by the article.Homogeneous, simple and clear criteria rating system is suggested for evaluation of all components of DLE.
(1) LO metadata interoperability standards such as EUN Learning Resource Exchange (LRE) Metadata Application Profile (AP) version 3.0 of the IEEE LOM standard (LRE AP, 2007) and specifications such as IMS Common Cartridge (IMS Content Package AP integrating QTI and LOM) and Learning Design (LD) (IMS LD, 2003); (2) repository of LD compliant Units of Learning (UoLs) and tools (e.g.RELOAD, LAMS v.2.0.3 together with Moodle v.1.8,EduSource, etc.) to create and reuse UoLs; (3) LOM repository containing LOs' and UoLs' metadata created in conformity with the newest LRE AP and thesaurus; (4) LOs digital right management (DRM) system; (5) CALIBRATE project's Topic -Goal -Learning Activities (TGA) ontology-based curriculum mapping in main subjects to search for LOs in the repositories and VLEs.The framework of DLE architecture design based on the introduced DLE approach and European LRE experience is the following:

F
i g u r e 1 .Proposed framework of DLE architecture design (a) interoperability (metadata, compliance with the main standards, can LO be used in different learning platforms / VLEs?);(b) decontextualisation level (LO granularity level, can LO be reused a number of times in different learning contexts?);(c) accessibility (is LO designed for all?);(d) appropriateness for different cultural and learning systems (LO internationalisation level, is LO suitable for localisation?).(2) Quality of content: (a) content accuracy; (b) compliance with national curricula; (c) clear and professional presentation (spelling / grammar, are appropriate academic references provided, etc.); (d) interactivity.(3) Design and usability: (a) aesthetics; (b) ease to use (i.e., navigation, user control, etc.); (c) user-friendly interface.(4) Economic efficiency (taking into account the number of probable users based on LO reusability level) (Kurilovas, 2007).