Home>BS Standards>BS 7988:2002 pdf download

BS 7988:2002 pdf download

BS 7988:2002 pdf download.Code of practice for the use of in formation technology (IT) in the delivery of assessments.
f) the confidentiality and integrity of candidate data should be maintained throughout; NOTE Attention is drawn to the Data Protection Act 1998 [3].
g) an audit trail should be maintained, so that any queries or irregularities can be investigated; h) back-up facilities and fall-back procedures should be in place to minimize disruptions so that the candidate is not disadvantaged, especially for high-stakes assessment.
5 Interface between assessment content and IT delivery
NOTE This clause concerns the inter-relationship between assessment. content and assessment software and is relevant both to assessment sponsors, who define the pedagogical requirements of the assessment, and to assessment distributors, who are responsible for developing or customizing the software. Assessment sponsors should comply with 5.1 and assessment distributors with 5.2.
This clause is applicable to both high-stakes and low-stakes assessments and to the use of both generic assessment software (capable of running a range of different assessments) and software which is specific to a single assessment or group of assessments.
5.1 Responsibilities of assessment sponsors
5.1.1 Assessment sponsors should ensure that they have sufficient familiarity with the soft.ware which is being considered or which is to be used and the associated delivery platform to be able to:
a) understand their advantages and limitations;
b) appreciate the likely effect of IT delivery on the validity and reliability of the assessment and of individual items;
c) use software features (e.g. item types, multimedia elements) relevant to the intended assessment; d) identify assessments for which IT delivery is unsuitable or should be supplemented by other assessment methods.
5.1.2 Assessment sponsors should specify clearly the parameters required for each assessment, including:
a) number and type of items to be used;
b) how the items are selected for each assessment session (e.g. fixed assessment form, computer selection
from a bank, any constraints on selection);
c) any time limit;
d) any restraints on navigation between items (see 7.1);
e) assessment regulations, including permitted and prohibited resources (see 6.4.3 and 7.3.3);
scoring rules for individual items and for the calculation of the overall result. (if applicable), including rules for the scoring of open-ended items (see 8.1 and 8.2);
g) feedback to be provided (see 8.3).
5.1.3 Assessment sponsors should try out the assessment (i.e. the combination of software and content) during development and again before operational use to verify that all aspects of the delivery, scoring and feedback operate as intended and in accordance with pedagogical requirements.
5.1.4 In developing assessment content, assessment sponsors should consider issues relating to candidates with disabilities, including:
a) whether a non-IT alternative should be provided for such candidates;
h) the effect of using assistive technology on the validity of the items (for example where the wording of a text alternative to a graphic changes the nature of the item).
5.2 Responsibilities of assessment distributors
Assessment distributors should provide assessment sponsors with:
a) full information about the capabilities, limitations and features of the intended software and the associated delivery platform that is relevant to the pedagogical aspects of the assessment;
h) a checklist of the parameters to be specified by the assessment sponsor (see 5.1.2).
6 IT delivery of assessments — general
NOTE This clause is applicable to assessment distributors who develop, specify, purchase or adapt assessment software. It will also be of interest to software designers and developers. All subclauses are applicable to high-stakes assessments; 6.1, 6.2. 6.3 and 6.5 are also applicable to low-stakes assessments. This clause is applicable both to generic assessment software (capable of ninning a range of different, assessments) and to software which is specific to a single assessment or group of assessments.
6.1 Interoperability
6.1.1 Tn the design of assessment software consideration should be given to the need to facilitate exchange of item and response data with other assessment. users (for example using IMS standards for question and test interoperability).
6.1.2 Where items are imported, the receiving organization should verify that they operate correctly on the hardware and software used by their candidates.
6.2 Hardware, software and communication considerations
6.2.1 Taking account of ICT facilities available to candidates
Design of software for IT delivery of assessments should take account of the TCT facilities likely to be available to the intended candidates and at the assessment centres. Implications which should be considered include the following:
a) the effect on access to the assessment if the assessment software requires a higher ICT specification than is available at most assessment centres;
b) the effect of the assessment centre’s delivery platform on the speed of operation of the assessment software.

Related Standards