Completion Report

Project Background

1. This project supported the Assessment strand of the Student Systems Roadmap (http://www.studentsystems.ed.ac.uk/ed/aboutus/whatareweworkingon/Documents/Student_Systems_Roadmap_2.0.pdf).

2. Assessment & Progression was identified as the number 1 priority by the Student Systems Board. Prior to this project, the administered assessment & progression through a variety of tools including SMART, EUCLID, local systems and VLEs.

2. CSPC approved the new Informing Taught Students of their Final Programme, Course and Progression Results policy for 2014/15. (http://www.docs.sasg.ed.ac.uk/AcademicServices/Policies/Informing_Taught_Students_of_their_Final_Programme_Course_and_Progression_Results.pdf)

This clarifies the role of the student record in the relation to student’s assessment and progression administration.

3. The SMART assessment system was robust but further changes and enhancements to the system were no longer recommended. The system was due to reach the end of its life in the next couple of years.

This project had two strands over 2014/15 and 2015/16:

1. Deliver tools to supporting the implementation of the ‘Informing Students of their Final Programme, Course and Progressions Results’ Policy with a focus on the ‘communication of all final progression decisions’

2. Develop Assessment and Progression capabilities within EUCLID to ultimately replace SMART and other School assessment systems.

 

Were the project goals met?   

The project implemented processes and software for the communication of progression and a pilot for assessment and progression tools in EUCLID.

The goals of the projects were:

  1. Phase 1 - Develop tools to enable Schools to record progression decisions onto the Student Record and publish to the Student EUCLID view so that students have a clear communication of their progression status.
  2. Phase 2 - Develop tools and run a pilot to: 
  • support the in-year administration of assessments
  • establish business processes for the administration of assessment and progression
  • support  course and programme level boards
  • present assessment and progression to Students and relevant members of staff (e.g. Personal Tutors) in an effective way and in a timely manner
  • establish a new system that could wholly replace SMART for 2016/7
  • The sharing of information on student results across schools on one system.

  • Production of exam board papers

 

Were the project deliverables fully or partially accomplished?  

 

The following deliverables were achieved (from project brief)

  1. Informing students of their Progression results:
    • Input of progression decisions to EUCLID (including notes), similar to the approach used for awards;
    • Automated notification to students of progression outcome;
    • Display of progression outcome through EUCLID student view
    • Wholesale adoption of the policy to formally notify Taught students of their progression through EUCLID; 
  2. Development of EUCLID assessment and progression tools for use in pilot schools in 2015/16 for a number of courses and programmes.  We piloted in 8 schools.

For courses functionality to support:

  • Set-up of course components of summative assessment (down to question level)
  • Recording of marks against each element of assessment
  • Calculation of the total course marks based on assessment component definition and weighting
  • Production of exam board papers
  • Actual and agreed courses components and marks

For progression and awards, functionality to support: 

  • The definition of progression rules for a programme;
  • Production of exam board papers;
  • Recommendation of progression and award/classification based on course marks and progression rules;
  • Recording of agreed progression and award/classification with accompanying notes (as required);
  • Publication of progression and awards.
  • Display of Progression status and accompanying notes to students through EUCLID student view

 

The following deliverables were not  described in the Brief but were delivered:

  1. Development of a PT view of awards for completed tutees and progression for existing tutees
  2. Improvement to navigation/presentation of EUCLID student view

Not Delivered:

  1. Automated notification to students when final course results are entered into EUCLID. It was agreed to defer this until there was a notification mechanism in MyEd as it could have resulted in multiple emails about the same assessment.
    • This will be revisited when Service Excellence/Digital Transformation delivers a notification mechanism.
  2. Release and display of course component (in course assessment) to students through EUCLID student view and to Personal Tutors. Pilot schools were reluctant to do this during the pilot.
    • This was therefore deferred to the full rollout in 2016/17 (SAC057)

 

Did the project deliver a solution to the problems being addressed? 

Yes.  Feedback from users was very positive ( noted for each strand below)

 

 

What are the benefits for the business? How did it impact the business?

Strand 1 

Students:

  • were clearly communicated when their progression decisions were due
  • have clear communication of their progression decision and any actions required through EUCLID student view. 
  • have links to clear information about what to do in the event of queries regarding their decision in EUCLID student view
  • have a consistent method of receiving progression decisions 

Staff:

  • Schools have a consistent process for recording and publishing progression decisions. 
  • Personal Tutors and Professional Services staff can easily access a history of progression decisions to help them provide support to the student

Feedback on strand 1

Colleagues in schools reported that the progression tools worked well. The main area of concern was the additional time required to process pre-honours progression. This was a new step for many schools. 

  • ‘Overall idea and implementation of the project have been great’ (Education UG)
  • 'From the PG side at SPS I can confirm we never experienced any problems, the system worked well and the process was really straightforward.' (PG SPS)
  • ‘The progression tool itself was fine though there are a few niggles.’ (SPS UG)
  • 'In general, we felt that the progression tools worked really well and there weren’t any issues with delivering the decisions to the students prior to the deadline of 30th June.' (Economics)
  • 'The upload in Euclid worked well, though took a while with the number of students we had to process.' (HCA UG)
  • 'We found the progression tools easy to use and we were able to quickly process the progression decisions without any major issues.'  (Engineering)

 

Strand 2 

For the pilot schools:

  • There is  centralised system that supports the set up of course assessment structures, entry of marks and caclulation of overall course marks 
  • Ratified (but unpublished) marks can be shared between schools
  • Schools have control over timing of publication of overall course marks marks 
  • Recommended progression decisions and award classifications can be calculated by the system
  • Exit awards can be awarded within the same process as progression

 

Feedback on strand 2

  • “Your project has been the most organised and complimentary to School processes that I have experienced in my 7 years at the University, so I am happy to promote it, and your team’s approach to working with Schools has been excellent, second to none.” (Sam Rice, Business School)
  • “The new system is far better than SMART in terms of reliability and system speed/access, and the output requires minimum tinkering, so that we have cut out processes, where things can go wrong with spreadsheets due to human error.” (Sam Rice, Business School)

  • “The Classifications all match perfectly. Screens are easy to use, no problems at all.” (Amanda Campbell, HCA)
  • “We are feeling quite happy about the roll out for next year - they system is, in general, a big improvement on what we have now (thank you!). It'll be particularly good when all Schools are using it but even next year should be easier.” (Karen Howie, HCA)
  • “We had administrators who were looking after courses not in the pilot wishing that they had been included” (Sabine Rolle,  LLC)

  • “We were parallel running with our existing system. One of the administrators said it took 2 hours to do in APT what took 2 days in their system” (Sabine Rolle, LLC)

  • “Just wanted to pass on some positive feedback from the Practical Physics course organiser who was impressed with the BI report for the course. They said it provided a lot of useful information.” (Rosie Edwards,  Physics)
  • “I was thinking whilst doing this [processing  programmes outside the pilot] - can we not just put all our programmes through the pilot progression software?” (Rosie Edwards,  Physics)

 

Does the Project Sponsor agree that this project can be closed at this time? 

Yes. The rollout and further development of the tools continues in 2016/17 under SAC057. The steering group for SAC034 continues for SAC057.

 

 Cost Summary
IS Apps Staff Resources Estimated at Project Brief (Days)

 600 * days IS Apps resource

 

Actual IS Apps Staff Resources Used (% Variance)

 862* days IS Apps resource (+43%), :

SSP non IS Staff Resources Estimated on Project Brief (Days)  550 days 
Actual SSP non IS Staff Resources (Days) (% variance)  808 days (+47 %)
Other Resources Estimated on Project Brief   n/a
Planned Delivery Date (go Live) from Project Brief  Multiple deliveries

Actual Delivery Date (go Live)

 Multiple deliveries

* these figures reflect the combined budgets allocated at the beginning of 2014/5 and 2015/6.

Explanation for variance:

IS Resource

The project was originally allocated a budget of 250 IS days for 2014/15. This was not an estimate, but an initial allocation of days. There were 2 checkpoints in 2014/15 when extra budget was requested.

Issue 4: Checkpoint and new scope: request for increase in project budget (increase from 250 to 400 days)

The first checkpoint identified the following reasons:

  •  the work required for the strand 2 pilot had been scoped and we were going to be able to use less 'out of the box' SITS functionality than originally hoped for
  • there was a higher level of a mentoring and support required for less experienced developer on the project than expected

Additional scope was identified: 

  • Development of a PT view of awards for completed tutees and progression for existing tutees  (est 10 days)
  • Improvement to navigation/presentation of EUCLID student view (navigation work required as current model is not sustainable in the long term and further progress has been made in other student/applicant facing funtions using new tools and techniques)  (est 10 days)

Issue 16: Checkpoint; request for increase in project budget (from 400 to 440 days)

The 2nd checkpoint identified the following reasons:

  •  Although the deployment of Strand 1 was in good time, the delivery took more effort and a longer elapsed time than anticipated.
  • Further work revealed that even less ‘out the box’ functionality will be used than planned for so extra work was required for assessment set up.
  • Small (3 day) impact of infrastructure upgrade/TEST refresh
  • Improvement to navigation/presentation of EUCLID student view took approx. 3 days longer than estimated.

 

A budget of 350 days was agreed for 2015/16 (total of 790 days):

Issue 24: Project budget for 2015/16

There were 2 further requests for extra budget during 2015/16:

Issue 27: Request increase budget of 20 days for Impact of Developer leaving project team (from 790 to 810 days)

Issue 30: Request increase to budget of 24 days (from 810 to 834 days)

Contributory factors for this were identified as 

  • Dev tech and Apps Man resource higher than anticipated for the set-up of courses/smart migration and fortnightly releases
  • Higher than anticipated development effort for EUGEX and BI universe changes

Non-IS Resource

  • The project team took a hands-on role in supporting the pilot schools using the software. Although this came at a cost, it contributed to the impression that colleagues had that they were being listened to and supported
  • BI reports were developed by one of the BAs
  • We were unable to use much of the 'out of the box' functionality that we expected. This led to further development, with BA input.
  • The effort required for the change aspect of the project was underestimated:  e.g. BAs attended and presented at numerous school and college briefings.

Key Learning Points:

Background: the project was run on an agile basis, and it was the first time that most of the team members had worked in this way.  The team felt that this was ultimately successful but it is challenging and the approach is still evolving.

What went well?

  • We got to understand processes by visiting exam boards etc. Analysis was done in an inclusive way and the team gained the trust of the user community
  • Users appreciated our knowledge of their processes, using their language
  • The team worked increasingly collaboratively
  • Dedicated support from Student Systems Operations worked really well though it took a while to get into the rhythm of working with the team. It enabled good pilot support and has aided that the spread of knowledge across Student Systems support and training teams.
  • Regular small releases to LIVE meant that the users were not overwhelmed by a big bang implementation and we could improve software in later releases as a result of feedback
  • Pilot approach worked well for gaining user confidence 
  • Having more than 1 BA on the project team allowed for greater collaboration and development of ideas
  • The developers were able to explore some new technologies, benefiting the team and wider IS department
  • The team embraced change, trying to do things in different ways.
  • Fortnightly project review meetings were valuable in keeping everyone up to speed and for the lead developer

 

What didn’t go so well?

  • Agile is hard, especially within SSP and using SITS, but it’s also definitely the right thing to do. And we were willing to try different things.
  • It’s been very difficult to  change/standardise business processes from bottom up, but it has highlighted that in fact a lot of standard work going on. This resulted in the development of more flexible functionality, at a cost.
  • Testing is challenging given the way SSP is set up for iterative development
  • One developer was also responsible for the SITS upgrade. This was disruptive to the project and impacted our delivery for the 2016 S2 exam boards.
  • From the project team perspective there was a disconnect between the way we are working (agile) and the project governance (6-weekly steering group meeting). The steering group meetings tend to be retrospective rather than proactive.
  • There were a lot of last minute releases which impacts on support, users not given enough opportunity to get familiar with released software
  • The (large) size of the pilot meant that the team spent more time on support than we would have liked
  • Introduction/replacement of team members was difficult.

If you had a project like this again, what would you improve?

  • Embedded product owners in the build team for such a big institutional wide project
  • Pilot users and colleagues inputting to the build need to have time allocated on involvement in project rather than on top  of their day job - need to recognise institutional expense of user engagement
  • Ensure that SSP developers are only working on one project
  • Adding developers to try to speed up development does not work unless the skillset matches perfectly

Outstanding issues:

This project continued as SAC057 for further roll out to schools in 2017/18 and outstanding issues have been carried forward into SAC057.

Support will be provided by the project team so Apps Man have not yet been required  to take responsibility for supporting the software.

Project Info

Project
Assessment & Progression in SITS
Code
SAC034
Programme
Student Systems Partnership SSP
Management Office
ISG PMO
Project Manager
Chris Giles
Project Sponsor
Susan Rhind
Current Stage
Close
Status
Closed
Start Date
12-Aug-2014
Planning Date
n/a
Delivery Date
n/a
Close Date
21-Apr-2017
Programme Priority
1
Overall Priority
Normal
Category
Discretionary