Completion Report
Project Summary:
Background
A new IT Tool is required to allow the collection and management of External Examiner reports. It will also allow performing thematic analysis on the responses, satisfying the requirements set by External Stakeholders.
Were the project goals met? Yes, the External Examiners Online Reporting System was delivered, along with a new BI Suite, to be first used by pilot schools from September to December 2014 PGT Exam Boards.
Were the project deliverables fully or partially accomplished? Yes, fully accomplished
The goals of the projects were to have the IT Tools in place for 6 pilot schools to conduct the following process from Sept 2014:
- External Examiners can submit reports on line and in time, and view responses.
- School Key Contacts can distribute and track the External Examiners report.
- University Academic Response Coordinators can respond to the External Examiner report and create a single response.
- School and College Key Contacts and Academic Response Coordinators can view a summary of the External Examiner issues and the responses so that it can be viewed and analysed.
- The University can demonstrate to External Stakeholders it is compliant by scrupulously using the External Examiners’ reports to extract strategic themes.
The following deliverables were achieved (from Project Brief)
A simple and intuitive on line reporting tool, with user authentication. The report is prepopulated with External Examiner, programme and course details held in EUCLID. The External Examiner can draft the report, provide feedback, raise issues, and submit the report. |
The External Examiner is able to go to a portal to:
|
The report is distributed to the appropriate University Academic Response Coordinators for review and response. School Key Contacts can monitor the status/key dates of report receipt, distribution and response and take action if appropriate. |
University Academic Response Coordinators are able to
|
| Key Contacts and Academic Response Coordinators can view online repository of reports, issues raised and the responses to these issues. |
Multiple reports\issues can be reviewed; summary reports can be created and distributed. |
| Users can search based on the reponses per category, drill down through College, School, Programme and Course trends to view the individual reports' issues and comments. |
The following deliverables described in the Brief were not delivered. Users agreed they were not required:
- Schools/colleges and institution can define and allocate themes to issues raised in reports.
- Facilitating the tracking and closure of issues, thereby demonstrating the closing of the feedback loop.
Did the project deliver a solution to the problems being addressed? Yes
Does the Project Sponsor agree that this project can be closed at this time? Yes
| Cost Summary | ||||||||||||||
|
Analysis of Resource Usage:
Staff Usage Estimate: 235 days
Staff Usage Actual: 295 days
Staff Usage Variance: 26%
Other Resource Estimate: 1 days
Other Resource Actual: 1 days
Other Resource Variance: 0%
Explanation for variance:
1. Budget:
The original 190 budget did not include the contingency (45d) for the large project held at Programme level.
At the end of the foundation stage, requirements were estimated by the development team and effort was estimated at 287d (https://www.projects.ed.ac.uk/project/stu235/issues/6)
There was a further re-estimation with descope of requirements considered not essential for the pilot (https://www.projects.ed.ac.uk/project/stu235/issues/7) bringing the total project to 295d.
The 190d budget was set at annual planning. As per Agile approach, once the users and developers agreed on the priority list of requirements , only then a relative estimate of each requirement was able to be done. Another re-estimation took place once a few build iterations were completed giving feedback on the correct estimates to use.
2. Go Live milestone
The Go Live for the IT Tool of the External Examiners Reporting System was delivered as planned end July.
The delay in the BI Suite has been due to delay experienced on the BI framework being unavailable at times over the June/July period, including the TEST BI InfoSpace feature . The unavailability of the BI suite caused delays in building and testing sign off.
Key Learning Points:
What went well?
- 10 schools from the 3 colleges signed up for the pilot (from 6 expected schools).
- The Agile approach enabled for the build to have an enhanced product after each iteration. The project team was therefore able to demonstrate the product to Board members, and pilot schools early on. This enabled early engagement with users.
- The project team worked well together, with developers and BA located in the same room. People came from central areas (Registry, Student Systems and IS Apps) and Schools and Colleges. The 4 user representatives for each of the 4 project strands supported the project team from initial analysis (foundation stage) to user testing (last iteration). This project was acknowledged by Jeremy Bradshaw, our MVM Senior User, as a good example of how we have worked across teams (centrally and with Schools and Colleges) to manage policy, process and system changes.
- After a couple of iterations, the relative estimation gave some reassurance about what could be achieved in the next iterations, enabling the project to be planned accordingly.
- Earlier design reviews had positive feedback from user groups, especially achieving sign off. It was very useful to have mock ups done to show.
- The communication of the IT Tool progress was made available to Board members after each iteration as per links on https://www.projects.ed.ac.uk/project/stu235/page-4
- Slimming down the governance, and refocusing information given to the Board.
- The IT Tool was deployed in time end July for the transition to start by Student Systems, including demo and training to schools, users set up and External Examiners appointments data to be imported into EUCLID. The follow up project (STU244) started in time to address some of the late usability issues.
What didn’t go so well?
- Budget: the discrepancy between the set budget of 235 d and its re-estimation after the foundation stage could have been better communicated to the Board members. The use of relative estimates could have been communicated better with Board members.This created negative headlines, though the build was progressing well. This is unusual for an Agile project to have a Board governance.
- Resource: Our product owner and business analyst left the project with 6 weeks to go before final deployment. While the business analysis hand over went relatively smoothly, the loss of the product owner was a blow to the project.
- BI Suite framework was unstable and often unavailable, this resulted in lots of unexpected efforts during build, and rescheduling of testing and sign off by users.
- A dependency with another high priority project (Upload and Communication of Awards projec) meant that the EE integration with EUCLID was delayed. Then we found out that the streams in Test, which updates the tables used to feed the EERS, were broken.
- A large amount of effort had already been spent on user story creation before the formation of the project team, and before agile coaching would have been able to guide user story creation. As a result stories tended to a) suffer from large and unclear scope, b) tended to be an expression of a process instead of a feature, c) tended not to have clear conditions of satisfaction. Our initial backlog was a set of business requirements captured as user stories, which took us a long time to overcome.
- Testing was often late and rushed due to the scoping of each story, and rework was always greater than expected due to unclear conditions.
If you had a project like this again, what would you improve?
- Communication with Board: on
- Budget: manage the budget expectation better for Agile project. Emphasise that budget cannot be set until the foundation is complete.
- Relative estimation. The team doesn’t have to deliver the same velocity consistently. It shall be noted by the Board that especially at the beginning of the project the estimation may not be precise. This changes as the project team goes along with the project but the velocity may still vary and non-delivery of expected story points (value for efforts used in Agile) per iteration shouldn’t be seen as a failure.
- Product usability. The project team feels that this should be the remit of the User Representatives, and for the Board to look at the governance.The prioritised requirements have been detailed and are sign off by the Product Owner/User Representatives.Any deviation to the requirements must constitute a new requirement to be agreed and prioritised accordingly.
- Be more proactive at the planning stage when we know that a story is bigger than expected.
- Agile coaches should be working with business analysts to help them learn the skill of writing good stories, and of achieving the right level of detail at the right time. The business analyst needs to work in collaboration with the lead developer during the user story creation process. If this does not happen it is likely that the foundations stage needs to be extended to ensure that the backlog is in a fit state for development, and that is a decision that we should be strong enough to take instead of simply going with a backlog that isn’t ready.
- Developers wrote scripts to enable to set test data for the EE reports at different stages & transition stages. This puts the project in a much stronger position regarding scenario and end-to-end testing as well as for setting up demo. The cost fo writing those scripts should be factored in the estimates.
There were a number of times when the “blank sheet” nature of the project meant it was difficult for user reps to visualise and define useful user stories. We should have been more proactive with things like mockups (by doing a 'spike' design user story) and been more aware of how this type of work impacts on the backlog, e.g. if an area of the site needs such investigate work then a team cannot also deliver working features for that area during the same iteration. Scheduling across sprints needs to be better so that the output of R&D work can be planned and prioritised into subsequent sprints.
- User stories should limit themselves to expressing the feature and the reason for having the feature (following the typical as a...I want..., so that... format). Conditions of satisfaction should also be expressed in business terms. Stories should not attempt to do detailed design a priori.
- The new product owner may have benefited from a proper induction to the role of product owner. That didn’t happen and, as a result, we lost some of the drive and ownership in the latter stage of the project. Hopefully this scenario is very much an exception to the rule but if it ever occurs again we need to approach the handover process far more carefully.
- Training and product demos should only be given by people who are familiar with the product. This does not meant that the product team always has to give the demos (although it is the normal etiquette for demos to be given by the team) but where external trainers are involved they must be given adequate induction before demonstrating the product. Sometimes a minor slipup can cause quite a big adverse reaction.
- Data gathering from users for live implementation: it should happen much earlier, especially when it is rolled to UG External Examiners. Role and responsibility of the BA/Academic Services to be clearer. BA should be able to communicate to Schools' users directly, keeping the project team in the loop. Need to identify key academic champions within Schools.
- Plan the communication for the transition with users earlier, input from project team is required, so that the goodwill from users is carried on.
Outstanding issues:
1. Following User Testing feedback end July, 5 user stories are being seen as required for the pilot and are being done under project STU244 with the aim to have them delivered in Sept:
2. The importing of External Examiners and Appointments data into EUCLID is being done in August once data have been received from the pilot schools. This is being managed by Student Systems.DONE
3. DPA check final test : https://www.jira.is.ed.ac.uk/browse/STU235-47- this was delivered, however some final nice-to-have formatting changes to the DPA check, will be held back to next scheduled release after the pilot (the pilot will run from Sept to Nov) . Managed by Brenda-SSP
4. Move Phase 2 requirements and Enhancement requirements from Agile Jira STU235 to STU243 & 244: Franck
