Completion Report
Project Summary:
SAC044 published a dashboard focusing on undergraduate admissions, outcomes and student feedback, available to all UoE staff via the Data Matters website.
The project broke new ground in a number of technical and data areas, delivering a number of innovations that will benefit other projects, outputs and contexts.
Background
The project covered creation of a dashboard to support Student Systems’ objectives to:
- Develop our use of student data to support ways to enhance learning & teaching, the student experience and operational effectiveness;
- Focus activity on what will make a difference at School level – provide support, help develop insights and share practice;
- Focus on the accessibility, visualisation and transparency of data, helping to simplify and manage complexity;
- Examine the use of dashboards to support these objectives.
Six keys areas were identified:
Understand applications & admissions over a period of time and in-year to help plan for following year |
Understand my student cohort(s), their characteristics, trends, progression and outcomes. |
Learning & teaching Survey data, linked to student record and other sources, some local level internal and external benchmarking. Analytics/Predictive |
Standard reports for annual course and programme reviews and TPRs to have one consistent set of data, spend less time looking for it, and more time using it. |
Understand my student on an individual level and what is happening in-year. Analytics/Predictive |
Effective/Efficient Assessment volumes, class sizes, feedback and assessment turnaround times, contact hours, internal and external comparison. Analytics/Predictive |
For the dashboards element, key areas of focus were identified as:
Value |
Are we reporting on data which will enable schools to enhance learning & teaching and student experience: clear link to strategy and key indicators |
Use |
Reporting and dashboard yes, but also greater analytics, insights and potentially predictive qualities; accessibility, visualisation and simplicity |
Resources |
Capacity as well as capabilities; systems as well as people |
Alignment |
Plan, scope, clarity on roles and responsibilities; aligning central team(s), colleges and schools – one point of distribution |
Agreement |
Data definitions, consistency in use, one source |
Ethics and security |
Clarity on use of data, access and security, avoid unintended consequences |
Operation |
Fitting in with operation rather than organisational structure; focussed at the level of need |
Culture |
Moving to active engagement and use, greater access, visibility and transparency |
Were the project goals met?
Yes: student data dashboards were delivered to all staff, in an accessible, visible, robust and reliable format. There is appetite amongst stakeholders for publishing further content via dashboards, and to improve accessibility (eg on mobile phones & Apple products, removing dependence on flash).
Were the project deliverables fully or partially accomplished?
All objectives and deliverables defined as ‘Must’ or ‘Should’ on the MoSCoW scale were delivered; some but not all defined as ‘Could’ were also delivered.
The following deliverables were fully achieved (from Project Brief):
Ref |
Details |
Priority |
O1 |
Key Objective |
|
D1 |
Deliver a service to all staff which :
and is delivered with good Training & Engagement with end users |
Must |
O2 |
Dashboards data architecture. |
|
D2 |
Application and Data Architecture (ADA) and System Design |
Must |
D3 |
Develop a new Student Data Hub to house new data sources required for the dashboards |
Must |
O3 |
Data Capture |
|
D4 |
Create a new data mart (Dev, Test and Live) for survey data from EvaSys; implement a new nightly data import process to import the data from EvaSys and load into the new data mart. |
Must |
D8 |
Maintain/update course organisers against EUCLID courses and current/future course instances via eVision (CCAM) |
Must |
D9 |
Amend the EUCLID course roll-forward process to include the course organiser-required for Jan 2017 roll forward |
Should |
D10 |
Record/maintain Evasys course survey flags and teaching staff against course instances in EUCLID via e:Vision |
Must |
O4 |
Data transfer |
|
D13 |
Transfer course subject areas, organiser codes, marking scheme codes and names, and full complement of WP markers to a schema capable of feeding the BIS STUDMI universes |
Must |
D14 |
Transfer course instance organiser code to a schema capable of feeding the BIS STUDMI universes |
Must |
O5 |
BIS universe developments |
|
D15 |
Create and implement a new Course Enhancement Surveys BIS universe, reading from the new Evasys datamart, and with data access restrictions based on user roles/permissions and principle of student anonymity |
Must |
D16 |
New STUDMI Monthly and STUDMI Daily objects for courses (subject areas, marking schemes), course instances (course delivery key), students (1 overarching and 9 subsidiary WP markers), and student programmes (5 week filter objects: exited within 5 weeks of commencing programme; exited/interrupted within 5 weeks of start of semester 1) |
Must |
D17 |
Archive existing STUDMI Monthly, STUDMI Daily and EUCLID Admissions session year objects with non-standard formatting and replace with new objects in standard format |
Must |
D18 |
Create new objects in STUDMI Monthly and STUDMI Daily to hold UUN/Instance in the standard format S1234567/1 |
Must |
D19 |
Populate subject areas, course organisers, new WP markers, marking schemes and revised session years in existing historical STUDMI Monthly snapshots |
Should |
D21 |
Any potential changes required to support NSS reporting of KIS universe and reconciliation with student/EUCLID data |
Must |
O6 |
Dashboards software implementation |
|
D22 |
Confirm whether Apple-, tablet- and phone-compatible SAP Dashboards can be published; establish expected lifetime of SAP Dashboards and flash-based outputs. Confirm viability of flash output regarding Apple products, security issues |
Must |
D23 |
Proof of concept: nightly refresh of EASE-protected Dashboards from source Webi reports; further restriction on access based on UUNs authorised by local line managers via Student Systems Operations |
Must |
D24 |
Go/no go decision on SAP Dashboards: if yes, implement software including LiveOffice add-on without compromising developers’ desktops; if no, seek alternative software that does not require licensing of end users |
Must |
O7 |
Dashboards community |
|
D25 |
Agree data requirements, definitions and visualisations with stakeholders |
Must |
D26 |
Establish a permanent dashboards user group |
Must |
D27 |
Agree new EUCLID roles and responsibilities with stakeholders in Schools and other key units; agree central, college and school reporting/support remits and responsibilities |
Must |
D28 |
Develop/maintain a web presence to publicise dashboards, provide associated documentation, and inform users of project developments/progress and future iterations |
Must |
O8 |
Webi report development |
|
D29 |
Develop Webi reports to support University- and School-level dashboards and, potentially, associated reports/infospaces |
Must |
O9 |
Dashboard development |
|
D30 |
Develop University- and School-level dashboards |
Must |
O10 |
Dashboard publication |
|
D31 |
Publish dashboards outside BI Suite (i.e. users will not login via BIS) , refreshing nightly, and openly accessible to EASE-authenticated staff (where authorised by local line managers/Student Systems Operations) using laptops and desktops. |
Must |
D32 |
Publish dashboards compatible with Apple products |
Could |
The following deliverables were not described in the Brief but were delivered:
O3 |
Data Capture |
|
|
Alongside Evasys course survey flags and teaching staff, EUCLID now also records school questions sets against Course Instances; school question sets are held in EUCLID as a referential dataset |
|
O4 |
Data transfer |
|
|
An extract function was delivered in EUCLID to deliver course evaluation survey data in a format that can be imported to Evasys directly without manual preparation or manipulation |
|
O5 |
BIS universe developments |
|
|
Additional WP-related objects were created in STUDMI to display, for each student: count of WP markers; count of Scotland-specific WP markers; count of RUK-specific WP markers. |
|
O6 |
Dashboards software implementation |
|
|
A virtual machine was developed to host SAP Dashboards remotely from developers’ machines; Alteryx software was procured and implemented to improve dashboard design and facilitate efficient build processes. |
|
O10 |
Dashboard publication |
|
|
A new data feed from EUCLID to Apache was developed, listing all users authorised to view individual student data via either EUCLID or BI Suite. |
|
|
Two versions of each dashboard screen were published, one with detailed counts, the other with rounded counts: users are automatically presented with the version appropriate to their authorisation level by virtue of the data feed described above. |
|
The following deliverables were partially achieved (from Project Brief):
Reference |
Details |
Priority |
O3 |
Data Capture |
|
D5 |
Record/maintain subject areas, and their JACS equivalents, comparator institutions, and student programmes in EUCLID |
Must |
O4 |
Data transfer |
|
D12 |
Transfer programme subject areas, NSS comparators and annual admissions targets to a schema capable of feeding the BIS Programmes universe |
Must |
O5 |
BIS universe developments |
|
D20 |
New Programmes universe objects for programme subject area; new Programmes universe class for annual admissions targets |
Must |
O10 |
Dashboard publication |
|
D33 |
Publish dashboards compatible with tablets and mobile phones |
Could |
Notes:
D5/D12/D20: Only subject areas and their child programmes were delivered:
- JACS equivalent and comparator institution data were not required for the finalised dashboard design agreed with stakeholders, and therefore effectively fell out of scope. There are no plans to cover these data in future projects. The removal of this deliverable from the project was agreed by the Steering Group and the project team, and did not impact the overall objectives.
- Annual admissions targets were formally de-scoped from the project’s objectives. They have since been raised again through Service Excellence workshops on SRA MI, and may be covered in Service Excellence projects, or if a further dashboard project is initiated, or in other future Student Systems developments. The removal of this deliverable from the project was agreed by the Steering Group and the project team, and did not impact the overall objectives.
D33: While the published dashboards are accessible via tablets, mobile phone compatibility is significantly restricted in SAP Dashboards and as a result mobile-compatible dashboards were not delivered. Mobile-compatible dashboards will only be pursued should alternative dashboard software become available and made the focus of a further dashboard project.
Not delivered (from the Project Brief):
Reference |
Details |
Priority |
O3 |
Data Capture |
|
D6 |
De-scoped: Record/maintain/view Taught Programme Reviews (TPRs) and Postgraduate Programme Reviews (PPRs), their subject areas and school administrators, and the programmes (and courses?) they include, in EUCLID via e:Vision |
Could |
D7 |
Not required (Home Subject fit for purpose/maintained in client): Record/maintain subject areas against EUCLID courses via e:Vision (if different from existing ‘TL subject areas’ and ‘Home/Other subjects’)- investigate whether it is already available |
Could |
D11 |
De-scoped: Record/maintain annual targets against admissions/entry programmes in EUCLID via client |
Could |
O10 |
Dashboard publication |
|
D32 |
Publish dashboards compatible with Apple products |
Could |
Notes:
D6: De-scoped: not required for finalised dashboard design agreed with stakeholders. Support for TPR/PPR administration is being pursued by the Student Systems Reporting team with Academic Services.
D7: Removed: Existing ‘Home Subjects’ already held against courses in EUCLID proved fit for purpose as ‘subject area’ categories in dashboards, so no alternative development was required.
D11: De-scoped: not required for finalised dashboard design agreed with stakeholders. Targets have since been raised again through Service Excellence workshops on SRA MI, and may be covered in Service Excellence projects, or if a further dashboard project is initiated, or in other Student Systems developments.
D32: The wider decision to deliver dashboards using SAP software reliant on flash meant Apple compatibility was not possible. Apple-compatible dashboards will only be pursued should alternative dashboard software become available and made the focus of a further dashboard project.
What were the business benefits? How did this impact the business?
Initial use and feedback demonstrates that the project has significantly improved schools’ access to and analysis of student data to gain insights into enhancing learning and teaching, the student experience, and operational effectiveness.
Examples of feedback to date: “student dashboard looks like good tool -- but could the comparison tool have an option to allow different scalings on the two plots. This would allow a programme/school/college to see how it is doing relative to the University as a whole..” GeoSciences “Really excellent. Ability to display and analyse trends is very useful.” CMVM “I've shown it to some of my colleagues here in Informatics and we're all really impressed at how user friendly it is and the detail of the information.” Informatics |
This was achieved by identifying the most valuable data, the most effective visualisations and the most useful methods of analysis through stakeholder consultation and feedback. These benefits extend to the related reporting areas of Quality Assurance, Athena Swan and Widening Participation. The functionality in the published dashboard exceeds initial expectations in many ways, particularly accessibility and usability. Although the project focused on delivering reporting benefits through dashboards, it also delivered significant enhancements and benefits in other areas of BI Suite reporting.
- New data available (surfaced in dashboards, and available across student reporting)
- Subject area introduced as a new analysis level between programme and school: teaching and various other areas of operations are organised at subject area level in many schools, and this is therefore a key level of analysis for Quality Assurance and other areas of reporting that has not previously been enabled but has been keenly sought by schools and colleges.
- WP data integrated into a robust dataset comprising both individual WP attributes and MI measures. This enables reporting on an institutional KPI which has previously not been possible in a robust manner.
- Applicants’ destination institutions (declined, insurance and accepted offers). This provides access to valuable market intelligence in the area of recruitment and admissions in a manner far more adaptable and accessible than previously available to planners in schools, colleges and support units.
- Course Evaluation data. Course evaluation has been mainstreamed in 2016/7 and is now a key resource for monitoring and enhancing learning and teaching; the dashboards enable significantly greater of analysis of core question results across the University, as existing standard reports only provide schools and colleges with local data in PDF format.
- UK-wide NSS benchmarking data. Student satisfaction is a key institutional priority, with NSS surveys providing the principal data for benchmarking across the UK. The dashboards provide a level of flexible and accessible analysis not previously available through the existing ‘raw’ datasets currently published via the Student Surveys wiki.
- Improved data accessibility via dashboards
- No BI Suite login required: a key requirement of senior managers and planners is to have access to key data without requiring expertise in unfamiliar software.
- All key student data and indicators available in one place: another key stakeholder requirement was to remove the need to access several different reports/locations in order to access the data required for any activity, particularly Quality Assurance.
- Automatic access to data at the appropriate security level. Access to detailed student data is available for authorised users while all non-authorised users can access exactly the same data in a rounded format that ensured data protection rules are met without compromising data accessibility.
- Standardised dashboard design features offering key strategic data analyses and insights that were either not previously possible, or only by accessing several reports or being skilled in manipulating BI Suite reports:
- Organisational levels: University; College; School; Subject; Programme/Course
- Key demographics: Fee Status; Ethnicity; Gender; Widening Participation
- Comparative analysis: multiple selections/filters side by side
- Trend analysis: normally across 5 years/sessions
- Consistent data definitions
- Support, documentation and data definitions
- The new Data Matters website, hosting the dashboards alongside Course Evaluation materials, features help, support and data definitions for users of student data.
- A new dedicated email address student.dashboards@ed.ac.uk is available for direct help, support and feedback to the Student Systems Operations’ reporting team.
Additionally, the project delivered data and software solutions to support Course Evaluation operational processes and reporting.
- Survey preparation and data capture (EUCLID): reduced time, effort and errors compared to with former manual processes in both schools and the surveys team. Surveys can now be prepared by various staff within a school instead of only one, while still delivering improved data integrity and consistency. This is a key element in successfully and robustly managing course evaluation, where this year’s mainstreaming has resulted in a three-fold increase in the number of courses evaluated and a significant rise in response rates compared to previous years, and which is under particular scrutiny as its role in staff development grows.
- Data validation (EUCLID, SQL): further improvements in data quality, and resulting saving in time and effort required from school and survey team staff.
- Exception reporting (SQL, BI Suite): greater oversight of and confidence in the data quality of surveys and their results.
- Operational reporting and management information (BI Suite): enabling vital oversight of course evaluation activity and results, and enabling a greatly increased and sustainable range of reporting outputs for schools and colleges despite the increase in activity from previous years.
Further, the project delivered a number of technical innovations and developments which potentially can deliver additional benefits to stakeholders though future projects and initiatives:
- Student Data Hub: the platform for housing student data within the new data architecture that will enable better, more consistent reporting for schools and colleges, including reporting across student data and other areas of data.
- Course Evaluation database and BI Suite universe: supporting new and expanded reports as the role and impact of course evaluation grows and develops.
- Greater compatibility between different student datasets: in particular, reporting across admissions and students, keenly sought by schools, colleges, and strategic planners.
- Stable VM-based SAP Dashboards implementation: providing a robust and reliable platform for further dashboard developments, which are key to the development of BI and MI provision across the University.
- Improved data analysis and transformation via Alteryx: enabling manipulation of vastly larger datasets that would otherwise have been sustainable, and thus enabling levels of analysis and detail far greater than originally envisaged in the dashboards.
- Mechanism for validating student data/EUCLID permissions during EASE authentication and other online activity. This could help develop group based management systems that can be used University-wide.
Analysis of Resource Usage:
Original Project Budget: 180 days
Staff Usage Estimate (original scope) (Information Services): 194 days
Staff Usage Actual (Information Services): 263 days
Staff Usage Variance (Information Services): +35.6%
Other Resource Estimate (Student Systems Operations): not estimated
Other Resource Actual (Student Systems Operations): around 215 days, plus £7000 Alteryx licences
Other Resource Variance: N/A
Explanation for variance:
Variance was largely due to additional scope introduced once the project was already in progress, and learning curves associated with the many new technologies and techniques used.
The project broke new ground in a number of technical and service areas, some of which were not apparent at the project outset:
- Student Data Hub development in line with emerging data architecture design principles
- Importing external datasets (Evasys)
- Dashboard development using new Alteryx and SAP Dashboards software
- Publication of BI Suite outputs outwith the BI Suite platform (dashboards via UoE website, Course Evaluation reports via email/network drives)
- Validation of data access permissions outside of EUCLID/BI Suite
Estimating associated resource was challenging due to lack of precedents; significant issues also arose (summarised below), often incurring unexpectedly high costs for IS; likewise, additional effort expended in these areas by Student Systems Operations staff reduced their availability for developing the dashboards themselves, which affected the overall breadth and depth of the published outputs
Much of the new scope introduced to the project appeared relatively late (August 2016 or later: publication and security solutions; Evasys data capture and extraction; course evaluations reporting requirements and universe design). While the project team and steering group managed this additional scope and secured additional budget for it, it had a direct impact on the project’s overall effort and timescales and caused much of the resource variance.
A significant amount of IS effort was also undertaken by contractors, requiring additional effort from IS staff to mentor and support those contractors.
The following resource totals are derived from ASTA, where effort is not categorised under the specific ‘project areas’ used here: therefore, the resource noted below against each project area should not be considered fully accurate or robust.
Project Area |
IS Resource |
Increased cost factors |
Project Initiation & Management |
31 |
Increased duration of project |
Software installation |
3 |
N/A |
Data Architecture and Infrastructure |
54 |
New infrastructure, Student Data Hub and alignment of data structures to COM025 not anticipated at project outset: significant cost of designing and prototyping new datamart and build processes |
Evasys/DAL data import |
27 |
Lack of Evasys/DAL documentation; Issues with Evasys/DAL data structure & standards; DAL ‘long field’ datatype incompatible with standard data import processes: additional effort to develop in-house solution |
Course Evaluation EUCLID screens and extract |
31 |
Course Evaluation processes not fully defined at outset: amended requirements and additional scope as processes evolved |
STUDMI & Programmes BIS universes |
31 |
N/A |
Course Evaluation BI Suite universe |
49 |
Lack of Evasys/DAL documentation; Issues with Evasys/DAL data structure & standards; Course Evaluation processes not fully defined at outset: amended requirements/additional scope as processes evolved |
Dashboard publication and security |
37 |
Publication mechanism unclear at project outset – additional scope to develop feed of user data from EUCLID to Apache and integrate into dashboard publication/login processes |
Key Learning Points:
1 What went well?
- Colleagues across Student Systems and IS collaborated successfully to overcome significant technical challenges and deliver innovative solutions which met key objectives and stakeholder expectations. Staff across a wide range of teams provided valuable insights and input, and senior managers generally helped resolve resource issues in a timely way.
- Availability/provision of short-term contractors in Student Systems and in IS to fill emerging resource and skills gaps.
- Regular meetings of a steering group comprising key stakeholder in IS, Student Systems and business areas.
- Collaboration between two project managers, one coordinating technical development efforts, the other business requirements and communications, helped avoid bottlenecks.
- Development and documentation of a prototype dashboard prior to the project commencing was valuable for informing the project’s analysis phase.
- Extensive engagement with stakeholders before and during all stages of the project led to outputs that focused on the key data of interest to stakeholders, and presented it in useful and easily-accessible formats, as evidenced in initial feedback from users.
- Dashboards build expedited by VM solution for SAP software, and procurement of Alteryx
2 What didn’t go so well?
- Initial deadlines for the project were very tight given the number of new technologies involved and the limited availability of key staff.
- Data architecture developments took longer than anticipated because expected supporting work from COM025 was not available. Delays designing and documenting the new architecture significantly impacted the rest of the project.
- Evasys-related developments took longer than expected to resolve largely due to the unfamiliar nature of the datasets and the limited documentation and support available from Electric Paper
- The dashboard was developed in a much shorter time period than originally planned due to the architecture, software and datamart developments that were required first
- The UG dashboard was delivered later than originally planned
- Additional UG dashboard content was not delivered
- No PG content was delivered
- The ‘long field’ datatype used to store course evaluation text comments proved particularly difficult to import into University datamarts.
- Support and information from the software suppliers was challenging to secure and often required intervention from senior staff; even the roles and relationships of the various companies supporting the Evasys software are unclear.
- While the suppliers recognised the long field issue as a bug, no fix has been developed; instead an in-house solution has been developed at significant cost and without any guarantee that this will remain effective if/when the suppliers make software changes or developments.
- The dashboard was developed using SAP Dashboards software largely because it was the only software licenced and available across the institution.
- Other software in use locally elsewhere around the University (eg Tableau) may have delivered similar functionality at a significantly reduced cost.
- There has been considerable discussion in the University’s BI community about potential adoption of other dashboard software across the institution: until concrete steps are taken to address this, different areas continue to develop dashboards using different software packages, risking a confusing and inconsistent landscape of dashboard provision.
- NB in addition to other benefits noted above, using Alteryx rather than BIS for data transformation has delivered outputs that could be re-used and re-pointed towards alternative dashboards software should this be adopted in future.
3 If you had a project like this again, what would you improve?
- Managing and delivering SAC044 might have been easier and smoother had it covered only the technical developments required for the dashboards, with the dashboard development itself and associated business-facing activities being undertaken separately.
- Ideally, software required to deliver a project would be fully implemented at the point the project initiates: the VM platform on which SAP Dashboards were eventually delivered only became available about half-way through the project, which was another factor contributing to delays in progressing the development of the dashboard outputs themselves.
- It would be better to close the project when the dashboards had been delivered. The additional scopes related to the course evaluation could be dealt with a separate project with better estimation and planning and the proper budget allocation.
4 Lessons learned
- Where projects are dependent on external suppliers, documentation, knowledge transfer and support from external suppliers should be secured earlier.
- Early engagement with key stakeholders and other users is an effective way to avoid major changes to the user requirements during the project’s progress; this was apparent in the relative stability of the dashboard outputs planned through the life of the project, whereas developments associated with the parallel Evasys project, where processes and requirements continued to evolve throughout the duration of SAC044, caused significantly more changes to scope during the project.
- Impact of dependent projects should be carefully monitored and mitigated: later-than-expected challenges defining and implementing course evaluation policy had a knock-on effect on timescales for defining related SAC044 requirements.
- A more cautious approach to adopting innovative software and data architecture solutions (new Student Data Hub, new Evasys datamart, SAP Dashboards, publishing and controlling access to data outside of BIS) might avoid unplanned additional scope and resource requirements during the course of the project, particularly where there is relatively little existing practical experience or knowledge available within the institution, and availability of key staff is limited. The costs and benefits of such new technical initiatives should be clearly established before being adopted. Equally, identifying new software that could benefit the project (eg Alteryx) ought to be undertaken as early as possible to maximise the benefits gained.
Outstanding issues:
There are no outstanding issues.
Additional minor dashboard developments and any bug fixes will be progressed through support. A three-month review of the dashboards’ content and reception is scheduled for late March: this should consider whether potential major improvements to the UG dashboard (as identified through stakeholder feedback) and any PG-level dashboards should be progressed.
Feedback continues to be gathered through the Data Matters website and through presentation to a variety of bodies such as College Quality Assurance Committees, the School Directors of Quality network, Academic Services, the BI Community, and key stakeholders.
Google analytics currently provides evidence of basic hits on the dashboards; deeper analysis of user engagement and activity is being explored by Student Systems and Service Management.
This project has introduced a new component Student Data Hub to simplify the data flow for MI BI and other downstream systems. However, the existing BI universes has not been migrated into the new architecture. A clear roadmap or principles need be developed for transitioning student data from the existing to the new data architecture to reduce the support costs of supporting the mixed data architecture and the risks to duplicating effort and failing to realise the full potential and benefits of the new data architecture. This need be done together with the work on next generation of EUGEX.