Overview

Background

Student Systems has responsibilities relating to (in additions to other areas) admissions and student data held on EUCLID and related systems, key student survey data including NSS and EvaSys course evaluation data, and the statutory Key Information Set return.

Early in 2015 Student Systems were provided a steer from senior management to:

  • Develop our use of student data to support ways to enhance learning & teaching, the student experience and operational effectiveness;
  • Focus activity on what will make a difference at School level – provide support, help develop insights and share practice;
  • Focus on the accessibility, visualisation and transparency of data, helping to simplify and manage complexity;
  • Examine the use of dashboards to support these objectives.

 

An initial working group was established, led by Barry Neilson and comprising representatives of Student Systems, BIMI Programme, ISG, GaSP, CHSS and Academic Services; by the end of 2015 Barry and the group had undertaken:

  • Workshops with a wide range of staff in May 2015;
  • Papers and presentations at: Senate; BIMI Board; Learning & Teaching Committee; Knowledge Strategy Committee
  • Meetings with Heads of School, Directors of  Professional Services, and colleagues in other roles
  • Meetings with other institutions already producing dashboards: The University of Warwick, University of Sheffield and Oxford Brookes University
  • Development of prototype dashboards using both SAP Dashboards and Qlikview software

Scope

Six keys areas have been identified:

Understand applications & admissions over a period of time and in-year to help plan for following year

Understand my student cohort(s), their characteristics, trends, progression and outcomes.

Learning & teaching  Survey data, linked to student record and other sources, some local level internal and external benchmarking.

Analytics/Predictive

Standard reports for annual course and programme reviews and TPRs to have one consistent set of data, spend less time looking for it, and more time using it.

Understand my student on an individual level and what is happening in-year.

Analytics/Predictive

Effective/Efficient  Assessment volumes, class sizes, feedback and assessment turnaround times, contact hours, internal and external comparison.

Analytics/Predictive

For the dashboards element, key areas of focus are identified as:

Value

Are we reporting on data which will enable schools to enhance learning & teaching and student experience: clear link to strategy and key indicators

Use

Reporting and dashboard yes, but also greater analytics, insights and potentially predictive qualities; accessibility, visualisation and simplicity

Resources

Capacity as well as capabilities; systems as well as people

Alignment

Plan, scope, clarity on roles and responsibilities; aligning central team(s), colleges and schools – one point of distribution

Agreement

Data definitions, consistency in use, one source

Ethics and security

Clarity on use of data, access and security, avoid unintended consequences

Operation

Fitting in with operation rather than organisational structure; focussed at the level of need

Culture

Moving to active engagement and use, greater access, visibility and transparency

Objectives and Deliverables

Reference

Details

Priority

O1

Key Objective 

Must
D1

Deliver a service to all staff which :

  • gives Accessibility
  • gives Visibility
  • is Robust
  • is Reliable

and is delivered with good Training & Engagement with end users

Must

O2

Dashboards data architecture.

Must

D2

Application and Data Architecture (ADA) and System Design

Must

D3Develop a new Student Data Hub to house new data sources required for the dashboardsMust

O3

Data Capture

Must

  D4

Create a new data mart (Dev, Test and Live) for survey data from EvaSys; implement a new nightly data import process to import the data from EvaSys and load into the new data mart.

Must

  D5

Record/maintain subject areas, and their JACS equivalents, comparator institutions, and student programmes in EUCLID

Must

  D6

Descoped: Record/maintain/view Taught Programme Reviews (TPRs) and Postgraduate Programme Reviews (PPRs), their subject areas and school administrators, and the programmes (and courses?) they include, in EUCLID via e:Vision

Could

  D7

Not required (Home Subject fit for purpose/maintained in client): Record/maintain subject areas against EUCLID courses via e:Vision (if different from existing ‘TL subject areas’ and ‘Home/Other subjects’)- investigate whether it is already available

Could

  D8

Maintain/update course organisers against EUCLID courses and current/future course instances via eVision (CCAM)

Must

  D9

Amend the EUCLID course roll-forward process to include the course organiser-required for Jan 2017 roll forward

Should

  D10

Record/maintain Evasys course survey flags and teaching staff against course instances in EUCLID via e:Vision

Must

  D11

Descoped: Record/maintain annual targets against admissions/entry programmes in EUCLID via client

Could

O4

Data transfer

Must

  D12

Transfer programme subject areas, NSS comparators and annual admissions targets to a schema capable of feeding the BIS Programmes universes

Must

  D13

Transfer course subject areas, organiser codes, marking scheme codes and names, and full complement of WP markers to a schema capable of feeding the BIS STUDMI universes

Must

  D14

Transfer course instance organiser code to a schema capable of feeding the BIS STUDMI universes

Must

O5

BIS universe developments

Must

  D15

Create and implement a new Course Enhancement Surveys BIS universe, reading from the new Evasys datamart, and with data access restrictions based on user roles/permissions and principle of student anonymity

Must

  D16

New STUDMI Monthly and STUDMI Daily objects for courses (subject areas, marking schemes), course instances (course delivery key), students (1 overarching and 9 subsidiary WP markers), and student programmes (5 week filter objects: exited within 5 weeks of commencing programme; exited/interrupted within 5 weeks of start of semester 1)

Must

  D17

Archive existing STUDMI Monthly, STUDMI Daily and EUCLID Admissions session year objects with non-standard formatting and replace with new objects in standard format

Must

  D18

Create new objects in STUDMI Monthly and STUDMI Daily to hold UUN/Instance in the standard format S1234567/1Must
  D19

Populate subject areas, course organisers, new WP markers, marking schemes and revised session years in existing historical STUDMI Monthly snapshots

Should

  D20

New Programmes universe objects for programme subject area; new Programmes universe class for annual admissions targets

Must

  D21

Any potential changes required to support NSS reporting of KIS universe and reconciliation with student/EUCLID data

Must

O6

Dashboards software implementation

Must

  D22

Confirm whether Apple-, tablet- and phone-compatible SAP Dashboards can be published; establish expected lifetime of SAP Dashboards and flash-based outputs. Confirm viability of flash output regarding Apple products, security issues

Must

  D23

Proof of concept: nightly refresh of EASE-protected Dashboards from source Webi reports; further restriction on access based on UUNs authorised by local line managers via Student Systems Operations

Must

  D24

Go/no go decision on SAP Dashboards: if yes, implement software including LiveOffice add-on without compromising developers’ desktops; if no, seek alternative software that does not require licensing of end users

Must

O7

Dashboards community

Must

  D25

Agree data requirements, definitions and visualisations with stakeholders

Must

  D26

Establish a permanent dashboards user group

Must

  D27

Agree new EUCLID roles and responsibilities with stakeholders in Schools and other key units; agree central, college and school reporting/support remits and responsibilities

Must

  D28

Develop/maintain a web presence to publicise dashboards, provide associated documentation, and inform users of project developments/progress and future iterations

Must

O8

Webi report development

Must

  D29

Develop Webi reports to support University- and School-level dashboards and, potentially, associated reports/infospaces

Must

O9

Dashboard development

Must

  D30

Develop University- and School-level dashboards

Must

O10

Dashboard publication

Must

  D31

Publish dashboards outside BI Suite (i.e. users will not login via BIS) , refreshing nightly, and openly accessible to EASE-authenticated staff (where authorised by local line managers/Student Systems Operations) using laptops and desktops.

Must

  D32Publish dashboards compatible with Apple productsCould
  D33Publish dashboards compatible with tablets and mobile phonesCould

Out of scope/Assumptions

  • Initial dashboards focusing on both undergraduate and postgraduate applicants and taught students will be published in late 2016; dashboard requirements for research students differ significantly from those for taught students, research student dashboards will be published later, via a future iteration or continuous improvements.
  • WIll not publish dashboards for non graduating student programmes

Benefits

Enhancing MI/BI and strategic planning

  • Better, more consistent data definitions across different areas of reporting
  • New MI available on surveys, admissions targets, external benchmarks
  • Key MI available together in one place
  • Easily accessible data, on mobile devices as well as computers; no BIS log-in
  • Better, quicker insights through improved visualisation and analysis
  • Internal benchmarking between/across schools, subject areas and courses/programmes

Enhancing Learning and Teaching

  • Cross-referencing student survey results with course results and programme outcomes
  • Analysis at subject level as well as individual courses/programmes
  • Identifying high and low performance, best practice and areas meriting attention
  • Tracking cohort progression, attrition and completion
  • Demographic analyses (gender, ethnicity, fee status, WP)
  • Identifying trends over time
  • External benchmarking via KIS/NSS

Improving data architecture and data quality

  • Initiating a new Student Data Hub consistent with the emerging framework for BI Architecture (COM025). Implementing the ETL tool with benefit from a standardised approach, which can be reused when new data sources are added to the new data hub. This will reduce manual Oracle development work.
  • Improving consistency and compatibility of datasets: ability to report on student, programme  and course across different Universes.
  • Better understood data definitions
  • Developing a sustainable, scalable data architecture for future growth and development.
  • Reducing overhead to develop and maintain new reports in Student Systems, Colleges and Schools.

Success Criteria

Ref

Deliverable Reference

Description

Measure

1D1Delivery of the key object O1The combined measures below will demonstrate the success of delivery of our key  objective O1

2

D3

Data architecture does not impact data availability

Dashboards combine data from all data sources whether flowing through datamarts on existing architecture and/or new Student Data Hub

3

D6

Evasys course evaluation data is available downstream within 1 day of surveys closing

Dashboards reflect all course surveys completed up to the previous day

4

D5, D12, D29, D30

External benchmarking

Dashboards support benchmarking against comparator programmes at other institutions

5

D5, D12, D13, D16,D29, D30

Multi-level analysis

Dashboards support analysis and comparisons at 5 levels: University; College; School; Subject Area; Programme/Course

6

D5, D7, D8, D9, D10, D11

Data captured at source

School/College roles maintain relevant new EUCLID data locally via e:Vision: course/programme subject areas; KIS/NSS comparators; course organisers; course survey flags and teaching staff; admissions targets

7

D8, D9, D14

History of course organisers is maintained against courses/course instances

Historic data on course performance and evaluation under previous course organisers is not associated with the current course organiser

8

D11, D12, D20, D29, D30

Admissions targets measured against actual entrants; forward planning enabled

Dashboards cross-reference entrant numbers against targets, application counts, offer counts and UF counts

9

D13, D19

WP reporting fully enabled via STUDMI

All WP markers available in STUDMI snapshots for previous three sessions

10

D15, D31

Appropriate and effective security around Evasys data surfaced via BI Suite

Access to Evasys data via Dashboards or published universe meets role-based criteria defined in Course Evaluation Mainstreaming policy; Evasys universe anonymised to ensure individuals cannot be identified through linking to student/applicant universes.

11

D15, D23, D24, D31

Appropriate and effective controls around access to personal/sensitive data.

Dashboards not accessible outside EASE; demographics meet relevant conditions set out in Schedules 2 & 3 of Data Protection Act 1998 c. 29; demographics not applied to Evasys data; staff-specific data excluded from Dashboards; access to Dashboards controlled via Student Systems Operations and local line managers as for all other BIS content?

12

D16, D17,D18, D21, D29

 

Improved data consistency/reporting across universes.

Webi reports combine a) course instance datasets from STUDMI and from Evasys and b) student instance datasets from STUDMI and EUCLID Admissions, and c) session year data from STUDMI, EUCLID Admissions, Programmes universe, and Evasys, and NSS/KIS universe

13

D17

No impact on existing STUDMI reports through archiving of existing objects

Regression testing of existing reports shows archived objects continue to perform correctly

14

D19, D20, D29, D29, D30

Trend analysis

Admissions trends analysed over 5 most recent entry sessions; student, programme and course trends analysed over three sessions.

15

D22, D24, D30, D31

Accessibility and Durability

Dashboards accessible on desktops and laptops; SAP Dashboards can be supported until any alternative dashboards software is procured and implemented.

16

D23 D24, D31

Data currency

Dashboards refresh on a nightly basis.

17

D23, D30

Sustainable/supportable dashboards

Dashboards refresh automatically

18

D25, D26, D27, D28,D29, D30, D31

Improved support, analysis and insight for School and College strategic planning 

Positive feedback from users/stakeholders

19

D25, D26, D27, D28,D29, D30, D31, D32

Improved support, analysis and insight for School and College learning and teaching, quality assurance processes

Positive feedback from users/stakeholders

Project Info

Not available.

Documentation

Not available.