<-Back to list of project artifacts

Below are some sample topics and questions that could be used in a survey intended to evaluate the architectural maturity of the learning ecosystem. The concept of maturity follows the stages of the CMM:

  1. Level 1 - Ad-hoc Processes are undocumented and tend to be driven in an ad hoc, uncontrolled and reactive manner by users or events.
  2. Level 2 - Repeatable Some processes are repeatable. Process discipline is unlikely to be rigorous.
  3. Level 3 - Defined There are defined and documented standard processes established and subject to some degree of improvement over time. These standard processes are in place (i.e., they are the AS-IS processes) and used to establish consistency of process performance across the organization.
  4. Level 4 - Managed It is characteristic of processes at this level that, using process metrics, management can effectively control the AS-IS process (e.g., for software development ). In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications. Process Capability is established from this level.
  5. Level 5 - Adaptive The focus is on continually improving process performance through both incremental and innovative technological changes/improvements. Multiple feedback loops based on metrics drive the continuous improvement process.

The purpose of the survey is to measure maturity (as defined in CMM) in these core areas:

  1. Curriculum development
  2. Instructional design
  3. Learning plan and advising
  4. Evaluation and feedback
  5. Measuring success
  6. Effective management and delivery of content

Curriculum development

Questions in this section are designed to measure the maturity of tools and processes used in developing curricular materials.

Curriculum development is the institutional process of creating and managing learning units: courses, programs, specializations. Ideally the process is workflow enabled and it allows members of the community (students and faculty) to comment on and evaluate changes to the curriculum. When new courses are published they should automatically be registered in the LMS. Any changes to programs should notify students who reference those programs in their Learning Plan.

  1. Is the curriculum development process workflow enabled?
  2. Does the curriculum development process enable feedback from students and other faculty
  3. Are new curricular materials published directly to the LMS
  4. Are changes to programs communicated to students via their Learning Plans?
  5. Does the curriculum development process capture learning objectives?

Instructional design

Instructional design covers all the processes involved in creating an actual learning unit (course, lecture, lab). There are various technologies involved in this process: video capture and processing, integration of simulation and testing tools.

Learning plan and advising

Questions in this section are designed to measure the maturity of tools and processes that support the learner's personal planning activities.

The Learning Plan is the student’s tool for navigating through x years of study. Ideally it is connected to Degree Audit, career opportunities and on-line advising. It is a vehicle for registration and program enrollment. The student should be able to move seamlessly from the Learning Plan to actual course content.

  1. Is the Learning Plan connected to Degree Audit?
  2. Does the Learning Plan include a recommendation engine?
  3. Is the Learning Plan connected to on-line advising?
  4. Is the Learning Plan connected to career opportunities?
  5. Are data from Learning Plans used in institutional analysis?
  6. Is the Learning Plan a vehicle for course registration and program enrollment?

Evaluation and feedback

Questions in this section are designed to measure maturity in the use of analytics to improve learning processes.

Are there continuous improvement feedback loops:

  1. To improve the curriculum?
  2. To improve instructional design?
  3. To improve the quality of instruction?
  4. To improve student success?

Measuring success

Questions in this section are designed to ascertain the degree of clarity around KPI's for teaching and learning.

There needs to be institutional clarity around KPI’s and how there are to be measured (with respect to the success of the learning enterprise). Typically these include:

  1. Retention
  2. Engagement
  3. Future success

Learning technologies are capturing masses of data on how students are interacting with course materials. Is there a comprehensive system for capturing and analyzing these data?

Effective management and delivery of content

This section is designed to measure the maturity of the "Learning Technology Ecosystem". It is basically about measuring the level of integration of the different components.

Learning tools include the LMS and all the tools that are plugged into the LMS.

  1. How sophisticated are the integration strategies?
  2. How are the LMS and SIS integrated?
  3. To what extent do integration strategies follow IMS standards (LIS and LTI)?
  4. Is there a comprehensive Identity and Access Management plan for learning tools integration?
  5. What proportion of courses are in the LMS?
  6. What proportion of students use the LMS?
  7. How "deep" is the use of the LMS (just course outlines=shallow, all instructional materials including tests = deep)
  • No labels