Topic: Managing EA Work and Communicating Up at UW
Presenter: Jim Phelps (and UW team).
Day/Time: Friday, April 16, 2021 | 11AM PDT, Noon Mountain, 1PM Central, 2PM Eastern (6PM GMT)
Jim Phelps (UW), Piet Niederhausen (UW), and Rupert Berk (UW). Ashish Pandit, Beth Schaefer, Bethany Gordy, Betsy Draper, Bill Allison, Brian DeMeulle, Christopher Eagle, Dana Miller, Dave Goldhammer, Garrett King, Henry, Jared Logan, jeff kennedy, Keith Hazelton, Lonnie Smetana, Louis King. Michael Davis, Mona Zarei, Nathan Eatherton, and Raoul Sevier.
This is the broader story of how the EA Team at UW does its work and how it communicates priorities, decisions, progress, and trade-offs. UW maintains a longer-term what-is-possible view of their backlog using a visual board created in Mural, and this is cast down and reflected into other timeframe views tools such as Google Sheets and then Jira:
Jim Phelps led this session, and facilitated contributions from other members of the UW team, so the complement was:
Using the visual collaboration tool Mural the team has defined an "opportunity matrix" that maps potential engagements/investments based on where they fall in the space bounded by impact and organizational readiness. Items are placed into this space without using a formal scoring rubric, just an informed sense of where things sit, and it's good to have that approach, because it's fast, and it also means that when EA sits with stakeholders to talk about these initiatives then other observers can contribute and provide their thoughts too, so it's a bonus this is "quick and dirty" rather than rigorous:
The resulting scores are sorted into a matrix (like a TIME chart) or a Forrester-Wave-type space (which the EA team uses), like this:
The resulting population looks like this, with green-coding applied to signify the status/category:
This is the core planning device that's used with a senior "EA Board" group that assesses the work program and provides them with this enterprise scan. The group includes the CIO and the senior leadership for technology and the CISO and various AVPs (such as Identity and Access Management). The resulting conversations enable the positioning of the initiatives to be adjusted and for interventions to be created where required. Note that the scope of what goes on this board includes everything the EA team is involved in. The board doesn't show or imply the magnitude of the work required to progress or deliver these initiatives (but there is a possibility of breaking big things/stories down into smaller parts, into smaller work packages. Early in this process there were some mappings between these work items in the sticky notes on the Mural board and the strategies/etc to which they are connected. However, behind each of those in-progress sticky notes are certainly pointers available to connect the work with the strategies they are related to.
During the review with the EA board there are also "dots" added in an overlay that communicates conditions such as: important to the EA Board, important to the CISO, blocked elsewhere but able to be progressed by EA, like this:
During these sessions, members of the EA Board are "forced" to take a longer-term view, to step out of their day-to-day operational and firefighting activities, and the conversations that happen here are enormously valuable.
The time horizon is typically within a one-year timeframe, some things one-to-two years (but it would be possible to place initiatives in this canvas that would one day be a really great investment for the university, principally to signal that these things exist and to maintain a longer-term watching brief.
Initiatives from the wider organization are identified through various formal channels and also identified through casual encounters, discussions during meetings, work and plans that are being shared in various communities of practice, and so forth. The EA team follows these up (but doesn't have a problem finding initiatives requiring their services!).
The focus of the team tends to be upon assembling the people, process, and information required to create the conditions in which a good solution will be established and enacted. The UW team tends to work at a fairly high level, rather than a detailed technical level, but is very hands-on throughout these engagements, shepherding decisions towards consistency and reuse and quality/sensible outcomes. Note that #ymmv = EA teams at different organizations operate in vastly different ways and have vastly different mandate and responsibilities.
There's a Google Sheet where quarterly planning of the work undertaken by the EA team is managed and tracked and size, who is on-point / responsible, and the sizing uses "points" based on hourly units (planning to 85% of available capacity to ensure there is just enough headroom for unexpected events --- the expected velocity is that each team member can commit about fifty points over a two-week sprint. "Quarter" is six two-week sprints. The sizing/hours estimates are also cast by a simple formula into tee-shirt sizes. Individual rows can become epics in Jira.
This gives the EA Board transparent visibility of the work being done, and the trade-offs being made (e.g., we're not doing this piece of work next quarter because we're full doing other higher-priority work), which enables conversations about (re)prioritization. There is also deliberate allowance made for unplanned work, and these are factored into the estimates and the planning (rated as "could" in the must, should, could scheme). This quarterly plan lets the EA team and its members:
As above, some of the rows from the quarterly planning sheet become epics in Jira, used to planning two weeks in the future:
...and that's where additional information and details and work-tracking occurs, in Jira, which is also where the actual time spent is recorded in hours --- that's able to be mapped against the story points (also recorded in hours), and these are reported and compared periodically in a nice reporting tool. The resulting pie charts are also shared back with the stakeholders to show how their investment was spent across activities, and used by the EA team in quarterly retrospectives.
Once established, the overheads of running this scheme are minor (and totally worthwhile!). The benefits of this approach (e.g., better decision-making) isn't established, tracked, or measured formally (beyond the quarterly retrospectives and the continuous improvement they create) but there is regular checking with the EA Board about whether or not the artefacts and the process is helpful, and they are reported to be engaging and useful.
A quick evaluation poll was run at the end of the session, and 100% of attendees rated the session as being 4 or 5 out of 5 — this was an excellent session!