Digital Discoveries Part 1: Reviewing projects to proceed with

From: Defence Digital Defence
Published: Wed Nov 23 2022


The Digital Discovery Assessment Branch (DDAB) is part of D Info and it accepts submissions to the Digital Discovery Gateway for project proposals.

Anyone can submit a project through the Digital Discovery Gateway and submitters are asked to summarise and give key information about their project and why they want DDAB's assistance.

Deciding which proposals to proceed with

In many ways the team became a victim of its own success once teams heard about the projects we had helped with. With a limited team, a way to prioritise projects was needed to prevent the team being overstretched, or important or urgent projects being overlooked.

We needed to produce a way to vet the proposed Discovery applications to review if the proposal needed a Discovery. This is not a new problem within defence as Silvia Grant's post from earlier this year explains.

Some organisations run these reviews as pre-Discoveries or assessments but we used the term "First Look" as that's what the Navy used in a (since deleted) blog post and where we got the inspiration.

We reviewed these methods but couldn't find one that solved all of our problems so we assembled a team to decide what we needed. We were fortunate in that we were at a firebreak before starting new projects so we had time to plan First Looks, test them out on new projects and allocate some time to review and refine them as we continued with the other work.

Creating First Looks

A First Look lets the team quickly get a broad enough understanding of a potential Discovery project's problems, users, potential outcomes and current priorities to allow it to decide what the best next steps are.

These next steps could be to proceed to a full Discovery, hand back to the team, or pass onto a development team.

First Look creation team

The team that worked on the First Look had similar skill sets to those who would carry out a Discovery - a Delivery Manager, User Researchers, Business Analyst.

Being based across the country we used an online whiteboard to discuss ideas while having several in-person workshops to debate it.

We treated this like any other Agile project with a Product Owner and giving regular show and tells to the wider team along with regular standups to track progress.

Defining our needs and users

Our sessions focused on "What did we want to achieve by the end of a First Look?". Early on we decided that it had to be that "we had enough information for the team to make a decision on the best next step".

For users we decided that the First Look team would be the main users with the wider DDAB team being secondary users along with the Gateway submission users as other users.

We then looked at how we would get about answering this with the least amount of work:

  • What questions did we need to answer?
  • What methods could we use to answer to answer these questions?
  • How would it fit into existing processes?
  • How would we inform and engage stakeholders, particularly if the Discovery process were to be extended
  • Who should be on a First Look team? What skills are required?
  • What changes would we need to make to the DDAB Gateway submission form?
  • How would a First Look process act within Agile?

We decided that we needed to have good enough answers to high level questions. To help us then focus these questions into more specific areas we looked at what other user researchers have looked at and used the 8 Pillars of User Research as one framework.

The 8 Pillars of User Research and forming interview questions

The global ResearchOps community researched what are the main areas that user research operates within. As Discovery is a user research-led process it made sense to frame our questions within a user research framework.

The 8 pillars are:

  1. Environment
  2. Scope
  3. Recruitment and Admin
  4. Data and Knowledge Management
  5. People
  6. Organisational context
  7. Governance
  8. Tools and infrastructure

We took these 8 pillars and created top-level questions or outcomes around them:

  1. Environment
  • Have we explained that Discovery is collaborative and research led (ie it may not give them what they expected when going in, such as an app?)
  • Do we know their expected outcomes? Scope

2. Scope

  • Do we know the key deadlines and likely methods required (eg site visit, remote interviews are ok)
  • Can we summarise the problem?

3. Recruitment and Admin

  • Have a we agreed how we can keep the product owner and stakeholders updated
  • Do we know how are going to recruit users? (they can give us a list, or know who can)

4. Data and Knowledge Management

  • Do we know how to store and share information [team agreed this already, not for the interview]

5. People

  • Have we got an idea of potential blockers?
  • Have we got a rough idea of RACI?
  • Are there similar projects we should contact?

6. Organisational context

  • What support is there for the submitting team

7. Governance

  • Have we got our data governance in place, can we protect interview data on Sharepoint

8. Tools and infrastructure

  • Are there any tools/sites the team will need to access?
  • Do we know who/how to access these sites?

Once we decided what we needed to achieve we set out defining how we would go about answering these questions and validating our assumptions that this was the right approach.

Read more in Part 2 our principles and methods for a First Look.

Company: Defence Digital Defence

Visit website »