"What happens when our agency uses three procurement methods in one project?"

The officer asking was an hour into the second day of training. The portal we were running her through assumed each project would behave like a textbook example: one method, one contract pathway, one clean lifecycle. Her actual project did not. Her question landed in the room, and within twenty minutes I was taking notes faster than I was teaching. That single edge case exposed a design assumption I had carried for over a year, and it did so in the room where it was supposed to be defended.

The 8th and 9th of April, Kampala. 143 procurement officers from 59 Ugandan public bodies sat through two days of training on the Government Procurement Portal. The portal had been live since February. The training was not the launch. It was the test of whether the launch had landed.

I had been working toward those two days for over a year. April was the training. July is the test, when the next quarterly reporting cycle hits and I find out which of the 143 are still using what we built. What I learned in the two days changed how I will design the next rollout.

1. Training is the moment ownership transfers

What I assumed: training is a knowledge transfer event. Walk officers through the portal, show them the workflows, hand over documentation, validate they can complete the core tasks. Done.

What actually happened: training is the moment when ownership transfers from the build team to the institution, or it does not. Ownership does not transfer because someone watches a demo. It transfers when officers find the system's edges, push against them, and discover the system holds.

The ownership distinction matters for how I write about this. The Government Procurement Portal is not mine. It belongs to the Public Procurement and Disposal of Public Assets Authority and, through it, to the Government of Uganda. I was contracted through CoST, the Infrastructure Transparency Initiative, to help build it. The transfer the training tested is not from me to PPDA; it is from the build team (CoST plus our technical partners) to PPDA's officers, the agencies that publish through them, and the public the disclosure is meant to serve. Calling the system mine would misdescribe the chain of accountability the disclosure regime depends on.

Of those 143 officers, 51% were women. They came from 59 organisations including the Office of the Prime Minister, Kampala Capital City Authority, the Ministry of Finance, the Uganda Electricity Transmission Company, the Uganda Revenue Authority, and the National Environment Management Authority. That mix matters. A roomful of officers from a single agency is a closed loop. A roomful from 59 is a stress test. Every workflow assumption the build team had baked in over the previous year got challenged within the first morning, by people who would actually have to use the portal.

The questions I expected: where is field X, how do I publish, what happens if I miss a deadline. The questions I did not expect: what happens when our agency uses three procurement methods in one project, why does the contract value field reject our internal currency code, who gets notified when I make a correction after publication. None of these were obvious. Each one was a real workflow that the portal needed to handle and did not handle gracefully.

By the second afternoon I had stopped teaching and started reverse-engineering the product backlog from the room.

2. The cohort mix matters more than the headcount

The 51% women statistic is what donors and oversight bodies look at first. It is also the easiest number to misread.

What I observed in the room: officers closer to daily data entry asked questions that the senior decision-makers in the same agency had stopped noticing. They flagged friction in field validation, audit-trail visibility for junior staff, workflow problems that only show up when you do not have political cover to skip steps. That observation is real. What I cannot tell you, because I did not measure it, is whether the active variable was gender, role, seniority, or central-versus-regional posting. These almost certainly correlate. A controlled comparison is what would isolate them, and I do not have one.

So the honest claim is narrower than the headline. Diversity of role and agency is the variable I can defend. Gender plausibly correlates with role and seniority in Ugandan public-sector procurement, which is its own well-documented pattern, but I am not in a position to claim a direct causal relationship from a single training cohort.

The practical consequence is the same in both readings. Train only the senior officers the system was designed for and the system never gets the feedback it needs. Train the people the system was not designed for and you find out what it actually does when it is not being held up by its designers. Whether you optimise for that diversity through gender targets, role mix, or agency spread, the active design choice is the same. Do not staff training cohorts from the audience the launch event was performed for.

3. The boring questions are the survival questions

The interesting training questions, the ones that demo well, are about novel features. Reporting, analytics, cross-project search, the things that distinguish a modern disclosure portal from a 2010s PDF dump.

The questions that predict survival are the boring ones. What happens when a contract gets cancelled mid-procurement and a new one replaces it. How do I correct a typo three months after publication. What happens when an officer leaves and their drafts need to transfer. Who can see this before I publish, who can see it after, and how do I tell the difference.

None of these are exciting. None of them get into a launch press release. All of them are the questions an officer asks when they are imagining themselves using the system on a Tuesday afternoon when nothing is going right. If you cannot answer them on day one of training, you have built a system for the launch event, not for the work.

The 395 projects that were live in the portal as of training day were not the test. The test was whether an officer in a regional office could correct one of those projects three weeks later without phoning Kampala. We had built that capability. We had not built the documentation that explained how to use it.

By the end of the second day, the documentation existed because the officers had written it. Not metaphorically. Two officers from KCCA started a shared draft mid-session, capturing the steps for a workflow we had not formally documented because we had assumed it would be obvious. By lunchtime on day two, four other agency representatives had added their variants, and one had photographed the resulting whiteboard sketches. We took that whiteboard back to the office on the second night and rebuilt three pages of the official portal handbook around it. The pages now in the live training pack come from those two days. The author credit on those pages is not mine. That is what ownership transfer looks like, on the day it actually happens.

4. Three validations that do different jobs

Most teams I have worked with treat regulator sign-off as launch. It is not. Regulator sign-off is one of three validations a transparency portal needs, and conflating them is what turns a working system into a digital monument.

The Public Procurement and Disposal of Public Assets Authority validated the portal on the 13th of February. That cleared the political path for adoption. PPDA validation says the system meets the standards it was meant to meet. It does not say anyone will use it.

The 143 officers in April performed the second validation. User validation says officers can operate the system under real workflow pressure, including pressure from edge cases the design did not anticipate. This is the validation that determines adoption, and it cannot be done by the team that built the system, because the team that built it has internalised what the system is meant to do.

The OC4IDS quality assurance round that closed on the 21st and 22nd of April was the third validation. Data validation says the data inside the portal can be trusted to comparison standards across CoST member countries. This is the validation that determines whether the disclosed data is interoperable, which determines whether downstream tools and oversight bodies can use it.

Three different validations. Three different audiences. None of them substitutes for the others. The temptation in every rollout I have worked on is to declare victory after the first validation and let the other two slide. That is how systems become monuments. If you ship a transparency portal and only one of the three has happened, you have shipped a draft. If only two have happened, you have shipped beta. Three validations, in any order, is what shipping looks like.

5. What I would do differently next time

The biggest design failure I caught during the two days was assuming officers would learn the portal in the order I had structured the training. They did not. They learned it in the order their actual workflows demanded. Some skipped to reporting before they had published a single project, because their boss wanted a portfolio summary by Friday. Some refused to touch reporting until they were confident their entries were clean.

The next training will start by asking each officer what they need to produce in the next 30 days. The training will then route through whatever portal features serve that production. The system has to meet the user where they are, not march them through a logical sequence the designer found pleasing.

I would also build training as a recurring rhythm, not a single event. The 143 officers who came in April will face new questions in July when their first quarterly reporting cycle hits. Most rollouts I have seen treat training as a budget line that ends at launch. Sustained adoption requires sustained training, in shrinking doses, indefinitely. Budget for it on day zero or accept that adoption will degrade.

And I would document everything that broke. The 59 organisations that came through brought 59 versions of how Ugandan procurement actually works. Every collision with my workflow assumptions is a future feature, a future training module, or a future redesign. None of that is captured if you treat training as a checkbox.

6. Where this lesson does and does not transfer

Uganda's 59-agency rollout is one shape of this problem. The user base is institutionally fragmented. Each agency has its own procurement culture, its own internal sign-off chain, its own relationship to the central regulator. The training problem is to reach all 59 with a system that bends to each without losing comparability. The 143-officer cohort approach worked here because the institutional shape demanded it. A different shape would have demanded something else.

Other CoST member implementations sit on different shapes. Some run through a single anchor institution that publishes most of the disclosed projects, with a smaller set of contributing entities. Some are sector-specific, building from a road authority or a water utility outward. Some are subnational, with one state or city programme as the unit. I have worked on more than one of these. I am not going to invent confident generalisations about each other country's institutional shape from the outside, because the difference between knowing a country's procurement culture and knowing of it is exactly the kind of imported assumption this article is arguing against.

What does travel is the diagnostic question. Before you decide what training looks like, you have to name the institutional shape the system actually has to land in. A 143-officer cross-agency cohort works for fragmented user bases. A workflow-integration sprint works for single-anchor implementations. A sector-deep training programme works for sector-specific portals. The format follows the shape. The shape is a fact about the country, not a choice the donor or the build team gets to make.

If you are scoping a transparency portal in 2026 anywhere on the CoST member registry, the first design question is not which OC4IDS fields to publish. It is what shape your user base is, and what training would have to look like to transfer ownership to that specific shape.

7. The July test

The 143 officers I worked with in April are now back in their agencies. Some are uploading regularly. Some have not logged in since.

The next reporting cycle will tell me which is which. I will know in July whether what I built can survive without me in the room. The training was the bet that it could. The next three months are the result. If I find that fewer than half of the 143 are still active by the July cycle, the rollout failed regardless of the launch metrics. If three quarters are active and producing, the bet paid. If the answer is somewhere in between, the lesson is in the gap, and I will publish what I find.

That is the only metric worth measuring.

Cengkuru Michael is a data specialist at CoST, the Infrastructure Transparency Initiative. He has implemented infrastructure transparency portals in Uganda, Kaduna State, Malawi, Mozambique, and elsewhere across Africa. Project counts and validation dates referenced in this article are as of the training week, the 8th and 9th of April 2026.

Continue Reading

The traffic light decision

Procurement risk-scoring systems often produce numbers nobody acts on. The CoST Data Use Manual v4 moved the action layer from scores to traffic-light bands. Here is why scoring fails as an action surface, and why banding works.

What 100,000 Contracts Taught Me About Procurement Transparency

After processing 100,000 procurement contracts in Uganda, I learned that disclosure isn't transparency. Here are the patterns the data revealed and what I'd build differently today.

Why Disclosure Portals Die (And How to Build Ones That Don't)

Every infrastructure disclosure portal I have worked with that launched to a press release went dark within three years. The failures follow predictable patterns. Here is what the survivors have in common.

Found this useful?

I write about open data systems, transparency, and implementation.

Read more articles →