I've been thinking a lot lately about how organizations choose and evaluate software. The Iowa Democratic Party chose Shadow Inc. to work on the IowaReporterApp, and it is clear that there were missteps in the testing and delivery of the product.
First, the timeline was too aggressive for the given budget. Even a simple application requires adequate time for testing and operationalizing. For a one-and-done event, like this, load testing during a mock session was an absolute requirement.
Second, there was inadequate training on how to use the app. The caucus chairs should have installed the app and been trained on its usage weeks before the caucus vote. In the case of the IowaReporterApp, the clumsy install process would have been a red flag to decision makers, warning of the impending disaster.
This is why it is so important to keep options open after choosing a software vendor. Insist on prototypes, and get them into the hands of end users. Be on the lookout for warning signs, and be prepared to pivot, if necessary.
You won’t find the IowaReporterApp in the App Store and even if you could, its rating would likely be too low for anyone to want to download it. The software played a key role in this week’s Iowa caucus debacle and the more we learn about it, the more it seems like it never really had a chance in the first place.
It was meant to make reporting results from the caucus simpler, quicker, and more transparent, but it did the opposite in each of those cases. Shadow, the company responsible for the app, reportedly had roughly two months to build the crucial app after charging roughly $60,000. Those numbers throw up an immediate warning signal for those who make this kind of software for a living.
“One of the problems with software in the past decade is that many clients don’t place enough importance on discovery and planning of an app,” says Saptarshi Halder, COO at the app development firm Unified Infotech. Even with a relatively simple app like IowaReporterApp, which mostly included text fields and a simple media upload functionality to transmit photos of the completed result sheets, he says the planning process—that is, before the developers even start actually building the app—should take a month, minimum. “You have to consider your screens, the fields, the necessary output—it takes time,” he adds.
David Cohen, the CEO of the New York-based mobile development firm Utility agrees that even the planning process could span months for something this critical.
The actual coding portion of an app like the IowaReporterApp isn’t very complex since it seemingly doesn’t require much advanced functionality. Halder suggests two months could be a feasible amount of time for putting the actual code together on something like this. But, unfortunately, that wouldn’t leave much time at all for testing.
No matter how good the performance seems in an app, variables invariably pop up. Chad Wolf, Homeland Security Secretary, told Fox that the department offered to help test the app, but they declined (although the DNC denies it). Keeping the developer’s identity secret was reportedly part of the plan to keep hackers from finding and trying to break the app, so bringing in outside firms—even governmental organizations—was unlikely.
Testing an app isn’t as simple as making sure it works as it should in a closed environment. As Cohen points out, IowaReporterApp didn’t necessarily fail outright, but reports claim it only provided partial data in its results, which is a clear sign that something related to the data transmission was throttling, either on the app side, or in the cloud on the server side of the infrastructure. “We don’t know if it captured all the data and only partially transmitted it” he says. It’s the kind of thing that can show up in a real world scenario after clearing the tests.
Some reports claim that volunteers encountered frequent error messages while trying to accomplish the last step of the process which required them to upload a photo of the completed sheets. “Even a simple media upload needs extensive testing,” says Cohen. “You have to decide how and how much you’re going to prioritize compressing the media files on the phone before transmitting them to the server, especially when considering everyone submitting results at the same time. Those raw photo files we take with our are phones are much larger than you think.”
Halder had similar concerns about the photo upload functionality. “Apps can crash because of excessive memory usage,” he says. “Depending on the device or the configuration, it can cause an error.” He says that the testing phase would add at least another month to the total build time of the app before it could meet its acceptance criteria.
With just around 1,700 total users at its peak, the IowaReporterApp had a relatively small user pool, but testing its ability to perform under the load of operation at scale is crucial. External applications like Apache JMeter, Loadrunner, and IBM’s Rational Performance Tester simulate real world operating conditions for applications to see if they will function as they should with actual users inputting data. They’re capable of simulating hundreds of thousands, or even millions of users. It’s unclear to what extent Shadow did this kind of testing, but both Cohen and Halder suggested a month of testing would only satisfy the bare minimum.
Typically, once the testing is over, companies submit their apps through the official channels like the App Store of the Google Play store for approval and official distribution. It’s a process that can take days, or in some cases weeks, and it makes distributing updates time consuming as they wait for approval.
In the case of the IowaReporterApp, Shadow eschewed this process by using enterprise environments that are typically meant for testing, and definitely shouldn’t serve as the operating platform for a crucial, real world app. According to a Vice report, Shadow used TestFairy to distribute the Android app. Similar to TestFlight for iOS, it’s designed to allow users to run preproduction versions of an application in a controlled environment on their devices.
Using platforms like TestFairy or TestFlight add a layer of complexity and variability to the entire process. Reports widely claim that users had a hard time even downloading the app, with some estimates claiming only roughly a quarter of the volunteers even successfully downloaded IowaReporterApp in the first place.
As for the budget for this kind of project, it’s hard to estimate exactly how much it should have cost because we don’t know the scope of the work or really who did it. According to Halder, the $60,000 budget number doesn’t make much sense at all if the work was completed domestically within the United States. He says the budget makes more sense if the work was outsourced to a country like India where the average hourly rate can be much lower at $30 or $40 an hour. FEC filings show that Nevada spend $58,000 in anticipation of using Shadow’s app, but it’s unclear how much revenue different versions of the app would have brought in total if everything had gone according to plan.
Even without knowing much about the backend operation of the app, Cohen similarly suggested that a budget for something like this should be well into the six figures to accommodate the kind of planning and very thorough testing something like this requires.
All of this lives within the planning and development process, which doesn’t even consider the added effort and expensive involved with training people to actually use the app in the first place or supporting users who have problems after it rolls out. Reports claim that support lines for ReportIowaApp became “overwhelmed” when things started to go wrong because there was insufficient support available to help.
While we may never know exactly what was happening within and behind the IowaReporterApp, we do know that the upcoming Nevada caucus won’t be using Shadow’s app as it had originally planned. It’s unclear what the state plans to use, but with the event happening on February 22nd, that doesn’t leave much time to spin up and test another option. But, if that requires recording the results the old fashioned way, it may seem like a small price to pay in light of the IowaReporterApp’s meltdown and the ensuing wave of bad press.
This article originally appeared on Popular Science