Finding a solution is the easy part

by A M Howcroft, SWARM CEO

Part 3 of 3

You’ve defined the problem, your data is clean and ready, matching it to an appropriate solution is easy, right? You may have the traditional ‘build or buy’ decision to make, but either approach should theoretically be straightforward. Yet statistics show many projects falter at this stage – in 2022 the Standish Group reported that only 4% of new software application projects are successful (with 49% failing outright), and only 30% of software package deployment projects meet with success. All of which suggests there is flaw in the process, which leads to the question of whether there might be a better way to deliver solutions, with a much higher success rate.

I would argue that many projects fail because they don’t have a clear definition of the challenge at the outset, which we believe is critical – and was covered in the first part of this series. Let’s assume, though, that we have successfully navigated the challenge definition and gathering of appropriate, clean data - as we discussed in part two. What are the primary factors that still cause failure, and how could we avoid them with a different process?

Reasons to fail

There are so many possible ways that internal IT development projects can go wrong, even after we have a clear definition of the goals. Here are some of the most common:

  • Inadequate project management

  • Scope creep

  • Lack of end-user involvement

  • Skills gap

  • Poor QA and/or testing strategies

  • Resource constraints

  • Technology issues

  • Organizational culture

Many of these problems are exacerbated in a scenario where there is a new and potentially complex technology involved – such as AI and Machine-Learning, for which the internal development team may not have deep experience.

Selecting a package may seem like a better approach (and appears to have a higher success rate based on the Standish Group analysis), but often has its own frustrations: there can be a lengthy and expensive selection process, additional purchase + onboarding and training costs, with no guarantee of success. Many organizations are frustrated that they must adopt their process to match that of the application, rather than the other way around.

In both cases, there is another issue, which is one of timing. Internal IT departments are often overwhelmed, for either development or package selection, and significant backlogs can form which are frequently amplified by project delays. We spoke to a major agrifood organization in 2022 who told us they were not looking at any new initiatives until they had cleared the backlog of 225 projects!

You can, of course, hire a Systems Integrator to do some of the work for you. This can be effective, but also expensive, and too regularly there is the question of who will maintain the application once the consultants have left the building.

A New Approach

These are not new problems, I hear you say. With the evolution of several key technologies, though, we believe a new approach is emerging which promises a much better success rate. Think through the next three questions. What if:

  • Your challenge definition was specific enough to find a precise solution?

  • These solutions were easily accessible in a public market?

  • The solutions could be assembled and delivered without any coding?

If that were the case, then the hard part would be defining the challenge correctly, not building the solution. Markets today are either very specialized (e.g. Datarobot delivering algorithms for data scientists) or very generic: e.g. the Apple App Store. If I search the Apple App Store for Project Management tools, I find over a hundred. But if I need a project management tool to help build a tunnel under the Atlantic, which requires collaboration by 200+ users in real-time, and can integrate with a specific CAD package, how do I find it?  

We believe those qualification criteria should be built into the challenge specification, and the meta data could then be matched to relevant meta data in a ‘smart’ market, which would result in a match to the 1-3 solutions that fit, instead of having to wade through a hundred irrelevant options. There is a compelling opportunity for a market that offers solution components by doing a better job of mapping meta data from problem definitions to an effectively organized catalogue of solution metadata.

These solutions would also need to be structured in such a way that they could inter-operate, perhaps by having a shared SaaS infrastructure such as Microsoft Azure or AWS, along with common interfaces such as RESTful APIs.

When will the new approach arrive?

At SWARM Engineering, we’ve already built a tool that captures business users’ requirements in a standard way and makes the meta data available for a machine learning layer to use. We are passionate about helping people solve their operational business problems and have made the SWARM Challenge Modeler available at no charge. Of course, this does have the added benefit of letting us understand the types of challenges that organizations in agrifood are most commonly facing (NB. we see the meta data, but the detailed information is private for each user).

We also have a SWARM Solution Engine that makes use of this challenge meta-data and can already deliver many typical agrifood supply chain solutions without any coding required, in areas such as labor planning, transport management, supply/demand allocation, and optimization of inbound and outbound logistics. 

Which means that in a small way, the new approach is already here, but there is much more that could be achieved. We are doing this for agrifood and only solve a percentage of the potential challenges that any organization will face, even within that market. There is still more work to be done on capturing knowledge meta-data for industries and in making more application solutions ‘composable’ as smaller pieces of a broader solution.

The good news is that we are working with other vendors towards industry wide solutions and are certainly not the only people that share a compelling vision of a smart market. Watch this space for more information on that in future blog posts.

The future is now

Given the rapid advancement of large-language models such as ChatGPT, the rise of no-code, and the proliferation of powerful SaaS platforms, it seems inevitable that software development will take a new path soon. For some industries, that path has already opened up and the future is here now. 

As building (or ‘composing’) applications becomes simpler, the focus will shift back to accurately and clearly defining the problem. Which takes us full circle to Einstein, where we started this three-part blog post. Einstein’s advice to anyone wanting to save the world in an hour was to spend the first 55 minutes defining the problem. It was good advice. If you haven’t yet set up a center of excellence that can get better at identifying and defining challenges, now might be a good time to start. In an upcoming blog post, I’ll give you a few hints on how to do that successfully. Remember, in the words of the sci-fi novelist William Gibson:

“The future is already here, it’s just not evenly distributed.”

Previous
Previous

SWARM Engineering Announces AgriFood Virtual Advisor (AVA), Powered by OpenAI and Microsoft Azure

Next
Next

The Buzz @ SWARM - Episode 4