Analysis of Xojo’s Bug Tracking System

Recently, we invested a significant amount of VC funding to conduct a thorough analysis of our core business operations. We engaged a well-regarded national consulting firm with expertise in software development to assist in reviewing our efforts. Since we had some “use it or lose it” funds available in the budget, we asked them to also analyze the current “bug” situation in Xojo.

The only reliable data they could access was the issue database. To my knowledge, there are no additional internal tools Xojo uses to track the broader bug landscape. For this analysis, we specifically focused on open issues and did not include feature requests.

Below is a summary of their findings. I share this with the genuine intention of being helpful, as Xojo’s success directly contributes to ours.

=========

This report examines Xojo’s bug tracking system, focusing exclusively on non-feature request issues, and provides actionable insights to enhance software quality. While it’s often said, “All software has bugs,” treating this notion as an excuse for inefficiency is naive and detrimental. Bugs are an expected part of software development, but a mature process and a disciplined approach can minimize their impact and maintain customer trust.

Key Findings

  1. Volume of Non-Feature Request Bugs

    • Xojo: 2,377 open “non-feature request” bugs.
    • Industry Benchmark: Medium-sized software companies typically have 1,000-1,500 open bugs.
    • Assessment: Xojo’s backlog is well above average, signaling higher-than-acceptable technical debt. The industry reality acknowledges bugs, but this volume suggests inefficiency, not inevitability.

  2. Aging of Bugs

    • Xojo:
    • Over 2 Years Old: 56.6% (1,346 bugs).
    • 1-2 Years Old: 21.8% (518 bugs).
    • Less than 90 Days Old: 7.3% (174 bugs).
    • Industry Benchmark:
    • Over 2 Years Old: 20-30%.
    • Less than 90 Days Old: 20-30%.
    • Assessment: Bugs lingering for over two years reflect poor prioritization. While all software may have bugs, sustained neglect only erodes product quality.

  3. Average Bug Age

    • Xojo: 1,021 days (2.8 years).
    • Industry Benchmark: 400-600 days (1.1-1.6 years).
    • Assessment: Nearly double the benchmark, Xojo’s average bug age highlights significant room for improvement.

  4. Bug Update Frequency

    • Xojo: Average time between updates is 732 days (2 years), with 1,563 bugs receiving multiple updates without resolution.
    • Industry Benchmark: Updates typically occur every 60-90 days with steady progress.
    • Assessment: The extended update intervals and lack of closure suggest systemic inefficiencies.

  5. Oldest Non-Feature Bug

    • Xojo: 5,791 days (15.86 years, created January 24, 2009).
    • Industry Benchmark: Rarely exceeds 5 years.
    • Assessment: Allowing a bug to persist for over 15 years is not reflective of the claim that “all software has bugs” but rather of inadequate resource allocation.

Debunking the Myth: “All Software Has Bugs”

While true in a literal sense, this phrase often becomes a shield for complacency. The presence of bugs is not an excuse for tolerating inefficiencies. Industry leaders demonstrate that with the right processes, bugs can be managed proactively and resolved efficiently. Xojo’s current practices can be optimized to ensure that bugs are not just identified but systematically addressed in a timely manner.

Strategic Recommendations

  1. Immediate Actions

    • Prioritize Older Bugs:
    • Dedicate resources to resolving the 1,346 bugs over two years old. Focus first on those with the greatest customer impact.
    • Establish a Triage Process:
    • Categorize bugs by severity and customer impact for targeted action.

  2. Mid-Term Optimization

    • Create a Dedicated Resolution Team:
    • Assign a team to systematically address long-standing issues using agile principles.
    • Enhance Accountability:
    • Ensure all bugs are assigned with clear ownership and deadlines for resolution.

  3. Long-Term Process Improvements

    • Automate Tracking:
    • Adopt real-time bug tracking tools to provide visibility and accountability.
    • Set KPIs for Bug Resolution:
    • Target an average resolution time of 400-600 days and resolve 70% of new bugs within 90 days.
    • Balance Resources:
    • Allocate sufficient focus to bug resolution alongside new feature development.

Projected Outcomes

Through disciplined execution, Xojo can transform its bug management approach:
• Reduce the backlog of aged bugs, restoring product stability.
• Shorten resolution times to align with industry standards.
• Build customer trust by demonstrating a commitment to quality.

Conclusion

The sentiment that “All software has bugs” is not a justification for inaction but a call to excel in managing them. By addressing inefficiencies, Xojo can exceed industry benchmarks, deliver a more reliable product, and elevate customer satisfaction.

17 Likes

@Underwriters_Technologies Nice work and good use of funds to help improve our favorite development tool.

6 Likes

This please.

Setting a specific KPI as a target is far more effective than a vague statement like “All software has bugs.”

1 Like

There’s a LOT being claimed here and not a lot of data or explanation of their methodology. For example, I would like to know that they are comparing like products; it would not be fair, for example, to be comparing the bug load of Xojo to, say, Photoshop or Chrome. They serve radically different userbases and the bug reporting patterns are going to be wildly divergent.

Secondly, this report suffers from Overgeneral Corporatespeak Syndrome - “industry leaders” and “industry benchmarks” and the like. It’s designed in such a way to imply that all industries work the same, have the same problem, and can be fixed in the same way. This isn’t even true of two cars of the same year, make, and model - why would it be the same between any two unrelated organizations?

Thirdly, it’s very difficult to take this kind of analysis seriously when they don’t have access to Xojo’s internal data or processes. I’m chuckling at the very serious sounding “Strategic Recommendations” section, which dutifully regurgitates a good portion of what we publicly know about Xojo’s quality control process. I guess they must be doing something right.

Fourthly, the report deploys a fantastic example of a strawman argument in the repeated debunking of the “myth” that “all software has bugs”. Those of us in the software industry know that this is not a justification for inaction, as the report snidely implies; it is an acknowledgement of the complexity of the work of creating software, and a gentle reminder that perfection is not the goal.

Frankly, I think you could have better spent your use-it-or-lose-it funds enjoying a nice pizza lunch (or possibly a trip to somewhere warm) than paying these consultants. The report they produced is meant to make you feel good for having spent the money paying them to create it; it contains no insight that your organization can use, no actions your organization can take.

13 Likes

The analysis is undoubtedly incomplete. I would welcome Xojo opening up and seeking some sort of outside help on this.

Who among us has not heard Xojos answer on this concern with “all software has bugs”?

2 Likes

Statistics are of course interesting though as @Eric_Williams pointed out, they can be misleading and sometimes not relevant to any given situation.

I blogged about how we decide what goes into any given Xojo release. That post continues to reflect the logic we apply to setting our priorities.

4 Likes

We agree on this at least; perhaps not the magnitude, though.

Citation required.

3 Likes

The blog post lacks any meaningful measurement of success. This isn’t about whether bugs are being chosen correctly; it’s about the process itself being a mess.

The volume of duplicates, old bugs, and multiple reclassifications clearly shows that the process—if it exists at all—is fundamentally broken. Without basic KPIs, like the percentage of bugs closed in under 90 days or how that compares to last quarter, there’s no way to track or measure progress in addressing the issues.

Without a way to quantify success, there’s no basis for understanding success or failure.

Over 2 Years Old: 56.6% (1,346 bugs) is from your data and it tells a pretty strong story.

1 Like

I think the problem here is the assumption that you can statistically determine success from bug reports. I am unconvinced that is possible. For example, let’s say we decided that having a very small number of open bug reports was our measure of success. The best way to achieve that would have been to only support desktop development and better yet, only macOS development since that’s where we started. That would keep the product surface area small and thus the number of bug reports low. The obvious problem with that is that our users want us to support Windows, Linux, the web, iOS and Android. So while we might have been able to achieve success when measured by the number of bugs reported, our success in terms overall customer satisfaction would be low because we wouldn’t be responding to their requests.

When you sort open issues by popularity, you see the message we are being clearly sent by the users who care enough to send it. As my blog post points out, we don’t rely entirely on that criterion but it certainly is an important one.

When any of us find a situation frustrating, what we wish for is to reduce it down to some simple explanation that, by its simplicity, can be easily addressed. In this case, it can’t be reduced down to a bug report statistic. We measure it instead by our customer satisfaction surveys, the number of new customers, and the number of renewing customers. Because at the end of the day, people vote with their wallets.

10 Likes

I always get a kick out of statements like these, because they attempt to reduce customer satisfaction down to a number.

Success is what you say it is; maybe you can reduce it to a number for your organization, but at my company we certainly do not. We look at things like are we happy at our jobs? Do we meet our customer’s needs? Do we have good relationships with our customers; suppliers; employees? How do we feel about the whole enterprise? Yes, we can look at financial performance as one measure of success that can be expressed in absolute numeric form, but it’s hardly the first place I look when I’m considering whether we are successful.

4 Likes

“I think the problem here is the assumption that you can statistically determine success from bug reports. I am unconvinced that is possible”

1 Like

I don’t think that statement says what you think it does.

I’m very happy with the current Xojo Issues system - so much better than the past. Xojo should be commended for this. It’s modern and “industry standard” or even better, as far as I’m concerned. Compare, say to Claris/FileMaker - which has a (poorly engineered) forum and no public bug tracker.

I’m less happy with the ongoing and repeated regressions we see, which I assume are due to a lack of unit testing. New features are developed - Good! They break existing features - Bad! Those regressions are sometimes caught in beta testing - Good! But sometimes are released anyway - Bad!

5 Likes

Eric, I want to clarify that I’m not trying to dismiss your perspective, but this is a discussion Geoff and I have been engaging in for years. Over that time, Geoff has consistently emphasized that there’s no use of any measurements of success beyond “What our users want”. I mean no disrespect, but your input is out of sync with where this conversation has evolved. I’d like to focus on the core issues at hand without diverting back to points we’ve already thoroughly examined years ago.

Over 2 Years Old: 56.6% (1,346 bugs), 1,563 bugs receiving multiple updates without resolution…is telling you something even if you are “unconvinced”.

What does a 15 year old bug tell you? It tells me that there is no one seriously trying to manage the bug data, or it would have been cleaned up 10 years ago.

1 Like

If this is important to you, perhaps your ongoing conversation should be conducted in private. You cannot expect everybody in this public forum to be conversant with your posts from years ago.

2 Likes

No, I think you have a fundamental misunderstanding of what he is saying here. He’s saying he doubts you can determine overall success from bug reports. If you go on to read his entire post, he clearly defines success in a much larger context - which includes bug statistics but is not defined by them. Their definition includes more direct measurements of customer satisfaction, such as renewals and survey results.

You are drilling into the success of the bug reporting and resolution process itself – a very different question. Or, at least I hope you don’t define success solely by this metric.

1 Like

I don’t think anyone is disagreeing that bugs aren’t bad or that they shouldn’t be addressed. But we should probably be clear that bug reports don’t necessary translate one-to-one to bugs as some are likely duplicates or aren’t bugs at all (I know I’ve put in my own fair share of reports on things that I thought were bugs but weren’t).

As was noted by Eric, the consultant using term such as “industry standard” isn’t really helpful. And as many large vendors might not have a public bug database, it can really be hard to tell what is a standard or not.

Additionally, I agree having bugs that are years or decades old isn’t the best case, but I bet Apple likely has bugs in Radar that are even older. Microsoft, Google and others are potentially in the same situation as well.

At this point I personally think Robin does a great job validating, reproducing and tending to bugs in an extremely timely manner when they come in. Then there’s Xojobot that already is attempting to optimize things by auto-closing certain bugs.

Maybe another direction this conversation can take is towards what is actionable and what can feasibly be done. If the idea is to optimize and clean up Xojo’s Bug Tracker what are folks proposing to make this happen?

Not my own proposals here but I know many folks have floated these ideas for quite some time now…

  1. Xojo Needs More Developers to Fix Bugs
    I think many folks would agree with this. But Xojo is a small team so should Geoff substantially raise prices to fund more heads? And even if he does so, my guess is that it might take years for someone to understand the code base enough to be able to squash many of the bugs in the tracker. Also at what cost is the user base willing to pay for this? FileMaker, LiveCode and 4D all recently increased their pricing substantially. Are folks willing to pay 1.5 or even 2X or more for some of these bugs to go away?

  2. Xojo Should Optimize Their Bug Tracking Practices
    Sort of tied into the prior bullet, should they have a PM that manages this directly? How should they fund this headcount? Or maybe optimizing could mean that bug reports need to be more stringent prior to them going into the system? Or should Xojobot be tightened up even more and if someone doesn’t respond every quarter or so to a bug, it gets closed out?

Looking at the consultant’s recommendations…

  1. Prioritize Older Bugs: Dedicate resources to resolving the 1,346 bugs over two years old. Focus first on those with the greatest customer impact.
    Yes, this could be doable but Xojo is a small team with only so many folks. Doing just this chunk of work could take quarters or maybe even years. Should progress on the product simply stall so that they can work on just a segment of the overall bugs? Heck for bugs this old, they might not even be worth the effort as the product has evolved, technologies have changed and developer’s needs are potentially different today than at the time the bug was created.
  2. Create a Dedicated Resolution Team: Assign a team to systematically address long-standing issues using agile principles.
    This is similar to the prior point and implies that the development team is large and has resources. Let’s say Xojo dedicates just 1 developer to this task. Which developer wants to be tagged to do nothing but bugs for the rest of their Xojo career? And I would imagine platform and product expertise will come into play as my suspicion is there isn’t a single developer who knows it all, so now you’re likely pulling cycles from other parts of the team even if you had one of the senior developers tied to this role.
  3. Automate Tracking: Adopt real-time bug tracking tools to provide visibility and accountability.
    This could be useful but only to the intent that the data is actionable. Sort of back to the prior points above, if the trend degrades, does Geoff raise prices and hire more folks? What happens if folks are simply burnt out after quarters/years of doing nothing but bugs? Effectively what actions should be taken if KPIs aren’t met?

I’m not saying any of this because I entirely disagree with the consultant’s assessment or that more can’t be done. But I’m not completely sure the consultant fully understood the dynamics at play here and what success looks like.

Personally I’d propose that one release a year be dedicated to the Bug Bash that the entire Xojo team did a few years ago. Make it a standard tradition within a particular set quarter every year (e.g. Q2). Spend the prior quarter getting the word out and having folks rank their most important bugs. Effectively make this a once a year Snow Leopard effort. Or alternatively, make every release a Bug Bash release but with the effort tied to a single platform (e.g. 2025r1 is a “normal” release but the Bug Bash is tied to Desktop, 2025r2 is a “normal” release bug the Bug Bash is tied to Web, etc.)

What are folks proposing should be done in realistic and tangible terms?

7 Likes

The report is largely irrelevant. “Well-regarded national consulting firms” are known for making meaningless reports. There are no actionable insights in the report.

I’ve done reports for a problem tracker for cars for about 13 years. At that time it was the second largest application at GM with over 10k users and 1k issues per week.

Any metric you define can be exploited. The GM app had a 5 step process. The engineers were responsible for the first 2 steps. So the metric was only against those steps. But this didn’t make the fix appear faster on the car. When the metric was replaced by a more complex one the engineers constantly whined.

Metrics for software like Xojo don’t make any sense whatsoever.

A lot of the Xojo bugs probably are obsolete. There is not enough manpower to check and close those out.

I’ve found quite some odd bugs like this one:

https://tracker.xojo.com/xojoinc/xojo/-/issues/70169
“autho” not allowed when converting string to constant

This is not affecting anyones work.

Then there are the bug which likely are obsolete:
https://tracker.xojo.com/xojoinc/xojo/-/issues/69248
Worker choking on result

Workers are not a priority anymore and only the spaghetti monster knows if this will be fixed or not.

But then I have a bug that makes my blood boil because it hasn’t been fixed in over effing 10 years:
https://tracker.xojo.com/xojoinc/xojo/-/issues/71193
DragItem.Destination still nil after 10 years

2 Likes

@Beatrix_Willius Any recommendations on what the Xojo team could do to improve things? Or maybe just carry on the way things are functioning today?

P.S. I feel for you on the drag and drop bug. The last time I looked there were a handful of related drag and drop issues that have been dying on the vine for quite some time now that also impact my own code.