Instead of throwing more machines at the problem, Launchable utilizes AI and automation to decrease testing time and increase code quality.
I came across the following post today on Twitter:
What was interesting was the reaction. It really shows the big diversity in the kind of software development we are involved in, and how people seem to be unaware of it!
At one end of the spectrum is people whose CI time can be measured in minutes. Those are the ones who complain about the Docker image cache or suggests faster build machine. My AWS CodeBuild job spends a few minutes downloading the right Docker image from ECR. Can you believe that!?!? So annoying!
… except if those are your problems, you don’t know how lucky you are. Because of my line of work, I had a pleasure of seeing lots of software development projects, and things look very, very different in other parts of the industry.
As your product gets successful and the project runs longer with larger number of crews, it is the testing that starts to take the lion’s share of your delivery process, both in terms of computer time and human time. As the project grows, the test usually grows O(N2) – you get more tests as the time passes, and you produce more tests as you have more engineers. The test workload grows whopping O(N3) because those O(N2) tests get run proportionally to the number of engineers.
So sooner or later you can no longer just throw more machines at the problem, and by the time you get to that point, your software is so big and so critical to your business, you really have no good solutions. I once met a DevOps architect of a billion dollar SaaS product. That product is built and maintained over 20 years, a massively monolithic application written in Java, spread over 1000s of modules. You are a victim of your own success.
I didn’t know what to tell to those people. With Launchable, now I do.
This article was originally posted on LinkedIn
Want to learn more? Book a demo today to find out how we can help you achieve your engineering and product goals in 2022 and beyond.