We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


Software engineering management platform Jellyfish has launched what it’s calling the industry’s “first comparative benchmarking tool,” one that enables engineering leads to verify how well they’re performing against other companies.

Jellyfish Benchmarks, as the product is called, is based on the company’s own internal data, which it garners and collates when engineering teams opt-in to share their anonymized data with the broader pool.

Aligning goals

Founded in 2017, Jellyfish’s core mission is to align activities from engineering teams with companies’ business objectives. It does this by analyzing myriad engineering “signals,” gleaned from developer tools such as issue trackers and source code management platforms, as well as project management tools. It’s all about establishing what teams are working on, tracking the progress they’re making and how individual teams and workers are performing.

By ushering in aggregated, pan-industry engineering data, this brings more context to the mix, allowing companies to compare and contrast internal figures with those from their peers across sectors.

Event

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

So, what kind of benchmarks does Jellyfish now serve up? Users have access to more than 50 metrics, including time-invested in growth; issues resolved; deployment frequency; pull requests merged; coding days; incident rate and mean time to repair (MTTR); among many others.

“Importantly, Jellyfish includes benchmarking for how teams are allocating or investing their time and resources — this helps teams understand how they compare on their time investments into innovation, support work, or keeping the lights on, for example,” Jellyfish product head, Krishna Kannan, told VentureBeat.

Jellyfish Benchmarks

At the time of writing, some 80% of Jellyfish customers opt-in to sharing their anonymized data into the benchmarking datasets and it’s only those who will be able to benefit from this new product. To get a little, you have to give a little, is the general idea.

“When Jellyfish customers onboard, they are offered the opportunity to leverage industry benchmarks built upon anonymized datasets from other Jellyfish customers — customers who opt-in will have their data anonymized and added to the benchmarking Jellyfish customer pool,” Kannan said. “In the rare instances where customers opt out of this opportunity, their dataset will not be added, but neither will they be able to leverage benchmarking as a feature.”

Insights

While software development teams arguably have access to more engineering data than ever, it’s not always possible to know from this data how well teams are actually performing on an ongoing basis — maybe they are doing well compared to historical figures, but are still hugely underperforming compared to companies elsewhere. This is the ultimate problem that Jellyfish Benchmarks seeks to address.

It’s also worth noting that Jellyfish rival LinearB offers something similar in the form of Engineering Benchmarks, spanning nine metrics. However, Jellyfish says that it caters to dozens of metrics, which could open the utility to a wider array of use-cases. *

“The reality we’ve found is that different teams are looking to optimize different metrics depending on their product, stage, business goals and so on,” Kannan said. “That’s why we’ve included benchmarking for whichever metrics our customers care most about.”

*Updated to correct a previous statement that suggested LinearB’s benchmarks’ product wasn’t fully integrated into its platform.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Author
Topics