Assessing Total vs. Annualized Equity Award Values

Pave Data Lab
February 13, 2025
2
min read

Following the launch of Calculated Benchmarks, an exciting new feature in Pave’s Market Data product, we’re taking a fresh look at equity compensation benchmarks across our dataset.

In case you missed the launch of Calculated Benchmarks, it’s a machine learning algorithm applied to our real-time equity grant database to give customers access to more relevant, timely, and accurate information.

To illustrate what we’ve been up to, we completed a new analysis of equity benchmarks for software engineers, including new hire and ongoing (or refresh) grants. Importantly, our analysis also examines the total vs. annualized value of equity awards, which is yet another special feature of Pave’s platform linked to our ability to access line-by-line data from equity management systems. Let’s see what the data shows us!

Exploring Our Latest Benchmarks

In the charts below, we examine total vs. annualized gross intended equity award values for software engineers based in the San Francisco Bay Area at public companies in the US with a market capitalization of $5B+. This dataset includes ~4.7K new hires with new hire grants and ~8.1K tenured employees who received ongoing grants.

While it is commonplace for compensation professionals to benchmark equity using total equity award values, getting reliable benchmarks for annualized equity award values has proved elusive. In many cases, people simply divide total equity award values by four (4) to estimate annualized equity grant values.

However, as more companies adopt shorter vesting periods, this 4:1 ratio no longer paints an accurate picture. In this specific case, our dataset, by virtue of including all grants and their specific vesting schedules, indicates dividing total award values by ~3.3 is correct.

Total vs. Annualized New Hire Equity Grant Values for Software Engineers

Total vs. Annualized Ongoing Equity Grant Values for Software Engineers

Fortunately, with Calculated Benchmarks and Pave’s ability to instantly toggle between total vs. annualized equity award benchmarks, there’s no extra guesswork required to understand the market. You can now conduct an analysis like the one above in seconds for any job anywhere in the US. Even better, Calculated Benchmarks outside the US will be launching soon.

Again, these results focus on software engineers based in the San Francisco Bay Area at public companies in the US with a market capitalization of $5B+. Equity award values vary significantly by location, company size, and company stage of development, so when using Calculated Benchmarks, we recommend applying additional filters to maximize accuracy and consistency.

To explore Calculated Benchmarks, sign up for Pave’s free Market Data product, or request a demo to access Premium Market Data features.

Learn more about Pave’s end-to-end compensation platform
Pave Team
Pave Team
Pave is a world-class team committed to unlocking a labor market built on trust. Our mission is to build confidence in every compensation decision.

Become a compensation expert with the latest insights powered by Pave.

(function (h, o, t, j, a, r) { h.hj = h.hj || function () { (h.hj.q = h.hj.q || []).push(arguments) }; h._hjSettings = { hjid: 2412860, hjsv: 6 }; a = o.getElementsByTagName('head')[0]; r = o.createElement('script'); r.async = 1; r.src = t + h._hjSettings.hjid + j + h._hjSettings.hjsv; a.appendChild(r); })(window, document, 'https://static.hotjar.com/c/hotjar-', '.js?sv='); !function () { var analytics = window.analytics = window.analytics || []; if (!analytics.initialize) if (analytics.invoked) window.console && console.error && console.error("Segment snippet included twice."); else { analytics.invoked = !0; analytics.methods = ["trackSubmit", "trackClick", "trackLink", "trackForm", "pageview", "identify", "reset", "group", "track", "ready", "alias", "debug", "page", "once", "off", "on", "addSourceMiddleware", "addIntegrationMiddleware", "setAnonymousId", "addDestinationMiddleware"]; analytics.factory = function (e) { return function () { var t = Array.prototype.slice.call(arguments); t.unshift(e); analytics.push(t); return analytics } }; for (var e = 0; e < analytics.methods.length; e++) { var key = analytics.methods[e]; analytics[key] = analytics.factory(key) } analytics.load = function (key, e) { var t = document.createElement("script"); t.type = "text/javascript"; t.async = !0; t.src = "https://cdn.segment.com/analytics.js/v1/" + key + "/analytics.min.js"; var n = document.getElementsByTagName("script")[0]; n.parentNode.insertBefore(t, n); analytics._loadOptions = e }; analytics.SNIPPET_VERSION = "4.13.1"; analytics.load("0KGQyN5tZ344emH53H3kxq9XcOO1bKKw"); analytics.page(); } }(); $(document).ready(function () { $('[data-analytics]').on('click', function (e) { var properties var event = $(this).attr('data-analytics') $.each(this.attributes, function (_, attribute) { if (attribute.name.startsWith('data-property-')) { if (!properties) properties = {} var property = attribute.name.split('data-property-')[1] properties[property] = attribute.value } }) analytics.track(event, properties) }) }); var isMobile = /iPhone|iPad|iPod|Android/i.test(navigator.userAgent); if (isMobile) { var dropdown = document.querySelectorAll('.navbar__dropdown'); for (var i = 0; i < dropdown.length; i++) { dropdown[i].addEventListener('click', function(e) { e.stopPropagation(); this.classList.toggle('w--open'); }); } }