I am not going to tell you GTM engineering is a fad. It is not. Job postings grew 205% year-over-year in 2025. Average salaries are sitting around $182K. Intercom, Canva, Rippling, Ramp, and others are all building dedicated GTM engineering teams. The work is impressive and the growth is earned.
But I have spent the last decade building the business systems that revenue teams actually run on. And the more I watch this space evolve, the more I notice teams racing to stand up AI-powered pipelines without the foundation to support them. The tooling is new. The discipline underneath it is not. And the companies skipping that part are quietly building up risks they have not had to deal with yet.
The New Stuff Is Real
I want to be clear about this. The capabilities that define GTM engineering are not just hype.
Signal-based selling actually works. Instead of building a list and blasting it, teams are monitoring real-time signals like funding rounds, job postings, and tech stack changes to find accounts at the right moment. The numbers back it up: 67% higher response rates, 41% shorter sales cycles.
Tools like Clay pull data from over 100 sources, generate personalized outreach at scale, and wire your data warehouse to your CRM to your sequencing platform. None of this existed five years ago. And the bar to do this work is legitimately technical. 38% of GTM engineering job postings call for SQL and Python. This is not a rebrand of the SDR role.
So yes, it is real. That is not where my concern is.
But What Are the Actual Outcomes?
This is the part that bothers me. When I ask GTM teams what has improved since they spun up these new workflows, I almost always get the same answer: more pipeline.
More leads. More sequences. More volume. Okay. But are those leads converting? Are deal sizes moving? Is the cycle getting shorter? Or are we just pushing more into the top of the funnel and patting ourselves on the back?
I have watched teams build beautiful AI-powered outbound machines that generate a flood of leads their AEs then spend weeks qualifying out. Nobody saved any time. The bottleneck just moved downstream. The pipeline report looks great on Monday morning. The close rate tells a different story by end of quarter.
When you optimize for speed without thinking about signal quality, this is what you get. The tools can generate volume. That was never the hard part. The hard part is making sure what comes in is actually worth working.
That takes clean data, real feedback loops between marketing and sales, and someone who understands how the system connects end to end. Not just the outbound slice of it.
The Systems Questions Nobody Is Asking
GTM engineers build fast. They wire up enrichment pipelines, connect APIs, ship automations. I respect it. But the conversation almost always stops at the revenue team boundary.
Who owns the data running through those pipelines? What happens when the enrichment vendor changes their API or jacks up pricing overnight? How does any of this connect to finance? To customer success? To the rest of your stack?
These are business systems questions. I have been asking them for years. In my experience, they do not get asked in GTM engineering conversations until something breaks.
And the landscape keeps getting more complex. The average company runs 275+ SaaS applications. Martech alone hit 15,384 tools in 2025. Every new AI workflow adds more connections, more data flows, more things that can go wrong. Without someone thinking about the full architecture, you end up with a stack where every team has its own pipeline and nobody trusts the numbers in anyone else's.
Oh, and 76% of organizations say less than half their CRM data is accurate. AI automation on top of bad data does not fix the data. It just makes the problems show up faster.
The Governance Conversation We Are Not Having
This one keeps me up at night.
AI agents are sitting inside your CRM, your enrichment tools, your outbound platform. They are pulling personal data from dozens of sources, writing emails on your behalf, deciding who to contact and when. And in most setups I have seen, nobody has thought seriously about who controls what.
90% of enterprise AI agents are over-permissioned. They move 16x more data than humans touching the same applications. OWASP ranked prompt injection as the number one LLM vulnerability and put out a full top 10 specifically for agentic AI last year. This is not a hypothetical risk anymore.
And on the compliance side, enforcement is getting real. France's CNIL hit Orange with a 50 million euro fine in 2024 for mixing ads into regular emails without consent. If your enrichment pipeline is pulling personal data and triggering automated outreach in the EU, a terms-of-service checkbox is not going to protect you.
I am not saying slow down. I am saying build it right so you can actually keep moving fast without creating a mess you will spend the next two years cleaning up.
These Two Disciplines Need Each Other
The best GTM engineering work I have seen has strong systems architecture behind it. And the best business systems teams I know are picking up AI-native tools and running with them. This is not a turf war. They are two halves of the same thing.
Ramp puts GTM engineers inside a Growth Platform squad that partners with engineering and ships in structured sprints. Rippling pairs them with growth teams running constant experiments. One of their playbooks uses hiring data as an outbound trigger. When the foundation is solid, the results speak for themselves: 60% open rates, 10% reply rates, consistent high-quality meetings.
The companies getting this right have a clear pattern. RevOps owns strategy and systems. GTM engineers build the automation layer on top. And somebody, always, is watching governance, data quality, and whether the pipeline is actually converting into revenue.
You need both. Without the systems side, you move fast and break things. Without the GTM engineering side, you build clean foundations that nobody uses.
So What Should You Actually Do?
If you are standing up a GTM engineering function, make sure it is connected to someone who thinks in whole systems. Someone who cares about data ownership, integration architecture, and what happens when things go wrong at 3am. And stop measuring pipeline volume like it means something on its own. Conversion rate, deal size, cycle time. Those are the numbers that tell you if the machine works.
If you already have a business systems team, push them to get their hands on the new tools. The discipline they bring is exactly what this space needs, but only if they are building with AI-native workflows, not just maintaining the old stack.
The winners here are not going to be the fastest teams or the most buttoned-up teams. They are going to be the ones that figure out how to be both.
Working through how AI, GTM engineering, and business systems fit together at your company? Get in touch. I have spent over a decade building and integrating the systems that power revenue teams, and I would love to hear what you are working on.
Sources
- The State of GTM Engineering Talent in 2025 - FullFunnel
- I Analyzed 1,000 GTM Engineering Jobs - Bloomberry
- The Rise of the GTM Engineer - Clay
- Signal-Based GTM: Turn Signals Into Pipeline - Databar
- ZoomInfo API Review 2026 - Generect
- State of CRM Data Management 2025 - Validity
- 2025 SaaS Statistics - BetterCloud
- Martech Landscape: 15,384 Tools - MarTech.org
- AI Agent Security Landscape - Obsidian Security
- OWASP LLM01:2025 Prompt Injection - OWASP
- OWASP Top 10 for Agentic AI - OWASP
- GDPR and Cold Emails - Salesforge
- GTM Engineering: Process-Driven Growth Systems - Cremanski
- Clay GTM Case Studies - GTM Engineering Blog
- Why GTM Engineers Are in High Demand - Ideaware