Listen now on YouTube | Spotify | Apple Podcasts
David Bryson had big dreams growing up in North Carolina. Like most kids there, he wanted to be a race car driver. Not NASCAR exactly, but something with Porsches and Ferraris on tracks with actual turns.
Fast forward to today, and Dave found his way into a different kind of competition. He's the Principal Competitive Intelligence Manager at Splunk, where he helps his company outflank competitors in the cybersecurity and observability markets. The racing dream didn't pan out, but the competitive spirit stuck around.
Dave learned the most important lesson of his competitive intelligence career from his CI mentor, Sophie. Whenever he'd rush into her office with the latest research, excited about some new competitor announcement or market development, she would listen patiently. Then she'd look at him and ask a question that changed how he thinks about intelligence work forever.
"David, so what?"
"I was trained in CI...by this wonderful woman who was a longtime CI practitioner, and she sort of mentored me in the practice. And she used to always tell me, I would go and I would get all this information...She would look at me, and she'd say, David, so what." — David Bryson, Principal Competitive Intelligence Manager at Splunk
At first, he didn't know how to answer. He had all this information about what competitors were doing, but Sophie was pushing him to think deeper. What did it actually mean? Why should anyone care? What should Splunk do about it?
"The way that you get to intelligence is you look at the information and you say, okay, so what?" Dave explains. That question forces you to move past surface-level facts and dig into the real implications.
This matters more than ever in our AI-driven world. We can gather more information faster than ever before, but turning that deluge of data into actionable intelligence still requires human insight. The "so what?" question is your filter for separating what's important from what's just noise.
What you'll learn in this post
Why traditional feature comparison matrices are dead in an AI-first world
David's actual "page-long prompt" technique for superhuman competitive research
How to shift from reactive to proactive competitive intelligence using AI-freed time
Why AI creates an information abundance problem that requires new verification skills
What skills do future CI professionals need beyond traditional credentials
About David Bryson
David Bryson is the Principal Competitive Intelligence Manager at Splunk (now part of Cisco) and has been practicing competitive intelligence for seven years. Before transitioning to CI, he spent most of his career as a sales engineer selling enterprise software. He previously helped build the CI practice at Alteryx and now focuses on cybersecurity and observability markets at Splunk.
Traditional competitive matrices just died. Here's what killed them.
Dave hit a wall when he started analyzing AI capabilities across Splunk's competitive landscape. The old playbook wasn't working anymore.
"I struggled with this one," he admits. "It was difficult to differentiate between one vendor's copilot versus another. They were both copilots. They both used OpenAI, maybe, or maybe they use Gemini as an API."
This is the death of the traditional feature matrix. You know the one where your company miraculously gets all green checkmarks while competitors get a suspicious number of yellow and red marks. When every vendor has "AI-powered" something and they're all using similar underlying models, those charts become meaningless.
Dave realized he was asking the wrong questions. Instead of "What AI features do they have?" he started asking, "What problems does their AI solve for users?"
Picture this: Company A has 50 cybersecurity analysts drowning in threat alerts. Company B has one analyst doing the same job. Most vendors can claim their AI detects threats. But what happens after detection? How does each vendor's AI help you work through all those alerts and figure out what's real? How does it help you respond to legitimate threats?
The AI use case for helping 50 analysts prioritize threats is entirely different from allowing a single analyst manage everything alone.
"In AI, it's like, well, okay, so they have a copilot. So do we. They use OpenAI. So are we, where's the differentiation? And the differentiation is the outcome. So what does the user want to do? What problem do they want to solve? And does the AI help them solve that problem or not?" — David Bryson, Principal Competitive Intelligence Manager at Splunk
This outcome-focused approach works across industries. Every organization has different problems, different team sizes, and different processes. The AI that solves problems for a Fortune 500 company might be overkill for a startup, or vice versa.
Dave's team has flipped their entire competitive analysis framework. They no longer compare AI features. They compare AI outcomes. When a salesperson asks about a competitor's AI capabilities, Dave pushes them to first understand what the customer wants to accomplish.
The shift from "What model do they use?" to "What problem does the AI solve for the user?" changes everything. It's the difference between playing feature bingo and understanding competitive positioning.
The research technique that makes AI superhuman
Dave treats AI like the world's most capable research intern. One that never gets tired, never complains about tedious tasks, and can work through hundreds of sources while you sleep.
There's a catch, though. This intern is only as good as the instructions you give it.
"The way I think about it is like, what would I tell an intern to do if I wanted to research this problem?" Dave explains. "What perspectives would I tell it to take? How would I guide it to be critical of what it gets back?"
"The way I think about it is like, what would I tell an intern to do if I wanted to research this problem? What perspectives would I suggest it consider? How would I guide it to be critical of what it gets back...When you guide it in the right way, it's superhuman, right?" — David Bryson, Principal Competitive Intelligence Manager at Splunk
Dave's prompts aren't quick one-liners. They're often a full page long, sometimes more. He's learned that generic prompts produce generic results. If you just ask "Compare Splunk to CrowdStrike," you'll get a bland, marketing-friendly comparison that doesn't help anyone.
The magic happens when you give AI detailed context about your problem, specify the sources you want it to use, tell it how to think about the information, and even provide it with follow-up questions to ask when it gets results back. Dave has found that responses can be either fantastic or terrible, with no middle ground. The difference comes down to how well you prompt. Here’s a snippet of one of Dave’s prompts:
You are a competitive intelligence analyst working with a <your stakeholder’s role> for <your stakeholder's team>. This team focuses on <the outcome/goal of the team you are working with>. You are trying to <state the goal of what you have been asked to do>. The outcome is <what you want the outcome to be for the stakeholder>.
You have the problem and the outcome; now, you must research competitors and see how well they accomplish that outcome. You'll need to look at competitor documentation, watch video demos, search peer review sites, competitor press releases, SEC filings, read industry analyst reports, <insert other relevant sources for your project/research> and see how well that competitor accomplishes that outcome.
For each source you find that answers the question or problem, validate the source. Look for things like date published, role of the publisher, the relationship between the source publisher and the competitor, <insert other criteria specific to your industry for validation> etc…
In the conclusions you come to, based on the authoritative sources you find, ask ‘so what’? Dig deeper to find the meaning and implication behind the statements you read, or information you uncover. Be critical, analytical, and dispassionate about what you read. Look at all sources as potentially misleading, especially if the competitor authors them. Look for alternative sources to validate claims made by competitor sources, either in reputable sources or within technical documentation. In the end, come back with impeccably sourced, validated conclusions and recommendations.
Present findings in a way that are compelling to a <insert the stakeholder role you are working for>. Take on the role of <insert stakeholder role> and critique the results and modify them to ensure the <stakeholder> gets the answers they need based on the <the outcome/goal of the team you are working with>
Take this Problem: <Some challenge a user might be having> which achieves this outcome <outcome achieved if problem is solved.> now see how well <insert competitor> accomplishes this outcome with <functionality you are looking at>
Beyond research, Dave discovered AI works brilliantly as a personal tutor. When he joined Splunk, he had to learn cybersecurity and observability markets from scratch. These are massive, technical industries with tons of use cases and acronyms.
"I have found AI to be a wonderful way to learn something I didn't know before, because it's like your personal tutor," he says. "You can ask it to explain this to me in simple terms, or I didn't understand that. What does this acronym mean? And it's infinitely patient with you."
This learning capability enables CI analysts to quickly get up to speed on new markets. Dave can apply his CI training to industries he's never worked in, thanks to AI, which helps him build the necessary background knowledge.
But remember that verification challenge? Dave advises taking AI-generated competitive insights with "a gigantic boulder of salt, not a grain of salt." Always verify sources, check for bias, and apply critical thinking to everything AI produces. The speed and volume are superhuman, but the judgment still needs to be human.
When information abundance creates strategic opportunity
Plot twist: AI solving the grunt work of competitive intelligence created an unexpected opportunity.
Dave's team at Splunk has automated most of their low-value research tasks. They use tools like Clue for news aggregation, and some teammates have built custom AI crawlers that scan websites and documentation, then filter and summarize everything automatically. This stuff used to eat up hours of analyst time every week.
The real game-changer for Dave has been using Gemini for deep research. "It gives you hundreds of articles, hundreds of sources that would have taken me months to go through, and it summarizes that," he explains. What used to require weeks of manual searching now happens in minutes.
"I've used Gemini, which is the one I kind of like for deep research. It gives you hundreds of articles, hundreds of sources that would have taken me months to go through, and it summarizes that. And so I don't think it's too much information, but I do think the information that we're getting back as a CI analyst, you've got to...apply that critical eye." — David Bryson, Principal Competitive Intelligence Manager at Splunk
But this information flood created a verification challenge. You still need to check each source. Is it reputable? What perspective is the author writing from? Do they have an agenda? Sometimes AI will cherry-pick a sentence or two that match your question, but the full context tells a different story.
Then there's what Dave calls the "sycophancy problem." AI systems tend to give you neutral, positive answers about everything. If you ask AI to compare your company to a competitor, remember that most of its training data comes from marketing messages. You're basically getting a summary of how each company wants to be perceived, not necessarily how they perform.
The information abundance is real, but the solution isn't to use less AI. It's to get better at verification while using the freed-up time for strategic work.
This is proactive CI in action
Here's what Dave's team did with their newfound bandwidth after implementing AI tools. Instead of just cranking out more battle cards and newsletters, they could finally tackle the big strategic questions that had been sitting on the back burner.
"What's really made us more proactive, though, is not necessarily that AI is helping us be more proactive," Dave explains. "It's that AI is helping us be more strategic in the work that we do, in the research that we do, and because we can be more strategic, we can be more proactive."
"What's really made us more proactive, though, is not necessarily that AI is helping us be more proactive. It's that AI is helping us be more strategic in the work that we do, in the research that we do, and because we can be more strategic, we can be more proactive." — David Bryson, Principal Competitive Intelligence Manager at Splunk
Dave can't share all the details, but his team used this strategic capacity to identify something Splunk was missing in the market. Something that everyone could see but was somehow overlooked.
They went deep on competitive analysis, looking at what competitors were doing and, more importantly, what would happen to Splunk's customers if they adopted these competing solutions. They painted a picture of the future consequences if Splunk ignored this gap.
The result? "We were able to get ahead of this product area that we were kind of neglecting and bring it to the forefront."
Every CI team faces significant strategic challenges, that are visible but never have the time to address properly. AI doesn't solve these problems for you, but it gives you the bandwidth to think them through. That's where the real competitive advantage lives.
The fun part of CI was never making battle cards anyway. It's when you get to be strategic and think through what competitors will do next, then figure out how to respond before they even make their move.
Building CI teams that thrive with AI
If you think AI will automate away competitive analysts, Dave has news for you. "If you think that AI can automate away an analyst, then you probably don't have very good analysts," he says bluntly.
"If you think that AI can automate away an analyst, then you probably don't have very good analysts. If all you're doing is just sort of regurgitating what you're reading out in the world...those are things that AI is going to automate." — David Bryson, Principal Competitive Intelligence Manager at Splunk
Good analysts were never just regurgitating information anyway. They were synthesizing, questioning, and connecting dots that others missed. AI makes them better at this work, not replaceable.
Dave thinks future CI professionals need three key capabilities:
First, understand human behavior. Companies are still run by humans who have personalities, weaknesses, and make illogical choices. Until AI runs entire companies, understanding human decision-making remains crucial for predicting competitive moves.
Second, study historical patterns. Business cycles repeat themselves. Companies make the same mistakes over and over. Historical perspective helps analysts spot patterns and anticipate what comes next.
Third, and this might surprise you, bring outside experience.
"I think the biggest advice and skill set for future CI professionals...is that I think they should do something else first...The best CI analysts come from other backgrounds, I think. They've been sales engineers, product managers, product marketers, sales people." — David Bryson, Principal Competitive Intelligence Manager at Splunk
Dave's met CI professionals from wildly different backgrounds: sales engineers, product managers, salespeople, and product marketers. This diversity of experience is what makes them effective analysts. They bring unique perspectives that fresh MBA graduates simply don't have.
"Don't be afraid if you don't have 10 years of CI experience to apply for a CI job," Dave advises. The techniques and frameworks can be learned through certification courses. What can't be easily taught is curiosity, critical thinking, and the kind of perspective that comes from working in different roles.
For marketing leaders building CI capabilities, this means looking beyond traditional CI credentials. That sales engineer who understands customer objections? That product manager who tracked feature adoption? They make excellent competitive analysts with the right training.
Your next move
The future of competitive intelligence isn't about choosing between human insight and AI capability. It's about combining them effectively.
Start by applying Dave's mentor's test to your current CI approach. When your team brings you competitive intelligence, ask "So what?" Push them to move past surface-level information and explain the real implications for your business.
Invest in prompt engineering training for your CI team. The analysts who master detailed, strategic prompting will produce superhuman research results. Those who stick to generic questions will get generic answers.
Build verification processes into your AI-enhanced workflows. Set up systems to check sources, identify bias, and apply critical thinking to AI outputs. Speed is valuable, but accuracy is essential.
Most importantly, remember that competitive intelligence has always been about understanding what competitors will do next, not just what they're doing now. AI gives you better tools for gathering information, but the strategic thinking that turns information into actionable intelligence still comes from experienced human analysts asking the right questions.
Dave never became a race car driver, but he learned something every racer knows: winning isn't about having the fastest car. It's about taking the right turns at the right time. In competitive intelligence, AI gives you speed. The "so what?" test helps you navigate the turns.
About David Sweenor
David Sweenor is an AI, Generative AI, and Product Marketing Expert. He brings this expertise to the forefront as the founder of TinyTechGuides and host of the Data Faces podcast. A recognized top 25 analytics thought leader and international speaker, David specializes in practical business applications of artificial intelligence and advanced analytics.
Books
Artificial Intelligence: An Executive Guide to Make AI Work for Your Business
Generative AI Business Applications: An Executive Guide with Real-Life Examples and Case Studies
The Generative AI Practitioner's Guide: How to Apply LLM Patterns for Enterprise Applications
The CIO's Guide to Adopting Generative AI: Five Keys to Success
Modern B2B Marketing: A Practitioner's Guide to Marketing Excellence
The PMM's Prompt Playbook: Mastering Generative AI for B2B Marketing Success
With over 25 years of hands-on experience implementing AI and analytics solutions, David has supported organizations including Alation, Alteryx, TIBCO, SAS, IBM, Dell, and Quest. His work spans marketing leadership, analytics implementation, and specialized expertise in AI, machine learning, data science, IoT, and business intelligence.
David holds several patents and consistently delivers insights that bridge technical capabilities with business value.
Follow David on Twitter @DavidSweenor and connect with him on LinkedIn.
Podcast Highlights - Key Takeaways from the Conversation
[0:05] Introduction Host David Sweenor introduces David Bryson, an expert in competitive intelligence (CI) with a background as a sales engineer. Bryson shares how he fell in love with CI after joining a startup and leveraging his knowledge of a former employer.
[3:15] How AI is Changing Intelligence Gathering Bryson distinguishes between "information gathering" and "intelligence gathering." He notes that while AI is excellent at automating the low-value, time-consuming task of collecting information, the real work begins after.
"AI just allows us to get a lot more information in a lot faster, which then allows us to look at it a little bit deeper and get to that intelligence, get to that 'so what'."
[5:45] Handling Information Overload and Verifying AI Output The conversation turns to the challenge of sifting through the vast amount of data AI can provide. Bryson emphasizes the need for a critical eye.
"You still have to apply that critical eye... I advise a lot of the analysts on our team to really take it with a gigantic boulder of salt... All of that is trained on the marketing message of the competitor."
[10:01] AI as an Enhancement, Not a Replacement Bryson argues that AI enhances, rather than replaces, the human analyst. He describes using AI as a "superhuman" research intern that requires expert guidance through detailed, page-long prompts.
"If you think that AI can automate away an analyst, then you probably don't have very good analysts." "The other [way AI has enhanced our work] is learning... I have found AI to be a wonderful way to learn something I didn't know before, because it's like your personal tutor."
[17:10] Competing on Outcomes, Not Features With AI being integrated into every product, Bryson explains that traditional feature-function comparisons are becoming obsolete. The new differentiator is the specific outcome the AI capability delivers for the user.
"The differentiation is the outcome. So what does the user want to do? What problem do they want to solve? And does the AI help them solve that problem or not? That, I think, is the new sort of feature matrix."
[23:23] Shifting from Reactive to Proactive CI By automating research, AI frees up analysts to focus on higher-value strategic work. This allows teams to move from a reactive posture to proactively identifying market gaps and predicting competitor moves.
"AI is helping us be more strategic in the work that we do... and because we can be more strategic, we can be more proactive."
[27:12] The "So What?" Framework for True Intelligence Bryson shares the single most important question for turning raw information into actionable intelligence: "So what?"
"She would look at me, and she'd say, 'David, so what?'... The way that you get to intelligence is you look at the information that's presented to you... and you say, 'okay, so what?' And what that does is it starts to make you ask additional questions."
[30:29] Skills for the Next Generation of CI Professionals Looking to the future, Bryson outlines the essential skills for CI professionals in an AI-driven world:
Be an expert in human behavior.
Have a good understanding of history.
Do something else first. He argues that the best analysts bring diverse experiences from other roles like sales, product management, or marketing.
[34:29] Advice for Aspiring CI Professionals Bryson encourages those interested in the field not to be intimidated if they lack a formal CI title. He stresses that curiosity and a unique perspective are the most valuable assets.
"Some of the best CI people I've ever worked with, they never did CI a day in their life. And then they come into the job, and they're magnificent, because they just think differently."
[36:18] Conclusion The podcast wraps up, with Sweenor thanking Bryson for his insightful perspective on the evolving world of competitive intelligence.
Listen to the full conversation with David Bryson on the Data Faces Podcast.
Share this post