Thad Eby reveals why bolt-on AI isn’t enough and how proposal tech must be rebuilt from the ground up. He breaks down the three pillars of modern proposal success—living knowledge bases, context engines, and seamless integrations—and why human curation still matters. Hear how a Fortune 50 SaaS company achieved 90% content reuse and cut proposal cycles from 3 weeks to 1, plus what CROs and proposal leaders get wrong. Thad also shares what’s next: intent-driven AI, proactive proposals, and buyer-facing self-service tools.
Christina Carter (00:09): Hey Thad, welcome to the Stargazy Brief. Thank you for being here.
Thad @ Ombud (00:13): Yeah, Chris, I'm super excited to be here. Thanks for having me and looking forward to connecting with the Stargazy community.
Christina Carter (00:20): Yes, and I'm really glad you're here. What a lot of people probably don't know is that I've known you for over a decade, probably, which is insane because Ombud was probably the third proposal tech that I used as a proposal manager and was by far, in a way, my favorite. And I have learned so much about how to manage content from Ombud and the people that you have at Ombud.
And just like how to manage a team. Like I've just learned a ton from you guys. So it was more than just a tool. It was also just making me better at my job. So I do want to say thank you. But on top of that, I know that like it's changed a ton since I used it in the past. And so I know you've been building Ombud for well over a decade, but you've also rebuilt it really recently. And so I'm just kind of curious if you could paint a picture for us about kind of how it's changed, what the difference is now between before, now and before, and why you made those changes.
Thad @ Ombud (01:22): Yeah, well, so first off, Chris, I got to say, like, you are a legend in the halls of Ombud, obviously going back to a decade. The experiences that we've had, I think together has really been kind of a reciprocal knowledge sharing. So, you know, thank you for obviously being a partner and a friend along the way. But I will say that we learn from every one of our clients and you've been one that has definitely been a legend when we come to the opportunities that we try to bring to people.
That being said, when we look at like Ombud and Ombud in general, you're right. We've been at it for a decade. And our original thesis was, how do you leverage natural language processing to really augment the human workflows that people were trying to accomplish? So when we started this journey, we really did take it under the context of how do we understand the information and more importantly, attribute that in a way that can be used in a sales cycle that would be, you know, fortuitous to the opportunities that you would engage on.
And that being said, that kind of became kind of the thesis of what we built on, around that allowed for us when we went to kind of go and rethink things in the age of generative AI and large language models to reconsider how we start to approach the solution. And so, when we took this journey on, we actually took a step back and we said, let's look what we've done well from a process standpoint, but let's completely revisit the platform as a whole.
And we pulled together a small team. We actually went and started from ground zero. We had no infrastructure in place, no solution. We started with evaluating what's out there. And I'm a big believer that large language models are interesting, but it's really the small language models that make a difference and being able to critique and create those. So we wanted to be able to take that context engine that we'd already built with Ombud and apply it to a more granular approach to the workflows that you'd be using inside of the proposal tech.
And so for us to really kind of rebuild that, it meant being able to understand the library. It meant to understand the people's workflows. And it really was kind of looking at the end to end from intake to delivery. What were all the steps that we would need to kind of reengage in? So it wasn't just a rip one piece out. It was kind of a reimagining the entire workflow so that we could really kind of take advantage of the lessons we've learned over the last 10 years and then revisit them.
Our goal was not to create a bolt on or kind of just a simple little add in that would kind of give us or even just a simple API call out the ChatGPT. We wanted to actually build our own models. We wanted to foundationally train on the workflows that we needed to get to and then revisit that.
Now, as far as the stack itself, it meant re-envisioning the library component of how we go about organizing and engaging with knowledge. It meant about reinvesting in how we think about intake and the intelligence that we wanted to do. And it also allowed for us to really bridge the workflow engine to be able to understand how that needs to be focused.
There are still some components that we're using that we are able to leverage. So some of the publishing areas that we took advantage of, and we see that kind of be in the next generation of where we get to the delivery side as it relates to the prospect being able to engage almost in a self-service type fashion.
So for us, this was a journey that meant re-envisioning everything from the ground up and then being able to deliver it in a unique way to our customer base.
Christina Carter (05:10): Yeah, I find that really exciting because the vast majority of, like, I really do see proposal tech kind of being put into two parts. I see there's the pre-generative AI tools and they kind of are good at certain things, but they do oftentimes have those bolt-ons, right? They do have those add-ons. I think then you have the new ones who are trying to, you know, the post AI tech that is trying to just try to catch up to some of the functionality that the legacy tools had.
And so, you know, yours is really unique in that you are trying to build it from the ground up. Like that's, I'm kind of wondering like what, what would you tell a lot of people who are looking at tools like this to purchase or whether to switch from what they're using to a new one, you know, what to look out for, like what questions that they should be asking, you know, the salespeople or the product people to see if it's a bolt on or a true change in the platform.
Thad @ Ombud (06:12): Yeah, well, so I mean, first off, I would say really quickly when it comes to how do you get started, right? Or how do you kind of start to have the conversation is I think the things that I would really try to understand is, you know, what's kind of that two week RFP reality check? And really what I would say is, you know, what can the team get done with what you're able to do today in two weeks? And not to say that you should be taking two weeks to get a proposal done. That's not what I'm saying.
It's like, what is kind of that sprint that you're realistically able to achieve in a two week timeline? And you may find today that the legacy solutions out there meet that need of what you need to do in a two week sprint and kind of align to your business. But what we find is it really does come down to where can we actually start to change that workflow and be able to diminish from a two weeks, even like a three day type of a scenario.
And it's all about the time efficiency that we start to look at, the effectiveness that the individual is able to take on in their work. And then what's the level of empowerment that you can engage into the job function itself.
So if I were to kind of like be, you know, looking at this and somebody were to approach me and say, you know, how do I actually start to look at the technology and more importantly, what should I be looking for? I would say it's not a technology decision and that's probably the key component to it. It is about the process and the three core components that you need to have in place to drive that process.
You have to have a living knowledge base. You can't just rely on static content. It needs to be dynamic and it needs to be self-updating and that's something that technology can do right now. We're to the point where we can run through review cycles, can provide regulatory checks, we can validate that the content is valid and accurate, and we can do that all through technology.
The second is context engine. It's not just about understanding keywords and kind of putting together the pieces, it's do we understand the deal itself, the partner, the ecosystem, and the solution. So the context engine is really kind of that second phase.
And then the third is all about bringing it to your team and the tools they're working with. And it's the level of integration that you put in place. So you're not having to learn net new software.
So if you were to come to me and say, well, where would I start? I'd say, well, what can you get done in two weeks today with your current process? What do you want to get done? And a lot of times it might be a volume. It might be an efficiency. It might be an effectiveness. And I argue that you need to really solve all of those, but to do it, it's got to be knowledge, context, and integration to your solution suite.
Christina Carter (09:01): That made me very happy to hear that. Yeah, I completely agree with you. I think a lot of us, it's really easy because there are all these new tools out there. They're shiny, they're fun. And so it's really easy for us to go into and be like, okay, well, I know I want these functionalities, this checkbox. And of course we're used to that. We respond to RFPs all day talking about functionality. But it does come back to what you actually want to achieve and the goals that you have.
Yeah, I really appreciate that answer. I'm also kind of wondering, you know, because you have built Ombud again from scratch, do you think that there's like a minimum viable proposal stack that you would put in place if you had to start from zero again? Like what is a basic thing do you think all useful proposal tech has, whether it's in functionality or in its effectiveness?
Thad @ Ombud (09:55): Yeah. So first off, and I hope this doesn't come off wrong to your community, but I would not start with proposal software, right? I think that is the thing where we get really caught up of like proposal software. It is knowledge management. And more importantly, when you come to the knowledge management, that has to be kind of the piece that you care about. When you think about your business, you have assets, right? Your people should be your number one asset.
Your customers should be your number two asset. And the third is the information that drives your business. And so if you're not treating that as knowledge and expertise, then you have a big challenge. The second thing is you kind of have that 80/20 rule and a lot of people try to basically boil the ocean. And what they don't realize is that if you solve the 20% of the knowledge, you're going to get to 80% of the effectiveness that you need across the workflow.
And so my big thing that I always look for when it comes to the areas that you've got to have as a minimum viable project is, what's that corpus look like and how does the tooling and the solution allow for the workflow to auto manage that corpus of content and give you the visibility into what needs to be done?
The second piece then is, how do you bring together the collaboration aspect with the subject matter experts that need to be involved? If you can't continue to feed it and keep it up to date, then you have a big challenge when it comes to your ability to deliver because fundamental of computer science: garbage in, garbage out.
So that's obviously the second piece, is that collaboration access. And then finally, the thing that I like to get then is what level of solution tooling are you providing that complements the human workflow and then can automate the mundane tasks that would be part of it.
So when I look at this from a thesis, it would be that knowledge piece put in place from the very beginning, the ability to collaborate within that knowledge, and then the ability to automate or drive the workflows that you need to do from your end-to-end process.
Christina Carter (12:06): Yeah, that makes such sense because of course none of that matters if you don't have the content there to respond. Like the collaboration doesn't matter so much and it takes so much longer and automating isn't going to matter either without that content. So I really appreciate that workflow. I just think, I'm wondering if then is that like the suggestion that you would have for people responding to RFPs is to first focus on the content that they are building?
And also the content that they are connecting to. But on top of that, and I think this is a more difficult question, is it worth connecting with all the internal content you have within your organization and all the relevant websites? Or should you be more, I guess, discerning about what content you're using to respond to RFPs?
Thad @ Ombud (12:54): Yeah, so it's interesting you say that because early on we used to believe that you needed the content 100% perfect before you really got started in this. And I mean, even when we worked together, right, we would come in and we'd really kind of get to this kind of content discovery step in our lifecycle.
What we found was you actually only need kind of an initial seed to verify that the process of the workflow is working. You do need automated ways, though, to ingest the content, right? So if that would be historical RFPs, product documentation, any sort of marketing content that would be put out, any sort of technical architectural documentation, that all should be ingested and consumed and build you that knowledge graph automatically, right?
What we never want to do is we never want to disturb the source of truth. And that, I talk about integration being one of those components that you have to have in place, Chris, is all about making sure that you keep the original source of content, but in a way that you can factor it up. So I don't think that you need to have it fully articulated and built out. But as you go through your workflow, you need to be able to understand what are the gaps that are being created in that workflow.
And then do you have a measurable way to acquire or to author or collaborate on that content that needs to be part of the knowledge that you're using to deliver your ultimate proposal to that customer or prospect.
Christina Carter (14:27): Yes, and that is such a difference than how it used to be. And I think that this is something that we are having to relearn because back in the day, one of the times I was using Ombud, I did have to have fairly good content and it had to be customer ready because I know it was basically going to be used to be copied and pasted, maybe fixed up before it was sent to a customer. So it did have to be perfect.
But now, of course, like that is no longer true, like what you're saying. And so I'm almost wondering, like, what would your suggestion be to the proposal or salespeople who are responding to RFPs that maybe just, they don't have the resources or time to keep up a content library? Obviously not to the same extent that they did before, but maybe there's a lot of good sources already internally, like the technical documentation, like the marketing, like the product documentation.
If all of that exists, and because AI can then rewrite, you know, take all the information and rewrite a fairly good answer, do we then need an internal library of content or maybe do we not? Like, what is your suggestion here?
Thad @ Ombud (15:36): Yeah, well, so let's be very, you know, just come out and call it, right? I mean, human curation is why AI works. And, you know, if we don't have the initial knowledge, right, we can't therefore, you know, kind of pull together. And I do think, you know, we are kind of getting into a point where, you know, AI creates a lot of, you know, it removes a lot of the kind of those manual workflows and processes that would come into it.
Christina Carter (15:44): Yeah.
Thad @ Ombud (16:03): I think you need to have a knowledge content strategy in place. And so for what works for one organization might not work for another. So you may have source content that comes in and it's gold standard and it provides me exactly what I need. You may have other organizations where that gold standard content is outdated, it's mundane and it has a lot of issues.
I feel that you need to augment any sort of your knowledge with some level of human curation. And that human curation may be to a level of augmentation. It may be authoring, but there needs to be some level of a review into the knowledge management.
Now, what I believe in, you can run that in two different fashions. One is a centralized, right? Kind of a single point of place. We have lots of examples and clients that have that single point. I've also seen it work very successfully where you have it in a distributed fashion.
But the traditional, you know, library manager that you had, you know, just even two to three years ago, I think that position has evolved. And I think you become more of a librarian of like being able to pull things together, but you should be able to augment that as not a full-time function into the business. And then it comes back to, do you create that in a distributed role or do you have that in a centralized function? And that really is more of a process and internal workflow conversation than it is how do we solve that with technology?
Christina Carter (17:34): Yeah, I mean, I think it is a change because of the technology though. Before, it just had to be different. And I'm seeing quite a few companies, especially within software, they are, especially if they're a large organization, they still tend to hire content managers to keep up their proposal content library.
And I'm also seeing other companies who are kind of evolving the role to then hire somebody to manage the proposal library, but then all the other libraries too, in the distributed fashion you're talking about, to make sure those are updated. So when it does get pulled into the response, that it's updated. And I'm wondering if you have any suggestions for people who are content managers right now, like what they should be working on right now to kind of get ready for kind of how the role is probably going to look fairly soon.
Thad @ Ombud (18:31): Yeah. So first off, I think anytime you're going to do like a content audit, I think you need to lay out the steps that you're currently doing today. And then I think you need to look at what are kind of the best practices and then try to do kind of a gap analysis between where you are today and what we're seeing.
I think absolutely you need to be getting up to speed on a lot of the marvels that we're seeing in AI in general. And so when you think about content in AI in general, I can definitely provide you with a couple of resources that I think do a great job of kind of bringing you up to speed.
But you absolutely need to be upgrading your tool set. If that is not a conversation that you're out today and you're not familiar or comfortable with, whether it's engaging with a level of prompting to a degree, whether it's a level of content revision, if you're not using that tooling today, you're going to be in a position that's going to really make it hard to compete when people are using those solutions because the level of efficiency and effectiveness is just, it's off the charts.
So I think you absolutely need to look at your process. You need to be looking at what are the best practices and then conduct a gap analysis of where you need to be. You also need to understand what is your reasonable tool set and be able to go through your understanding.
Now solutions out there off the shelf provide a lot of that process that kind of give you, and that's what we always try to do here at Ombud is give you a marriage of what the technology can do, where your process is, and then the gaps between so that you have a way to augment that as a content manager.
But absolutely the role of a content manager from historical days, it's not there. And you now have to be at a point where you can actually not only leverage the content you're working from, but be able to distribute on those other sources or be able to handle inside of the workflow that you're currently using. So it is a piece where if you haven't already thought about these things, you'll be in a challenge going forward for sure.
Christina Carter (20:32): Yeah, I'm also wondering from the hiring manager's point of view, who are hiring for these content library or content manager roles, what would you say to them? How do they need to be thinking about this differently as they are making their hiring plans in the future or hiring right now?
Thad @ Ombud (20:50): Yeah, well, so first off, I think the one thing that I would come back to is where's the hiring managers just overall insight into what's going on in the piece. How's that? That's the first thing I would start with is the hiring manager, I think needs to actually self-reflect on themselves and make sure that there is a vision that they're carrying to the organization.
And there's a level of understanding. They don't need to be an expert, but they need to be able to understand what the capabilities are or be seeking out advice to help them with that conversation. If you're a hiring manager and you're saying, I just need an AI content manager, it's gotta be more than that, Chris. Like you really do need to understand in that piece.
So it's what is the art of the possible of what we can obtain and what we need to get to. And then it becomes a level of skillset that we can derive of what we're going to be looking for in that individual. But I would say for anything, if a hiring manager is going down this journey, I would start with that hiring manager and make sure that they really are in tune to what this change means and then making sure they go find the right person who can adhere to that process that we want to get to when it comes to next generation AI content.
Christina Carter (22:01): Yeah, and what would you say is kind of the hidden tax of just kind of keeping with what we've just always been doing? Like maybe our proposal software just hasn't really kept up. Maybe it just has some add-ons, some bolt-ons of AI. Maybe our team doesn't really know a lot about AI or doesn't feel like they can use it in a work setting. What is the tax for that team of not really upgrading how they're doing things?
Thad @ Ombud (22:30): Yeah, well, so I mean, I think the one thing is, and again, if I can take just a step back and if I look at like going through, so you have the tax, but then you have kind of the mistakes that people go into of trying to modernize their proposal stack. And so if I were to maybe break down first, what are kind of the issues or the mistakes that people make going into it, then what taxes come out of that. Let me take that first.
Christina Carter (22:46): Yeah.
Thad @ Ombud (22:58): The thing I would say is that when people look at this, they start with kind of a tool-first approach, right? So it is really about, I'm going to get the technology. I want an auto responder. I want a tool that's basically going to make my RFP process super easy or my proposal process in there.
And again, I will hit on that workflow. Not understanding what that workflow means becomes a challenge. The second thing that we see, which is a pretty common area, is this big bang approach. I'm going to replace everything at once versus kind of iteratively going through that. And then it's a failure to understand the change management. So if I don't understand that people will have to adopt new tools and incentives, and that's where I go back to that hiring manager not having that thought process, it becomes hard.
The other thing that we see, which is very common, is over-engineering. I was with a prospect two weeks ago and I mean, they're literally using a really antiquated, outdated solution that we're all probably very familiar with that company, and they're immediately trying to go to the Rolls Royce of solutions overnight. And it's just, that is way too complex. And the organization doesn't understand the change management.
And then finally, right, you miss the SME and the subject matter expert because you can't scale the knowledge.
So what happens from a tax perspective when you take that approach? First off, it's talent retention, right? You absolutely are going to lose the best people because you're going to be still doing that manual work. They're going to get frustrated. They're not going to be able to get what's in place.
The second is that competitive advantage, right? I have clients that have gone through this transformation. What we talked about two weeks earlier is now down to a two-day workflow and they are literally outpacing their competitors three to one on the amount of volume and activity and effectiveness for when that becomes big.
And then the third area is you start to have this attrition of knowledge. People start walking out the door and that creates a really big challenge. When you look at scaling, right, if you can't proportionately grow at that point, you now have cut yourself in a position where you can't get out from under your own druthers of where you need to be.
And then finally, the customer loses, right? You have this inconsistent proposal. It creates buying confusion and you no longer have a real centralized customer experience that you can deliver. And that's one thing that I always say that people do: is your proposal customer-driven and customer-facing to the point that your customer views it almost as a work of art?
Christina Carter (25:21): Yeah, that is so true. And that is one thing that, correct me if I'm wrong, but I feel like Ombud is one of those where you can use it just about wherever you live. And so you're not dragging your SMEs into Ombud, like you're working in Word or Excel, like they're in there too. And so to me, like that just makes it so much easier for everybody to use. Am I right about that? Is that something Ombud does?
Thad @ Ombud (25:58): Yeah, so it's a great point. So early on we used to have this concept we called the single pane of glass, right? Let's get you into one environment. And that works well, right? I will say if you want to standardize and single-ize, you know, what you're being able to drive, that single pane of glass makes it really, you know, centralized and a capability that's there.
Over time, we've evolved. The capabilities have gotten better, right? So you go back in the early days like a Word plugin, right? It really was a challenge. Microsoft's obviously opened up the add-in technology. You can literally have a React Native app running inside of Word or Excel or PowerPoint, and then being able to extend it.
So we created this initiative a few years ago. We called it O2 or Ombud Outside. And the whole thought process was: how do we bring the capabilities of Ombud into your native tooling so you can use it at point of presence and you can bring people along without requiring them to go into a single system and have to do everything in one location.
So you're absolutely right.
Christina Carter (26:57): Yeah, that's just been such a huge issue with me in the past, is getting people onto something that they use like once a month or sometimes like once a year. So no, I think that's like a huge, huge change and not a lot of tools do that, which is why I wanted to call that out.
But that kind of brings me to my next question, which is really about a success story. I'm wondering if there is, you know, a firm that you've worked with that's kind of rebuilt what they've done and it's worked really well. And I'm curious like what they did differently so we can all copy them and also have the same success.
Thad @ Ombud (27:31): Yeah, so I'm going to take a scenario which is kind of like Ombud Next Generation, right, which is kind of our Ombuddy infrastructure with the AI, agentic capabilities that we've built in, and then the kind of AI-first initiative with the solution. So first off, we hope to do this case study with everyone, right? When we look at different clients that we've been able to do, our goal is to really be able to bring the transformation with everyone. So I hope the story that I share with you today is not just a single story, but one that we can bring to anyone.
Thad @ Ombud (27:31): Yeah, so I'm going to take a scenario which is kind of like Ombud Next Generation, right, which is kind of our Ombuddy infrastructure with the AI, agentic capabilities that we've built in, and then the kind of AI-first initiative with the solution. So first off, we hope to do this case study with everyone, right? When we look at different clients that we've been able to do, our goal is to really be able to bring the transformation with everyone. So I hope the story that I share with you today is not just a single story, but one that we can bring to anyone.
Christina Carter (28:01): Yeah.
Thad @ Ombud (28:01): First off, I think that it really comes to that knowledge audit, which is there. And I think being able to audit the knowledge that goes into the problem space is that first step that we start with. And I cannot tell you the number of prospects that have been like, no, we just need to get this in. And so we pause and we go through it.
So this client that I'm talking to, they're a very large Fortune 50 company. They sit in the SaaS market. And what we've been able to do with them is we started with understanding what was the different components of knowledge that lived in the business and then really being able to audit what level of automation can we bring, what level of human curation needs to be, and how do those process workflows come underway. That was step one.
The second thing was then looking at the highest value workflows that we could incrementally scale and incrementally achieve. So again, it wasn't about boiling the ocean overnight. It was starting with one thing. So we started with intake and specifically looking at how intake would occur.
So it was, how can I do my research upfront? How can I get my win-loss analysis ahead of time? So I understand from a qualification perspective where I am. How can I ensure that I now have identified where my content gaps and my knowledge base exist? And that was workflow one — just really getting to the point that the front of the process was solved.
And then the second thing that we looked at was how do we choreograph the orchestration of all the people that are going to be involved. This particular client in general will have anywhere from five to fifteen people that will work on an individual proposal because there's so many components. These are 4,000, 5,000, 6,000 unique requirements in a single response, multiple pages.
A lot of public sector work. So a lot kind of went into that collaboration component.
And then finally, we solved against the review. What do we want to really create to orchestrate that we have the best version, the best iteration and being able to solve it from front to middle to the final component of what we're delivering was the big piece.
What we saw in this was that focusing on the user adoption, and embedding that into the existing tooling that you alluded to earlier really transformed. And in that particular case, right, we're seeing upwards of a 90% content reuse — 90%. And in the old days we would see 45, maybe 50%. And 90% is really impressive.
And we've been able to literally scale the response time to a third of what it used to take. There's still a review, right? There's still a process that needs to be in there, but going from that type of a workflow from three weeks to one was a really big opportunity for us to be able to invest in.
Christina Carter (31:00): Yeah, that's huge. So what I'm hearing is essentially don't just like dive in head first, take it step by step and really look at the outcome that you actually want and also really focus on the change management as well. Like it's not just about the tech, it's getting everybody along on the journey with you, especially with these new tools that are different than the ones we used to. I know it's a really fast high level overview, but is that...
Thad @ Ombud (31:29): No, I mean, would discuss it like how, what they did differently was really three things.
Change management first, technology second, right? And that is one thing that, you know, if you are thinking about this as technology first, you already are behind. So change management is kind of that first piece that made them successful.
The second one is measuring success in user behavior, not speed metrics. It's user behavior. And what you're doing is you're changing the behavior of how people operate and function.
And the third is when you're bringing AI to the conversation, it is about making it helpful for the individual, not threatening to their function. And so what you really think about is how do I augment that individual and give them a tool set versus trying to replace functions that they should be doing.
And so those were really the three things: change management, measure the success by behavior, and then make the AI as assistant as possible so that you're being able to provide a function that can allow that person to do their job better.
Christina Carter (32:31): Yeah, I can see that being incredibly powerful, but also not obvious to most people. So I'm also kind of wondering, I know that you work with a bunch of CROs and a bunch of proposal leaders. And so I'm kind of wondering what you see them over-invest in that doesn't actually make a big enough difference when they are rebuilding.
Thad @ Ombud (32:53): Yeah, so first off, I think when you look at a CRO, you have to understand what's the frame of reference or the experience that the CRO has. I always like the question, hey, does the CRO even get involved in this process or is it so far removed from where they are? And it's okay, right, if they're not involved, right? But do they have the experience to understand what really goes into this?
I think so many times, Chris, people look at proposals and RFPs and responses as an afterthought or a burden. And you really have to make this as a strategic advantage to your business. And if you don't look at it that way, then again, you're already behind and on your back, you know, hind legs. And so that's something that's really important to me, is making sure that's there.
Here's a couple of things that I've seen when it comes to like, you know, where CROs kind of get it wrong, or their organizations in particular, because I think, again, it comes back to is the CRO really the responsible person, or are they just delegating the capability.
First off, it's kind of these fancy collaboration features. I see so many over-articulations of these complex approval workflows, complex review cycles. It's like, really, no, you don't need to have six different people go through and review.
That's one piece where I see it's like over-articulating on that collaboration.
We talked about this a little bit in content management, but trying to get the perfect content library, I think again, is a misnomer. It's being able to make sure that you have templates and you have a way to get what works and you can start with a very small piece around it.
The third is this hype cycle. We've seen the study that MIT claims that 90% of these AI projects fail. And so taking this attitude that AI is going to write everything and solve it, you know, it's just... Again, it's not to replace humans. It's to eliminate that mundane task. You need to understand there are a lot of limitations with AI. It gets better every day, but you can't try to basically do everything.
Then the other piece I would hit on is integration overload. We had one prospect we were working with that literally wanted us to integrate with like 15 different systems. And when we actually got out there, we had to limit with one, their CRM solution. It's like, why do we need to have all these pieces there? Does it really give you better?
And then finally, I would say what really matters and what you need to be caring about is user adoption, the knowledge capture, and then are you reducing the time to value? And if you're doing that, it gives you more opportunities at bat, creates more effectiveness in your sales cycle. And then more importantly, you have a more focused customer vision and mission that you can deliver to them.
Christina Carter (35:39): Yeah, that's like the goal. I mean, the output that we're trying to at least get to is to win. So that makes perfect sense. Yeah, I feel like so many CROs I know have responded to RFPs forever ago, back when they're AEs or maybe solution consultants. But yeah, of course, they don't quite get it at quite the same level that we do, because why would they?
So what would you suggest to, I mean, because like they... why would they know this and why would they get in the weeds of that? I'm just kind of wondering, what would you suggest to proposal leaders who may be part of the decision of purchasing new proposal management software, new revenue software, or would want to bring that up to the CRO or sales leaders to purchase. How would you suggest that they bring that up to them to show the value of doing that?
Thad @ Ombud (36:31): Yeah. So when I think about evaluation, I think you really do need to be thoughtful in this, right? So many times we go to the CRO or the decision-maker like, this is going to save me a ton of time. And this is going to give me all these great things. Or here's my ROI study of all the money we're going to save the business. And those are great, right?
But here's the kind of the tangible, tactical way I would go about this.
First off, I have something called the 48-hour test. I like to go in and say, can we track how long it takes to respond to an unexpected RFP question? What's that look like in a 48-hour window? Not weeks, but in 48 hours, what's that look like for us to understand?
The second thing is, have you done an SME time audit? Do you know what percentage of your expert time goes into those repetitive questions and being able to drill into it? Do you even know how much time your SME spends inside of these workflows?
Then the third area is content reuse. How often are you creating content that already exists or already lives inside of the organization? I'm a big proponent in this concept called first-time use knowledge. And that's really all you should be using a SME for. So first-time use knowledge is the intersection of that question that came in that we didn't have an answer to that requires that curation to be able to drive from it.
And then finally, what is our quality control process? How many errors and inconsistencies slipped through our current evaluation process that we don't even know about? I had one customer that had a couple of profane words because somebody put comments in and they never even knew that was there until we ran through a content control review.
So it's what can we do in the 48 hours? What's that SME time audit look like? What does our content reuse really drive into? And then what level of quality control can we provide so that we are providing the best articulation to the customer?
But you have to start with proposals being a strategic asset, not a burden in the sales cycle that just costs and takes time away from me.
Christina Carter (38:32): Yeah, I feel like that's so important because, I mean, we know that RFPs contribute to almost 50% of revenue in sales cycles. So you think it'd be obviously important. One last question for you. What do you think the proposal tech world is going to look like in a couple of years from now? What should we be looking out for?
Thad @ Ombud (38:56): Yeah, well, I wish I knew the future because I would obviously be in a much different position. But if I see like what the trends are of what we're seeing and what kind of is taking place, I think that first off, this is probably going to be the most amount of change that we've seen in this industry in the last 20 years. I mean, it really is.
I mean, this is a period of time. I started my career during the dot com boom and bust. And so that was one of the most explosive times in my career of seeing just transformation. And we're on a very similar trajectory here with what we're seeing for AI.
So I think first off is it's all about from templating the conversations. It is going to come back to AI that really understands intent, not keywords. And that is something that's really important, is that we get that intent right.
The second thing that I think is really important here is we're going to move from a reactive approach to a proactive one. We want to be able to anticipate the questions before they arrive. I should be at a point where I can talk through, here's my customer, here is where they're at, let's go through and actually proactively define and deliver what should be responsible with them and provide all the research that went into it.
So you're providing more of that consultative proactive response than a reactive feature function. Let's get into it.
I'm a big believer that self-service is coming and that is going to lead to customer-facing AI. I'm already seeing it. We're already delivering it for a few of our, what I call very innovative customers, but buyers are interacting directly. They want to be able to have a self-service, almost a Shopify experience to getting at the information that they need to make their evaluation with you.
And then finally, real-time knowledge. We're getting to a point where content is going to update automatically as the business changes. And being able to have that real-time context and knowledge available to you is a gap that we will compress and turn to zero.
Christina Carter (40:55): Amazing. And Thad, where can people learn more about the new, very cool Ombud and yourself?
Thad @ Ombud (41:02): Well, first off, there's this awesome community called Stargazy that has this wonderful community-driven approach to information. So I would definitely start with there. I think it's a community resource that is very underutilized and is just waiting to get itself out there.
Second thing is our info truthfulness. We've not invested heavily in marketing ourselves. That has never been my focus point for the business. It is about how do we build a solution that customers want to use and how do we deliver on their mission. That's what fuels our team.
But you can go to ombud.com and get more information. We will be launching a new website before the end of the year. So we're really excited about some of the gains that we've been doing. And that's the other piece.
And then finally, I would say to learn more, have a chance to speak to our users or our customers. We're very much about being able to provide you with that, but you can start with the community resources such as Stargazy. You can look at our website, or I would recommend definitely spending time with community members who have used our solutions.
Christina Carter (42:10): Very cool. Well, thank you so much, Thad. This has been incredibly enlightening and I have 500 things I need to go look up now after this conversation. So thank you.
Thad @ Ombud (42:20): Yeah, of course. Absolutely. Thank you so much, Chris. And obviously, if there's anything we can ever do to help you and your community, just let us know.