In this week’s AI Is the Easy Part—a series designed to help business leaders move from GenAI experimentation to enterprise transformation—we’re talking about how to vet AI suppliers. Leaders are feeling pressure to “do something with AI.” That pressure often turns into a scramble: vendor meetings, rushed pilots, and external solutions introduced before the actual problem is defined. It may look like progress, but without the right foundation, it leads to failure. Here’s why: without a clear use case, a grounded understanding of your current capabilities, and alignment on what success looks like, even the most advanced tool won’t get you where you need to go. Before your next demo meeting, take a moment to slow down. This is the criteria I share with clients before they engage external solutions: BEFORE ASSESSING SOLUTIONS - Map your current flow: Understand how the process works today. what’s manual, what’s slow, who’s involved, and where improvement is needed. - Keep it simple: This space evolves fast. You do not want a cluttered stack of tools. Before seeking something new, confirm your existing solutions can’t address the problem. - Assess your current technology: Review platforms already in place (ChatGPT, Salesforce, HubSpot). Understand their AI capabilities, current usage, and future roadmap. It’s almost always more efficient to deepen an existing partnership than to add a new one. - Be clear and choiceful: If your use case truly can’t be solved with current tools, define it clearly. What is the problem, and what is the specific need? WHEN ASSESSING SOLUTIONS - Do your research: Leverage tools like ChatGPTs team research to help you understand and do an initial vetting of potential partners. - Ask for a step-by-step walkthrough: Vendors should explain exactly how the tech works. If they default to “proprietary,” proceed with caution. - Clarify what makes it different: Ask how it differs from models like ChatGPT. Many tools are wrappers. Understand what real value is being added vs. great prompting in ChatGPT. - Define business outcomes: What cost will it reduce, what revenue could it unlock, and what’s needed from your team to get there? - Understand training and support: Who leads training? What’s the time commitment? What happens when questions arise? - Ask how the tool improves: How does it learn? What feedback loops exist? Review the roadmap: what is coming next, and why does it matter? - Request case studies with results: Ask for specific examples of value delivered. What worked, for whom, and in what conditions? - Plan for validation: Are you ready to outsource this capability? How will you measure its value versus your current approach? A side-by-side test may be needed. - Always test before buying: Run a real pilot, with your data and your team. A demo is not evidence. If you’ve seen this dynamic inside your organization, I’d be curious what you’ve learned.
How to Evaluate Technology Acquisitions
Explore top LinkedIn content from expert professionals.
Summary
Understanding how to evaluate technology acquisitions is critical for organizations aiming to align new tools with their business goals. This involves a careful assessment of a technology's compatibility, value, and potential to address specific organizational needs.
- Define clear objectives: Identify the specific problem your organization aims to solve and align the technology acquisition with these goals before engaging with vendors.
- Analyze current resources: Audit your existing tools and systems to determine whether they can meet your needs before considering external solutions.
- Evaluate vendor alignment: Review the vendor's track record, support model, and strategic alignment with your organization’s long-term goals to ensure a strong partnership.
-
-
How We Evaluate Technology at SOCPAC: A New Standard At SOCPAC, we’ve reached an inflection point in how we engage with technology companies. The days of buzzwords and slide decks are over. Moving forward, our evaluation process is guided by four criteria, each rooted in our operational needs and foundational architecture: 1. Production-Proven: Your technology must work in real-world environments, not just in a lab, demo, or wargame. If your product doesn’t already run at scale, on-network, and under pressure, it’s not ready for our missions. 2. User-Validated: We don’t just ask what your platform does. We ask: Do our operators want to use it? If an end user on our team says your tool gives them an edge, that carries more weight than any technical spec. 3. Architecture-Integrated: Every capability must connect to the platforms we’ve already deployed, a platform for strategic workflows and data fusion, a platform for tactical autonomy and sensor-to-shooter control, and a platform for AI tuning, feedback, and agent deployment. If your system can’t plug into this triad, it will create friction, not an advantage for us. 4. Culturally Aligned: We look for companies that embody intellectual honesty, speed of iteration, and a bias for solving problems over selling products. We want partners who thrive in ambiguity and innovate under constraint. This isn't about shutting the door. It's about raising the bar. We’re building a digital warfighting ecosystem, not a tech museum. If your team can plug into our architecture, align with our culture, and deliver capabilities that actually matter to the mission, we’re ready to work with you. Let’s move fast together.
-
For anyone exploring the learning and people tech market, don't start with an RFP, start with Art of the Possible. Over the last few years I've worked with a number of organizations exploring the tech market and we've been successful with the following approach: 1) Don't start with requirements for a tech solution (LXP, talent intelligence, microlearning, LMS, etc.) and a list of vendors in that category of solution, start with problems you are trying to solve. 2) Art of the Possible. Book exploratory demos with vendors in different categories of technology, especially outside the category you think holds your solution. Ask them how they might approach solving your problem(s). Keep this informal. Give the vendors a one-pager with a list of problems to solve with some context, and let them figure out what to demo and how much additional slideware and other material to provide. You will learn a lot about what's available in terms of features, functionality and approaches that you wouldn't have even thought to articulate as requirements. You'll also get a preview of how collaborative and business focused (or not) the vendors are. 3) Now write your requirements, issue your RFI/RFP, get your vendor list together, and go through your selection and contract negotiation process. This leads to less wasted effort for you and the vendors, a better, more informed perspective on the solution you are really looking for, and fewer (but better) options. You might find vendors who get invited to respond after an art of the possible will tend to put in more effort because they understand your needs better and know exactly how their solution fits.
-
How do I recommend you evaluate a potential technology provider? Over the years, I've focused on six key dimensions when completing my due diligence of any tech provider: 1. Requirements vs. Vendor Capabilities - I typically recommend a customized RFP based on customer / user requirements, followed by some limited scope POC. If the vendor is unwilling to support a POC, then I recommend disqualifying them. - Engagement with end users during RFP process is critical, which includes involving them in some form of vendor scoring exercise. - Ensure you also engage all the necessary IT stakeholders from an architecture and security perspective as engaging them too late could throw a big wrench into your process. - Criticality of this dimension: 30% of total vendor score 2. Business Value and Pricing/License Options - Will the vendor help you build a business case to justify a spend? If not, this is a huge red flag. - Ensure your financial evaluation method aligns to your CFO's preferred method (TCO vs. ROI vs. NPV vs. IRR, etc.) - License restrictions, exit options, pricing levers - Is the vendor pricing roughly aligned to how others price? - Does the pricing / licensing model support flexibility for future growth? - Criticality of this dimension: 25% of total vendor score 3. Roadmap & Strategic Alignment - Does the vendor have a well-articulated roadmap, and does it align to how you see your requirements evolving in the future? - Does the vendor roadmap align to where you see the market heading? - Does the vendor solution, and their roadmap align to your long-term data and IT strategies? - Criticality of this dimension: 10% of total vendor score 4. Market Feedback - customer testimonials and references - analyst reviews - peer insights, reviews, social media - Criticality of this dimension: 15% of total vendor score 5. Ongoing Support - what is the vendor support model? - do you have an assigned customer success manager? - how dedicated is the vendor to your success? - Criticality of this dimension: 10% of total vendor score 5. The overall vendor 'vibe' - difficult to quantify and beware of biases, but also listen to your gut - does your experience with the vendor feel like a partnership, or a transaction? - how important is your success to the vendor? - Criticality of this dimension: 10% of total vendor score What have I missed? What else would you add? #cdo #chiefdataofficer #rfp
-
I have a dear friend who is the CIO of a PE-backed firm. She shared that she's "drowning in AI salespeople" and needs to know how to vet their solutions. Her words echo the challenge that I hear from many executives and board directors. 🗨 One recently said to me, "I'm so sick of AI. I can't tell what's real and what's hype. The risk is high if I do nothing. And if I go too fast or make bad choices, the risk is even higher. I've got to figure this out." I hear you. Your concerns and frustration are warranted. To help you, I hammered out 3 guides - business value, risk, and technical - that include questions to help you to identify AI solutions that are best fit for YOUR organization. These guides are designed to help you create business value with AI, avoid risks, and sustainably deploy and scale your AI solutions. 📊 Business Value Questions: This guide includes 24 questions designed to ensure that the AI solutions align with your strategic objectives and deliver tangible business outcomes. 🔍 Risk-Based Questions: This guide covers 33 questions focused on identifying and assessing potential risks associated with AI solutions, helping you to make informed decisions that mitigate risks. 🔧 Technical Questions: This guide contains 48 technical-based questions to ensure the AI solutions under evaluation have the technical robustness necessary to support your business objectives. 👉 Click below, share your email address, and you'll receive an email with links to all 3 documents. #AI #AIEvaluation #BusinessValue #RiskManagement #Innovation Disclaimer: While these questions provide a solid foundation for evaluating AI solutions, it's not possible to cover every possible needed question in a concise format. As always, I encourage you to apply your own expertise and judgment. https://lnkd.in/ghG4RdP4