Host Dave Sobel engages in a compelling conversation with Mark Haas, the CEO of the Helmsman Group, a company specializing in the consumer packaged goods (CPG) sector. With over 30 years of experience in the industry, Mark shares insights into how AI, particularly natural language processing (NLP), is transforming food regulation and safety. The discussion delves into the origins of their innovative platform, Regulate, which was born out of a need for a sophisticated data warehouse to manage the extensive documentation required in the highly regulated food and beverage industry.
Mark explains that Regulate initially served as a static database to archive information but has evolved into a powerful tool for analyzing consumer sentiment. By leveraging open API connections from social media platforms like X, Reddit, and Amazon, the platform captures real-time discussions about products. This allows brands to gain organic insights into consumer opinions, preferences, and trends, moving beyond traditional focus groups that often suffer from confirmation bias. The episode highlights the importance of understanding consumer sentiment in a more authentic way, enabling companies to make informed decisions about product development and marketing strategies.
One of the standout features of Regulate is its Consumer Sentiment Analysis module, which provides various reports to help brands understand consumer perceptions. Mark discusses how the platform generates word clouds, attribution reports, and centrality analyses to identify key themes and connections in consumer conversations. This data-driven approach not only aids in product innovation but also helps brands identify flavor trends and other market opportunities that resonate with consumers. By analyzing the interconnections of thoughts, companies can better align their products with consumer desires, ultimately leading to more successful market entries.
The conversation also touches on the compliance aspect of the platform, emphasizing how Regulate assists brands in navigating FDA regulations. Mark explains that the platform can parse vast amounts of regulatory data to ensure that products meet compliance standards, thus reducing the risk of costly errors. The episode concludes with a discussion on measuring success, where Mark shares that the primary deliverable for businesses using Regulate is the ability to accelerate time to market while improving profitability. By streamlining the product development process, the Helmsman Group is helping brands achieve significant financial outcomes, demonstrating the transformative power of AI in the CPG industry.
All our Sponsors: https://businessof.tech/sponsors/
Do you want the show on your podcast app or the written versions of the stories? Subscribe to the Business of Tech: https://www.businessof.tech/subscribe/
Looking for a link from the stories? The entire script of the show, with links to articles, are posted in each story on https://www.businessof.tech/
Support the show on Patreon: https://patreon.com/mspradio/
Want to be a guest on Business of Tech: Daily 10-Minute IT Services Insights? Send Dave Sobel a message on PodMatch, here: https://www.podmatch.com/hostdetailpreview/businessoftech
Want our stuff? Cool Merch? Wear “Why Do We Care?” - Visit https://mspradio.myspreadshop.com
Follow us on:
LinkedIn: https://www.linkedin.com/company/28908079/
YouTube: https://youtube.com/mspradio/
Facebook: https://www.facebook.com/mspradionews/
Instagram: https://www.instagram.com/mspradio/
TikTok: https://www.tiktok.com/@businessoftech
Bluesky: https://bsky.app/profile/businessof.tech
I get asked a lot, what are customers doing with they? How do they measure? Oh, I had a chance to talk to Mark Haas, he's the CEO of the Helmsmith Group, and they focus on the consumer packaged good space. Now he comes at it, it's a consumer packaged good expert, it's a subject better expert, and we learn how they're using natural language processing to get better outcomes, things to play on this bonus episode of the Business of Tech. Well, Mark, thanks for joining me today. Yeah, thank you for having me, Dave. Now, I'm super interested to dive into this, but you came with some of this really interesting premise of injecting AI into food regulation and food safety, and I feel like I need to start with tell my audience a little bit about the backstory of what you've created and how you got here so that we can dive into some of that thought of how it works. Sure. I've been in the consumer packaged goods industry for 30 years, and we've had a development agency for the last 13, four years ago to create more efficiency in our agency. We set upon building a data warehouse. We looked at off-the-shelf solutions, none really fit the bill in terms of the sophistication that we were looking for, and it's sort of a kismet moment. We were renting, subletting, one of our offices in our facility in Emeryville, which is where we started this business, and it just so happened that the tenant that was renting space from us came by a common space one day while I was mapping out the architecture of a database. He asked me, "Is that a database?" I said, "Yes," and the conversation grew, and it turns out that he is an NLP expert, and we joined forces and built this platform that we now call regulate. Gotcha. Tell me a little bit about what regulate does. Initially, it was just to archive information. It was a static database to warehouse different supporting pieces of information for food and beverage. When food and beverage is a highly regulated industry, we have, I would say, often more than 100 pages of documentation per each ingredient, and there are regulations, of course, with how you produce different products, and when we -- let me pause for a second here -- when we -- the initial project was simply just a data warehouse, and to hold this information. When I learned more about Ixon, my business partner, our CTO's background, I dove pretty deep. It was pulling some things from my college years. I was involved in the fractal project at the University of Washington in the '80s and early '90s, and I was -- I did code in C++ back then, and so I had some solid knowledge, and so I dove pretty deep with him about database architecture and current programming languages in his area of expertise, and didn't really understand natural language processing at the time. This was 2019, and so I really immersed myself in what the potential utility of NLP is, and it occurred to me as a domain expert, the complexity of regulations, and how we would go about effectively vectorizing that data to interpret it for purpose-built outcomes. It was something that nobody was really looking at. This is before OpenAI, really, was putting anything into the market at the time, and so I felt that there was something here that we needed to explore and dive a little deeper on, and the truth is now, with where AIML is, specifically NLP and LLMs, the power is pretty remarkable, but we're able to accomplish now. We have some amazing tools. I'm dying now to know, so give me a great use case of what you're doing with it. The most recent module that we built, we call it consumer sentiment analysis, and so we go to open API connections on social media channels, X, Reddit, and Amazon, and we're looking at what people are talking about, so we create a framework about a product, and so we might talk about a snack bar, and that snack bar, people will comment and discuss and share opinions and facts, and whatever is on their mind. Reddit appears to be the best channel when we're dealing with open API connections, although if you are working with a brand and you had curated channel, any better analytics and better information, we can extract a lot more information from that. Parsing the data from commenters allows us to understand in a more organic fashion what large bodies of people truly think. You invite an audience member to be part of a focus group or something like that. They tend to have some sort of confirmation bias, or others have confirmation bias. We're looking at real-life conversations between people, and what are the commonalities, and in food and beverage we tend to focus on product and recipe a little too heavily at times. When we analyze the conversations with people that we are anonymous to us, we get greater insights, price points, merchandising locations, packaging information, font, typeset, script, things that relate to the other aspects of the packaging that are really important for marketing and consumer engagement. We analyze this in three different ways. A couple of typical reports, we have a word cloud report that helps us understand volume and frequency and what's being discussed. We've created our own GPT prompt to help us understand the context and the syntax of the information. We're not looking at the actual commenters, we're looking at the amalgamation, the output in a word cloud report, and so it's busy, and there's a lot of information, and at times it seems incongruent. When we analyze it through our GPT prompt, we're getting very clear insights as to what consumers are really thinking, what their drive is, and we have other filters. We can look at it, I wish it was different, or what is positive or negative, so we can change the perspective, we can change the lens with how we look and output the reports. We have attribution reports, so quality of ingredients, or package, or what do I want to see different? Those are basic histograms and volume metrics. Then we have another report that we call a centrality analysis, and it's pretty unique. We can analyze the data and the interconnection of thoughts. This is really, I think, the advantage of deploying NLP for this purpose. We can dive in and fully understand and analyze the connection points, what a central concept is, why that's necessary in a particular product. If you're designing from scratch, you have an idea, we call it a white space innovation. You don't know where to start. We can run this consumer sentiment analysis and look for the central nodes, and from that what are the supporting nodes that really make that shine in consumers' eyes, so that when we formulate, we can incorporate those elements or attributes into the final product. Gotcha. That includes things like flavor trends. You can actually get a sense of what people are interested in. How does that change and how are you tracking it? Yes, we can do that with flavor trends. We haven't really modeled that. We think we're going to build another module that is built on the architecture of this consumer sentiment. The consumer sentiment in the CPG world is a pretty big nut to crack. Nobody really knows what consumers are thinking until after you go to market and you see transactions, what do people actually buy? It's a risk mitigation strategy to utilize our platform to leap ahead, to project, to foreshadow what probable outcomes are likely to be. In our internal testing, over the last year, we've proven through the use of this data model, our module, that never before conceived fusion concepts are now hitting the market and resonating with consumers fulfilling that particular need state, so that they're getting what they're talking about, what they're asking for instead of what has been modeled through quantitative analysis, so the typical trend has been the way it's been done for the last 30 years. In CPG, as we go to syndicated data, we do a market size opportunity. We do a basic mathematical movement analysis. What's moving, how frequently isn't moving, what is the velocity of the product, what's the total dollar value of a particular category or product, and then we try to find white space between what's moving, where is there another future potential opportunity, and that's just somebody like me as a subject matter expert opining on what that could be, and then recruiting a focus group audience to validate that concept, which to me is nothing but confirmation bias. If you're maybe successful, if you're experienced, you'll hit a second base outcome, but home runs are pretty infrequent. What we're striving for is home run at bat every single time. I would think that you've got so much data at this point. What have you learned through this process about what works really well with managing this much data to be effective, and what does it? Well, I know my peer engineer is out there. We're all in the data quest. We're all looking for more data. We learn something, and then we need more data, and we want to keep going. What we've learned is there are so many business applications for how you analyze the data and what problems you're solving. If you look at traditional business problems, what is my budget going to be next year? Could I increase my price? Am I likely to incur inflation? These are just basic business metrics. We can incorporate that. There's plenty of data that is in the public domain. We have futures indices. We have that could be ICE or whatever, all these different markets where we look at soy bean futures, corn doesn't really matter. Every ingredient is either derived from a commodity or a derivative of a commodity. Meaning that canola price, well, canola has a market now, but sunflower seed oil is a derivative of the soybean oil market because it's a seed oil, and it's an alternative too. You can create some metrics to analyze what sunflower pricing might be, and if that is in your recipe deck, we can build a market basket analysis and project what's likely to occur based on costs, which translate ultimately to price, and businesses can make informed decisions. Do I need to have a conversation with my key accounts and talk about a price increase? Is it nominal enough that I can absorb some of this cost and I can have a strategic pricing competition with my competitor and not have a price conversation? The degree of insights that we can glean and supply to business solutions is pretty powerful. We're learning something new every day, and we've been asked, can you do sustainability outputs? Yeah, it just depends on how we design the architecture of the database, what we can pull from the data, and ultimately what we can connect to, I like to connect to public domain sources, there's a lot of insights out there that are effectively free data that allow us to understand consumers, weather patterns. There's a lot of things that influence the purchase behavior of consumers. Gotcha, and there's a compliance element to this too, right, that you can actually use all of this data to help on the FDA compliance side. Tell me a little bit about how that works. So federal regulations, public domain data, everything's written, the Code of Federal Regulations CFR, so the FDA is based on 21 CFR 100, which is the food regulations, and so that's everything, what you can call a product, we call that standard of identity, what has to be put on a label, what the nutrition facts are, what the supports or claims that you can or can't make, and all of that is in gigabytes of data. We can use our engines to parse that data, to query that data set, specific questions. Is this recipe in compliance? Is this nutrition facts label that we generated in compliance? Are there alternatives? Is there a different way of naming this? What's right or wrong with a proposed label, whatever that label may be, we have optical character recognition engine, so we can look at anything, we can look at our own outputs and then compare that to the data set, and regulations are ambiguous. They're not super clear always by design, and so NLP is another powerful tool that allows us to understand probabilistic outcomes based on ambiguity. Gotcha, okay. So I guess the final piece I want to get a little bit insight is how are you measuring success of this? Is it profit improvement alone? Is it production-specific? Give me some of the sense of the KPIs and the success metrics that you're looking for on these kinds of data efforts. The number one is, well, I would say it's not the principal purpose, but it is a major superlative and a differentiator. It is the compliance and assuredness that everything is going according to plan. The business case, the primary deliverable for business case, the reason why people would subscribe to the product is go to market. So we can reduce the speed to market and improve profitability, and that by itself, and some of the tests that we've done, a product, for example, that has a $5 million launch target year one. We've been able to accelerate its launch timing by food products take years to put together typically for strategics. We reduced that time to market by 22 months, which translated to $13 million of unpound revenue based on the business's compounded annual growth rates. And so that's just one product. And there are, I would say, typically more than 20,000 new products brought to market every year. Well right there is the exact definition of a business outcome, and that's what we talk about all the time of the show. Mark Hause is the CEO of the Helbsman Group, where he leverages a deep background in consumer package goods consulting to pioneer AI driven solutions for the food and beverage industry. Under his leadership, the company integrates cutting edge technology to streamline compliance, drive product innovation, and support brands in navigating FDA regulations. Mark, I've learned a ton today. Thanks for joining me. Thanks for having me, David. I really appreciate it. Are you ready to get your brand in front of the tech leaders shaping the future of managed services, here at the Business of Tech, we offer flexible sponsorship opportunities to meet your needs, whether it's live show sponsorship, podcast advertising, event promotion, or custom webinars. From affordable exposure options to exclusive sponsorships, our offerings are designed to fit businesses and vendors of all sizes looking to make an impact. Businesses start at just $500 per month, making our packages a fraction of typical event sponsorship costs. Be a part of the conversation that matters to IT service providers worldwide. Join us at MSP Radio and to amplify your message where it counts. Visit mspradio.com/engage today to explore all the ways we can help you grow. The Business of Tech is written and produced by me, Dave Sobel, under ethics guidelines posted at businessof.tech. If you like the content, please make sure to hit that like button and follow or subscribe. It's free and easy and the best way to support the show and help us grow. You can also check out our Patreon where you can join the Business of Tech community at patreon.com/mspradio or buy our Why Do We Care merch at businessof.tech. Finally, if you're interested in advertising on this show, visit mspradio.com/engage. Once again, thanks for listening to me, and I will talk to you again on our next episode of The Business of Tech. Part of the MSP Radio Network. [BLANK_AUDIO]