The interview subject is the Head of Product at a Series C B2B SaaS company that ships an AI-augmented platform to mid-market revenue teams. The company shipped its first AI features in 2024 — primarily chat surfaces, in line with the dominant pattern at the time. By Q4 2025 the chat surfaces were producing measurable engagement decay, and the product team made a decision the rest of the company called bold and the Head of Product called overdue: remove every chat surface and replace them with embedded action panels inside the workflow surfaces. The retention numbers are now in. We sat down with her for thirty-four minutes. She asked to remain unnamed for procurement reasons.
On the moment she decided.
Knyte: What was the meeting.
Head of Product: It was a retention review where the head of analytics presented the chat-surface engagement decay curves. We had four chat surfaces in the product — one for the deal record, one for the contact record, one in the dashboard, and one in the inbox. All four had identical decay shapes. Healthy engagement in week one, halving by week four, asymptotic to about a quarter of week one by week eight. We had been treating each one as a separate optimization problem. The four identical curves told us we had a category problem, not a feature problem.
Knyte: And you decided to remove all four.
Head of Product: I did not decide to remove all four in that meeting. I decided to test removing one. We picked the dashboard chat surface — the one with the lowest engagement of the four — and replaced it with three embedded action panels in the dashboard itself. The action panels surfaced the same model capabilities the chat had, but as specific actions tied to the dashboard's existing widgets, with outputs that landed in the dashboard's existing artifacts. The engagement curve on the embedded panels did not decay. It grew through week eight and was still growing in week sixteen when we last measured. That was the data that justified removing the other three.
On what the team's reaction was.
Knyte: How did the engineering team react.
Head of Product: Mixed. The engineering team had built the chat surfaces. They were proud of them. The chat surfaces were also genuinely interesting engineering work — streaming, tool use, multi-turn handling. Telling the team that their good work was the wrong work for the customer was uncomfortable. I tried to make the conversation as honest as possible. The chat surfaces were good. The customer needed something else. The engineering work that produced the chat surfaces would now be applied to a different surface. None of the team's effort was wasted; it was being redirected.
Knyte: Did the redirection actually use the prior engineering work.
Head of Product: A surprising amount of it. The streaming infrastructure, the tool-use infrastructure, the audit-trail infrastructure — all of it transferred to the embedded panels. The chat-specific code — turn management, conversation history, suggested-prompt generation — was deprecated. The deprecation was about a third of the chat-surface codebase. The other two-thirds was the underlying AI infrastructure, which is the same regardless of whether the surface is a chat or an action panel.
On the customer reaction.
Knyte: What did customers say.
Head of Product: Almost nothing in the first week. We had braced for complaints — "where did the chat go" — and they did not come. About fifteen support tickets across our entire customer base in the first month, mostly from a small cohort of power users who had built workflows around specific chat affordances. We worked with them to find equivalent paths in the new surfaces. By month two the support tickets stopped.
Head of Product: What we did get, around month two, was a lot of qualitative feedback that the product felt faster. Customers were not noticing that the chat had been removed. They were noticing that they were getting more done. The cognitive cost of the chat — composing prompts, managing context, deciding what to ask — had been a constant tax that customers had absorbed without complaining. Removing the tax was visible in the qualitative feedback even though the customers did not articulate the chat itself as the cause.
On the metrics.
Knyte: What does the QBR slide say.
Head of Product: Three numbers. AI feature engagement, by which I mean any interaction with an AI-generated output, was up roughly four times against the chat-surface baseline at the same week-eight measurement point. AI feature retention, by which I mean week-twelve return engagement, was up about three times. And NPS for the AI features specifically — a question we ask in the in-product survey — moved from neutral to materially positive.
Head of Product: The number that surprised the board most was AI-attributable revenue retention. We had been tracking it because we wanted to know whether the AI features were contributing to the platform's overall stickiness. With the chat surfaces, the contribution had been slightly positive but inside noise. With the embedded panels, the contribution is clearly positive — about a four-point lift on net revenue retention for accounts that use the AI features regularly. That is the number that justifies the next round of investment.
On what she would not do again.
Knyte: What would you do differently.
Head of Product: I would have shipped one chat surface as a test, not four. The decision to ship four was driven by the assumption that chat would work because the demos were impressive. We did not have evidence at the four-surface ship date. We had hopes. If we had shipped one, we would have learned the decay curve in eight weeks, and the cost of changing direction would have been a quarter of what it ended up being.
Head of Product: I would also have started with the embedded-panel pattern from the beginning if we had been thinking carefully. The pattern is not new. We covered the chat sidebar's structural problems in our internal product design discussions in early 2024 and did not act on the insight. The lag between recognizing the pattern and acting on it cost us a year of customer engagement.
On what is next.
Knyte: What is the year ahead.
Head of Product: Depth on the embedded panels. We are not adding new AI surface area for the next two quarters. We are concentrating the engineering attention on the panels we have, watching the engagement curves, and adding panel-level features the customer feedback is asking for. The discipline is similar to the three-pillars deployment shape you have written about. Concentrate. Deepen. Avoid the urge to expand surface area before the existing surface has fully matured.
Head of Product: And we are exploring, carefully, whether any new surface should be a panel from the start or whether some surfaces genuinely warrant chat. The honest answer is that we have not found a customer workflow yet where chat is materially better than embedded panels. We are looking. If we find one we will ship a chat surface for it. We will not ship four.