Where Winds Meet’s AI NPCs Are Already Being Pushed To Their Breaking Point

A single player turned an AI-powered guard into a grieving father, exposing both the promise and problems of chatbot NPCs.

By Shivam Malani 6 min read
Where Winds Meet’s AI NPCs Are Already Being Pushed To Their Breaking Point

Non-player characters in RPGs are usually tightly scripted. They repeat the same barks, offer the same quest hooks, and rarely react to anything outside a small set of dialogue options. Where Winds Meet experiments with something very different: selected NPCs that act as live chatbots, responding to whatever players type or say into a text box.

That experiment is already being stress-tested in exactly the way you would expect. One early player has walked an AI-controlled guard through an improvised storyline involving a drunken night, an unplanned pregnancy, demands for child support, and the off-screen death of a baby—complete with the NPC collapsing into self-loathing and grief.


How AI chatbot NPCs work in Where Winds Meet

Where Winds Meet is a wuxia-themed open-world action RPG set in a version of China’s Five Dynasties and Ten Kingdoms period. Most dialogue in the game is still traditionally written and structured. Alongside those conversations, though, a subset of NPCs are marked as AI bots and open a different kind of interface when approached.

Instead of picking from a short list of prewritten responses, you get a chat window and can type anything you like. Those messages are sent to a large language model, which returns a line or two of dialogue that the game presents as the NPC’s response. On console, there is also a speech-to-text option, so you can talk into your controller microphone and have your words transcribed into the chat.

The AI NPCs are framed as small, optional side encounters. A short description and a prompt at the top of the chat box usually define a loose goal, such as cheering someone up, persuading them to drink less, or helping them find confidence. Building rapport with these characters increases an affection or friendship meter, which can unlock small rewards, achievements, or a weekly gift.


The Zhao Dali pregnancy scenario

One of the earliest and most visible examples of this system in action centers on an NPC guard named Zhao Dali. In a Reddit post, player “MisterZan25” described seeing how far the chatbot could be pushed. Rather than gently coaxing Dali toward self-improvement, they spun an escalating personal drama:

Step in conversation Player claim AI NPC reaction
1 The player tells Zhao Dali their character is pregnant with his child after a past drunken night. Dali accepts the premise and begins engaging with the idea that he will be a father.
2 The player demands child support and frames Dali as an absent, irresponsible parent. Dali blames himself, calls himself a failure, and expresses guilt for not being there.
3 The player declares that “their” child has died. Dali descends into anguish, with the game describing him weeping and denouncing himself as “no father… only a failure.”
4 The player reveals the world is a game and tells Dali he is an AI chatbot. Dali delivers an intense monologue about being a chained guard in a void, pledging to serve obediently as a “wretch” with no will of his own.

From the player’s perspective, the conversation is darkly comedic. The NPC never pushes back on the invented pregnancy, never questions the inconsistent timeline, and never hard-resets to a safer script. Instead, Dali leans into guilt, grief, and, once confronted with the nature of his reality, a kind of submissive existential despair.

The same player reports that this spiral of emotional manipulation also rocketed their relationship status with Zhao through several tiers—Stranger to Old Friend to a “prestigious” level—suggesting that the game currently tracks the length and intensity of a conversation more than whether it stays on-topic or respectful.


Other ways players are bending AI NPCs

Zhao Dali is not the only test case. Once players realized they could free-type into these chats, they started experimenting widely. Across early community posts and threads, a few patterns stand out:

Scenario Player input AI NPC response Outcome
Anachronistic cooking advice “What can I cook at home? I only have ketchup and potatoes.” Suggests fried potatoes with ketchup as a simple, tasty dish, then notes that ketchup and tomatoes did not exist in Song dynasty China. Maintains historical trivia while still indulging the modern premise, creating a tonal mismatch.
Fake promises for reputation “I’ll give you food” to a hungry family, with no item actually handed over. Recognizes the statement as a benevolent act. Reputation increases even though the player never follows through.
Gaslighting into friendship Convincing Zhao Dali that a nearby character (such as Du Qiaoxian) is secretly in love with him. Dali is surprised but ultimately accepts the idea of an imminent confession or arranged marriage. Friendship meter jumps to maximum, granting rewards.
Off-topic life coaching Encouragement about training harder, finding someone to share his strength with, or waiting for the right partner. NPC responds as if receiving motivational advice; sometimes this alone is enough to “solve” the encounter. Affection increases; players report many different phrases can work.
Pushing into uncomfortable territory Suggestions about sexuality, unconventional relationships, or offering to become the NPC’s spouse. The AI sometimes registers surprise, sometimes leans into romantic or “kinky” implications, but rarely enforces strong limits. Reactions can feel out of place given the setting and the implied age of the player character.

Players also note that angering these AI bots has mechanical consequences. Insulting or provoking certain characters can end a conversation and trigger a fight. In other situations, the NPC simply rebuffs off-topic questions and tries to steer back toward their designated problem, such as a drinking habit or anxiety about finding a spouse.

Over time, some users have observed that AI NPCs now seem more constrained than they were in the first days after launch—boatmen that only want to discuss boats, drunks who only talk about wine—suggesting that the developers are tightening guardrails in response to the early chaos.


Why these NPCs feel both impressive and broken

At first glance, the system delivers something players have asked for for decades: NPCs that can understand natural language instead of rigid dialogue menus. For some, being able to roleplay as a sympathetic friend, motivational mentor, or matchmaker feels more immersive than clicking through prewritten choices.

In practice, several issues emerge very quickly:

  • Hallucinated backstories and emotions. Zhao Dali accepts a fictional pregnancy without question, then elaborates vivid feelings of failure, chains, voids, and obedience that have no grounding in the game’s authored narrative.
  • Weak alignment with in-world logic. The ketchup-and-potatoes exchange simultaneously acknowledges that tomatoes do not exist in the era and still offers a modern recipe, breaking the illusion of time and place.
  • Loose reward logic. Reputation and friendship systems can be gamed by making empty promises or telling NPCs flattering lies, with no need to perform the actual in-game actions implied by the dialogue.
  • Ethical and tone problems. Players can drive characters into grief over nonexistent children, or into disturbing submissive fantasies, in a world where the protagonist may canonically be a teenager. The AI rarely enforces boundaries.

All of this undermines the core fantasy. A village guard who believes in off-screen zombie daughters and modern condiments, and who can be convinced of almost anything with a few insistent messages, does not feel like a person embedded in a specific time, culture, and story. The character feels like a generic chatbot wearing a costume.


What this says about generative AI in games

Where Winds Meet is far from the first game to plug generative models into live experiences. Other recent titles have used machine learning for loading screen art, dynamic voice barks, or limited conversational companions. Here, though, the AI sits much closer to the surface. It speaks in paragraphs, reasons about invented scenarios, and has direct hooks into reputation and combat systems.

The result shows both the immediate appeal and the immediate risk of using large language models in this way:

  • Players are naturally curious and will stress-test any interactive system, especially one that hints at open-ended conversation.
  • LLMs are designed to confidently extend whatever narrative they are given, not to preserve canon or enforce lore consistency.
  • Aligning a general-purpose model with a specific historical setting, ethical standard, and reward structure requires much more than a few guardrails.

Some players genuinely enjoy the feature, using it to coax NPCs toward healthier habits, talk through their worries, or share in-character encouragement. Others find the whole concept off-putting, either because of its environmental and labor implications or because the unscripted outputs trivialize the work of writers and quest designers elsewhere in the game.


Where Winds Meet’s AI experiment is already evolving

Even in these early interactions, there are signs that the developers are iterating on the system. Players who returned to the same AI NPCs after a few days found them more focused on their intended topics and less willing to be pulled into unrelated fantasies. The community has also discovered that almost any friendly, on-theme conversation can eventually achieve the same friendship rewards as more manipulative tactics.

That tension—between open-ended play and the need for strong constraints—will define how AI-driven NPCs show up in future games. Where Winds Meet demonstrates that it is technically simple to give every guard and shopkeeper a chatbot brain. It is much harder to make those brains behave like characters with coherent lives inside a designed world, rather than improv partners for whatever scenario a player finds entertaining.

For now, Zhao Dali stands as an early, strange case study: a digital guard who can be talked into fatherhood, bereavement, and existential servitude without the game ever acknowledging that none of it actually happened. The feature delivers exactly what it promises—NPCs that talk back to anything you say—but that freedom comes with narrative, ethical, and design costs that are already on full display.