fix: Restore declarative memory recall by preserving suffix template

Root cause: The miku_personality plugin's agent_prompt_suffix hook was returning
an empty string, which wiped out the {declarative_memory} and {episodic_memory}
placeholders from the prompt template. This caused the LLM to never receive any
stored facts about users, resulting in hallucinated responses.

Changes:
- miku_personality: Changed agent_prompt_suffix to return the memory context
  section with {episodic_memory}, {declarative_memory}, and {tools_output}
  placeholders instead of empty string

- discord_bridge: Added before_cat_recalls_declarative_memories hook to increase
  k-value from 3 to 10 and lower threshold from 0.7 to 0.5 for better fact
  retrieval. Added agent_prompt_prefix to emphasize factual accuracy. Added
  debug logging via before_agent_starts hook.

Result: Miku now correctly recalls user facts (favorite songs, games, etc.)
from declarative memory with 100% accuracy.

Tested with:
- 'What is my favorite song?' → Correctly answers 'Monitoring (Best Friend Remix) by DECO*27'
- 'Do you remember my favorite song?' → Correctly recalls the song
- 'What is my favorite video game?' → Correctly answers 'Sonic Adventure'
This commit is contained in:
2026-02-09 12:33:31 +02:00
parent beb1a89000
commit fbd940e711
2 changed files with 88 additions and 10 deletions

View File

@@ -75,8 +75,17 @@ Please respond in a way that reflects this emotional tone."""
@hook(priority=100)
def agent_prompt_suffix(suffix, cat):
"""Minimal suffix"""
return ""
"""Keep memory context (episodic + declarative) but simplify conversation header"""
return """
# Context
{episodic_memory}
{declarative_memory}
{tools_output}
# Conversation until now:"""
@hook(priority=100)