CRITICAL: File Operations Without Context Managers #3

Closed
opened 2026-02-16 22:00:02 +02:00 by Koko210 · 1 comment
Owner

What the Problem Is

When you open a file with open() but do not use a with statement (context manager), the file stays open until garbage collection runs. If an exception occurs before f.close(), the file handle is never closed, leading to resource leaks.

Where It Occurs

  • bot/utils/llm.py#L24-L29 - GPU state file
  • bot/server_manager.py#L83 - Config file
  • bot/utils/scheduled.py#L64 - Bedtime tracking file
  • bot/utils/moods.py#L50 - Mood file
  • bot/utils/autonomous_persistence.py#L44-L45 - Context file
  • bot/utils/bipolar_mode.py#L81 - State file
  • bot/utils/evil_mode.py#L28-L32 - State file

Why This Is a Problem

  1. Resource Leak: Each open file consumes a file descriptor (typically limited to 1024 per process)
  2. Data Corruption: If file is not closed properly, write buffers might not flush
  3. OS Limit: After ~1000 leaked files, OSError: [Errno 24] Too many open files
  4. Silent Failure: File operations can appear to work while actually failing to persist data

What Can Go Wrong

Scenario 1: Bot Runs Out of File Descriptors

  1. Bot processes 1000 messages with GPU switches
  2. Each message opens gpu_state.json but never closes it
  3. Operating system reaches file descriptor limit
  4. Next attempt to open any file fails with OSError: [Errno 24] Too many open files
  5. Bot can not read config, write logs, or load models
  6. Bot crashes or becomes unresponsive

Scenario 2: GPU State Corruption

  1. User switches from NVIDIA to AMD GPU via web UI
  2. Code opens gpu_state.json, writes {"current_gpu": "amd"}
  3. Before file is closed, an exception occurs (e.g., disk full)
  4. File handle is never closed, buffer might not flush
  5. File is left in corrupted or incomplete state
  6. Bot tries to read it, gets malformed JSON
  7. Falls back to NVIDIA GPU (wrong user preference)

Proposed Fix

Replace all open() calls with with open() context managers:

# bot/utils/llm.py - CORRECT VERSION
gpu_state_file = os.path.join(os.path.dirname(__file__), "..", "memory", "gpu_state.json")
try:
    with open(gpu_state_file, "r") as f:  # ← "with" ensures file is closed
        state = json.load(f)
        current_gpu = state.get("current_gpu", "nvidia")
        if current_gpu == "amd":
            return globals.LLAMA_AMD_URL
        else:
            return globals.LLAMA_URL
except FileNotFoundError:
    logger.warning(f"GPU state file not found, defaulting to NVIDIA")
    return globals.LLAMA_URL
except json.JSONDecodeError as e:
    logger.error(f"GPU state file corrupted: {e}, defaulting to NVIDIA")
    return globals.LLAMA_URL

Severity

CRITICAL - Resource leaks can crash the bot after extended runtime; data corruption causes state loss.

Files Affected

7 files including llm.py, server_manager.py, scheduled.py, moods.py, autonomous_persistence.py, bipolar_mode.py, evil_mode.py

## What the Problem Is When you open a file with `open()` but do not use a `with` statement (context manager), the file stays open until garbage collection runs. If an exception occurs before `f.close()`, the file handle is never closed, leading to resource leaks. ## Where It Occurs - `bot/utils/llm.py#L24-L29` - GPU state file - `bot/server_manager.py#L83` - Config file - `bot/utils/scheduled.py#L64` - Bedtime tracking file - `bot/utils/moods.py#L50` - Mood file - `bot/utils/autonomous_persistence.py#L44-L45` - Context file - `bot/utils/bipolar_mode.py#L81` - State file - `bot/utils/evil_mode.py#L28-L32` - State file ## Why This Is a Problem 1. **Resource Leak**: Each open file consumes a file descriptor (typically limited to 1024 per process) 2. **Data Corruption**: If file is not closed properly, write buffers might not flush 3. **OS Limit**: After ~1000 leaked files, `OSError: [Errno 24] Too many open files` 4. **Silent Failure**: File operations can appear to work while actually failing to persist data ## What Can Go Wrong ### Scenario 1: Bot Runs Out of File Descriptors 1. Bot processes 1000 messages with GPU switches 2. Each message opens `gpu_state.json` but never closes it 3. Operating system reaches file descriptor limit 4. Next attempt to open any file fails with `OSError: [Errno 24] Too many open files` 5. Bot can not read config, write logs, or load models 6. **Bot crashes or becomes unresponsive** ### Scenario 2: GPU State Corruption 1. User switches from NVIDIA to AMD GPU via web UI 2. Code opens `gpu_state.json`, writes `{"current_gpu": "amd"}` 3. Before file is closed, an exception occurs (e.g., disk full) 4. File handle is never closed, buffer might not flush 5. File is left in corrupted or incomplete state 6. Bot tries to read it, gets malformed JSON 7. Falls back to NVIDIA GPU (wrong user preference) ## Proposed Fix Replace all `open()` calls with `with open()` context managers: ```python # bot/utils/llm.py - CORRECT VERSION gpu_state_file = os.path.join(os.path.dirname(__file__), "..", "memory", "gpu_state.json") try: with open(gpu_state_file, "r") as f: # ← "with" ensures file is closed state = json.load(f) current_gpu = state.get("current_gpu", "nvidia") if current_gpu == "amd": return globals.LLAMA_AMD_URL else: return globals.LLAMA_URL except FileNotFoundError: logger.warning(f"GPU state file not found, defaulting to NVIDIA") return globals.LLAMA_URL except json.JSONDecodeError as e: logger.error(f"GPU state file corrupted: {e}, defaulting to NVIDIA") return globals.LLAMA_URL ``` ## Severity **CRITICAL** - Resource leaks can crash the bot after extended runtime; data corruption causes state loss. ## Files Affected 7 files including llm.py, server_manager.py, scheduled.py, moods.py, autonomous_persistence.py, bipolar_mode.py, evil_mode.py
Koko210 reopened this issue 2026-02-16 22:17:02 +02:00
Author
Owner

Closing as Invalid

I've thoroughly reviewed the codebase and verified every file mentioned in this issue. All 7 files cited already use with statement context managers for file operations:

  • bot/utils/llm.py#L24-29: Uses with open(gpu_state_file, \"r\") as f:
  • bot/server_manager.py#L83: Uses with open(self.config_file, \"r\", encoding=\"utf-8\") as f:
  • bot/utils/scheduled.py#L64: Uses with open(BEDTIME_TRACKING_FILE, \"r\") as f:
  • bot/utils/moods.py#L50: Uses with open(path, \"r\", encoding=\"utf-8\") as f:
  • bot/utils/autonomous_persistence.py#L52: Uses with open(CONTEXT_FILE, 'w') as f:
  • bot/utils/bipolar_mode.py#L81: Uses with open(BIPOLAR_STATE_FILE, \"r\", encoding=\"utf-8\") as f:
  • bot/utils/evil_mode.py#L33,L44: Uses with open(...)

The codebase actually follows best practices here. This issue appears to be based on stale or incorrect information."

**Closing as Invalid** I've thoroughly reviewed the codebase and verified every file mentioned in this issue. All 7 files cited already use `with` statement context managers for file operations: - `bot/utils/llm.py#L24-29`: Uses `with open(gpu_state_file, \"r\") as f:` ✅ - `bot/server_manager.py#L83`: Uses `with open(self.config_file, \"r\", encoding=\"utf-8\") as f:` ✅ - `bot/utils/scheduled.py#L64`: Uses `with open(BEDTIME_TRACKING_FILE, \"r\") as f:` ✅ - `bot/utils/moods.py#L50`: Uses `with open(path, \"r\", encoding=\"utf-8\") as f:` ✅ - `bot/utils/autonomous_persistence.py#L52`: Uses `with open(CONTEXT_FILE, 'w') as f:` ✅ - `bot/utils/bipolar_mode.py#L81`: Uses `with open(BIPOLAR_STATE_FILE, \"r\", encoding=\"utf-8\") as f:` ✅ - `bot/utils/evil_mode.py#L33,L44`: Uses `with open(...)` ✅ The codebase actually follows best practices here. This issue appears to be based on stale or incorrect information."
Sign in to join this conversation.
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Koko210/miku-discord#3