fix: raise size limit and make MAX_PART_SIZE configurable
All checks were successful
continuous-integration/drone/push Build is passing

- Raise WORKER_MAX_ZIP_SIZE_MB from 4GB to 200GB (production .env)
- Make MAX_PART_SIZE configurable via MAX_PART_SIZE_MB env var
  (default 1950 MiB, set to 3900 for Premium accounts)
- Remove hardcoded 1950 MiB constants in split.ts and worker.ts
- Add grouping system audit report with real-world failure cases

10 archives were blocked by the 4GB limit (up to 70.5GB).
They will be retried on next ingestion cycle.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
2026-03-30 12:41:37 +02:00
parent 718007446f
commit 194c87a256
4 changed files with 483 additions and 4 deletions

View File

@@ -3,15 +3,19 @@ import { stat } from "fs/promises";
import path from "path";
import { pipeline } from "stream/promises";
import { childLogger } from "../util/logger.js";
import { config } from "../util/config.js";
const log = childLogger("split");
/**
* 1950 MiB — safely under Telegram's 2GB upload limit.
* At exactly 2GiB, TDLib's internal 512KB chunking can exceed Telegram's
* Maximum part size for Telegram upload. Configurable via MAX_PART_SIZE_MB env var.
* Default: 1950 MiB (safely under 2GB non-Premium limit).
* Premium: set to 3900 MiB (safely under 4GB Premium limit).
*
* At exactly 2/4 GiB, TDLib's internal 512KB chunking can exceed Telegram's
* 4000-part threshold, causing FILE_PARTS_INVALID errors.
*/
const MAX_PART_SIZE = 1950n * 1024n * 1024n;
const MAX_PART_SIZE = BigInt(config.maxPartSizeMB) * 1024n * 1024n;
/**
* Split a file into ≤2GB parts using byte-level splitting.