mirror of
https://github.com/xCyanGrizzly/DragonsStash.git
synced 2026-05-10 22:01:16 +00:00
Update tg issues
This commit is contained in:
5
.claude/settings.json
Normal file
5
.claude/settings.json
Normal file
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"enabledPlugins": {
|
||||
"superpowers@superpowers-marketplace": true
|
||||
}
|
||||
}
|
||||
@@ -83,7 +83,10 @@
|
||||
"Bash(git -C /mnt/c/Users/A00963355/OneDrive - Amaris Zorggroep/Documents/VScodeProjects/DragonsStash log --oneline -10)",
|
||||
"Bash(git -C \"C:/Users/A00963355/OneDrive - Amaris Zorggroep/Documents/VScodeProjects/DragonsStash\" status --short)",
|
||||
"Bash(timeout:*)",
|
||||
"mcp__Claude_Preview__preview_start"
|
||||
"mcp__Claude_Preview__preview_start",
|
||||
"Bash(cat:*)",
|
||||
"Bash(grep:*)",
|
||||
"Bash(wait:*)"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
102
CLAUDE.md
Normal file
102
CLAUDE.md
Normal file
@@ -0,0 +1,102 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
Dragon's Stash is a self-hosted inventory management system for 3D printing filament, SLA resin, miniature paints, and supplies. It includes an integrated Telegram archive worker that scans channels for ZIP/RAR archives, indexes their contents, and a bot that lets users search and receive packages via Telegram.
|
||||
|
||||
## Tech Stack
|
||||
|
||||
- **App**: Next.js 16 (App Router), TypeScript 5.9 (strict), Tailwind CSS 4, shadcn/ui
|
||||
- **Database**: PostgreSQL 16+ via Prisma v7.4 with `@prisma/adapter-pg`
|
||||
- **Auth**: Auth.js v5 (NextAuth) with credentials + optional GitHub OAuth
|
||||
- **Worker**: TypeScript + TDLib (via `tdl`) for Telegram channel scanning
|
||||
- **Bot**: TypeScript + TDLib for Telegram bot interface
|
||||
- **Forms**: React Hook Form + Zod v4
|
||||
|
||||
## Commands
|
||||
|
||||
### App (root package.json)
|
||||
```bash
|
||||
npm run dev # Next.js dev server with hot reload
|
||||
npm run build # Production build (standalone output)
|
||||
npm run start # Production server
|
||||
npm run lint # ESLint (next/core-web-vitals + TypeScript)
|
||||
```
|
||||
|
||||
### Database
|
||||
```bash
|
||||
npm run db:generate # Generate Prisma client
|
||||
npm run db:migrate # Run migrations (dev mode)
|
||||
npm run db:push # Push schema without migrations
|
||||
npm run db:seed # Seed database with test data
|
||||
npm run db:studio # Prisma Studio UI
|
||||
npx prisma migrate dev --name <description> # Create new migration
|
||||
```
|
||||
|
||||
### Worker & Bot (each in their own directory)
|
||||
```bash
|
||||
cd worker && npm run dev # Dev mode with tsx watch
|
||||
cd worker && npm run build # TypeScript compile to dist/
|
||||
cd bot && npm run dev # Dev mode with tsx watch
|
||||
cd bot && npm run build # TypeScript compile to dist/
|
||||
```
|
||||
|
||||
### Dev Environment Setup
|
||||
```bash
|
||||
docker compose -f docker-compose.dev.yml up -d # Start PostgreSQL + worker
|
||||
npm run dev # Run app locally
|
||||
```
|
||||
|
||||
## Architecture
|
||||
|
||||
### Three-Service Design
|
||||
The project is split into three independent services sharing one PostgreSQL database:
|
||||
1. **App** (root `src/`): Next.js web UI for inventory management and Telegram admin
|
||||
2. **Worker** (`worker/`): Scans Telegram source channels, processes archives, uploads to destination channel
|
||||
3. **Bot** (`bot/`): Telegram bot for user search, package delivery, keyword subscriptions
|
||||
|
||||
Services communicate asynchronously via `pg_notify` (e.g., on-demand channel fetches, bot send requests).
|
||||
|
||||
### App Source Layout (`src/`)
|
||||
- `app/(auth)/` — Login/register pages (public)
|
||||
- `app/(app)/` — Protected routes behind auth middleware (dashboard, filaments, resins, paints, supplies, vendors, locations, settings, stls, telegram, usage)
|
||||
- `app/api/` — API routes (NextAuth, health check, bot endpoints)
|
||||
- `data/` — Server-side Prisma query functions (`*.queries.ts`), one file per domain model
|
||||
- `schemas/` — Zod validation schemas, one file per domain model
|
||||
- `components/ui/` — shadcn/ui primitives
|
||||
- `components/shared/` — Reusable business components (data-table, status-badge, color-swatch, stat-card, page-header)
|
||||
- `components/layout/` — Sidebar and header
|
||||
- `lib/` — Auth config, Prisma singleton, constants, utilities, Telegram query helpers
|
||||
- `hooks/` — Custom React hooks (use-modal, use-debounce, use-current-user)
|
||||
- `types/` — Shared TypeScript types
|
||||
|
||||
### Key Patterns
|
||||
- **Server Components by default** — pages are async server components that fetch data directly. Only interactive components use `"use client"`.
|
||||
- **Server Actions for mutations** — each page directory has an `actions.ts` file with create/update/delete actions.
|
||||
- **Data queries centralized** — all Prisma reads go through `src/data/*.queries.ts`, not inline in components.
|
||||
- **Modal-based CRUD** — add/edit forms use dialog modals, not separate pages.
|
||||
- **TanStack Table** with server-side pagination for all inventory tables.
|
||||
- **All Prisma PKs use `cuid()`** string IDs.
|
||||
|
||||
### Worker Pipeline
|
||||
1. Authenticate Telegram account via TDLib (SMS code flow, managed via admin UI)
|
||||
2. Scan source channels for messages since `lastProcessedMessageId`
|
||||
3. Detect archives (ZIP/RAR), group multipart sets, extract file listings
|
||||
4. Hash for dedup, match preview images, extract creator from filename
|
||||
5. Split files >2GB, upload to destination channel, track progress
|
||||
|
||||
### ESLint Scope
|
||||
ESLint covers `src/` only. The `worker/`, `bot/`, `scripts/`, and `prisma/seed.ts` directories are excluded from linting.
|
||||
|
||||
## Docker Deployment
|
||||
|
||||
- `docker-compose.yml` — Production: app + worker + bot + db
|
||||
- `docker-compose.dev.yml` — Dev: db + worker only (app runs locally)
|
||||
- `docker-entrypoint.sh` — Runs migrations, optional seeding, then starts app
|
||||
- Bot service uses Docker Compose profiles (`bot` or `full`) — not started by default
|
||||
|
||||
## Testing
|
||||
|
||||
No test framework is configured. Testing is manual.
|
||||
@@ -1,7 +1,7 @@
|
||||
import { config } from "./util/config.js";
|
||||
import { logger } from "./util/logger.js";
|
||||
import { db, pool } from "./db/client.js";
|
||||
import { createBotClient, closeBotClient, onBotUpdate } from "./tdlib/client.js";
|
||||
import { createBotClient, closeBotClient, onBotUpdate, getUser } from "./tdlib/client.js";
|
||||
import { startSendListener, stopSendListener } from "./send-listener.js";
|
||||
import { handleMessage } from "./commands.js";
|
||||
import { mkdir } from "fs/promises";
|
||||
@@ -49,14 +49,27 @@ async function main(): Promise<void> {
|
||||
const userId = senderId.user_id as number;
|
||||
|
||||
if (text && userId) {
|
||||
// Get user info for display name (async but fire-and-forget for perf)
|
||||
handleMessage({
|
||||
(async () => {
|
||||
let firstName = "User";
|
||||
let lastName: string | undefined;
|
||||
let username: string | undefined;
|
||||
try {
|
||||
const userInfo = await getUser(userId);
|
||||
firstName = userInfo.firstName;
|
||||
lastName = userInfo.lastName;
|
||||
username = userInfo.username;
|
||||
} catch {
|
||||
// Fall back to defaults if getUser fails
|
||||
}
|
||||
await handleMessage({
|
||||
chatId: BigInt(chatId),
|
||||
userId: BigInt(userId),
|
||||
text,
|
||||
firstName: "User", // TDLib provides this via a separate getUser call
|
||||
username: undefined,
|
||||
}).catch((err) => {
|
||||
firstName,
|
||||
lastName,
|
||||
username,
|
||||
});
|
||||
})().catch((err) => {
|
||||
log.error({ err, chatId, userId }, "Failed to handle message");
|
||||
});
|
||||
}
|
||||
|
||||
@@ -182,7 +182,7 @@ async function handleNewPackage(payload: string): Promise<void> {
|
||||
userSubs.set(key, patterns);
|
||||
}
|
||||
|
||||
const creator = data.creator ? ` by ${data.creator}` : "";
|
||||
const creator = data.creator ? ` by ${escapeHtml(data.creator)}` : "";
|
||||
for (const [telegramUserId, patterns] of userSubs) {
|
||||
const msg = [
|
||||
`🔔 <b>New package matching your subscriptions:</b>`,
|
||||
|
||||
@@ -143,6 +143,28 @@ export async function sendPhotoMessage(
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get basic info about a Telegram user (name, username).
|
||||
*/
|
||||
export async function getUser(
|
||||
userId: number
|
||||
): Promise<{ firstName: string; lastName?: string; username?: string }> {
|
||||
if (!client) throw new Error("Bot client not initialized");
|
||||
const user = (await client.invoke({
|
||||
_: "getUser",
|
||||
user_id: userId,
|
||||
})) as {
|
||||
first_name?: string;
|
||||
last_name?: string;
|
||||
usernames?: { editable_username?: string };
|
||||
};
|
||||
return {
|
||||
firstName: user.first_name ?? "User",
|
||||
lastName: user.last_name || undefined,
|
||||
username: user.usernames?.editable_username || undefined,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get updates from TDLib. The bot listens for new messages this way.
|
||||
*/
|
||||
|
||||
221
install.cmd
Normal file
221
install.cmd
Normal file
@@ -0,0 +1,221 @@
|
||||
@echo off
|
||||
setlocal enabledelayedexpansion
|
||||
|
||||
REM Claude Code Windows CMD Bootstrap Script
|
||||
REM Installs Claude Code for environments where PowerShell is not available
|
||||
|
||||
REM Parse command line argument
|
||||
set "TARGET=%~1"
|
||||
if "!TARGET!"=="" set "TARGET=latest"
|
||||
|
||||
REM Validate target parameter
|
||||
if /i "!TARGET!"=="stable" goto :target_valid
|
||||
if /i "!TARGET!"=="latest" goto :target_valid
|
||||
echo !TARGET! | findstr /r "^[0-9][0-9]*\.[0-9][0-9]*\.[0-9][0-9]*" >nul
|
||||
if !ERRORLEVEL! equ 0 goto :target_valid
|
||||
|
||||
echo Usage: %0 [stable^|latest^|VERSION] >&2
|
||||
echo Example: %0 1.0.58 >&2
|
||||
exit /b 1
|
||||
|
||||
:target_valid
|
||||
|
||||
REM Check for 64-bit Windows
|
||||
if /i "%PROCESSOR_ARCHITECTURE%"=="AMD64" goto :arch_valid
|
||||
if /i "%PROCESSOR_ARCHITECTURE%"=="ARM64" goto :arch_valid
|
||||
if /i "%PROCESSOR_ARCHITEW6432%"=="AMD64" goto :arch_valid
|
||||
if /i "%PROCESSOR_ARCHITEW6432%"=="ARM64" goto :arch_valid
|
||||
|
||||
echo Claude Code does not support 32-bit Windows. Please use a 64-bit version of Windows. >&2
|
||||
exit /b 1
|
||||
|
||||
:arch_valid
|
||||
|
||||
REM Set constants
|
||||
set "GCS_BUCKET=https://storage.googleapis.com/claude-code-dist-86c565f3-f756-42ad-8dfa-d59b1c096819/claude-code-releases"
|
||||
set "DOWNLOAD_DIR=%USERPROFILE%\.claude\downloads"
|
||||
REM Use native ARM64 binary on ARM64 Windows, x64 otherwise
|
||||
if /i "%PROCESSOR_ARCHITECTURE%"=="ARM64" (
|
||||
set "PLATFORM=win32-arm64"
|
||||
) else (
|
||||
set "PLATFORM=win32-x64"
|
||||
)
|
||||
|
||||
REM Create download directory
|
||||
if not exist "!DOWNLOAD_DIR!" mkdir "!DOWNLOAD_DIR!"
|
||||
|
||||
REM Check for curl availability
|
||||
curl --version >nul 2>&1
|
||||
if !ERRORLEVEL! neq 0 (
|
||||
echo curl is required but not available. Please install curl or use PowerShell installer. >&2
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
REM Always download latest version (which has the most up-to-date installer)
|
||||
call :download_file "!GCS_BUCKET!/latest" "!DOWNLOAD_DIR!\latest"
|
||||
if !ERRORLEVEL! neq 0 (
|
||||
echo Failed to get latest version >&2
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
REM Read version from file
|
||||
set /p VERSION=<"!DOWNLOAD_DIR!\latest"
|
||||
del "!DOWNLOAD_DIR!\latest"
|
||||
|
||||
REM Download manifest
|
||||
call :download_file "!GCS_BUCKET!/!VERSION!/manifest.json" "!DOWNLOAD_DIR!\manifest.json"
|
||||
if !ERRORLEVEL! neq 0 (
|
||||
echo Failed to get manifest >&2
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
REM Extract checksum from manifest
|
||||
call :parse_manifest "!DOWNLOAD_DIR!\manifest.json" "!PLATFORM!"
|
||||
if !ERRORLEVEL! neq 0 (
|
||||
echo Platform !PLATFORM! not found in manifest >&2
|
||||
del "!DOWNLOAD_DIR!\manifest.json" 2>nul
|
||||
exit /b 1
|
||||
)
|
||||
del "!DOWNLOAD_DIR!\manifest.json"
|
||||
|
||||
REM Download binary
|
||||
set "BINARY_PATH=!DOWNLOAD_DIR!\claude-!VERSION!-!PLATFORM!.exe"
|
||||
call :download_file "!GCS_BUCKET!/!VERSION!/!PLATFORM!/claude.exe" "!BINARY_PATH!"
|
||||
if !ERRORLEVEL! neq 0 (
|
||||
echo Failed to download binary >&2
|
||||
if exist "!BINARY_PATH!" del "!BINARY_PATH!"
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
REM Verify checksum
|
||||
call :verify_checksum "!BINARY_PATH!" "!EXPECTED_CHECKSUM!"
|
||||
if !ERRORLEVEL! neq 0 (
|
||||
echo Checksum verification failed >&2
|
||||
del "!BINARY_PATH!"
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
REM Run claude install to set up launcher and shell integration
|
||||
echo Setting up Claude Code...
|
||||
"!BINARY_PATH!" install "!TARGET!"
|
||||
set "INSTALL_RESULT=!ERRORLEVEL!"
|
||||
|
||||
REM Clean up downloaded file
|
||||
REM Wait a moment for any file handles to be released
|
||||
timeout /t 1 /nobreak >nul 2>&1
|
||||
del /f "!BINARY_PATH!" >nul 2>&1
|
||||
if exist "!BINARY_PATH!" (
|
||||
echo Warning: Could not remove temporary file: !BINARY_PATH!
|
||||
)
|
||||
|
||||
if !INSTALL_RESULT! neq 0 (
|
||||
echo Installation failed >&2
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
echo.
|
||||
echo Installation complete^^!
|
||||
echo.
|
||||
exit /b 0
|
||||
|
||||
REM ============================================================================
|
||||
REM SUBROUTINES
|
||||
REM ============================================================================
|
||||
|
||||
:download_file
|
||||
REM Downloads a file using curl
|
||||
REM Args: %1=URL, %2=OutputPath
|
||||
set "URL=%~1"
|
||||
set "OUTPUT=%~2"
|
||||
|
||||
curl -fsSL "!URL!" -o "!OUTPUT!"
|
||||
exit /b !ERRORLEVEL!
|
||||
|
||||
:parse_manifest
|
||||
REM Parse JSON manifest to extract checksum for platform
|
||||
REM Args: %1=ManifestPath, %2=Platform
|
||||
set "MANIFEST_PATH=%~1"
|
||||
set "PLATFORM_NAME=%~2"
|
||||
set "EXPECTED_CHECKSUM="
|
||||
|
||||
REM Use findstr to find platform section, then look for checksum
|
||||
set "FOUND_PLATFORM="
|
||||
set "IN_PLATFORM_SECTION="
|
||||
|
||||
REM Read the manifest line by line
|
||||
for /f "usebackq tokens=*" %%i in ("!MANIFEST_PATH!") do (
|
||||
set "LINE=%%i"
|
||||
|
||||
REM Check if this line contains our platform
|
||||
echo !LINE! | findstr /c:"\"%PLATFORM_NAME%\":" >nul
|
||||
if !ERRORLEVEL! equ 0 (
|
||||
set "IN_PLATFORM_SECTION=1"
|
||||
)
|
||||
|
||||
REM If we're in the platform section, look for checksum
|
||||
if defined IN_PLATFORM_SECTION (
|
||||
echo !LINE! | findstr /c:"\"checksum\":" >nul
|
||||
if !ERRORLEVEL! equ 0 (
|
||||
REM Extract checksum value
|
||||
for /f "tokens=2 delims=:" %%j in ("!LINE!") do (
|
||||
set "CHECKSUM_PART=%%j"
|
||||
REM Remove quotes, whitespace, and comma
|
||||
set "CHECKSUM_PART=!CHECKSUM_PART: =!"
|
||||
set "CHECKSUM_PART=!CHECKSUM_PART:"=!"
|
||||
set "CHECKSUM_PART=!CHECKSUM_PART:,=!"
|
||||
|
||||
REM Check if it looks like a SHA256 (64 hex chars)
|
||||
if not "!CHECKSUM_PART!"=="" (
|
||||
call :check_length "!CHECKSUM_PART!" 64
|
||||
if !ERRORLEVEL! equ 0 (
|
||||
set "EXPECTED_CHECKSUM=!CHECKSUM_PART!"
|
||||
exit /b 0
|
||||
)
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
REM Check if we've left the platform section (closing brace)
|
||||
echo !LINE! | findstr /c:"}" >nul
|
||||
if !ERRORLEVEL! equ 0 set "IN_PLATFORM_SECTION="
|
||||
)
|
||||
)
|
||||
|
||||
if "!EXPECTED_CHECKSUM!"=="" exit /b 1
|
||||
exit /b 0
|
||||
|
||||
:check_length
|
||||
REM Check if string length equals expected length
|
||||
REM Args: %1=String, %2=ExpectedLength
|
||||
set "STR=%~1"
|
||||
set "EXPECTED_LEN=%~2"
|
||||
set "LEN=0"
|
||||
:count_loop
|
||||
if "!STR:~%LEN%,1!"=="" goto :count_done
|
||||
set /a LEN+=1
|
||||
goto :count_loop
|
||||
:count_done
|
||||
if %LEN%==%EXPECTED_LEN% exit /b 0
|
||||
exit /b 1
|
||||
|
||||
:verify_checksum
|
||||
REM Verify file checksum using certutil
|
||||
REM Args: %1=FilePath, %2=ExpectedChecksum
|
||||
set "FILE_PATH=%~1"
|
||||
set "EXPECTED=%~2"
|
||||
|
||||
for /f "skip=1 tokens=*" %%i in ('certutil -hashfile "!FILE_PATH!" SHA256') do (
|
||||
set "ACTUAL=%%i"
|
||||
set "ACTUAL=!ACTUAL: =!"
|
||||
if "!ACTUAL!"=="CertUtil:Thecommandcompletedsuccessfully." goto :verify_done
|
||||
if "!ACTUAL!" neq "" (
|
||||
if /i "!ACTUAL!"=="!EXPECTED!" (
|
||||
exit /b 0
|
||||
) else (
|
||||
exit /b 1
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
:verify_done
|
||||
exit /b 1
|
||||
@@ -288,22 +288,6 @@ export async function setChannelType(
|
||||
}
|
||||
}
|
||||
|
||||
export async function triggerChannelSync(): Promise<ActionResult> {
|
||||
const admin = await requireAdmin();
|
||||
if (!admin.success) return admin;
|
||||
|
||||
try {
|
||||
// Signal the worker to do a channel sync via pg_notify
|
||||
await prisma.$queryRawUnsafe(
|
||||
`SELECT pg_notify('channel_sync', 'requested')`
|
||||
);
|
||||
revalidatePath(REVALIDATE_PATH);
|
||||
return { success: true, data: undefined };
|
||||
} catch {
|
||||
return { success: false, error: "Failed to trigger channel sync" };
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset all scan progress for a channel so the worker will re-process it
|
||||
* from the very beginning on the next ingestion cycle.
|
||||
|
||||
@@ -36,11 +36,13 @@ async function main(): Promise<void> {
|
||||
// Graceful shutdown
|
||||
function shutdown(signal: string): void {
|
||||
log.info({ signal }, "Shutdown signal received");
|
||||
stopScheduler();
|
||||
|
||||
// Stop accepting new work
|
||||
stopFetchListener();
|
||||
|
||||
// Close DB connections
|
||||
Promise.all([db.$disconnect(), pool.end()])
|
||||
// Wait for any active cycle to finish before closing DB
|
||||
stopScheduler()
|
||||
.then(() => Promise.all([db.$disconnect(), pool.end()]))
|
||||
.then(() => {
|
||||
log.info("Shutdown complete");
|
||||
process.exit(0);
|
||||
|
||||
@@ -9,6 +9,7 @@ const log = childLogger("scheduler");
|
||||
let running = false;
|
||||
let timer: ReturnType<typeof setTimeout> | null = null;
|
||||
let cycleCount = 0;
|
||||
let activeCyclePromise: Promise<void> | null = null;
|
||||
|
||||
/**
|
||||
* Maximum time for a single ingestion cycle (ms).
|
||||
@@ -107,7 +108,9 @@ function scheduleNext(): void {
|
||||
);
|
||||
|
||||
timer = setTimeout(async () => {
|
||||
await runCycle();
|
||||
activeCyclePromise = runCycle();
|
||||
await activeCyclePromise;
|
||||
activeCyclePromise = null;
|
||||
scheduleNext();
|
||||
}, delay);
|
||||
}
|
||||
@@ -125,7 +128,9 @@ export async function startScheduler(): Promise<void> {
|
||||
);
|
||||
|
||||
// Run immediately on start
|
||||
await runCycle();
|
||||
activeCyclePromise = runCycle();
|
||||
await activeCyclePromise;
|
||||
activeCyclePromise = null;
|
||||
|
||||
// Then schedule recurring cycles
|
||||
scheduleNext();
|
||||
@@ -146,11 +151,21 @@ export async function triggerImmediateCycle(): Promise<void> {
|
||||
|
||||
/**
|
||||
* Stop the scheduler gracefully.
|
||||
* Returns a promise that resolves when any active cycle finishes,
|
||||
* so callers can wait before closing DB connections.
|
||||
*/
|
||||
export function stopScheduler(): void {
|
||||
export function stopScheduler(): Promise<void> {
|
||||
if (timer) {
|
||||
clearTimeout(timer);
|
||||
timer = null;
|
||||
}
|
||||
if (activeCyclePromise) {
|
||||
log.info("Scheduler stopping — waiting for active cycle to finish");
|
||||
return activeCyclePromise.finally(() => {
|
||||
activeCyclePromise = null;
|
||||
log.info("Scheduler stopped");
|
||||
});
|
||||
}
|
||||
log.info("Scheduler stopped");
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
@@ -107,12 +107,10 @@ export async function getForumTopicList(
|
||||
|
||||
for (const t of result.topics) {
|
||||
if (!t.info?.message_thread_id) continue;
|
||||
// Skip the "General" topic — it's not creator-specific
|
||||
if (t.info.is_general) continue;
|
||||
|
||||
topics.push({
|
||||
topicId: BigInt(t.info.message_thread_id),
|
||||
name: t.info.name ?? "Unnamed",
|
||||
name: t.info.is_general ? "General" : (t.info.name ?? "Unnamed"),
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -76,6 +76,10 @@ export async function uploadToChannel(
|
||||
/**
|
||||
* Send a single file message and wait for Telegram to confirm the upload.
|
||||
* Returns the final server-assigned message ID.
|
||||
*
|
||||
* IMPORTANT: The update listener is attached BEFORE sending the message to
|
||||
* avoid a race where fast uploads (cached files) complete before the listener
|
||||
* is registered, which would cause the promise to hang forever.
|
||||
*/
|
||||
async function sendAndWaitForUpload(
|
||||
client: Client,
|
||||
@@ -85,41 +89,10 @@ async function sendAndWaitForUpload(
|
||||
fileName: string,
|
||||
fileSizeMB: number
|
||||
): Promise<bigint> {
|
||||
// Send the message — this returns a temporary message immediately.
|
||||
// Wrapped in withFloodWait to handle Telegram rate limits on upload.
|
||||
const tempMsg = (await withFloodWait(
|
||||
() =>
|
||||
client.invoke({
|
||||
_: "sendMessage",
|
||||
chat_id: Number(chatId),
|
||||
input_message_content: {
|
||||
_: "inputMessageDocument",
|
||||
document: {
|
||||
_: "inputFileLocal",
|
||||
path: filePath,
|
||||
},
|
||||
caption: caption
|
||||
? {
|
||||
_: "formattedText",
|
||||
text: caption,
|
||||
}
|
||||
: undefined,
|
||||
},
|
||||
}),
|
||||
"sendMessage:upload"
|
||||
)) as { id: number };
|
||||
|
||||
const tempMsgId = tempMsg.id;
|
||||
|
||||
log.debug(
|
||||
{ fileName, tempMsgId },
|
||||
"Message queued, waiting for upload confirmation"
|
||||
);
|
||||
|
||||
// Wait for the actual upload to complete
|
||||
return new Promise<bigint>((resolve, reject) => {
|
||||
let settled = false;
|
||||
let lastLoggedPercent = 0;
|
||||
let tempMsgId: number | null = null;
|
||||
|
||||
// Timeout: 10 minutes per GB, minimum 10 minutes
|
||||
const timeoutMs = Math.max(
|
||||
@@ -162,7 +135,7 @@ async function sendAndWaitForUpload(
|
||||
if (update?._ === "updateMessageSendSucceeded") {
|
||||
const msg = update.message;
|
||||
const oldMsgId = update.old_message_id;
|
||||
if (oldMsgId === tempMsgId) {
|
||||
if (tempMsgId !== null && oldMsgId === tempMsgId) {
|
||||
if (!settled) {
|
||||
settled = true;
|
||||
cleanup();
|
||||
@@ -179,7 +152,7 @@ async function sendAndWaitForUpload(
|
||||
// Upload failed
|
||||
if (update?._ === "updateMessageSendFailed") {
|
||||
const oldMsgId = update.old_message_id;
|
||||
if (oldMsgId === tempMsgId) {
|
||||
if (tempMsgId !== null && oldMsgId === tempMsgId) {
|
||||
if (!settled) {
|
||||
settled = true;
|
||||
cleanup();
|
||||
@@ -195,7 +168,47 @@ async function sendAndWaitForUpload(
|
||||
client.off("update", handleUpdate);
|
||||
};
|
||||
|
||||
// Attach listener BEFORE sending to avoid missing fast completions
|
||||
client.on("update", handleUpdate);
|
||||
|
||||
// Send the message — this returns a temporary message immediately.
|
||||
// Wrapped in withFloodWait to handle Telegram rate limits on upload.
|
||||
withFloodWait(
|
||||
() =>
|
||||
client.invoke({
|
||||
_: "sendMessage",
|
||||
chat_id: Number(chatId),
|
||||
input_message_content: {
|
||||
_: "inputMessageDocument",
|
||||
document: {
|
||||
_: "inputFileLocal",
|
||||
path: filePath,
|
||||
},
|
||||
caption: caption
|
||||
? {
|
||||
_: "formattedText",
|
||||
text: caption,
|
||||
}
|
||||
: undefined,
|
||||
},
|
||||
}),
|
||||
"sendMessage:upload"
|
||||
)
|
||||
.then((result) => {
|
||||
const tempMsg = result as { id: number };
|
||||
tempMsgId = tempMsg.id;
|
||||
log.debug(
|
||||
{ fileName, tempMsgId },
|
||||
"Message queued, waiting for upload confirmation"
|
||||
);
|
||||
})
|
||||
.catch((err) => {
|
||||
if (!settled) {
|
||||
settled = true;
|
||||
cleanup();
|
||||
reject(err);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -559,9 +559,11 @@ export async function runWorkerForAccount(
|
||||
}
|
||||
|
||||
// ── Done ──
|
||||
await throttled.flush();
|
||||
await completeIngestionRun(activeRunId, counters);
|
||||
accountLog.info({ counters }, "Ingestion run completed");
|
||||
} finally {
|
||||
await throttled.flush();
|
||||
await closeTdlibClient(client);
|
||||
}
|
||||
} catch (err) {
|
||||
|
||||
Reference in New Issue
Block a user