14 Commits

Author SHA1 Message Date
527aca7c25 feat: add package grouping for Telegram album files
All checks were successful
continuous-integration/drone/push Build is passing
Groups related packages posted together in Telegram channels.
Auto-detects albums via media_album_id, supports manual grouping
from UI. Groups appear as collapsible rows in STL files table.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:46:52 +01:00
a4156b2ac6 fix: add race condition guard and null check in group queries
- createOrFindPackageGroup: catch unique constraint violation from
  concurrent creates and fall back to findFirst
- createManualGroup: guard against empty package results before
  accessing first element

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:45:29 +01:00
d50c68f67c feat: add package grouping UI with expand/collapse, selection, and manual grouping
- Update STL page to use listDisplayItems query for mixed package/group display
- Rewrite package-columns to handle StlTableRow union type (group headers + packages)
- Add group expand/collapse with chevron toggle and indented member rows
- Add checkbox selection with "Group N Selected" toolbar button and dialog
- Add inline group actions: rename, dissolve, send all, remove member
- Add clickable group preview thumbnail with file upload for preview images
- Extend DataTable with optional rowClassName prop for group row styling

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:39:23 +01:00
f6e7f5ed3c feat: add server actions for group management
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:34:29 +01:00
e7f213eec4 feat: add group preview image API endpoint
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:33:29 +01:00
20b7d28fdf feat: add listDisplayItems query, group CRUD, and search by group name
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:32:47 +01:00
21663fc29e feat: add PackageGroupRow and DisplayItem types
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:31:02 +01:00
218ccb9282 feat: add album grouping post-processing to worker pipeline
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:28:19 +01:00
b632533f54 feat: add createOrFindPackageGroup and linkPackagesToGroup worker queries
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:24:31 +01:00
4baf5aad83 feat: capture media_album_id from TDLib messages during scanning
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:23:47 +01:00
ad7790c07b feat: add mediaAlbumId to TelegramMessage and TelegramPhoto interfaces
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:23:11 +01:00
e4398caebe feat: add PackageGroup schema for album-based file grouping
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:21:52 +01:00
6eb7129637 docs: add package grouping design spec and implementation plan
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 21:40:13 +01:00
d6386209be fix: improve download/upload reliability and fix FILE_PARTS_INVALID
- Add downloadStarted flag to prevent false "stopped unexpectedly" errors
  when TDLib emits initial updateFile before download is active
- Add 5-minute stall detection for both downloads and uploads
- Reduce max split part size from 2GiB to 1950MiB to stay under
  Telegram's internal upload part count limits
- Increase timeouts from max(10min, 15min/GB) to max(15min, 20min/GB)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 21:40:00 +01:00
23 changed files with 4859 additions and 105 deletions

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,241 @@
# Design: Search Match Indicators, Size Limit Increase, Skipped/Failed Files Overview
**Date:** 2026-03-24
**Status:** Approved
## Overview
Three related improvements to the STL packages system:
1. **Search match indicators** — Show which internal files matched a search query, with highlighted files in the drawer
2. **Size limit increase** — Raise the ingestion limit from 4 GB to 200 GB so large multipart archives aren't skipped
3. **Skipped/failed files overview** — Track and display archives that were skipped or failed, with retry capability
---
## Feature 1: Size Limit Increase
### Change
`worker/src/util/config.ts` line 6 — change default from `"4096"` to `"204800"`.
One-line change. The split/upload pipeline already handles arbitrary sizes. The 2 GB per-part Telegram API limit is a separate hard-coded constant and stays as-is.
### Impact
- Archives up to 200 GB will now be attempted
- Multipart archives where individual parts are under 2 GB (but total exceeds 4 GB) will no longer be skipped — these upload directly without any splitting
- Single files over 2 GB are automatically split into 2 GB parts (existing behavior)
- Temp disk usage during processing can now reach up to ~200 GB per archive
---
## Feature 2: Search Match Indicators
### Backend Changes
**File:** `src/lib/telegram/queries.ts``searchPackages()`
When `searchIn` is `"files"` or `"both"`, change the PackageFile query from `distinct` to a **grouped count**:
```typescript
// Current: findMany with select: { packageId }, distinct: ["packageId"]
// New: groupBy packageId with _count
const fileMatches = await prisma.packageFile.groupBy({
by: ["packageId"],
where: {
OR: [
{ fileName: { contains: q, mode: "insensitive" } },
{ path: { contains: q, mode: "insensitive" } },
],
},
_count: { _all: true },
});
```
This returns `{ packageId: string, _count: { _all: number } }[]`.
Note: `PackageRow` in `package-columns.tsx` mirrors `PackageListItem` and must also receive the two new fields.
**File:** `src/lib/telegram/types.ts``PackageListItem`
Add two fields:
- `matchedFileCount: number` — how many files inside matched (0 if matched by package name only)
- `matchedByContent: boolean` — true if any files inside matched
### Frontend Changes
**File:** `src/app/(app)/stls/page.tsx`
Pass the search term to `StlTable` as a new prop.
**File:** `src/app/(app)/stls/_components/stl-table.tsx`
Pass search term to columns via TanStack Table column meta.
**File:** `src/app/(app)/stls/_components/package-columns.tsx`
When search is active and `matchedByContent` is true, render a clickable badge below the filename: e.g., "3 file matches". Clicking opens the `PackageFilesDrawer` with a `highlightTerm` prop set to the search term.
**File:** `src/app/(app)/stls/_components/package-files-drawer.tsx`
- Accept optional `highlightTerm: string` prop
- Render full file tree as normal (all files visible)
- Files whose `fileName` or `path` case-insensitively contains `highlightTerm` get a subtle highlight (amber/yellow background on the row)
- Auto-expand folders that contain highlighted files
- The drawer's own search input remains independent
### Data Flow
1. User types search term in STL table search input
2. URL updates with `?search=value`, page reloads
3. `page.tsx` calls `searchPackages()` with `searchIn: "both"`
4. Query returns packages with `matchedFileCount` and `matchedByContent`
5. Table renders "N file matches" badge on content-matched rows
6. User clicks badge -> drawer opens with full tree, matching files highlighted
7. Folders containing matches auto-expanded
---
## Feature 3: Skipped/Failed Files Overview
### Database Schema
New model in `prisma/schema.prisma`:
```prisma
enum SkipReason {
SIZE_LIMIT
DOWNLOAD_FAILED
EXTRACT_FAILED
UPLOAD_FAILED
}
model SkippedPackage {
id String @id @default(cuid())
fileName String
fileSize BigInt
reason SkipReason
errorMessage String?
sourceChannelId String
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
sourceMessageId BigInt
sourceTopicId BigInt?
isMultipart Boolean @default(false)
partCount Int @default(1)
accountId String
account TelegramAccount @relation(fields: [accountId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
@@unique([sourceChannelId, sourceMessageId])
@@index([reason])
@@index([accountId])
@@map("skipped_packages")
}
```
Reverse relations must be added to `TelegramChannel` and `TelegramAccount` models:
```prisma
// In TelegramChannel:
skippedPackages SkippedPackage[]
// In TelegramAccount:
skippedPackages SkippedPackage[]
```
### Worker Changes
**File:** `worker/src/worker.ts`
Extend `PipelineContext` interface to include `accountId` (derived from the ingestion run's account).
At each skip/failure point, upsert a `SkippedPackage` record:
- **Size limit skip** (line 784): reason `SIZE_LIMIT`, no error message
- **Download failure** (catch in download loop): reason `DOWNLOAD_FAILED` + error text
- **Extract/metadata failure** (catch in extract): reason `EXTRACT_FAILED` + error text
- **Upload failure** (catch in upload): reason `UPLOAD_FAILED` + error text
On **successful ingestion** of a package, delete any existing `SkippedPackage` with the same `(sourceChannelId, sourceMessageId)` — so successful retries clean up after themselves.
**File:** `worker/src/db/queries.ts`
Add functions:
- `upsertSkippedPackage(data)` — create or update skip record
- `deleteSkippedPackage(sourceChannelId, sourceMessageId)` — remove on success
### Retry Mechanism
Retrying a skipped package:
1. Delete the `SkippedPackage` record
2. Find the `AccountChannelMap` record using both `accountId` and `sourceChannelId`, then reset its `lastProcessedMessageId` to `sourceMessageId - 1` (only if less than current watermark)
3. If `sourceTopicId` is non-null, also reset the corresponding `TopicProgress.lastProcessedMessageId` for that topic
4. The next ingestion cycle picks up the message and re-attempts processing
For "Retry All" (e.g., all `SIZE_LIMIT` skips after raising the limit):
- Delete all matching `SkippedPackage` records
- For each affected (account, channel) pair, reset `AccountChannelMap` watermark to the minimum `sourceMessageId - 1` among deleted records
- For each affected (account, channel, topic) triple, reset `TopicProgress` watermark similarly
**Note on behavioral distinction:** `DOWNLOAD_FAILED`, `EXTRACT_FAILED`, and `UPLOAD_FAILED` archives already naturally retry because the worker does not advance the watermark past failed sets. The `SkippedPackage` record provides visibility into these failures. The explicit retry/watermark reset is only strictly needed for `SIZE_LIMIT` skips (where the watermark does advance past the skipped message). The UI should present both types but the retry button is most impactful for `SIZE_LIMIT` skips.
**Performance note:** "Retry All" can cause the worker to re-scan large message ranges. The existing dedup logic (`packageExistsBySourceMessage`) ensures already-ingested packages are skipped quickly, but there is a scanning cost proportional to the number of messages between the reset watermark and the current position.
### Frontend Changes
**File:** `src/app/(app)/stls/_components/stl-table.tsx`
Add a "Skipped / Failed" tab alongside the main packages table.
**New file:** `src/app/(app)/stls/_components/skipped-packages-tab.tsx`
Table columns:
- **fileName** — archive name
- **fileSize** — formatted size
- **reason** — color-coded badge: `SIZE_LIMIT` (yellow), `DOWNLOAD_FAILED` (red), `EXTRACT_FAILED` (red), `UPLOAD_FAILED` (red)
- **errorMessage** — truncated with expandable tooltip/popover for full text
- **channel** — source channel title
- **createdAt** — when the skip/failure was recorded
Actions:
- **Retry** button per row — server action that deletes record + resets watermark
- **Retry All** button in the header — bulk retry, filterable by reason
**File:** `src/app/(app)/stls/page.tsx`
Fetch skipped packages count (for tab badge) alongside existing queries.
**File:** `src/data/` or `src/lib/telegram/queries.ts`
Add query functions:
- `listSkippedPackages(options)` — paginated list with reason filter
- `countSkippedPackages()` — for tab badge
- `retrySkippedPackage(id)` — delete record + reset watermark
- `retryAllSkippedPackages(reason?)` — bulk retry
**File:** `src/app/(app)/stls/actions.ts`
Add server actions:
- `retrySkippedPackageAction(id)`
- `retryAllSkippedPackagesAction(reason?)`
---
## Files to Create/Modify
### Create
- `src/app/(app)/stls/_components/skipped-packages-tab.tsx` — skipped packages table UI
- Prisma migration for `SkippedPackage` model
### Modify
- `worker/src/util/config.ts` — raise default max size
- `worker/src/worker.ts` — record skips/failures, clean up on success
- `worker/src/db/queries.ts` — add skip record CRUD functions
- `prisma/schema.prisma` — add `SkippedPackage` model and `SkipReason` enum
- `src/lib/telegram/queries.ts` — modify `searchPackages()` for match counts, add skipped package queries
- `src/lib/telegram/types.ts` — add `matchedFileCount`/`matchedByContent` to `PackageListItem`, add skipped package types
- `src/app/(app)/stls/page.tsx` — pass search term, fetch skipped count, add tab
- `src/app/(app)/stls/_components/stl-table.tsx` — accept search prop, render tabs
- `src/app/(app)/stls/_components/package-columns.tsx` — render match badge
- `src/app/(app)/stls/_components/package-files-drawer.tsx` — accept highlightTerm, highlight matching files, auto-expand matched folders
- `src/app/(app)/stls/actions.ts` — add retry server actions

View File

@@ -0,0 +1,246 @@
# Package Grouping Design
## Overview
Add the ability to group related packages that were posted together in a Telegram channel (e.g., "DUNGEON BLOCKS - Colossal Dungeon" with 6 separate archive files). Groups appear as collapsible rows in the STL files table, with support for both automatic detection via Telegram album IDs and manual grouping through the UI.
## Goals
- Automatically detect and group files posted together in Telegram (same `media_album_id`)
- Display groups as collapsed rows in the STL table with aggregated metadata
- Allow manual grouping/ungrouping of packages via the UI
- Support editable group names and preview images
- Enable "Send All" to deliver every package in a group via the bot
## Non-Goals
- Merging grouped packages into a single Package record (each stays independent)
- Time-proximity heuristics for grouping (too error-prone)
- Grouping across different source channels
---
## Data Model
### New `PackageGroup` Table
```prisma
model PackageGroup {
id String @id @default(cuid())
name String
mediaAlbumId String?
sourceChannelId String
previewData Bytes?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
packages Package[]
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
@@unique([mediaAlbumId, sourceChannelId])
@@index([sourceChannelId])
@@map("package_groups")
}
```
### Package Model Changes
Add optional group membership:
```prisma
model Package {
// ... existing fields ...
packageGroupId String?
packageGroup PackageGroup? @relation(fields: [packageGroupId], references: [id], onDelete: SetNull)
@@index([packageGroupId])
}
```
### TelegramChannel Model Changes
Add back-relation for the new `PackageGroup` model:
```prisma
model TelegramChannel {
// ... existing fields and relations ...
packageGroups PackageGroup[]
}
```
### Key Decisions
- `mediaAlbumId` is `String?` (TDLib int64 stringified) — only used for dedup lookups, avoids BigInt complexity
- `@@unique([mediaAlbumId, sourceChannelId])` prevents duplicate album-derived groups when re-scanning. PostgreSQL treats NULLs as distinct in unique constraints, so manually-created groups (with `mediaAlbumId = null`) are not constrained by this — which is correct behavior
- Idempotency for album groups uses `findFirst({ where: { mediaAlbumId, sourceChannelId } })` + conditional `create`, not `upsert`, because Prisma does not support `upsert` on compound unique keys with nullable fields
- `onDelete: SetNull` on `Package.packageGroup` means dissolving a group automatically unlinks all members
- `onDelete: Cascade` on `PackageGroup.sourceChannel` means deleting a channel cleans up its groups
- `sourceTopicId` is omitted from `PackageGroup` — it can be inferred from member packages, and manual groups may span topics
- `@@map("package_groups")` follows the project's snake_case table naming convention
- `previewData` stores JPEG thumbnail bytes directly on the group (same pattern as Package)
---
## Worker Changes
### TelegramMessage Interface
Add optional `mediaAlbumId` field:
```typescript
export interface TelegramMessage {
id: bigint;
fileName: string;
fileId: string;
fileSize: bigint;
date: Date;
mediaAlbumId?: string; // Absent or "0" when not part of an album
}
```
The field is optional to minimize call-site changes. The grouping step treats `undefined` and `"0"` equivalently as "not part of an album."
### TelegramPhoto Interface
Add optional `mediaAlbumId` field:
```typescript
export interface TelegramPhoto {
id: bigint;
date: Date;
caption: string;
fileId: string;
fileSize: number;
mediaAlbumId?: string; // For album-to-preview correlation
}
```
### Channel Scanning
In `getChannelMessages()`, read `media_album_id` from the TDLib message object (already present in TDLib responses, just not captured today). Add `media_album_id?: string` to the `TdMessage` interface and pass through to both `TelegramMessage` and `TelegramPhoto`.
The document pass and photo pass already run as separate loops over `searchChatMessages`. Both loops capture `media_album_id` independently. Correlation happens at grouping time: album photos are matched to album documents by comparing their `mediaAlbumId` values, not at scan time.
### Group Creation (Post-Processing)
After each scan cycle's packages are individually processed (downloaded, hashed, uploaded, indexed), a post-processing step handles grouping:
1. Collect all packages from the current scan batch that share the same non-zero `mediaAlbumId`
2. For each distinct `mediaAlbumId`, check if a `PackageGroup` already exists via `findFirst({ where: { mediaAlbumId, sourceChannelId } })`
3. If no group exists, create one:
- **Name:** caption of the first message in the album (falls back to first file's base name)
- **Preview:** find a `TelegramPhoto` from the scan's `photos[]` array with the same `mediaAlbumId`. If found, download via `downloadPhotoThumbnail`. If not, the group starts with no preview (can be added in UI later)
4. Link all member packages via an idempotent `updateMany` — sets `packageGroupId` on all packages whose `sourceMessageId` is in the album's message set. This handles both newly-indexed packages and previously-indexed ones that were created in an earlier partial scan (e.g., if one package failed and was retried later)
The per-package pipeline is unchanged — each file is still downloaded, hashed, deduped, split, uploaded, and indexed independently. Grouping is a layer on top.
---
## Query Layer
### Paginated Listing with Groups
The STL table shows "display items" — either a group (collapsed) or a standalone package. Pagination operates on display items so that a group occupies exactly one slot regardless of member count.
**Two-step query approach** (handles filters correctly):
**Step 1 — Find matching display item IDs:**
```sql
-- Find all group IDs and standalone package IDs where at least one member matches filters
SELECT DISTINCT COALESCE(p."packageGroupId", p.id) AS display_id,
CASE WHEN p."packageGroupId" IS NOT NULL THEN 'group' ELSE 'package' END AS display_type,
MAX(p."indexedAt") AS sort_date
FROM packages p
LEFT JOIN package_groups pg ON pg.id = p."packageGroupId"
WHERE 1=1
-- Optional filters applied here (creator, tags, search text, channelId)
GROUP BY COALESCE(p."packageGroupId", p.id),
CASE WHEN p."packageGroupId" IS NOT NULL THEN 'group' ELSE 'package' END
ORDER BY sort_date DESC
LIMIT $1 OFFSET $2
```
**Step 2 — Fetch full data:**
For groups on the current page, fetch all member packages (including those that didn't match filters — the group appears because at least one member matched, but the expanded view shows all members). For standalone packages, fetch the full package data.
**Count query** (for pagination total):
```sql
SELECT COUNT(*) FROM (
SELECT DISTINCT COALESCE(p."packageGroupId", p.id)
FROM packages p
WHERE 1=1
-- Same filters as step 1
) AS display_items
```
### Group Row Aggregates
Computed in the step 2 fetch: total file size (sum), total file count (sum), combined tags (array union), member package count per group. These populate the collapsed group row.
### Search
`searchPackages` adds `PackageGroup.name` to search targets via a `LEFT JOIN` to `package_groups`. If any package in a group matches by name/file content, or the group name matches, the whole group appears.
### Filtering
Creator/tag filters apply to member packages. A group appears if any member matches the filter. The group row shows aggregates of all members (not just matching ones).
### New Query Functions
| Function | Purpose |
|----------|---------|
| `listDisplayItems(page, limit, filters)` | Two-step paginated query returning groups + standalone packages |
| `getDisplayItemCount(filters)` | Count of display items for pagination total |
| `getPackageGroup(groupId)` | Group metadata + all member packages |
| `updatePackageGroupName(groupId, name)` | Rename group |
| `updatePackageGroupPreview(groupId, previewData)` | Replace group preview |
| `addPackagesToGroup(packageIds, groupId)` | Manual grouping — add to existing group |
| `removePackageFromGroup(packageId)` | Ungroup single package |
| `createManualGroup(name, packageIds)` | Create new group from UI |
| `dissolveGroup(groupId)` | Ungroup all members, delete group record |
For manual grouping of packages that already belong to different groups: the UI first dissolves empty source groups (groups where all members were moved), then links the selected packages to the target group. Non-selected members of source groups remain in their original group.
---
## UI Changes
### STL Table — Group Rows
- **Collapsed (default):** Single row showing preview thumbnail, group name (editable inline), archive type badge ("Mixed" if heterogeneous), combined size, combined file count, combined tags (editable), source channel, latest `indexedAt`, actions
- **Expanded:** Chevron toggle reveals member packages as indented sub-rows with their existing columns and per-package actions
- Chevron icon on the left of the row toggles expand/collapse
**Loading strategy:** Member packages for all groups on the current page are prefetched in a single batched query during the step 2 fetch. This means expand/collapse is instant (no on-demand loading) and avoids per-row loading states.
### Group Row Actions
- **Send All** — Queues bot send requests for every package in the group. Checks for existing PENDING/SENDING requests per package to avoid duplicates.
- **View Files** — Opens file drawer showing all member packages' files, separated by package name headers
- **Dissolve Group** — Ungroups all members (confirmation required)
### Individual Package Actions (Within a Group)
- Existing: Send, View Files
- New: "Remove from group" in dropdown menu
### Manual Grouping
- Checkbox selection column on package rows
- When 2+ packages selected, a "Group Selected" button appears in the table toolbar
- Prompts for a group name, creates the group
- If selected packages belong to existing groups, those packages are moved to the new group. Source groups that become empty are automatically dissolved.
### Preview Editing
- Click the group's preview thumbnail to upload a replacement image
- Same upload flow as individual packages (existing component reuse)
### No Changes To
- Skipped/failed packages tab
- Package detail drawer internals
- Search UI (just broader matching behind the scenes)

View File

@@ -0,0 +1,30 @@
-- AlterTable
ALTER TABLE "packages" ADD COLUMN "packageGroupId" TEXT;
-- CreateTable
CREATE TABLE "package_groups" (
"id" TEXT NOT NULL,
"name" TEXT NOT NULL,
"mediaAlbumId" TEXT,
"sourceChannelId" TEXT NOT NULL,
"previewData" BYTEA,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "package_groups_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "package_groups_sourceChannelId_idx" ON "package_groups"("sourceChannelId");
-- CreateIndex
CREATE UNIQUE INDEX "package_groups_mediaAlbumId_sourceChannelId_key" ON "package_groups"("mediaAlbumId", "sourceChannelId");
-- CreateIndex
CREATE INDEX "packages_packageGroupId_idx" ON "packages"("packageGroupId");
-- AddForeignKey
ALTER TABLE "packages" ADD CONSTRAINT "packages_packageGroupId_fkey" FOREIGN KEY ("packageGroupId") REFERENCES "package_groups"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "package_groups" ADD CONSTRAINT "package_groups_sourceChannelId_fkey" FOREIGN KEY ("sourceChannelId") REFERENCES "telegram_channels"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -432,6 +432,7 @@ model TelegramChannel {
accountMaps AccountChannelMap[]
packages Package[]
skippedPackages SkippedPackage[]
packageGroups PackageGroup[]
@@index([type, isActive])
@@index([category])
@@ -474,10 +475,12 @@ model Package {
tags String[] @default([])
previewData Bytes? // JPEG thumbnail from nearby Telegram photo (stored as raw bytes)
previewMsgId BigInt? // Telegram message ID of the matched photo
packageGroupId String?
indexedAt DateTime @default(now())
createdAt DateTime @default(now())
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id])
packageGroup PackageGroup? @relation(fields: [packageGroupId], references: [id], onDelete: SetNull)
files PackageFile[]
ingestionRun IngestionRun? @relation(fields: [ingestionRunId], references: [id])
ingestionRunId String?
@@ -491,6 +494,7 @@ model Package {
@@index([indexedAt])
@@index([archiveType])
@@index([creator])
@@index([packageGroupId])
@@map("packages")
}
@@ -512,6 +516,23 @@ model PackageFile {
@@map("package_files")
}
model PackageGroup {
id String @id @default(cuid())
name String
mediaAlbumId String?
sourceChannelId String
previewData Bytes?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
packages Package[]
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
@@unique([mediaAlbumId, sourceChannelId])
@@index([sourceChannelId])
@@map("package_groups")
}
model IngestionRun {
id String @id @default(cuid())
accountId String

View File

@@ -1,10 +1,11 @@
"use client";
import { type ColumnDef } from "@tanstack/react-table";
import { FileArchive, Eye } from "lucide-react";
import { FileArchive, Eye, ChevronRight, Layers, Ungroup, Send, ImagePlus } from "lucide-react";
import { DataTableColumnHeader } from "@/components/shared/data-table-column-header";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
import { Checkbox } from "@/components/ui/checkbox";
import { SendToTelegramButton } from "./send-to-telegram-button";
export interface PackageRow {
@@ -25,6 +26,34 @@ export interface PackageRow {
};
matchedFileCount: number;
matchedByContent: boolean;
packageGroupId?: string | null;
}
export interface GroupHeaderRow {
_rowType: "group";
id: string;
name: string;
hasPreview: boolean;
totalFileSize: string;
totalFileCount: number;
packageCount: number;
combinedTags: string[];
archiveTypes: ("ZIP" | "RAR" | "SEVEN_Z" | "DOCUMENT")[];
latestIndexedAt: string;
sourceChannel: { id: string; title: string };
_expanded: boolean;
}
export interface PackageTableRow extends PackageRow {
_rowType: "package";
_groupId: string | null;
_isGroupMember: boolean;
}
export type StlTableRow = GroupHeaderRow | PackageTableRow;
function isGroupRow(row: StlTableRow): row is GroupHeaderRow {
return row._rowType === "group";
}
interface PackageColumnsProps {
@@ -32,9 +61,17 @@ interface PackageColumnsProps {
onSetCreator: (pkg: PackageRow) => void;
onSetTags: (pkg: PackageRow) => void;
searchTerm: string;
onToggleGroup: (groupId: string) => void;
onRenameGroup: (groupId: string, currentName: string) => void;
onDissolveGroup: (groupId: string) => void;
onSendAllInGroup: (groupId: string) => void;
onRemoveFromGroup: (packageId: string) => void;
onGroupPreviewUpload: (groupId: string) => void;
selectedPackages: Set<string>;
onToggleSelect: (packageId: string) => void;
}
function formatBytes(bytesStr: string): string {
export function formatBytes(bytesStr: string): string {
const bytes = Number(bytesStr);
if (bytes === 0) return "0 B";
const k = 1024;
@@ -61,107 +98,254 @@ function PreviewCell({ pkg }: { pkg: PackageRow }) {
);
}
function GroupPreviewCell({
group,
onUpload,
}: {
group: GroupHeaderRow;
onUpload: (groupId: string) => void;
}) {
if (group.hasPreview) {
return (
<button
className="relative group/preview cursor-pointer"
onClick={() => onUpload(group.id)}
title="Click to change preview image"
>
<img
src={`/api/groups/${group.id}/preview`}
alt=""
className="h-9 w-9 rounded-md object-cover bg-muted"
loading="lazy"
/>
<div className="absolute inset-0 flex items-center justify-center rounded-md bg-black/50 opacity-0 group-hover/preview:opacity-100 transition-opacity">
<ImagePlus className="h-3.5 w-3.5 text-white" />
</div>
</button>
);
}
return (
<button
className="flex h-9 w-9 items-center justify-center rounded-md bg-muted hover:bg-muted/80 transition-colors cursor-pointer"
onClick={() => onUpload(group.id)}
title="Click to add preview image"
>
<Layers className="h-4 w-4 text-muted-foreground" />
</button>
);
}
export function getPackageColumns({
onViewFiles,
onSetCreator,
onSetTags,
searchTerm,
}: PackageColumnsProps): ColumnDef<PackageRow, unknown>[] {
onToggleGroup,
onRenameGroup,
onDissolveGroup,
onSendAllInGroup,
onRemoveFromGroup,
onGroupPreviewUpload,
selectedPackages,
onToggleSelect,
}: PackageColumnsProps): ColumnDef<StlTableRow, unknown>[] {
return [
{
id: "select",
header: "",
cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) return null;
return (
<Checkbox
checked={selectedPackages.has(data.id)}
onCheckedChange={() => onToggleSelect(data.id)}
aria-label="Select package"
className="translate-y-[2px]"
/>
);
},
enableHiding: false,
enableSorting: false,
size: 32,
},
{
id: "preview",
header: "",
cell: ({ row }) => <PreviewCell pkg={row.original} />,
cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) {
return (
<div className="flex items-center gap-1">
<button
className="shrink-0 p-0.5 cursor-pointer"
onClick={() => onToggleGroup(data.id)}
aria-label={data._expanded ? "Collapse group" : "Expand group"}
>
<ChevronRight
className={`h-4 w-4 text-muted-foreground transition-transform ${
data._expanded ? "rotate-90" : ""
}`}
/>
</button>
<GroupPreviewCell group={data} onUpload={onGroupPreviewUpload} />
</div>
);
}
return (
<div className={data._isGroupMember ? "pl-5" : ""}>
<PreviewCell pkg={data} />
</div>
);
},
enableHiding: false,
enableSorting: false,
size: 52,
size: 72,
},
{
accessorKey: "fileName",
header: ({ column }) => <DataTableColumnHeader column={column} title="File Name" />,
cell: ({ row }) => (
<div className="min-w-0">
<div className="flex items-center gap-2">
<span className="font-medium truncate max-w-[300px]">{row.original.fileName}</span>
{row.original.isMultipart && (
<Badge variant="outline" className="text-[10px] shrink-0">
Multi
</Badge>
cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) {
return (
<div className="min-w-0">
<div className="flex items-center gap-2">
<button
className="font-semibold truncate max-w-[300px] cursor-pointer hover:underline text-left"
onClick={() => onRenameGroup(data.id, data.name)}
title="Click to rename group"
>
{data.name}
</button>
<Badge variant="secondary" className="text-[10px] shrink-0">
{data.packageCount} pkg{data.packageCount !== 1 ? "s" : ""}
</Badge>
</div>
</div>
);
}
return (
<div className="min-w-0">
<div className="flex items-center gap-2">
<span className="font-medium truncate max-w-[300px]">{data.fileName}</span>
{data.isMultipart && (
<Badge variant="outline" className="text-[10px] shrink-0">
Multi
</Badge>
)}
</div>
{searchTerm && data.matchedByContent && (
<button
className="text-[11px] text-amber-500 hover:text-amber-400 hover:underline cursor-pointer mt-0.5"
onClick={() => onViewFiles(data)}
>
{data.matchedFileCount.toLocaleString()} file match{data.matchedFileCount !== 1 ? "es" : ""}
</button>
)}
</div>
{searchTerm && row.original.matchedByContent && (
<button
className="text-[11px] text-amber-500 hover:text-amber-400 hover:underline cursor-pointer mt-0.5"
onClick={() => onViewFiles(row.original)}
>
{row.original.matchedFileCount.toLocaleString()} file match{row.original.matchedFileCount !== 1 ? "es" : ""}
</button>
)}
</div>
),
);
},
enableHiding: false,
},
{
accessorKey: "archiveType",
header: ({ column }) => <DataTableColumnHeader column={column} title="Type" />,
cell: ({ row }) => (
<Badge variant="secondary" className="text-[10px]">
{row.original.archiveType}
</Badge>
),
cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) {
const types = data.archiveTypes;
if (types.length === 1) {
return (
<Badge variant="secondary" className="text-[10px]">
{types[0]}
</Badge>
);
}
return (
<Badge variant="secondary" className="text-[10px]">
Mixed
</Badge>
);
}
return (
<Badge variant="secondary" className="text-[10px]">
{data.archiveType}
</Badge>
);
},
},
{
accessorKey: "fileSize",
header: ({ column }) => <DataTableColumnHeader column={column} title="Size" />,
cell: ({ row }) => (
<span className="text-sm text-muted-foreground">
{formatBytes(row.original.fileSize)}
</span>
),
cell: ({ row }) => {
const data = row.original;
const size = isGroupRow(data) ? data.totalFileSize : data.fileSize;
return (
<span className="text-sm text-muted-foreground">
{formatBytes(size)}
</span>
);
},
},
{
accessorKey: "fileCount",
header: ({ column }) => <DataTableColumnHeader column={column} title="Files" />,
cell: ({ row }) => (
<span className="text-sm">
{row.original.fileCount.toLocaleString()}
</span>
),
cell: ({ row }) => {
const data = row.original;
const count = isGroupRow(data) ? data.totalFileCount : data.fileCount;
return (
<span className="text-sm">
{count.toLocaleString()}
</span>
);
},
},
{
accessorKey: "creator",
header: ({ column }) => <DataTableColumnHeader column={column} title="Creator" />,
cell: ({ row }) => (
<button
className="text-sm text-muted-foreground truncate max-w-[160px] block hover:text-foreground hover:underline cursor-pointer text-left"
onClick={() => onSetCreator(row.original)}
title="Click to edit creator"
>
{row.original.creator || "\u2014"}
</button>
),
cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) {
return <span className="text-sm text-muted-foreground">{"\u2014"}</span>;
}
return (
<button
className="text-sm text-muted-foreground truncate max-w-[160px] block hover:text-foreground hover:underline cursor-pointer text-left"
onClick={() => onSetCreator(data)}
title="Click to edit creator"
>
{data.creator || "\u2014"}
</button>
);
},
},
{
id: "tags",
header: ({ column }) => <DataTableColumnHeader column={column} title="Tags" />,
cell: ({ row }) => {
const tags = row.original.tags;
const data = row.original;
const tags = isGroupRow(data) ? data.combinedTags : data.tags;
if (tags.length === 0) {
if (isGroupRow(data)) {
return <span className="text-sm text-muted-foreground">{"\u2014"}</span>;
}
return (
<button
className="text-sm text-muted-foreground hover:text-foreground cursor-pointer"
onClick={() => onSetTags(row.original)}
onClick={() => onSetTags(data)}
title="Click to add tags"
>
{"\u2014"}
</button>
);
}
const clickHandler = isGroupRow(data) ? undefined : () => onSetTags(data as PackageTableRow);
return (
<button
className="flex flex-wrap gap-1 cursor-pointer"
onClick={() => onSetTags(row.original)}
title="Click to edit tags"
className={`flex flex-wrap gap-1 ${clickHandler ? "cursor-pointer" : "cursor-default"}`}
onClick={clickHandler}
title={clickHandler ? "Click to edit tags" : undefined}
>
{tags.map((tag) => (
<Badge
@@ -175,7 +359,10 @@ export function getPackageColumns({
</button>
);
},
accessorFn: (row) => row.tags.join(", "),
accessorFn: (row) => {
if (isGroupRow(row)) return row.combinedTags.join(", ");
return row.tags.join(", ");
},
},
{
id: "channel",
@@ -190,31 +377,73 @@ export function getPackageColumns({
{
accessorKey: "indexedAt",
header: ({ column }) => <DataTableColumnHeader column={column} title="Indexed" />,
cell: ({ row }) => (
<span className="text-sm text-muted-foreground">
{new Date(row.original.indexedAt).toLocaleDateString()}
</span>
),
cell: ({ row }) => {
const data = row.original;
const date = isGroupRow(data) ? data.latestIndexedAt : data.indexedAt;
return (
<span className="text-sm text-muted-foreground">
{new Date(date).toLocaleDateString()}
</span>
);
},
},
{
id: "actions",
cell: ({ row }) => (
<div className="flex items-center gap-0.5">
<SendToTelegramButton
packageId={row.original.id}
packageName={row.original.fileName}
variant="icon"
/>
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onViewFiles(row.original)}
>
<Eye className="h-4 w-4" />
</Button>
</div>
),
cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) {
return (
<div className="flex items-center gap-0.5">
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onSendAllInGroup(data.id)}
title="Send all packages in group"
>
<Send className="h-4 w-4" />
</Button>
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onDissolveGroup(data.id)}
title="Dissolve group"
>
<Ungroup className="h-4 w-4" />
</Button>
</div>
);
}
return (
<div className="flex items-center gap-0.5">
<SendToTelegramButton
packageId={data.id}
packageName={data.fileName}
variant="icon"
/>
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onViewFiles(data)}
>
<Eye className="h-4 w-4" />
</Button>
{data._isGroupMember && (
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onRemoveFromGroup(data.id)}
title="Remove from group"
>
<Ungroup className="h-3.5 w-3.5" />
</Button>
)}
</div>
);
},
enableHiding: false,
},
];

View File

@@ -1,11 +1,17 @@
"use client";
import { useState, useCallback, useTransition } from "react";
import { useState, useCallback, useTransition, useMemo, useRef } from "react";
import { useRouter, usePathname, useSearchParams } from "next/navigation";
import { toast } from "sonner";
import { Search } from "lucide-react";
import { Search, Layers } from "lucide-react";
import { useDataTable } from "@/hooks/use-data-table";
import { getPackageColumns, type PackageRow } from "./package-columns";
import {
getPackageColumns,
type PackageRow,
type StlTableRow,
type PackageTableRow,
type GroupHeaderRow,
} from "./package-columns";
import { PackageFilesDrawer } from "./package-files-drawer";
import { IngestionStatus } from "./ingestion-status";
import { SkippedPackagesTab } from "./skipped-packages-tab";
@@ -14,6 +20,7 @@ import { DataTablePagination } from "@/components/shared/data-table-pagination";
import { DataTableViewOptions } from "@/components/shared/data-table-view-options";
import { PageHeader } from "@/components/shared/page-header";
import { Input } from "@/components/ui/input";
import { Button } from "@/components/ui/button";
import {
Select,
SelectContent,
@@ -21,14 +28,31 @@ import {
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
import { Badge } from "@/components/ui/badge";
import type { IngestionAccountStatus } from "@/lib/telegram/types";
import type { DisplayItem, IngestionAccountStatus } from "@/lib/telegram/types";
import type { SkippedRow } from "./skipped-columns";
import { updatePackageCreator, updatePackageTags } from "../actions";
import {
updatePackageCreator,
updatePackageTags,
renameGroupAction,
dissolveGroupAction,
createGroupAction,
removeFromGroupAction,
sendAllInGroupAction,
updateGroupPreviewAction,
} from "../actions";
interface StlTableProps {
data: PackageRow[];
data: DisplayItem[];
pageCount: number;
totalCount: number;
ingestionStatus: IngestionAccountStatus[];
@@ -58,6 +82,88 @@ export function StlTable({
const [viewPkg, setViewPkg] = useState<PackageRow | null>(null);
const [, startTransition] = useTransition();
// Group expansion state
const [expandedGroups, setExpandedGroups] = useState<Set<string>>(new Set());
// Package selection state (for manual grouping)
const [selectedPackages, setSelectedPackages] = useState<Set<string>>(new Set());
// Create group dialog state
const [createGroupOpen, setCreateGroupOpen] = useState(false);
const [groupName, setGroupName] = useState("");
// Group preview upload ref
const previewInputRef = useRef<HTMLInputElement>(null);
const [uploadGroupId, setUploadGroupId] = useState<string | null>(null);
const toggleGroup = useCallback((groupId: string) => {
setExpandedGroups((prev) => {
const next = new Set(prev);
if (next.has(groupId)) {
next.delete(groupId);
} else {
next.add(groupId);
}
return next;
});
}, []);
const toggleSelect = useCallback((packageId: string) => {
setSelectedPackages((prev) => {
const next = new Set(prev);
if (next.has(packageId)) {
next.delete(packageId);
} else {
next.add(packageId);
}
return next;
});
}, []);
// Flatten DisplayItem[] into StlTableRow[] based on expansion state
const tableRows: StlTableRow[] = useMemo(() => {
const rows: StlTableRow[] = [];
for (const item of data) {
if (item.type === "package") {
rows.push({
...item.data,
_rowType: "package" as const,
_groupId: null,
_isGroupMember: false,
});
} else {
const group = item.data;
const isExpanded = expandedGroups.has(group.id);
rows.push({
_rowType: "group" as const,
id: group.id,
name: group.name,
hasPreview: group.hasPreview,
totalFileSize: group.totalFileSize,
totalFileCount: group.totalFileCount,
packageCount: group.packageCount,
combinedTags: group.combinedTags,
archiveTypes: group.archiveTypes,
latestIndexedAt: group.latestIndexedAt,
sourceChannel: group.sourceChannel,
_expanded: isExpanded,
});
if (isExpanded) {
for (const pkg of group.packages) {
rows.push({
...pkg,
_rowType: "package" as const,
_groupId: group.id,
_isGroupMember: true,
packageGroupId: group.id,
});
}
}
}
}
return rows;
}, [data, expandedGroups]);
const updateSearch = useCallback(
(value: string) => {
setSearchValue(value);
@@ -103,6 +209,131 @@ export function StlTable({
[router, pathname, searchParams]
);
const handleRenameGroup = useCallback(
(groupId: string, currentName: string) => {
const value = prompt("Enter group name:", currentName);
if (value === null || value.trim() === currentName) return;
startTransition(async () => {
const result = await renameGroupAction(groupId, value);
if (result.success) {
toast.success(`Group renamed to "${value.trim()}"`);
router.refresh();
} else {
toast.error(result.error);
}
});
},
[router]
);
const handleDissolveGroup = useCallback(
(groupId: string) => {
if (!confirm("Dissolve this group? Packages will become standalone items.")) return;
startTransition(async () => {
const result = await dissolveGroupAction(groupId);
if (result.success) {
toast.success("Group dissolved");
setExpandedGroups((prev) => {
const next = new Set(prev);
next.delete(groupId);
return next;
});
router.refresh();
} else {
toast.error(result.error);
}
});
},
[router]
);
const handleSendAllInGroup = useCallback(
(groupId: string) => {
if (!confirm("Send all packages in this group to your Telegram?")) return;
startTransition(async () => {
const result = await sendAllInGroupAction(groupId);
if (result.success) {
toast.success("Group packages queued for sending");
router.refresh();
} else {
toast.error(result.error);
}
});
},
[router]
);
const handleRemoveFromGroup = useCallback(
(packageId: string) => {
startTransition(async () => {
const result = await removeFromGroupAction(packageId);
if (result.success) {
toast.success("Package removed from group");
router.refresh();
} else {
toast.error(result.error);
}
});
},
[router]
);
const handleCreateGroup = useCallback(() => {
if (selectedPackages.size < 2) return;
setGroupName("");
setCreateGroupOpen(true);
}, [selectedPackages.size]);
const submitCreateGroup = useCallback(() => {
if (!groupName.trim() || selectedPackages.size < 2) return;
const ids = Array.from(selectedPackages);
startTransition(async () => {
const result = await createGroupAction(groupName, ids);
if (result.success) {
toast.success(`Group "${groupName.trim()}" created`);
setSelectedPackages(new Set());
setCreateGroupOpen(false);
router.refresh();
} else {
toast.error(result.error);
}
});
}, [groupName, selectedPackages, router]);
// Group preview upload handler (Task 12)
const handleGroupPreviewUpload = useCallback((groupId: string) => {
setUploadGroupId(groupId);
// Trigger file input after state update
setTimeout(() => {
previewInputRef.current?.click();
}, 0);
}, []);
const handlePreviewFileChange = useCallback(
(e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file || !uploadGroupId) return;
const formData = new FormData();
formData.append("file", file);
startTransition(async () => {
const result = await updateGroupPreviewAction(uploadGroupId, formData);
if (result.success) {
toast.success("Group preview updated");
router.refresh();
} else {
toast.error(result.error);
}
setUploadGroupId(null);
});
// Reset input so the same file can be selected again
e.target.value = "";
},
[uploadGroupId, router]
);
const columns = getPackageColumns({
onViewFiles: (pkg) => setViewPkg(pkg),
searchTerm,
@@ -136,9 +367,17 @@ export function StlTable({
}
});
},
onToggleGroup: toggleGroup,
onRenameGroup: handleRenameGroup,
onDissolveGroup: handleDissolveGroup,
onSendAllInGroup: handleSendAllInGroup,
onRemoveFromGroup: handleRemoveFromGroup,
onGroupPreviewUpload: handleGroupPreviewUpload,
selectedPackages,
onToggleSelect: toggleSelect,
});
const { table } = useDataTable({ data, columns, pageCount });
const { table } = useDataTable({ data: tableRows, columns, pageCount });
const activeTag = searchParams.get("tag") ?? "";
@@ -191,11 +430,37 @@ export function StlTable({
</Select>
)}
<DataTableViewOptions table={table} />
{selectedPackages.size >= 2 && (
<Button
variant="outline"
size="sm"
className="h-9 gap-1.5"
onClick={handleCreateGroup}
>
<Layers className="h-3.5 w-3.5" />
Group {selectedPackages.size} Selected
</Button>
)}
{selectedPackages.size > 0 && selectedPackages.size < 2 && (
<span className="text-xs text-muted-foreground">
Select at least 2 packages to group
</span>
)}
</div>
<DataTable
table={table}
emptyMessage="No packages found. Archives will appear here after ingestion."
rowClassName={(row) => {
const data = row.original as StlTableRow;
if (data._rowType === "group") {
return "bg-muted/30 border-border";
}
if (data._rowType === "package" && (data as PackageTableRow)._isGroupMember) {
return "bg-muted/10";
}
return "";
}}
/>
<DataTablePagination table={table} totalCount={totalCount} />
</TabsContent>
@@ -217,6 +482,47 @@ export function StlTable({
}}
highlightTerm={searchTerm}
/>
{/* Create Group Dialog */}
<Dialog open={createGroupOpen} onOpenChange={setCreateGroupOpen}>
<DialogContent className="sm:max-w-md">
<DialogHeader>
<DialogTitle>Create Package Group</DialogTitle>
<DialogDescription>
Group {selectedPackages.size} selected packages together. Enter a name for the group.
</DialogDescription>
</DialogHeader>
<div className="py-4">
<Input
placeholder="Group name..."
value={groupName}
onChange={(e) => setGroupName(e.target.value)}
onKeyDown={(e) => {
if (e.key === "Enter") submitCreateGroup();
}}
autoFocus
/>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => setCreateGroupOpen(false)}>
Cancel
</Button>
<Button onClick={submitCreateGroup} disabled={!groupName.trim()}>
<Layers className="h-4 w-4 mr-1" />
Create Group
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
{/* Hidden file input for group preview upload (Task 12) */}
<input
ref={previewInputRef}
type="file"
accept="image/jpeg,image/png,image/webp"
className="hidden"
onChange={handlePreviewFileChange}
/>
</div>
);
}

View File

@@ -4,6 +4,13 @@ import { auth } from "@/lib/auth";
import { prisma } from "@/lib/prisma";
import type { ActionResult } from "@/types/api.types";
import { revalidatePath } from "next/cache";
import {
updatePackageGroupName,
updatePackageGroupPreview,
createManualGroup,
removePackageFromGroup,
dissolveGroup,
} from "@/lib/telegram/queries";
const ALLOWED_IMAGE_TYPES = [
"image/jpeg",
@@ -322,3 +329,186 @@ export async function retryAllSkippedPackagesAction(
return { success: false, error: "Failed to retry skipped packages" };
}
}
export async function renameGroupAction(
groupId: string,
name: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
if (!name.trim()) {
return { success: false, error: "Group name cannot be empty" };
}
try {
await updatePackageGroupName(groupId, name);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to rename group" };
}
}
export async function dissolveGroupAction(
groupId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
await dissolveGroup(groupId);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to dissolve group" };
}
}
export async function createGroupAction(
name: string,
packageIds: string[]
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
if (!name.trim()) {
return { success: false, error: "Group name cannot be empty" };
}
if (packageIds.length < 2) {
return { success: false, error: "At least 2 packages are required to create a group" };
}
try {
await createManualGroup(name, packageIds);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch (err) {
const message = err instanceof Error ? err.message : "Failed to create group";
return { success: false, error: message };
}
}
export async function removeFromGroupAction(
packageId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
await removePackageFromGroup(packageId);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to remove package from group" };
}
}
export async function updateGroupPreviewAction(
groupId: string,
formData: FormData
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
const file = formData.get("file");
if (!(file instanceof File)) {
return { success: false, error: "No file provided" };
}
if (!ALLOWED_IMAGE_TYPES.includes(file.type as (typeof ALLOWED_IMAGE_TYPES)[number])) {
return { success: false, error: "Only JPG, PNG, and WebP images are accepted" };
}
if (file.size > MAX_IMAGE_SIZE) {
return { success: false, error: "Image must be smaller than 2 MB" };
}
try {
const arrayBuffer = await file.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
await updatePackageGroupPreview(groupId, buffer);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to upload group preview image" };
}
}
export async function sendAllInGroupAction(
groupId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const telegramLink = await prisma.telegramLink.findUnique({
where: { userId: session.user.id },
});
if (!telegramLink) {
return { success: false, error: "No linked Telegram account. Link one in Settings." };
}
const group = await prisma.packageGroup.findUnique({
where: { id: groupId },
select: {
packages: {
select: { id: true, destChannelId: true, destMessageId: true, fileName: true },
},
},
});
if (!group) {
return { success: false, error: "Group not found" };
}
const sendablePackages = group.packages.filter(
(p) => p.destChannelId && p.destMessageId
);
if (sendablePackages.length === 0) {
return { success: false, error: "No packages in this group have been uploaded to a destination channel" };
}
let queued = 0;
for (const pkg of sendablePackages) {
// Only create if no existing PENDING/SENDING request for this package+link combo
const existing = await prisma.botSendRequest.findFirst({
where: {
packageId: pkg.id,
telegramLinkId: telegramLink.id,
status: { in: ["PENDING", "SENDING"] },
},
});
if (!existing) {
const sendRequest = await prisma.botSendRequest.create({
data: {
packageId: pkg.id,
telegramLinkId: telegramLink.id,
requestedByUserId: session.user.id,
status: "PENDING",
},
});
// Notify the bot via pg_notify
try {
await prisma.$queryRawUnsafe(
`SELECT pg_notify('bot_send', $1)`,
sendRequest.id
);
} catch {
// Best-effort — the bot also polls periodically
}
queued++;
}
}
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to send group packages" };
}
}

View File

@@ -1,7 +1,8 @@
import { auth } from "@/lib/auth";
import { redirect } from "next/navigation";
import { listPackages, searchPackages, getIngestionStatus, getAllPackageTags, listSkippedPackages, countSkippedPackages } from "@/lib/telegram/queries";
import { listDisplayItems, searchPackages, getIngestionStatus, getAllPackageTags, listSkippedPackages, countSkippedPackages } from "@/lib/telegram/queries";
import { StlTable } from "./_components/stl-table";
import type { DisplayItem, PackageListItem } from "@/lib/telegram/types";
interface Props {
searchParams: Promise<Record<string, string | string[] | undefined>>;
@@ -31,7 +32,7 @@ export default async function StlFilesPage({ searchParams }: Props) {
limit: perPage,
searchIn: "both",
})
: listPackages({
: listDisplayItems({
page,
limit: perPage,
creator,
@@ -44,6 +45,11 @@ export default async function StlFilesPage({ searchParams }: Props) {
countSkippedPackages(),
]);
// For search results, wrap as DisplayItem[]; for non-search, already DisplayItem[]
const displayItems: DisplayItem[] = search
? (result as { items: PackageListItem[] }).items.map((item) => ({ type: "package" as const, data: item }))
: (result as { items: DisplayItem[] }).items;
// Fetch skipped packages only if on that tab
const skippedResult = tab === "skipped"
? await listSkippedPackages({ page, limit: perPage })
@@ -51,7 +57,7 @@ export default async function StlFilesPage({ searchParams }: Props) {
return (
<StlTable
data={result.items}
data={displayItems}
pageCount={result.pagination.totalPages}
totalCount={result.pagination.total}
ingestionStatus={ingestionStatus}

View File

@@ -0,0 +1,36 @@
import { NextResponse } from "next/server";
import { prisma } from "@/lib/prisma";
import { authenticateApiRequest } from "@/lib/telegram/api-auth";
export async function GET(
request: Request,
{ params }: { params: Promise<{ id: string }> }
) {
const authResult = await authenticateApiRequest(request);
if ("error" in authResult) return authResult.error;
const { id } = await params;
const group = await prisma.packageGroup.findUnique({
where: { id },
select: { previewData: true },
});
if (!group || !group.previewData) {
return new NextResponse(null, { status: 404 });
}
const buffer =
group.previewData instanceof Buffer
? group.previewData
: Buffer.from(group.previewData);
return new NextResponse(buffer, {
status: 200,
headers: {
"Content-Type": "image/jpeg",
"Content-Length": String(buffer.length),
"Cache-Control": "public, max-age=3600, immutable",
},
});
}

View File

@@ -1,6 +1,6 @@
"use client";
import { type Table as TanStackTable, flexRender } from "@tanstack/react-table";
import { type Table as TanStackTable, type Row, flexRender } from "@tanstack/react-table";
import {
Table,
TableBody,
@@ -10,13 +10,15 @@ import {
TableRow,
} from "@/components/ui/table";
import { EmptyState } from "./empty-state";
import { cn } from "@/lib/utils";
interface DataTableProps<TData> {
table: TanStackTable<TData>;
emptyMessage?: string;
rowClassName?: (row: Row<TData>) => string;
}
export function DataTable<TData>({ table, emptyMessage }: DataTableProps<TData>) {
export function DataTable<TData>({ table, emptyMessage, rowClassName }: DataTableProps<TData>) {
return (
<div className="rounded-md border border-border">
<Table>
@@ -36,7 +38,10 @@ export function DataTable<TData>({ table, emptyMessage }: DataTableProps<TData>)
<TableBody>
{table.getRowModel().rows?.length ? (
table.getRowModel().rows.map((row) => (
<TableRow key={row.id} className="h-10 border-border hover:bg-muted/50">
<TableRow
key={row.id}
className={cn("h-10 border-border hover:bg-muted/50", rowClassName?.(row))}
>
{row.getVisibleCells().map((cell) => (
<TableCell key={cell.id} className="py-1.5 text-sm">
{flexRender(cell.column.columnDef.cell, cell.getContext())}

View File

@@ -5,6 +5,8 @@ import type {
PackageFileItem,
IngestionAccountStatus,
SkippedPackageItem,
DisplayItem,
PackageGroupRow,
} from "./types";
export async function listPackages(options: {
@@ -73,6 +75,177 @@ export async function listPackages(options: {
};
}
export async function listDisplayItems(options: {
page: number;
limit: number;
channelId?: string;
creator?: string;
tag?: string;
sortBy: "indexedAt" | "fileName" | "fileSize";
order: "asc" | "desc";
}): Promise<{ items: DisplayItem[]; pagination: { page: number; limit: number; total: number; totalPages: number } }> {
const { page, limit, channelId, creator, tag, sortBy, order } = options;
// Build WHERE clause fragments for raw SQL
const conditions: string[] = [];
const params: unknown[] = [];
let paramIdx = 1;
if (channelId) {
conditions.push(`p."sourceChannelId" = $${paramIdx++}`);
params.push(channelId);
}
if (creator) {
conditions.push(`p."creator" = $${paramIdx++}`);
params.push(creator);
}
if (tag) {
conditions.push(`$${paramIdx++} = ANY(p."tags")`);
params.push(tag);
}
const whereClause = conditions.length > 0 ? `WHERE ${conditions.join(" AND ")}` : "";
const sortCol = sortBy === "fileName" ? `"fileName"` : sortBy === "fileSize" ? `"fileSize"` : `"indexedAt"`;
const sortDir = order === "asc" ? "ASC" : "DESC";
// Step 1: Count display items
const countResult = await prisma.$queryRawUnsafe<[{ count: bigint }]>(
`SELECT COUNT(*) AS count FROM (
SELECT DISTINCT COALESCE(p."packageGroupId", p."id") AS display_id
FROM packages p
${whereClause}
) AS display_items`,
...params
);
const total = Number(countResult[0].count);
// Step 2: Get display item IDs for this page
const limitParam = paramIdx++;
const offsetParam = paramIdx++;
const displayRows = await prisma.$queryRawUnsafe<
{ display_id: string; display_type: string }[]
>(
`SELECT
COALESCE(p."packageGroupId", p."id") AS display_id,
CASE WHEN p."packageGroupId" IS NOT NULL THEN 'group' ELSE 'package' END AS display_type,
MAX(p.${sortCol}) AS sort_value
FROM packages p
${whereClause}
GROUP BY COALESCE(p."packageGroupId", p."id"),
CASE WHEN p."packageGroupId" IS NOT NULL THEN 'group' ELSE 'package' END
ORDER BY sort_value ${sortDir}
LIMIT $${limitParam} OFFSET $${offsetParam}`,
...params, limit, (page - 1) * limit
);
// Step 3: Fetch full data
const groupIds = displayRows.filter((r) => r.display_type === "group").map((r) => r.display_id);
const packageIds = displayRows.filter((r) => r.display_type === "package").map((r) => r.display_id);
const standalonePackages = packageIds.length > 0
? await prisma.package.findMany({
where: { id: { in: packageIds } },
select: {
id: true, fileName: true, fileSize: true, contentHash: true,
archiveType: true, fileCount: true, isMultipart: true,
indexedAt: true, creator: true, tags: true, previewData: true,
sourceChannel: { select: { id: true, title: true } },
},
})
: [];
const groups = groupIds.length > 0
? await prisma.packageGroup.findMany({
where: { id: { in: groupIds } },
select: {
id: true, name: true, previewData: true,
sourceChannel: { select: { id: true, title: true } },
packages: {
select: {
id: true, fileName: true, fileSize: true, contentHash: true,
archiveType: true, fileCount: true, isMultipart: true,
indexedAt: true, creator: true, tags: true, previewData: true,
sourceChannel: { select: { id: true, title: true } },
},
orderBy: { indexedAt: "desc" },
},
},
})
: [];
// Build DisplayItem array in the original sort order
const packageMap = new Map(standalonePackages.map((p) => [p.id, p]));
const groupMap = new Map(groups.map((g) => [g.id, g]));
const items: DisplayItem[] = displayRows.map((row) => {
if (row.display_type === "package") {
const pkg = packageMap.get(row.display_id)!;
return {
type: "package" as const,
data: {
id: pkg.id,
fileName: pkg.fileName,
fileSize: pkg.fileSize.toString(),
contentHash: pkg.contentHash,
archiveType: pkg.archiveType,
fileCount: pkg.fileCount,
isMultipart: pkg.isMultipart,
hasPreview: pkg.previewData !== null,
creator: pkg.creator,
tags: pkg.tags,
indexedAt: pkg.indexedAt.toISOString(),
sourceChannel: pkg.sourceChannel,
matchedFileCount: 0,
matchedByContent: false,
},
};
} else {
const grp = groupMap.get(row.display_id)!;
const allTags = [...new Set(grp.packages.flatMap((p) => p.tags))];
const archiveTypes = [...new Set(grp.packages.map((p) => p.archiveType))] as PackageGroupRow["archiveTypes"];
return {
type: "group" as const,
data: {
id: grp.id,
name: grp.name,
hasPreview: grp.previewData !== null,
totalFileSize: grp.packages.reduce((sum, p) => sum + p.fileSize, BigInt(0)).toString(),
totalFileCount: grp.packages.reduce((sum, p) => sum + p.fileCount, 0),
packageCount: grp.packages.length,
combinedTags: allTags,
archiveTypes,
latestIndexedAt: grp.packages.length > 0
? grp.packages[0].indexedAt.toISOString()
: new Date().toISOString(),
sourceChannel: grp.sourceChannel,
packages: grp.packages.map((pkg) => ({
id: pkg.id,
fileName: pkg.fileName,
fileSize: pkg.fileSize.toString(),
contentHash: pkg.contentHash,
archiveType: pkg.archiveType,
fileCount: pkg.fileCount,
isMultipart: pkg.isMultipart,
hasPreview: pkg.previewData !== null,
creator: pkg.creator,
tags: pkg.tags,
indexedAt: pkg.indexedAt.toISOString(),
sourceChannel: pkg.sourceChannel,
matchedFileCount: 0,
matchedByContent: false,
})),
},
};
}
});
return {
items,
pagination: { page, limit, total, totalPages: Math.ceil(total / limit) },
};
}
export async function getPackageById(
id: string
): Promise<PackageDetail | null> {
@@ -203,7 +376,16 @@ export async function searchPackages(options: {
).map((p) => p.id)
: [];
const allIds = [...new Set([...fileMatchedIds, ...packageNameIds])];
// Also match by group name
const groupNameMatches = await prisma.package.findMany({
where: {
packageGroup: { name: { contains: q, mode: "insensitive" } },
},
select: { id: true },
});
const groupMatchedIds = groupNameMatches.map((p) => p.id);
const allIds = [...new Set([...fileMatchedIds, ...packageNameIds, ...groupMatchedIds])];
const [items, total] = await Promise.all([
prisma.package.findMany({
@@ -388,3 +570,103 @@ export async function listSkippedPackages(options: {
export async function countSkippedPackages(): Promise<number> {
return prisma.skippedPackage.count();
}
export async function getPackageGroup(groupId: string) {
return prisma.packageGroup.findUnique({
where: { id: groupId },
select: {
id: true, name: true, previewData: true, mediaAlbumId: true,
sourceChannelId: true, createdAt: true,
sourceChannel: { select: { id: true, title: true } },
packages: {
select: {
id: true, fileName: true, fileSize: true, archiveType: true,
fileCount: true, creator: true, tags: true,
},
orderBy: { indexedAt: "desc" },
},
},
});
}
export async function updatePackageGroupName(groupId: string, name: string) {
return prisma.packageGroup.update({
where: { id: groupId },
data: { name: name.trim() },
});
}
export async function updatePackageGroupPreview(groupId: string, previewData: Buffer) {
return prisma.packageGroup.update({
where: { id: groupId },
data: { previewData: new Uint8Array(previewData) },
});
}
export async function createManualGroup(name: string, packageIds: string[]) {
// Verify all packages belong to the same channel
const pkgs = await prisma.package.findMany({
where: { id: { in: packageIds } },
select: { sourceChannelId: true },
});
if (pkgs.length === 0) {
throw new Error("No matching packages found");
}
const channelIds = new Set(pkgs.map((p) => p.sourceChannelId));
if (channelIds.size > 1) {
throw new Error("Cannot group packages from different channels");
}
const firstPkg = pkgs[0];
const group = await prisma.packageGroup.create({
data: {
name: name.trim(),
sourceChannelId: firstPkg.sourceChannelId,
},
});
await prisma.package.updateMany({
where: { id: { in: packageIds } },
data: { packageGroupId: group.id },
});
// Clean up empty groups left behind
await prisma.packageGroup.deleteMany({
where: { packages: { none: {} }, id: { not: group.id } },
});
return group;
}
export async function addPackagesToGroup(packageIds: string[], groupId: string) {
await prisma.package.updateMany({
where: { id: { in: packageIds } },
data: { packageGroupId: groupId },
});
await prisma.packageGroup.deleteMany({
where: { packages: { none: {} } },
});
}
export async function removePackageFromGroup(packageId: string) {
const pkg = await prisma.package.findUniqueOrThrow({
where: { id: packageId },
select: { packageGroupId: true },
});
if (!pkg.packageGroupId) return;
await prisma.package.update({
where: { id: packageId },
data: { packageGroupId: null },
});
await prisma.packageGroup.deleteMany({
where: { id: pkg.packageGroupId, packages: { none: {} } },
});
}
export async function dissolveGroup(groupId: string) {
await prisma.package.updateMany({
where: { packageGroupId: groupId },
data: { packageGroupId: null },
});
await prisma.packageGroup.delete({ where: { id: groupId } });
}

View File

@@ -68,6 +68,24 @@ export interface PaginatedResponse<T> {
};
}
export interface PackageGroupRow {
id: string;
name: string;
hasPreview: boolean;
totalFileSize: string;
totalFileCount: number;
packageCount: number;
combinedTags: string[];
archiveTypes: ("ZIP" | "RAR" | "SEVEN_Z" | "DOCUMENT")[];
latestIndexedAt: string;
sourceChannel: { id: string; title: string };
packages: PackageListItem[];
}
export type DisplayItem =
| { type: "package"; data: PackageListItem }
| { type: "group"; data: PackageGroupRow };
export interface IngestionAccountStatus {
id: string;
displayName: string | null;

View File

@@ -10,6 +10,7 @@ export interface TelegramMessage {
fileId: string;
fileSize: bigint;
date: Date;
mediaAlbumId?: string;
}
export interface ArchiveSet {

View File

@@ -6,8 +6,12 @@ import { childLogger } from "../util/logger.js";
const log = childLogger("split");
/** 2GB in bytes — Telegram's file size limit */
const MAX_PART_SIZE = 2n * 1024n * 1024n * 1024n;
/**
* 1950 MiB — safely under Telegram's 2GB upload limit.
* At exactly 2GiB, TDLib's internal 512KB chunking can exceed Telegram's
* 4000-part threshold, causing FILE_PARTS_INVALID errors.
*/
const MAX_PART_SIZE = 1950n * 1024n * 1024n;
/**
* Split a file into ≤2GB parts using byte-level splitting.

View File

@@ -535,3 +535,53 @@ export async function deleteSkippedPackage(
where: { sourceChannelId, sourceMessageId },
});
}
export async function createOrFindPackageGroup(input: {
mediaAlbumId: string;
sourceChannelId: string;
name: string;
previewData?: Buffer | null;
}): Promise<string> {
// findFirst + conditional create (Prisma doesn't support upsert on nullable compound unique)
const existing = await db.packageGroup.findFirst({
where: {
mediaAlbumId: input.mediaAlbumId,
sourceChannelId: input.sourceChannelId,
},
select: { id: true },
});
if (existing) return existing.id;
try {
const group = await db.packageGroup.create({
data: {
mediaAlbumId: input.mediaAlbumId,
sourceChannelId: input.sourceChannelId,
name: input.name,
previewData: input.previewData ? new Uint8Array(input.previewData) : undefined,
},
});
return group.id;
} catch (err) {
// Handle race condition: another process created the group between our findFirst and create
if (err instanceof Error && err.message.includes("Unique constraint")) {
const raced = await db.packageGroup.findFirst({
where: { mediaAlbumId: input.mediaAlbumId, sourceChannelId: input.sourceChannelId },
select: { id: true },
});
if (raced) return raced.id;
}
throw err;
}
}
export async function linkPackagesToGroup(
packageIds: string[],
groupId: string
): Promise<void> {
await db.package.updateMany({
where: { id: { in: packageIds } },
data: { packageGroupId: groupId },
});
}

79
worker/src/grouping.ts Normal file
View File

@@ -0,0 +1,79 @@
import type { Client } from "tdl";
import type { TelegramPhoto } from "./preview/match.js";
import { downloadPhotoThumbnail } from "./tdlib/download.js";
import { createOrFindPackageGroup, linkPackagesToGroup } from "./db/queries.js";
import { childLogger } from "./util/logger.js";
import { db } from "./db/client.js";
const log = childLogger("grouping");
export interface IndexedPackageRef {
packageId: string;
sourceMessageId: bigint;
mediaAlbumId?: string;
}
/**
* After a scan cycle's packages are individually indexed, detect album groups
* and create PackageGroup records linking the members.
*/
export async function processAlbumGroups(
client: Client,
sourceChannelId: string,
indexedPackages: IndexedPackageRef[],
photos: TelegramPhoto[]
): Promise<void> {
// Group indexed packages by mediaAlbumId
const albumMap = new Map<string, IndexedPackageRef[]>();
for (const pkg of indexedPackages) {
if (!pkg.mediaAlbumId || pkg.mediaAlbumId === "0") continue;
const group = albumMap.get(pkg.mediaAlbumId) ?? [];
group.push(pkg);
albumMap.set(pkg.mediaAlbumId, group);
}
if (albumMap.size === 0) return;
log.info({ albumCount: albumMap.size }, "Detected album groups to process");
for (const [albumId, members] of albumMap) {
if (members.length < 2) continue;
try {
// Find the first package's fileName for the group name fallback
const firstPkg = await db.package.findFirst({
where: { id: { in: members.map((m) => m.packageId) } },
orderBy: { sourceMessageId: "asc" },
select: { id: true, fileName: true },
});
// Try to find a caption from the album's photo message
const albumPhoto = photos.find((p) => p.mediaAlbumId === albumId);
const groupName = albumPhoto?.caption || firstPkg?.fileName || "Unnamed Group";
// Download preview from album photo if available
let previewData: Buffer | null = null;
if (albumPhoto) {
previewData = await downloadPhotoThumbnail(client, albumPhoto.fileId);
}
const groupId = await createOrFindPackageGroup({
mediaAlbumId: albumId,
sourceChannelId,
name: groupName,
previewData,
});
// Idempotent link — safe to re-run if some packages were indexed in prior scans
const packageIds = members.map((m) => m.packageId);
await linkPackagesToGroup(packageIds, groupId);
log.info(
{ albumId, groupId, groupName, memberCount: packageIds.length },
"Linked packages to album group"
);
} catch (err) {
log.warn({ albumId, err }, "Failed to create album group — packages still indexed individually");
}
}
}

View File

@@ -10,6 +10,7 @@ export interface TelegramPhoto {
/** The smallest photo size available — used as thumbnail. */
fileId: string;
fileSize: number;
mediaAlbumId?: string;
}
export interface ArchiveRef {

View File

@@ -35,6 +35,7 @@ interface TdPhotoSize {
interface TdMessage {
id: number;
date: number;
media_album_id?: string;
content: {
_: string;
document?: {
@@ -211,6 +212,7 @@ export async function getChannelMessages(
fileId: String(doc.document.id),
fileSize: BigInt(doc.document.size),
date: new Date(msg.date * 1000),
mediaAlbumId: msg.media_album_id && msg.media_album_id !== "0" ? msg.media_album_id : undefined,
});
continue;
}
@@ -227,6 +229,7 @@ export async function getChannelMessages(
caption,
fileId: String(smallest.photo.id),
fileSize: smallest.photo.size || smallest.photo.expected_size,
mediaAlbumId: msg.media_album_id && msg.media_album_id !== "0" ? msg.media_album_id : undefined,
});
}
}
@@ -353,11 +356,14 @@ export async function downloadFile(
return new Promise<void>((resolve, reject) => {
let lastLoggedPercent = 0;
let settled = false;
let downloadStarted = false; // True once TDLib reports is_downloading_active
let lastProgressBytes = 0;
let lastProgressTime = Date.now();
// Timeout: 15 minutes per GB, minimum 10 minutes
// Timeout: 20 minutes per GB, minimum 15 minutes
const timeoutMs = Math.max(
10 * 60_000,
(totalBytes / (1024 * 1024 * 1024)) * 15 * 60_000
15 * 60_000,
(totalBytes / (1024 * 1024 * 1024)) * 20 * 60_000
);
const timer = setTimeout(() => {
if (!settled) {
@@ -371,6 +377,23 @@ export async function downloadFile(
}
}, timeoutMs);
// Stall detection: no progress for 5 minutes after download started → reject
const STALL_TIMEOUT_MS = 5 * 60_000;
const stallChecker = setInterval(() => {
if (settled || !downloadStarted) return;
const stallMs = Date.now() - lastProgressTime;
if (stallMs >= STALL_TIMEOUT_MS) {
settled = true;
cleanup();
reject(
new Error(
`Download stalled for ${fileName} — no progress for ${Math.round(stallMs / 60_000)}min ` +
`(${lastProgressBytes}/${totalBytes} bytes)`
)
);
}
}, 30_000);
// Listen for file update events to track progress
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const handleUpdate = (update: any) => {
@@ -382,6 +405,17 @@ export async function downloadFile(
const percent =
totalBytes > 0 ? Math.round((downloaded / totalBytes) * 100) : 0;
// Track whether the download has actually started
if (file.local.is_downloading_active) {
downloadStarted = true;
}
// Reset stall timer when bytes advance
if (downloaded > lastProgressBytes) {
lastProgressBytes = downloaded;
lastProgressTime = Date.now();
}
// Log at every 10% increment
if (percent >= lastLoggedPercent + 10) {
lastLoggedPercent = percent - (percent % 10);
@@ -412,8 +446,11 @@ export async function downloadFile(
}
}
// Download stopped without completing (network error, cancelled, etc.)
// Download stopped without completing — only if it had actually started.
// TDLib may emit an initial updateFile with is_downloading_active=false
// before the download begins; ignoring that prevents false positives.
if (
downloadStarted &&
!file.local.is_downloading_active &&
!file.local.is_downloading_completed
) {
@@ -432,6 +469,7 @@ export async function downloadFile(
const cleanup = () => {
clearTimeout(timer);
clearInterval(stallChecker);
client.off("update", handleUpdate);
};

View File

@@ -201,6 +201,7 @@ export async function getTopicMessages(
messages?: {
id: number;
date: number;
media_album_id?: string;
content: {
_: string;
document?: {
@@ -248,6 +249,7 @@ export async function getTopicMessages(
fileId: String(doc.document.id),
fileSize: BigInt(doc.document.size),
date: new Date(msg.date * 1000),
mediaAlbumId: msg.media_album_id && msg.media_album_id !== "0" ? msg.media_album_id : undefined,
});
continue;
}
@@ -263,6 +265,7 @@ export async function getTopicMessages(
caption,
fileId: String(smallest.photo.id),
fileSize: smallest.photo.size || smallest.photo.expected_size,
mediaAlbumId: msg.media_album_id && msg.media_album_id !== "0" ? msg.media_album_id : undefined,
});
}
}

View File

@@ -93,11 +93,13 @@ async function sendAndWaitForUpload(
let settled = false;
let lastLoggedPercent = 0;
let tempMsgId: number | null = null;
let uploadStarted = false;
let lastProgressTime = Date.now();
// Timeout: 15 minutes per GB, minimum 10 minutes
// Timeout: 20 minutes per GB, minimum 15 minutes
const timeoutMs = Math.max(
10 * 60_000,
(fileSizeMB / 1024) * 15 * 60_000
15 * 60_000,
(fileSizeMB / 1024) * 20 * 60_000
);
const timer = setTimeout(() => {
@@ -112,12 +114,31 @@ async function sendAndWaitForUpload(
}
}, timeoutMs);
// Stall detection: no progress for 5 minutes after upload started → reject
const STALL_TIMEOUT_MS = 5 * 60_000;
const stallChecker = setInterval(() => {
if (settled || !uploadStarted) return;
const stallMs = Date.now() - lastProgressTime;
if (stallMs >= STALL_TIMEOUT_MS) {
settled = true;
cleanup();
reject(
new Error(
`Upload stalled for ${fileName} — no progress for ${Math.round(stallMs / 60_000)}min`
)
);
}
}, 30_000);
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const handleUpdate = (update: any) => {
// Track upload progress via updateFile events
if (update?._ === "updateFile") {
const file = update.file;
if (file?.remote?.is_uploading_active && file.expected_size > 0) {
uploadStarted = true;
lastProgressTime = Date.now();
const uploaded = file.remote.uploaded_size ?? 0;
const total = file.expected_size;
const percent = Math.round((uploaded / total) * 100);
@@ -165,6 +186,7 @@ async function sendAndWaitForUpload(
const cleanup = () => {
clearTimeout(timer);
clearInterval(stallChecker);
client.off("update", handleUpdate);
};

View File

@@ -47,6 +47,7 @@ import { readRarContents } from "./archive/rar-reader.js";
import { read7zContents } from "./archive/sevenz-reader.js";
import { byteLevelSplit, concatenateFiles } from "./archive/split.js";
import { uploadToChannel } from "./upload/channel.js";
import { processAlbumGroups, type IndexedPackageRef } from "./grouping.js";
import type { TelegramAccount, TelegramChannel } from "@prisma/client";
import type { Client } from "tdl";
@@ -722,10 +723,11 @@ async function processArchiveSets(
// Track the highest message ID that was successfully processed
let maxProcessedId: bigint | null = null;
const indexedPackageRefs: IndexedPackageRef[] = [];
for (let setIdx = 0; setIdx < archiveSets.length; setIdx++) {
try {
await processOneArchiveSet(
const packageId = await processOneArchiveSet(
ctx,
archiveSets[setIdx],
setIdx,
@@ -734,6 +736,15 @@ async function processArchiveSets(
ingestionRunId
);
if (packageId) {
const firstPart = archiveSets[setIdx].parts[0];
indexedPackageRefs.push({
packageId,
sourceMessageId: firstPart.id,
mediaAlbumId: firstPart.mediaAlbumId,
});
}
// Set completed (ingested or confirmed duplicate) — advance watermark
const setMaxId = archiveSets[setIdx].parts.reduce(
(max, p) => (p.id > max ? p.id : max),
@@ -771,6 +782,16 @@ async function processArchiveSets(
}
}
// Post-processing: group packages by Telegram album ID
if (indexedPackageRefs.length > 0) {
await processAlbumGroups(
ctx.client,
channel.id,
indexedPackageRefs,
scanResult.photos
);
}
return maxProcessedId;
}
@@ -784,7 +805,7 @@ async function processOneArchiveSet(
totalSets: number,
previewMatches: Map<string, { id: bigint; fileId: string }>,
ingestionRunId: string
): Promise<void> {
): Promise<string | null> {
const {
client, runId, channelTitle, channel,
destChannelTelegramId, destChannelId,
@@ -814,7 +835,7 @@ async function processOneArchiveSet(
totalFiles: totalSets,
zipsDuplicate: counters.zipsDuplicate,
});
return;
return null;
}
// ── Size guard: skip archives that exceed WORKER_MAX_ZIP_SIZE_MB ──
@@ -848,7 +869,7 @@ async function processOneArchiveSet(
partCount: archiveSet.parts.length,
accountId: ctx.accountId,
});
return;
return null;
}
const tempPaths: string[] = [];
@@ -954,7 +975,7 @@ async function processOneArchiveSet(
totalFiles: totalSets,
zipsDuplicate: counters.zipsDuplicate,
});
return;
return null;
}
// ── Reading metadata ──
@@ -999,7 +1020,7 @@ async function processOneArchiveSet(
(sum, p) => sum + p.fileSize,
0n
);
const MAX_UPLOAD_SIZE = 2n * 1024n * 1024n * 1024n;
const MAX_UPLOAD_SIZE = 1950n * 1024n * 1024n; // Match split.ts MAX_PART_SIZE
const hasOversizedPart = archiveSet.parts.some((p) => p.fileSize > MAX_UPLOAD_SIZE);
if (hasOversizedPart) {
@@ -1127,7 +1148,7 @@ async function processOneArchiveSet(
tags.push(channel.category);
}
await createPackageWithFiles({
const pkg = await createPackageWithFiles({
contentHash,
fileName: archiveName,
fileSize: totalSize,
@@ -1166,6 +1187,8 @@ async function processOneArchiveSet(
{ fileName: archiveName, contentHash, fileCount: entries.length, creator },
"Archive ingested"
);
return pkg.id;
} finally {
// ALWAYS delete temp files and the set directory
await deleteFiles([...tempPaths, ...splitPaths]);