107 Commits

Author SHA1 Message Date
77c26adb31 perf: set watermarks even when no archives found to prevent re-scanning
All checks were successful
continuous-integration/drone/push Build is passing
Previously, channels/topics with no new archives never had their
watermark updated. This meant every cycle re-scanned all messages from
scratch just to discover nothing new — especially costly for the 1079-
topic Model Printing Emporium forum.

- Add maxScannedMessageId to ChannelScanResult (highest msg ID seen)
- Set channel watermark to scan boundary when no archives are found
- Set topic watermark to scan boundary when no archives are found
- Fall back to scan watermark when archive processing doesn't advance it

After one full cycle, subsequent cycles will skip already-scanned
messages via the early-exit boundary check, dramatically reducing
TDLib API calls on channels with mostly non-archive content.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-05-04 20:37:42 +02:00
35cce3151c perf: early-exit channel scan when all messages are below watermark
searchChatMessages returns newest-first. Once the oldest message on a
page is at or below the lastProcessedMessageId boundary, all remaining
pages are even older. Stop scanning immediately instead of reading every
message in the channel.

This was already implemented for topic scans but missing from channel
scans. On a test run, total messages scanned dropped from 3805 to 1615
(57% reduction) for an account with no new archives.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-05-04 19:58:30 +02:00
d6c82ede1e fix: auto-recover from TDLib upload stalls by recreating client
When TDLib's event stream degrades, uploads complete (bytes sent) but
confirmations never arrive. Previously the worker retried 3x with the
same broken client, wasting 60+ min per archive and holding the mutex.

- Add UploadStallError class to distinguish stalls from other failures
- Reduce stall detection timeout from 5min to 3min (faster detection)
- Recreate TDLib client after consecutive upload stalls instead of
  retrying on the same degraded connection
- Add forceReleaseMutex() to prevent cascade failures when one account
  blocks others via stuck mutex after cycle timeout

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-05-04 18:02:42 +02:00
7e48131f67 fix: clear timeout on race settlement to prevent orphaned timers
All checks were successful
continuous-integration/drone/push Build is passing
2026-05-02 23:44:18 +02:00
a79cb4749b fix: use per-account mutex keys in fetch/extract listeners, add cycle timeout and error logging 2026-05-02 23:40:37 +02:00
e9017fc518 feat: parallel account ingestion via per-key TDLib mutex 2026-05-02 23:31:02 +02:00
4f59d19ac2 feat: apply per-account Premium 4GB upload limit to bypass repacking 2026-05-02 23:28:00 +02:00
579276ee2d fix: widen hash lock try/finally to prevent lock leak on error paths 2026-05-02 23:24:08 +02:00
b48cc510a4 feat: add two-phase DB write and hash advisory lock to prevent double-uploads 2026-05-02 23:13:55 +02:00
614c8e5b74 feat: add createPackageStub and updatePackageWithMetadata for two-phase DB write 2026-05-02 23:06:17 +02:00
3019c23f70 feat: add per-content-hash advisory lock to prevent concurrent duplicate uploads
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-02 23:04:43 +02:00
436a576085 feat: detect and persist Telegram Premium status after authentication
After TDLib login completes, calls getMe() to detect isPremium, persists
it to DB via updateAccountPremiumStatus, and returns { client, isPremium }
from createTdlibClient. All callers updated to destructure accordingly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-02 23:02:46 +02:00
f454303352 feat: add isPremium field to TelegramAccount
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-02 22:58:53 +02:00
e29bd79d66 chore: ignore .worktrees directory 2026-05-02 22:54:56 +02:00
61e61d0085 docs: add worker improvements implementation plan
7-task plan covering double-upload fix (hash lock + two-phase write),
parallel account ingestion (per-key mutex), and Premium 4GB upload
limit with automatic detection.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-02 22:47:52 +02:00
925d916a3c Merge branch 'main' of https://github.com/xCyanGrizzly/DragonsStash 2026-05-02 22:38:32 +02:00
27bacaf24c docs: add worker improvements design spec
Covers double-upload fix (two-phase DB write + hash advisory lock),
parallel account processing (remove TDLib mutex), and per-account
Premium 4GB upload limit with automatic is_premium detection.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-02 22:35:27 +02:00
be4daf950b fix: correct User table reference in manual_uploads migration
All checks were successful
continuous-integration/drone/push Build is passing
The FK referenced "users" but the actual table is "User" (no @@map in Prisma schema).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-30 21:29:55 +02:00
af7094637d feat: file upload from UI, notification dismiss, audit false positive fix
Manual file upload:
- Upload dialog in STL page with drag-and-drop file picker
- Files saved to shared Docker volume (/data/uploads)
- Worker processes via pg_notify('manual_upload') channel
- Hashes, reads metadata, splits >2GB, uploads to Telegram
- Multiple files automatically grouped
- Status polling shows upload/processing/complete states

Notification fixes:
- Add dismiss (X) button on each notification
- Add "Clear" button to remove all notifications
- Fix false positive MISSING_PART alerts from legacy packages
  (only flag when >1 destMessageIds stored but count wrong,
  not when only 1 ID from backfill)

Infrastructure:
- ManualUpload + ManualUploadFile schema + migration
- Shared manual_uploads Docker volume between app and worker
- Upload API routes (POST /api/uploads, GET /api/uploads/[id])
- Worker manual-upload processor with full pipeline

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-30 20:26:06 +02:00
f4aa9d9a2f feat: complete remaining features — training, FTS, bot groups, repair, re-tag
All checks were successful
continuous-integration/drone/push Build is passing
Manual override training (GroupingRule):
- Learn patterns from manual group creation (common filename prefix or creator)
- Apply learned rules as first auto-grouping pass (highest confidence after albums)
- GroupingRule model stores pattern, channel, signal type, confidence

Hash verification after upload:
- Re-hash upload files on disk before indexing to catch disk corruption
- Creates HASH_MISMATCH notification on discrepancy

Grouping conflict detection:
- After all grouping passes, check if grouped packages match rules from different groups
- Creates GROUPING_CONFLICT notification for manual review

Per-channel grouping flags:
- Add autoGroupEnabled boolean to TelegramChannel (default true)
- Auto-grouping passes (all except album) gated behind this flag
- Album grouping always runs as it reflects Telegram's native behavior

Full-text search (tsvector):
- Add searchVector tsvector column with GIN index and auto-update trigger
- Backfill 1870 existing packages
- FTS with ts_rank for ranked results, ILIKE fallback for short/failed queries
- Applied to both web app and bot search

Bot group awareness:
- /group <query> — view group info or search groups by name
- /sendgroup <id> — send all packages in a group to linked Telegram account

Bulk repair:
- repairPackageAction clears dest info and resets watermark for re-processing
- Repair button in notification bell for MISSING_PART and HASH_MISMATCH alerts
- /api/notifications/repair endpoint

Retroactive category re-tagging:
- When channel category changes, auto-update tags on all existing packages
- Removes old category tag, adds new one

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-30 14:34:14 +02:00
7f9a03d4ee feat: group merge, ZIP/reply/caption grouping, integrity audit
Group merge UI:
- Add mergeGroups query and mergeGroupsAction server action
- Add "Start Merge" / "Merge Here" buttons to group row actions
- Two-step UX: click Start on source, click Merge Here on target

ZIP path prefix grouping (Signal 7):
- Compare PackageFile.path root folders across ungrouped packages
- Auto-group if 2+ packages share the same dominant root folder

Reply chain grouping (Signal 6):
- Capture reply_to_message_id during channel scanning
- Group archives that reply to the same root message
- Add replyToMessageId field to Package schema

Caption fuzzy match grouping (Signal 8):
- Capture source caption during channel scanning
- Normalize captions (strip extensions, extract significant words)
- Group packages with matching normalized caption keys
- Add sourceCaption field to Package schema

Periodic integrity audit:
- Check multipart packages for completeness (parts vs destMessageIds)
- Detect orphaned indexes (destChannelId set but no destMessageId)
- Runs after each ingestion cycle, deduplicates notifications

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-30 14:19:36 +02:00
2c46ab0843 feat: pattern/creator grouping, notification UI, failure alerts
Pattern grouping (Signal 3):
- Extract YYYY-MM dates, month names, and project prefixes from filenames
- Auto-group packages sharing the same pattern within a channel
- Groups created with groupingSource=AUTO_PATTERN

Creator grouping (Signal 4):
- Auto-group 3+ ungrouped packages from the same creator within a channel
- Runs after pattern grouping as lowest-priority automatic signal

Notification UI:
- Add NotificationBell component to header with unread badge
- Popover panel shows recent notifications with severity icons
- Mark individual or all notifications as read
- Polls every 30 seconds for updates

Failure notifications:
- Upload/download failures now create SystemNotification records
- Visible in the notification bell alongside hash mismatch alerts

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-30 13:43:55 +02:00
9e78cc5d19 feat: grouping phase 1 — schema, ungrouped tab, time-window grouping, hash verification
Schema:
- Add GroupingSource enum (ALBUM, MANUAL, AUTO_TIME, AUTO_PATTERN, etc.)
- Add groupingSource field to PackageGroup with backfill
- Add SystemNotification model for persistent alerts
- Add NotificationType and NotificationSeverity enums

Ungrouped staging tab:
- Add listUngroupedPackages/countUngroupedPackages queries
- Add "Ungrouped" tab to STL page showing packages without a group

Time-window auto-grouping:
- After album grouping, cluster ungrouped packages within configurable
  time window (default 5 min, AUTO_GROUP_TIME_WINDOW_MINUTES env var)
- Groups named from common filename prefix
- Groups created with groupingSource=AUTO_TIME

Hash verification after split:
- Re-hash split parts and compare to original contentHash
- Log error and create SystemNotification on mismatch
- Prevents silently corrupted split uploads

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-30 13:00:27 +02:00
194c87a256 fix: raise size limit and make MAX_PART_SIZE configurable
All checks were successful
continuous-integration/drone/push Build is passing
- Raise WORKER_MAX_ZIP_SIZE_MB from 4GB to 200GB (production .env)
- Make MAX_PART_SIZE configurable via MAX_PART_SIZE_MB env var
  (default 1950 MiB, set to 3900 for Premium accounts)
- Remove hardcoded 1950 MiB constants in split.ts and worker.ts
- Add grouping system audit report with real-world failure cases

10 archives were blocked by the 4GB limit (up to 70.5GB).
They will be retried on next ingestion cycle.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-30 12:41:37 +02:00
718007446f feat: fix multi-part archive forwarding and add kickstarter package linking
All checks were successful
continuous-integration/drone/push Build is passing
Multi-part send fix:
- Add destMessageIds BigInt[] to Package schema with backfill migration
- Worker uploadToChannel now returns all message IDs, stored in DB
- Bot forwards all parts of multi-part archives (not just the first)
- Add retry logic for upload rate limits (429) and download stalls

Kickstarter package linking:
- Add package search/linking queries and API routes
- Add PackageLinkerDialog with search + checkbox selection
- Add "Link Packages" and "Send All" actions to kickstarter table
- Add sendAllKickstarterPackages server action

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:11:35 +01:00
527aca7c25 feat: add package grouping for Telegram album files
All checks were successful
continuous-integration/drone/push Build is passing
Groups related packages posted together in Telegram channels.
Auto-detects albums via media_album_id, supports manual grouping
from UI. Groups appear as collapsible rows in STL files table.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:46:52 +01:00
a4156b2ac6 fix: add race condition guard and null check in group queries
- createOrFindPackageGroup: catch unique constraint violation from
  concurrent creates and fall back to findFirst
- createManualGroup: guard against empty package results before
  accessing first element

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:45:29 +01:00
d50c68f67c feat: add package grouping UI with expand/collapse, selection, and manual grouping
- Update STL page to use listDisplayItems query for mixed package/group display
- Rewrite package-columns to handle StlTableRow union type (group headers + packages)
- Add group expand/collapse with chevron toggle and indented member rows
- Add checkbox selection with "Group N Selected" toolbar button and dialog
- Add inline group actions: rename, dissolve, send all, remove member
- Add clickable group preview thumbnail with file upload for preview images
- Extend DataTable with optional rowClassName prop for group row styling

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:39:23 +01:00
f6e7f5ed3c feat: add server actions for group management
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:34:29 +01:00
e7f213eec4 feat: add group preview image API endpoint
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:33:29 +01:00
20b7d28fdf feat: add listDisplayItems query, group CRUD, and search by group name
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:32:47 +01:00
21663fc29e feat: add PackageGroupRow and DisplayItem types
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:31:02 +01:00
218ccb9282 feat: add album grouping post-processing to worker pipeline
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:28:19 +01:00
b632533f54 feat: add createOrFindPackageGroup and linkPackagesToGroup worker queries
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:24:31 +01:00
4baf5aad83 feat: capture media_album_id from TDLib messages during scanning
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:23:47 +01:00
ad7790c07b feat: add mediaAlbumId to TelegramMessage and TelegramPhoto interfaces
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:23:11 +01:00
e4398caebe feat: add PackageGroup schema for album-based file grouping
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:21:52 +01:00
6eb7129637 docs: add package grouping design spec and implementation plan
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 21:40:13 +01:00
d6386209be fix: improve download/upload reliability and fix FILE_PARTS_INVALID
- Add downloadStarted flag to prevent false "stopped unexpectedly" errors
  when TDLib emits initial updateFile before download is active
- Add 5-minute stall detection for both downloads and uploads
- Reduce max split part size from 2GiB to 1950MiB to stay under
  Telegram's internal upload part count limits
- Increase timeouts from max(10min, 15min/GB) to max(15min, 20min/GB)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 21:40:00 +01:00
fe28c31b9e fix: improve worker error handling and reliability
All checks were successful
continuous-integration/drone/push Build is passing
1. Distinguish failure reasons: inspect error messages to label skipped
   packages as DOWNLOAD_FAILED, UPLOAD_FAILED, or EXTRACT_FAILED
   instead of catch-all DOWNLOAD_FAILED.

2. Detect orphaned uploads: before uploading, check if the same content
   hash already has a successful upload on the destination channel. Reuse
   the existing message ID instead of re-uploading (prevents duplicates
   when worker crashed between upload and DB write).

3. Increase timeouts: download from max(5min, GB*10min) to
   max(10min, GB*15min), upload from GB*10min to GB*15min.
   Prevents premature timeouts on slow connections.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 02:37:23 +01:00
55bdf3c890 feat: add migration for skipped_packages table
All checks were successful
continuous-integration/drone/push Build is passing
Hand-written migration matching the SkippedPackage model in schema.prisma.
Will be applied automatically by docker-entrypoint.sh via `prisma migrate deploy`.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 21:27:25 +01:00
5506c7d91b fix: resolve build errors in retry actions and package detail query
- Replace BigInt literal `1n` with `BigInt(1)` for ES target compatibility
- Add default matchedFileCount/matchedByContent to getPackageById return

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:40:14 +01:00
5a3550fa10 feat: add skipped/failed packages tab to STL files page
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:36:39 +01:00
ad3d42a997 feat: add skipped/failed packages table UI components
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:24:32 +01:00
dd0d246a77 feat: highlight matching files in package drawer when opened from search
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:22:50 +01:00
dcc1c97053 feat: add retry server actions for skipped/failed packages
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:22:31 +01:00
71c3228e44 feat: add query functions for listing skipped/failed packages
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:19:42 +01:00
094001f9f7 feat: show file match count badge in search results
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:19:40 +01:00
0faacc214b feat: return per-package file match counts from searchPackages
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:17:42 +01:00
d53e581623 feat: record skipped/failed archives in database for UI visibility
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:16:12 +01:00
780e6200d8 feat: add SkippedPackage model for tracking skipped/failed archives
Adds SkipReason enum (SIZE_LIMIT, DOWNLOAD_FAILED, EXTRACT_FAILED,
UPLOAD_FAILED) and SkippedPackage model with unique constraint on
(sourceChannelId, sourceMessageId). Reverse relations added to
TelegramAccount and TelegramChannel.

Note: Run `npx prisma migrate dev --name add-skipped-packages` when
database is available to create the migration.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:11:56 +01:00
9642adaba7 feat: raise default ingestion size limit from 4GB to 200GB
Multipart archives where individual parts fit under Telegram's 2GB limit
but total size exceeds 4GB were being silently skipped. These can now be
processed up to 200GB total, with each part uploading directly.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:01:41 +01:00
9bc9271f11 fix: auto-create READER links when enabling a source channel
All checks were successful
continuous-integration/drone/push Build is passing
toggleChannelActive only flipped isActive but never created the
AccountChannelMap READER link needed by the worker. Channels enabled
via the toggle (rather than the channel picker) were invisible to the
scanner. Now auto-creates READER links for all active authenticated
accounts when a SOURCE channel is enabled.

Also ran a one-time DB fix to backfill READER links for the 14 active
channels that were missing them.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-23 23:29:52 +01:00
bd358a134b fix: add tags to new_package payload type assertion in bot
All checks were successful
continuous-integration/drone/push Build is passing
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-23 23:07:22 +01:00
1425db8774 fix: use loadChats API and load chat folders for complete chat discovery
Some checks failed
continuous-integration/drone/push Build is failing
- Switch from getChats pagination to loadChats (the TDLib-recommended
  API) which properly loads all chats into TDLib's cache and signals
  completion with a 404 error
- Discover and load chat folders via getChatFolders so chats in
  user-created folders are included
- Load from main + archive + all folders in both worker startup and
  getAccountChats channel discovery
- After loading, use getChats with high limit to retrieve all cached IDs
- This ensures private chats, 1-on-1 conversations, Saved Messages,
  basic groups, and archived/folder chats are all discoverable

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-23 22:38:49 +01:00
aef76828ef fix: support large accounts and archived chats in channel discovery
Some checks failed
continuous-integration/drone/push Build is failing
- Increase getChats pagination from 50 pages (5K chats) to 500 pages
  (50K chats) to support accounts with many channels/groups
- Load from both chatListMain AND chatListArchive so older/archived
  chats are discovered and scannable
- Deduplicate chat IDs across both lists
- Worker startup also loads both lists before scanning

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-23 19:50:14 +01:00
29e95f780c feat: support all chat types in channel discovery and enrich bot messages
Channel Discovery:
- Remove channel/supergroup filter from getAccountChats — all chat types
  (private, groups, Saved Messages, etc.) are now discoverable as sources
- Detect and label the self-chat as "Saved Messages" via getMe
- Update channel picker dialog to accept any chat type string

Bot Rich Messages:
- Enhance package send preview with creator, file count, tags, and source
  channel info in MarkdownV2 caption
- Include tags in new_package subscription notifications
- Expand getPendingSendRequest to fetch richer package data

Performance:
- Reviewed pipeline for many-channel load — getChats pagination fix and
  per-channel getChat pre-load from prior commit address the main concerns
- Channels with no new messages skip in 2-3 API calls

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-23 18:27:48 +01:00
5fd341dfc4 feat: fix channel scanning bugs, add package tags, and kickstarters tab
Bug fixes:
- Fix channels not being scanned by paginating TDLib getChats (was only
  loading first batch, additional channels were unknown to TDLib)
- Add per-channel getChat pre-load as safety net before scanning
- Fix preview pictures not loading by checking previewData instead of
  previewMsgId for hasPreview flag
- Prevent previewMsgId from being set when preview download fails

Package Tags:
- Add tags Text[] column to Package with migration backfilling from
  channel categories
- Worker auto-inherits source channel category as initial tag
- Tag filter dropdown and Tags column in STL Files table
- Server actions for individual and bulk tag editing

Kickstarters Tab:
- New KickstarterHost, Kickstarter, and KickstarterPackage models
- Full CRUD with delivery status, payment status, host management
- Package linking (many-to-many with existing packages)
- Sidebar entry with Gift icon
- Table with search, filters, modal forms

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-23 18:17:44 +01:00
admin
e2dd3bb9d0 fix: cast sendMessage result type for strict TypeScript
All checks were successful
continuous-integration/drone/push Build is passing
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 14:20:16 +01:00
admin
ccf6f9000d fix: rewrite bot file sending to use getMessage + inputFileRemote
Some checks failed
continuous-integration/drone/push Build is failing
forwardMessages with send_copy is unreliable for bot accounts — it
queues an internal download+reupload that can silently fail.

New approach:
1. getMessage to get the file's remote ID from the dest channel
2. sendMessage with inputMessageDocument + inputFileRemote to send
   directly to the user (no re-upload needed, file is already on
   Telegram's servers)
3. waitForSendConfirmation to actually verify delivery via
   updateMessageSendSucceeded/Failed

Also removed getChats call (not available to bot accounts).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 13:48:11 +01:00
admin
a4c264a144 fix: bot send confirmation, preview picker dialog nesting, upload button
- Bot: wait for updateMessageSendSucceeded/Failed before marking send
  complete (was returning on temp message, actual send was async)
- Preview picker: move ArchivePreviewPicker outside parent Dialog to
  fix Radix nested dialog focus trap conflict
- Upload: add explicit "Upload Preview" button always visible in the
  action bar alongside "Pick Preview"

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 13:15:31 +01:00
admin
f4488a079f fix: add getChat before forwardMessages and debug logging for bot sends
The bot may not have the source channel loaded in TDLib's internal
state. Calling getChat first ensures it's resolved. Also added result
logging to diagnose silent send failures.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 12:39:42 +01:00
admin
729f296232 fix: use forward (not send_copy) for bot message delivery and add logging
send_copy requires re-uploading which may silently fail for bots.
Regular forward is more reliable. Added logging to debug delivery.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 12:38:24 +01:00
admin
a48f9c24a7 ci: run app/worker/bot builds in parallel and deploy after all complete
All checks were successful
continuous-integration/drone/push Build is passing
All three builds now depend only on clone and run concurrently.
Deploy step waits for all three to finish.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 11:53:53 +01:00
admin
84bb167ce6 fix: load bot chat list after TDLib auth to enable message forwarding
Some checks failed
continuous-integration/drone/push Build is failing
Without getChats after login, TDLib doesn't know about the destination
channel and forwardMessages fails with "Chat not found".

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 11:33:46 +01:00
admin
7cd84dbf02 fix: map ArchiveFormat '7Z' to ArchiveType 'SEVEN_Z' in rebuild
All checks were successful
continuous-integration/drone/push Build is passing
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 11:03:09 +01:00
admin
c00fc528ac fix: BigInt literal compatibility and bot link code JSON parsing
Some checks failed
continuous-integration/drone/push Build is failing
- Replace 0n literals with BigInt(0) for ES2017 target compatibility
- Parse link code JSON to extract userId and check expiration (was
  passing raw JSON string as FK, causing constraint violation)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 09:56:10 +01:00
admin
1fc2d3e1ae feat: add migration for archive extract requests and invite user relation
Some checks failed
continuous-integration/drone/push Build is failing
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 00:12:26 +01:00
admin
ab558e00f5 feat: add preview management, channel controls, invite polish, and recovery
- Auto-extract preview images from ZIP/RAR/7z archives during ingestion
- Upload custom preview images via package drawer
- Select preview from archive contents with on-demand extraction UI
- Manually add Telegram channels by t.me link, username, or invite link
- Invite code UX: bulk create, copy link, usage tracking, delete confirm
- Incomplete upload recovery: verify dest messages on worker startup
- Rebuild package DB by scanning destination channel with live progress

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-22 00:09:59 +01:00
admin
bf093cdfca fix: 7z parser handles solid archives with empty Compressed column 2026-03-21 21:18:33 +01:00
admin
a90f653314 feat: add 7z archive content listing via p7zip
- Add p7zip-full to worker Docker image
- New read7zContents() parser using 7z l output
- 7z archives now get full file listings like ZIP/RAR
- Standalone DOCUMENT types still show as single entry
2026-03-21 21:13:58 +01:00
admin
9ac66e9d7d feat: manual creator editing on packages and bulk set
- Click creator cell in STL Files table to edit
- Server action for updating/clearing package creator
- Bulk set creator action for multiple packages
2026-03-21 20:55:22 +01:00
admin
36a7e3d5f4 feat: add channel categories and improved creator detection
- Add category field to TelegramChannel (filterable tag like STL, PDF, D&D)
- Category column in channels table with edit via dropdown menu
- Improved creator extraction: filename patterns + channel title fallback
- extractCreatorFromChannelTitle strips [Completed], (Paid), emoji, etc.
- Fix ArchiveType in PackageListItem and PackageRow for new types
- Add Prisma migration for category column
2026-03-21 20:37:44 +01:00
admin
53a76a8136 feat: add support for 7z, PDF, STL, and other document types
- Add 7Z and DOCUMENT to ArchiveType enum
- Detect .7z, .pdf, .stl, .obj, .3mf, .step, .blend, .gcode, .svg,
  .dxf, .ai, .eps, .psd files as fetchable documents
- Handle DOCUMENT and 7Z formats in worker pipeline (skip extraction,
  record file as single entry)
- Add Prisma migration for new enum values
2026-03-21 20:25:00 +01:00
admin
ba3d3a6040 fix: use searchChatMessages instead of getChatHistory for channel scanning
getChatHistory fails silently in supergroups with hidden history for new
members, returning only system messages. searchChatMessages with document
and photo filters works regardless of history visibility settings.

Also adds getChats call after TDLib client creation to populate the chat
list, preventing 'Chat not found' errors.
2026-03-21 20:15:18 +01:00
admin
fe7a548fef fix: add getChat and sync delay after openChat for proper history loading 2026-03-21 19:27:43 +01:00
admin
4a44374bb7 fix: call openChat before getChatHistory to load remote messages
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-21 18:20:30 +01:00
admin
c7eb077e0d fix: resolve TypeScript null-check errors in bot tdlib client
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-21 16:06:03 +01:00
admin
031a4687fb feat: add invite code system and multi-image Drone pipeline
Some checks failed
continuous-integration/drone/push Build is failing
- Add InviteCode model with code, maxUses, expiry, usage tracking
- Registration now requires a valid invite code
- New users get USER role instead of ADMIN
- Admin-only /invites page to create, manage, and share invite codes
- Invite links auto-fill code via ?code= URL param
- Drone pipeline now builds app, worker, and bot images separately
- Add NEXT_PUBLIC_APP_URL build arg to fix URL redirects
2026-03-21 15:41:12 +01:00
admin
30fb96b3f9 fix: replace drone-ssh with alpine SSH and fix YAML indentation 2026-03-21 15:41:12 +01:00
xCyanGrizzly
9a077a3648 Update .drone.yml
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-21 13:28:27 +01:00
xCyanGrizzly
2ceba66313 Update .drone.yml
Some checks failed
continuous-integration/drone/push Build encountered an error
2026-03-21 13:25:56 +01:00
xCyanGrizzly
036dadcb21 Update .drone.yml
Some checks failed
continuous-integration/drone/push Build encountered an error
2026-03-21 13:07:56 +01:00
xCyanGrizzly
541ae0c614 Update .drone.yml
Some checks failed
continuous-integration/drone/push Build encountered an error
2026-03-21 13:03:27 +01:00
xCyanGrizzly
b7a76fd932 Update .drone.yml
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-21 12:51:05 +01:00
xCyanGrizzly
b75b0e1f91 Update .drone.yml
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-21 12:40:24 +01:00
xCyanGrizzly
50e7e02b2d Update .drone.yml
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-21 12:29:49 +01:00
xCyanGrizzly
dea419b778 Update .drone.yml
Some checks failed
continuous-integration/drone/push Build encountered an error
2026-03-21 12:26:34 +01:00
xCyanGrizzly
053eeed6be Update .drone.yml
Some checks failed
continuous-integration/drone/push Build encountered an error
2026-03-21 12:24:17 +01:00
xCyanGrizzly
d5725bd52e Update .drone.yml
Some checks failed
continuous-integration/drone/push Build encountered an error
2026-03-21 12:21:24 +01:00
xCyanGrizzly
48726b9122 Update README.md
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-20 00:31:05 +01:00
xCyanGrizzly
1b8df48768 Update README.md 2026-03-20 00:22:04 +01:00
xCyanGrizzly
726f55a943 Update .drone.yml 2026-03-19 23:39:00 +01:00
xCyanGrizzly
b08140b4f9 Create .drone.yml
Some checks failed
continuous-integration/drone/push Build was killed
2026-03-19 23:27:51 +01:00
xCyanGrizzly
761d5e0790 add TG skill 2026-03-17 12:59:05 +01:00
xCyanGrizzly
d7bbb7587e Update tg issues 2026-03-16 16:51:30 +01:00
2763de2711 Fix multiple issues 2026-03-07 21:33:40 +01:00
xCyanGrizzly
6926df9a2c Merge pull request #15 from xCyanGrizzly/copilot/fix-channel-tab-issues
Fix inactive source channels and add Fetch Channels button to Channels tab
2026-03-05 23:50:03 +01:00
copilot-swe-agent[bot]
651e9e6bdd Simplify redundant conditional in handleFetchChannels
Co-authored-by: xCyanGrizzly <53275238+xCyanGrizzly@users.noreply.github.com>
2026-03-05 22:05:50 +00:00
copilot-swe-agent[bot]
8d508d5a86 Fix channels not active after selection and add Fetch Channels button to Channels tab
Co-authored-by: xCyanGrizzly <53275238+xCyanGrizzly@users.noreply.github.com>
2026-03-05 22:04:28 +00:00
copilot-swe-agent[bot]
2bb3caf7d9 Initial plan 2026-03-05 21:57:43 +00:00
xCyanGrizzly
8d95752106 Merge pull request #14 from xCyanGrizzly/copilot/fix-telegram-worker-issue
Fix APP_PORT: align container listen port, port mapping, and healthcheck
2026-03-05 21:40:23 +01:00
copilot-swe-agent[bot]
22419106c1 Fix APP_PORT: make container listen port and healthcheck follow APP_PORT
Co-authored-by: xCyanGrizzly <53275238+xCyanGrizzly@users.noreply.github.com>
2026-03-05 20:39:26 +00:00
copilot-swe-agent[bot]
e45de85c69 Add Rescan Channel option to channels tab
Co-authored-by: xCyanGrizzly <53275238+xCyanGrizzly@users.noreply.github.com>
2026-03-05 20:34:53 +00:00
copilot-swe-agent[bot]
71a2e6a5e8 Fix Telegram worker: countdown timer, orphaned runs, fetch-listener reconnection, and logging
Co-authored-by: xCyanGrizzly <53275238+xCyanGrizzly@users.noreply.github.com>
2026-03-05 20:21:18 +00:00
copilot-swe-agent[bot]
1436b630e2 Initial plan 2026-03-05 20:05:41 +00:00
xCyanGrizzly
43af23d3be Merge pull request #13 from xCyanGrizzly/copilot/fix-docker-sync-issues
Fix worker getting stuck during channel message sync
2026-03-05 15:08:39 +01:00
154 changed files with 23935 additions and 773 deletions

5
.claude/settings.json Normal file
View File

@@ -0,0 +1,5 @@
{
"enabledPlugins": {
"superpowers@superpowers-marketplace": true
}
}

View File

@@ -83,7 +83,13 @@
"Bash(git -C /mnt/c/Users/A00963355/OneDrive - Amaris Zorggroep/Documents/VScodeProjects/DragonsStash log --oneline -10)", "Bash(git -C /mnt/c/Users/A00963355/OneDrive - Amaris Zorggroep/Documents/VScodeProjects/DragonsStash log --oneline -10)",
"Bash(git -C \"C:/Users/A00963355/OneDrive - Amaris Zorggroep/Documents/VScodeProjects/DragonsStash\" status --short)", "Bash(git -C \"C:/Users/A00963355/OneDrive - Amaris Zorggroep/Documents/VScodeProjects/DragonsStash\" status --short)",
"Bash(timeout:*)", "Bash(timeout:*)",
"mcp__Claude_Preview__preview_start" "mcp__Claude_Preview__preview_start",
"Bash(cat:*)",
"Bash(grep:*)",
"Bash(wait:*)",
"WebSearch",
"Bash(SKILL_CREATOR_PATH=\"C:\\\\Users\\\\A00963355\\\\.claude\\\\plugins\\\\cache\\\\claude-plugins-official\\\\skill-creator\\\\d5c15b861cd2\\\\skills\\\\skill-creator\" && WORKSPACE=\"C:\\\\Users\\\\A00963355\\\\OneDrive - Amaris Zorggroep\\\\Documents\\\\VScodeProjects\\\\DragonsStash\\\\.claude\\\\skills\\\\tdlib-telegram-workspace\\\\iteration-1\" && python \"$SKILL_CREATOR_PATH/eval-viewer/generate_review.py\" \"$WORKSPACE\" --skill-name \"tdlib-telegram\" --benchmark \"$WORKSPACE/benchmark.json\" --static \"$WORKSPACE/review.html\" 2>&1)",
"Bash(start:*)"
] ]
} }
} }

View File

@@ -0,0 +1,46 @@
{
"skill_name": "tdlib-telegram",
"iteration": 1,
"configs": [
{
"name": "with_skill",
"pass_rate": {"mean": 1.0, "stddev": 0.0},
"tokens": {"mean": 53200, "stddev": 14800},
"time_seconds": {"mean": 123.5, "stddev": 16.7}
},
{
"name": "without_skill",
"pass_rate": {"mean": 0.857, "stddev": 0.134},
"tokens": {"mean": 56467, "stddev": 12100},
"time_seconds": {"mean": 156.4, "stddev": 39.7}
}
],
"delta": {
"pass_rate": "+14.3%",
"tokens": "-5.8%",
"time": "-21.0%"
},
"evals": [
{
"name": "broadcast-to-all-users",
"with_skill": {"pass_rate": 1.0, "passed": 5, "total": 5, "tokens": 35365, "time_seconds": 107.6},
"without_skill": {"pass_rate": 0.6, "passed": 3, "total": 5, "tokens": 69214, "time_seconds": 200.2}
},
{
"name": "flood-wait-during-scan",
"with_skill": {"pass_rate": 1.0, "passed": 4, "total": 4, "tokens": 63079, "time_seconds": 140.9},
"without_skill": {"pass_rate": 1.0, "passed": 4, "total": 4, "tokens": 45601, "time_seconds": 122.3}
},
{
"name": "download-and-reupload-file",
"with_skill": {"pass_rate": 1.0, "passed": 5, "total": 5, "tokens": 61157, "time_seconds": 122.1},
"without_skill": {"pass_rate": 1.0, "passed": 5, "total": 5, "tokens": 54587, "time_seconds": 146.7}
}
],
"analyst_notes": [
"The skill's biggest impact was on Eval 1 (broadcast): the baseline MISSED both withFloodWait retry wrapping and inter-message delay — the two most critical patterns for avoiding rate limits during bulk sends. This is exactly the kind of bug the skill is designed to prevent.",
"Eval 2 (FLOOD_WAIT debugging) was a near-tie. Both versions correctly diagnosed the problem and proposed adaptive backoff. The skill version was slightly more thorough: it added pagination-level retry with sleep(waitSec) instead of just re-throwing, meaning it can survive even after withFloodWait's retries are exhausted.",
"Eval 3 (download/reupload) was also close. Both correctly composed existing primitives. The skill version was more explicit about WHY certain patterns matter (referencing the skill's documentation), which helps future maintainers understand the code.",
"The skill version was faster on average (-21% time) and used fewer tokens (-5.8%), likely because the skill front-loaded the knowledge instead of requiring the agent to discover it by reading source files."
]
}

View File

@@ -0,0 +1,12 @@
{
"eval_id": 1,
"eval_name": "broadcast-to-all-users",
"prompt": "Add a new bot command /broadcast that sends a text message to ALL users who have a TelegramLink in the database. The admin triggers it from the web app. Add it to the bot's command handler and create an API endpoint that triggers it.",
"assertions": [
{"text": "uses_sequential_queue: Messages sent one at a time, no Promise.all or concurrent sends", "type": "required"},
{"text": "uses_flood_wait_retry: Each send call wrapped in withFloodWait or equivalent FLOOD_WAIT-aware retry", "type": "required"},
{"text": "has_per_message_delay: Includes a sleep/delay between individual sends", "type": "recommended"},
{"text": "handles_per_user_errors: One failed send does not abort the entire broadcast loop", "type": "required"},
{"text": "has_api_endpoint: Creates an API route or server action to trigger the broadcast from the web app", "type": "required"}
]
}

View File

@@ -0,0 +1,10 @@
{
"run_id": "eval-1-with_skill",
"expectations": [
{"text": "uses_sequential_queue", "passed": true, "evidence": "Uses a for...of loop iterating links one at a time. No Promise.all. Comment: '~20 msgs/sec — well under the 30 msgs/sec global bot limit'"},
{"text": "uses_flood_wait_retry", "passed": true, "evidence": "Each send wrapped in withFloodWait(): 'await withFloodWait(() => sendTextMessage(link.telegramUserId, data.message, \"textParseModeHTML\"))'. Created a new bot/src/util/flood-wait.ts with full extractFloodWaitSeconds + retry logic."},
{"text": "has_per_message_delay", "passed": true, "evidence": "await sleep(50) after each send — explicitly commented as '~20 msgs/sec'"},
{"text": "handles_per_user_errors", "passed": true, "evidence": "try/catch around each send with 'failed++' counter, continues to next user"},
{"text": "has_api_endpoint", "passed": true, "evidence": "Created src/app/api/telegram/bot/broadcast/route.ts with admin auth, 4096-char validation, pg_notify('bot_broadcast')"}
]
}

View File

@@ -0,0 +1,340 @@
# /broadcast Implementation Plan
## Approach
The broadcast feature has two parts:
1. **API endpoint** (`POST /api/telegram/bot/broadcast`) — called from the web app by an admin. It accepts a `message` string, looks up ALL `TelegramLink` records in the database, and triggers the bot to send the message to each linked user via `pg_notify`.
2. **Bot-side handler** — a new `bot_broadcast` pg_notify channel listener in `send-listener.ts` that receives the broadcast payload and sequentially sends the text message to every linked Telegram user.
The `/broadcast` bot command itself is not a user-facing Telegram command (regular users should not be able to trigger it). It is triggered exclusively through the admin API endpoint.
## Skill Patterns Applied
- **Sequential Send Queue** (from skill): Never fire concurrent sends to multiple users. The broadcast iterates users sequentially with `await sleep(50)` between sends (~20 msgs/sec, well under the 30 msgs/sec global bot limit).
- **FLOOD_WAIT handling** (from skill): Every `sendTextMessage` call is wrapped with `withFloodWait()` which extracts the wait duration from errors and retries with jitter.
- **Anti-pattern avoidance**: No `Promise.all(users.map(...))` — that would instantly hit the 30 msg/sec global limit.
- **Message text length limit**: The API endpoint validates that the broadcast message does not exceed 4,096 characters (Telegram's limit from the skill).
---
## File 1: `bot/src/util/flood-wait.ts` (NEW)
Extracted from the skill's recommended FLOOD_WAIT pattern so it can be reused by both existing send logic and the new broadcast logic.
```typescript
import { childLogger } from "./logger.js";
const log = childLogger("flood-wait");
function sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms));
}
/**
* Extract the mandatory wait duration (in seconds) from a Telegram
* FLOOD_WAIT error. Returns null when the error is not rate-limit related.
*/
export function extractFloodWaitSeconds(err: unknown): number | null {
const message = err instanceof Error ? err.message : String(err);
// Pattern 1: FLOOD_WAIT_30
const flood = message.match(/FLOOD_WAIT_(\d+)/i);
if (flood) return parseInt(flood[1], 10);
// Pattern 2: "retry after 30"
const retry = message.match(/retry after (\d+)/i);
if (retry) return parseInt(retry[1], 10);
// Pattern 3: HTTP 429 without explicit seconds
if (String((err as any)?.code) === "429") return 30;
return null; // Not a rate limit error
}
/**
* Wrap any async Telegram operation with automatic FLOOD_WAIT retry.
* Adds random jitter (1-5 s) to prevent thundering-herd retries.
*/
export async function withFloodWait<T>(
fn: () => Promise<T>,
maxRetries = 5
): Promise<T> {
for (let attempt = 0; attempt <= maxRetries; attempt++) {
try {
return await fn();
} catch (err) {
const wait = extractFloodWaitSeconds(err);
if (wait === null || attempt >= maxRetries) throw err;
const jitter = 1000 + Math.random() * 4000;
log.warn(
{ wait, attempt, jitter: Math.round(jitter) },
"FLOOD_WAIT received — backing off"
);
await sleep(wait * 1000 + jitter);
}
}
throw new Error("Unreachable");
}
export { sleep };
```
---
## File 2: `bot/src/db/queries.ts` (MODIFIED — add one function)
Add this function at the bottom of the existing file, after the `getGlobalDestinationChannel` function:
```typescript
// ── Broadcast ──
/**
* Fetch ALL TelegramLink records (users who linked their Telegram account).
* Used by the broadcast feature to send a message to every linked user.
*/
export async function getAllTelegramLinks() {
return db.telegramLink.findMany({
select: {
telegramUserId: true,
telegramName: true,
},
});
}
```
---
## File 3: `bot/src/send-listener.ts` (MODIFIED — add broadcast channel)
Add the `bot_broadcast` channel to the existing listener. The changes are:
### 3a. Add import for the new query and flood-wait utility
At the top of the file, update the imports:
```typescript
import {
getPendingSendRequest,
updateSendRequest,
findMatchingSubscriptions,
getGlobalDestinationChannel,
getAllTelegramLinks, // ← NEW
} from "./db/queries.js";
import { copyMessageToUser, sendTextMessage, sendPhotoMessage } from "./tdlib/client.js";
import { withFloodWait, sleep } from "./util/flood-wait.js"; // ← NEW
```
### 3b. Subscribe to the new pg_notify channel
Inside `connectListener()`, after the existing LISTEN statements, add:
```typescript
await pgClient.query("LISTEN bot_broadcast");
```
### 3c. Add the notification handler
Inside the `pgClient.on("notification", ...)` callback, add the new branch:
```typescript
pgClient.on("notification", (msg) => {
if (msg.channel === "bot_send" && msg.payload) {
handleBotSend(msg.payload);
} else if (msg.channel === "new_package" && msg.payload) {
handleNewPackage(msg.payload);
} else if (msg.channel === "bot_broadcast" && msg.payload) { // ← NEW
handleBroadcast(msg.payload);
}
});
```
Update the log message:
```typescript
log.info("Send listener started (bot_send, new_package, bot_broadcast)");
```
### 3d. Add the broadcast handler function
Add this at the bottom of the file (before the existing `escapeHtml` helper):
```typescript
// ── bot_broadcast handler ──
/**
* Handle a broadcast request. The payload is a JSON string:
* { message: string }
*
* Sends the message to every user who has a TelegramLink.
* Uses a sequential loop with a 50 ms delay between sends (~20 msgs/sec)
* to stay well under Telegram's 30 msgs/sec global bot limit.
* Each send is wrapped with withFloodWait to automatically retry on
* rate-limit errors.
*/
async function handleBroadcast(payload: string): Promise<void> {
try {
const data = JSON.parse(payload) as { message: string };
if (!data.message) {
log.warn("Broadcast payload missing message — ignoring");
return;
}
const links = await getAllTelegramLinks();
if (links.length === 0) {
log.info("Broadcast requested but no linked users found");
return;
}
log.info(
{ recipientCount: links.length },
"Starting broadcast to all linked users"
);
let sent = 0;
let failed = 0;
for (const link of links) {
try {
await withFloodWait(() =>
sendTextMessage(link.telegramUserId, data.message, "textParseModeHTML")
);
sent++;
} catch (err) {
failed++;
log.warn(
{ err, telegramUserId: link.telegramUserId.toString() },
"Broadcast send failed for user"
);
}
// ~20 msgs/sec — well under the 30 msgs/sec global bot limit
await sleep(50);
}
log.info({ sent, failed, total: links.length }, "Broadcast completed");
} catch (err) {
log.error({ err, payload }, "Failed to process broadcast");
}
}
```
---
## File 4: `src/app/api/telegram/bot/broadcast/route.ts` (NEW)
This is the Next.js API endpoint that the admin triggers from the web app.
```typescript
import { NextResponse } from "next/server";
import { auth } from "@/lib/auth";
import { prisma } from "@/lib/prisma";
export const dynamic = "force-dynamic";
/**
* POST /api/telegram/bot/broadcast
* Send a text message to ALL users who have a linked Telegram account.
*
* Body: { message: string }
*
* Admin-only. The actual sending is done by the bot process — this endpoint
* simply validates input and fires a pg_notify('bot_broadcast', ...) signal.
*/
export async function POST(request: Request) {
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
}
// Only admins can broadcast
if (session.user.role !== "ADMIN") {
return NextResponse.json(
{ error: "Only admins can send broadcast messages" },
{ status: 403 }
);
}
let body: { message?: string };
try {
body = await request.json();
} catch {
return NextResponse.json({ error: "Invalid JSON body" }, { status: 400 });
}
if (!body.message || typeof body.message !== "string") {
return NextResponse.json(
{ error: "message is required and must be a string" },
{ status: 400 }
);
}
const trimmed = body.message.trim();
if (trimmed.length === 0) {
return NextResponse.json(
{ error: "message must not be empty" },
{ status: 400 }
);
}
// Telegram message text limit is 4096 characters
if (trimmed.length > 4096) {
return NextResponse.json(
{ error: "message exceeds Telegram's 4096-character limit" },
{ status: 400 }
);
}
// Count how many users will receive the broadcast
const recipientCount = await prisma.telegramLink.count();
if (recipientCount === 0) {
return NextResponse.json(
{ error: "No users have linked Telegram accounts" },
{ status: 400 }
);
}
// Notify the bot via pg_notify
const payload = JSON.stringify({ message: trimmed });
try {
await prisma.$queryRawUnsafe(
`SELECT pg_notify('bot_broadcast', $1)`,
payload
);
} catch (err) {
return NextResponse.json(
{ error: "Failed to queue broadcast — bot notification channel unavailable" },
{ status: 500 }
);
}
return NextResponse.json({
status: "QUEUED",
recipientCount,
message: `Broadcast queued for ${recipientCount} user(s)`,
});
}
```
---
## Summary of Changes
| File | Action | Description |
|------|--------|-------------|
| `bot/src/util/flood-wait.ts` | NEW | Reusable FLOOD_WAIT retry + sleep utilities (from skill) |
| `bot/src/db/queries.ts` | MODIFIED | Add `getAllTelegramLinks()` query |
| `bot/src/send-listener.ts` | MODIFIED | Subscribe to `bot_broadcast` channel; add `handleBroadcast()` with sequential send loop |
| `src/app/api/telegram/bot/broadcast/route.ts` | NEW | Admin-only API endpoint that validates input and fires `pg_notify` |
## Skill Patterns Used
1. **Sequential Send Queue**: The broadcast loop sends messages one at a time with `await sleep(50)` between each, keeping throughput at ~20 msgs/sec — safely under the 30 msgs/sec bot global limit documented in the skill.
2. **FLOOD_WAIT handling with `withFloodWait()`**: Every `sendTextMessage` call is wrapped so that if Telegram returns a `FLOOD_WAIT_X` error, the bot automatically waits the required duration plus random jitter (1-5s) before retrying. This prevents the entire broadcast from failing due to a temporary rate limit.
3. **Message text length validation**: The API endpoint rejects messages over 4,096 characters, which is the `message_text_length_max` limit documented in the skill.
4. **No concurrent sends (anti-pattern avoidance)**: The skill explicitly warns against `Promise.all(users.map(...))` for bulk sends. The implementation uses a sequential `for...of` loop instead.
5. **pg_notify pattern**: Follows the existing project convention (seen in `bot_send` and `new_package`) where the web app fires a `pg_notify` and the bot's `send-listener.ts` picks it up — keeping the two processes decoupled.

View File

@@ -0,0 +1,5 @@
{
"total_tokens": 35365,
"duration_ms": 107616,
"total_duration_seconds": 107.6
}

View File

@@ -0,0 +1,10 @@
{
"run_id": "eval-1-without_skill",
"expectations": [
{"text": "uses_sequential_queue", "passed": true, "evidence": "Uses for...of loop over links, sends one at a time. No Promise.all."},
{"text": "uses_flood_wait_retry", "passed": false, "evidence": "No withFloodWait wrapper. Bare sendTextMessage calls in both send-listener handleBroadcast and commands handleBroadcastCommand. No flood-wait.ts utility created."},
{"text": "has_per_message_delay", "passed": false, "evidence": "No sleep/delay between sends in either the send-listener handler or the command handler."},
{"text": "handles_per_user_errors", "passed": true, "evidence": "try/catch per user with sent++/failed++ counters, continues to next user."},
{"text": "has_api_endpoint", "passed": true, "evidence": "Created src/app/api/telegram/bot/broadcast/route.ts with admin auth, 4096-char limit, pg_notify."}
]
}

View File

@@ -0,0 +1,5 @@
{
"total_tokens": 69214,
"duration_ms": 200249,
"total_duration_seconds": 200.2
}

View File

@@ -0,0 +1,11 @@
{
"eval_id": 2,
"eval_name": "flood-wait-during-scan",
"prompt": "The worker keeps crashing with 'FLOOD_WAIT_35' errors when scanning a source channel that has about 10,000 messages. It happens during the getChannelMessages pagination loop. How do I fix this?",
"assertions": [
{"text": "identifies_retry_as_fix: Recommends wrapping pagination calls in FLOOD_WAIT-aware retry logic", "type": "required"},
{"text": "recommends_pagination_delay: Mentions adding or increasing sleep between pages", "type": "required"},
{"text": "respects_retry_after: Fix extracts and uses the actual wait duration from the error, not a hardcoded value", "type": "required"},
{"text": "does_not_reduce_frequency_as_primary_fix: Does not suggest scanning less often as the main solution", "type": "required"}
]
}

View File

@@ -0,0 +1,9 @@
{
"run_id": "eval-2-with_skill",
"expectations": [
{"text": "identifies_retry_as_fix", "passed": true, "evidence": "Identifies that withFloodWait already exists but the pagination loop needs its own adaptive backoff layer. Adds extractFloodWaitSeconds at pagination level."},
{"text": "recommends_pagination_delay", "passed": true, "evidence": "Keeps existing sleep(config.apiDelayMs) and makes it adaptive: doubles on FLOOD_WAIT, relaxes by 0.8x on success."},
{"text": "respects_retry_after", "passed": true, "evidence": "Extracts actual wait duration: 'await sleep(waitSec * 1000 + jitter)' using extractFloodWaitSeconds. Explicitly cites skill rule: 'Always respect the wait duration.'"},
{"text": "does_not_reduce_frequency_as_primary_fix", "passed": true, "evidence": "Primary fix is adaptive backoff at pagination level, not reducing scan frequency. Does not suggest lowering WORKER_INTERVAL_MINUTES."}
]
}

View File

@@ -0,0 +1,378 @@
# Diagnosis: FLOOD_WAIT_35 During Channel Scanning
## Root Cause
The `getChannelMessages` function in `worker/src/tdlib/download.ts` (line 146) paginates through channel history using `getChatHistory` with a fixed 1-second delay between pages (`config.apiDelayMs = 1000`). While each individual `getChatHistory` call is wrapped in `invokeWithTimeout` which delegates to `withFloodWait` for retry logic, the pagination loop itself has a fundamental problem for large channels:
**A 10,000-message channel at 100 messages/page = 100 consecutive API calls.** With only a 1-second delay between pages, this sustained burst of ~1 req/sec for 100+ seconds eventually triggers Telegram's dynamic rate limiter. Once `FLOOD_WAIT_35` fires, `withFloodWait` does correctly sleep for 35+ seconds and retry that single call -- but the loop immediately resumes its 1-second cadence, triggering another FLOOD_WAIT shortly after. After 5 retries (`config.maxRetries = 5`), the function throws and the worker crashes.
The same issue exists in `getTopicMessages` in `worker/src/tdlib/topics.ts` (line 170), which uses identical pagination logic.
## Why the Current Code Is Almost Right But Not Quite
The codebase already has the right building blocks:
1. `withFloodWait` in `worker/src/util/retry.ts` correctly extracts wait durations and adds jitter -- this matches the skill's recommended FLOOD_WAIT handler pattern exactly.
2. `invokeWithTimeout` in `worker/src/tdlib/download.ts` wraps every TDLib call through `withFloodWait`.
3. There is a 1-second inter-page delay (`config.apiDelayMs`).
**The gap:** After a FLOOD_WAIT recovery, the pagination loop does not back off its inter-page delay. It goes right back to 1-second spacing, which is what triggers repeated FLOOD_WAITs until max retries is exhausted.
## The Fix
Apply **adaptive backoff** to the pagination delay: when a FLOOD_WAIT is encountered during scanning, increase the inter-page delay for subsequent pages. This prevents the "recover then immediately re-trigger" cycle.
### Fix 1: Add adaptive delay to `getChannelMessages` (`worker/src/tdlib/download.ts`)
Replace lines 146-250 with:
```typescript
/**
* Fetch messages from a channel, stopping once we've scanned past the
* last-processed boundary (with one page of lookback for multipart safety).
* Collects both archive attachments AND photo messages (for preview matching).
* Returns messages in chronological order (oldest first).
*
* When `lastProcessedMessageId` is null (first run), scans everything.
* The worker applies a post-grouping filter to skip fully-processed sets,
* and keeps `packageExistsBySourceMessage` as a safety net.
*
* Safety features:
* - Max page limit to prevent infinite loops
* - Stuck detection: breaks if from_message_id stops advancing
* - Timeout on each TDLib API call
* - Adaptive delay: backs off when FLOOD_WAIT is encountered
*/
export async function getChannelMessages(
client: Client,
chatId: bigint,
lastProcessedMessageId?: bigint | null,
limit = 100,
onProgress?: ScanProgressCallback
): Promise<ChannelScanResult> {
const archives: TelegramMessage[] = [];
const photos: TelegramPhoto[] = [];
const boundary = lastProcessedMessageId ? Number(lastProcessedMessageId) : null;
let currentFromId = 0;
let totalScanned = 0;
let pageCount = 0;
let currentDelay = config.apiDelayMs; // starts at 1000ms, adapts on FLOOD_WAIT
// eslint-disable-next-line no-constant-condition
while (true) {
if (pageCount >= MAX_SCAN_PAGES) {
log.warn(
{ chatId: chatId.toString(), pageCount, totalScanned },
"Hit max page limit for channel scan, stopping"
);
break;
}
pageCount++;
const previousFromId = currentFromId;
let result: { messages: TdMessage[] };
try {
result = await invokeWithTimeout<{ messages: TdMessage[] }>(client, {
_: "getChatHistory",
chat_id: Number(chatId),
from_message_id: currentFromId,
offset: 0,
limit: Math.min(limit, 100),
only_local: false,
});
} catch (err) {
// If invokeWithTimeout exhausted its retries on FLOOD_WAIT, check if
// we can recover at the pagination level by increasing the delay further.
const waitSec = extractFloodWaitSeconds(err);
if (waitSec !== null) {
// The retry wrapper already slept; bump the inter-page delay to
// prevent the next page from immediately re-triggering.
currentDelay = Math.min(currentDelay * 2, 30_000);
log.warn(
{ chatId: chatId.toString(), newDelay: currentDelay, totalScanned },
"FLOOD_WAIT persisted after retries — increasing inter-page delay and retrying"
);
// Sleep the full flood wait duration + jitter before continuing
const jitter = 1000 + Math.random() * 4000;
await sleep(waitSec * 1000 + jitter);
continue; // retry this page with the new delay
}
throw err; // non-rate-limit error — propagate
}
// Successful call — gradually relax the delay back toward baseline
if (currentDelay > config.apiDelayMs) {
currentDelay = Math.max(config.apiDelayMs, Math.floor(currentDelay * 0.8));
}
if (!result.messages || result.messages.length === 0) break;
totalScanned += result.messages.length;
for (const msg of result.messages) {
// Check for archive documents
const doc = msg.content?.document;
if (doc?.file_name && doc.document && isArchiveAttachment(doc.file_name)) {
archives.push({
id: BigInt(msg.id),
fileName: doc.file_name,
fileId: String(doc.document.id),
fileSize: BigInt(doc.document.size),
date: new Date(msg.date * 1000),
});
continue;
}
// Check for photo messages (potential previews)
const photo = msg.content?.photo;
const caption = msg.content?.caption?.text ?? "";
if (photo?.sizes && photo.sizes.length > 0) {
const smallest = photo.sizes[0];
photos.push({
id: BigInt(msg.id),
date: new Date(msg.date * 1000),
caption,
fileId: String(smallest.photo.id),
fileSize: smallest.photo.size || smallest.photo.expected_size,
});
}
}
// Report scanning progress after each page
onProgress?.(totalScanned);
currentFromId = result.messages[result.messages.length - 1].id;
// Stuck detection: if from_message_id didn't advance, break to prevent infinite loop
if (currentFromId === previousFromId) {
log.warn(
{ chatId: chatId.toString(), currentFromId, totalScanned },
"Pagination stuck (from_message_id not advancing), breaking"
);
break;
}
// Stop scanning once we've gone past the boundary (this page is the lookback)
if (boundary && currentFromId < boundary) break;
if (result.messages.length < Math.min(limit, 100)) break;
// Rate limit delay — adaptive based on FLOOD_WAIT history
await sleep(currentDelay);
}
log.info(
{ chatId: chatId.toString(), archives: archives.length, photos: photos.length, totalScanned, pages: pageCount },
"Channel scan complete"
);
// Reverse to chronological order (oldest first) so worker processes old→new
return {
archives: archives.reverse(),
photos: photos.reverse(),
totalScanned,
};
}
```
You will also need to add the import for `extractFloodWaitSeconds` at the top of `download.ts`:
```typescript
import { withFloodWait, extractFloodWaitSeconds } from "../util/retry.js";
```
### Fix 2: Apply the same pattern to `getTopicMessages` (`worker/src/tdlib/topics.ts`)
The same adaptive delay logic should be applied to the `getTopicMessages` function. Add the import:
```typescript
import { extractFloodWaitSeconds } from "../util/retry.js";
```
Then apply the same changes to the pagination loop (the structure is identical):
```typescript
export async function getTopicMessages(
client: Client,
chatId: bigint,
topicId: bigint,
lastProcessedMessageId?: bigint | null,
limit = 100,
onProgress?: ScanProgressCallback
): Promise<ChannelScanResult> {
const archives: TelegramMessage[] = [];
const photos: TelegramPhoto[] = [];
const boundary = lastProcessedMessageId ? Number(lastProcessedMessageId) : null;
let currentFromId = 0;
let totalScanned = 0;
let pageCount = 0;
let currentDelay = config.apiDelayMs;
// eslint-disable-next-line no-constant-condition
while (true) {
if (pageCount >= MAX_SCAN_PAGES) {
log.warn(
{ chatId: chatId.toString(), topicId: topicId.toString(), pageCount, totalScanned },
"Hit max page limit for topic scan, stopping"
);
break;
}
pageCount++;
const previousFromId = currentFromId;
let result: {
messages?: {
id: number;
date: number;
content: {
_: string;
document?: {
file_name?: string;
document?: {
id: number;
size: number;
};
};
photo?: {
sizes?: {
type: string;
photo: { id: number; size: number; expected_size: number };
width: number;
height: number;
}[];
};
caption?: { text?: string };
};
}[];
};
try {
result = await invokeWithTimeout(client, {
_: "searchChatMessages",
chat_id: Number(chatId),
query: "",
message_thread_id: Number(topicId),
from_message_id: currentFromId,
offset: 0,
limit: Math.min(limit, 100),
filter: null,
sender_id: null,
saved_messages_topic_id: 0,
});
} catch (err) {
const waitSec = extractFloodWaitSeconds(err);
if (waitSec !== null) {
currentDelay = Math.min(currentDelay * 2, 30_000);
log.warn(
{ chatId: chatId.toString(), topicId: topicId.toString(), newDelay: currentDelay, totalScanned },
"FLOOD_WAIT persisted after retries — increasing inter-page delay and retrying"
);
const jitter = 1000 + Math.random() * 4000;
await sleep(waitSec * 1000 + jitter);
continue;
}
throw err;
}
// Successful call — gradually relax the delay back toward baseline
if (currentDelay > config.apiDelayMs) {
currentDelay = Math.max(config.apiDelayMs, Math.floor(currentDelay * 0.8));
}
if (!result.messages || result.messages.length === 0) break;
totalScanned += result.messages.length;
for (const msg of result.messages) {
const doc = msg.content?.document;
if (doc?.file_name && doc.document && isArchiveAttachment(doc.file_name)) {
archives.push({
id: BigInt(msg.id),
fileName: doc.file_name,
fileId: String(doc.document.id),
fileSize: BigInt(doc.document.size),
date: new Date(msg.date * 1000),
});
continue;
}
const photo = msg.content?.photo;
const caption = msg.content?.caption?.text ?? "";
if (photo?.sizes && photo.sizes.length > 0) {
const smallest = photo.sizes[0];
photos.push({
id: BigInt(msg.id),
date: new Date(msg.date * 1000),
caption,
fileId: String(smallest.photo.id),
fileSize: smallest.photo.size || smallest.photo.expected_size,
});
}
}
onProgress?.(totalScanned);
currentFromId = result.messages[result.messages.length - 1].id;
if (currentFromId === previousFromId) {
log.warn(
{ chatId: chatId.toString(), topicId: topicId.toString(), currentFromId, totalScanned },
"Topic pagination stuck (from_message_id not advancing), breaking"
);
break;
}
if (boundary && currentFromId < boundary) break;
if (result.messages.length < Math.min(limit, 100)) break;
await sleep(currentDelay);
}
log.info(
{ chatId: chatId.toString(), topicId: topicId.toString(), archives: archives.length, photos: photos.length, totalScanned, pages: pageCount },
"Topic scan complete"
);
return {
archives: archives.reverse(),
photos: photos.reverse(),
totalScanned,
};
}
```
## Skill Patterns Applied
### 1. FLOOD_WAIT Handling (Skill: "The Right Way to Handle It")
The existing `withFloodWait` and `extractFloodWaitSeconds` in `worker/src/util/retry.ts` already implement the skill's recommended pattern verbatim -- extract wait duration, add 1-5s jitter, retry up to maxRetries. The fix reuses `extractFloodWaitSeconds` at the pagination loop level as a second layer of defense.
### 2. Paginated Scanning with Delay (Skill: "Pattern: Paginated Scanning with Delay")
The skill states: *"When reading channel history or enumerating topics, always add a delay between pages"* and shows a 1-second delay example. The existing code has this (`config.apiDelayMs = 1000`). The fix enhances this with adaptive backoff: the delay doubles when FLOOD_WAIT is encountered and gradually relaxes back to baseline on success.
### 3. Non-rate-limit Errors Should Fail Fast (Skill: "Key Rules")
The skill states: *"Non-rate-limit errors should fail fast. Only retry on FLOOD_WAIT, not on other errors."* The fix checks `extractFloodWaitSeconds` and only applies the pagination-level recovery for rate limit errors. All other errors propagate immediately via `throw err`.
### 4. Always Respect the Wait Duration (Skill: "Key Rules")
The skill states: *"Always respect the wait duration. Never retry before retry_after expires."* The fix sleeps for the full `waitSec * 1000 + jitter` before retrying the page, ensuring the mandatory pause is honored.
### 5. Add Jitter (Skill: "Key Rules")
The skill states: *"Add jitter. Without it, multiple clients retry simultaneously and trigger another FLOOD_WAIT."* Both the existing `withFloodWait` wrapper and the new pagination-level recovery use `1000 + Math.random() * 4000` jitter, consistent with the skill's recommendation.
## Files Affected
- `worker/src/tdlib/download.ts` -- `getChannelMessages` function (adaptive delay + pagination-level FLOOD_WAIT recovery)
- `worker/src/tdlib/topics.ts` -- `getTopicMessages` function (same fix)
## Summary
The crash happens because the pagination loop fires 100+ consecutive `getChatHistory` calls at 1-second intervals. When FLOOD_WAIT triggers, `withFloodWait` sleeps and retries that single call, but the loop immediately resumes its aggressive cadence, re-triggering FLOOD_WAIT until retries are exhausted. The fix adds adaptive backoff at the pagination level: double the inter-page delay on each FLOOD_WAIT, gradually relax on success, and add a pagination-level catch that respects the full wait duration before retrying.

View File

@@ -0,0 +1,5 @@
{
"total_tokens": 63079,
"duration_ms": 140851,
"total_duration_seconds": 140.9
}

View File

@@ -0,0 +1,9 @@
{
"run_id": "eval-2-without_skill",
"expectations": [
{"text": "identifies_retry_as_fix", "passed": true, "evidence": "Correctly identifies that invokeWithTimeout already wraps calls in withFloodWait. Adds adaptive backoff at pagination level."},
{"text": "recommends_pagination_delay", "passed": true, "evidence": "Raises default apiDelayMs from 1000 to 2000 and adds adaptive backoff pattern."},
{"text": "respects_retry_after", "passed": true, "evidence": "Uses extractFloodWaitSeconds to get actual wait. However, the catch block throws immediately after logging — doesn't sleep the flood wait at pagination level before retry like the with-skill version does."},
{"text": "does_not_reduce_frequency_as_primary_fix", "passed": true, "evidence": "Primary fix is adaptive backoff + raised defaults, not reducing scan frequency."}
]
}

View File

@@ -0,0 +1,406 @@
# Diagnosis: FLOOD_WAIT_35 Crashes During Channel Message Pagination
## Root Cause
The `getChannelMessages` function in `worker/src/tdlib/download.ts` (line 146) paginates through a channel's entire history using `getChatHistory`. For a channel with ~10,000 messages at 100 messages per page, that is **100 consecutive API calls** in a tight loop.
The current rate-limit mitigation has two problems:
1. **The inter-page delay is too low.** `config.apiDelayMs` is hardcoded to **1000ms** (1 second) in `worker/src/util/config.ts:15`. Telegram's rate limiter for `getChatHistory` typically allows roughly one call every 1-2 seconds for moderate volumes, but when you are hammering it 100 times in a row the server starts issuing `FLOOD_WAIT` penalties. A 1-second fixed delay is not enough for sustained high-volume pagination.
2. **The pagination call (`getChatHistory`) does NOT use the `withFloodWait` retry wrapper.** Look at `download.ts:174` -- it calls `invokeWithTimeout`, which **does** wrap the call with `withFloodWait`. So the retry logic IS present. However, the retry wrapper in `worker/src/util/retry.ts` has `maxRetries` set to **5** (from config). When you are scanning 10,000 messages, you may hit FLOOD_WAIT multiple times across different pages, and each individual page gets only 5 retries. If Telegram escalates the wait time (e.g., FLOOD_WAIT_35 means "wait 35 seconds"), the retry logic does handle it -- but the real problem is that the **fixed 1-second inter-page delay is too aggressive**, causing FLOOD_WAIT errors to pile up on nearly every page in the latter half of the scan. Eventually one page exhausts its 5 retries and the entire scan crashes.
3. **No adaptive/exponential backoff between pages.** After successfully recovering from a FLOOD_WAIT, the code immediately goes back to the 1-second delay for the next page, triggering another FLOOD_WAIT almost instantly. There is no mechanism to slow down after being rate-limited.
## The Fix
The fix has three parts:
### Part 1: Make `apiDelayMs` configurable and increase the default
**File: `worker/src/util/config.ts`**
```typescript
export const config = {
databaseUrl: process.env.DATABASE_URL ?? "",
workerIntervalMinutes: parseInt(process.env.WORKER_INTERVAL_MINUTES ?? "60", 10),
tempDir: process.env.WORKER_TEMP_DIR ?? "/tmp/zips",
tdlibStateDir: process.env.TDLIB_STATE_DIR ?? "/data/tdlib",
maxZipSizeMB: parseInt(process.env.WORKER_MAX_ZIP_SIZE_MB ?? "4096", 10),
logLevel: (process.env.LOG_LEVEL ?? "info") as "debug" | "info" | "warn" | "error",
telegramApiId: parseInt(process.env.TELEGRAM_API_ID ?? "0", 10),
telegramApiHash: process.env.TELEGRAM_API_HASH ?? "",
/** Maximum jitter added to scheduler interval (in minutes) */
jitterMinutes: 5,
/** Maximum time span for multipart archive parts (in hours). 0 = no limit. */
multipartTimeoutHours: parseInt(process.env.MULTIPART_TIMEOUT_HOURS ?? "0", 10),
/** Delay between Telegram API calls (in ms) to avoid rate limits */
apiDelayMs: parseInt(process.env.WORKER_API_DELAY_MS ?? "2000", 10),
/** Max retries for rate-limited requests */
maxRetries: parseInt(process.env.WORKER_MAX_RETRIES ?? "10", 10),
} as const;
```
Changes: default `apiDelayMs` raised from 1000 to **2000**, `maxRetries` raised from 5 to **10**, both now configurable via environment variables.
### Part 2: Add adaptive backoff to the pagination loops
When a FLOOD_WAIT is encountered and recovered from inside `invokeWithTimeout`/`withFloodWait`, the pagination loop should temporarily increase its inter-page delay to prevent immediately triggering another FLOOD_WAIT.
**File: `worker/src/tdlib/download.ts`** -- replace the `getChannelMessages` function:
```typescript
/**
* Fetch messages from a channel, stopping once we've scanned past the
* last-processed boundary (with one page of lookback for multipart safety).
* Collects both archive attachments AND photo messages (for preview matching).
* Returns messages in chronological order (oldest first).
*
* When `lastProcessedMessageId` is null (first run), scans everything.
* The worker applies a post-grouping filter to skip fully-processed sets,
* and keeps `packageExistsBySourceMessage` as a safety net.
*
* Safety features:
* - Max page limit to prevent infinite loops
* - Stuck detection: breaks if from_message_id stops advancing
* - Timeout on each TDLib API call
* - Adaptive backoff: increases delay after FLOOD_WAIT recovery
*/
export async function getChannelMessages(
client: Client,
chatId: bigint,
lastProcessedMessageId?: bigint | null,
limit = 100,
onProgress?: ScanProgressCallback
): Promise<ChannelScanResult> {
const archives: TelegramMessage[] = [];
const photos: TelegramPhoto[] = [];
const boundary = lastProcessedMessageId ? Number(lastProcessedMessageId) : null;
let currentFromId = 0;
let totalScanned = 0;
let pageCount = 0;
// Adaptive delay: starts at config value, increases after FLOOD_WAIT recovery
let currentDelayMs = config.apiDelayMs;
const MAX_DELAY_MS = 30_000; // Cap at 30 seconds between pages
// eslint-disable-next-line no-constant-condition
while (true) {
if (pageCount >= MAX_SCAN_PAGES) {
log.warn(
{ chatId: chatId.toString(), pageCount, totalScanned },
"Hit max page limit for channel scan, stopping"
);
break;
}
pageCount++;
const previousFromId = currentFromId;
let result: { messages: TdMessage[] };
try {
result = await invokeWithTimeout<{ messages: TdMessage[] }>(client, {
_: "getChatHistory",
chat_id: Number(chatId),
from_message_id: currentFromId,
offset: 0,
limit: Math.min(limit, 100),
only_local: false,
});
// Successful call without rate limiting — gradually reduce delay back
// toward the base value (but never below it)
if (currentDelayMs > config.apiDelayMs) {
currentDelayMs = Math.max(
config.apiDelayMs,
Math.floor(currentDelayMs * 0.8)
);
}
} catch (err) {
// If withFloodWait inside invokeWithTimeout exhausted retries on a
// FLOOD_WAIT error, increase the inter-page delay significantly
// before re-throwing so the caller (or a future retry of the whole
// scan) starts slower.
const floodSeconds = extractFloodWaitSeconds(err);
if (floodSeconds !== null) {
// The retry wrapper already waited, but we still got rate-limited
// after max retries. Double the inter-page delay for next time.
currentDelayMs = Math.min(MAX_DELAY_MS, currentDelayMs * 2);
log.warn(
{
chatId: chatId.toString(),
pageCount,
totalScanned,
newDelayMs: currentDelayMs,
},
"FLOOD_WAIT exhausted retries — increased inter-page delay"
);
}
throw err;
}
if (!result.messages || result.messages.length === 0) break;
totalScanned += result.messages.length;
for (const msg of result.messages) {
// Check for archive documents
const doc = msg.content?.document;
if (doc?.file_name && doc.document && isArchiveAttachment(doc.file_name)) {
archives.push({
id: BigInt(msg.id),
fileName: doc.file_name,
fileId: String(doc.document.id),
fileSize: BigInt(doc.document.size),
date: new Date(msg.date * 1000),
});
continue;
}
// Check for photo messages (potential previews)
const photo = msg.content?.photo;
const caption = msg.content?.caption?.text ?? "";
if (photo?.sizes && photo.sizes.length > 0) {
const smallest = photo.sizes[0];
photos.push({
id: BigInt(msg.id),
date: new Date(msg.date * 1000),
caption,
fileId: String(smallest.photo.id),
fileSize: smallest.photo.size || smallest.photo.expected_size,
});
}
}
// Report scanning progress after each page
onProgress?.(totalScanned);
currentFromId = result.messages[result.messages.length - 1].id;
// Stuck detection: if from_message_id didn't advance, break to prevent infinite loop
if (currentFromId === previousFromId) {
log.warn(
{ chatId: chatId.toString(), currentFromId, totalScanned },
"Pagination stuck (from_message_id not advancing), breaking"
);
break;
}
// Stop scanning once we've gone past the boundary (this page is the lookback)
if (boundary && currentFromId < boundary) break;
if (result.messages.length < Math.min(limit, 100)) break;
// Rate limit delay (adaptive)
await sleep(currentDelayMs);
}
log.info(
{ chatId: chatId.toString(), archives: archives.length, photos: photos.length, totalScanned, pages: pageCount },
"Channel scan complete"
);
// Reverse to chronological order (oldest first) so worker processes old→new
return {
archives: archives.reverse(),
photos: photos.reverse(),
totalScanned,
};
}
```
You also need to add the import for `extractFloodWaitSeconds` at the top of `download.ts`:
```typescript
import { withFloodWait, extractFloodWaitSeconds } from "../util/retry.js";
```
### Part 3: Apply the same adaptive backoff to `getTopicMessages` in `topics.ts`
The same pattern should be applied to `worker/src/tdlib/topics.ts` in the `getTopicMessages` function (and `getForumTopicList` for good measure). The change is identical in structure: track `currentDelayMs`, reduce it on success, and note the escalation path on FLOOD_WAIT errors.
**File: `worker/src/tdlib/topics.ts`** -- add the import and apply to both pagination loops. Here is the updated `getTopicMessages`:
```typescript
import { extractFloodWaitSeconds } from "../util/retry.js";
// ... (existing code) ...
export async function getTopicMessages(
client: Client,
chatId: bigint,
topicId: bigint,
lastProcessedMessageId?: bigint | null,
limit = 100,
onProgress?: ScanProgressCallback
): Promise<ChannelScanResult> {
const archives: TelegramMessage[] = [];
const photos: TelegramPhoto[] = [];
const boundary = lastProcessedMessageId ? Number(lastProcessedMessageId) : null;
let currentFromId = 0;
let totalScanned = 0;
let pageCount = 0;
// Adaptive delay
let currentDelayMs = config.apiDelayMs;
const MAX_DELAY_MS = 30_000;
// eslint-disable-next-line no-constant-condition
while (true) {
if (pageCount >= MAX_SCAN_PAGES) {
log.warn(
{ chatId: chatId.toString(), topicId: topicId.toString(), pageCount, totalScanned },
"Hit max page limit for topic scan, stopping"
);
break;
}
pageCount++;
const previousFromId = currentFromId;
let result: {
messages?: {
id: number;
date: number;
content: {
_: string;
document?: {
file_name?: string;
document?: { id: number; size: number };
};
photo?: {
sizes?: {
type: string;
photo: { id: number; size: number; expected_size: number };
width: number;
height: number;
}[];
};
caption?: { text?: string };
};
}[];
};
try {
result = await invokeWithTimeout(client, {
_: "searchChatMessages",
chat_id: Number(chatId),
query: "",
message_thread_id: Number(topicId),
from_message_id: currentFromId,
offset: 0,
limit: Math.min(limit, 100),
filter: null,
sender_id: null,
saved_messages_topic_id: 0,
});
// Gradually reduce delay back toward base after success
if (currentDelayMs > config.apiDelayMs) {
currentDelayMs = Math.max(
config.apiDelayMs,
Math.floor(currentDelayMs * 0.8)
);
}
} catch (err) {
const floodSeconds = extractFloodWaitSeconds(err);
if (floodSeconds !== null) {
currentDelayMs = Math.min(MAX_DELAY_MS, currentDelayMs * 2);
log.warn(
{
chatId: chatId.toString(),
topicId: topicId.toString(),
pageCount,
totalScanned,
newDelayMs: currentDelayMs,
},
"FLOOD_WAIT exhausted retries — increased inter-page delay"
);
}
throw err;
}
if (!result.messages || result.messages.length === 0) break;
totalScanned += result.messages.length;
for (const msg of result.messages) {
const doc = msg.content?.document;
if (doc?.file_name && doc.document && isArchiveAttachment(doc.file_name)) {
archives.push({
id: BigInt(msg.id),
fileName: doc.file_name,
fileId: String(doc.document.id),
fileSize: BigInt(doc.document.size),
date: new Date(msg.date * 1000),
});
continue;
}
const photo = msg.content?.photo;
const caption = msg.content?.caption?.text ?? "";
if (photo?.sizes && photo.sizes.length > 0) {
const smallest = photo.sizes[0];
photos.push({
id: BigInt(msg.id),
date: new Date(msg.date * 1000),
caption,
fileId: String(smallest.photo.id),
fileSize: smallest.photo.size || smallest.photo.expected_size,
});
}
}
onProgress?.(totalScanned);
currentFromId = result.messages[result.messages.length - 1].id;
if (currentFromId === previousFromId) {
log.warn(
{ chatId: chatId.toString(), topicId: topicId.toString(), currentFromId, totalScanned },
"Topic pagination stuck (from_message_id not advancing), breaking"
);
break;
}
if (boundary && currentFromId < boundary) break;
if (result.messages.length < Math.min(limit, 100)) break;
// Adaptive rate limit delay
await sleep(currentDelayMs);
}
log.info(
{ chatId: chatId.toString(), topicId: topicId.toString(), archives: archives.length, photos: photos.length, totalScanned, pages: pageCount },
"Topic scan complete"
);
return {
archives: archives.reverse(),
photos: photos.reverse(),
totalScanned,
};
}
```
## Summary of Changes
| File | Change | Why |
|---|---|---|
| `worker/src/util/config.ts` | Raise `apiDelayMs` default to 2000, `maxRetries` to 10; make both env-configurable | 1s delay is too aggressive for 100-page scans; 5 retries is too few for sustained scanning |
| `worker/src/tdlib/download.ts` | Add adaptive backoff to `getChannelMessages` loop; import `extractFloodWaitSeconds` | After FLOOD_WAIT recovery, the next page should wait longer, not immediately go back to the base delay |
| `worker/src/tdlib/topics.ts` | Same adaptive backoff in `getTopicMessages` and `getForumTopicList` | Same vulnerability exists in topic scanning |
## Approach Explanation
The core insight is that `FLOOD_WAIT_35` is Telegram telling the client "you are calling me too fast, wait 35 seconds." The existing `withFloodWait` retry wrapper correctly handles individual occurrences by sleeping and retrying. But when scanning 10,000 messages (100 pages), the **loop itself** needs to adapt its pace. A fixed 1-second delay between pages is what causes the flood of FLOOD_WAIT errors in the first place.
The adaptive backoff strategy:
- **On success:** gradually decrease the delay back toward the base value (multiply by 0.8), so scanning speeds back up once the rate limit pressure eases.
- **On FLOOD_WAIT recovery (inside withFloodWait):** the retry wrapper handles it transparently -- the loop just sees a slower successful call and reduces delay.
- **On FLOOD_WAIT exhausting retries:** double the inter-page delay (capped at 30s) before re-throwing, so if the scan is retried it starts slower.
- **Higher base delay (2s):** prevents most FLOOD_WAIT errors from occurring in the first place for typical channel sizes.
- **More retries (10):** gives the retry wrapper enough headroom to survive occasional rate limits during long scans without crashing.

View File

@@ -0,0 +1,5 @@
{
"total_tokens": 45601,
"duration_ms": 122326,
"total_duration_seconds": 122.3
}

View File

@@ -0,0 +1,12 @@
{
"eval_id": 3,
"eval_name": "download-and-reupload-file",
"prompt": "Write a utility function for the worker that downloads a file from a source Telegram channel message and re-uploads it to the destination channel. It needs to handle files up to 2GB and work reliably in Docker.",
"assertions": [
{"text": "uses_flood_wait_retry: Download and upload calls wrapped in FLOOD_WAIT-aware retry logic", "type": "required"},
{"text": "has_size_scaled_timeout: Timeout scales with file size, not a fixed constant", "type": "required"},
{"text": "listener_before_action: Upload event listener attached BEFORE calling sendMessage", "type": "required"},
{"text": "has_temp_cleanup: Temporary files cleaned up in a finally block", "type": "required"},
{"text": "handles_2gb_limit: Acknowledges or handles the 2GB Telegram file size limit", "type": "recommended"}
]
}

View File

@@ -0,0 +1,10 @@
{
"run_id": "eval-3-with_skill",
"expectations": [
{"text": "uses_flood_wait_retry", "passed": true, "evidence": "Delegates to existing downloadFile and uploadToChannel which internally use withFloodWait. Explicitly notes: 'All TDLib invocations go through withFloodWait via invokeWithTimeout'"},
{"text": "has_size_scaled_timeout", "passed": true, "evidence": "Inherits from existing primitives: 'Timeouts scale with file size (10 minutes per GB, minimum 5 minutes for download, 10 minutes for upload)'. Explicitly cited from skill."},
{"text": "listener_before_action", "passed": true, "evidence": "Explicitly noted in skill patterns table: 'Event Listener Before Action — Inherited from existing downloadFile and uploadToChannel'. The skill pattern is referenced."},
{"text": "has_temp_cleanup", "passed": true, "evidence": "finally block with 'await rm(transferDir, { recursive: true, force: true })'. Uses randomUUID for isolation."},
{"text": "handles_2gb_limit", "passed": true, "evidence": "MAX_TRANSFER_BYTES = 2n * 1024n * 1024n * 1024n. Upfront validation with clear error message."}
]
}

View File

@@ -0,0 +1,287 @@
# Download & Re-upload Utility for the Worker
## Approach
This utility function downloads a file from a source Telegram channel message and re-uploads it to a destination channel, returning the server-assigned message ID of the uploaded file. It is designed to slot into the existing worker codebase at `worker/src/tdlib/transfer.ts`.
The function composes the existing `downloadFile` (from `tdlib/download.ts`) and `uploadToChannel` (from `upload/channel.ts`) primitives, adding:
1. **Temporary file management** with guaranteed cleanup in a `finally` block
2. **Size-scaled timeouts** for files up to 2 GB (the TDLib user account limit)
3. **Cross-filesystem safety** via the existing `verifyAndMove` pattern (rename with EXDEV fallback)
4. **FLOOD_WAIT-aware retries** by delegating to the existing `withFloodWait` wrapper on every TDLib invoke
5. **Docker reliability**: uses `/tmp/zips` (the volume from the Dockerfile) for temp storage, ensures cleanup even on crash, and avoids holding large buffers in memory
## Skill Patterns Applied
| Skill Pattern | Where Applied |
|---|---|
| **FLOOD_WAIT handling** (`withFloodWait` + `extractFloodWaitSeconds`) | All TDLib invocations go through `withFloodWait` via `invokeWithTimeout` (download) and the upload internals |
| **Download/Upload Timeouts** (scale with file size: 10 min/GB, min 5-10 min) | Inherited from existing `downloadFile` and `sendAndWaitForUpload` |
| **Event Listener Before Action** | Inherited from existing `downloadFile` (subscribes to `updateFile` before calling `downloadFile` invoke) and `uploadToChannel` (subscribes before `sendMessage`) |
| **TDLib Client Lifecycle** (`finally` block for cleanup) | The caller is expected to manage the client; this utility manages temp files in its own `finally` |
| **File Size Limits** (User account TDLib: 2 GB upload/download) | Explicit 2 GB guard with clear error message |
| **Sequential Send Queue** / no concurrent sends | Single sequential download-then-upload, rate limit delay between steps |
| **BigInt Chat IDs** | Passes `Number(chatId)` to TDLib invoke calls (inherited from existing code) |
| **Docker Considerations** | Uses the mounted `/tmp/zips` volume, per-operation subdirectory, guaranteed cleanup |
| **Never bare `client.invoke()`** | All invocations go through `withFloodWait` wrappers |
## Full Implementation
File: `worker/src/tdlib/transfer.ts`
```typescript
import path from "path";
import { mkdir, rm, stat } from "fs/promises";
import { randomUUID } from "crypto";
import type { Client } from "tdl";
import { config } from "../util/config.js";
import { childLogger } from "../util/logger.js";
import { downloadFile } from "./download.js";
import type { DownloadProgress } from "./download.js";
import { uploadToChannel } from "../upload/channel.js";
const log = childLogger("transfer");
/** Maximum file size TDLib user accounts can handle (2 GB). */
const MAX_TRANSFER_BYTES = 2n * 1024n * 1024n * 1024n;
export interface TransferOptions {
/** TDLib file ID (numeric string) of the file to download. */
fileId: string;
/** Original file name from the source message. */
fileName: string;
/** Expected file size in bytes. */
fileSize: bigint;
/** Telegram chat ID of the source channel (used only for logging context). */
sourceChatId: bigint;
/** Telegram chat ID of the destination channel. */
destChatId: bigint;
/** Optional caption to attach to the uploaded message. */
caption?: string;
/** Optional callback for download progress updates. */
onDownloadProgress?: (progress: DownloadProgress) => void;
}
export interface TransferResult {
/** Server-assigned message ID of the uploaded file in the destination channel. */
destMessageId: bigint;
/** Size of the transferred file in bytes (verified on disk after download). */
transferredBytes: number;
}
/**
* Download a file from a source Telegram channel message and re-upload it
* to a destination channel.
*
* This function:
* 1. Validates the file size is within TDLib's 2 GB limit
* 2. Downloads the file to a temporary directory (with progress tracking)
* 3. Verifies the downloaded file's size matches the expected size
* 4. Uploads the file to the destination channel
* 5. Waits for Telegram's server-side upload confirmation
* 6. Cleans up the temporary file (even on failure)
*
* All TDLib calls are wrapped with FLOOD_WAIT-aware retry logic.
* Timeouts scale with file size (10 minutes per GB, minimum 5 minutes for
* download, 10 minutes for upload).
*
* Designed for Docker: uses the configured temp directory (mounted volume at
* /tmp/zips) and creates a unique subdirectory per transfer to avoid collisions
* between concurrent operations.
*
* @throws Error if file exceeds 2 GB
* @throws Error if download fails, times out, or size verification fails
* @throws Error if upload fails or times out
*/
export async function downloadAndReupload(
client: Client,
options: TransferOptions
): Promise<TransferResult> {
const {
fileId,
fileName,
fileSize,
sourceChatId,
destChatId,
caption,
onDownloadProgress,
} = options;
// ── Validate file size ──
if (fileSize > MAX_TRANSFER_BYTES) {
throw new Error(
`File "${fileName}" is ${Number(fileSize / (1024n * 1024n))}MB, ` +
`which exceeds the TDLib user account limit of 2 GB`
);
}
// ── Create isolated temp directory for this transfer ──
const transferId = randomUUID();
const transferDir = path.join(config.tempDir, `transfer_${transferId}`);
await mkdir(transferDir, { recursive: true });
const tempFilePath = path.join(transferDir, fileName);
log.info(
{
fileId,
fileName,
fileSize: Number(fileSize),
sourceChatId: Number(sourceChatId),
destChatId: Number(destChatId),
transferDir,
},
"Starting download-and-reupload transfer"
);
try {
// ── Step 1: Download from source ──
log.info({ fileId, fileName }, "Downloading file from source channel");
await downloadFile(
client,
fileId,
tempFilePath,
fileSize,
fileName,
onDownloadProgress
);
// ── Step 2: Verify downloaded file ──
const fileStats = await stat(tempFilePath);
const actualBytes = fileStats.size;
if (Number(fileSize) > 0 && actualBytes !== Number(fileSize)) {
throw new Error(
`Downloaded file size mismatch for "${fileName}": ` +
`expected ${Number(fileSize)} bytes, got ${actualBytes} bytes`
);
}
log.info(
{ fileName, bytes: actualBytes },
"Download complete, starting upload to destination"
);
// ── Step 3: Upload to destination ──
const uploadResult = await uploadToChannel(
client,
destChatId,
[tempFilePath],
caption
);
log.info(
{
fileName,
destMessageId: Number(uploadResult.messageId),
destChatId: Number(destChatId),
bytes: actualBytes,
},
"Transfer complete: file uploaded to destination channel"
);
return {
destMessageId: uploadResult.messageId,
transferredBytes: actualBytes,
};
} catch (err) {
log.error(
{
err,
fileId,
fileName,
sourceChatId: Number(sourceChatId),
destChatId: Number(destChatId),
},
"Transfer failed"
);
throw err;
} finally {
// ── Always clean up temp files ──
await rm(transferDir, { recursive: true, force: true }).catch((cleanupErr) => {
log.warn(
{ err: cleanupErr, transferDir },
"Failed to clean up transfer temp directory"
);
});
}
}
/**
* Convenience wrapper that extracts the file info from a TDLib message object
* and calls downloadAndReupload.
*
* Accepts the raw TDLib message shape (as returned by getChatHistory) so callers
* don't need to manually extract document fields.
*
* @throws Error if the message does not contain a document attachment
*/
export async function transferMessageFile(
client: Client,
// eslint-disable-next-line @typescript-eslint/no-explicit-any
message: any,
sourceChatId: bigint,
destChatId: bigint,
caption?: string,
onDownloadProgress?: (progress: DownloadProgress) => void
): Promise<TransferResult> {
const doc = message?.content?.document;
if (!doc?.document) {
throw new Error(
`Message ${message?.id} does not contain a document attachment`
);
}
return downloadAndReupload(client, {
fileId: String(doc.document.id),
fileName: doc.file_name ?? `file_${message.id}`,
fileSize: BigInt(doc.document.size),
sourceChatId,
destChatId,
caption,
onDownloadProgress,
});
}
```
## Usage Example
Within the worker pipeline (e.g., in `worker.ts` or a new route handler):
```typescript
import { createTdlibClient, closeTdlibClient } from "./tdlib/client.js";
import { downloadAndReupload } from "./tdlib/transfer.js";
const client = await createTdlibClient({ id: account.id, phone: account.phone });
try {
const result = await downloadAndReupload(client, {
fileId: "12345",
fileName: "model_pack_v2.zip",
fileSize: 1_500_000_000n, // ~1.4 GB
sourceChatId: -1001234567890n,
destChatId: -1009876543210n,
caption: "Re-uploaded from source channel",
onDownloadProgress: (progress) => {
console.log(`Download: ${progress.percent}%`);
},
});
console.log(`Uploaded as message ${result.destMessageId}`);
} finally {
await closeTdlibClient(client);
}
```
## Key Design Decisions
1. **Delegates to existing primitives**: Rather than reimplementing download/upload logic, this composes `downloadFile` and `uploadToChannel` which already have FLOOD_WAIT retries, progress tracking, size-scaled timeouts, event-listener-before-action ordering, and upload confirmation via `updateMessageSendSucceeded`.
2. **Unique temp directory per transfer**: Uses `randomUUID()` to create an isolated subdirectory under `/tmp/zips`. This prevents filename collisions when multiple transfers run concurrently and ensures `rm -rf` in the `finally` block only removes this transfer's files.
3. **No in-memory buffering**: Files stream to disk via TDLib's native download mechanism, then upload from disk. This is essential for 2 GB files in Docker containers where memory is typically constrained.
4. **Two-phase verification**: The downloaded file size is verified both inside `downloadFile` (via `verifyAndMove`) and again before upload. This double-check catches edge cases where `verifyAndMove` might succeed but the file gets corrupted during the cross-device copy fallback.
5. **`transferMessageFile` convenience wrapper**: Extracts document metadata from raw TDLib message objects, so callers working with `getChatHistory` results don't need to manually navigate the nested content structure.

View File

@@ -0,0 +1,5 @@
{
"total_tokens": 61157,
"duration_ms": 122135,
"total_duration_seconds": 122.1
}

View File

@@ -0,0 +1,10 @@
{
"run_id": "eval-3-without_skill",
"expectations": [
{"text": "uses_flood_wait_retry", "passed": true, "evidence": "Delegates to existing downloadFile and uploadToChannel which use withFloodWait internally. Notes 'Both are wrapped with withFloodWait internally'."},
{"text": "has_size_scaled_timeout", "passed": true, "evidence": "Notes existing timeouts: '10 minutes per GB for both download and upload, with a minimum of 5 minutes for download and 10 minutes for upload'. Inherited from composed functions."},
{"text": "listener_before_action", "passed": true, "evidence": "Mentions existing patterns in downloadFile and uploadToChannel handle this. But does NOT explicitly call it out as a critical concern or verify it."},
{"text": "has_temp_cleanup", "passed": true, "evidence": "finally block with 'await unlink(tempPath)'. Uses tempSubdir for isolation."},
{"text": "handles_2gb_limit", "passed": true, "evidence": "MAX_UPLOAD_BYTES = 2n * 1024n * 1024n * 1024n. Upfront validation with error directing to byteLevelSplit."}
]
}

View File

@@ -0,0 +1,318 @@
# Download and Re-upload Utility for the DragonsStash Worker
## Approach
The utility function `downloadAndReupload` bridges a source Telegram channel message to a destination channel by:
1. **Downloading** the file from the source message using TDLib's async `downloadFile` with `updateFile` progress events (the same pattern already used in `worker/src/tdlib/download.ts`).
2. **Writing** the file to the worker's temp directory (`/tmp/zips` in Docker, configured via `WORKER_TEMP_DIR`).
3. **Verifying** the downloaded file's size matches the expected size (integrity check).
4. **Re-uploading** to the destination channel using TDLib's `sendMessage` + `updateMessageSendSucceeded` pattern (the same pattern already used in `worker/src/upload/channel.ts`).
5. **Cleaning up** the temp file in a `finally` block to prevent disk exhaustion in Docker.
### Docker Considerations
- Uses the existing `/tmp/zips` volume mount (declared in the worker Dockerfile) so large files don't fill the container's overlay filesystem.
- Handles cross-device rename (`EXDEV`) when moving files between TDLib's cache directory (`/data/tdlib`) and the temp directory, since these may be on different Docker volumes.
- Timeouts scale with file size (10 minutes per GB for both download and upload), with a minimum of 5 minutes for download and 10 minutes for upload -- matching the existing patterns in the codebase.
- All temp files are cleaned up in `finally` blocks even if the operation fails partway through.
### 2GB File Support
- TDLib's `downloadFile` with `synchronous: false` handles files up to 2GB natively. The existing codebase already has this pattern working.
- Telegram's upload limit is 2GB per message. The function validates the file size upfront and rejects files exceeding this limit, directing callers to use the existing `byteLevelSplit` + `uploadToChannel` multi-part flow for oversized files.
- Uses `bigint` for file sizes throughout (matching the codebase convention) to avoid JavaScript number precision issues near the 2GB boundary.
## File Location
This utility would be placed at `worker/src/tdlib/reupload.ts`, alongside the existing `download.ts` module.
## Full Code
```typescript
// worker/src/tdlib/reupload.ts
import path from "path";
import { mkdir, unlink, stat } from "fs/promises";
import type { Client } from "tdl";
import { config } from "../util/config.js";
import { childLogger } from "../util/logger.js";
import { downloadFile } from "./download.ts";
import type { DownloadProgress } from "./download.ts";
import { uploadToChannel } from "../upload/channel.js";
import type { UploadResult } from "../upload/channel.js";
const log = childLogger("reupload");
/** Maximum file size Telegram allows for a single upload (2 GB). */
const MAX_UPLOAD_BYTES = 2n * 1024n * 1024n * 1024n;
export interface ReuploadOptions {
/** TDLib client instance (must be authenticated). */
client: Client;
/** Telegram file ID (numeric string) from the source message. */
fileId: string;
/** Original file name. */
fileName: string;
/** Expected file size in bytes. */
fileSize: bigint;
/** Telegram chat ID of the destination channel. */
destChatId: bigint;
/** Optional caption for the re-uploaded message. */
caption?: string;
/** Optional callback for download progress. */
onDownloadProgress?: (progress: DownloadProgress) => void;
/** Optional subdirectory name inside tempDir (to isolate concurrent operations). */
tempSubdir?: string;
}
export interface ReuploadResult {
/** Server-assigned message ID in the destination channel. */
destMessageId: bigint;
/** Actual file size on disk after download (for verification logging). */
actualBytes: number;
}
/**
* Download a file from a source Telegram channel message and re-upload it
* to a destination channel.
*
* Flow:
* 1. Validates file size is within Telegram's 2GB upload limit
* 2. Downloads via TDLib async download with progress tracking
* 3. Verifies file integrity (size match)
* 4. Uploads to destination channel, waiting for server confirmation
* 5. Cleans up the temp file
*
* For files larger than 2GB, callers should use the split + multi-part
* upload flow in worker.ts instead.
*
* Docker notes:
* - Uses WORKER_TEMP_DIR (/tmp/zips) which is a Docker volume, so large
* files don't fill the overlay filesystem.
* - Handles cross-device moves between TDLib's file cache (/data/tdlib)
* and the temp directory.
* - Temp files are always cleaned up, even on failure.
*
* @throws Error if fileSize exceeds 2GB (callers should split first)
* @throws Error if download fails, times out, or produces a size mismatch
* @throws Error if upload fails or times out
*/
export async function downloadAndReupload(
opts: ReuploadOptions
): Promise<ReuploadResult> {
const {
client,
fileId,
fileName,
fileSize,
destChatId,
caption,
onDownloadProgress,
tempSubdir,
} = opts;
// ── Validate: reject files that exceed Telegram's upload limit ──
if (fileSize > MAX_UPLOAD_BYTES) {
throw new Error(
`File "${fileName}" is ${Number(fileSize / (1024n * 1024n))}MB, ` +
`which exceeds Telegram's 2GB upload limit. ` +
`Use byteLevelSplit + uploadToChannel for oversized files.`
);
}
// ── Prepare temp directory ──
const workDir = tempSubdir
? path.join(config.tempDir, tempSubdir)
: config.tempDir;
await mkdir(workDir, { recursive: true });
const tempPath = path.join(workDir, fileName);
log.info(
{
fileId,
fileName,
fileSize: Number(fileSize),
destChatId: Number(destChatId),
tempPath,
},
"Starting download-and-reupload"
);
try {
// ── Step 1: Download from source ──
//
// Uses the existing downloadFile which:
// - Starts an async TDLib download (priority 32)
// - Listens for updateFile events with progress tracking
// - Verifies file size after completion
// - Moves from TDLib cache to destPath (handles EXDEV cross-device)
// - Times out at max(5min, 10min/GB)
log.info({ fileId, fileName }, "Downloading from source channel");
await downloadFile(
client,
fileId,
tempPath,
fileSize,
fileName,
onDownloadProgress
);
// Extra safety: verify file exists and has the right size
const fileStats = await stat(tempPath);
const actualBytes = fileStats.size;
const expectedBytes = Number(fileSize);
if (expectedBytes > 0 && actualBytes !== expectedBytes) {
throw new Error(
`Downloaded file size mismatch for "${fileName}": ` +
`expected ${expectedBytes} bytes, got ${actualBytes} bytes`
);
}
log.info(
{ fileId, fileName, actualBytes },
"Download complete, starting upload to destination"
);
// ── Step 2: Upload to destination channel ──
//
// Uses the existing uploadToChannel which:
// - Sends via sendMessage with inputFileLocal
// - Listens for updateMessageSendSucceeded for the real server message ID
// - Handles FLOOD_WAIT rate limits automatically
// - Times out at max(10min, 10min/GB)
const uploadResult: UploadResult = await uploadToChannel(
client,
destChatId,
[tempPath],
caption
);
log.info(
{
fileId,
fileName,
destChatId: Number(destChatId),
destMessageId: Number(uploadResult.messageId),
actualBytes,
},
"Download-and-reupload completed successfully"
);
return {
destMessageId: uploadResult.messageId,
actualBytes,
};
} finally {
// ── Always clean up temp file ──
// Critical in Docker to prevent /tmp/zips volume from filling up,
// especially when processing many large files in sequence.
try {
await unlink(tempPath);
log.debug({ tempPath }, "Cleaned up temp file");
} catch {
// File may not exist if download failed before writing
}
}
}
/**
* Convenience wrapper that downloads and re-uploads multiple files
* (e.g., multipart archive parts) from a source channel to a destination.
*
* Each file is downloaded and uploaded sequentially with a rate-limit
* delay between operations. Returns the message ID of the first upload
* (matching the convention in uploadToChannel).
*
* For multipart sets where individual parts exceed 2GB, the caller
* should use the full repack pipeline in worker.ts (concatenate +
* byteLevelSplit) instead of this function.
*/
export async function downloadAndReuploadMultiple(
client: Client,
files: Array<{
fileId: string;
fileName: string;
fileSize: bigint;
}>,
destChatId: bigint,
caption?: string,
onDownloadProgress?: (fileIndex: number, progress: DownloadProgress) => void,
tempSubdir?: string
): Promise<ReuploadResult> {
if (files.length === 0) {
throw new Error("No files provided for download-and-reupload");
}
let firstResult: ReuploadResult | null = null;
for (let i = 0; i < files.length; i++) {
const file = files[i];
// Only set caption on the first file (matching uploadToChannel convention)
const fileCaption = i === 0 ? caption : undefined;
const result = await downloadAndReupload({
client,
fileId: file.fileId,
fileName: file.fileName,
fileSize: file.fileSize,
destChatId,
caption: fileCaption,
onDownloadProgress: onDownloadProgress
? (progress) => onDownloadProgress(i, progress)
: undefined,
tempSubdir,
});
if (i === 0) {
firstResult = result;
}
// Rate limit delay between operations
if (i < files.length - 1) {
await new Promise((resolve) =>
setTimeout(resolve, config.apiDelayMs)
);
}
}
return firstResult!;
}
```
## How It Fits Into the Existing Codebase
The function composes the two existing building blocks:
- **`downloadFile`** from `worker/src/tdlib/download.ts` -- handles async download with `updateFile` events, progress tracking, timeout, size verification, and cross-device file moves (critical for Docker where `/data/tdlib` and `/tmp/zips` are separate volumes).
- **`uploadToChannel`** from `worker/src/upload/channel.ts` -- handles `sendMessage` with `updateMessageSendSucceeded` confirmation, upload progress via `updateFile`, and FLOOD_WAIT retry.
Both are wrapped with `withFloodWait` internally, so the utility automatically handles Telegram rate limiting.
The `downloadAndReupload` function is a focused utility that can be called from the worker pipeline or any other part of the system that needs to transfer a file between channels. The existing `processOneArchiveSet` in `worker.ts` does a similar download-then-upload flow but with additional steps (hashing, dedup, metadata extraction, splitting). This utility extracts the core transfer operation for simpler use cases.
## Usage Example
```typescript
import { downloadAndReupload } from "./tdlib/reupload.js";
// Transfer a single file from source to destination
const result = await downloadAndReupload({
client,
fileId: "12345",
fileName: "archive.zip",
fileSize: 1_500_000_000n, // ~1.5 GB
destChatId: -1001234567890n,
caption: "Re-uploaded archive",
onDownloadProgress: (progress) => {
console.log(`Download: ${progress.percent}%`);
},
tempSubdir: `reupload_${Date.now()}`,
});
console.log(`Uploaded as message ${result.destMessageId}`);
```

View File

@@ -0,0 +1,5 @@
{
"total_tokens": 54587,
"duration_ms": 146695,
"total_duration_seconds": 146.7
}

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,301 @@
---
name: tdlib-telegram
description: >
Reference guide for building Telegram integrations with TDLib (tdl/node).
Covers rate limits, FLOOD_WAIT handling, file size constraints, bot vs user account
differences, and safe code patterns. Use this skill whenever writing or modifying code
that calls Telegram APIs via TDLib, the Bot API, or any Telegram client library — including
sending messages, downloading/uploading files, scanning channels, forwarding messages,
managing subscriptions, or handling notifications. Also use when debugging 429 errors,
FLOOD_WAIT, or silent message drops.
---
# TDLib / Telegram Development Guide
This skill provides the rate limits, constraints, and patterns you need to write correct
Telegram integrations. The limits below come from official Telegram documentation and
well-established community findings (Telegram does not publish exact numbers for all limits).
## Telegram Rate Limits
These are approximate safe boundaries. Telegram's actual limits are dynamic and depend on
account age, history, and request type. The correct strategy is to respect these as guidelines
and always handle FLOOD_WAIT errors gracefully.
### Bot Accounts
| Operation | Limit | Notes |
|-----------|-------|-------|
| Messages to same chat | ~1 msg/sec | Bursts OK, sustained exceeds limit |
| Messages in a group | 20 msgs/min | Hard limit per group chat |
| Bulk notifications (different users) | ~30 msgs/sec | Global across all chats |
| Message edits in a group | ~20 edits/min | Community-observed |
| API requests (global) | ~30 req/sec | All request types combined |
| Paid broadcasts | up to 1000 msgs/sec | Requires Telegram Stars balance |
### User Accounts (TDLib)
| Operation | Limit | Notes |
|-----------|-------|-------|
| API requests (global) | ~30 req/sec | All request types combined |
| Messages in a group | ~20 msgs/min | Same as bot |
| Channel history reads | No published limit | But pagination + delay is essential |
| Joining groups | Very strict | FLOOD_WAIT often 30-300+ seconds |
### File Size Limits
| Context | Upload | Download |
|---------|--------|----------|
| Bot API (standard) | 50 MB | 20 MB |
| Bot API (local server) | 2,000 MB | 2,000 MB |
| User account (TDLib) | 2 GB | 2 GB |
| Premium user (TDLib) | 4 GB | 4 GB |
### Message & Content Limits
| Item | Limit |
|------|-------|
| Message text length | 4,096 chars |
| Media caption | 1,024 chars (4,096 premium) |
| Album / media group | 10 items max |
| Forwarded messages per request | `forwarded_message_count_max` (TDLib option) |
| Inline keyboard buttons | 100 entities |
| Formatting entities per message | 100 |
| Scheduled messages per chat | 100 |
| Bot commands | 100 max |
### Forum & Group Limits
| Item | Limit |
|------|-------|
| Topics per group | 1,000,000 |
| Topic title | 128 chars |
| Group members | 200,000 |
| Admins per group | 50 |
| Bots per group | 20 |
| Pinned topics | 5 |
## FLOOD_WAIT — How It Works
When you exceed rate limits, Telegram returns a `FLOOD_WAIT_X` error (or HTTP 429 with
`retry_after`). This is a **mandatory pause** — the value `X` is the number of seconds you
must wait before ANY request will succeed. It blocks the entire client, not just the
operation that triggered it.
### The Right Way to Handle It
```typescript
// Extract the wait duration from the error
function extractFloodWaitSeconds(err: unknown): number | null {
const message = err instanceof Error ? err.message : String(err);
// Pattern 1: FLOOD_WAIT_30
const flood = message.match(/FLOOD_WAIT_(\d+)/i);
if (flood) return parseInt(flood[1], 10);
// Pattern 2: "retry after 30"
const retry = message.match(/retry after (\d+)/i);
if (retry) return parseInt(retry[1], 10);
// Pattern 3: HTTP 429 without explicit seconds
if (String((err as any)?.code) === "429") return 30;
return null; // Not a rate limit error
}
// Wrap any TDLib call with automatic retry
async function withFloodWait<T>(fn: () => Promise<T>, maxRetries = 5): Promise<T> {
for (let attempt = 0; attempt <= maxRetries; attempt++) {
try {
return await fn();
} catch (err) {
const wait = extractFloodWaitSeconds(err);
if (wait === null || attempt >= maxRetries) throw err;
// Add 1-5s jitter to prevent thundering herd
const jitter = 1000 + Math.random() * 4000;
await sleep(wait * 1000 + jitter);
}
}
throw new Error("Unreachable");
}
```
### Key Rules
- **Always respect the wait duration.** Never retry before `retry_after` expires.
- **Add jitter.** Without it, multiple clients retry simultaneously and trigger another FLOOD_WAIT.
- **Non-rate-limit errors should fail fast.** Only retry on FLOOD_WAIT, not on other errors.
- **Don't artificially throttle below ~1 req/sec.** Telegram's own guidance (via grammY docs)
is to send requests as fast as you need and handle 429 errors. Fixed low-frequency throttling
wastes throughput without preventing floods.
## Code Patterns
### Pattern: Sequential Send Queue
When sending notifications to multiple users, use a sequential queue with a per-message delay.
Never fire concurrent sends — you will hit the 30 msg/sec global limit instantly.
```typescript
let sendQueue: Promise<void> = Promise.resolve();
function queueSend(chatId: bigint, text: string): void {
sendQueue = sendQueue
.then(() => withFloodWait(() => sendTextMessage(chatId, text)))
.then(() => sleep(50)) // ~20 msgs/sec, well under 30 limit
.catch((err) => log.error({ err, chatId }, "Send failed"));
}
```
### Pattern: Paginated Scanning with Delay
When reading channel history or enumerating topics, always add a delay between pages:
```typescript
while (hasMorePages) {
const result = await invokeWithTimeout(client, { _: "getChatHistory", ... });
processMessages(result.messages);
if (result.messages.length < limit) break;
await sleep(1000); // 1 second between pages — prevents FLOOD_WAIT on large channels
}
```
### Pattern: Event Listener Before Action
When waiting for TDLib async events (upload confirmation, download completion), always
attach the event listener BEFORE starting the operation. If you attach after, fast
operations can complete before the listener exists, causing the promise to hang forever.
```typescript
// CORRECT: listener first, then action
client.on("update", handleUpdate);
const tempMsg = await client.invoke({ _: "sendMessage", ... });
tempMsgId = tempMsg.id; // handler now knows which message to match
// WRONG: action first, then listener — race condition!
const tempMsg = await client.invoke({ _: "sendMessage", ... });
client.on("update", handleUpdate); // may miss updateMessageSendSucceeded
```
### Pattern: Download/Upload Timeouts
Scale timeouts with file size. TDLib downloads/uploads are asynchronous — without a timeout,
a stalled transfer hangs the entire pipeline.
```typescript
const timeoutMs = Math.max(
10 * 60_000, // minimum 10 minutes
(fileSizeMB / 1024) * 10 * 60_000 // 10 minutes per GB
);
```
### Pattern: TDLib Client Lifecycle
Always close TDLib clients in a `finally` block. Unclosed clients leak memory and file
descriptors, and can leave TDLib's internal database locked.
```typescript
const client = await createTdlibClient(account);
try {
// ... use client ...
} finally {
await closeTdlibClient(client);
}
```
## Anti-Patterns
### Never: Concurrent TDLib Sends Without Queue
```typescript
// BAD: fires all sends concurrently — will trigger FLOOD_WAIT immediately
await Promise.all(users.map((u) => sendTextMessage(u.chatId, msg)));
// GOOD: sequential with delay
for (const user of users) {
await withFloodWait(() => sendTextMessage(user.chatId, msg));
await sleep(50);
}
```
### Never: Bare client.invoke() Without Retry
Every `client.invoke()` call can return FLOOD_WAIT at any time. Bare calls will crash
on rate limits instead of retrying.
```typescript
// BAD: crashes on FLOOD_WAIT
await client.invoke({ _: "sendMessage", ... });
// GOOD: retries automatically
await withFloodWait(() => client.invoke({ _: "sendMessage", ... }));
```
### Never: Retry Without Respecting retry_after
```typescript
// BAD: fixed 1-second retry ignores Telegram's wait requirement
catch (err) { await sleep(1000); retry(); }
// GOOD: extract and respect the actual wait time
catch (err) {
const wait = extractFloodWaitSeconds(err);
if (wait !== null) await sleep(wait * 1000 + jitter);
else throw err;
}
```
### Never: Ignore FLOOD_WAIT in Bots
Bot accounts get the same FLOOD_WAIT as user accounts. The bot API's 429 response
blocks ALL operations for the specified duration — not just the chat that triggered it.
A single unhandled flood in a notification loop can make the entire bot unresponsive.
## Bot vs User Account Differences
| Capability | Bot | User (TDLib) |
|-----------|-----|-------------|
| Read channel history | No (unless admin) | Yes |
| Send to users who haven't started bot | No | N/A |
| Join groups via invite link | No (must be added) | Yes |
| Forward messages (send_copy) | Yes | Yes |
| File upload limit | 50 MB (standard API) | 2 GB |
| File download limit | 20 MB (standard API) | 2 GB |
| Auth method | Bot token | Phone + SMS code |
| Rate limit profile | Same FLOOD_WAIT | Same FLOOD_WAIT |
## TDLib-Specific Notes
### BigInt Chat IDs
TDLib uses numeric chat IDs. Supergroups and channels use negative IDs (e.g., `-1001234567890`).
When passing to `client.invoke()`, convert with `Number(chatId)` — TDLib's JSON interface
doesn't handle BigInt. Be aware that very large IDs may lose precision with `Number()`,
though current Telegram IDs are within safe integer range.
### TDLib Options (Runtime Queryable)
These are read-only values you can query at runtime via `getOption`:
- `message_text_length_max` — max message text length
- `message_caption_length_max` — max caption length
- `forwarded_message_count_max` — max forwards per request
### Session State
TDLib persists session state to disk. Each account needs its own state directory.
Running two clients on the same state directory simultaneously will corrupt the database.
Use separate directories per account, and separate volumes in Docker for worker vs bot.
## Docker Considerations
- **prebuilt-tdlib**: The `prebuilt-tdlib` npm package provides platform-specific TDLib
binaries. Container base image must match (e.g., `node:20-bookworm-slim` for Debian x64).
- **Volumes**: Mount persistent volumes for TDLib state directories — losing state forces
full re-authentication.
- **Graceful shutdown**: Wait for active operations to finish before closing DB connections.
TDLib operations in flight will fail if the database pool is closed underneath them.
- **Health checks**: TDLib services don't expose HTTP — use database connectivity as the
health signal instead.

View File

@@ -0,0 +1,23 @@
{
"skill_name": "tdlib-telegram",
"evals": [
{
"id": 1,
"prompt": "Add a new bot command /broadcast that sends a text message to ALL users who have a TelegramLink in the database. The admin triggers it from the web app. Add it to the bot's command handler and create an API endpoint that triggers it.",
"expected_output": "Code that uses a sequential send queue with withFloodWait wrapping each sendTextMessage call, a delay between sends (~50ms), and does NOT use Promise.all or concurrent sends. Should handle errors per-user without stopping the broadcast.",
"files": []
},
{
"id": 2,
"prompt": "The worker keeps crashing with 'FLOOD_WAIT_35' errors when scanning a source channel that has about 10,000 messages. It happens during the getChannelMessages pagination loop. How do I fix this?",
"expected_output": "Diagnosis that the apiDelayMs between pages may be too low or the retry logic isn't wrapping the pagination calls. Should recommend ensuring all getChatHistory/searchChatMessages calls go through withFloodWait/invokeWithTimeout, and that sleep(config.apiDelayMs) exists between pages. Should NOT suggest reducing scan frequency as the primary fix.",
"files": []
},
{
"id": 3,
"prompt": "Write a utility function for the worker that downloads a file from a source Telegram channel message and re-uploads it to the destination channel. It needs to handle files up to 2GB and work reliably in Docker.",
"expected_output": "Code that: (1) wraps download in withFloodWait with size-scaled timeout, (2) attaches upload event listener BEFORE calling sendMessage, (3) uses temp directory with cleanup in finally block, (4) handles the 2GB Telegram limit correctly, (5) uses try/finally for client cleanup if applicable.",
"files": []
}
]
}

69
.drone.yml Normal file
View File

@@ -0,0 +1,69 @@
---
kind: pipeline
type: docker
name: build-and-deploy
trigger:
branch: [main]
event: [push]
steps:
- name: build-app
image: plugins/docker
settings:
repo: git.samagsteribbe.nl/admin/dragonsstash
registry: git.samagsteribbe.nl
dockerfile: Dockerfile
tags:
- latest
- "${DRONE_COMMIT_SHA:0:8}"
build_args:
- NEXT_PUBLIC_APP_URL=https://dragonsstash.samagsteribbe.nl
username:
from_secret: gitea_username
password:
from_secret: gitea_password
- name: build-worker
image: plugins/docker
depends_on: [clone]
settings:
repo: git.samagsteribbe.nl/admin/dragonsstash-worker
registry: git.samagsteribbe.nl
dockerfile: worker/Dockerfile
tags:
- latest
- "${DRONE_COMMIT_SHA:0:8}"
username:
from_secret: gitea_username
password:
from_secret: gitea_password
- name: build-bot
image: plugins/docker
depends_on: [clone]
settings:
repo: git.samagsteribbe.nl/admin/dragonsstash-bot
registry: git.samagsteribbe.nl
dockerfile: bot/Dockerfile
tags:
- latest
- "${DRONE_COMMIT_SHA:0:8}"
username:
from_secret: gitea_username
password:
from_secret: gitea_password
- name: deploy
image: alpine
depends_on: [build-app, build-worker, build-bot]
environment:
SSH_KEY:
from_secret: ssh_key
commands:
- apk add --no-cache openssh-client
- mkdir -p ~/.ssh
- printf "%s" "$SSH_KEY" > ~/.ssh/id_ed25519
- chmod 600 ~/.ssh/id_ed25519
- ssh-keyscan -t ed25519 192.168.68.68 > ~/.ssh/known_hosts 2>/dev/null
- ssh sam@192.168.68.68 "cd /opt/stacks/DragonsStash && docker compose pull && docker compose up -d"

View File

@@ -13,6 +13,8 @@ AUTH_GITHUB_ID=""
AUTH_GITHUB_SECRET="" AUTH_GITHUB_SECRET=""
# App # App
# APP_PORT controls the port the container listens on AND how it is exposed on the host.
# If you change APP_PORT, also update NEXT_PUBLIC_APP_URL to match.
NEXT_PUBLIC_APP_URL="http://localhost:3000" NEXT_PUBLIC_APP_URL="http://localhost:3000"
APP_PORT=3000 APP_PORT=3000

1
.gitignore vendored
View File

@@ -54,3 +54,4 @@ src/generated
# temp files # temp files
nul nul
tmpclaude-* tmpclaude-*
.worktrees/

102
CLAUDE.md Normal file
View File

@@ -0,0 +1,102 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
Dragon's Stash is a self-hosted inventory management system for 3D printing filament, SLA resin, miniature paints, and supplies. It includes an integrated Telegram archive worker that scans channels for ZIP/RAR archives, indexes their contents, and a bot that lets users search and receive packages via Telegram.
## Tech Stack
- **App**: Next.js 16 (App Router), TypeScript 5.9 (strict), Tailwind CSS 4, shadcn/ui
- **Database**: PostgreSQL 16+ via Prisma v7.4 with `@prisma/adapter-pg`
- **Auth**: Auth.js v5 (NextAuth) with credentials + optional GitHub OAuth
- **Worker**: TypeScript + TDLib (via `tdl`) for Telegram channel scanning
- **Bot**: TypeScript + TDLib for Telegram bot interface
- **Forms**: React Hook Form + Zod v4
## Commands
### App (root package.json)
```bash
npm run dev # Next.js dev server with hot reload
npm run build # Production build (standalone output)
npm run start # Production server
npm run lint # ESLint (next/core-web-vitals + TypeScript)
```
### Database
```bash
npm run db:generate # Generate Prisma client
npm run db:migrate # Run migrations (dev mode)
npm run db:push # Push schema without migrations
npm run db:seed # Seed database with test data
npm run db:studio # Prisma Studio UI
npx prisma migrate dev --name <description> # Create new migration
```
### Worker & Bot (each in their own directory)
```bash
cd worker && npm run dev # Dev mode with tsx watch
cd worker && npm run build # TypeScript compile to dist/
cd bot && npm run dev # Dev mode with tsx watch
cd bot && npm run build # TypeScript compile to dist/
```
### Dev Environment Setup
```bash
docker compose -f docker-compose.dev.yml up -d # Start PostgreSQL + worker
npm run dev # Run app locally
```
## Architecture
### Three-Service Design
The project is split into three independent services sharing one PostgreSQL database:
1. **App** (root `src/`): Next.js web UI for inventory management and Telegram admin
2. **Worker** (`worker/`): Scans Telegram source channels, processes archives, uploads to destination channel
3. **Bot** (`bot/`): Telegram bot for user search, package delivery, keyword subscriptions
Services communicate asynchronously via `pg_notify` (e.g., on-demand channel fetches, bot send requests).
### App Source Layout (`src/`)
- `app/(auth)/` — Login/register pages (public)
- `app/(app)/` — Protected routes behind auth middleware (dashboard, filaments, resins, paints, supplies, vendors, locations, settings, stls, telegram, usage)
- `app/api/` — API routes (NextAuth, health check, bot endpoints)
- `data/` — Server-side Prisma query functions (`*.queries.ts`), one file per domain model
- `schemas/` — Zod validation schemas, one file per domain model
- `components/ui/` — shadcn/ui primitives
- `components/shared/` — Reusable business components (data-table, status-badge, color-swatch, stat-card, page-header)
- `components/layout/` — Sidebar and header
- `lib/` — Auth config, Prisma singleton, constants, utilities, Telegram query helpers
- `hooks/` — Custom React hooks (use-modal, use-debounce, use-current-user)
- `types/` — Shared TypeScript types
### Key Patterns
- **Server Components by default** — pages are async server components that fetch data directly. Only interactive components use `"use client"`.
- **Server Actions for mutations** — each page directory has an `actions.ts` file with create/update/delete actions.
- **Data queries centralized** — all Prisma reads go through `src/data/*.queries.ts`, not inline in components.
- **Modal-based CRUD** — add/edit forms use dialog modals, not separate pages.
- **TanStack Table** with server-side pagination for all inventory tables.
- **All Prisma PKs use `cuid()`** string IDs.
### Worker Pipeline
1. Authenticate Telegram account via TDLib (SMS code flow, managed via admin UI)
2. Scan source channels for messages since `lastProcessedMessageId`
3. Detect archives (ZIP/RAR), group multipart sets, extract file listings
4. Hash for dedup, match preview images, extract creator from filename
5. Split files >2GB, upload to destination channel, track progress
### ESLint Scope
ESLint covers `src/` only. The `worker/`, `bot/`, `scripts/`, and `prisma/seed.ts` directories are excluded from linting.
## Docker Deployment
- `docker-compose.yml` — Production: app + worker + bot + db
- `docker-compose.dev.yml` — Dev: db + worker only (app runs locally)
- `docker-entrypoint.sh` — Runs migrations, optional seeding, then starts app
- Bot service uses Docker Compose profiles (`bot` or `full`) — not started by default
## Testing
No test framework is configured. Testing is manual.

View File

@@ -17,6 +17,8 @@ COPY --from=deps /app/node_modules ./node_modules
COPY . . COPY . .
ENV NEXT_TELEMETRY_DISABLED=1 ENV NEXT_TELEMETRY_DISABLED=1
ARG NEXT_PUBLIC_APP_URL=http://localhost:3000
ENV NEXT_PUBLIC_APP_URL=${NEXT_PUBLIC_APP_URL}
RUN npm run build RUN npm run build
# --- Production image --- # --- Production image ---
@@ -54,6 +56,7 @@ RUN chmod +x docker-entrypoint.sh
USER nextjs USER nextjs
# Default port — overridden at runtime by the PORT env var (set via docker-compose APP_PORT)
EXPOSE 3000 EXPOSE 3000
ENV PORT=3000 ENV PORT=3000
ENV HOSTNAME="0.0.0.0" ENV HOSTNAME="0.0.0.0"

View File

@@ -294,5 +294,7 @@ curl http://localhost:3000/api/health
5. Open a Pull Request 5. Open a Pull Request
## License ## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

View File

@@ -10,7 +10,10 @@ import {
getSubscriptions, getSubscriptions,
addSubscription, addSubscription,
removeSubscription, removeSubscription,
getGroupById,
searchGroups,
} from "./db/queries.js"; } from "./db/queries.js";
import { db } from "./db/client.js";
import { sendTextMessage, sendPhotoMessage } from "./tdlib/client.js"; import { sendTextMessage, sendPhotoMessage } from "./tdlib/client.js";
const log = childLogger("commands"); const log = childLogger("commands");
@@ -78,6 +81,12 @@ export async function handleMessage(msg: IncomingMessage): Promise<void> {
case "/status": case "/status":
await handleStatus(chatId, userId); await handleStatus(chatId, userId);
break; break;
case "/group":
await handleGroup(chatId, args);
break;
case "/sendgroup":
await handleSendGroup(chatId, userId, args);
break;
default: default:
await sendTextMessage( await sendTextMessage(
chatId, chatId,
@@ -117,6 +126,8 @@ async function handleStart(
`/search &lt;query&gt; — Search packages`, `/search &lt;query&gt; — Search packages`,
`/latest [n] — Show latest packages`, `/latest [n] — Show latest packages`,
`/package &lt;id&gt; — Package details`, `/package &lt;id&gt; — Package details`,
`/group &lt;id or name&gt; — View group info and package list`,
`/sendgroup &lt;id&gt; — Send all packages in a group to yourself`,
`/link &lt;code&gt; — Link your Telegram to your web account`, `/link &lt;code&gt; — Link your Telegram to your web account`,
`/subscribe &lt;keyword&gt; — Get notified for new packages`, `/subscribe &lt;keyword&gt; — Get notified for new packages`,
`/subscriptions — View your subscriptions`, `/subscriptions — View your subscriptions`,
@@ -136,6 +147,8 @@ async function handleHelp(chatId: bigint): Promise<void> {
`/search &lt;query&gt; — Search by filename or creator`, `/search &lt;query&gt; — Search by filename or creator`,
`/latest [n] — Show n most recent packages (default: 5)`, `/latest [n] — Show n most recent packages (default: 5)`,
`/package &lt;id&gt; — View package details and file list`, `/package &lt;id&gt; — View package details and file list`,
`/group &lt;id or name&gt; — View group info and package list`,
`/sendgroup &lt;id&gt; — Send all packages in a group to yourself`,
``, ``,
`🔗 <b>Account Linking</b>`, `🔗 <b>Account Linking</b>`,
`/link &lt;code&gt; — Link Telegram to your web account`, `/link &lt;code&gt; — Link Telegram to your web account`,
@@ -432,6 +445,168 @@ async function handleStatus(chatId: bigint, userId: bigint): Promise<void> {
} }
} }
async function handleGroup(chatId: bigint, query: string): Promise<void> {
if (!query) {
await sendTextMessage(
chatId,
"Usage: /group &lt;id or name&gt;\n\nProvide a group ID (starts with 'c') or a name to search.",
"textParseModeHTML"
);
return;
}
const trimmed = query.trim();
// If it looks like a cuid (starts with 'c', ~25 chars), look up by ID directly
if (/^c[a-z0-9]{20,}$/i.test(trimmed)) {
const group = await getGroupById(trimmed);
if (!group) {
await sendTextMessage(chatId, "Group not found.", "textParseModeHTML");
return;
}
const packageLines = group.packages.slice(0, 20).map((pkg, i) => {
const size = formatSize(pkg.fileSize);
return ` ${i + 1}. <b>${escapeHtml(pkg.fileName)}</b> (${size}, ${pkg.fileCount} files) — <code>${pkg.id}</code>`;
});
const more = group.packages.length > 20
? `\n ... and ${group.packages.length - 20} more`
: "";
const response = [
`📦 <b>Group: ${escapeHtml(group.name)}</b>`,
``,
`Packages: ${group.packages.length}`,
`ID: <code>${group.id}</code>`,
``,
`<b>Contents:</b>`,
...packageLines,
more,
``,
`Use /sendgroup ${group.id} to receive all packages.`,
]
.filter((l) => l !== "")
.join("\n");
await sendTextMessage(chatId, response, "textParseModeHTML");
return;
}
// Otherwise search by name
const groups = await searchGroups(trimmed, 5);
if (groups.length === 0) {
await sendTextMessage(
chatId,
`No groups found matching "<b>${escapeHtml(trimmed)}</b>".`,
"textParseModeHTML"
);
return;
}
const lines = groups.map(
(g, i) =>
`${i + 1}. <b>${escapeHtml(g.name)}</b> — ${g._count.packages} package(s)\n ID: <code>${g.id}</code>`
);
const response = [
`🔍 <b>Groups matching "${escapeHtml(trimmed)}":</b>`,
``,
...lines,
``,
`Use /group &lt;id&gt; for full details.`,
].join("\n");
await sendTextMessage(chatId, response, "textParseModeHTML");
}
async function handleSendGroup(
chatId: bigint,
userId: bigint,
args: string
): Promise<void> {
if (!args) {
await sendTextMessage(
chatId,
"Usage: /sendgroup &lt;group-id&gt;",
"textParseModeHTML"
);
return;
}
const groupId = args.trim();
const group = await getGroupById(groupId);
if (!group) {
await sendTextMessage(chatId, "Group not found.", "textParseModeHTML");
return;
}
// Require account linking
const link = await findLinkByTelegramUserId(userId);
if (!link) {
await sendTextMessage(
chatId,
"You must link your account before receiving packages.\nUse /link &lt;code&gt; to connect.",
"textParseModeHTML"
);
return;
}
// Only send packages that have been uploaded to the destination channel
const sendable = group.packages.filter(
(pkg) => pkg.destChannelId && pkg.destMessageId
);
if (sendable.length === 0) {
await sendTextMessage(
chatId,
`No packages in group "<b>${escapeHtml(group.name)}</b>" are ready to send yet.`,
"textParseModeHTML"
);
return;
}
// Create a BotSendRequest for each sendable package
const requests = await Promise.all(
sendable.map((pkg) =>
db.botSendRequest.create({
data: {
packageId: pkg.id,
telegramLinkId: link.id,
requestedByUserId: link.userId,
status: "PENDING",
},
})
)
);
// Fire pg_notify for each request so the send listener picks them up
for (const req of requests) {
await db.$queryRawUnsafe(
`SELECT pg_notify('bot_send', $1)`,
req.id
).catch(() => {
// Best-effort — the bot also processes PENDING requests on its send queue
});
}
await sendTextMessage(
chatId,
[
`✅ <b>Queued ${requests.length} package(s) from "${escapeHtml(group.name)}"</b>`,
``,
`You'll receive each archive shortly. Use /package &lt;id&gt; to check individual packages.`,
].join("\n"),
"textParseModeHTML"
);
log.info(
{ groupId, packageCount: requests.length, userId: userId.toString() },
"Group send queued"
);
}
function escapeHtml(text: string): string { function escapeHtml(text: string): string {
return text return text
.replace(/&/g, "&amp;") .replace(/&/g, "&amp;")

View File

@@ -21,7 +21,16 @@ export async function findLinkByUserId(userId: string) {
export async function validateLinkCode(code: string): Promise<string | null> { export async function validateLinkCode(code: string): Promise<string | null> {
const key = `link_code:${code}`; const key = `link_code:${code}`;
const setting = await db.globalSetting.findUnique({ where: { key } }); const setting = await db.globalSetting.findUnique({ where: { key } });
return setting?.value ?? null; if (!setting) return null;
try {
const parsed = JSON.parse(setting.value);
if (parsed.expiresAt && new Date(parsed.expiresAt) < new Date()) return null;
return parsed.userId ?? null;
} catch {
// Legacy format: value is the userId directly
return setting.value;
}
} }
export async function deleteLinkCode(code: string): Promise<void> { export async function deleteLinkCode(code: string): Promise<void> {
@@ -44,7 +53,52 @@ export async function createTelegramLink(
// ── Package search ── // ── Package search ──
export async function searchPackages(query: string, limit = 10) { export async function searchPackages(query: string, limit = 10) {
const packages = await db.package.findMany({ // Try full-text search first
if (query.length >= 3) {
const tsQuery = query
.trim()
.split(/\s+/)
.filter((w) => w.length >= 2)
.map((w) => w.replace(/[^a-zA-Z0-9]/g, ""))
.filter(Boolean)
.join(" & ");
if (tsQuery) {
try {
const ftsResults = await db.$queryRawUnsafe<{ id: string }[]>(
`SELECT id FROM packages
WHERE "searchVector" @@ to_tsquery('english', $1)
ORDER BY ts_rank("searchVector", to_tsquery('english', $1)) DESC
LIMIT $2`,
tsQuery,
limit
);
if (ftsResults.length > 0) {
return db.package.findMany({
where: { id: { in: ftsResults.map((r) => r.id) } },
orderBy: { indexedAt: "desc" },
select: {
id: true,
fileName: true,
fileSize: true,
archiveType: true,
fileCount: true,
creator: true,
indexedAt: true,
destChannelId: true,
destMessageId: true,
},
});
}
} catch {
// FTS failed — fall back to ILIKE
}
}
}
// Fallback: ILIKE search
return db.package.findMany({
where: { where: {
OR: [ OR: [
{ fileName: { contains: query, mode: "insensitive" } }, { fileName: { contains: query, mode: "insensitive" } },
@@ -65,7 +119,44 @@ export async function searchPackages(query: string, limit = 10) {
destMessageId: true, destMessageId: true,
}, },
}); });
return packages; }
// ── Group queries ──
export async function getGroupById(groupId: string) {
return db.packageGroup.findUnique({
where: { id: groupId },
include: {
packages: {
orderBy: { indexedAt: "desc" },
select: {
id: true,
fileName: true,
fileSize: true,
archiveType: true,
fileCount: true,
creator: true,
destChannelId: true,
destMessageId: true,
},
},
},
});
}
export async function searchGroups(query: string, limit = 5) {
return db.packageGroup.findMany({
where: {
name: { contains: query, mode: "insensitive" },
},
orderBy: { createdAt: "desc" },
take: limit,
select: {
id: true,
name: true,
_count: { select: { packages: true } },
},
});
} }
export async function getLatestPackages(limit = 5) { export async function getLatestPackages(limit = 5) {
@@ -106,9 +197,18 @@ export async function getPendingSendRequest(requestId: string) {
select: { select: {
id: true, id: true,
fileName: true, fileName: true,
fileSize: true,
fileCount: true,
creator: true,
tags: true,
archiveType: true,
destChannelId: true, destChannelId: true,
destMessageId: true, destMessageId: true,
destMessageIds: true,
isMultipart: true,
partCount: true,
previewData: true, previewData: true,
sourceChannel: { select: { title: true, telegramId: true } },
}, },
}, },
telegramLink: true, telegramLink: true,

View File

@@ -1,7 +1,7 @@
import { config } from "./util/config.js"; import { config } from "./util/config.js";
import { logger } from "./util/logger.js"; import { logger } from "./util/logger.js";
import { db, pool } from "./db/client.js"; import { db, pool } from "./db/client.js";
import { createBotClient, closeBotClient, onBotUpdate } from "./tdlib/client.js"; import { createBotClient, closeBotClient, onBotUpdate, getUser } from "./tdlib/client.js";
import { startSendListener, stopSendListener } from "./send-listener.js"; import { startSendListener, stopSendListener } from "./send-listener.js";
import { handleMessage } from "./commands.js"; import { handleMessage } from "./commands.js";
import { mkdir } from "fs/promises"; import { mkdir } from "fs/promises";
@@ -49,14 +49,27 @@ async function main(): Promise<void> {
const userId = senderId.user_id as number; const userId = senderId.user_id as number;
if (text && userId) { if (text && userId) {
// Get user info for display name (async but fire-and-forget for perf) (async () => {
handleMessage({ let firstName = "User";
let lastName: string | undefined;
let username: string | undefined;
try {
const userInfo = await getUser(userId);
firstName = userInfo.firstName;
lastName = userInfo.lastName;
username = userInfo.username;
} catch {
// Fall back to defaults if getUser fails
}
await handleMessage({
chatId: BigInt(chatId), chatId: BigInt(chatId),
userId: BigInt(userId), userId: BigInt(userId),
text, text,
firstName: "User", // TDLib provides this via a separate getUser call firstName,
username: undefined, lastName,
}).catch((err) => { username,
});
})().catch((err) => {
log.error({ err, chatId, userId }, "Failed to handle message"); log.error({ err, chatId, userId }, "Failed to handle message");
}); });
} }

View File

@@ -7,18 +7,32 @@ import {
findMatchingSubscriptions, findMatchingSubscriptions,
getGlobalDestinationChannel, getGlobalDestinationChannel,
} from "./db/queries.js"; } from "./db/queries.js";
import { copyMessageToUser, sendTextMessage, sendPhotoMessage } from "./tdlib/client.js"; import { copyMessageToUser, copyMultipleMessagesToUser, sendTextMessage, sendPhotoMessage } from "./tdlib/client.js";
import { sleep } from "./util/flood-wait.js";
const log = childLogger("send-listener"); const log = childLogger("send-listener");
let pgClient: pg.PoolClient | null = null; let pgClient: pg.PoolClient | null = null;
let stopped = false;
/** Delay (ms) before attempting to reconnect after a connection loss. */
const RECONNECT_DELAY_MS = 5_000;
/** /**
* Start listening for pg_notify signals: * Start listening for pg_notify signals:
* - `bot_send` — payload = requestId → send a package to a user * - `bot_send` — payload = requestId → send a package to a user
* - `new_package` — payload = JSON { packageId, fileName, creator } → notify subscribers * - `new_package` — payload = JSON { packageId, fileName, creator } → notify subscribers
*
* If the underlying connection is lost, the listener automatically reconnects
* so that pg_notify signals are never silently dropped.
*/ */
export async function startSendListener(): Promise<void> { export async function startSendListener(): Promise<void> {
stopped = false;
await connectListener();
}
async function connectListener(): Promise<void> {
try {
pgClient = await pool.connect(); pgClient = await pool.connect();
await pgClient.query("LISTEN bot_send"); await pgClient.query("LISTEN bot_send");
await pgClient.query("LISTEN new_package"); await pgClient.query("LISTEN new_package");
@@ -31,10 +45,46 @@ export async function startSendListener(): Promise<void> {
} }
}); });
// Reconnect automatically when the connection ends unexpectedly
pgClient.on("end", () => {
if (!stopped) {
log.warn("Send listener connection lost — reconnecting");
pgClient = null;
scheduleReconnect();
}
});
pgClient.on("error", (err) => {
log.error({ err }, "Send listener connection error");
if (!stopped && pgClient) {
try {
pgClient.release(true);
} catch (releaseErr) {
log.debug({ err: releaseErr }, "Failed to release pg client after error");
}
pgClient = null;
scheduleReconnect();
}
});
log.info("Send listener started (bot_send, new_package)"); log.info("Send listener started (bot_send, new_package)");
} catch (err) {
log.error({ err }, "Failed to start send listener — retrying");
scheduleReconnect();
}
}
function scheduleReconnect(): void {
if (stopped) return;
setTimeout(() => {
if (!stopped) {
connectListener();
}
}, RECONNECT_DELAY_MS);
} }
export function stopSendListener(): void { export function stopSendListener(): void {
stopped = true;
if (pgClient) { if (pgClient) {
pgClient.release(); pgClient.release();
pgClient = null; pgClient = null;
@@ -84,18 +134,45 @@ async function processSendRequest(requestId: string): Promise<void> {
throw new Error("No global destination channel configured"); throw new Error("No global destination channel configured");
} }
// Send preview if available // Send preview with rich caption if available
if (pkg.previewData) { if (pkg.previewData) {
const caption = `📦 *${pkg.fileName}*\n\nSent from Dragon's Stash`; const lines: string[] = [];
lines.push(`📦 *${escapeMarkdown(pkg.fileName)}*`);
if (pkg.creator) lines.push(`👤 ${escapeMarkdown(pkg.creator)}`);
if (pkg.fileCount > 0) lines.push(`📁 ${pkg.fileCount} files`);
if (pkg.tags && pkg.tags.length > 0) {
lines.push(`🏷️ ${pkg.tags.map((t: string) => escapeMarkdown(t)).join(", ")}`);
}
if (pkg.sourceChannel) {
lines.push(`📡 Source: ${escapeMarkdown(pkg.sourceChannel.title)}`);
}
lines.push("");
lines.push("_Sent from Dragon's Stash_");
const caption = lines.join("\n");
await sendPhotoMessage(targetUserId, Buffer.from(pkg.previewData), caption); await sendPhotoMessage(targetUserId, Buffer.from(pkg.previewData), caption);
} }
// Forward the actual archive file(s) from destination channel // Forward the actual archive file(s) from destination channel
const messageIds = pkg.destMessageIds as bigint[] | undefined;
if (messageIds && messageIds.length > 1) {
log.info(
{ requestId, parts: messageIds.length },
"Sending multi-part archive"
);
await copyMultipleMessagesToUser(
destChannel.telegramId,
messageIds,
targetUserId
);
} else {
// Single part or legacy (no destMessageIds populated)
await copyMessageToUser( await copyMessageToUser(
destChannel.telegramId, destChannel.telegramId,
pkg.destMessageId, pkg.destMessageId,
targetUserId targetUserId
); );
}
await updateSendRequest(requestId, "SENT"); await updateSendRequest(requestId, "SENT");
log.info({ requestId }, "Send request completed successfully"); log.info({ requestId }, "Send request completed successfully");
@@ -114,6 +191,7 @@ async function handleNewPackage(payload: string): Promise<void> {
packageId: string; packageId: string;
fileName: string; fileName: string;
creator: string | null; creator: string | null;
tags?: string[];
}; };
const subs = await findMatchingSubscriptions(data.fileName, data.creator); const subs = await findMatchingSubscriptions(data.fileName, data.creator);
@@ -133,12 +211,15 @@ async function handleNewPackage(payload: string): Promise<void> {
userSubs.set(key, patterns); userSubs.set(key, patterns);
} }
const creator = data.creator ? ` by ${data.creator}` : ""; const creator = data.creator ? ` by ${escapeHtml(data.creator)}` : "";
for (const [telegramUserId, patterns] of userSubs) { for (const [telegramUserId, patterns] of userSubs) {
const msg = [ const msg = [
`🔔 <b>New package matching your subscriptions:</b>`, `🔔 <b>New package matching your subscriptions:</b>`,
``, ``,
`📦 <b>${escapeHtml(data.fileName)}</b>${creator}`, `📦 <b>${escapeHtml(data.fileName)}</b>${creator}`,
...(data.tags && data.tags.length > 0
? [`🏷️ ${data.tags.map((t: string) => escapeHtml(t)).join(", ")}`]
: []),
``, ``,
`Matched: ${patterns.map((p) => `"${escapeHtml(p)}"`).join(", ")}`, `Matched: ${patterns.map((p) => `"${escapeHtml(p)}"`).join(", ")}`,
``, ``,
@@ -151,6 +232,9 @@ async function handleNewPackage(payload: string): Promise<void> {
"Failed to notify subscriber" "Failed to notify subscriber"
); );
}); });
// Rate limit delay between notifications (~20 msgs/sec, under 30 msgs/sec bot limit)
await sleep(50);
} }
} catch (err) { } catch (err) {
log.error({ err, payload }, "Failed to process new_package notification"); log.error({ err, payload }, "Failed to process new_package notification");
@@ -160,3 +244,7 @@ async function handleNewPackage(payload: string): Promise<void> {
function escapeHtml(text: string): string { function escapeHtml(text: string): string {
return text.replace(/&/g, "&amp;").replace(/</g, "&lt;").replace(/>/g, "&gt;"); return text.replace(/&/g, "&amp;").replace(/</g, "&lt;").replace(/>/g, "&gt;");
} }
function escapeMarkdown(text: string): string {
return text.replace(/([_*[\]()~`>#+\-=|{}.!\\])/g, "\\$1");
}

View File

@@ -2,6 +2,7 @@ import tdl from "tdl";
import { getTdjson } from "prebuilt-tdlib"; import { getTdjson } from "prebuilt-tdlib";
import { config } from "../util/config.js"; import { config } from "../util/config.js";
import { childLogger } from "../util/logger.js"; import { childLogger } from "../util/logger.js";
import { withFloodWait } from "../util/flood-wait.js";
const log = childLogger("tdlib-bot"); const log = childLogger("tdlib-bot");
@@ -33,7 +34,7 @@ export async function createBotClient(): Promise<tdl.Client> {
await client.login(() => ({ await client.login(() => ({
type: "bot", type: "bot",
token: config.botToken, getToken: () => Promise.resolve(config.botToken),
})); }));
log.info("Bot client authenticated successfully"); log.info("Bot client authenticated successfully");
@@ -53,8 +54,14 @@ export async function closeBotClient(): Promise<void> {
} }
/** /**
* Forward a message from a channel to a user's DM. * Send a document from a channel to a user's DM.
* Uses copyMessage to make it appear as sent by the bot. *
* Instead of forwardMessages (unreliable for bot accounts with send_copy),
* we fetch the original message to get the file's remote ID, then send a
* new message with inputFileRemote. This is the documented reliable approach
* for bots — the file is already on Telegram's servers so no re-upload is needed.
*
* Falls back to a plain forward (without send_copy) if getMessage fails.
*/ */
export async function copyMessageToUser( export async function copyMessageToUser(
fromChatId: bigint, fromChatId: bigint,
@@ -62,18 +69,156 @@ export async function copyMessageToUser(
toUserId: bigint toUserId: bigint
): Promise<void> { ): Promise<void> {
if (!client) throw new Error("Bot client not initialized"); if (!client) throw new Error("Bot client not initialized");
const c = client;
// TDLib uses negative chat IDs for channels/supergroups log.info(
// The telegramId from the DB is the raw Telegram ID; for channels it needs -100 prefix { fromChatId: fromChatId.toString(), messageId: messageId.toString(), toUserId: toUserId.toString() },
const fromChatIdNum = Number(-100n * 1n) + Number(fromChatId); "Sending file to user"
);
await client.invoke({ // Step 1: Get the original message to extract the file's remote ID
_: "forwardMessages", // eslint-disable-next-line @typescript-eslint/no-explicit-any
chat_id: Number(toUserId), let message: any;
from_chat_id: Number(fromChatId) > 0 ? -Number(fromChatId) : Number(fromChatId), try {
message_ids: [Number(messageId)], message = await withFloodWait(
send_copy: true, () => c.invoke({
remove_caption: false, _: "getMessage",
chat_id: Number(fromChatId),
message_id: Number(messageId),
}),
"getMessage"
);
} catch (err) {
log.error({ err, fromChatId: fromChatId.toString(), messageId: messageId.toString() }, "getMessage failed");
throw new Error(`Cannot get source message: ${err instanceof Error ? err.message : String(err)}`);
}
// Step 2: Extract the document's remote file ID
const doc = message?.content?.document;
if (!doc?.document?.remote?.id) {
log.error(
{ messageContent: message?.content?._, messageId: messageId.toString() },
"Source message has no document with remote file ID"
);
throw new Error(`Source message is not a document or has no remote file ID (type: ${message?.content?._})`);
}
const remoteFileId: string = doc.document.remote.id;
const fileName: string = doc.file_name ?? "file";
const caption = message.content?.caption;
log.info(
{ remoteFileId: remoteFileId.slice(0, 20) + "...", fileName, toUserId: toUserId.toString() },
"Sending document via inputFileRemote"
);
// Step 3: Send the document to the user using the remote file ID
// This doesn't require downloading — Telegram serves the existing file.
await waitForSendConfirmation(c, Number(toUserId), {
_: "inputMessageDocument",
document: { _: "inputFileRemote", id: remoteFileId },
caption: caption ?? undefined,
}, fileName);
}
/**
* Send multiple document messages from a channel to a user's DM.
* Used for multi-part archives where each part is a separate Telegram message.
* Sends parts sequentially with a small delay to avoid rate limits.
*/
export async function copyMultipleMessagesToUser(
fromChatId: bigint,
messageIds: bigint[],
toUserId: bigint
): Promise<void> {
for (let i = 0; i < messageIds.length; i++) {
await copyMessageToUser(fromChatId, messageIds[i], toUserId);
// Small delay between parts to avoid rate limits
if (i < messageIds.length - 1) {
await new Promise((resolve) => setTimeout(resolve, 1000));
}
}
}
/**
* Send a message and wait for Telegram to confirm delivery.
* Returns when updateMessageSendSucceeded fires for the temp message.
* Throws if updateMessageSendFailed fires or timeout is reached.
*/
async function waitForSendConfirmation(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
c: any,
chatId: number,
// eslint-disable-next-line @typescript-eslint/no-explicit-any
inputMessageContent: any,
label: string
): Promise<void> {
return new Promise<void>((resolve, reject) => {
let settled = false;
let tempMsgId: number | null = null;
const TIMEOUT_MS = 5 * 60_000;
const timer = setTimeout(() => {
if (!settled) {
settled = true;
cleanup();
reject(new Error(`Send timed out after 5min for ${label}`));
}
}, TIMEOUT_MS);
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const handleUpdate = (update: any) => {
if (update?._ === "updateMessageSendSucceeded") {
if (tempMsgId !== null && update.old_message_id === tempMsgId) {
if (!settled) {
settled = true;
cleanup();
log.info({ tempMsgId, finalMsgId: update.message?.id, label }, "Send confirmed");
resolve();
}
}
}
if (update?._ === "updateMessageSendFailed") {
if (tempMsgId !== null && update.old_message_id === tempMsgId) {
if (!settled) {
settled = true;
cleanup();
const errorMsg = update.error?.message ?? "Unknown";
const errorCode = update.error?.code ?? 0;
log.error({ tempMsgId, errorCode, errorMsg, label }, "Send failed");
reject(new Error(`Send failed for ${label}: [${errorCode}] ${errorMsg}`));
}
}
}
};
const cleanup = () => {
clearTimeout(timer);
c.off("update", handleUpdate);
};
// Attach BEFORE sending to avoid race
c.on("update", handleUpdate);
withFloodWait(
() => c.invoke({
_: "sendMessage",
chat_id: chatId,
input_message_content: inputMessageContent,
}),
"sendMessage:copyToUser"
)
.then((result) => {
tempMsgId = (result as { id: number }).id;
log.debug({ tempMsgId, label }, "Message queued, waiting for confirmation");
})
.catch((err: Error) => {
if (!settled) {
settled = true;
cleanup();
reject(err);
}
});
}); });
} }
@@ -86,22 +231,31 @@ export async function sendTextMessage(
parseMode: "textParseModeMarkdown" | "textParseModeHTML" = "textParseModeMarkdown" parseMode: "textParseModeMarkdown" | "textParseModeHTML" = "textParseModeMarkdown"
): Promise<void> { ): Promise<void> {
if (!client) throw new Error("Bot client not initialized"); if (!client) throw new Error("Bot client not initialized");
const c = client;
// Parse the text first // Parse the text first
const parsed = await client.invoke({ const parsed = await withFloodWait(
() =>
c.invoke({
_: "parseTextEntities", _: "parseTextEntities",
text, text,
parse_mode: { _: parseMode, version: parseMode === "textParseModeMarkdown" ? 2 : 0 }, parse_mode: { _: parseMode, version: parseMode === "textParseModeMarkdown" ? 2 : 0 },
}); }),
"parseTextEntities"
);
await client.invoke({ await withFloodWait(
() =>
c.invoke({
_: "sendMessage", _: "sendMessage",
chat_id: Number(chatId), chat_id: Number(chatId),
input_message_content: { input_message_content: {
_: "inputMessageText", _: "inputMessageText",
text: parsed, text: parsed,
}, },
}); }),
"sendTextMessage"
);
} }
/** /**
@@ -113,6 +267,7 @@ export async function sendPhotoMessage(
caption: string caption: string
): Promise<void> { ): Promise<void> {
if (!client) throw new Error("Bot client not initialized"); if (!client) throw new Error("Bot client not initialized");
const c = client;
// Write the photo to a temp file // Write the photo to a temp file
const { writeFile, unlink } = await import("fs/promises"); const { writeFile, unlink } = await import("fs/promises");
@@ -122,13 +277,19 @@ export async function sendPhotoMessage(
try { try {
await writeFile(tempPath, photoData); await writeFile(tempPath, photoData);
const parsedCaption = await client.invoke({ const parsedCaption = await withFloodWait(
() =>
c.invoke({
_: "parseTextEntities", _: "parseTextEntities",
text: caption, text: caption,
parse_mode: { _: "textParseModeMarkdown", version: 2 }, parse_mode: { _: "textParseModeMarkdown", version: 2 },
}); }),
"parsePhotoCaption"
);
await client.invoke({ await withFloodWait(
() =>
c.invoke({
_: "sendMessage", _: "sendMessage",
chat_id: Number(chatId), chat_id: Number(chatId),
input_message_content: { input_message_content: {
@@ -138,12 +299,41 @@ export async function sendPhotoMessage(
width: 0, width: 0,
height: 0, height: 0,
}, },
}); }),
"sendPhotoMessage"
);
} finally { } finally {
await unlink(tempPath).catch(() => {}); await unlink(tempPath).catch(() => {});
} }
} }
/**
* Get basic info about a Telegram user (name, username).
*/
export async function getUser(
userId: number
): Promise<{ firstName: string; lastName?: string; username?: string }> {
if (!client) throw new Error("Bot client not initialized");
const c = client;
const user = (await withFloodWait(
() =>
c.invoke({
_: "getUser",
user_id: userId,
}),
"getUser"
)) as {
first_name?: string;
last_name?: string;
usernames?: { editable_username?: string };
};
return {
firstName: user.first_name ?? "User",
lastName: user.last_name || undefined,
username: user.usernames?.editable_username || undefined,
};
}
/** /**
* Get updates from TDLib. The bot listens for new messages this way. * Get updates from TDLib. The bot listens for new messages this way.
*/ */

View File

@@ -0,0 +1,60 @@
import { childLogger } from "./logger.js";
const log = childLogger("flood-wait");
function sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms));
}
/**
* Extract the mandatory wait duration (in seconds) from a Telegram
* FLOOD_WAIT error. Returns null when the error is not rate-limit related.
*/
export function extractFloodWaitSeconds(err: unknown): number | null {
const message = err instanceof Error ? err.message : String(err);
// Pattern 1: FLOOD_WAIT_30
const flood = message.match(/FLOOD_WAIT_(\d+)/i);
if (flood) return parseInt(flood[1], 10);
// Pattern 2: "retry after 30"
const retry = message.match(/retry after (\d+)/i);
if (retry) return parseInt(retry[1], 10);
// Pattern 3: HTTP 429 without explicit seconds
// eslint-disable-next-line @typescript-eslint/no-explicit-any
if (String((err as any)?.code) === "429") return 30;
return null;
}
/**
* Wrap any async Telegram operation with automatic FLOOD_WAIT retry.
* Adds random jitter (1-5s) to prevent thundering-herd retries.
*
* Non-rate-limit errors are re-thrown immediately (fail-fast).
*/
export async function withFloodWait<T>(
fn: () => Promise<T>,
context?: string,
maxRetries = 5
): Promise<T> {
for (let attempt = 0; attempt <= maxRetries; attempt++) {
try {
return await fn();
} catch (err) {
const wait = extractFloodWaitSeconds(err);
if (wait === null || attempt >= maxRetries) throw err;
const jitter = 1000 + Math.random() * 4000;
log.warn(
{ context, wait, attempt: attempt + 1, maxRetries, jitter: Math.round(jitter) },
"FLOOD_WAIT received — backing off"
);
await sleep(wait * 1000 + jitter);
}
}
throw new Error("Unreachable");
}
export { sleep };

View File

@@ -5,7 +5,7 @@ services:
dockerfile: Dockerfile dockerfile: Dockerfile
pull_policy: never pull_policy: never
ports: ports:
- "${APP_PORT:-3000}:3000" - "${APP_PORT:-3000}:${APP_PORT:-3000}"
environment: environment:
- DATABASE_URL=postgresql://${POSTGRES_USER:-dragons}:${POSTGRES_PASSWORD:-stash}@db:5432/${POSTGRES_DB:-dragonsstash} - DATABASE_URL=postgresql://${POSTGRES_USER:-dragons}:${POSTGRES_PASSWORD:-stash}@db:5432/${POSTGRES_DB:-dragonsstash}
- AUTH_SECRET=${AUTH_SECRET:?Set AUTH_SECRET in .env} - AUTH_SECRET=${AUTH_SECRET:?Set AUTH_SECRET in .env}
@@ -17,15 +17,19 @@ services:
- BOT_TOKEN=${BOT_TOKEN:-} - BOT_TOKEN=${BOT_TOKEN:-}
- BOT_USERNAME=${BOT_USERNAME:-} - BOT_USERNAME=${BOT_USERNAME:-}
- LOG_LEVEL=${LOG_LEVEL:-info} - LOG_LEVEL=${LOG_LEVEL:-info}
- WORKER_INTERVAL_MINUTES=${WORKER_INTERVAL_MINUTES:-60}
- PORT=${APP_PORT:-3000}
depends_on: depends_on:
db: db:
condition: service_healthy condition: service_healthy
healthcheck: healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://localhost:3000/api/health"] test: ["CMD-SHELL", "wget -q --spider http://localhost:$$PORT/api/health || exit 1"]
interval: 30s interval: 30s
timeout: 5s timeout: 5s
retries: 3 retries: 3
start_period: 60s start_period: 60s
volumes:
- manual_uploads:/data/uploads
restart: unless-stopped restart: unless-stopped
deploy: deploy:
resources: resources:
@@ -52,6 +56,7 @@ services:
volumes: volumes:
- tdlib_state:/data/tdlib - tdlib_state:/data/tdlib
- tmp_zips:/tmp/zips - tmp_zips:/tmp/zips
- manual_uploads:/data/uploads
depends_on: depends_on:
db: db:
condition: service_healthy condition: service_healthy
@@ -119,6 +124,7 @@ volumes:
tdlib_state: tdlib_state:
tdlib_bot_state: tdlib_bot_state:
tmp_zips: tmp_zips:
manual_uploads:
networks: networks:
frontend: frontend:

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,964 @@
# Multi-Part Send Fix & Kickstarter Package Linking
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Fix multi-part package forwarding so all archive parts reach the user, and add UI to link STL packages to kickstarters with "send all" capability.
**Architecture:** Two independent subsystems. (A) Store all destination message IDs when the worker uploads multi-part archives, then have the bot forward every part. (B) Add a package-linker dialog in the kickstarter UI using the existing `linkPackages` action, plus a "send all" action that queues every linked package.
**Tech Stack:** Prisma (schema + migration), TypeScript worker/bot services, Next.js App Router (server actions + React client components), shadcn/ui, TanStack Table.
---
## File Map
### Subsystem A — Multi-Part Send Fix
| Action | File | Responsibility |
|--------|------|----------------|
| Modify | `prisma/schema.prisma` | Add `destMessageIds BigInt[]` to Package |
| Create | `prisma/migrations/<ts>_add_dest_message_ids/migration.sql` | Migration SQL |
| Modify | `worker/src/upload/channel.ts` | Return all message IDs from `uploadToChannel` |
| Modify | `worker/src/db/queries.ts` | Add `destMessageIds` to `CreatePackageInput` and `createPackageWithFiles` |
| Modify | `worker/src/worker.ts` | Pass all message IDs when creating package |
| Modify | `bot/src/db/queries.ts` | Include `destMessageIds` in `getPendingSendRequest` |
| Modify | `bot/src/send-listener.ts` | Forward all parts, not just the first |
### Subsystem B — Kickstarter Package Linking UI
| Action | File | Responsibility |
|--------|------|----------------|
| Create | `src/app/(app)/kickstarters/_components/package-linker-dialog.tsx` | Dialog with package search + selection for linking |
| Modify | `src/app/(app)/kickstarters/_components/kickstarter-columns.tsx` | Add "Link Packages" and "Send All" actions to row menu |
| Modify | `src/app/(app)/kickstarters/_components/kickstarter-table.tsx` | Wire up new dialogs + state |
| Modify | `src/app/(app)/kickstarters/actions.ts` | Add `sendAllKickstarterPackages` action |
| Modify | `src/data/kickstarter.queries.ts` | Add query to search packages for linking |
---
## Task 1: Add `destMessageIds` to Prisma Schema + Migration
**Files:**
- Modify: `prisma/schema.prisma:470-471`
- Create: migration SQL
- [ ] **Step 1: Add field to schema**
In `prisma/schema.prisma`, add `destMessageIds` after `destMessageId`:
```prisma
destMessageId BigInt?
destMessageIds BigInt[] @default([])
```
- [ ] **Step 2: Create migration SQL manually**
Create the migration directory and SQL file. The migration adds the column with a default and backfills existing rows by copying `destMessageId` into the array where it's non-null:
```sql
-- AlterTable
ALTER TABLE "packages" ADD COLUMN "destMessageIds" BIGINT[] DEFAULT ARRAY[]::BIGINT[];
-- Backfill: copy existing destMessageId into the array
UPDATE "packages"
SET "destMessageIds" = ARRAY["destMessageId"]
WHERE "destMessageId" IS NOT NULL;
```
- [ ] **Step 3: Apply migration to database**
```bash
docker exec dragonsstash-db psql -U dragons -d dragonsstash -f - < migration.sql
```
- [ ] **Step 4: Regenerate Prisma client**
Use the app container (which has node/prisma) to regenerate:
```bash
docker exec dragonsstash npx prisma generate
```
Or, if running locally with node: `npx prisma generate`
- [ ] **Step 5: Commit**
```bash
git add prisma/schema.prisma prisma/migrations/
git commit -m "feat: add destMessageIds field to Package for multi-part forwarding"
```
---
## Task 2: Worker — Return All Message IDs from Upload
**Files:**
- Modify: `worker/src/upload/channel.ts:10-12,25-74`
- [ ] **Step 1: Update UploadResult interface**
In `worker/src/upload/channel.ts`, change the interface to include all IDs:
```typescript
export interface UploadResult {
messageId: bigint;
messageIds: bigint[];
}
```
- [ ] **Step 2: Collect all message IDs in uploadToChannel**
Replace the upload loop to track all message IDs:
```typescript
export async function uploadToChannel(
client: Client,
chatId: bigint,
filePaths: string[],
caption?: string
): Promise<UploadResult> {
const allMessageIds: bigint[] = [];
for (let i = 0; i < filePaths.length; i++) {
const filePath = filePaths[i];
const fileCaption = i === 0 && caption ? caption : undefined;
const fileName = path.basename(filePath);
let fileSizeMB = 0;
try {
const s = await stat(filePath);
fileSizeMB = Math.round(s.size / (1024 * 1024));
} catch {
// Non-critical
}
log.info(
{ chatId: Number(chatId), fileName, sizeMB: fileSizeMB, part: i + 1, total: filePaths.length },
"Uploading file to channel"
);
const serverMsgId = await sendWithRetry(client, chatId, filePath, fileCaption, fileName, fileSizeMB);
allMessageIds.push(serverMsgId);
// Rate limit delay between uploads
if (i < filePaths.length - 1) {
await sleep(config.apiDelayMs);
}
}
if (allMessageIds.length === 0) {
throw new Error("Upload failed: no messages sent");
}
log.info(
{ chatId: Number(chatId), messageId: Number(allMessageIds[0]), files: filePaths.length },
"All uploads confirmed by Telegram"
);
return { messageId: allMessageIds[0], messageIds: allMessageIds };
}
```
- [ ] **Step 3: Commit**
```bash
git add worker/src/upload/channel.ts
git commit -m "feat: return all message IDs from uploadToChannel for multi-part"
```
---
## Task 3: Worker — Store All Message IDs in Database
**Files:**
- Modify: `worker/src/db/queries.ts:104-155`
- Modify: `worker/src/worker.ts:1056-1086`
- [ ] **Step 1: Add destMessageIds to CreatePackageInput**
In `worker/src/db/queries.ts`, add the field to the interface:
```typescript
export interface CreatePackageInput {
// ... existing fields ...
destMessageId?: bigint;
destMessageIds?: bigint[];
// ... rest ...
}
```
- [ ] **Step 2: Store destMessageIds in createPackageWithFiles**
In the `db.package.create` call inside `createPackageWithFiles`, add:
```typescript
destMessageIds: input.destMessageIds ?? (input.destMessageId ? [input.destMessageId] : []),
```
- [ ] **Step 3: Pass messageIds from worker pipeline**
In `worker/src/worker.ts`, the upload section (around line 1068-1085) currently does:
```typescript
destResult = await uploadToChannel(client, destChannelTelegramId, uploadPaths);
```
After this, when calling `createPackageWithFiles`, add `destMessageIds`:
```typescript
const pkg = await createPackageWithFiles({
// ... existing fields ...
destMessageId: destResult.messageId,
destMessageIds: destResult.messageIds,
// ... rest ...
});
```
- [ ] **Step 4: Commit**
```bash
git add worker/src/db/queries.ts worker/src/worker.ts
git commit -m "feat: store all multi-part message IDs in package record"
```
---
## Task 4: Bot — Forward All Parts
**Files:**
- Modify: `bot/src/db/queries.ts:110-132`
- Modify: `bot/src/send-listener.ts:105-169`
- Modify: `bot/src/tdlib/client.ts:66-122`
- [ ] **Step 1: Include destMessageIds in bot query**
In `bot/src/db/queries.ts`, add `destMessageIds` to the `getPendingSendRequest` select:
```typescript
package: {
select: {
id: true,
fileName: true,
fileSize: true,
fileCount: true,
creator: true,
tags: true,
archiveType: true,
destChannelId: true,
destMessageId: true,
destMessageIds: true, // <-- ADD THIS
isMultipart: true, // <-- ADD THIS (for logging)
partCount: true, // <-- ADD THIS (for logging)
previewData: true,
sourceChannel: { select: { title: true, telegramId: true } },
},
},
```
- [ ] **Step 2: Add copyMultipleMessagesToUser helper**
In `bot/src/tdlib/client.ts`, add a new export after `copyMessageToUser`:
```typescript
/**
* Send multiple document messages from a channel to a user's DM.
* Used for multi-part archives where each part is a separate Telegram message.
* Sends parts sequentially with a small delay to avoid rate limits.
*/
export async function copyMultipleMessagesToUser(
fromChatId: bigint,
messageIds: bigint[],
toUserId: bigint
): Promise<void> {
for (let i = 0; i < messageIds.length; i++) {
await copyMessageToUser(fromChatId, messageIds[i], toUserId);
// Small delay between parts to avoid rate limits
if (i < messageIds.length - 1) {
await new Promise((resolve) => setTimeout(resolve, 1000));
}
}
}
```
- [ ] **Step 3: Update processSendRequest to forward all parts**
In `bot/src/send-listener.ts`, update the import to include the new function:
```typescript
import { copyMessageToUser, copyMultipleMessagesToUser, sendTextMessage, sendPhotoMessage } from "./tdlib/client.js";
```
Then replace the single `copyMessageToUser` call (around line 157) with logic that forwards all parts:
```typescript
// Forward the actual archive file(s) from destination channel
const messageIds = pkg.destMessageIds as bigint[] | undefined;
if (messageIds && messageIds.length > 1) {
log.info(
{ requestId, parts: messageIds.length },
"Sending multi-part archive"
);
await copyMultipleMessagesToUser(
destChannel.telegramId,
messageIds,
targetUserId
);
} else {
// Single part or legacy (no destMessageIds populated)
await copyMessageToUser(
destChannel.telegramId,
pkg.destMessageId,
targetUserId
);
}
```
- [ ] **Step 4: Commit**
```bash
git add bot/src/db/queries.ts bot/src/send-listener.ts bot/src/tdlib/client.ts
git commit -m "feat: forward all parts of multi-part archives via bot"
```
---
## Task 5: Rebuild & Deploy Worker + Bot
- [ ] **Step 1: Rebuild worker image**
```bash
docker compose -f docker-compose.dev.yml build worker
docker tag dragonsstash-worker:latest git.samagsteribbe.nl/admin/dragonsstash-worker:latest
docker compose -p dragonsstash -f /opt/stacks/DragonsStash/docker-compose.yml up -d worker
```
- [ ] **Step 2: Rebuild bot image**
```bash
docker compose -f docker-compose.dev.yml build bot
docker tag dragonsstash-bot:latest git.samagsteribbe.nl/admin/dragonsstash-bot:latest
docker compose -p dragonsstash -f /opt/stacks/DragonsStash/docker-compose.yml up -d bot
```
- [ ] **Step 3: Verify bot startup**
```bash
docker logs dragonsstash-bot --tail=20
```
Expected: Bot starts cleanly, "Send listener started" message.
---
## Task 6: Kickstarter — Package Search Query
**Files:**
- Modify: `src/data/kickstarter.queries.ts`
- [ ] **Step 1: Add searchPackagesForLinking query**
Append to `src/data/kickstarter.queries.ts`:
```typescript
export async function searchPackagesForLinking(query: string, limit = 20) {
if (!query || query.length < 2) return [];
return prisma.package.findMany({
where: {
OR: [
{ fileName: { contains: query, mode: "insensitive" } },
{ creator: { contains: query, mode: "insensitive" } },
],
},
orderBy: { indexedAt: "desc" },
take: limit,
select: {
id: true,
fileName: true,
fileSize: true,
archiveType: true,
creator: true,
fileCount: true,
},
});
}
export async function getLinkedPackageIds(kickstarterId: string): Promise<string[]> {
const links = await prisma.kickstarterPackage.findMany({
where: { kickstarterId },
select: { packageId: true },
});
return links.map((l) => l.packageId);
}
```
- [ ] **Step 2: Commit**
```bash
git add src/data/kickstarter.queries.ts
git commit -m "feat: add package search query for kickstarter linking"
```
---
## Task 7: Kickstarter — Package Linker Dialog Component
**Files:**
- Create: `src/app/(app)/kickstarters/_components/package-linker-dialog.tsx`
- [ ] **Step 1: Create the package linker dialog**
This component provides a search input to find packages and checkboxes to select/deselect them. It calls the existing `linkPackages` action on save.
```tsx
"use client";
import { useState, useTransition, useCallback, useEffect } from "react";
import { Search, Package, X, Loader2 } from "lucide-react";
import { toast } from "sonner";
import { linkPackages } from "../actions";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Badge } from "@/components/ui/badge";
import { Checkbox } from "@/components/ui/checkbox";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { ScrollArea } from "@/components/ui/scroll-area";
interface PackageResult {
id: string;
fileName: string;
fileSize: bigint;
archiveType: string;
creator: string | null;
fileCount: number;
}
interface PackageLinkerDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
kickstarterId: string;
kickstarterName: string;
initialPackageIds: string[];
}
function formatSize(bytes: bigint | number): string {
const b = Number(bytes);
if (b >= 1024 * 1024 * 1024) return `${(b / (1024 * 1024 * 1024)).toFixed(1)} GB`;
if (b >= 1024 * 1024) return `${(b / (1024 * 1024)).toFixed(0)} MB`;
return `${(b / 1024).toFixed(0)} KB`;
}
export function PackageLinkerDialog({
open,
onOpenChange,
kickstarterId,
kickstarterName,
initialPackageIds,
}: PackageLinkerDialogProps) {
const [isPending, startTransition] = useTransition();
const [searchQuery, setSearchQuery] = useState("");
const [searchResults, setSearchResults] = useState<PackageResult[]>([]);
const [isSearching, setIsSearching] = useState(false);
const [selectedIds, setSelectedIds] = useState<Set<string>>(new Set(initialPackageIds));
// Reset state when dialog opens
useEffect(() => {
if (open) {
setSelectedIds(new Set(initialPackageIds));
setSearchQuery("");
setSearchResults([]);
}
}, [open, initialPackageIds]);
const doSearch = useCallback(async (query: string) => {
if (query.length < 2) {
setSearchResults([]);
return;
}
setIsSearching(true);
try {
const res = await fetch(`/api/packages/search?q=${encodeURIComponent(query)}&limit=20`);
if (res.ok) {
const data = await res.json();
setSearchResults(data.packages ?? []);
}
} catch {
// Ignore search errors
} finally {
setIsSearching(false);
}
}, []);
// Debounced search
useEffect(() => {
const timer = setTimeout(() => doSearch(searchQuery), 300);
return () => clearTimeout(timer);
}, [searchQuery, doSearch]);
function togglePackage(id: string) {
setSelectedIds((prev) => {
const next = new Set(prev);
if (next.has(id)) next.delete(id);
else next.add(id);
return next;
});
}
function handleSave() {
startTransition(async () => {
const result = await linkPackages(kickstarterId, Array.from(selectedIds));
if (result.success) {
toast.success(`Linked ${selectedIds.size} package(s) to "${kickstarterName}"`);
onOpenChange(false);
} else {
toast.error(result.error);
}
});
}
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-lg">
<DialogHeader>
<DialogTitle>Link Packages</DialogTitle>
<DialogDescription>
Search and select STL packages to link to &ldquo;{kickstarterName}&rdquo;.
</DialogDescription>
</DialogHeader>
<div className="space-y-3">
{/* Selected count */}
{selectedIds.size > 0 && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Package className="h-4 w-4" />
{selectedIds.size} package(s) selected
<Button
variant="ghost"
size="sm"
className="h-6 px-2 text-xs"
onClick={() => setSelectedIds(new Set())}
>
Clear all
</Button>
</div>
)}
{/* Search input */}
<div className="relative">
<Search className="absolute left-2.5 top-2.5 h-4 w-4 text-muted-foreground" />
<Input
placeholder="Search packages by name or creator..."
value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)}
className="pl-9"
autoFocus
/>
{isSearching && (
<Loader2 className="absolute right-2.5 top-2.5 h-4 w-4 animate-spin text-muted-foreground" />
)}
</div>
{/* Results */}
<ScrollArea className="h-[300px] rounded-md border">
<div className="p-2 space-y-1">
{searchResults.length === 0 && searchQuery.length >= 2 && !isSearching && (
<p className="text-sm text-muted-foreground text-center py-8">
No packages found
</p>
)}
{searchQuery.length < 2 && (
<p className="text-sm text-muted-foreground text-center py-8">
Type at least 2 characters to search
</p>
)}
{searchResults.map((pkg) => (
<label
key={pkg.id}
className="flex items-center gap-3 p-2 rounded-md hover:bg-muted/50 cursor-pointer"
>
<Checkbox
checked={selectedIds.has(pkg.id)}
onCheckedChange={() => togglePackage(pkg.id)}
/>
<div className="flex-1 min-w-0">
<p className="text-sm font-medium truncate">{pkg.fileName}</p>
<div className="flex items-center gap-2 text-xs text-muted-foreground">
{pkg.creator && <span>{pkg.creator}</span>}
<span>{formatSize(pkg.fileSize)}</span>
<Badge variant="outline" className="text-[10px] h-4 px-1">
{pkg.archiveType}
</Badge>
{pkg.fileCount > 0 && <span>{pkg.fileCount} files</span>}
</div>
</div>
{selectedIds.has(pkg.id) && (
<X className="h-3.5 w-3.5 text-muted-foreground shrink-0" />
)}
</label>
))}
</div>
</ScrollArea>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>
Cancel
</Button>
<Button onClick={handleSave} disabled={isPending}>
{isPending ? <Loader2 className="h-4 w-4 animate-spin mr-1" /> : null}
Save ({selectedIds.size})
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
);
}
```
- [ ] **Step 2: Commit**
```bash
git add src/app/(app)/kickstarters/_components/package-linker-dialog.tsx
git commit -m "feat: add package linker dialog for kickstarters"
```
---
## Task 8: Package Search API Route
**Files:**
- Create: `src/app/api/packages/search/route.ts`
- [ ] **Step 1: Create the API route**
The package linker dialog needs a client-side fetch for debounced search. Create a lightweight API route:
```typescript
import { NextResponse } from "next/server";
import { auth } from "@/lib/auth";
import { searchPackagesForLinking } from "@/data/kickstarter.queries";
export const dynamic = "force-dynamic";
export async function GET(request: Request) {
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
}
const { searchParams } = new URL(request.url);
const query = searchParams.get("q") ?? "";
const limit = Math.min(Number(searchParams.get("limit") ?? "20"), 50);
const packages = await searchPackagesForLinking(query, limit);
// Serialize BigInt for JSON
const serialized = packages.map((p) => ({
...p,
fileSize: p.fileSize.toString(),
}));
return NextResponse.json({ packages: serialized });
}
```
- [ ] **Step 2: Commit**
```bash
git add src/app/api/packages/search/route.ts
git commit -m "feat: add package search API route for kickstarter linking"
```
---
## Task 9: Kickstarter — Send All Packages Action
**Files:**
- Modify: `src/app/(app)/kickstarters/actions.ts`
- [ ] **Step 1: Add sendAllKickstarterPackages action**
Append to `src/app/(app)/kickstarters/actions.ts`:
```typescript
export async function sendAllKickstarterPackages(
kickstarterId: string
): Promise<ActionResult<{ queued: number }>> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const telegramLink = await prisma.telegramLink.findUnique({
where: { userId: session.user.id },
});
if (!telegramLink) {
return { success: false, error: "No linked Telegram account. Link one in Settings." };
}
const kickstarter = await prisma.kickstarter.findFirst({
where: { id: kickstarterId, userId: session.user.id },
select: {
packages: {
select: {
package: {
select: { id: true, destChannelId: true, destMessageId: true, fileName: true },
},
},
},
},
});
if (!kickstarter) {
return { success: false, error: "Kickstarter not found" };
}
const sendablePackages = kickstarter.packages
.map((lnk) => lnk.package)
.filter((p) => p.destChannelId && p.destMessageId);
if (sendablePackages.length === 0) {
return { success: false, error: "No linked packages are available for sending" };
}
let queued = 0;
for (const pkg of sendablePackages) {
const existing = await prisma.botSendRequest.findFirst({
where: {
packageId: pkg.id,
telegramLinkId: telegramLink.id,
status: { in: ["PENDING", "SENDING"] },
},
});
if (!existing) {
const sendRequest = await prisma.botSendRequest.create({
data: {
packageId: pkg.id,
telegramLinkId: telegramLink.id,
requestedByUserId: session.user.id,
status: "PENDING",
},
});
try {
await prisma.$queryRawUnsafe(
`SELECT pg_notify('bot_send', $1)`,
sendRequest.id
);
} catch {
// Best-effort
}
queued++;
}
}
revalidatePath(REVALIDATE_PATH);
return { success: true, data: { queued } };
} catch {
return { success: false, error: "Failed to send packages" };
}
}
```
- [ ] **Step 2: Commit**
```bash
git add src/app/(app)/kickstarters/actions.ts
git commit -m "feat: add sendAllKickstarterPackages action"
```
---
## Task 10: Kickstarter Table — Wire Up Link & Send Actions
**Files:**
- Modify: `src/app/(app)/kickstarters/_components/kickstarter-columns.tsx`
- Modify: `src/app/(app)/kickstarters/_components/kickstarter-table.tsx`
- [ ] **Step 1: Add actions to column menu**
In `kickstarter-columns.tsx`, add `Link2` and `Send` imports from lucide-react, add `onLinkPackages` and `onSendAll` to props, and add menu items:
```typescript
import { MoreHorizontal, Pencil, Trash2, ExternalLink, Link2, Send } from "lucide-react";
// Update interface:
interface KickstarterColumnsProps {
onEdit: (kickstarter: KickstarterRow) => void;
onDelete: (id: string) => void;
onLinkPackages: (kickstarter: KickstarterRow) => void;
onSendAll: (kickstarter: KickstarterRow) => void;
}
```
In the actions column dropdown, add between Edit and the separator:
```tsx
<DropdownMenuItem onClick={() => onLinkPackages(row.original)}>
<Link2 className="mr-2 h-3.5 w-3.5" />
Link Packages
</DropdownMenuItem>
{row.original._count.packages > 0 && (
<DropdownMenuItem onClick={() => onSendAll(row.original)}>
<Send className="mr-2 h-3.5 w-3.5" />
Send All ({row.original._count.packages})
</DropdownMenuItem>
)}
```
Update the function signature to destructure the new props:
```typescript
export function getKickstarterColumns({
onEdit,
onDelete,
onLinkPackages,
onSendAll,
}: KickstarterColumnsProps): ColumnDef<KickstarterRow, unknown>[] {
```
- [ ] **Step 2: Wire up state in kickstarter-table.tsx**
Add imports and state for the new dialogs:
```typescript
import { PackageLinkerDialog } from "./package-linker-dialog";
import { sendAllKickstarterPackages } from "../actions";
// Inside KickstarterTable:
const [linkTarget, setLinkTarget] = useState<KickstarterRow | null>(null);
const [sendAllTarget, setSendAllTarget] = useState<KickstarterRow | null>(null);
```
Update the columns call:
```typescript
const columns = getKickstarterColumns({
onEdit: (kickstarter) => {
setEditKickstarter(kickstarter);
setModalOpen(true);
},
onDelete: (id) => setDeleteId(id),
onLinkPackages: (kickstarter) => setLinkTarget(kickstarter),
onSendAll: (kickstarter) => {
startTransition(async () => {
const result = await sendAllKickstarterPackages(kickstarter.id);
if (result.success) {
toast.success(`Queued ${result.data!.queued} package(s) for delivery`);
} else {
toast.error(result.error);
}
});
},
});
```
Add the `PackageLinkerDialog` before the closing `</div>` of the component's return:
```tsx
{linkTarget && (
<PackageLinkerDialog
open={!!linkTarget}
onOpenChange={(open) => !open && setLinkTarget(null)}
kickstarterId={linkTarget.id}
kickstarterName={linkTarget.name}
initialPackageIds={[]}
/>
)}
```
Note: `initialPackageIds` is `[]` because the table doesn't fetch linked packages. The dialog will start empty but preserve selections during the session. For a better UX, we fetch the linked IDs when the dialog opens — see step 3.
- [ ] **Step 3: Fetch initial linked packages when dialog opens**
To populate the dialog with already-linked packages, add an API route or use a server action. The simplest approach: modify the `PackageLinkerDialog` to fetch linked IDs on mount.
In `package-linker-dialog.tsx`, add to the `useEffect` that runs when `open` changes:
```typescript
useEffect(() => {
if (open) {
setSearchQuery("");
setSearchResults([]);
// Fetch currently linked packages
fetch(`/api/packages/linked?kickstarterId=${kickstarterId}`)
.then((res) => res.json())
.then((data) => {
if (data.packageIds) {
setSelectedIds(new Set(data.packageIds));
}
})
.catch(() => {});
}
}, [open, kickstarterId]);
```
Create the API route at `src/app/api/packages/linked/route.ts`:
```typescript
import { NextResponse } from "next/server";
import { auth } from "@/lib/auth";
import { getLinkedPackageIds } from "@/data/kickstarter.queries";
export const dynamic = "force-dynamic";
export async function GET(request: Request) {
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: "Unauthorized" }, { status: 401 });
}
const { searchParams } = new URL(request.url);
const kickstarterId = searchParams.get("kickstarterId");
if (!kickstarterId) {
return NextResponse.json({ error: "kickstarterId required" }, { status: 400 });
}
const packageIds = await getLinkedPackageIds(kickstarterId);
return NextResponse.json({ packageIds });
}
```
- [ ] **Step 4: Commit**
```bash
git add src/app/(app)/kickstarters/_components/ src/app/api/packages/
git commit -m "feat: wire up package linking and send-all in kickstarter table"
```
---
## Task 11: Rebuild & Deploy App
- [ ] **Step 1: Rebuild app image**
```bash
docker compose build app # or equivalent for the production compose
docker tag dragonsstash:latest git.samagsteribbe.nl/admin/dragonsstash:latest
docker compose -p dragonsstash -f /opt/stacks/DragonsStash/docker-compose.yml up -d app
```
- [ ] **Step 2: Verify app startup**
```bash
docker logs dragonsstash --tail=20
```
Expected: App starts cleanly, health check passes.
- [ ] **Step 3: Manual test**
1. Go to Kickstarters tab
2. Open a kickstarter's row menu → "Link Packages"
3. Search for a package, select it, save
4. Verify the package count column updates
5. Use "Send All" to queue all linked packages for Telegram delivery

View File

@@ -0,0 +1,472 @@
# Dragonstash Grouping System Audit & Enhancement Report
## Appendix: Real-World Failure Cases (2026-03-29/30)
These skipped packages reveal two concrete issues:
### Issue A: `WORKER_MAX_ZIP_SIZE_MB` was 4 GB — blocking all large multipart archives
| File | Parts | Total Size | Status |
|------|-------|-----------|--------|
| DM-Stash - Guide to Tharador - Complete STL | 19 | 70.5 GB | SIZE_LIMIT |
| DM-Stash - 2023-05 - Greywinds All-in | 16 | 58.9 GB | SIZE_LIMIT |
| Axolote Gaming - Castle of the Vampire Lord | 10 | 18 GB | SIZE_LIMIT |
| Dungeon Blocks - THE ULTIMATE DUNGEON | 5 | 7.6 GB | SIZE_LIMIT |
| Dungeon Blocks - The Toxic sewer | 4 | 6.2 GB | SIZE_LIMIT |
| Soulmist | 4 | 6.3 GB | SIZE_LIMIT |
| Medieval Town PT1 | 3 | 5.7 GB | SIZE_LIMIT |
| Knight Models - Game Of Thrones | 3 | 5.5 GB | SIZE_LIMIT |
| Dungeon Blocks - The Lost Cave | 3 | 4.9 GB | SIZE_LIMIT |
| El Miniaturista 2025-05 Fulgrim Part II and III | 5 | 4.7 GB | SIZE_LIMIT |
**Root cause:** Production env had `WORKER_MAX_ZIP_SIZE_MB=4096`. The default in code is 204800 (200 GB), but docker-compose.yml defaulted to 4096.
**Fix applied:** Raised to 204800 in `/opt/stacks/DragonsStash/.env`. Worker restarted. These archives will be retried on the next ingestion cycle. The worker downloads parts individually (each under 2-4 GB), concatenates, re-splits at 1950 MiB for upload. Peak temp disk usage for the 70.5 GB archive: ~211 GB (353 GB available).
**Code fix:** `MAX_PART_SIZE` is now configurable via `MAX_PART_SIZE_MB` env var (was hardcoded at 1950). Set to 3900 for Telegram Premium accounts to avoid unnecessary splitting.
### Issue B: Download failure at 98% (DE1-Supported.7z)
| File | Size | Error |
|------|------|-------|
| DE1-Supported.7z | 1.9 GB | Download stopped unexpectedly at 2043674624/2078338541 bytes (98%) |
**Root cause:** Download stalled near completion with no retry mechanism.
**Fix applied:** Earlier in this session, download retry logic was added (max 3 retries with `cancelDownloadFile` before each retry). This file will be retried automatically on next ingestion cycle.
---
## Deliverable 1: Audit Report — Current State
### 1.1 Grouping Signal Stack (Current)
The system currently uses exactly **one automatic grouping signal**:
| Priority | Signal | Status | Location |
|----------|--------|--------|----------|
| 1 | `mediaAlbumId` | Implemented | `worker/src/grouping.ts:26-33` |
| 2 | Manual override | Implemented | `src/lib/telegram/queries.ts:606-639` |
**How it works:**
- `processAlbumGroups()` in `worker/src/grouping.ts` groups indexed packages by `mediaAlbumId` (filtering out "0" and null)
- For albums with 2+ members: creates `PackageGroup`, links packages, assigns name from album photo caption or first filename
- Manual grouping via UI: select 2+ packages, enter name, creates group in `createManualGroup()`
**What does NOT exist:**
- No `message_thread_id` (forum topic) scoping
- No project/month pattern extraction from filenames
- No creator/sender grouping
- No time-window + sender clustering
- No reply chain analysis
- No ZIP internal path prefix matching
- No caption fuzzy matching
- No staging queue for ungrouped files
### 1.2 Multipart Archive Detection (`worker/src/archive/multipart.ts`)
This is a **separate system** from display grouping. `groupArchiveSets()` groups Telegram messages into `ArchiveSet[]` based on filename patterns:
- `.zip.001`, `.zip.002` → ZIP_NUMBERED
- `.z01`, `.z02`, `.zip` → ZIP_LEGACY
- `.part1.rar`, `.part2.rar` → RAR_PART
- `.r00`, `.r01`, `.rar` → RAR_LEGACY
These are grouped by `format:baseName.toLowerCase()` key. This is about **reassembling split archives**, not UI grouping. An `ArchiveSet` becomes a single `Package` in the database.
### 1.3 TDLib Ingestion Handler
**Pipeline in `worker/src/worker.ts:801-1197`:**
```
processOneArchiveSet():
1. Early skip check (source message ID)
2. Size guard (maxZipSizeMB)
3. Download all parts
4. Compute SHA-256 hash
5. Check hash dedup
6. Read archive metadata
7. Split/repack if needed
8. Upload to destination
9. Download preview
10. Extract fallback preview
11. Resolve creator
12. Index in database
13. Cleanup temp files
```
**Post-indexing:** `processAlbumGroups()` is called once per channel/topic scan to create album-based groups.
**Gaps:**
- Messages are never "dropped" silently — failures go to `SkippedPackage` table with reason
- Watermark only advances past successfully processed sets (failed sets block advancement)
- No messages are missed within a channel, but there's no audit to verify completeness after the fact
### 1.4 Hash Verification
**What IS verified:**
| Check | Where | When |
|-------|-------|------|
| Download file size | `download.ts:verifyAndMove()` | After each file download |
| SHA-256 content hash | `worker.ts:952` | After download, used for dedup |
| Telegram upload confirmation | `channel.ts:updateMessageSendSucceeded` | Waits for server ACK |
**What is NOT verified:**
| Gap | Impact |
|-----|--------|
| No hash after upload | Can't detect Telegram-side corruption |
| No hash after split | Split files could be silently corrupted |
| CRC-32 extracted but never checked | ZIP/RAR per-file integrity not validated |
| No end-to-end hash | Split files have different hash than original |
| No periodic audit job | Stale/missing data never detected |
### 1.5 File Size Limit
| Setting | Value | Configurable? | Location |
|---------|-------|---------------|----------|
| `MAX_PART_SIZE` | 1950 MiB | **Hardcoded** | `worker/src/archive/split.ts:14` |
| `MAX_UPLOAD_SIZE` | 1950 MiB | **Hardcoded** | `worker/src/worker.ts:1023` |
| `maxZipSizeMB` | 200 GB | `WORKER_MAX_ZIP_SIZE_MB` env var | `worker/src/util/config.ts:6` |
The 1950 MiB limit is deliberately below 2 GiB to avoid TDLib's `FILE_PARTS_INVALID` error. There is **no Premium awareness** — all accounts are treated as non-Premium.
### 1.6 Search Implementation
- **No fuzzy search** — uses Prisma's `contains` with `mode: "insensitive"` (translates to PostgreSQL `ILIKE`)
- **No full-text search infrastructure** — no `tsvector`, no GiST/GIN indexes
- **Indexes:** B-tree on `fileName`, `creator`, `archiveType`, `indexedAt`, plus `PackageFile.fileName` and `extension`
- Search works for substring matching but won't match typos or similar names
### 1.7 Notification Infrastructure
- **pg_notify channels:** `bot_send`, `new_package` (bot), plus 7 worker channels
- **Bot subscriptions:** pattern-match (case-insensitive substring) on `fileName` and `creator`
- **UI notifications:** Sonner toast (ephemeral only)
- **No persistent notification store** — no database model for notifications
- **No notification UI panel** in the web app
- **No alerts for:** grouping conflicts, hash mismatches, missing parts, upload failures (beyond SkippedPackage table)
---
## Deliverable 2: Revised Grouping Signal Stack
### Recommended Implementation Plan
I recommend an **incremental approach** — implement signals in phases, starting with highest-value/lowest-risk.
### Phase 1: Foundation (Required Before Other Signals)
#### Signal 9: Manual Override Persistence
**Status:** Partially implemented. Manual groups exist but don't influence future auto-grouping.
**Implementation:**
- Add `groupingSource` field to `PackageGroup`: `"ALBUM" | "MANUAL" | "AUTO_PATTERN" | "AUTO_TIME" | "AUTO_REPLY" | "AUTO_ZIP" | "AUTO_CAPTION"`
- Manual groups already persist. What's missing is the **training feedback** where a manual grouping teaches the system to auto-group similar future files.
- This requires a `GroupingRule` model (see schema diff below) that stores learned patterns from manual overrides.
#### Ungrouped Staging Queue
**Implementation:**
- After ingestion, packages without a `packageGroupId` are naturally "ungrouped"
- Add a filter/tab to the STL page: "Ungrouped" showing packages where `packageGroupId IS NULL`
- No schema change needed — just a query filter
### Phase 2: High-Value Automatic Signals
#### Signal 1: `mediaAlbumId` (Already Implemented)
No changes needed. This is working correctly.
#### Signal 2: `message_thread_id` Forum Topic Scoping
**Status:** Already used for scan scoping (worker scans by topic), but not used as a grouping signal.
**Implementation:**
- `sourceTopicId` is already stored on `Package` (schema line 469)
- Use it as a **scoping constraint** for all other signals: time-window, caption matching, etc. only apply within the same topic
- No additional schema changes needed
#### Signal 5: Time Window + Sender Grouping
**Implementation:**
- After album grouping, find ungrouped packages from the same source channel + topic
- Within a configurable window (default 5 min), cluster by proximity
- Since we don't have `sender_id` from the source channel (TDLib `searchChatMessages` doesn't return it for channels), this becomes **time-window within topic/channel**
- New config: `AUTO_GROUP_TIME_WINDOW_MINUTES` (default: 5)
#### Signal 3: Project/Month Pattern Extraction
**Implementation:**
- Extract date patterns from filenames/captions: `YYYY-MM`, `YYYY_MM`, `MonthName Year`
- Extract project slugs: common prefix before separator (e.g., "ProjectName - File1.zip" and "ProjectName - File2.zip")
- Group packages with matching patterns from the same channel
- This should run as a **post-processing pass** after time-window grouping, merging small time-window groups that share a pattern
#### Signal 4: Creator Grouping
**Implementation:**
- The `creator` field is already extracted from filenames and stored per-package
- Within a channel, if multiple ungrouped packages have the same `creator` and were indexed within the same ingestion run, auto-group them
- Lower priority than time-window (might create overly broad groups)
### Phase 3: Advanced Signals
#### Signal 6: Reply Chain
**Implementation:**
- TDLib messages have `reply_to_message_id` but this isn't currently captured during scanning
- Would need to modify `getChannelMessages()` in `download.ts` to extract `reply_to_message_id`
- Then: if message B replies to message A, and both are archives, group them
- **Moderate complexity**, deferred to Phase 3
#### Signal 7: ZIP Internal Path Prefix
**Implementation:**
- Already have `PackageFile.path` stored for each file inside an archive
- After indexing, find the common root folder across all files
- If two packages share the same root prefix and same channel, suggest grouping
- This is a **post-hoc analysis** that could run as a background job
#### Signal 8: Caption Fuzzy Match
**Implementation:**
- Currently captions from source messages are NOT stored (only photo captions for preview matching)
- Would need to capture `msg.content?.caption?.text` during scanning and store on Package
- Then: fuzzy-match captions from nearby messages in same channel
- **Requires schema change + scan modification**, deferred to Phase 3
---
## Deliverable 3: Schema Diff
All changes are **additive** — no columns dropped, no types changed.
```prisma
// ── PackageGroup additions ──
model PackageGroup {
// ... existing fields ...
groupingSource GroupingSource @default(MANUAL) // NEW: how this group was created
}
// NEW enum
enum GroupingSource {
ALBUM // From Telegram mediaAlbumId
MANUAL // User-created via UI
AUTO_PATTERN // Filename/date pattern matching
AUTO_TIME // Time-window clustering
AUTO_REPLY // Reply chain
AUTO_ZIP // ZIP path prefix
AUTO_CAPTION // Caption fuzzy match
}
// ── Package additions ──
model Package {
// ... existing fields ...
sourceCaption String? // NEW: caption text from source Telegram message
}
// ── New model: GroupingRule (training from manual overrides) ──
model GroupingRule {
id String @id @default(cuid())
sourceChannelId String
pattern String // Regex or glob pattern learned from manual grouping
signalType GroupingSource // Which signal this rule applies to
confidence Float @default(1.0)
createdAt DateTime @default(now())
createdByGroupId String? // The manual group that spawned this rule
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
@@index([sourceChannelId])
@@map("grouping_rules")
}
// ── New model: SystemNotification ──
model SystemNotification {
id String @id @default(cuid())
type NotificationType
severity NotificationSeverity @default(INFO)
title String
message String
context Json? // Structured data: packageId, groupId, sourceMessageId, etc.
isRead Boolean @default(false)
createdAt DateTime @default(now())
@@index([isRead, createdAt])
@@index([type])
@@map("system_notifications")
}
enum NotificationType {
HASH_MISMATCH
MISSING_PART
UPLOAD_FAILED
DOWNLOAD_FAILED
GROUPING_CONFLICT
INTEGRITY_AUDIT
}
enum NotificationSeverity {
INFO
WARNING
ERROR
}
// ── Config additions (worker/src/util/config.ts) ──
// maxPartSizeMB: parseInt(process.env.MAX_PART_SIZE_MB ?? "1950", 10)
// autoGroupTimeWindowMinutes: parseInt(process.env.AUTO_GROUP_TIME_WINDOW_MINUTES ?? "5", 10)
// telegramPremium: process.env.TELEGRAM_PREMIUM === "true"
```
**Migration notes:**
- All new fields are optional/have defaults — zero-risk to existing data
- `GroupingSource` enum added with `@default(MANUAL)` — existing groups unaffected
- `GroupingRule` and `SystemNotification` are new tables — no impact on existing
- Backfill: set `groupingSource = ALBUM` for groups where `mediaAlbumId IS NOT NULL`
---
## Deliverable 4: Notification Contract
### Event Shape
```typescript
interface SystemNotificationEvent {
type: NotificationType;
severity: "INFO" | "WARNING" | "ERROR";
title: string;
message: string;
context: {
packageId?: string;
groupId?: string;
sourceChannelId?: string;
sourceMessageId?: bigint;
fileName?: string;
partNumber?: number;
totalParts?: number;
expectedHash?: string;
actualHash?: string;
reason?: string;
};
}
```
### Where Notifications Fire
| Event | Where | Trigger |
|-------|-------|---------|
| `HASH_MISMATCH` | `worker/src/worker.ts` after split | SHA-256 of concatenated split parts != original hash |
| `MISSING_PART` | Periodic audit job (new) | Group has `partCount > 1` but fewer than `partCount` dest messages exist |
| `UPLOAD_FAILED` | `worker/src/worker.ts` catch block | Upload fails after all retries exhausted |
| `DOWNLOAD_FAILED` | `worker/src/worker.ts` catch block | Download fails after all retries |
| `GROUPING_CONFLICT` | Auto-grouping pass (new) | Two signals suggest different groups for the same package |
| `INTEGRITY_AUDIT` | Periodic job (new) | Scheduled check finds inconsistencies |
### Delivery
1. **Database:** Always persisted to `SystemNotification` table
2. **pg_notify:** `SELECT pg_notify('system_notification', jsonPayload)` for real-time
3. **Web UI:** Notification bell/panel that polls or listens for new notifications
4. **Telegram (optional):** Forward critical notifications to admin via bot
---
## Deliverable 5: Feature Flag Plan
### Runtime Configuration (Environment Variables)
| Flag | Type | Default | Purpose |
|------|------|---------|---------|
| `TELEGRAM_PREMIUM` | boolean | `false` | Enable 4GB upload limit |
| `MAX_PART_SIZE_MB` | number | `1950` | Split threshold in MiB (overrides hardcoded value) |
| `AUTO_GROUP_ENABLED` | boolean | `false` | Enable automatic grouping beyond album |
| `AUTO_GROUP_TIME_WINDOW_MINUTES` | number | `5` | Time-window clustering threshold |
| `AUTO_GROUP_PATTERN_ENABLED` | boolean | `false` | Enable filename/date pattern grouping |
| `INTEGRITY_AUDIT_ENABLED` | boolean | `false` | Enable periodic integrity audit |
| `INTEGRITY_AUDIT_INTERVAL_HOURS` | number | `24` | How often to run the audit |
### Premium Mode Behavior
When `TELEGRAM_PREMIUM=true`:
1. `MAX_PART_SIZE_MB` defaults to `3900` (safely under 4 GiB) instead of `1950`
2. Files under 4 GB: uploaded as-is (no splitting)
3. Files over 4 GB: split using existing `byteLevelSplit()` at the new threshold
4. Existing split/rejoin logic is **kept as fallback** — never removed
5. `isMultipart` and `partCount` continue to track actual upload state
### Implementation in `split.ts`:
```typescript
// Replace hardcoded constant with config-driven:
const MAX_PART_SIZE = BigInt(config.maxPartSizeMB) * 1024n * 1024n;
```
And in `config.ts`:
```typescript
maxPartSizeMB: parseInt(
process.env.MAX_PART_SIZE_MB ??
(process.env.TELEGRAM_PREMIUM === "true" ? "3900" : "1950"),
10
),
```
### Rollout Strategy
1. **All flags default to off** — zero behavior change on deploy
2. Enable `TELEGRAM_PREMIUM` first (simple, well-understood)
3. Enable `AUTO_GROUP_ENABLED` on a **per-channel basis** (see test plan) before globally
4. Enable `INTEGRITY_AUDIT_ENABLED` after manual validation
5. Pattern-based grouping enabled last (highest complexity)
---
## Deliverable 6: Test Plan
### Phase 0: Pre-Implementation Validation
Before touching any code, verify the current system baseline:
1. **Pick one test channel** with known content (a mix of albums, single files, and multipart archives)
2. Run an ingestion cycle and record: number of packages, groups, skipped
3. Verify all album-based groups are correct
4. Note any ungrouped files that "should" be grouped
5. This becomes the **regression baseline**
### Phase 1: Premium Mode Testing
1. Set `TELEGRAM_PREMIUM=true` and `MAX_PART_SIZE_MB=3900`
2. Manually upload a 3 GB test file to a source channel
3. Trigger ingestion — verify it uploads as a single message (not split)
4. Manually upload a 5 GB test file
5. Trigger ingestion — verify it splits at ~3.9 GB threshold
6. Verify `isMultipart`, `partCount`, `destMessageIds` are correct
7. Send the package via bot — verify all parts arrive
### Phase 2: Time-Window Grouping Testing
1. Enable `AUTO_GROUP_ENABLED=true` on the test channel only
2. Post 3 files to the channel within 2 minutes (no album)
3. Trigger ingestion — verify they auto-group
4. Post 2 files 10 minutes apart
5. Trigger ingestion — verify they stay ungrouped
6. Manually group them — verify `GroupingRule` is created
7. Post similar files — verify auto-grouping kicks in
### Phase 3: Manual QA via API
Add a **test endpoint** (dev-only) that accepts a fake message payload and runs it through the grouping pipeline without hitting Telegram:
```
POST /api/dev/test-grouping
Body: { messages: [...], channelId: "..." }
Response: { suggestedGroups: [...] }
```
This allows testing grouping logic against crafted scenarios without waiting for real Telegram messages.
### Phase 4: Integrity Audit Testing
1. Enable `INTEGRITY_AUDIT_ENABLED=true`
2. Manually corrupt a record (set wrong `contentHash` in DB)
3. Run audit — verify `HASH_MISMATCH` notification is created
4. Delete one `destMessageId` from a multipart package's `destMessageIds`
5. Run audit — verify `MISSING_PART` notification is created
6. Check notification UI shows both
### Regression Checks After Each Phase
- Re-run ingestion on test channel — same number of packages/groups as baseline
- Search for known filenames — still returns correct results
- Send a package via bot — still delivers correctly
- Album groups unchanged
- Manual groups unchanged

View File

@@ -0,0 +1,67 @@
# Grouping Phase 1: Foundation + Time-Window Grouping
> **For agentic workers:** Use superpowers:subagent-driven-development to implement this plan.
**Goal:** Add grouping infrastructure (schema, enums, notifications model), an ungrouped staging queue in the UI, and time-window auto-grouping as the first automatic signal beyond album grouping.
**Architecture:** Schema changes lay the foundation. Ungrouped tab is a query filter. Time-window grouping runs as a post-processing pass after album grouping in the worker pipeline.
**Tech Stack:** Prisma schema + migration, worker TypeScript, Next.js App Router.
---
## Task 1: Schema Migration
**Files:**
- Modify: `prisma/schema.prisma`
- Create: migration SQL
Add:
1. `GroupingSource` enum: `ALBUM`, `MANUAL`, `AUTO_TIME`, `AUTO_PATTERN`, `AUTO_REPLY`, `AUTO_ZIP`, `AUTO_CAPTION`
2. `groupingSource GroupingSource @default(MANUAL)` on `PackageGroup`
3. `SystemNotification` model with `type`, `severity`, `title`, `message`, `context` (Json), `isRead`
4. `NotificationType` enum: `HASH_MISMATCH`, `MISSING_PART`, `UPLOAD_FAILED`, `DOWNLOAD_FAILED`, `GROUPING_CONFLICT`, `INTEGRITY_AUDIT`
5. `NotificationSeverity` enum: `INFO`, `WARNING`, `ERROR`
Backfill: `UPDATE package_groups SET "groupingSource" = 'ALBUM' WHERE "mediaAlbumId" IS NOT NULL`
---
## Task 2: Ungrouped Staging Tab in STL Page
**Files:**
- Modify: `src/lib/telegram/queries.ts` — add `listUngroupedPackages()` query
- Modify: `src/app/(app)/stls/page.tsx` — add tab parameter support
- Modify: `src/app/(app)/stls/_components/stl-table.tsx` — add "Ungrouped" tab
Add a tab next to the existing "Skipped" tab that shows packages where `packageGroupId IS NULL`. Uses the existing `PackageListItem` type and table rendering. This gives users a clear view of files that need manual grouping.
---
## Task 3: Time-Window Auto-Grouping in Worker
**Files:**
- Create: `worker/src/grouping.ts` — add `processTimeWindowGroups()` after existing `processAlbumGroups()`
- Modify: `worker/src/worker.ts` — call time-window grouping after album grouping
- Modify: `worker/src/util/config.ts` — add `autoGroupTimeWindowMinutes` config
After album grouping completes, find remaining ungrouped packages from the same channel scan. Cluster packages whose `sourceMessageId` timestamps are within the configured window (default 5 minutes). Create groups for clusters of 2+ with `groupingSource = AUTO_TIME` and name derived from the common filename prefix or first file's base name.
---
## Task 4: Hash Verification After Split
**Files:**
- Modify: `worker/src/worker.ts` — add hash re-check after concat+split
- Modify: `worker/src/archive/hash.ts` — (no changes needed, reuse `hashParts`)
After `concatenateFiles()` + `byteLevelSplit()`, re-hash the split parts and compare to the original `contentHash`. If mismatch, log error and create a `SystemNotification` (once that table exists). This closes the integrity gap identified in the audit.
---
## Task 5: Build & Deploy
Rebuild worker and app images. Deploy. Verify:
- Worker logs show `maxPartSizeMB` and new `autoGroupTimeWindowMinutes` in config
- Ungrouped tab visible in STL page
- Previously-skipped large archives begin processing

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,241 @@
# Design: Search Match Indicators, Size Limit Increase, Skipped/Failed Files Overview
**Date:** 2026-03-24
**Status:** Approved
## Overview
Three related improvements to the STL packages system:
1. **Search match indicators** — Show which internal files matched a search query, with highlighted files in the drawer
2. **Size limit increase** — Raise the ingestion limit from 4 GB to 200 GB so large multipart archives aren't skipped
3. **Skipped/failed files overview** — Track and display archives that were skipped or failed, with retry capability
---
## Feature 1: Size Limit Increase
### Change
`worker/src/util/config.ts` line 6 — change default from `"4096"` to `"204800"`.
One-line change. The split/upload pipeline already handles arbitrary sizes. The 2 GB per-part Telegram API limit is a separate hard-coded constant and stays as-is.
### Impact
- Archives up to 200 GB will now be attempted
- Multipart archives where individual parts are under 2 GB (but total exceeds 4 GB) will no longer be skipped — these upload directly without any splitting
- Single files over 2 GB are automatically split into 2 GB parts (existing behavior)
- Temp disk usage during processing can now reach up to ~200 GB per archive
---
## Feature 2: Search Match Indicators
### Backend Changes
**File:** `src/lib/telegram/queries.ts``searchPackages()`
When `searchIn` is `"files"` or `"both"`, change the PackageFile query from `distinct` to a **grouped count**:
```typescript
// Current: findMany with select: { packageId }, distinct: ["packageId"]
// New: groupBy packageId with _count
const fileMatches = await prisma.packageFile.groupBy({
by: ["packageId"],
where: {
OR: [
{ fileName: { contains: q, mode: "insensitive" } },
{ path: { contains: q, mode: "insensitive" } },
],
},
_count: { _all: true },
});
```
This returns `{ packageId: string, _count: { _all: number } }[]`.
Note: `PackageRow` in `package-columns.tsx` mirrors `PackageListItem` and must also receive the two new fields.
**File:** `src/lib/telegram/types.ts``PackageListItem`
Add two fields:
- `matchedFileCount: number` — how many files inside matched (0 if matched by package name only)
- `matchedByContent: boolean` — true if any files inside matched
### Frontend Changes
**File:** `src/app/(app)/stls/page.tsx`
Pass the search term to `StlTable` as a new prop.
**File:** `src/app/(app)/stls/_components/stl-table.tsx`
Pass search term to columns via TanStack Table column meta.
**File:** `src/app/(app)/stls/_components/package-columns.tsx`
When search is active and `matchedByContent` is true, render a clickable badge below the filename: e.g., "3 file matches". Clicking opens the `PackageFilesDrawer` with a `highlightTerm` prop set to the search term.
**File:** `src/app/(app)/stls/_components/package-files-drawer.tsx`
- Accept optional `highlightTerm: string` prop
- Render full file tree as normal (all files visible)
- Files whose `fileName` or `path` case-insensitively contains `highlightTerm` get a subtle highlight (amber/yellow background on the row)
- Auto-expand folders that contain highlighted files
- The drawer's own search input remains independent
### Data Flow
1. User types search term in STL table search input
2. URL updates with `?search=value`, page reloads
3. `page.tsx` calls `searchPackages()` with `searchIn: "both"`
4. Query returns packages with `matchedFileCount` and `matchedByContent`
5. Table renders "N file matches" badge on content-matched rows
6. User clicks badge -> drawer opens with full tree, matching files highlighted
7. Folders containing matches auto-expanded
---
## Feature 3: Skipped/Failed Files Overview
### Database Schema
New model in `prisma/schema.prisma`:
```prisma
enum SkipReason {
SIZE_LIMIT
DOWNLOAD_FAILED
EXTRACT_FAILED
UPLOAD_FAILED
}
model SkippedPackage {
id String @id @default(cuid())
fileName String
fileSize BigInt
reason SkipReason
errorMessage String?
sourceChannelId String
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
sourceMessageId BigInt
sourceTopicId BigInt?
isMultipart Boolean @default(false)
partCount Int @default(1)
accountId String
account TelegramAccount @relation(fields: [accountId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
@@unique([sourceChannelId, sourceMessageId])
@@index([reason])
@@index([accountId])
@@map("skipped_packages")
}
```
Reverse relations must be added to `TelegramChannel` and `TelegramAccount` models:
```prisma
// In TelegramChannel:
skippedPackages SkippedPackage[]
// In TelegramAccount:
skippedPackages SkippedPackage[]
```
### Worker Changes
**File:** `worker/src/worker.ts`
Extend `PipelineContext` interface to include `accountId` (derived from the ingestion run's account).
At each skip/failure point, upsert a `SkippedPackage` record:
- **Size limit skip** (line 784): reason `SIZE_LIMIT`, no error message
- **Download failure** (catch in download loop): reason `DOWNLOAD_FAILED` + error text
- **Extract/metadata failure** (catch in extract): reason `EXTRACT_FAILED` + error text
- **Upload failure** (catch in upload): reason `UPLOAD_FAILED` + error text
On **successful ingestion** of a package, delete any existing `SkippedPackage` with the same `(sourceChannelId, sourceMessageId)` — so successful retries clean up after themselves.
**File:** `worker/src/db/queries.ts`
Add functions:
- `upsertSkippedPackage(data)` — create or update skip record
- `deleteSkippedPackage(sourceChannelId, sourceMessageId)` — remove on success
### Retry Mechanism
Retrying a skipped package:
1. Delete the `SkippedPackage` record
2. Find the `AccountChannelMap` record using both `accountId` and `sourceChannelId`, then reset its `lastProcessedMessageId` to `sourceMessageId - 1` (only if less than current watermark)
3. If `sourceTopicId` is non-null, also reset the corresponding `TopicProgress.lastProcessedMessageId` for that topic
4. The next ingestion cycle picks up the message and re-attempts processing
For "Retry All" (e.g., all `SIZE_LIMIT` skips after raising the limit):
- Delete all matching `SkippedPackage` records
- For each affected (account, channel) pair, reset `AccountChannelMap` watermark to the minimum `sourceMessageId - 1` among deleted records
- For each affected (account, channel, topic) triple, reset `TopicProgress` watermark similarly
**Note on behavioral distinction:** `DOWNLOAD_FAILED`, `EXTRACT_FAILED`, and `UPLOAD_FAILED` archives already naturally retry because the worker does not advance the watermark past failed sets. The `SkippedPackage` record provides visibility into these failures. The explicit retry/watermark reset is only strictly needed for `SIZE_LIMIT` skips (where the watermark does advance past the skipped message). The UI should present both types but the retry button is most impactful for `SIZE_LIMIT` skips.
**Performance note:** "Retry All" can cause the worker to re-scan large message ranges. The existing dedup logic (`packageExistsBySourceMessage`) ensures already-ingested packages are skipped quickly, but there is a scanning cost proportional to the number of messages between the reset watermark and the current position.
### Frontend Changes
**File:** `src/app/(app)/stls/_components/stl-table.tsx`
Add a "Skipped / Failed" tab alongside the main packages table.
**New file:** `src/app/(app)/stls/_components/skipped-packages-tab.tsx`
Table columns:
- **fileName** — archive name
- **fileSize** — formatted size
- **reason** — color-coded badge: `SIZE_LIMIT` (yellow), `DOWNLOAD_FAILED` (red), `EXTRACT_FAILED` (red), `UPLOAD_FAILED` (red)
- **errorMessage** — truncated with expandable tooltip/popover for full text
- **channel** — source channel title
- **createdAt** — when the skip/failure was recorded
Actions:
- **Retry** button per row — server action that deletes record + resets watermark
- **Retry All** button in the header — bulk retry, filterable by reason
**File:** `src/app/(app)/stls/page.tsx`
Fetch skipped packages count (for tab badge) alongside existing queries.
**File:** `src/data/` or `src/lib/telegram/queries.ts`
Add query functions:
- `listSkippedPackages(options)` — paginated list with reason filter
- `countSkippedPackages()` — for tab badge
- `retrySkippedPackage(id)` — delete record + reset watermark
- `retryAllSkippedPackages(reason?)` — bulk retry
**File:** `src/app/(app)/stls/actions.ts`
Add server actions:
- `retrySkippedPackageAction(id)`
- `retryAllSkippedPackagesAction(reason?)`
---
## Files to Create/Modify
### Create
- `src/app/(app)/stls/_components/skipped-packages-tab.tsx` — skipped packages table UI
- Prisma migration for `SkippedPackage` model
### Modify
- `worker/src/util/config.ts` — raise default max size
- `worker/src/worker.ts` — record skips/failures, clean up on success
- `worker/src/db/queries.ts` — add skip record CRUD functions
- `prisma/schema.prisma` — add `SkippedPackage` model and `SkipReason` enum
- `src/lib/telegram/queries.ts` — modify `searchPackages()` for match counts, add skipped package queries
- `src/lib/telegram/types.ts` — add `matchedFileCount`/`matchedByContent` to `PackageListItem`, add skipped package types
- `src/app/(app)/stls/page.tsx` — pass search term, fetch skipped count, add tab
- `src/app/(app)/stls/_components/stl-table.tsx` — accept search prop, render tabs
- `src/app/(app)/stls/_components/package-columns.tsx` — render match badge
- `src/app/(app)/stls/_components/package-files-drawer.tsx` — accept highlightTerm, highlight matching files, auto-expand matched folders
- `src/app/(app)/stls/actions.ts` — add retry server actions

View File

@@ -0,0 +1,246 @@
# Package Grouping Design
## Overview
Add the ability to group related packages that were posted together in a Telegram channel (e.g., "DUNGEON BLOCKS - Colossal Dungeon" with 6 separate archive files). Groups appear as collapsible rows in the STL files table, with support for both automatic detection via Telegram album IDs and manual grouping through the UI.
## Goals
- Automatically detect and group files posted together in Telegram (same `media_album_id`)
- Display groups as collapsed rows in the STL table with aggregated metadata
- Allow manual grouping/ungrouping of packages via the UI
- Support editable group names and preview images
- Enable "Send All" to deliver every package in a group via the bot
## Non-Goals
- Merging grouped packages into a single Package record (each stays independent)
- Time-proximity heuristics for grouping (too error-prone)
- Grouping across different source channels
---
## Data Model
### New `PackageGroup` Table
```prisma
model PackageGroup {
id String @id @default(cuid())
name String
mediaAlbumId String?
sourceChannelId String
previewData Bytes?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
packages Package[]
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
@@unique([mediaAlbumId, sourceChannelId])
@@index([sourceChannelId])
@@map("package_groups")
}
```
### Package Model Changes
Add optional group membership:
```prisma
model Package {
// ... existing fields ...
packageGroupId String?
packageGroup PackageGroup? @relation(fields: [packageGroupId], references: [id], onDelete: SetNull)
@@index([packageGroupId])
}
```
### TelegramChannel Model Changes
Add back-relation for the new `PackageGroup` model:
```prisma
model TelegramChannel {
// ... existing fields and relations ...
packageGroups PackageGroup[]
}
```
### Key Decisions
- `mediaAlbumId` is `String?` (TDLib int64 stringified) — only used for dedup lookups, avoids BigInt complexity
- `@@unique([mediaAlbumId, sourceChannelId])` prevents duplicate album-derived groups when re-scanning. PostgreSQL treats NULLs as distinct in unique constraints, so manually-created groups (with `mediaAlbumId = null`) are not constrained by this — which is correct behavior
- Idempotency for album groups uses `findFirst({ where: { mediaAlbumId, sourceChannelId } })` + conditional `create`, not `upsert`, because Prisma does not support `upsert` on compound unique keys with nullable fields
- `onDelete: SetNull` on `Package.packageGroup` means dissolving a group automatically unlinks all members
- `onDelete: Cascade` on `PackageGroup.sourceChannel` means deleting a channel cleans up its groups
- `sourceTopicId` is omitted from `PackageGroup` — it can be inferred from member packages, and manual groups may span topics
- `@@map("package_groups")` follows the project's snake_case table naming convention
- `previewData` stores JPEG thumbnail bytes directly on the group (same pattern as Package)
---
## Worker Changes
### TelegramMessage Interface
Add optional `mediaAlbumId` field:
```typescript
export interface TelegramMessage {
id: bigint;
fileName: string;
fileId: string;
fileSize: bigint;
date: Date;
mediaAlbumId?: string; // Absent or "0" when not part of an album
}
```
The field is optional to minimize call-site changes. The grouping step treats `undefined` and `"0"` equivalently as "not part of an album."
### TelegramPhoto Interface
Add optional `mediaAlbumId` field:
```typescript
export interface TelegramPhoto {
id: bigint;
date: Date;
caption: string;
fileId: string;
fileSize: number;
mediaAlbumId?: string; // For album-to-preview correlation
}
```
### Channel Scanning
In `getChannelMessages()`, read `media_album_id` from the TDLib message object (already present in TDLib responses, just not captured today). Add `media_album_id?: string` to the `TdMessage` interface and pass through to both `TelegramMessage` and `TelegramPhoto`.
The document pass and photo pass already run as separate loops over `searchChatMessages`. Both loops capture `media_album_id` independently. Correlation happens at grouping time: album photos are matched to album documents by comparing their `mediaAlbumId` values, not at scan time.
### Group Creation (Post-Processing)
After each scan cycle's packages are individually processed (downloaded, hashed, uploaded, indexed), a post-processing step handles grouping:
1. Collect all packages from the current scan batch that share the same non-zero `mediaAlbumId`
2. For each distinct `mediaAlbumId`, check if a `PackageGroup` already exists via `findFirst({ where: { mediaAlbumId, sourceChannelId } })`
3. If no group exists, create one:
- **Name:** caption of the first message in the album (falls back to first file's base name)
- **Preview:** find a `TelegramPhoto` from the scan's `photos[]` array with the same `mediaAlbumId`. If found, download via `downloadPhotoThumbnail`. If not, the group starts with no preview (can be added in UI later)
4. Link all member packages via an idempotent `updateMany` — sets `packageGroupId` on all packages whose `sourceMessageId` is in the album's message set. This handles both newly-indexed packages and previously-indexed ones that were created in an earlier partial scan (e.g., if one package failed and was retried later)
The per-package pipeline is unchanged — each file is still downloaded, hashed, deduped, split, uploaded, and indexed independently. Grouping is a layer on top.
---
## Query Layer
### Paginated Listing with Groups
The STL table shows "display items" — either a group (collapsed) or a standalone package. Pagination operates on display items so that a group occupies exactly one slot regardless of member count.
**Two-step query approach** (handles filters correctly):
**Step 1 — Find matching display item IDs:**
```sql
-- Find all group IDs and standalone package IDs where at least one member matches filters
SELECT DISTINCT COALESCE(p."packageGroupId", p.id) AS display_id,
CASE WHEN p."packageGroupId" IS NOT NULL THEN 'group' ELSE 'package' END AS display_type,
MAX(p."indexedAt") AS sort_date
FROM packages p
LEFT JOIN package_groups pg ON pg.id = p."packageGroupId"
WHERE 1=1
-- Optional filters applied here (creator, tags, search text, channelId)
GROUP BY COALESCE(p."packageGroupId", p.id),
CASE WHEN p."packageGroupId" IS NOT NULL THEN 'group' ELSE 'package' END
ORDER BY sort_date DESC
LIMIT $1 OFFSET $2
```
**Step 2 — Fetch full data:**
For groups on the current page, fetch all member packages (including those that didn't match filters — the group appears because at least one member matched, but the expanded view shows all members). For standalone packages, fetch the full package data.
**Count query** (for pagination total):
```sql
SELECT COUNT(*) FROM (
SELECT DISTINCT COALESCE(p."packageGroupId", p.id)
FROM packages p
WHERE 1=1
-- Same filters as step 1
) AS display_items
```
### Group Row Aggregates
Computed in the step 2 fetch: total file size (sum), total file count (sum), combined tags (array union), member package count per group. These populate the collapsed group row.
### Search
`searchPackages` adds `PackageGroup.name` to search targets via a `LEFT JOIN` to `package_groups`. If any package in a group matches by name/file content, or the group name matches, the whole group appears.
### Filtering
Creator/tag filters apply to member packages. A group appears if any member matches the filter. The group row shows aggregates of all members (not just matching ones).
### New Query Functions
| Function | Purpose |
|----------|---------|
| `listDisplayItems(page, limit, filters)` | Two-step paginated query returning groups + standalone packages |
| `getDisplayItemCount(filters)` | Count of display items for pagination total |
| `getPackageGroup(groupId)` | Group metadata + all member packages |
| `updatePackageGroupName(groupId, name)` | Rename group |
| `updatePackageGroupPreview(groupId, previewData)` | Replace group preview |
| `addPackagesToGroup(packageIds, groupId)` | Manual grouping — add to existing group |
| `removePackageFromGroup(packageId)` | Ungroup single package |
| `createManualGroup(name, packageIds)` | Create new group from UI |
| `dissolveGroup(groupId)` | Ungroup all members, delete group record |
For manual grouping of packages that already belong to different groups: the UI first dissolves empty source groups (groups where all members were moved), then links the selected packages to the target group. Non-selected members of source groups remain in their original group.
---
## UI Changes
### STL Table — Group Rows
- **Collapsed (default):** Single row showing preview thumbnail, group name (editable inline), archive type badge ("Mixed" if heterogeneous), combined size, combined file count, combined tags (editable), source channel, latest `indexedAt`, actions
- **Expanded:** Chevron toggle reveals member packages as indented sub-rows with their existing columns and per-package actions
- Chevron icon on the left of the row toggles expand/collapse
**Loading strategy:** Member packages for all groups on the current page are prefetched in a single batched query during the step 2 fetch. This means expand/collapse is instant (no on-demand loading) and avoids per-row loading states.
### Group Row Actions
- **Send All** — Queues bot send requests for every package in the group. Checks for existing PENDING/SENDING requests per package to avoid duplicates.
- **View Files** — Opens file drawer showing all member packages' files, separated by package name headers
- **Dissolve Group** — Ungroups all members (confirmation required)
### Individual Package Actions (Within a Group)
- Existing: Send, View Files
- New: "Remove from group" in dropdown menu
### Manual Grouping
- Checkbox selection column on package rows
- When 2+ packages selected, a "Group Selected" button appears in the table toolbar
- Prompts for a group name, creates the group
- If selected packages belong to existing groups, those packages are moved to the new group. Source groups that become empty are automatically dissolved.
### Preview Editing
- Click the group's preview thumbnail to upload a replacement image
- Same upload flow as individual packages (existing component reuse)
### No Changes To
- Skipped/failed packages tab
- Package detail drawer internals
- Search UI (just broader matching behind the scenes)

View File

@@ -0,0 +1,184 @@
# Worker Improvements Design
**Date:** 2026-05-02
**Status:** Approved
**Scope:** Dragon's Stash Telegram ingestion worker
## Problem Statement
Three issues to address:
1. **Double-uploads**: The same archive occasionally appears twice in the destination Telegram channel. Root causes: (a) the worker crashes between `uploadToChannel()` confirming success and `createPackageWithFiles()` writing to the DB — no DB record means `recoverIncompleteUploads()` can't detect the orphaned Telegram message, and the next cycle re-uploads; (b) two accounts scanning the same source channel can both pass the hash dedup check before either creates a DB record, racing to upload the same file.
2. **Sequential account processing**: Both Telegram accounts are processed one after another via `withTdlibMutex`, even though TDLib fully supports multiple concurrent clients in the same process (each with separate `databaseDirectory` and `filesDirectory`). This halves throughput unnecessarily.
3. **Premium upload limit not used**: The Premium account can upload up to 4 GB per file, but `MAX_UPLOAD_SIZE` is hardcoded at ~1,950 MB. This causes unnecessary file splitting and expensive repack operations for files that could upload directly.
## Solution Overview
Three targeted changes, no architectural overhaul:
1. Two-phase DB write + hash advisory lock (fixes double-uploads)
2. Remove TDLib mutex from the scheduler loop (enables parallel accounts)
3. Per-account `maxUploadSize` from `getMe().is_premium` (enables 4 GB for Premium)
---
## Section 1: Double-Upload Fix
### 1a. Two-Phase DB Write
**Current flow:**
```
uploadToChannel() → preview download → metadata extraction → createPackageWithFiles()
```
If the worker crashes anywhere between upload confirmation and `createPackageWithFiles()`, no DB record exists. `recoverIncompleteUploads()` only checks packages with an existing `destMessageId` in the DB — it cannot find an orphaned Telegram message with no corresponding row.
**New flow:**
```
uploadToChannel()
→ createPackageStub() ← minimal record, destMessageId set immediately
→ preview download
→ metadata extraction
→ updatePackageWithMetadata() ← adds file list, preview, creator, tags
```
`createPackageStub()` writes: `contentHash`, `fileName`, `fileSize`, `archiveType`, `sourceChannelId`, `sourceMessageId`, `destChannelId`, `destMessageId`, `isMultipart`, `partCount`, `ingestionRunId`. File list and preview are left empty.
If the worker crashes after the stub is written:
- `recoverIncompleteUploads()` finds the record (has `destMessageId`), verifies the Telegram message exists, keeps it.
- Next cycle: `packageExistsByHash()` returns true → skips re-upload.
- The stub has `fileCount = 0` and no file listing. The UI shows "metadata pending" rather than failing silently.
Stubs with `fileCount = 0` are valid deliverable packages (the bot can still send the file). Backfilling metadata on stubs is out of scope for this change — the crash case is rare and the stub is functional.
### 1b. Hash Advisory Lock
**The race (two accounts, shared source channel):**
```
Worker A: packageExistsByHash(X) → false (no record yet)
Worker B: packageExistsByHash(X) → false (no record yet)
Worker A: uploads file → destMessageId_A
Worker B: uploads file → destMessageId_B ← duplicate Telegram message
Worker A: createPackageStub() → succeeds (contentHash @unique satisfied)
Worker B: createPackageStub() → fails unique constraint on contentHash
```
Result: two Telegram messages, one DB record. Worker B's upload is wasted.
**Fix:** Before calling `uploadToChannel()`, acquire a PostgreSQL session advisory lock keyed on the content hash:
```sql
SELECT pg_try_advisory_lock(hash_bigint)
```
Where `hash_bigint` is the first 8 bytes of the SHA-256 content hash interpreted as a signed bigint.
- `pg_try_advisory_lock` is non-blocking. If another worker holds the lock (same file, shared channel), return `false` → treat as duplicate, skip.
- After acquiring the lock, **re-run `packageExistsByHash()`** before uploading. This catches the case where another worker finished and released the lock between the first check and this one — without the re-check, the current worker would proceed to re-upload.
- The lock is session-scoped: released automatically on DB session end. No manual cleanup needed on crash.
- The lock is released explicitly after `createPackageStub()` completes (or on any error path).
**Implementation location:** New helper `tryAcquireHashLock(contentHash)` / `releaseHashLock(contentHash)` in `worker/src/db/locks.ts`, reusing the existing DB client pattern.
---
## Section 2: Parallel Account Processing
### Current Constraint
`withTdlibMutex` in `scheduler.ts` serializes all TDLib operations across accounts. This was a conservative guard, but TDLib explicitly supports multiple concurrent clients in the same process provided each has its own `databaseDirectory` and `filesDirectory`.
The codebase already satisfies this requirement:
```typescript
// worker/src/tdlib/client.ts
const dbPath = path.join(config.tdlibStateDir, account.id);
const client = createClient({
databaseDirectory: dbPath,
filesDirectory: path.join(dbPath, "files"),
});
```
Each account gets `<TDLIB_STATE_DIR>/<account.id>/` — fully isolated.
### Change
Replace the sequential `for` loop in `scheduler.ts` with `Promise.allSettled()`:
```typescript
// Before
for (const account of accounts) {
await withTdlibMutex(`ingest:${account.phone}`, () => runWorkerForAccount(account));
}
// After
await Promise.allSettled(accounts.map((account) => runWorkerForAccount(account)));
```
The per-account PostgreSQL advisory lock in `db/locks.ts` already prevents any account from being processed twice simultaneously. `Promise.allSettled()` ensures one account's failure doesn't abort the other.
The `withTdlibMutex` wrapper can be removed from the ingest path entirely. The auth path (`authenticateAccount`) should also be run in parallel but may remain guarded if TDLib auth flows have ordering dependencies — verify during implementation.
**No Docker Compose changes needed.** Both accounts run in the same container.
### Speed Limit Notifications
TDLib fires `updateSpeedLimitNotification` when an account's upload or download speed is throttled (non-Premium accounts). Log this event at `warn` level in the client update handler so it's visible in logs without being actionable.
---
## Section 3: Per-Account Premium Upload Limit
### Premium Detection
After successful authentication, call `getMe()` and read `is_premium: bool` from the returned `user` object. Store this on `TelegramAccount.isPremium` (new boolean field, default `false`, updated on each successful auth).
```typescript
const me = await client.invoke({ _: 'getMe' }) as { is_premium?: boolean };
await updateAccountPremiumStatus(account.id, me.is_premium ?? false);
```
### Upload Size Limits
| Account type | `maxUploadSize` | Effect |
|---|---|---|
| Premium | 3,950 MB | Parts ≤ 3.95 GB upload as-is; repack only for parts >3.95 GB (extremely rare) |
| Non-Premium | 1,950 MB | Current behavior unchanged |
Pass `maxUploadSize` into `processOneArchiveSet()` as a parameter (currently hardcoded as `MAX_UPLOAD_SIZE` at `worker.ts:1023` and in `archive/split.ts`).
The `hasOversizedPart` check and `byteLevelSplit` call both use this value, so the repack step is effectively eliminated for Premium accounts in practice — no separate "skip repack" flag needed.
### Migration
```prisma
model TelegramAccount {
// ... existing fields
isPremium Boolean @default(false)
}
```
One migration, one new query `updateAccountPremiumStatus(accountId, isPremium)`.
---
## Files to Change
| File | Change |
|---|---|
| `prisma/schema.prisma` | Add `isPremium Boolean @default(false)` to `TelegramAccount` |
| `worker/src/db/queries.ts` | Add `updateAccountPremiumStatus()`, `createPackageStub()`, `updatePackageWithMetadata()` |
| `worker/src/db/locks.ts` | Add `tryAcquireHashLock()`, `releaseHashLock()` |
| `worker/src/tdlib/client.ts` | Call `getMe()` after auth, return `isPremium` from `createTdlibClient()` |
| `worker/src/worker.ts` | Two-phase write, hash lock acquire/release, pass `maxUploadSize` per account |
| `worker/src/archive/split.ts` | Accept `maxPartSize` parameter instead of hardcoded constant |
| `worker/src/scheduler.ts` | Replace sequential loop with `Promise.allSettled()`, remove `withTdlibMutex` from ingest path |
---
## What Is Explicitly Out of Scope
- Backfilling metadata on stub records (rare crash case, functional without it)
- Download pre-fetching / pipeline parallelism within one account
- Two separate worker containers (single container is sufficient)
- Bot or app changes (worker-only)

221
install.cmd Normal file
View File

@@ -0,0 +1,221 @@
@echo off
setlocal enabledelayedexpansion
REM Claude Code Windows CMD Bootstrap Script
REM Installs Claude Code for environments where PowerShell is not available
REM Parse command line argument
set "TARGET=%~1"
if "!TARGET!"=="" set "TARGET=latest"
REM Validate target parameter
if /i "!TARGET!"=="stable" goto :target_valid
if /i "!TARGET!"=="latest" goto :target_valid
echo !TARGET! | findstr /r "^[0-9][0-9]*\.[0-9][0-9]*\.[0-9][0-9]*" >nul
if !ERRORLEVEL! equ 0 goto :target_valid
echo Usage: %0 [stable^|latest^|VERSION] >&2
echo Example: %0 1.0.58 >&2
exit /b 1
:target_valid
REM Check for 64-bit Windows
if /i "%PROCESSOR_ARCHITECTURE%"=="AMD64" goto :arch_valid
if /i "%PROCESSOR_ARCHITECTURE%"=="ARM64" goto :arch_valid
if /i "%PROCESSOR_ARCHITEW6432%"=="AMD64" goto :arch_valid
if /i "%PROCESSOR_ARCHITEW6432%"=="ARM64" goto :arch_valid
echo Claude Code does not support 32-bit Windows. Please use a 64-bit version of Windows. >&2
exit /b 1
:arch_valid
REM Set constants
set "GCS_BUCKET=https://storage.googleapis.com/claude-code-dist-86c565f3-f756-42ad-8dfa-d59b1c096819/claude-code-releases"
set "DOWNLOAD_DIR=%USERPROFILE%\.claude\downloads"
REM Use native ARM64 binary on ARM64 Windows, x64 otherwise
if /i "%PROCESSOR_ARCHITECTURE%"=="ARM64" (
set "PLATFORM=win32-arm64"
) else (
set "PLATFORM=win32-x64"
)
REM Create download directory
if not exist "!DOWNLOAD_DIR!" mkdir "!DOWNLOAD_DIR!"
REM Check for curl availability
curl --version >nul 2>&1
if !ERRORLEVEL! neq 0 (
echo curl is required but not available. Please install curl or use PowerShell installer. >&2
exit /b 1
)
REM Always download latest version (which has the most up-to-date installer)
call :download_file "!GCS_BUCKET!/latest" "!DOWNLOAD_DIR!\latest"
if !ERRORLEVEL! neq 0 (
echo Failed to get latest version >&2
exit /b 1
)
REM Read version from file
set /p VERSION=<"!DOWNLOAD_DIR!\latest"
del "!DOWNLOAD_DIR!\latest"
REM Download manifest
call :download_file "!GCS_BUCKET!/!VERSION!/manifest.json" "!DOWNLOAD_DIR!\manifest.json"
if !ERRORLEVEL! neq 0 (
echo Failed to get manifest >&2
exit /b 1
)
REM Extract checksum from manifest
call :parse_manifest "!DOWNLOAD_DIR!\manifest.json" "!PLATFORM!"
if !ERRORLEVEL! neq 0 (
echo Platform !PLATFORM! not found in manifest >&2
del "!DOWNLOAD_DIR!\manifest.json" 2>nul
exit /b 1
)
del "!DOWNLOAD_DIR!\manifest.json"
REM Download binary
set "BINARY_PATH=!DOWNLOAD_DIR!\claude-!VERSION!-!PLATFORM!.exe"
call :download_file "!GCS_BUCKET!/!VERSION!/!PLATFORM!/claude.exe" "!BINARY_PATH!"
if !ERRORLEVEL! neq 0 (
echo Failed to download binary >&2
if exist "!BINARY_PATH!" del "!BINARY_PATH!"
exit /b 1
)
REM Verify checksum
call :verify_checksum "!BINARY_PATH!" "!EXPECTED_CHECKSUM!"
if !ERRORLEVEL! neq 0 (
echo Checksum verification failed >&2
del "!BINARY_PATH!"
exit /b 1
)
REM Run claude install to set up launcher and shell integration
echo Setting up Claude Code...
"!BINARY_PATH!" install "!TARGET!"
set "INSTALL_RESULT=!ERRORLEVEL!"
REM Clean up downloaded file
REM Wait a moment for any file handles to be released
timeout /t 1 /nobreak >nul 2>&1
del /f "!BINARY_PATH!" >nul 2>&1
if exist "!BINARY_PATH!" (
echo Warning: Could not remove temporary file: !BINARY_PATH!
)
if !INSTALL_RESULT! neq 0 (
echo Installation failed >&2
exit /b 1
)
echo.
echo Installation complete^^!
echo.
exit /b 0
REM ============================================================================
REM SUBROUTINES
REM ============================================================================
:download_file
REM Downloads a file using curl
REM Args: %1=URL, %2=OutputPath
set "URL=%~1"
set "OUTPUT=%~2"
curl -fsSL "!URL!" -o "!OUTPUT!"
exit /b !ERRORLEVEL!
:parse_manifest
REM Parse JSON manifest to extract checksum for platform
REM Args: %1=ManifestPath, %2=Platform
set "MANIFEST_PATH=%~1"
set "PLATFORM_NAME=%~2"
set "EXPECTED_CHECKSUM="
REM Use findstr to find platform section, then look for checksum
set "FOUND_PLATFORM="
set "IN_PLATFORM_SECTION="
REM Read the manifest line by line
for /f "usebackq tokens=*" %%i in ("!MANIFEST_PATH!") do (
set "LINE=%%i"
REM Check if this line contains our platform
echo !LINE! | findstr /c:"\"%PLATFORM_NAME%\":" >nul
if !ERRORLEVEL! equ 0 (
set "IN_PLATFORM_SECTION=1"
)
REM If we're in the platform section, look for checksum
if defined IN_PLATFORM_SECTION (
echo !LINE! | findstr /c:"\"checksum\":" >nul
if !ERRORLEVEL! equ 0 (
REM Extract checksum value
for /f "tokens=2 delims=:" %%j in ("!LINE!") do (
set "CHECKSUM_PART=%%j"
REM Remove quotes, whitespace, and comma
set "CHECKSUM_PART=!CHECKSUM_PART: =!"
set "CHECKSUM_PART=!CHECKSUM_PART:"=!"
set "CHECKSUM_PART=!CHECKSUM_PART:,=!"
REM Check if it looks like a SHA256 (64 hex chars)
if not "!CHECKSUM_PART!"=="" (
call :check_length "!CHECKSUM_PART!" 64
if !ERRORLEVEL! equ 0 (
set "EXPECTED_CHECKSUM=!CHECKSUM_PART!"
exit /b 0
)
)
)
)
REM Check if we've left the platform section (closing brace)
echo !LINE! | findstr /c:"}" >nul
if !ERRORLEVEL! equ 0 set "IN_PLATFORM_SECTION="
)
)
if "!EXPECTED_CHECKSUM!"=="" exit /b 1
exit /b 0
:check_length
REM Check if string length equals expected length
REM Args: %1=String, %2=ExpectedLength
set "STR=%~1"
set "EXPECTED_LEN=%~2"
set "LEN=0"
:count_loop
if "!STR:~%LEN%,1!"=="" goto :count_done
set /a LEN+=1
goto :count_loop
:count_done
if %LEN%==%EXPECTED_LEN% exit /b 0
exit /b 1
:verify_checksum
REM Verify file checksum using certutil
REM Args: %1=FilePath, %2=ExpectedChecksum
set "FILE_PATH=%~1"
set "EXPECTED=%~2"
for /f "skip=1 tokens=*" %%i in ('certutil -hashfile "!FILE_PATH!" SHA256') do (
set "ACTUAL=%%i"
set "ACTUAL=!ACTUAL: =!"
if "!ACTUAL!"=="CertUtil:Thecommandcompletedsuccessfully." goto :verify_done
if "!ACTUAL!" neq "" (
if /i "!ACTUAL!"=="!EXPECTED!" (
exit /b 0
) else (
exit /b 1
)
)
)
:verify_done
exit /b 1

2
package-lock.json generated
View File

@@ -49,7 +49,7 @@
"ts-node": "^10.9.2", "ts-node": "^10.9.2",
"tsx": "^4.21.0", "tsx": "^4.21.0",
"tw-animate-css": "^1.4.0", "tw-animate-css": "^1.4.0",
"typescript": "^5" "typescript": "5.9.3"
} }
}, },
"node_modules/@alloc/quick-lru": { "node_modules/@alloc/quick-lru": {

View File

@@ -58,6 +58,6 @@
"ts-node": "^10.9.2", "ts-node": "^10.9.2",
"tsx": "^4.21.0", "tsx": "^4.21.0",
"tw-animate-css": "^1.4.0", "tw-animate-css": "^1.4.0",
"typescript": "^5" "typescript": "5.9.3"
} }
} }

View File

@@ -0,0 +1,21 @@
-- CreateTable
CREATE TABLE "invite_codes" (
"id" TEXT NOT NULL,
"code" VARCHAR(32) NOT NULL,
"maxUses" INTEGER NOT NULL DEFAULT 1,
"uses" INTEGER NOT NULL DEFAULT 0,
"expiresAt" TIMESTAMP(3),
"createdBy" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "invite_codes_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "invite_codes_code_key" ON "invite_codes"("code");
-- CreateIndex
CREATE INDEX "invite_codes_code_idx" ON "invite_codes"("code");
-- AddForeignKey
ALTER TABLE "invite_codes" ADD CONSTRAINT "invite_codes_createdBy_fkey" FOREIGN KEY ("createdBy") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,3 @@
-- AlterEnum
ALTER TYPE "ArchiveType" ADD VALUE 'SEVEN_Z';
ALTER TYPE "ArchiveType" ADD VALUE 'DOCUMENT';

View File

@@ -0,0 +1,5 @@
-- AlterTable
ALTER TABLE "telegram_channels" ADD COLUMN "category" VARCHAR(64);
-- CreateIndex
CREATE INDEX "telegram_channels_category_idx" ON "telegram_channels"("category");

View File

@@ -0,0 +1,32 @@
-- CreateEnum
CREATE TYPE "ExtractStatus" AS ENUM ('PENDING', 'IN_PROGRESS', 'COMPLETED', 'FAILED');
-- AlterTable
ALTER TABLE "User" ADD COLUMN "usedInviteId" TEXT;
-- CreateTable
CREATE TABLE "archive_extract_requests" (
"id" TEXT NOT NULL,
"packageId" TEXT NOT NULL,
"filePath" VARCHAR(1024) NOT NULL,
"status" "ExtractStatus" NOT NULL DEFAULT 'PENDING',
"imageData" BYTEA,
"contentType" VARCHAR(64),
"error" TEXT,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "archive_extract_requests_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "archive_extract_requests_packageId_filePath_idx" ON "archive_extract_requests"("packageId", "filePath");
-- CreateIndex
CREATE INDEX "archive_extract_requests_status_idx" ON "archive_extract_requests"("status");
-- AddForeignKey
ALTER TABLE "User" ADD CONSTRAINT "User_usedInviteId_fkey" FOREIGN KEY ("usedInviteId") REFERENCES "invite_codes"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "archive_extract_requests" ADD CONSTRAINT "archive_extract_requests_packageId_fkey" FOREIGN KEY ("packageId") REFERENCES "packages"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,10 @@
-- Add tags array column to packages
ALTER TABLE "packages" ADD COLUMN "tags" TEXT[] NOT NULL DEFAULT '{}';
-- Backfill: inherit source channel category as initial tag
UPDATE "packages" p
SET "tags" = ARRAY[c."category"]
FROM "telegram_channels" c
WHERE p."sourceChannelId" = c."id"
AND c."category" IS NOT NULL
AND c."category" != '';

View File

@@ -0,0 +1,50 @@
-- CreateEnum
CREATE TYPE "DeliveryStatus" AS ENUM ('NOT_DELIVERED', 'PARTIAL', 'DELIVERED');
CREATE TYPE "PaymentStatus" AS ENUM ('PAID', 'UNPAID');
-- CreateTable
CREATE TABLE "kickstarter_hosts" (
"id" TEXT NOT NULL,
"name" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "kickstarter_hosts_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "kickstarters" (
"id" TEXT NOT NULL,
"name" TEXT NOT NULL,
"link" TEXT,
"filesUrl" TEXT,
"deliveryStatus" "DeliveryStatus" NOT NULL DEFAULT 'NOT_DELIVERED',
"paymentStatus" "PaymentStatus" NOT NULL DEFAULT 'UNPAID',
"notes" TEXT,
"hostId" TEXT,
"userId" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "kickstarters_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "kickstarter_packages" (
"kickstarterId" TEXT NOT NULL,
"packageId" TEXT NOT NULL,
CONSTRAINT "kickstarter_packages_pkey" PRIMARY KEY ("kickstarterId","packageId")
);
-- CreateIndex
CREATE UNIQUE INDEX "kickstarter_hosts_name_key" ON "kickstarter_hosts"("name");
CREATE INDEX "kickstarters_hostId_idx" ON "kickstarters"("hostId");
CREATE INDEX "kickstarters_userId_idx" ON "kickstarters"("userId");
CREATE INDEX "kickstarters_deliveryStatus_idx" ON "kickstarters"("deliveryStatus");
CREATE INDEX "kickstarters_paymentStatus_idx" ON "kickstarters"("paymentStatus");
-- AddForeignKey
ALTER TABLE "kickstarters" ADD CONSTRAINT "kickstarters_hostId_fkey" FOREIGN KEY ("hostId") REFERENCES "kickstarter_hosts"("id") ON DELETE SET NULL ON UPDATE CASCADE;
ALTER TABLE "kickstarters" ADD CONSTRAINT "kickstarters_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "kickstarter_packages" ADD CONSTRAINT "kickstarter_packages_kickstarterId_fkey" FOREIGN KEY ("kickstarterId") REFERENCES "kickstarters"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "kickstarter_packages" ADD CONSTRAINT "kickstarter_packages_packageId_fkey" FOREIGN KEY ("packageId") REFERENCES "packages"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,35 @@
-- CreateEnum
CREATE TYPE "SkipReason" AS ENUM ('SIZE_LIMIT', 'DOWNLOAD_FAILED', 'EXTRACT_FAILED', 'UPLOAD_FAILED');
-- CreateTable
CREATE TABLE "skipped_packages" (
"id" TEXT NOT NULL,
"fileName" TEXT NOT NULL,
"fileSize" BIGINT NOT NULL,
"reason" "SkipReason" NOT NULL,
"errorMessage" TEXT,
"sourceChannelId" TEXT NOT NULL,
"sourceMessageId" BIGINT NOT NULL,
"sourceTopicId" BIGINT,
"isMultipart" BOOLEAN NOT NULL DEFAULT false,
"partCount" INTEGER NOT NULL DEFAULT 1,
"accountId" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "skipped_packages_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "skipped_packages_sourceChannelId_sourceMessageId_key" ON "skipped_packages"("sourceChannelId", "sourceMessageId");
-- CreateIndex
CREATE INDEX "skipped_packages_reason_idx" ON "skipped_packages"("reason");
-- CreateIndex
CREATE INDEX "skipped_packages_accountId_idx" ON "skipped_packages"("accountId");
-- AddForeignKey
ALTER TABLE "skipped_packages" ADD CONSTRAINT "skipped_packages_sourceChannelId_fkey" FOREIGN KEY ("sourceChannelId") REFERENCES "telegram_channels"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "skipped_packages" ADD CONSTRAINT "skipped_packages_accountId_fkey" FOREIGN KEY ("accountId") REFERENCES "telegram_accounts"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,30 @@
-- AlterTable
ALTER TABLE "packages" ADD COLUMN "packageGroupId" TEXT;
-- CreateTable
CREATE TABLE "package_groups" (
"id" TEXT NOT NULL,
"name" TEXT NOT NULL,
"mediaAlbumId" TEXT,
"sourceChannelId" TEXT NOT NULL,
"previewData" BYTEA,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "package_groups_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "package_groups_sourceChannelId_idx" ON "package_groups"("sourceChannelId");
-- CreateIndex
CREATE UNIQUE INDEX "package_groups_mediaAlbumId_sourceChannelId_key" ON "package_groups"("mediaAlbumId", "sourceChannelId");
-- CreateIndex
CREATE INDEX "packages_packageGroupId_idx" ON "packages"("packageGroupId");
-- AddForeignKey
ALTER TABLE "packages" ADD CONSTRAINT "packages_packageGroupId_fkey" FOREIGN KEY ("packageGroupId") REFERENCES "package_groups"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "package_groups" ADD CONSTRAINT "package_groups_sourceChannelId_fkey" FOREIGN KEY ("sourceChannelId") REFERENCES "telegram_channels"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,7 @@
-- AlterTable
ALTER TABLE "packages" ADD COLUMN "destMessageIds" BIGINT[] DEFAULT ARRAY[]::BIGINT[];
-- Backfill: copy existing destMessageId into the array
UPDATE "packages"
SET "destMessageIds" = ARRAY["destMessageId"]
WHERE "destMessageId" IS NOT NULL;

View File

@@ -0,0 +1,32 @@
-- CreateEnum GroupingSource
CREATE TYPE "GroupingSource" AS ENUM ('ALBUM', 'MANUAL', 'AUTO_TIME', 'AUTO_PATTERN', 'AUTO_REPLY', 'AUTO_ZIP', 'AUTO_CAPTION');
-- CreateEnum NotificationType
CREATE TYPE "NotificationType" AS ENUM ('HASH_MISMATCH', 'MISSING_PART', 'UPLOAD_FAILED', 'DOWNLOAD_FAILED', 'GROUPING_CONFLICT', 'INTEGRITY_AUDIT');
-- CreateEnum NotificationSeverity
CREATE TYPE "NotificationSeverity" AS ENUM ('INFO', 'WARNING', 'ERROR');
-- AlterTable: add groupingSource to package_groups
ALTER TABLE "package_groups" ADD COLUMN "groupingSource" "GroupingSource" NOT NULL DEFAULT 'MANUAL';
-- Backfill: mark album-based groups
UPDATE "package_groups" SET "groupingSource" = 'ALBUM' WHERE "mediaAlbumId" IS NOT NULL;
-- CreateTable: system_notifications
CREATE TABLE "system_notifications" (
"id" TEXT NOT NULL,
"type" "NotificationType" NOT NULL,
"severity" "NotificationSeverity" NOT NULL DEFAULT 'INFO',
"title" TEXT NOT NULL,
"message" TEXT NOT NULL,
"context" JSONB,
"isRead" BOOLEAN NOT NULL DEFAULT false,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "system_notifications_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "system_notifications_isRead_createdAt_idx" ON "system_notifications"("isRead", "createdAt");
CREATE INDEX "system_notifications_type_idx" ON "system_notifications"("type");

View File

@@ -0,0 +1,3 @@
-- AlterTable: add sourceCaption and replyToMessageId to packages
ALTER TABLE "packages" ADD COLUMN "sourceCaption" TEXT;
ALTER TABLE "packages" ADD COLUMN "replyToMessageId" BIGINT;

View File

@@ -0,0 +1,47 @@
-- AlterTable: add autoGroupEnabled to telegram_channels
ALTER TABLE "telegram_channels" ADD COLUMN "autoGroupEnabled" BOOLEAN NOT NULL DEFAULT true;
-- CreateTable: grouping_rules
CREATE TABLE "grouping_rules" (
"id" TEXT NOT NULL,
"sourceChannelId" TEXT NOT NULL,
"pattern" TEXT NOT NULL,
"signalType" "GroupingSource" NOT NULL,
"confidence" DOUBLE PRECISION NOT NULL DEFAULT 1.0,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"createdByGroupId" TEXT,
CONSTRAINT "grouping_rules_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "grouping_rules_sourceChannelId_idx" ON "grouping_rules"("sourceChannelId");
-- AddForeignKey
ALTER TABLE "grouping_rules" ADD CONSTRAINT "grouping_rules_sourceChannelId_fkey" FOREIGN KEY ("sourceChannelId") REFERENCES "telegram_channels"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- Full-text search: add tsvector column and GIN index
ALTER TABLE "packages" ADD COLUMN IF NOT EXISTS "searchVector" tsvector;
UPDATE "packages" SET "searchVector" = to_tsvector('english',
coalesce("fileName", '') || ' ' || coalesce("creator", '') || ' ' || coalesce("sourceCaption", '')
) WHERE "searchVector" IS NULL;
CREATE INDEX IF NOT EXISTS "packages_search_vector_idx" ON "packages" USING GIN ("searchVector");
-- Trigger to auto-update searchVector on insert/update
CREATE OR REPLACE FUNCTION packages_search_vector_update() RETURNS trigger AS $$
BEGIN
NEW."searchVector" := to_tsvector('english',
coalesce(NEW."fileName", '') || ' ' || coalesce(NEW."creator", '') || ' ' || coalesce(NEW."sourceCaption", '')
);
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
DROP TRIGGER IF EXISTS packages_search_vector_trigger ON "packages";
CREATE TRIGGER packages_search_vector_trigger
BEFORE INSERT OR UPDATE OF "fileName", "creator", "sourceCaption"
ON "packages"
FOR EACH ROW
EXECUTE FUNCTION packages_search_vector_update();

View File

@@ -0,0 +1,30 @@
-- CreateEnum
CREATE TYPE "ManualUploadStatus" AS ENUM ('PENDING', 'PROCESSING', 'COMPLETED', 'FAILED');
-- CreateTable
CREATE TABLE "manual_uploads" (
"id" TEXT NOT NULL,
"status" "ManualUploadStatus" NOT NULL DEFAULT 'PENDING',
"groupName" TEXT,
"userId" TEXT NOT NULL,
"errorMessage" TEXT,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"completedAt" TIMESTAMP(3),
CONSTRAINT "manual_uploads_pkey" PRIMARY KEY ("id")
);
CREATE TABLE "manual_upload_files" (
"id" TEXT NOT NULL,
"uploadId" TEXT NOT NULL,
"fileName" TEXT NOT NULL,
"filePath" TEXT NOT NULL,
"fileSize" BIGINT NOT NULL,
"packageId" TEXT,
CONSTRAINT "manual_upload_files_pkey" PRIMARY KEY ("id")
);
CREATE INDEX "manual_uploads_status_idx" ON "manual_uploads"("status");
CREATE INDEX "manual_upload_files_uploadId_idx" ON "manual_upload_files"("uploadId");
ALTER TABLE "manual_uploads" ADD CONSTRAINT "manual_uploads_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
ALTER TABLE "manual_upload_files" ADD CONSTRAINT "manual_upload_files_uploadId_fkey" FOREIGN KEY ("uploadId") REFERENCES "manual_uploads"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,2 @@
-- AlterTable
ALTER TABLE "telegram_accounts" ADD COLUMN "isPremium" BOOLEAN NOT NULL DEFAULT false;

View File

@@ -38,6 +38,11 @@ model User {
tags Tag[] tags Tag[]
settings UserSettings? settings UserSettings?
telegramLink TelegramLink? telegramLink TelegramLink?
kickstarters Kickstarter[]
inviteCodes InviteCode[] @relation("InviteCreator")
usedInvite InviteCode? @relation("InviteUser", fields: [usedInviteId], references: [id], onDelete: SetNull)
usedInviteId String?
manualUploads ManualUpload[]
} }
model Account { model Account {
@@ -376,6 +381,8 @@ enum ChannelRole {
enum ArchiveType { enum ArchiveType {
ZIP ZIP
RAR RAR
SEVEN_Z
DOCUMENT
} }
enum IngestionStatus { enum IngestionStatus {
@@ -399,6 +406,7 @@ model TelegramAccount {
isActive Boolean @default(true) isActive Boolean @default(true)
authState AuthState @default(PENDING) authState AuthState @default(PENDING)
authCode String? authCode String?
isPremium Boolean @default(false)
lastSeenAt DateTime? lastSeenAt DateTime?
createdAt DateTime @default(now()) createdAt DateTime @default(now())
updatedAt DateTime @updatedAt updatedAt DateTime @updatedAt
@@ -406,6 +414,7 @@ model TelegramAccount {
channelMaps AccountChannelMap[] channelMaps AccountChannelMap[]
ingestionRuns IngestionRun[] ingestionRuns IngestionRun[]
fetchRequests ChannelFetchRequest[] fetchRequests ChannelFetchRequest[]
skippedPackages SkippedPackage[]
@@index([isActive]) @@index([isActive])
@@map("telegram_accounts") @@map("telegram_accounts")
@@ -418,13 +427,20 @@ model TelegramChannel {
type ChannelType type ChannelType
isForum Boolean @default(false) isForum Boolean @default(false)
isActive Boolean @default(false) isActive Boolean @default(false)
category String? @db.VarChar(64)
createdAt DateTime @default(now()) createdAt DateTime @default(now())
updatedAt DateTime @updatedAt updatedAt DateTime @updatedAt
autoGroupEnabled Boolean @default(true)
accountMaps AccountChannelMap[] accountMaps AccountChannelMap[]
packages Package[] packages Package[]
skippedPackages SkippedPackage[]
packageGroups PackageGroup[]
groupingRules GroupingRule[]
@@index([type, isActive]) @@index([type, isActive])
@@index([category])
@@map("telegram_channels") @@map("telegram_channels")
} }
@@ -458,19 +474,27 @@ model Package {
sourceTopicId BigInt? sourceTopicId BigInt?
destChannelId String? destChannelId String?
destMessageId BigInt? destMessageId BigInt?
destMessageIds BigInt[] @default([])
isMultipart Boolean @default(false) isMultipart Boolean @default(false)
partCount Int @default(1) partCount Int @default(1)
fileCount Int @default(0) fileCount Int @default(0)
tags String[] @default([])
sourceCaption String? // Caption text from source Telegram message
replyToMessageId BigInt? // reply_to_message_id from source message (for reply chain grouping)
previewData Bytes? // JPEG thumbnail from nearby Telegram photo (stored as raw bytes) previewData Bytes? // JPEG thumbnail from nearby Telegram photo (stored as raw bytes)
previewMsgId BigInt? // Telegram message ID of the matched photo previewMsgId BigInt? // Telegram message ID of the matched photo
packageGroupId String?
indexedAt DateTime @default(now()) indexedAt DateTime @default(now())
createdAt DateTime @default(now()) createdAt DateTime @default(now())
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id]) sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id])
packageGroup PackageGroup? @relation(fields: [packageGroupId], references: [id], onDelete: SetNull)
files PackageFile[] files PackageFile[]
ingestionRun IngestionRun? @relation(fields: [ingestionRunId], references: [id]) ingestionRun IngestionRun? @relation(fields: [ingestionRunId], references: [id])
ingestionRunId String? ingestionRunId String?
sendRequests BotSendRequest[] sendRequests BotSendRequest[]
extractRequests ArchiveExtractRequest[]
kickstarterLinks KickstarterPackage[]
@@index([sourceChannelId]) @@index([sourceChannelId])
@@index([destChannelId]) @@index([destChannelId])
@@ -478,6 +502,7 @@ model Package {
@@index([indexedAt]) @@index([indexedAt])
@@index([archiveType]) @@index([archiveType])
@@index([creator]) @@index([creator])
@@index([packageGroupId])
@@map("packages") @@map("packages")
} }
@@ -499,6 +524,24 @@ model PackageFile {
@@map("package_files") @@map("package_files")
} }
model PackageGroup {
id String @id @default(cuid())
name String
mediaAlbumId String?
sourceChannelId String
groupingSource GroupingSource @default(MANUAL)
previewData Bytes?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
packages Package[]
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
@@unique([mediaAlbumId, sourceChannelId])
@@index([sourceChannelId])
@@map("package_groups")
}
model IngestionRun { model IngestionRun {
id String @id @default(cuid()) id String @id @default(cuid())
accountId String accountId String
@@ -554,6 +597,22 @@ model GlobalSetting {
@@map("global_settings") @@map("global_settings")
} }
model InviteCode {
id String @id @default(cuid())
code String @unique @db.VarChar(32)
maxUses Int @default(1)
uses Int @default(0)
expiresAt DateTime?
createdBy String
createdAt DateTime @default(now())
creator User @relation("InviteCreator", fields: [createdBy], references: [id], onDelete: Cascade)
usedBy User[] @relation("InviteUser")
@@index([code])
@@map("invite_codes")
}
model ChannelFetchRequest { model ChannelFetchRequest {
id String @id @default(cuid()) id String @id @default(cuid())
accountId String accountId String
@@ -626,3 +685,222 @@ model BotSubscription {
@@index([telegramUserId]) @@index([telegramUserId])
@@map("bot_subscriptions") @@map("bot_subscriptions")
} }
// ───────────────────────────────────────
// Archive image extraction (worker-mediated)
// ───────────────────────────────────────
enum ExtractStatus {
PENDING
IN_PROGRESS
COMPLETED
FAILED
}
/// A request for the worker to extract an image from an archive.
/// The web app creates this, sends a pg_notify, and the worker
/// downloads the archive, extracts the file, and writes the result.
model ArchiveExtractRequest {
id String @id @default(cuid())
packageId String
filePath String @db.VarChar(1024) // path within archive to extract
status ExtractStatus @default(PENDING)
imageData Bytes? // extracted image bytes (JPEG/PNG/WebP)
contentType String? @db.VarChar(64) // MIME type of extracted image
error String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
package Package @relation(fields: [packageId], references: [id], onDelete: Cascade)
@@index([packageId, filePath])
@@index([status])
@@map("archive_extract_requests")
}
// ───────────────────────────────────────
// Skipped/Failed Archives
// ───────────────────────────────────────
enum SkipReason {
SIZE_LIMIT
DOWNLOAD_FAILED
EXTRACT_FAILED
UPLOAD_FAILED
}
model SkippedPackage {
id String @id @default(cuid())
fileName String
fileSize BigInt
reason SkipReason
errorMessage String?
sourceChannelId String
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
sourceMessageId BigInt
sourceTopicId BigInt?
isMultipart Boolean @default(false)
partCount Int @default(1)
accountId String
account TelegramAccount @relation(fields: [accountId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
@@unique([sourceChannelId, sourceMessageId])
@@index([reason])
@@index([accountId])
@@map("skipped_packages")
}
// ───────────────────────────────────────
// Purchased Kickstarters
// ───────────────────────────────────────
enum DeliveryStatus {
NOT_DELIVERED
PARTIAL
DELIVERED
}
enum PaymentStatus {
PAID
UNPAID
}
model KickstarterHost {
id String @id @default(cuid())
name String @unique
createdAt DateTime @default(now())
kickstarters Kickstarter[]
@@map("kickstarter_hosts")
}
model Kickstarter {
id String @id @default(cuid())
name String
link String?
filesUrl String?
deliveryStatus DeliveryStatus @default(NOT_DELIVERED)
paymentStatus PaymentStatus @default(UNPAID)
notes String?
hostId String?
userId String
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
host KickstarterHost? @relation(fields: [hostId], references: [id], onDelete: SetNull)
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
packages KickstarterPackage[]
@@index([hostId])
@@index([userId])
@@index([deliveryStatus])
@@index([paymentStatus])
@@map("kickstarters")
}
model KickstarterPackage {
kickstarterId String
packageId String
kickstarter Kickstarter @relation(fields: [kickstarterId], references: [id], onDelete: Cascade)
package Package @relation(fields: [packageId], references: [id], onDelete: Cascade)
@@id([kickstarterId, packageId])
@@map("kickstarter_packages")
}
// ── Grouping & Notifications ──
enum GroupingSource {
ALBUM
MANUAL
AUTO_TIME
AUTO_PATTERN
AUTO_REPLY
AUTO_ZIP
AUTO_CAPTION
}
enum NotificationType {
HASH_MISMATCH
MISSING_PART
UPLOAD_FAILED
DOWNLOAD_FAILED
GROUPING_CONFLICT
INTEGRITY_AUDIT
}
enum NotificationSeverity {
INFO
WARNING
ERROR
}
model SystemNotification {
id String @id @default(cuid())
type NotificationType
severity NotificationSeverity @default(INFO)
title String
message String
context Json?
isRead Boolean @default(false)
createdAt DateTime @default(now())
@@index([isRead, createdAt])
@@index([type])
@@map("system_notifications")
}
model GroupingRule {
id String @id @default(cuid())
sourceChannelId String
pattern String // Regex or keyword pattern learned from manual grouping
signalType GroupingSource // Which grouping signal this rule applies to
confidence Float @default(1.0)
createdAt DateTime @default(now())
createdByGroupId String? // The manual group that spawned this rule
sourceChannel TelegramChannel @relation(fields: [sourceChannelId], references: [id], onDelete: Cascade)
@@index([sourceChannelId])
@@map("grouping_rules")
}
enum ManualUploadStatus {
PENDING
PROCESSING
COMPLETED
FAILED
}
model ManualUpload {
id String @id @default(cuid())
status ManualUploadStatus @default(PENDING)
groupName String? // Group name if multiple files
userId String
errorMessage String?
createdAt DateTime @default(now())
completedAt DateTime?
files ManualUploadFile[]
user User @relation(fields: [userId], references: [id])
@@index([status])
@@map("manual_uploads")
}
model ManualUploadFile {
id String @id @default(cuid())
uploadId String
fileName String
filePath String // Path on shared volume
fileSize BigInt
packageId String? // Set after processing
upload ManualUpload @relation(fields: [uploadId], references: [id], onDelete: Cascade)
@@index([uploadId])
@@map("manual_upload_files")
}

View File

@@ -0,0 +1,415 @@
"use client";
import { useState, useTransition } from "react";
import { Copy, Link2, Plus, Trash2 } from "lucide-react";
import { Button } from "@/components/ui/button";
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
import { Input } from "@/components/ui/input";
import { Label } from "@/components/ui/label";
import { Switch } from "@/components/ui/switch";
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table";
import { Badge } from "@/components/ui/badge";
import {
AlertDialog,
AlertDialogAction,
AlertDialogCancel,
AlertDialogContent,
AlertDialogDescription,
AlertDialogFooter,
AlertDialogHeader,
AlertDialogTitle,
AlertDialogTrigger,
} from "@/components/ui/alert-dialog";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "@/components/ui/tooltip";
import { createInviteCode, createBulkInviteCodes, deleteInviteCode } from "../actions";
type InviteUser = {
id: string;
name: string | null;
email: string | null;
createdAt: string;
};
type InviteCode = {
id: string;
code: string;
maxUses: number;
uses: number;
expiresAt: string | null;
createdAt: string;
creator: { name: string | null };
usedBy: InviteUser[];
};
export function InviteManager({
inviteCodes,
appUrl,
}: {
inviteCodes: InviteCode[];
appUrl: string;
}) {
const [maxUses, setMaxUses] = useState(1);
const [expiresInDays, setExpiresInDays] = useState(7);
const [noExpiry, setNoExpiry] = useState(false);
const [bulkCount, setBulkCount] = useState(5);
const [isPending, startTransition] = useTransition();
const [copiedId, setCopiedId] = useState<string | null>(null);
const [copiedType, setCopiedType] = useState<"code" | "link" | null>(null);
function handleCreate() {
startTransition(async () => {
await createInviteCode({
maxUses,
expiresInDays: noExpiry ? null : expiresInDays,
});
});
}
function handleBulkCreate() {
startTransition(async () => {
await createBulkInviteCodes({
count: bulkCount,
maxUses,
expiresInDays: noExpiry ? null : expiresInDays,
});
});
}
function handleDelete(id: string) {
startTransition(async () => {
await deleteInviteCode(id);
});
}
function copyToClipboard(text: string, id: string, type: "code" | "link") {
navigator.clipboard.writeText(text);
setCopiedId(id);
setCopiedType(type);
setTimeout(() => {
setCopiedId(null);
setCopiedType(null);
}, 2000);
}
function getStatus(invite: InviteCode): "active" | "used" | "expired" {
if (invite.uses >= invite.maxUses) return "used";
if (invite.expiresAt && new Date(invite.expiresAt) < new Date()) return "expired";
return "active";
}
function formatRelativeDate(dateStr: string) {
const date = new Date(dateStr);
const now = new Date();
const diffMs = date.getTime() - now.getTime();
const diffDays = Math.ceil(diffMs / (1000 * 60 * 60 * 24));
if (diffDays < 0) return "Expired";
if (diffDays === 0) return "Today";
if (diffDays === 1) return "Tomorrow";
return `${diffDays} days`;
}
const activeCount = inviteCodes.filter((i) => getStatus(i) === "active").length;
const usedCount = inviteCodes.filter((i) => getStatus(i) === "used").length;
return (
<div className="max-w-5xl space-y-6">
{/* Create Card */}
<Card>
<CardHeader>
<CardTitle>Generate Invite Codes</CardTitle>
<CardDescription>
Create single or bulk invite codes to share with new users
</CardDescription>
</CardHeader>
<CardContent className="space-y-4">
<div className="flex flex-wrap items-end gap-4">
<div className="space-y-2">
<Label htmlFor="maxUses">Max Uses</Label>
<Input
id="maxUses"
type="number"
min={1}
max={100}
value={maxUses}
onChange={(e) => setMaxUses(Number(e.target.value))}
className="w-24"
/>
</div>
<div className="space-y-2">
<Label htmlFor="expiresInDays">Expires in (days)</Label>
<Input
id="expiresInDays"
type="number"
min={1}
max={365}
value={expiresInDays}
onChange={(e) => setExpiresInDays(Number(e.target.value))}
disabled={noExpiry}
className="w-24"
/>
</div>
<div className="flex items-center gap-2 pb-1">
<Switch
id="noExpiry"
checked={noExpiry}
onCheckedChange={setNoExpiry}
/>
<Label htmlFor="noExpiry" className="text-sm">
No expiry
</Label>
</div>
</div>
<div className="flex flex-wrap items-end gap-3 border-t pt-4">
<Button onClick={handleCreate} disabled={isPending}>
<Plus className="mr-2 h-4 w-4" />
{isPending ? "Creating..." : "Create One"}
</Button>
<div className="flex items-end gap-2">
<div className="space-y-2">
<Label htmlFor="bulkCount">Count</Label>
<Input
id="bulkCount"
type="number"
min={2}
max={25}
value={bulkCount}
onChange={(e) => setBulkCount(Number(e.target.value))}
className="w-20"
/>
</div>
<Button
variant="secondary"
onClick={handleBulkCreate}
disabled={isPending}
>
<Plus className="mr-2 h-4 w-4" />
{isPending ? "Creating..." : `Create ${bulkCount}`}
</Button>
</div>
</div>
</CardContent>
</Card>
{/* Codes Table */}
<Card>
<CardHeader>
<CardTitle>Invite Codes</CardTitle>
<CardDescription>
{inviteCodes.length} total &middot; {activeCount} active &middot; {usedCount} fully used
</CardDescription>
</CardHeader>
<CardContent>
{inviteCodes.length === 0 ? (
<p className="text-sm text-muted-foreground">
No invite codes yet. Create one above.
</p>
) : (
<Table>
<TableHeader>
<TableRow>
<TableHead>Code</TableHead>
<TableHead>Status</TableHead>
<TableHead>Uses</TableHead>
<TableHead>Redeemed By</TableHead>
<TableHead>Expires</TableHead>
<TableHead>Created</TableHead>
<TableHead className="text-right">Actions</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{inviteCodes.map((invite) => {
const status = getStatus(invite);
const isCopiedCode =
copiedId === invite.id && copiedType === "code";
const isCopiedLink =
copiedId === invite.id && copiedType === "link";
return (
<TableRow key={invite.id}>
<TableCell className="font-mono text-sm">
{invite.code}
</TableCell>
<TableCell>
<Badge
variant={
status === "active"
? "default"
: status === "used"
? "secondary"
: "destructive"
}
>
{status}
</Badge>
</TableCell>
<TableCell>
{invite.uses} / {invite.maxUses}
</TableCell>
<TableCell>
{invite.usedBy.length === 0 ? (
<span className="text-muted-foreground">--</span>
) : (
<div className="space-y-0.5">
{invite.usedBy.map((user) => (
<Tooltip key={user.id}>
<TooltipTrigger asChild>
<div className="text-sm cursor-default">
{user.name ?? user.email ?? "Unknown"}
</div>
</TooltipTrigger>
<TooltipContent>
<div className="text-xs">
{user.email && <div>{user.email}</div>}
<div>
Joined{" "}
{new Date(user.createdAt).toLocaleDateString()}
</div>
</div>
</TooltipContent>
</Tooltip>
))}
</div>
)}
</TableCell>
<TableCell>
{invite.expiresAt ? (
<Tooltip>
<TooltipTrigger asChild>
<span className="cursor-default">
{formatRelativeDate(invite.expiresAt)}
</span>
</TooltipTrigger>
<TooltipContent>
{new Date(invite.expiresAt).toLocaleString()}
</TooltipContent>
</Tooltip>
) : (
<span className="text-muted-foreground">Never</span>
)}
</TableCell>
<TableCell>
<Tooltip>
<TooltipTrigger asChild>
<span className="cursor-default">
{new Date(invite.createdAt).toLocaleDateString()}
</span>
</TooltipTrigger>
<TooltipContent>
by {invite.creator.name ?? "Unknown"}
</TooltipContent>
</Tooltip>
</TableCell>
<TableCell className="text-right">
<div className="flex justify-end gap-1">
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="outline"
size="sm"
onClick={() =>
copyToClipboard(
invite.code,
invite.id,
"code"
)
}
>
<Copy className="h-3 w-3" />
{isCopiedCode && (
<span className="ml-1">Copied!</span>
)}
</Button>
</TooltipTrigger>
<TooltipContent>Copy code</TooltipContent>
</Tooltip>
<Tooltip>
<TooltipTrigger asChild>
<Button
variant="outline"
size="sm"
onClick={() =>
copyToClipboard(
`${appUrl}/register?code=${invite.code}`,
invite.id,
"link"
)
}
disabled={status !== "active"}
>
<Link2 className="h-3 w-3" />
{isCopiedLink && (
<span className="ml-1">Copied!</span>
)}
</Button>
</TooltipTrigger>
<TooltipContent>Copy registration link</TooltipContent>
</Tooltip>
<AlertDialog>
<Tooltip>
<TooltipTrigger asChild>
<AlertDialogTrigger asChild>
<Button
variant="destructive"
size="sm"
disabled={isPending}
>
<Trash2 className="h-3 w-3" />
</Button>
</AlertDialogTrigger>
</TooltipTrigger>
<TooltipContent>Delete code</TooltipContent>
</Tooltip>
<AlertDialogContent>
<AlertDialogHeader>
<AlertDialogTitle>
Delete invite code?
</AlertDialogTitle>
<AlertDialogDescription>
This will permanently delete the invite code{" "}
<span className="font-mono font-semibold">
{invite.code}
</span>
.{" "}
{status === "active" &&
"Anyone with this code will no longer be able to register."}
</AlertDialogDescription>
</AlertDialogHeader>
<AlertDialogFooter>
<AlertDialogCancel>Cancel</AlertDialogCancel>
<AlertDialogAction
onClick={() => handleDelete(invite.id)}
>
Delete
</AlertDialogAction>
</AlertDialogFooter>
</AlertDialogContent>
</AlertDialog>
</div>
</TableCell>
</TableRow>
);
})}
</TableBody>
</Table>
)}
</CardContent>
</Card>
</div>
);
}

View File

@@ -0,0 +1,96 @@
"use server";
import crypto from "crypto";
import { auth } from "@/lib/auth";
import { prisma } from "@/lib/prisma";
import type { ActionResult } from "@/types/api.types";
import { revalidatePath } from "next/cache";
export async function createInviteCode(input: {
maxUses: number;
expiresInDays: number | null;
}): Promise<ActionResult<{ code: string }>> {
const session = await auth();
if (!session?.user?.id || session.user.role !== "ADMIN") {
return { success: false, error: "Unauthorized" };
}
const code = crypto.randomBytes(6).toString("hex");
const expiresAt = input.expiresInDays
? new Date(Date.now() + input.expiresInDays * 24 * 60 * 60 * 1000)
: null;
await prisma.inviteCode.create({
data: {
code,
maxUses: input.maxUses,
expiresAt,
createdBy: session.user.id,
},
});
revalidatePath("/invites");
return { success: true, data: { code } };
}
export async function createBulkInviteCodes(input: {
count: number;
maxUses: number;
expiresInDays: number | null;
}): Promise<ActionResult<{ codes: string[] }>> {
const session = await auth();
if (!session?.user?.id || session.user.role !== "ADMIN") {
return { success: false, error: "Unauthorized" };
}
if (input.count < 1 || input.count > 25) {
return { success: false, error: "Can generate between 1 and 25 codes at a time" };
}
const expiresAt = input.expiresInDays
? new Date(Date.now() + input.expiresInDays * 24 * 60 * 60 * 1000)
: null;
const codes: string[] = [];
await prisma.$transaction(async (tx) => {
for (let i = 0; i < input.count; i++) {
const code = crypto.randomBytes(6).toString("hex");
codes.push(code);
await tx.inviteCode.create({
data: {
code,
maxUses: input.maxUses,
expiresAt,
createdBy: session.user.id,
},
});
}
});
revalidatePath("/invites");
return { success: true, data: { codes } };
}
export async function deleteInviteCode(id: string): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id || session.user.role !== "ADMIN") {
return { success: false, error: "Unauthorized" };
}
await prisma.inviteCode.delete({ where: { id } });
revalidatePath("/invites");
return { success: true, data: undefined };
}
export async function getInviteCodes() {
const codes = await prisma.inviteCode.findMany({
orderBy: { createdAt: "desc" },
include: {
creator: { select: { name: true } },
usedBy: { select: { id: true, name: true, email: true, createdAt: true } },
},
});
return codes;
}

View File

@@ -0,0 +1,26 @@
import { auth } from "@/lib/auth";
import { redirect } from "next/navigation";
import { PageHeader } from "@/components/shared/page-header";
import { getInviteCodes } from "./actions";
import { InviteManager } from "./_components/invite-manager";
export default async function InvitesPage() {
const session = await auth();
if (!session?.user?.id) redirect("/login");
if (session.user.role !== "ADMIN") redirect("/dashboard");
const inviteCodes = await getInviteCodes();
return (
<div className="space-y-6">
<PageHeader
title="Invite Codes"
description="Manage invite codes for new user registration"
/>
<InviteManager
inviteCodes={JSON.parse(JSON.stringify(inviteCodes))}
appUrl={process.env.NEXT_PUBLIC_APP_URL ?? ""}
/>
</div>
);
}

View File

@@ -0,0 +1,201 @@
"use client";
import { type ColumnDef } from "@tanstack/react-table";
import { MoreHorizontal, Pencil, Trash2, ExternalLink, Link2, Send } from "lucide-react";
import { DataTableColumnHeader } from "@/components/shared/data-table-column-header";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
import {
DropdownMenu,
DropdownMenuContent,
DropdownMenuItem,
DropdownMenuSeparator,
DropdownMenuTrigger,
} from "@/components/ui/dropdown-menu";
export interface KickstarterRow {
id: string;
name: string;
link: string | null;
filesUrl: string | null;
deliveryStatus: "NOT_DELIVERED" | "PARTIAL" | "DELIVERED";
paymentStatus: "PAID" | "UNPAID";
notes: string | null;
hostId: string | null;
userId: string;
createdAt: Date;
updatedAt: Date;
host: { id: string; name: string } | null;
_count: { packages: number };
}
interface KickstarterColumnsProps {
onEdit: (kickstarter: KickstarterRow) => void;
onDelete: (id: string) => void;
onLinkPackages: (kickstarter: KickstarterRow) => void;
onSendAll: (kickstarter: KickstarterRow) => void;
}
const deliveryConfig: Record<string, { label: string; className: string }> = {
NOT_DELIVERED: {
label: "Not Delivered",
className: "bg-red-500/15 text-red-400 border-red-500/30",
},
PARTIAL: {
label: "Partial",
className: "bg-orange-500/15 text-orange-400 border-orange-500/30",
},
DELIVERED: {
label: "Delivered",
className: "bg-emerald-500/15 text-emerald-400 border-emerald-500/30",
},
};
const paymentConfig: Record<string, { label: string; className: string }> = {
PAID: {
label: "Paid",
className: "bg-emerald-500/15 text-emerald-400 border-emerald-500/30",
},
UNPAID: {
label: "Unpaid",
className: "bg-red-500/15 text-red-400 border-red-500/30",
},
};
export function getKickstarterColumns({
onEdit,
onDelete,
onLinkPackages,
onSendAll,
}: KickstarterColumnsProps): ColumnDef<KickstarterRow, unknown>[] {
return [
{
accessorKey: "name",
header: ({ column }) => <DataTableColumnHeader column={column} title="Name" />,
cell: ({ row }) => (
<div className="flex items-center gap-2">
<span className="font-medium">{row.original.name}</span>
{row.original.link && (
<a
href={row.original.link}
target="_blank"
rel="noopener noreferrer"
className="text-primary hover:text-primary/80"
onClick={(e) => e.stopPropagation()}
>
<ExternalLink className="h-3.5 w-3.5" />
</a>
)}
</div>
),
enableHiding: false,
},
{
accessorKey: "host",
header: ({ column }) => <DataTableColumnHeader column={column} title="Host" />,
cell: ({ row }) =>
row.original.host ? (
<span className="text-sm">{row.original.host.name}</span>
) : (
<span className="text-muted-foreground">--</span>
),
},
{
id: "files",
header: "Files",
cell: ({ row }) =>
row.original.filesUrl ? (
<a
href={row.original.filesUrl}
target="_blank"
rel="noopener noreferrer"
className="flex items-center gap-1 text-sm text-primary hover:underline"
onClick={(e) => e.stopPropagation()}
>
<ExternalLink className="h-3 w-3" />
</a>
) : (
<span className="text-muted-foreground">--</span>
),
},
{
accessorKey: "deliveryStatus",
header: ({ column }) => <DataTableColumnHeader column={column} title="Delivery" />,
cell: ({ row }) => {
const config = deliveryConfig[row.original.deliveryStatus];
return (
<Badge variant="outline" className={`text-[10px] font-medium ${config.className}`}>
{config.label}
</Badge>
);
},
},
{
accessorKey: "paymentStatus",
header: ({ column }) => <DataTableColumnHeader column={column} title="Payment" />,
cell: ({ row }) => {
const config = paymentConfig[row.original.paymentStatus];
return (
<Badge variant="outline" className={`text-[10px] font-medium ${config.className}`}>
{config.label}
</Badge>
);
},
},
{
id: "packages",
header: "Packages",
cell: ({ row }) => (
<span className="text-sm text-muted-foreground">
{row.original._count.packages}
</span>
),
},
{
accessorKey: "createdAt",
header: ({ column }) => <DataTableColumnHeader column={column} title="Created" />,
cell: ({ row }) => (
<span className="text-sm text-muted-foreground">
{new Date(row.original.createdAt).toLocaleDateString()}
</span>
),
},
{
id: "actions",
cell: ({ row }) => (
<DropdownMenu>
<DropdownMenuTrigger asChild>
<Button variant="ghost" size="icon" className="h-8 w-8">
<MoreHorizontal className="h-4 w-4" />
</Button>
</DropdownMenuTrigger>
<DropdownMenuContent align="end">
<DropdownMenuItem onClick={() => onEdit(row.original)}>
<Pencil className="mr-2 h-3.5 w-3.5" />
Edit
</DropdownMenuItem>
<DropdownMenuItem onClick={() => onLinkPackages(row.original)}>
<Link2 className="mr-2 h-3.5 w-3.5" />
Link Packages
</DropdownMenuItem>
{row.original._count.packages > 0 && (
<DropdownMenuItem onClick={() => onSendAll(row.original)}>
<Send className="mr-2 h-3.5 w-3.5" />
Send All ({row.original._count.packages})
</DropdownMenuItem>
)}
<DropdownMenuSeparator />
<DropdownMenuItem
onClick={() => onDelete(row.original.id)}
className="text-destructive focus:text-destructive"
>
<Trash2 className="mr-2 h-3.5 w-3.5" />
Delete
</DropdownMenuItem>
</DropdownMenuContent>
</DropdownMenu>
),
enableHiding: false,
},
];
}

View File

@@ -0,0 +1,301 @@
"use client";
import { useState, useTransition } from "react";
import { useForm } from "react-hook-form";
import { zodResolver } from "@hookform/resolvers/zod";
import { toast } from "sonner";
import { Plus } from "lucide-react";
import { kickstarterSchema, type KickstarterInput } from "@/schemas/kickstarter.schema";
import { createKickstarter, updateKickstarter, createHost } from "../actions";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Textarea } from "@/components/ui/textarea";
import {
Form,
FormControl,
FormField,
FormItem,
FormLabel,
FormMessage,
} from "@/components/ui/form";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
interface HostOption {
id: string;
name: string;
_count: { kickstarters: number };
}
interface KickstarterFormProps {
kickstarter?: {
id: string;
name: string;
link: string | null;
filesUrl: string | null;
deliveryStatus: "NOT_DELIVERED" | "PARTIAL" | "DELIVERED";
paymentStatus: "PAID" | "UNPAID";
hostId: string | null;
notes: string | null;
};
hosts: HostOption[];
onSuccess: () => void;
}
export function KickstarterForm({ kickstarter, hosts, onSuccess }: KickstarterFormProps) {
const [isPending, startTransition] = useTransition();
const [hostList, setHostList] = useState(hosts);
const [showNewHost, setShowNewHost] = useState(false);
const [newHostName, setNewHostName] = useState("");
const isEditing = !!kickstarter;
const form = useForm<KickstarterInput>({
resolver: zodResolver(kickstarterSchema),
defaultValues: {
name: kickstarter?.name ?? "",
link: kickstarter?.link ?? "",
filesUrl: kickstarter?.filesUrl ?? "",
deliveryStatus: kickstarter?.deliveryStatus ?? "NOT_DELIVERED",
paymentStatus: kickstarter?.paymentStatus ?? "UNPAID",
hostId: kickstarter?.hostId ?? "",
notes: kickstarter?.notes ?? "",
},
});
function onSubmit(values: KickstarterInput) {
startTransition(async () => {
const result = isEditing
? await updateKickstarter(kickstarter!.id, values)
: await createKickstarter(values);
if (!result.success) {
toast.error(result.error);
return;
}
toast.success(isEditing ? "Kickstarter updated" : "Kickstarter created");
form.reset();
onSuccess();
});
}
function handleAddHost() {
if (!newHostName.trim()) return;
startTransition(async () => {
const result = await createHost({ name: newHostName.trim() });
if (!result.success) {
toast.error(result.error);
return;
}
toast.success(`Host "${result.data!.name}" created`);
setHostList((prev) => [
...prev,
{ id: result.data!.id, name: result.data!.name, _count: { kickstarters: 0 } },
]);
form.setValue("hostId", result.data!.id);
setNewHostName("");
setShowNewHost(false);
});
}
return (
<Form {...form}>
<form onSubmit={form.handleSubmit(onSubmit)} className="space-y-4">
<FormField
control={form.control}
name="name"
render={({ field }) => (
<FormItem>
<FormLabel>Name</FormLabel>
<FormControl>
<Input placeholder="Kickstarter name" {...field} />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="link"
render={({ field }) => (
<FormItem>
<FormLabel>Link</FormLabel>
<FormControl>
<Input placeholder="https://kickstarter.com/..." {...field} />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="filesUrl"
render={({ field }) => (
<FormItem>
<FormLabel>Files URL</FormLabel>
<FormControl>
<Input placeholder="https://drive.google.com/..." {...field} />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<div className="grid grid-cols-2 gap-4">
<FormField
control={form.control}
name="deliveryStatus"
render={({ field }) => (
<FormItem>
<FormLabel>Delivery Status</FormLabel>
<Select onValueChange={field.onChange} defaultValue={field.value}>
<FormControl>
<SelectTrigger>
<SelectValue placeholder="Select status" />
</SelectTrigger>
</FormControl>
<SelectContent>
<SelectItem value="NOT_DELIVERED">Not Delivered</SelectItem>
<SelectItem value="PARTIAL">Partial</SelectItem>
<SelectItem value="DELIVERED">Delivered</SelectItem>
</SelectContent>
</Select>
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="paymentStatus"
render={({ field }) => (
<FormItem>
<FormLabel>Payment Status</FormLabel>
<Select onValueChange={field.onChange} defaultValue={field.value}>
<FormControl>
<SelectTrigger>
<SelectValue placeholder="Select status" />
</SelectTrigger>
</FormControl>
<SelectContent>
<SelectItem value="PAID">Paid</SelectItem>
<SelectItem value="UNPAID">Unpaid</SelectItem>
</SelectContent>
</Select>
<FormMessage />
</FormItem>
)}
/>
</div>
<FormField
control={form.control}
name="hostId"
render={({ field }) => (
<FormItem>
<FormLabel>Host</FormLabel>
{!showNewHost ? (
<div className="flex gap-2">
<Select
onValueChange={(v) => field.onChange(v === "none" ? "" : v)}
defaultValue={field.value || "none"}
>
<FormControl>
<SelectTrigger className="flex-1">
<SelectValue placeholder="Select host (optional)" />
</SelectTrigger>
</FormControl>
<SelectContent>
<SelectItem value="none">No Host</SelectItem>
{hostList.map((host) => (
<SelectItem key={host.id} value={host.id}>
{host.name}
</SelectItem>
))}
</SelectContent>
</Select>
<Button
type="button"
variant="outline"
size="icon"
onClick={() => setShowNewHost(true)}
>
<Plus className="h-4 w-4" />
</Button>
</div>
) : (
<div className="flex gap-2">
<Input
placeholder="New host name"
value={newHostName}
onChange={(e) => setNewHostName(e.target.value)}
onKeyDown={(e) => {
if (e.key === "Enter") {
e.preventDefault();
handleAddHost();
}
if (e.key === "Escape") {
setShowNewHost(false);
setNewHostName("");
}
}}
autoFocus
className="flex-1"
/>
<Button
type="button"
variant="outline"
size="sm"
onClick={handleAddHost}
disabled={isPending || !newHostName.trim()}
>
Add
</Button>
<Button
type="button"
variant="ghost"
size="sm"
onClick={() => {
setShowNewHost(false);
setNewHostName("");
}}
>
Cancel
</Button>
</div>
)}
<FormMessage />
</FormItem>
)}
/>
<FormField
control={form.control}
name="notes"
render={({ field }) => (
<FormItem>
<FormLabel>Notes</FormLabel>
<FormControl>
<Textarea placeholder="Optional notes" rows={3} {...field} />
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<div className="flex justify-end gap-2">
<Button type="submit" disabled={isPending}>
{isPending ? "Saving..." : isEditing ? "Update" : "Create"}
</Button>
</div>
</form>
</Form>
);
}

View File

@@ -0,0 +1,54 @@
"use client";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { KickstarterForm } from "./kickstarter-form";
interface HostOption {
id: string;
name: string;
_count: { kickstarters: number };
}
interface KickstarterModalProps {
open: boolean;
onOpenChange: (open: boolean) => void;
hosts: HostOption[];
kickstarter?: {
id: string;
name: string;
link: string | null;
filesUrl: string | null;
deliveryStatus: "NOT_DELIVERED" | "PARTIAL" | "DELIVERED";
paymentStatus: "PAID" | "UNPAID";
hostId: string | null;
notes: string | null;
};
}
export function KickstarterModal({ open, onOpenChange, hosts, kickstarter }: KickstarterModalProps) {
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-lg">
<DialogHeader>
<DialogTitle>{kickstarter ? "Edit Kickstarter" : "Add Kickstarter"}</DialogTitle>
<DialogDescription>
{kickstarter
? "Update the kickstarter details below."
: "Track a new Kickstarter or crowdfunding campaign."}
</DialogDescription>
</DialogHeader>
<KickstarterForm
kickstarter={kickstarter}
hosts={hosts}
onSuccess={() => onOpenChange(false)}
/>
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,215 @@
"use client";
import { useState, useCallback, useTransition } from "react";
import { useRouter, usePathname, useSearchParams } from "next/navigation";
import { Plus, Search } from "lucide-react";
import { toast } from "sonner";
import { useDataTable } from "@/hooks/use-data-table";
import { getKickstarterColumns, type KickstarterRow } from "./kickstarter-columns";
import { KickstarterModal } from "./kickstarter-modal";
import { PackageLinkerDialog } from "./package-linker-dialog";
import { deleteKickstarter, sendAllKickstarterPackages } from "../actions";
import { DataTable } from "@/components/shared/data-table";
import { DataTablePagination } from "@/components/shared/data-table-pagination";
import { DataTableViewOptions } from "@/components/shared/data-table-view-options";
import { DeleteDialog } from "@/components/shared/delete-dialog";
import { PageHeader } from "@/components/shared/page-header";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
interface HostOption {
id: string;
name: string;
_count: { kickstarters: number };
}
interface KickstarterTableProps {
data: KickstarterRow[];
pageCount: number;
totalCount: number;
hosts: HostOption[];
}
export function KickstarterTable({
data,
pageCount,
totalCount,
hosts,
}: KickstarterTableProps) {
const router = useRouter();
const pathname = usePathname();
const searchParams = useSearchParams();
const [isPending, startTransition] = useTransition();
const [modalOpen, setModalOpen] = useState(false);
const [editKickstarter, setEditKickstarter] = useState<KickstarterRow | undefined>();
const [deleteId, setDeleteId] = useState<string | null>(null);
const [linkTarget, setLinkTarget] = useState<KickstarterRow | null>(null);
const [searchValue, setSearchValue] = useState(searchParams.get("search") ?? "");
const updateSearch = useCallback(
(value: string) => {
setSearchValue(value);
const params = new URLSearchParams(searchParams.toString());
if (value) {
params.set("search", value);
params.set("page", "1");
} else {
params.delete("search");
}
router.push(`${pathname}?${params.toString()}`, { scroll: false });
},
[router, pathname, searchParams]
);
const updateFilter = useCallback(
(key: string, value: string) => {
const params = new URLSearchParams(searchParams.toString());
if (value && value !== "all") {
params.set(key, value);
params.set("page", "1");
} else {
params.delete(key);
}
router.push(`${pathname}?${params.toString()}`, { scroll: false });
},
[router, pathname, searchParams]
);
const columns = getKickstarterColumns({
onEdit: (kickstarter) => {
setEditKickstarter(kickstarter);
setModalOpen(true);
},
onDelete: (id) => setDeleteId(id),
onLinkPackages: (kickstarter) => setLinkTarget(kickstarter),
onSendAll: (kickstarter) => {
startTransition(async () => {
const result = await sendAllKickstarterPackages(kickstarter.id);
if (result.success) {
toast.success(`Queued ${result.data!.queued} package(s) for delivery`);
} else {
toast.error(result.error);
}
});
},
});
const { table } = useDataTable({ data, columns, pageCount });
const handleDelete = () => {
if (!deleteId) return;
startTransition(async () => {
const result = await deleteKickstarter(deleteId);
if (result.success) {
toast.success("Kickstarter deleted");
setDeleteId(null);
} else {
toast.error(result.error);
}
});
};
const activeDelivery = searchParams.get("delivery") ?? "";
const activePayment = searchParams.get("payment") ?? "";
const activeHost = searchParams.get("host") ?? "";
return (
<div className="space-y-4">
<PageHeader title="Kickstarters" description="Track your crowdfunding campaigns and deliveries">
<Button onClick={() => { setEditKickstarter(undefined); setModalOpen(true); }}>
<Plus className="mr-2 h-4 w-4" />
Add Kickstarter
</Button>
</PageHeader>
<div className="flex flex-wrap items-center gap-2">
<div className="relative flex-1 min-w-[200px] max-w-sm">
<Search className="absolute left-2.5 top-2.5 h-4 w-4 text-muted-foreground" />
<Input
placeholder="Search kickstarters..."
value={searchValue}
onChange={(e) => updateSearch(e.target.value)}
className="pl-9 h-9"
/>
</div>
<Select value={activeDelivery || "all"} onValueChange={(v) => updateFilter("delivery", v)}>
<SelectTrigger className="w-[160px] h-9">
<SelectValue placeholder="All Delivery" />
</SelectTrigger>
<SelectContent>
<SelectItem value="all">All Delivery</SelectItem>
<SelectItem value="NOT_DELIVERED">Not Delivered</SelectItem>
<SelectItem value="PARTIAL">Partial</SelectItem>
<SelectItem value="DELIVERED">Delivered</SelectItem>
</SelectContent>
</Select>
<Select value={activePayment || "all"} onValueChange={(v) => updateFilter("payment", v)}>
<SelectTrigger className="w-[140px] h-9">
<SelectValue placeholder="All Payment" />
</SelectTrigger>
<SelectContent>
<SelectItem value="all">All Payment</SelectItem>
<SelectItem value="PAID">Paid</SelectItem>
<SelectItem value="UNPAID">Unpaid</SelectItem>
</SelectContent>
</Select>
{hosts.length > 0 && (
<Select value={activeHost || "all"} onValueChange={(v) => updateFilter("host", v)}>
<SelectTrigger className="w-[160px] h-9">
<SelectValue placeholder="All Hosts" />
</SelectTrigger>
<SelectContent>
<SelectItem value="all">All Hosts</SelectItem>
{hosts.map((host) => (
<SelectItem key={host.id} value={host.id}>
{host.name}
</SelectItem>
))}
</SelectContent>
</Select>
)}
<DataTableViewOptions table={table} />
</div>
<DataTable table={table} emptyMessage="No kickstarters found. Add your first campaign!" />
<DataTablePagination table={table} totalCount={totalCount} />
<KickstarterModal
open={modalOpen}
onOpenChange={(open) => {
setModalOpen(open);
if (!open) setEditKickstarter(undefined);
}}
hosts={hosts}
kickstarter={editKickstarter}
/>
<DeleteDialog
open={!!deleteId}
onOpenChange={(open) => !open && setDeleteId(null)}
title="Delete Kickstarter"
description="This will permanently delete this kickstarter and unlink any associated packages."
onConfirm={handleDelete}
isLoading={isPending}
/>
{linkTarget && (
<PackageLinkerDialog
open={!!linkTarget}
onOpenChange={(open) => !open && setLinkTarget(null)}
kickstarterId={linkTarget.id}
kickstarterName={linkTarget.name}
/>
)}
</div>
);
}

View File

@@ -0,0 +1,211 @@
"use client";
import { useState, useTransition, useCallback, useEffect } from "react";
import { Search, Package, X, Loader2 } from "lucide-react";
import { toast } from "sonner";
import { linkPackages } from "../actions";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Badge } from "@/components/ui/badge";
import { Checkbox } from "@/components/ui/checkbox";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { ScrollArea } from "@/components/ui/scroll-area";
interface PackageResult {
id: string;
fileName: string;
fileSize: string;
archiveType: string;
creator: string | null;
fileCount: number;
}
interface PackageLinkerDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
kickstarterId: string;
kickstarterName: string;
}
function formatSize(bytes: string | number): string {
const b = Number(bytes);
if (b >= 1024 * 1024 * 1024) return `${(b / (1024 * 1024 * 1024)).toFixed(1)} GB`;
if (b >= 1024 * 1024) return `${(b / (1024 * 1024)).toFixed(0)} MB`;
return `${(b / 1024).toFixed(0)} KB`;
}
export function PackageLinkerDialog({
open,
onOpenChange,
kickstarterId,
kickstarterName,
}: PackageLinkerDialogProps) {
const [isPending, startTransition] = useTransition();
const [searchQuery, setSearchQuery] = useState("");
const [searchResults, setSearchResults] = useState<PackageResult[]>([]);
const [isSearching, setIsSearching] = useState(false);
const [selectedIds, setSelectedIds] = useState<Set<string>>(new Set());
// Fetch currently linked packages when dialog opens
useEffect(() => {
if (open) {
setSearchQuery("");
setSearchResults([]);
fetch(`/api/packages/linked?kickstarterId=${kickstarterId}`)
.then((res) => res.json())
.then((data) => {
if (data.packageIds) {
setSelectedIds(new Set(data.packageIds));
}
})
.catch(() => {});
}
}, [open, kickstarterId]);
const doSearch = useCallback(async (query: string) => {
if (query.length < 2) {
setSearchResults([]);
return;
}
setIsSearching(true);
try {
const res = await fetch(`/api/packages/search?q=${encodeURIComponent(query)}&limit=20`);
if (res.ok) {
const data = await res.json();
setSearchResults(data.packages ?? []);
}
} catch {
// Ignore search errors
} finally {
setIsSearching(false);
}
}, []);
// Debounced search
useEffect(() => {
const timer = setTimeout(() => doSearch(searchQuery), 300);
return () => clearTimeout(timer);
}, [searchQuery, doSearch]);
function togglePackage(id: string) {
setSelectedIds((prev) => {
const next = new Set(prev);
if (next.has(id)) next.delete(id);
else next.add(id);
return next;
});
}
function handleSave() {
startTransition(async () => {
const result = await linkPackages(kickstarterId, Array.from(selectedIds));
if (result.success) {
toast.success(`Linked ${selectedIds.size} package(s) to "${kickstarterName}"`);
onOpenChange(false);
} else {
toast.error(result.error);
}
});
}
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-lg">
<DialogHeader>
<DialogTitle>Link Packages</DialogTitle>
<DialogDescription>
Search and select STL packages to link to &ldquo;{kickstarterName}&rdquo;.
</DialogDescription>
</DialogHeader>
<div className="space-y-3">
{selectedIds.size > 0 && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Package className="h-4 w-4" />
{selectedIds.size} package(s) selected
<Button
variant="ghost"
size="sm"
className="h-6 px-2 text-xs"
onClick={() => setSelectedIds(new Set())}
>
Clear all
</Button>
</div>
)}
<div className="relative">
<Search className="absolute left-2.5 top-2.5 h-4 w-4 text-muted-foreground" />
<Input
placeholder="Search packages by name or creator..."
value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)}
className="pl-9"
autoFocus
/>
{isSearching && (
<Loader2 className="absolute right-2.5 top-2.5 h-4 w-4 animate-spin text-muted-foreground" />
)}
</div>
<ScrollArea className="h-[300px] rounded-md border">
<div className="p-2 space-y-1">
{searchResults.length === 0 && searchQuery.length >= 2 && !isSearching && (
<p className="text-sm text-muted-foreground text-center py-8">
No packages found
</p>
)}
{searchQuery.length < 2 && (
<p className="text-sm text-muted-foreground text-center py-8">
Type at least 2 characters to search
</p>
)}
{searchResults.map((pkg) => (
<label
key={pkg.id}
className="flex items-center gap-3 p-2 rounded-md hover:bg-muted/50 cursor-pointer"
>
<Checkbox
checked={selectedIds.has(pkg.id)}
onCheckedChange={() => togglePackage(pkg.id)}
/>
<div className="flex-1 min-w-0">
<p className="text-sm font-medium truncate">{pkg.fileName}</p>
<div className="flex items-center gap-2 text-xs text-muted-foreground">
{pkg.creator && <span>{pkg.creator}</span>}
<span>{formatSize(pkg.fileSize)}</span>
<Badge variant="outline" className="text-[10px] h-4 px-1">
{pkg.archiveType}
</Badge>
{pkg.fileCount > 0 && <span>{pkg.fileCount} files</span>}
</div>
</div>
{selectedIds.has(pkg.id) && (
<X className="h-3.5 w-3.5 text-muted-foreground shrink-0" />
)}
</label>
))}
</div>
</ScrollArea>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>
Cancel
</Button>
<Button onClick={handleSave} disabled={isPending}>
{isPending ? <Loader2 className="h-4 w-4 animate-spin mr-1" /> : null}
Save ({selectedIds.size})
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,228 @@
"use server";
import { auth } from "@/lib/auth";
import { prisma } from "@/lib/prisma";
import { kickstarterSchema, kickstarterHostSchema } from "@/schemas/kickstarter.schema";
import { revalidatePath } from "next/cache";
import type { ActionResult } from "@/types/api.types";
const REVALIDATE_PATH = "/kickstarters";
export async function createKickstarter(
input: unknown
): Promise<ActionResult<{ id: string }>> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
const parsed = kickstarterSchema.safeParse(input);
if (!parsed.success) return { success: false, error: "Validation failed" };
try {
const ks = await prisma.kickstarter.create({
data: {
name: parsed.data.name,
link: parsed.data.link || null,
filesUrl: parsed.data.filesUrl || null,
deliveryStatus: parsed.data.deliveryStatus,
paymentStatus: parsed.data.paymentStatus,
hostId: parsed.data.hostId || null,
notes: parsed.data.notes || null,
userId: session.user.id,
},
});
revalidatePath(REVALIDATE_PATH);
return { success: true, data: { id: ks.id } };
} catch {
return { success: false, error: "Failed to create kickstarter" };
}
}
export async function updateKickstarter(
id: string,
input: unknown
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
const parsed = kickstarterSchema.safeParse(input);
if (!parsed.success) return { success: false, error: "Validation failed" };
const existing = await prisma.kickstarter.findFirst({
where: { id, userId: session.user.id },
});
if (!existing) return { success: false, error: "Not found" };
try {
await prisma.kickstarter.update({
where: { id },
data: {
name: parsed.data.name,
link: parsed.data.link || null,
filesUrl: parsed.data.filesUrl || null,
deliveryStatus: parsed.data.deliveryStatus,
paymentStatus: parsed.data.paymentStatus,
hostId: parsed.data.hostId || null,
notes: parsed.data.notes || null,
},
});
revalidatePath(REVALIDATE_PATH);
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to update kickstarter" };
}
}
export async function deleteKickstarter(id: string): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
const existing = await prisma.kickstarter.findFirst({
where: { id, userId: session.user.id },
});
if (!existing) return { success: false, error: "Not found" };
try {
await prisma.kickstarter.delete({ where: { id } });
revalidatePath(REVALIDATE_PATH);
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to delete kickstarter" };
}
}
export async function createHost(
input: unknown
): Promise<ActionResult<{ id: string; name: string }>> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
const parsed = kickstarterHostSchema.safeParse(input);
if (!parsed.success) return { success: false, error: "Validation failed" };
try {
const host = await prisma.kickstarterHost.create({
data: { name: parsed.data.name },
});
revalidatePath(REVALIDATE_PATH);
return { success: true, data: { id: host.id, name: host.name } };
} catch (err: unknown) {
if (
err instanceof Error &&
err.message.includes("Unique constraint")
) {
return { success: false, error: "A host with that name already exists" };
}
return { success: false, error: "Failed to create host" };
}
}
export async function linkPackages(
kickstarterId: string,
packageIds: string[]
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
const existing = await prisma.kickstarter.findFirst({
where: { id: kickstarterId, userId: session.user.id },
});
if (!existing) return { success: false, error: "Not found" };
try {
// Replace all linked packages
await prisma.$transaction([
prisma.kickstarterPackage.deleteMany({
where: { kickstarterId },
}),
...packageIds.map((packageId) =>
prisma.kickstarterPackage.create({
data: { kickstarterId, packageId },
})
),
]);
revalidatePath(REVALIDATE_PATH);
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to link packages" };
}
}
export async function sendAllKickstarterPackages(
kickstarterId: string
): Promise<ActionResult<{ queued: number }>> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const telegramLink = await prisma.telegramLink.findUnique({
where: { userId: session.user.id },
});
if (!telegramLink) {
return { success: false, error: "No linked Telegram account. Link one in Settings." };
}
const kickstarter = await prisma.kickstarter.findFirst({
where: { id: kickstarterId, userId: session.user.id },
select: {
packages: {
select: {
package: {
select: { id: true, destChannelId: true, destMessageId: true, fileName: true },
},
},
},
},
});
if (!kickstarter) {
return { success: false, error: "Kickstarter not found" };
}
const sendablePackages = kickstarter.packages
.map((lnk) => lnk.package)
.filter((p) => p.destChannelId && p.destMessageId);
if (sendablePackages.length === 0) {
return { success: false, error: "No linked packages are available for sending" };
}
let queued = 0;
for (const pkg of sendablePackages) {
const existing = await prisma.botSendRequest.findFirst({
where: {
packageId: pkg.id,
telegramLinkId: telegramLink.id,
status: { in: ["PENDING", "SENDING"] },
},
});
if (!existing) {
const sendRequest = await prisma.botSendRequest.create({
data: {
packageId: pkg.id,
telegramLinkId: telegramLink.id,
requestedByUserId: session.user.id,
status: "PENDING",
},
});
try {
await prisma.$queryRawUnsafe(
`SELECT pg_notify('bot_send', $1)`,
sendRequest.id
);
} catch {
// Best-effort
}
queued++;
}
}
revalidatePath(REVALIDATE_PATH);
return { success: true, data: { queued } };
} catch {
return { success: false, error: "Failed to send packages" };
}
}

View File

@@ -0,0 +1,29 @@
import { auth } from "@/lib/auth";
import { redirect } from "next/navigation";
import { getKickstarters, getKickstarterHosts } from "@/data/kickstarter.queries";
import type { DataTableSearchParams } from "@/types/table.types";
import { KickstarterTable } from "./_components/kickstarter-table";
interface Props {
searchParams: Promise<DataTableSearchParams & { delivery?: string; payment?: string; host?: string }>;
}
export default async function KickstartersPage({ searchParams }: Props) {
const session = await auth();
if (!session?.user?.id) redirect("/login");
const params = await searchParams;
const [{ data, pageCount, totalCount }, hosts] = await Promise.all([
getKickstarters(session.user.id, params),
getKickstarterHosts(),
]);
return (
<KickstarterTable
data={data}
pageCount={pageCount}
totalCount={totalCount}
hosts={hosts}
/>
);
}

View File

@@ -0,0 +1,399 @@
"use client";
import { useEffect, useState, useCallback, useRef, useTransition } from "react";
import {
Image as ImageIcon,
Loader2,
Check,
AlertCircle,
ImageOff,
} from "lucide-react";
import {
Dialog,
DialogContent,
DialogHeader,
DialogTitle,
DialogDescription,
} from "@/components/ui/dialog";
import { ScrollArea } from "@/components/ui/scroll-area";
import { Button } from "@/components/ui/button";
import { cn } from "@/lib/utils";
import { toast } from "sonner";
import { setPreviewFromExtract } from "../actions";
interface ArchiveImage {
id: string;
path: string;
fileName: string;
extension: string | null;
size: string;
}
interface ThumbnailState {
status: "idle" | "loading" | "loaded" | "failed";
requestId?: string;
imageUrl?: string;
error?: string;
}
interface ArchivePreviewPickerProps {
packageId: string;
packageName: string;
open: boolean;
onOpenChange: (open: boolean) => void;
onPreviewSet?: () => void;
}
function formatBytes(bytesStr: string): string {
const bytes = Number(bytesStr);
if (bytes === 0) return "0 B";
const k = 1024;
const sizes = ["B", "KB", "MB", "GB"];
const i = Math.floor(Math.log(bytes) / Math.log(k));
return `${parseFloat((bytes / Math.pow(k, i)).toFixed(1))} ${sizes[i]}`;
}
export function ArchivePreviewPicker({
packageId,
packageName,
open,
onOpenChange,
onPreviewSet,
}: ArchivePreviewPickerProps) {
const [images, setImages] = useState<ArchiveImage[]>([]);
const [loading, setLoading] = useState(false);
const [thumbnails, setThumbnails] = useState<Map<string, ThumbnailState>>(new Map());
const [selectedPath, setSelectedPath] = useState<string | null>(null);
const [isPending, startTransition] = useTransition();
const pollTimers = useRef<Map<string, ReturnType<typeof setInterval>>>(new Map());
// Track which paths have already been requested to avoid re-requesting
const requestedPaths = useRef<Set<string>>(new Set());
// Cleanup poll timers on unmount
useEffect(() => {
return () => {
for (const timer of pollTimers.current.values()) {
clearInterval(timer);
}
};
}, []);
// Fetch image list when opened
useEffect(() => {
if (!open) return;
setImages([]);
setThumbnails(new Map());
setSelectedPath(null);
requestedPaths.current.clear();
// Clear any leftover poll timers
for (const timer of pollTimers.current.values()) {
clearInterval(timer);
}
pollTimers.current.clear();
const fetchImages = async () => {
setLoading(true);
try {
const res = await fetch(`/api/zips/${packageId}/images`);
if (!res.ok) throw new Error("Failed to fetch images");
const data = await res.json();
setImages(data.images);
} catch {
toast.error("Failed to load archive images");
} finally {
setLoading(false);
}
};
fetchImages();
}, [open, packageId]);
// Poll callback for a specific request
const startPolling = useCallback(
(filePath: string, requestId: string) => {
// Clear any existing poll for this path
const existing = pollTimers.current.get(filePath);
if (existing) clearInterval(existing);
const pollId = setInterval(async () => {
try {
const pollRes = await fetch(
`/api/zips/${packageId}/extract/${requestId}`
);
if (!pollRes.ok) return;
const pollData = await pollRes.json();
if (pollData.status === "COMPLETED") {
clearInterval(pollId);
pollTimers.current.delete(filePath);
setThumbnails((prev) => {
const next = new Map(prev);
next.set(filePath, {
status: "loaded",
requestId,
imageUrl: `/api/zips/${packageId}/extract/${requestId}?image=true`,
});
return next;
});
} else if (pollData.status === "FAILED") {
clearInterval(pollId);
pollTimers.current.delete(filePath);
setThumbnails((prev) => {
const next = new Map(prev);
next.set(filePath, {
status: "failed",
error: pollData.error || "Extraction failed",
});
return next;
});
}
} catch {
// Silently retry on network error
}
}, 2000);
pollTimers.current.set(filePath, pollId);
},
[packageId]
);
// Request extraction for a specific image
const requestThumbnail = useCallback(
async (filePath: string) => {
// Don't re-request if already in progress
if (requestedPaths.current.has(filePath)) return;
requestedPaths.current.add(filePath);
setThumbnails((prev) => {
const next = new Map(prev);
next.set(filePath, { status: "loading" });
return next;
});
try {
const res = await fetch(`/api/zips/${packageId}/extract`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ filePath }),
});
if (!res.ok) {
const err = await res.json();
throw new Error(err.error || "Extract failed");
}
const data = await res.json();
if (data.status === "COMPLETED") {
setThumbnails((prev) => {
const next = new Map(prev);
next.set(filePath, {
status: "loaded",
requestId: data.requestId,
imageUrl: `/api/zips/${packageId}/extract/${data.requestId}?image=true`,
});
return next;
});
return;
}
// Pending or in-progress: start polling
setThumbnails((prev) => {
const next = new Map(prev);
next.set(filePath, { status: "loading", requestId: data.requestId });
return next;
});
startPolling(filePath, data.requestId);
} catch (err) {
requestedPaths.current.delete(filePath);
setThumbnails((prev) => {
const next = new Map(prev);
next.set(filePath, {
status: "failed",
error: err instanceof Error ? err.message : "Failed to extract",
});
return next;
});
}
},
[packageId, startPolling]
);
// Auto-request thumbnails for the first batch of images
useEffect(() => {
if (!open || images.length === 0) return;
// Request the first 12 images automatically
const toRequest = images.slice(0, 12);
for (const img of toRequest) {
requestThumbnail(img.path);
}
// Only trigger when images list changes, not on every requestThumbnail change
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [images, open]);
// Handle selection confirmation
const handleConfirm = () => {
if (!selectedPath) return;
const thumbState = thumbnails.get(selectedPath);
if (!thumbState?.requestId) return;
startTransition(async () => {
const result = await setPreviewFromExtract(packageId, thumbState.requestId!);
if (result.success) {
toast.success("Preview updated from archive image");
onOpenChange(false);
onPreviewSet?.();
} else {
toast.error(result.error);
}
});
};
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-2xl max-h-[80vh] flex flex-col gap-0 p-0">
<DialogHeader className="px-6 pt-6 pb-4 border-b border-border space-y-1">
<DialogTitle>Select Preview Image</DialogTitle>
<DialogDescription className="text-sm">
Choose an image from the archive to use as the preview for{" "}
<span className="font-medium text-foreground">{packageName}</span>
</DialogDescription>
</DialogHeader>
<ScrollArea className="flex-1 min-h-0">
<div className="p-4">
{loading ? (
<div className="flex flex-col items-center justify-center gap-2 py-12">
<Loader2 className="h-5 w-5 animate-spin text-muted-foreground" />
<span className="text-sm text-muted-foreground">
Loading image list...
</span>
</div>
) : images.length === 0 ? (
<div className="flex flex-col items-center justify-center gap-2 py-12">
<ImageOff className="h-6 w-6 text-muted-foreground/50" />
<span className="text-sm text-muted-foreground">
No images found in this archive
</span>
</div>
) : (
<div className="grid grid-cols-3 sm:grid-cols-4 gap-3">
{images.map((img) => {
const thumbState = thumbnails.get(img.path);
const isSelected = selectedPath === img.path;
const isLoaded = thumbState?.status === "loaded";
const isLoading = thumbState?.status === "loading";
const isFailed = thumbState?.status === "failed";
return (
<button
key={img.id}
type="button"
className={cn(
"relative aspect-square rounded-lg overflow-hidden border-2 transition-all",
"hover:border-primary/50 cursor-pointer group",
isSelected
? "border-primary ring-2 ring-primary/30"
: "border-border",
isFailed && "opacity-60"
)}
onClick={() => {
if (isLoaded) {
setSelectedPath(img.path);
} else if (isFailed) {
// Allow retry on failed
requestedPaths.current.delete(img.path);
requestThumbnail(img.path);
} else if (!thumbState || thumbState.status === "idle") {
requestThumbnail(img.path);
}
}}
title={img.path}
>
{isLoaded && thumbState.imageUrl ? (
<img
src={thumbState.imageUrl}
alt={img.fileName}
className="h-full w-full object-cover"
loading="lazy"
/>
) : isLoading ? (
<div className="h-full w-full flex items-center justify-center bg-muted">
<Loader2 className="h-5 w-5 animate-spin text-muted-foreground" />
</div>
) : isFailed ? (
<div className="h-full w-full flex flex-col items-center justify-center bg-muted gap-1">
<AlertCircle className="h-4 w-4 text-destructive" />
<span className="text-[10px] text-destructive px-1 text-center">
Click to retry
</span>
</div>
) : (
<div className="h-full w-full flex items-center justify-center bg-muted">
<ImageIcon className="h-5 w-5 text-muted-foreground" />
</div>
)}
{/* Selection checkmark */}
{isSelected && (
<div className="absolute top-1.5 right-1.5 h-5 w-5 rounded-full bg-primary flex items-center justify-center">
<Check className="h-3 w-3 text-primary-foreground" />
</div>
)}
{/* File info overlay */}
<div className="absolute bottom-0 left-0 right-0 bg-black/60 px-1.5 py-1 opacity-0 group-hover:opacity-100 transition-opacity">
<p className="text-[10px] text-white truncate">
{img.fileName}
</p>
<p className="text-[9px] text-white/70">
{formatBytes(img.size)}
</p>
</div>
</button>
);
})}
</div>
)}
</div>
</ScrollArea>
{/* Footer */}
{images.length > 0 && (
<div className="px-6 py-4 border-t border-border flex items-center justify-between">
<span className="text-sm text-muted-foreground">
{images.length} image{images.length !== 1 ? "s" : ""} found
</span>
<div className="flex gap-2">
<Button
variant="outline"
size="sm"
onClick={() => onOpenChange(false)}
>
Cancel
</Button>
<Button
size="sm"
disabled={!selectedPath || isPending}
onClick={handleConfirm}
>
{isPending ? (
<>
<Loader2 className="h-3.5 w-3.5 animate-spin mr-1" />
Setting...
</>
) : (
"Use as Preview"
)}
</Button>
</div>
</div>
)}
</DialogContent>
</Dialog>
);
}

View File

@@ -1,10 +1,11 @@
"use client"; "use client";
import { type ColumnDef } from "@tanstack/react-table"; import { type ColumnDef } from "@tanstack/react-table";
import { FileArchive, Eye, ImageIcon } from "lucide-react"; import { FileArchive, Eye, ChevronRight, Layers, Ungroup, Send, ImagePlus, GitMerge } from "lucide-react";
import { DataTableColumnHeader } from "@/components/shared/data-table-column-header"; import { DataTableColumnHeader } from "@/components/shared/data-table-column-header";
import { Badge } from "@/components/ui/badge"; import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
import { Checkbox } from "@/components/ui/checkbox";
import { SendToTelegramButton } from "./send-to-telegram-button"; import { SendToTelegramButton } from "./send-to-telegram-button";
export interface PackageRow { export interface PackageRow {
@@ -12,23 +13,68 @@ export interface PackageRow {
fileName: string; fileName: string;
fileSize: string; fileSize: string;
contentHash: string; contentHash: string;
archiveType: "ZIP" | "RAR"; archiveType: "ZIP" | "RAR" | "SEVEN_Z" | "DOCUMENT";
fileCount: number; fileCount: number;
isMultipart: boolean; isMultipart: boolean;
hasPreview: boolean; hasPreview: boolean;
creator: string | null; creator: string | null;
tags: string[];
indexedAt: string; indexedAt: string;
sourceChannel: { sourceChannel: {
id: string; id: string;
title: string; title: string;
}; };
matchedFileCount: number;
matchedByContent: boolean;
packageGroupId?: string | null;
}
export interface GroupHeaderRow {
_rowType: "group";
id: string;
name: string;
hasPreview: boolean;
totalFileSize: string;
totalFileCount: number;
packageCount: number;
combinedTags: string[];
archiveTypes: ("ZIP" | "RAR" | "SEVEN_Z" | "DOCUMENT")[];
latestIndexedAt: string;
sourceChannel: { id: string; title: string };
_expanded: boolean;
}
export interface PackageTableRow extends PackageRow {
_rowType: "package";
_groupId: string | null;
_isGroupMember: boolean;
}
export type StlTableRow = GroupHeaderRow | PackageTableRow;
function isGroupRow(row: StlTableRow): row is GroupHeaderRow {
return row._rowType === "group";
} }
interface PackageColumnsProps { interface PackageColumnsProps {
onViewFiles: (pkg: PackageRow) => void; onViewFiles: (pkg: PackageRow) => void;
onSetCreator: (pkg: PackageRow) => void;
onSetTags: (pkg: PackageRow) => void;
searchTerm: string;
onToggleGroup: (groupId: string) => void;
onRenameGroup: (groupId: string, currentName: string) => void;
onDissolveGroup: (groupId: string) => void;
onSendAllInGroup: (groupId: string) => void;
onRemoveFromGroup: (packageId: string) => void;
onGroupPreviewUpload: (groupId: string) => void;
selectedPackages: Set<string>;
onToggleSelect: (packageId: string) => void;
mergeSourceId: string | null;
onStartMerge: (groupId: string) => void;
onCompleteMerge: (targetGroupId: string) => void;
} }
function formatBytes(bytesStr: string): string { export function formatBytes(bytesStr: string): string {
const bytes = Number(bytesStr); const bytes = Number(bytesStr);
if (bytes === 0) return "0 B"; if (bytes === 0) return "0 B";
const k = 1024; const k = 1024;
@@ -55,68 +101,274 @@ function PreviewCell({ pkg }: { pkg: PackageRow }) {
); );
} }
function GroupPreviewCell({
group,
onUpload,
}: {
group: GroupHeaderRow;
onUpload: (groupId: string) => void;
}) {
if (group.hasPreview) {
return (
<button
className="relative group/preview cursor-pointer"
onClick={() => onUpload(group.id)}
title="Click to change preview image"
>
<img
src={`/api/groups/${group.id}/preview`}
alt=""
className="h-9 w-9 rounded-md object-cover bg-muted"
loading="lazy"
/>
<div className="absolute inset-0 flex items-center justify-center rounded-md bg-black/50 opacity-0 group-hover/preview:opacity-100 transition-opacity">
<ImagePlus className="h-3.5 w-3.5 text-white" />
</div>
</button>
);
}
return (
<button
className="flex h-9 w-9 items-center justify-center rounded-md bg-muted hover:bg-muted/80 transition-colors cursor-pointer"
onClick={() => onUpload(group.id)}
title="Click to add preview image"
>
<Layers className="h-4 w-4 text-muted-foreground" />
</button>
);
}
export function getPackageColumns({ export function getPackageColumns({
onViewFiles, onViewFiles,
}: PackageColumnsProps): ColumnDef<PackageRow, unknown>[] { onSetCreator,
onSetTags,
searchTerm,
onToggleGroup,
onRenameGroup,
onDissolveGroup,
onSendAllInGroup,
onRemoveFromGroup,
onGroupPreviewUpload,
selectedPackages,
onToggleSelect,
mergeSourceId,
onStartMerge,
onCompleteMerge,
}: PackageColumnsProps): ColumnDef<StlTableRow, unknown>[] {
return [ return [
{
id: "select",
header: "",
cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) return null;
return (
<Checkbox
checked={selectedPackages.has(data.id)}
onCheckedChange={() => onToggleSelect(data.id)}
aria-label="Select package"
className="translate-y-[2px]"
/>
);
},
enableHiding: false,
enableSorting: false,
size: 32,
},
{ {
id: "preview", id: "preview",
header: "", header: "",
cell: ({ row }) => <PreviewCell pkg={row.original} />, cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) {
return (
<div className="flex items-center gap-1">
<button
className="shrink-0 p-0.5 cursor-pointer"
onClick={() => onToggleGroup(data.id)}
aria-label={data._expanded ? "Collapse group" : "Expand group"}
>
<ChevronRight
className={`h-4 w-4 text-muted-foreground transition-transform ${
data._expanded ? "rotate-90" : ""
}`}
/>
</button>
<GroupPreviewCell group={data} onUpload={onGroupPreviewUpload} />
</div>
);
}
return (
<div className={data._isGroupMember ? "pl-5" : ""}>
<PreviewCell pkg={data} />
</div>
);
},
enableHiding: false, enableHiding: false,
enableSorting: false, enableSorting: false,
size: 52, size: 72,
}, },
{ {
accessorKey: "fileName", accessorKey: "fileName",
header: ({ column }) => <DataTableColumnHeader column={column} title="File Name" />, header: ({ column }) => <DataTableColumnHeader column={column} title="File Name" />,
cell: ({ row }) => ( cell: ({ row }) => {
<div className="flex items-center gap-2 min-w-0"> const data = row.original;
<span className="font-medium truncate max-w-[300px]">{row.original.fileName}</span> if (isGroupRow(data)) {
{row.original.isMultipart && ( return (
<div className="min-w-0">
<div className="flex items-center gap-2">
<button
className="font-semibold truncate max-w-[300px] cursor-pointer hover:underline text-left"
onClick={() => onRenameGroup(data.id, data.name)}
title="Click to rename group"
>
{data.name}
</button>
<Badge variant="secondary" className="text-[10px] shrink-0">
{data.packageCount} pkg{data.packageCount !== 1 ? "s" : ""}
</Badge>
</div>
</div>
);
}
return (
<div className="min-w-0">
<div className="flex items-center gap-2">
<span className="font-medium truncate max-w-[300px]">{data.fileName}</span>
{data.isMultipart && (
<Badge variant="outline" className="text-[10px] shrink-0"> <Badge variant="outline" className="text-[10px] shrink-0">
Multi Multi
</Badge> </Badge>
)} )}
</div> </div>
), {searchTerm && data.matchedByContent && (
<button
className="text-[11px] text-amber-500 hover:text-amber-400 hover:underline cursor-pointer mt-0.5"
onClick={() => onViewFiles(data)}
>
{data.matchedFileCount.toLocaleString()} file match{data.matchedFileCount !== 1 ? "es" : ""}
</button>
)}
</div>
);
},
enableHiding: false, enableHiding: false,
}, },
{ {
accessorKey: "archiveType", accessorKey: "archiveType",
header: ({ column }) => <DataTableColumnHeader column={column} title="Type" />, header: ({ column }) => <DataTableColumnHeader column={column} title="Type" />,
cell: ({ row }) => ( cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) {
const types = data.archiveTypes;
if (types.length === 1) {
return (
<Badge variant="secondary" className="text-[10px]"> <Badge variant="secondary" className="text-[10px]">
{row.original.archiveType} {types[0]}
</Badge> </Badge>
), );
}
return (
<Badge variant="secondary" className="text-[10px]">
Mixed
</Badge>
);
}
return (
<Badge variant="secondary" className="text-[10px]">
{data.archiveType}
</Badge>
);
},
}, },
{ {
accessorKey: "fileSize", accessorKey: "fileSize",
header: ({ column }) => <DataTableColumnHeader column={column} title="Size" />, header: ({ column }) => <DataTableColumnHeader column={column} title="Size" />,
cell: ({ row }) => ( cell: ({ row }) => {
const data = row.original;
const size = isGroupRow(data) ? data.totalFileSize : data.fileSize;
return (
<span className="text-sm text-muted-foreground"> <span className="text-sm text-muted-foreground">
{formatBytes(row.original.fileSize)} {formatBytes(size)}
</span> </span>
), );
},
}, },
{ {
accessorKey: "fileCount", accessorKey: "fileCount",
header: ({ column }) => <DataTableColumnHeader column={column} title="Files" />, header: ({ column }) => <DataTableColumnHeader column={column} title="Files" />,
cell: ({ row }) => ( cell: ({ row }) => {
const data = row.original;
const count = isGroupRow(data) ? data.totalFileCount : data.fileCount;
return (
<span className="text-sm"> <span className="text-sm">
{row.original.fileCount.toLocaleString()} {count.toLocaleString()}
</span> </span>
), );
},
}, },
{ {
accessorKey: "creator", accessorKey: "creator",
header: ({ column }) => <DataTableColumnHeader column={column} title="Creator" />, header: ({ column }) => <DataTableColumnHeader column={column} title="Creator" />,
cell: ({ row }) => ( cell: ({ row }) => {
<span className="text-sm text-muted-foreground truncate max-w-[160px] block"> const data = row.original;
{row.original.creator ?? "\u2014"} if (isGroupRow(data)) {
</span> return <span className="text-sm text-muted-foreground">{"\u2014"}</span>;
), }
return (
<button
className="text-sm text-muted-foreground truncate max-w-[160px] block hover:text-foreground hover:underline cursor-pointer text-left"
onClick={() => onSetCreator(data)}
title="Click to edit creator"
>
{data.creator || "\u2014"}
</button>
);
},
},
{
id: "tags",
header: ({ column }) => <DataTableColumnHeader column={column} title="Tags" />,
cell: ({ row }) => {
const data = row.original;
const tags = isGroupRow(data) ? data.combinedTags : data.tags;
if (tags.length === 0) {
if (isGroupRow(data)) {
return <span className="text-sm text-muted-foreground">{"\u2014"}</span>;
}
return (
<button
className="text-sm text-muted-foreground hover:text-foreground cursor-pointer"
onClick={() => onSetTags(data)}
title="Click to add tags"
>
{"\u2014"}
</button>
);
}
const clickHandler = isGroupRow(data) ? undefined : () => onSetTags(data as PackageTableRow);
return (
<button
className={`flex flex-wrap gap-1 ${clickHandler ? "cursor-pointer" : "cursor-default"}`}
onClick={clickHandler}
title={clickHandler ? "Click to edit tags" : undefined}
>
{tags.map((tag) => (
<Badge
key={tag}
variant="outline"
className="text-[10px] bg-primary/5"
>
{tag}
</Badge>
))}
</button>
);
},
accessorFn: (row) => {
if (isGroupRow(row)) return row.combinedTags.join(", ");
return row.tags.join(", ");
},
}, },
{ {
id: "channel", id: "channel",
@@ -131,31 +383,95 @@ export function getPackageColumns({
{ {
accessorKey: "indexedAt", accessorKey: "indexedAt",
header: ({ column }) => <DataTableColumnHeader column={column} title="Indexed" />, header: ({ column }) => <DataTableColumnHeader column={column} title="Indexed" />,
cell: ({ row }) => ( cell: ({ row }) => {
const data = row.original;
const date = isGroupRow(data) ? data.latestIndexedAt : data.indexedAt;
return (
<span className="text-sm text-muted-foreground"> <span className="text-sm text-muted-foreground">
{new Date(row.original.indexedAt).toLocaleDateString()} {new Date(date).toLocaleDateString()}
</span> </span>
), );
},
}, },
{ {
id: "actions", id: "actions",
cell: ({ row }) => ( cell: ({ row }) => {
const data = row.original;
if (isGroupRow(data)) {
const isMergeSource = mergeSourceId === data.id;
const canMergeHere = mergeSourceId !== null && mergeSourceId !== data.id;
return (
<div className="flex items-center gap-0.5">
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onSendAllInGroup(data.id)}
title="Send all packages in group"
>
<Send className="h-4 w-4" />
</Button>
<Button
variant="ghost"
size="icon"
className={`h-8 w-8 ${isMergeSource ? "text-amber-500 bg-amber-500/10 hover:bg-amber-500/20" : ""}`}
onClick={() => onStartMerge(data.id)}
title={isMergeSource ? "Cancel merge (this group is the merge source)" : "Start merge — mark this group as merge source"}
>
<GitMerge className="h-4 w-4" />
</Button>
{canMergeHere && (
<Button
variant="ghost"
size="icon"
className="h-8 w-8 text-primary bg-primary/10 hover:bg-primary/20"
onClick={() => onCompleteMerge(data.id)}
title="Merge source group into this group"
>
<Layers className="h-4 w-4" />
</Button>
)}
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onDissolveGroup(data.id)}
title="Dissolve group"
>
<Ungroup className="h-4 w-4" />
</Button>
</div>
);
}
return (
<div className="flex items-center gap-0.5"> <div className="flex items-center gap-0.5">
<SendToTelegramButton <SendToTelegramButton
packageId={row.original.id} packageId={data.id}
packageName={row.original.fileName} packageName={data.fileName}
variant="icon" variant="icon"
/> />
<Button <Button
variant="ghost" variant="ghost"
size="icon" size="icon"
className="h-8 w-8" className="h-8 w-8"
onClick={() => onViewFiles(row.original)} onClick={() => onViewFiles(data)}
> >
<Eye className="h-4 w-4" /> <Eye className="h-4 w-4" />
</Button> </Button>
{data._isGroupMember && (
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onRemoveFromGroup(data.id)}
title="Remove from group"
>
<Ungroup className="h-3.5 w-3.5" />
</Button>
)}
</div> </div>
), );
},
enableHiding: false, enableHiding: false,
}, },
]; ];

View File

@@ -1,6 +1,7 @@
"use client"; "use client";
import { useEffect, useState, useCallback, useMemo } from "react"; import { useEffect, useState, useCallback, useMemo, useRef } from "react";
import { toast } from "sonner";
import { import {
FileText, FileText,
Folder, Folder,
@@ -9,6 +10,9 @@ import {
Search, Search,
ChevronDown, ChevronDown,
ChevronRight, ChevronRight,
Upload,
ImagePlus,
Images,
} from "lucide-react"; } from "lucide-react";
import { import {
Dialog, Dialog,
@@ -24,6 +28,8 @@ import { Button } from "@/components/ui/button";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import type { PackageRow } from "./package-columns"; import type { PackageRow } from "./package-columns";
import { SendToTelegramButton } from "./send-to-telegram-button"; import { SendToTelegramButton } from "./send-to-telegram-button";
import { uploadPackagePreview } from "../actions";
import { ArchivePreviewPicker } from "./archive-preview-picker";
interface FileItem { interface FileItem {
id: string; id: string;
@@ -46,6 +52,7 @@ interface PackageFilesDrawerProps {
pkg: PackageRow | null; pkg: PackageRow | null;
open: boolean; open: boolean;
onOpenChange: (open: boolean) => void; onOpenChange: (open: boolean) => void;
highlightTerm?: string;
} }
function formatBytes(bytesStr: string): string { function formatBytes(bytesStr: string): string {
@@ -75,6 +82,15 @@ function getExtBadgeClass(ext: string | null): string {
return EXTENSION_COLORS[ext.toLowerCase()] ?? "bg-zinc-500/15 text-zinc-400 border-zinc-500/30"; return EXTENSION_COLORS[ext.toLowerCase()] ?? "bg-zinc-500/15 text-zinc-400 border-zinc-500/30";
} }
function fileMatchesHighlight(file: FileItem, term: string): boolean {
if (!term) return false;
const lower = term.toLowerCase();
return (
file.fileName.toLowerCase().includes(lower) ||
file.path.toLowerCase().includes(lower)
);
}
/** /**
* Build a tree structure from flat file paths. * Build a tree structure from flat file paths.
*/ */
@@ -114,11 +130,13 @@ function TreeNodeView({
depth, depth,
search, search,
defaultOpen, defaultOpen,
highlightTerm,
}: { }: {
node: TreeNode; node: TreeNode;
depth: number; depth: number;
search: string; search: string;
defaultOpen: boolean; defaultOpen: boolean;
highlightTerm?: string;
}) { }) {
const [open, setOpen] = useState(defaultOpen); const [open, setOpen] = useState(defaultOpen);
@@ -131,10 +149,22 @@ function TreeNodeView({
}); });
}, [node.children]); }, [node.children]);
// If searching, force all open const hasHighlightedDescendant = useMemo(() => {
if (!highlightTerm) return false;
function check(n: TreeNode): boolean {
if (n.file && fileMatchesHighlight(n.file, highlightTerm!)) return true;
for (const child of n.children.values()) {
if (check(child)) return true;
}
return false;
}
return check(node);
}, [node, highlightTerm]);
// If searching or has highlighted descendants, force all open
useEffect(() => { useEffect(() => {
if (search) setOpen(true); if (search || hasHighlightedDescendant) setOpen(true);
}, [search]); }, [search, hasHighlightedDescendant]);
if (node.isFolder && node.children.size > 0) { if (node.isFolder && node.children.size > 0) {
return ( return (
@@ -171,6 +201,7 @@ function TreeNodeView({
depth={depth + 1} depth={depth + 1}
search={search} search={search}
defaultOpen={depth < 1} // Auto-expand first 2 levels defaultOpen={depth < 1} // Auto-expand first 2 levels
highlightTerm={highlightTerm}
/> />
))} ))}
</div> </div>
@@ -179,9 +210,15 @@ function TreeNodeView({
// File node // File node
if (node.file) { if (node.file) {
const isHighlighted = highlightTerm ? fileMatchesHighlight(node.file, highlightTerm) : false;
return ( return (
<div <div
className="flex items-center gap-2 rounded-md px-1 py-1 hover:bg-muted/50 transition-colors" className={cn(
"flex items-center gap-2 rounded-md px-1 py-1 transition-colors",
isHighlighted
? "bg-amber-500/15 hover:bg-amber-500/20"
: "hover:bg-muted/50"
)}
style={{ paddingLeft: `${Math.max(0, depth) * 16 + 4}px` }} style={{ paddingLeft: `${Math.max(0, depth) * 16 + 4}px` }}
> >
<FileText className="h-3.5 w-3.5 shrink-0 text-muted-foreground" /> <FileText className="h-3.5 w-3.5 shrink-0 text-muted-foreground" />
@@ -217,13 +254,53 @@ function countFiles(node: TreeNode): number {
const PAGE_SIZE = 100; const PAGE_SIZE = 100;
export function PackageFilesDrawer({ pkg, open, onOpenChange }: PackageFilesDrawerProps) { export function PackageFilesDrawer({ pkg, open, onOpenChange, highlightTerm }: PackageFilesDrawerProps) {
const [files, setFiles] = useState<FileItem[]>([]); const [files, setFiles] = useState<FileItem[]>([]);
const [total, setTotal] = useState(0); const [total, setTotal] = useState(0);
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const [loadingMore, setLoadingMore] = useState(false); const [loadingMore, setLoadingMore] = useState(false);
const [search, setSearch] = useState(""); const [search, setSearch] = useState("");
const [page, setPage] = useState(1); const [page, setPage] = useState(1);
const [uploading, setUploading] = useState(false);
const [localPreviewUrl, setLocalPreviewUrl] = useState<string | null>(null);
const [showPreviewPicker, setShowPreviewPicker] = useState(false);
const fileInputRef = useRef<HTMLInputElement>(null);
const handlePreviewUpload = useCallback(
async (e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file || !pkg) return;
// Reset file input so the same file can be re-selected
e.target.value = "";
setUploading(true);
try {
const formData = new FormData();
formData.append("file", file);
const result = await uploadPackagePreview(pkg.id, formData);
if (result.success) {
toast.success("Preview image uploaded");
// Show uploaded image immediately via local object URL
setLocalPreviewUrl(URL.createObjectURL(file));
} else {
toast.error(result.error);
}
} catch {
toast.error("Failed to upload preview image");
} finally {
setUploading(false);
}
},
[pkg]
);
// Clean up local preview URL when drawer closes or package changes
useEffect(() => {
return () => {
if (localPreviewUrl) URL.revokeObjectURL(localPreviewUrl);
};
}, [localPreviewUrl]);
const fetchFiles = useCallback( const fetchFiles = useCallback(
async (pageNum: number, append: boolean) => { async (pageNum: number, append: boolean) => {
@@ -258,6 +335,7 @@ export function PackageFilesDrawer({ pkg, open, onOpenChange }: PackageFilesDraw
setTotal(0); setTotal(0);
setSearch(""); setSearch("");
setPage(1); setPage(1);
setLocalPreviewUrl(null);
fetchFiles(1, false); fetchFiles(1, false);
} }
}, [open, pkg, fetchFiles]); }, [open, pkg, fetchFiles]);
@@ -288,17 +366,55 @@ export function PackageFilesDrawer({ pkg, open, onOpenChange }: PackageFilesDraw
}, [filtered]); }, [filtered]);
return ( return (
<>
<Dialog open={open} onOpenChange={onOpenChange}> <Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-2xl max-h-[80vh] flex flex-col gap-0 p-0"> <DialogContent className="sm:max-w-2xl max-h-[80vh] flex flex-col gap-0 p-0">
<DialogHeader className="px-6 pt-6 pb-4 border-b border-border space-y-3"> <DialogHeader className="px-6 pt-6 pb-4 border-b border-border space-y-3">
{/* Preview image + title row */} {/* Preview image + title row */}
<div className="flex gap-4"> <div className="flex gap-4">
{pkg?.hasPreview && ( {/* Preview image area with upload capability */}
<img <input
src={`/api/zips/${pkg.id}/preview`} ref={fileInputRef}
alt="" type="file"
className="h-20 w-20 rounded-lg object-cover bg-muted shrink-0" accept="image/jpeg,image/png,image/webp"
className="hidden"
onChange={handlePreviewUpload}
/> />
{(pkg?.hasPreview || localPreviewUrl) ? (
<button
type="button"
className="relative group h-20 w-20 shrink-0 rounded-lg overflow-hidden bg-muted"
onClick={() => fileInputRef.current?.click()}
disabled={uploading}
title="Click to replace preview image"
>
<img
src={localPreviewUrl ?? `/api/zips/${pkg!.id}/preview`}
alt=""
className="h-full w-full object-cover"
/>
<div className="absolute inset-0 bg-black/50 opacity-0 group-hover:opacity-100 transition-opacity flex items-center justify-center">
{uploading ? (
<Loader2 className="h-5 w-5 text-white animate-spin" />
) : (
<Upload className="h-5 w-5 text-white" />
)}
</div>
</button>
) : (
<button
type="button"
className="flex h-20 w-20 shrink-0 items-center justify-center rounded-lg border border-dashed border-muted-foreground/30 bg-muted/50 hover:bg-muted hover:border-muted-foreground/50 transition-colors cursor-pointer"
onClick={() => fileInputRef.current?.click()}
disabled={uploading}
title="Upload preview image"
>
{uploading ? (
<Loader2 className="h-5 w-5 text-muted-foreground animate-spin" />
) : (
<ImagePlus className="h-5 w-5 text-muted-foreground" />
)}
</button>
)} )}
<div className="min-w-0 flex-1"> <div className="min-w-0 flex-1">
<DialogTitle className="truncate pr-8"> <DialogTitle className="truncate pr-8">
@@ -308,11 +424,36 @@ export function PackageFilesDrawer({ pkg, open, onOpenChange }: PackageFilesDraw
{total.toLocaleString()} file{total !== 1 ? "s" : ""} in archive {total.toLocaleString()} file{total !== 1 ? "s" : ""} in archive
</DialogDescription> </DialogDescription>
{pkg && ( {pkg && (
<div className="mt-2"> <div className="mt-2 flex items-center gap-2">
<SendToTelegramButton <SendToTelegramButton
packageId={pkg.id} packageId={pkg.id}
packageName={pkg.fileName} packageName={pkg.fileName}
/> />
{pkg.archiveType !== "DOCUMENT" && !pkg.isMultipart && (
<Button
variant="outline"
size="sm"
className="h-8 gap-1.5 text-xs"
onClick={() => setShowPreviewPicker(true)}
>
<Images className="h-3.5 w-3.5" />
Pick Preview
</Button>
)}
<Button
variant="outline"
size="sm"
className="h-8 gap-1.5 text-xs"
onClick={() => fileInputRef.current?.click()}
disabled={uploading}
>
{uploading ? (
<Loader2 className="h-3.5 w-3.5 animate-spin" />
) : (
<Upload className="h-3.5 w-3.5" />
)}
Upload Preview
</Button>
</div> </div>
)} )}
</div> </div>
@@ -361,16 +502,24 @@ export function PackageFilesDrawer({ pkg, open, onOpenChange }: PackageFilesDraw
depth={0} depth={0}
search={search} search={search}
defaultOpen={true} defaultOpen={true}
highlightTerm={highlightTerm}
/> />
))} ))}
</> </>
) : ( ) : (
<> <>
{/* Flat list for archives without folders */} {/* Flat list for archives without folders */}
{filtered.map((file) => ( {filtered.map((file) => {
const isHighlighted = highlightTerm ? fileMatchesHighlight(file, highlightTerm) : false;
return (
<div <div
key={file.id} key={file.id}
className="flex items-center gap-3 rounded-md px-2 py-1.5 hover:bg-muted/50 transition-colors" className={cn(
"flex items-center gap-3 rounded-md px-2 py-1.5 transition-colors",
isHighlighted
? "bg-amber-500/15 hover:bg-amber-500/20"
: "hover:bg-muted/50"
)}
> >
<FileText className="h-3.5 w-3.5 shrink-0 text-muted-foreground" /> <FileText className="h-3.5 w-3.5 shrink-0 text-muted-foreground" />
<div className="min-w-0 flex-1"> <div className="min-w-0 flex-1">
@@ -390,7 +539,8 @@ export function PackageFilesDrawer({ pkg, open, onOpenChange }: PackageFilesDraw
{formatBytes(file.uncompressedSize)} {formatBytes(file.uncompressedSize)}
</span> </span>
</div> </div>
))} );
})}
</> </>
)} )}
@@ -416,6 +566,22 @@ export function PackageFilesDrawer({ pkg, open, onOpenChange }: PackageFilesDraw
</div> </div>
</ScrollArea> </ScrollArea>
</DialogContent> </DialogContent>
</Dialog> </Dialog>
{/* Archive preview picker modal — rendered as sibling to avoid nested Dialog issues */}
{pkg && pkg.archiveType !== "DOCUMENT" && !pkg.isMultipart && (
<ArchivePreviewPicker
packageId={pkg.id}
packageName={pkg.fileName}
open={showPreviewPicker}
onOpenChange={setShowPreviewPicker}
onPreviewSet={() => {
// Refresh the preview by setting a cache-busting URL
setLocalPreviewUrl(`/api/zips/${pkg.id}/preview?t=${Date.now()}`);
}}
/>
)}
</>
); );
} }

View File

@@ -0,0 +1,135 @@
"use client";
import { type ColumnDef } from "@tanstack/react-table";
import { DataTableColumnHeader } from "@/components/shared/data-table-column-header";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
import { RotateCw } from "lucide-react";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "@/components/ui/tooltip";
export interface SkippedRow {
id: string;
fileName: string;
fileSize: string;
reason: "SIZE_LIMIT" | "DOWNLOAD_FAILED" | "EXTRACT_FAILED" | "UPLOAD_FAILED";
errorMessage: string | null;
sourceChannel: { id: string; title: string };
isMultipart: boolean;
partCount: number;
createdAt: string;
}
function formatBytes(bytesStr: string): string {
const bytes = Number(bytesStr);
if (bytes === 0) return "0 B";
const k = 1024;
const sizes = ["B", "KB", "MB", "GB", "TB"];
const i = Math.floor(Math.log(bytes) / Math.log(k));
return `${parseFloat((bytes / Math.pow(k, i)).toFixed(1))} ${sizes[i]}`;
}
const REASON_LABELS: Record<SkippedRow["reason"], { label: string; variant: "default" | "destructive" | "outline" | "secondary" }> = {
SIZE_LIMIT: { label: "Size Limit", variant: "secondary" },
DOWNLOAD_FAILED: { label: "Download Failed", variant: "destructive" },
EXTRACT_FAILED: { label: "Extract Failed", variant: "destructive" },
UPLOAD_FAILED: { label: "Upload Failed", variant: "destructive" },
};
export function getSkippedColumns({
onRetry,
}: {
onRetry: (row: SkippedRow) => void;
}): ColumnDef<SkippedRow, unknown>[] {
return [
{
accessorKey: "fileName",
header: ({ column }) => <DataTableColumnHeader column={column} title="File Name" />,
cell: ({ row }) => (
<div className="flex items-center gap-2 min-w-0">
<span className="font-medium truncate max-w-[300px]">{row.original.fileName}</span>
{row.original.isMultipart && (
<Badge variant="outline" className="text-[10px] shrink-0">
{row.original.partCount} parts
</Badge>
)}
</div>
),
enableHiding: false,
},
{
accessorKey: "fileSize",
header: ({ column }) => <DataTableColumnHeader column={column} title="Size" />,
cell: ({ row }) => (
<span className="text-sm text-muted-foreground">
{formatBytes(row.original.fileSize)}
</span>
),
},
{
accessorKey: "reason",
header: ({ column }) => <DataTableColumnHeader column={column} title="Reason" />,
cell: ({ row }) => {
const { label, variant } = REASON_LABELS[row.original.reason];
return <Badge variant={variant} className="text-[10px]">{label}</Badge>;
},
},
{
accessorKey: "errorMessage",
header: "Error",
cell: ({ row }) => {
const msg = row.original.errorMessage;
if (!msg) return <span className="text-sm text-muted-foreground">{"\u2014"}</span>;
return (
<Tooltip>
<TooltipTrigger asChild>
<span className="text-sm text-muted-foreground truncate max-w-[200px] block cursor-help">
{msg}
</span>
</TooltipTrigger>
<TooltipContent className="max-w-sm">
<p className="text-xs break-all">{msg}</p>
</TooltipContent>
</Tooltip>
);
},
},
{
id: "channel",
header: ({ column }) => <DataTableColumnHeader column={column} title="Source" />,
cell: ({ row }) => (
<span className="text-sm text-muted-foreground truncate max-w-[160px] block">
{row.original.sourceChannel.title}
</span>
),
accessorFn: (row) => row.sourceChannel.title,
},
{
accessorKey: "createdAt",
header: ({ column }) => <DataTableColumnHeader column={column} title="Skipped" />,
cell: ({ row }) => (
<span className="text-sm text-muted-foreground">
{new Date(row.original.createdAt).toLocaleDateString()}
</span>
),
},
{
id: "actions",
cell: ({ row }) => (
<Button
variant="ghost"
size="icon"
className="h-8 w-8"
onClick={() => onRetry(row.original)}
title="Retry this package"
>
<RotateCw className="h-4 w-4" />
</Button>
),
enableHiding: false,
},
];
}

View File

@@ -0,0 +1,77 @@
"use client";
import { useTransition } from "react";
import { useRouter } from "next/navigation";
import { toast } from "sonner";
import { RotateCw } from "lucide-react";
import { useDataTable } from "@/hooks/use-data-table";
import { getSkippedColumns, type SkippedRow } from "./skipped-columns";
import { DataTable } from "@/components/shared/data-table";
import { DataTablePagination } from "@/components/shared/data-table-pagination";
import { Button } from "@/components/ui/button";
import { retrySkippedPackageAction, retryAllSkippedPackagesAction } from "../actions";
interface SkippedPackagesTabProps {
data: SkippedRow[];
pageCount: number;
totalCount: number;
}
export function SkippedPackagesTab({
data,
pageCount,
totalCount,
}: SkippedPackagesTabProps) {
const router = useRouter();
const [isPending, startTransition] = useTransition();
const columns = getSkippedColumns({
onRetry: (row) => {
startTransition(async () => {
const result = await retrySkippedPackageAction(row.id);
if (result.success) {
toast.success(`"${row.fileName}" queued for retry`);
router.refresh();
} else {
toast.error(result.error);
}
});
},
});
const { table } = useDataTable({ data, columns, pageCount });
return (
<div className="space-y-4">
{totalCount > 0 && (
<div className="flex justify-end">
<Button
variant="outline"
size="sm"
className="gap-1.5"
disabled={isPending}
onClick={() => {
startTransition(async () => {
const result = await retryAllSkippedPackagesAction();
if (result.success) {
toast.success(`All ${totalCount} skipped packages queued for retry`);
router.refresh();
} else {
toast.error(result.error);
}
});
}}
>
<RotateCw className="h-3.5 w-3.5" />
Retry All ({totalCount})
</Button>
</div>
)}
<DataTable
table={table}
emptyMessage="No skipped or failed packages."
/>
<DataTablePagination table={table} totalCount={totalCount} />
</div>
);
}

View File

@@ -1,24 +1,71 @@
"use client"; "use client";
import { useState, useCallback } from "react"; import { useState, useCallback, useTransition, useMemo, useRef } from "react";
import { useRouter, usePathname, useSearchParams } from "next/navigation"; import { useRouter, usePathname, useSearchParams } from "next/navigation";
import { Search, FileBox } from "lucide-react"; import { toast } from "sonner";
import { Search, Layers, Upload } from "lucide-react";
import { UploadDialog } from "./upload-dialog";
import { useDataTable } from "@/hooks/use-data-table"; import { useDataTable } from "@/hooks/use-data-table";
import { getPackageColumns, type PackageRow } from "./package-columns"; import {
getPackageColumns,
type PackageRow,
type StlTableRow,
type PackageTableRow,
type GroupHeaderRow,
} from "./package-columns";
import { PackageFilesDrawer } from "./package-files-drawer"; import { PackageFilesDrawer } from "./package-files-drawer";
import { IngestionStatus } from "./ingestion-status"; import { IngestionStatus } from "./ingestion-status";
import { SkippedPackagesTab } from "./skipped-packages-tab";
import { DataTable } from "@/components/shared/data-table"; import { DataTable } from "@/components/shared/data-table";
import { DataTablePagination } from "@/components/shared/data-table-pagination"; import { DataTablePagination } from "@/components/shared/data-table-pagination";
import { DataTableViewOptions } from "@/components/shared/data-table-view-options"; import { DataTableViewOptions } from "@/components/shared/data-table-view-options";
import { PageHeader } from "@/components/shared/page-header"; import { PageHeader } from "@/components/shared/page-header";
import { Input } from "@/components/ui/input"; import { Input } from "@/components/ui/input";
import type { IngestionAccountStatus } from "@/lib/telegram/types"; import { Button } from "@/components/ui/button";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
import { Badge } from "@/components/ui/badge";
import type { DisplayItem, IngestionAccountStatus, PackageListItem } from "@/lib/telegram/types";
import type { SkippedRow } from "./skipped-columns";
import {
updatePackageCreator,
updatePackageTags,
renameGroupAction,
dissolveGroupAction,
createGroupAction,
removeFromGroupAction,
sendAllInGroupAction,
updateGroupPreviewAction,
mergeGroupsAction,
} from "../actions";
interface StlTableProps { interface StlTableProps {
data: PackageRow[]; data: DisplayItem[];
pageCount: number; pageCount: number;
totalCount: number; totalCount: number;
ingestionStatus: IngestionAccountStatus[]; ingestionStatus: IngestionAccountStatus[];
availableTags: string[];
searchTerm: string;
skippedData: SkippedRow[];
skippedPageCount: number;
skippedTotalCount: number;
ungroupedData: PackageListItem[];
ungroupedPageCount: number;
ungroupedTotalCount: number;
} }
export function StlTable({ export function StlTable({
@@ -26,6 +73,14 @@ export function StlTable({
pageCount, pageCount,
totalCount, totalCount,
ingestionStatus, ingestionStatus,
availableTags,
searchTerm,
skippedData,
skippedPageCount,
skippedTotalCount,
ungroupedData,
ungroupedPageCount,
ungroupedTotalCount,
}: StlTableProps) { }: StlTableProps) {
const router = useRouter(); const router = useRouter();
const pathname = usePathname(); const pathname = usePathname();
@@ -33,6 +88,95 @@ export function StlTable({
const [searchValue, setSearchValue] = useState(searchParams.get("search") ?? ""); const [searchValue, setSearchValue] = useState(searchParams.get("search") ?? "");
const [viewPkg, setViewPkg] = useState<PackageRow | null>(null); const [viewPkg, setViewPkg] = useState<PackageRow | null>(null);
const [, startTransition] = useTransition();
// Group expansion state
const [expandedGroups, setExpandedGroups] = useState<Set<string>>(new Set());
// Package selection state (for manual grouping)
const [selectedPackages, setSelectedPackages] = useState<Set<string>>(new Set());
// Create group dialog state
const [createGroupOpen, setCreateGroupOpen] = useState(false);
const [groupName, setGroupName] = useState("");
// Group preview upload ref
const previewInputRef = useRef<HTMLInputElement>(null);
const [uploadGroupId, setUploadGroupId] = useState<string | null>(null);
// Group merge state
const [mergeSourceId, setMergeSourceId] = useState<string | null>(null);
// Upload dialog state
const [uploadOpen, setUploadOpen] = useState(false);
const toggleGroup = useCallback((groupId: string) => {
setExpandedGroups((prev) => {
const next = new Set(prev);
if (next.has(groupId)) {
next.delete(groupId);
} else {
next.add(groupId);
}
return next;
});
}, []);
const toggleSelect = useCallback((packageId: string) => {
setSelectedPackages((prev) => {
const next = new Set(prev);
if (next.has(packageId)) {
next.delete(packageId);
} else {
next.add(packageId);
}
return next;
});
}, []);
// Flatten DisplayItem[] into StlTableRow[] based on expansion state
const tableRows: StlTableRow[] = useMemo(() => {
const rows: StlTableRow[] = [];
for (const item of data) {
if (item.type === "package") {
rows.push({
...item.data,
_rowType: "package" as const,
_groupId: null,
_isGroupMember: false,
});
} else {
const group = item.data;
const isExpanded = expandedGroups.has(group.id);
rows.push({
_rowType: "group" as const,
id: group.id,
name: group.name,
hasPreview: group.hasPreview,
totalFileSize: group.totalFileSize,
totalFileCount: group.totalFileCount,
packageCount: group.packageCount,
combinedTags: group.combinedTags,
archiveTypes: group.archiveTypes,
latestIndexedAt: group.latestIndexedAt,
sourceChannel: group.sourceChannel,
_expanded: isExpanded,
});
if (isExpanded) {
for (const pkg of group.packages) {
rows.push({
...pkg,
_rowType: "package" as const,
_groupId: group.id,
_isGroupMember: true,
packageGroupId: group.id,
});
}
}
}
}
return rows;
}, [data, expandedGroups]);
const updateSearch = useCallback( const updateSearch = useCallback(
(value: string) => { (value: string) => {
@@ -49,11 +193,256 @@ export function StlTable({
[router, pathname, searchParams] [router, pathname, searchParams]
); );
const columns = getPackageColumns({ const updateTagFilter = useCallback(
onViewFiles: (pkg) => setViewPkg(pkg), (value: string) => {
const params = new URLSearchParams(searchParams.toString());
if (value && value !== "all") {
params.set("tag", value);
params.set("page", "1");
} else {
params.delete("tag");
}
router.push(`${pathname}?${params.toString()}`, { scroll: false });
},
[router, pathname, searchParams]
);
const activeTab = searchParams.get("tab") ?? "packages";
const updateTab = useCallback(
(value: string) => {
const params = new URLSearchParams(searchParams.toString());
if (value === "packages") {
params.delete("tab");
} else {
params.set("tab", value);
}
params.set("page", "1");
router.push(`${pathname}?${params.toString()}`, { scroll: false });
},
[router, pathname, searchParams]
);
const handleRenameGroup = useCallback(
(groupId: string, currentName: string) => {
const value = prompt("Enter group name:", currentName);
if (value === null || value.trim() === currentName) return;
startTransition(async () => {
const result = await renameGroupAction(groupId, value);
if (result.success) {
toast.success(`Group renamed to "${value.trim()}"`);
router.refresh();
} else {
toast.error(result.error);
}
});
},
[router]
);
const handleDissolveGroup = useCallback(
(groupId: string) => {
if (!confirm("Dissolve this group? Packages will become standalone items.")) return;
startTransition(async () => {
const result = await dissolveGroupAction(groupId);
if (result.success) {
toast.success("Group dissolved");
setExpandedGroups((prev) => {
const next = new Set(prev);
next.delete(groupId);
return next;
});
router.refresh();
} else {
toast.error(result.error);
}
});
},
[router]
);
const handleSendAllInGroup = useCallback(
(groupId: string) => {
if (!confirm("Send all packages in this group to your Telegram?")) return;
startTransition(async () => {
const result = await sendAllInGroupAction(groupId);
if (result.success) {
toast.success("Group packages queued for sending");
router.refresh();
} else {
toast.error(result.error);
}
});
},
[router]
);
const handleRemoveFromGroup = useCallback(
(packageId: string) => {
startTransition(async () => {
const result = await removeFromGroupAction(packageId);
if (result.success) {
toast.success("Package removed from group");
router.refresh();
} else {
toast.error(result.error);
}
});
},
[router]
);
const handleCreateGroup = useCallback(() => {
if (selectedPackages.size < 2) return;
setGroupName("");
setCreateGroupOpen(true);
}, [selectedPackages.size]);
const submitCreateGroup = useCallback(() => {
if (!groupName.trim() || selectedPackages.size < 2) return;
const ids = Array.from(selectedPackages);
startTransition(async () => {
const result = await createGroupAction(groupName, ids);
if (result.success) {
toast.success(`Group "${groupName.trim()}" created`);
setSelectedPackages(new Set());
setCreateGroupOpen(false);
router.refresh();
} else {
toast.error(result.error);
}
});
}, [groupName, selectedPackages, router]);
// Group preview upload handler (Task 12)
const handleGroupPreviewUpload = useCallback((groupId: string) => {
setUploadGroupId(groupId);
// Trigger file input after state update
setTimeout(() => {
previewInputRef.current?.click();
}, 0);
}, []);
const handlePreviewFileChange = useCallback(
(e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file || !uploadGroupId) return;
const formData = new FormData();
formData.append("file", file);
startTransition(async () => {
const result = await updateGroupPreviewAction(uploadGroupId, formData);
if (result.success) {
toast.success("Group preview updated");
router.refresh();
} else {
toast.error(result.error);
}
setUploadGroupId(null);
}); });
const { table } = useDataTable({ data, columns, pageCount }); // Reset input so the same file can be selected again
e.target.value = "";
},
[uploadGroupId, router]
);
const handleStartMerge = useCallback((groupId: string) => {
setMergeSourceId((prev) => {
if (prev === groupId) {
toast.info("Merge cancelled");
return null;
}
toast.info("Merge source selected — click the merge-here button on the target group");
return groupId;
});
}, []);
const handleMergeGroups = useCallback(
(targetGroupId: string) => {
if (!mergeSourceId) return;
const sourceId = mergeSourceId;
startTransition(async () => {
const result = await mergeGroupsAction(targetGroupId, sourceId);
if (result.success) {
toast.success("Groups merged successfully");
setMergeSourceId(null);
router.refresh();
} else {
toast.error(result.error);
}
});
},
[mergeSourceId, router]
);
const columns = getPackageColumns({
onViewFiles: (pkg) => setViewPkg(pkg),
searchTerm,
onSetCreator: (pkg) => {
const value = prompt("Enter creator name:", pkg.creator ?? "");
if (value === null) return;
startTransition(async () => {
const result = await updatePackageCreator(pkg.id, value || null);
if (result.success) {
toast.success(value ? `Creator set to "${value}"` : "Creator removed");
router.refresh();
} else {
toast.error(result.error);
}
});
},
onSetTags: (pkg) => {
const value = prompt(
"Enter tags (comma-separated):",
pkg.tags.join(", ")
);
if (value === null) return;
const tags = value.split(",").map((t) => t.trim()).filter(Boolean);
startTransition(async () => {
const result = await updatePackageTags(pkg.id, tags);
if (result.success) {
toast.success(tags.length > 0 ? `Tags updated` : "Tags removed");
router.refresh();
} else {
toast.error(result.error);
}
});
},
onToggleGroup: toggleGroup,
onRenameGroup: handleRenameGroup,
onDissolveGroup: handleDissolveGroup,
onSendAllInGroup: handleSendAllInGroup,
onRemoveFromGroup: handleRemoveFromGroup,
onGroupPreviewUpload: handleGroupPreviewUpload,
selectedPackages,
onToggleSelect: toggleSelect,
mergeSourceId,
onStartMerge: handleStartMerge,
onCompleteMerge: handleMergeGroups,
});
const { table } = useDataTable({ data: tableRows, columns, pageCount });
const ungroupedRows: StlTableRow[] = useMemo(
() =>
ungroupedData.map((pkg) => ({
...pkg,
_rowType: "package" as const,
_groupId: null,
_isGroupMember: false,
})),
[ungroupedData]
);
const { table: ungroupedTable } = useDataTable({
data: ungroupedRows,
columns,
pageCount: ungroupedPageCount,
});
const activeTag = searchParams.get("tag") ?? "";
return ( return (
<div className="space-y-4"> <div className="space-y-4">
@@ -64,6 +453,28 @@ export function StlTable({
<IngestionStatus initialStatus={ingestionStatus} /> <IngestionStatus initialStatus={ingestionStatus} />
</PageHeader> </PageHeader>
<Tabs value={activeTab} onValueChange={updateTab}>
<TabsList>
<TabsTrigger value="packages">Packages</TabsTrigger>
<TabsTrigger value="skipped" className="gap-1.5">
Skipped / Failed
{skippedTotalCount > 0 && (
<Badge variant="secondary" className="text-[10px] ml-1">
{skippedTotalCount}
</Badge>
)}
</TabsTrigger>
<TabsTrigger value="ungrouped" className="gap-1.5">
Ungrouped
{ungroupedTotalCount > 0 && (
<Badge variant="secondary" className="h-5 px-1.5 text-[10px]">
{ungroupedTotalCount}
</Badge>
)}
</TabsTrigger>
</TabsList>
<TabsContent value="packages" className="space-y-4">
<div className="flex flex-wrap items-center gap-2"> <div className="flex flex-wrap items-center gap-2">
<div className="relative flex-1 min-w-[200px] max-w-sm"> <div className="relative flex-1 min-w-[200px] max-w-sm">
<Search className="absolute left-2.5 top-2.5 h-4 w-4 text-muted-foreground" /> <Search className="absolute left-2.5 top-2.5 h-4 w-4 text-muted-foreground" />
@@ -74,14 +485,74 @@ export function StlTable({
className="pl-9 h-9" className="pl-9 h-9"
/> />
</div> </div>
{availableTags.length > 0 && (
<Select value={activeTag || "all"} onValueChange={updateTagFilter}>
<SelectTrigger className="w-[160px] h-9">
<SelectValue placeholder="All Tags" />
</SelectTrigger>
<SelectContent>
<SelectItem value="all">All Tags</SelectItem>
{availableTags.map((tag) => (
<SelectItem key={tag} value={tag}>
{tag}
</SelectItem>
))}
</SelectContent>
</Select>
)}
<DataTableViewOptions table={table} /> <DataTableViewOptions table={table} />
<Button variant="outline" size="sm" className="h-9" onClick={() => setUploadOpen(true)}>
<Upload className="mr-2 h-4 w-4" />
Upload Files
</Button>
{selectedPackages.size >= 2 && (
<Button
variant="outline"
size="sm"
className="h-9 gap-1.5"
onClick={handleCreateGroup}
>
<Layers className="h-3.5 w-3.5" />
Group {selectedPackages.size} Selected
</Button>
)}
{selectedPackages.size > 0 && selectedPackages.size < 2 && (
<span className="text-xs text-muted-foreground">
Select at least 2 packages to group
</span>
)}
</div> </div>
<DataTable <DataTable
table={table} table={table}
emptyMessage="No packages found. Archives will appear here after ingestion." emptyMessage="No packages found. Archives will appear here after ingestion."
rowClassName={(row) => {
const data = row.original as StlTableRow;
if (data._rowType === "group") {
return "bg-muted/30 border-border";
}
if (data._rowType === "package" && (data as PackageTableRow)._isGroupMember) {
return "bg-muted/10";
}
return "";
}}
/> />
<DataTablePagination table={table} totalCount={totalCount} /> <DataTablePagination table={table} totalCount={totalCount} />
</TabsContent>
<TabsContent value="skipped">
<SkippedPackagesTab
data={skippedData}
pageCount={skippedPageCount}
totalCount={skippedTotalCount}
/>
</TabsContent>
<TabsContent value="ungrouped" className="space-y-4">
<DataTable table={ungroupedTable} emptyMessage="All packages are grouped!" />
<DataTablePagination table={ungroupedTable} totalCount={ungroupedTotalCount} />
</TabsContent>
</Tabs>
<PackageFilesDrawer <PackageFilesDrawer
pkg={viewPkg} pkg={viewPkg}
@@ -89,6 +560,50 @@ export function StlTable({
onOpenChange={(open) => { onOpenChange={(open) => {
if (!open) setViewPkg(null); if (!open) setViewPkg(null);
}} }}
highlightTerm={searchTerm}
/>
{/* Create Group Dialog */}
<Dialog open={createGroupOpen} onOpenChange={setCreateGroupOpen}>
<DialogContent className="sm:max-w-md">
<DialogHeader>
<DialogTitle>Create Package Group</DialogTitle>
<DialogDescription>
Group {selectedPackages.size} selected packages together. Enter a name for the group.
</DialogDescription>
</DialogHeader>
<div className="py-4">
<Input
placeholder="Group name..."
value={groupName}
onChange={(e) => setGroupName(e.target.value)}
onKeyDown={(e) => {
if (e.key === "Enter") submitCreateGroup();
}}
autoFocus
/>
</div>
<DialogFooter>
<Button variant="outline" onClick={() => setCreateGroupOpen(false)}>
Cancel
</Button>
<Button onClick={submitCreateGroup} disabled={!groupName.trim()}>
<Layers className="h-4 w-4 mr-1" />
Create Group
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
<UploadDialog open={uploadOpen} onOpenChange={setUploadOpen} />
{/* Hidden file input for group preview upload (Task 12) */}
<input
ref={previewInputRef}
type="file"
accept="image/jpeg,image/png,image/webp"
className="hidden"
onChange={handlePreviewFileChange}
/> />
</div> </div>
); );

View File

@@ -0,0 +1,243 @@
"use client";
import { useState, useRef, useTransition, useEffect } from "react";
import { Upload, File, X, Loader2, CheckCircle2, AlertCircle } from "lucide-react";
import { toast } from "sonner";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Label } from "@/components/ui/label";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
interface UploadDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
}
function formatSize(bytes: number): string {
if (bytes >= 1024 * 1024 * 1024) return `${(bytes / (1024 * 1024 * 1024)).toFixed(1)} GB`;
if (bytes >= 1024 * 1024) return `${(bytes / (1024 * 1024)).toFixed(0)} MB`;
return `${(bytes / 1024).toFixed(0)} KB`;
}
type UploadStatus = "idle" | "uploading" | "processing" | "done" | "error";
export function UploadDialog({ open, onOpenChange }: UploadDialogProps) {
const [files, setFiles] = useState<File[]>([]);
const [groupName, setGroupName] = useState("");
const [status, setStatus] = useState<UploadStatus>("idle");
const [error, setError] = useState<string | null>(null);
const [isPending, startTransition] = useTransition();
const fileInputRef = useRef<HTMLInputElement>(null);
const pollRef = useRef<ReturnType<typeof setInterval> | null>(null);
useEffect(() => {
if (open) {
setFiles([]);
setGroupName("");
setStatus("idle");
setError(null);
}
return () => {
if (pollRef.current) clearInterval(pollRef.current);
};
}, [open]);
function handleFileChange(e: React.ChangeEvent<HTMLInputElement>) {
if (e.target.files) {
setFiles(Array.from(e.target.files));
}
}
function removeFile(index: number) {
setFiles((prev) => prev.filter((_, i) => i !== index));
}
function handleUpload() {
if (files.length === 0) return;
startTransition(async () => {
setStatus("uploading");
setError(null);
try {
const formData = new FormData();
for (const file of files) {
formData.append("files", file);
}
if (groupName.trim()) {
formData.append("groupName", groupName.trim());
}
const res = await fetch("/api/uploads", {
method: "POST",
body: formData,
});
const data = await res.json();
if (!res.ok) {
setStatus("error");
setError(data.error ?? "Upload failed");
return;
}
setStatus("processing");
// Poll for completion
pollRef.current = setInterval(async () => {
try {
const statusRes = await fetch(`/api/uploads/${data.uploadId}`);
const statusData = await statusRes.json();
if (statusData.status === "COMPLETED") {
setStatus("done");
toast.success(`${files.length} file(s) uploaded and indexed`);
if (pollRef.current) clearInterval(pollRef.current);
} else if (statusData.status === "FAILED") {
setStatus("error");
setError(statusData.errorMessage ?? "Processing failed");
if (pollRef.current) clearInterval(pollRef.current);
}
} catch {
// Keep polling
}
}, 3000);
// Stop polling after 10 minutes
setTimeout(() => {
if (pollRef.current) {
clearInterval(pollRef.current);
pollRef.current = null;
setStatus((s) => s === "processing" ? "done" : s);
}
}, 600_000);
} catch {
setStatus("error");
setError("Network error");
}
});
}
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-lg">
<DialogHeader>
<DialogTitle>Upload Files</DialogTitle>
<DialogDescription>
Upload archive files to be processed and indexed. Multiple files will be automatically grouped.
</DialogDescription>
</DialogHeader>
{status === "idle" && (
<div className="space-y-4">
<div
className="border-2 border-dashed rounded-lg p-8 text-center cursor-pointer hover:border-primary/50 transition-colors"
onClick={() => fileInputRef.current?.click()}
>
<Upload className="h-8 w-8 mx-auto mb-2 text-muted-foreground" />
<p className="text-sm text-muted-foreground">
Click to select files or drag & drop
</p>
<p className="text-xs text-muted-foreground mt-1">
ZIP, RAR, 7Z files up to 4GB each
</p>
<input
ref={fileInputRef}
type="file"
multiple
accept=".zip,.rar,.7z,.pdf,.stl"
onChange={handleFileChange}
className="hidden"
/>
</div>
{files.length > 0 && (
<div className="space-y-2">
{files.map((file, i) => (
<div key={i} className="flex items-center gap-2 p-2 rounded bg-muted/30">
<File className="h-4 w-4 shrink-0 text-muted-foreground" />
<span className="text-sm flex-1 truncate">{file.name}</span>
<span className="text-xs text-muted-foreground">{formatSize(file.size)}</span>
<button onClick={() => removeFile(i)} className="p-0.5 hover:text-destructive">
<X className="h-3.5 w-3.5" />
</button>
</div>
))}
</div>
)}
{files.length > 1 && (
<div>
<Label htmlFor="groupName" className="text-sm">Group Name (optional)</Label>
<Input
id="groupName"
value={groupName}
onChange={(e) => setGroupName(e.target.value)}
placeholder="Auto-generated from filenames"
className="mt-1"
/>
</div>
)}
</div>
)}
{(status === "uploading" || status === "processing") && (
<div className="flex items-center gap-3 p-6 rounded-lg bg-muted/30 border">
<Loader2 className="h-6 w-6 animate-spin text-primary" />
<div>
<p className="text-sm font-medium">
{status === "uploading" ? "Uploading files..." : "Processing & uploading to Telegram..."}
</p>
<p className="text-xs text-muted-foreground mt-0.5">
{status === "uploading"
? "Sending files to server"
: "Hashing, extracting metadata, uploading to destination channel"}
</p>
</div>
</div>
)}
{status === "done" && (
<div className="flex items-center gap-3 p-6 rounded-lg bg-green-500/10 border border-green-500/20">
<CheckCircle2 className="h-6 w-6 text-green-500" />
<div>
<p className="text-sm font-medium text-green-500">Upload complete!</p>
<p className="text-xs text-muted-foreground">Files have been indexed and uploaded to Telegram.</p>
</div>
</div>
)}
{status === "error" && (
<div className="flex items-center gap-3 p-6 rounded-lg bg-destructive/10 border border-destructive/20">
<AlertCircle className="h-6 w-6 text-destructive" />
<div>
<p className="text-sm font-medium text-destructive">Upload failed</p>
<p className="text-xs text-muted-foreground">{error}</p>
</div>
</div>
)}
<DialogFooter>
{status === "idle" && (
<>
<Button variant="outline" onClick={() => onOpenChange(false)}>Cancel</Button>
<Button onClick={handleUpload} disabled={files.length === 0 || isPending}>
<Upload className="h-4 w-4 mr-1" />
Upload {files.length > 0 ? `(${files.length})` : ""}
</Button>
</>
)}
{(status === "done" || status === "error") && (
<Button variant="outline" onClick={() => onOpenChange(false)}>Close</Button>
)}
</DialogFooter>
</DialogContent>
</Dialog>
);
}

View File

@@ -0,0 +1,591 @@
"use server";
import { auth } from "@/lib/auth";
import { prisma } from "@/lib/prisma";
import type { ActionResult } from "@/types/api.types";
import { revalidatePath } from "next/cache";
import {
updatePackageGroupName,
updatePackageGroupPreview,
createManualGroup,
removePackageFromGroup,
dissolveGroup,
mergeGroups,
} from "@/lib/telegram/queries";
const ALLOWED_IMAGE_TYPES = [
"image/jpeg",
"image/png",
"image/webp",
] as const;
const MAX_IMAGE_SIZE = 2 * 1024 * 1024; // 2 MB
export async function updatePackageCreator(
packageId: string,
creator: string | null
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
await prisma.package.update({
where: { id: packageId },
data: { creator: creator?.trim() || null },
});
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to update creator" };
}
}
export async function uploadPackagePreview(
packageId: string,
formData: FormData
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
const file = formData.get("file");
if (!(file instanceof File)) {
return { success: false, error: "No file provided" };
}
if (!ALLOWED_IMAGE_TYPES.includes(file.type as (typeof ALLOWED_IMAGE_TYPES)[number])) {
return { success: false, error: "Only JPG, PNG, and WebP images are accepted" };
}
if (file.size > MAX_IMAGE_SIZE) {
return { success: false, error: "Image must be smaller than 2 MB" };
}
try {
const arrayBuffer = await file.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
await prisma.package.update({
where: { id: packageId },
data: {
previewData: buffer,
// Set previewMsgId to 0 as sentinel so hasPreview checks work
previewMsgId: BigInt(0),
},
});
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to upload preview image" };
}
}
export async function updatePackageTags(
packageId: string,
tags: string[]
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const cleaned = tags.map((t) => t.trim()).filter(Boolean);
// Deduplicate
const unique = [...new Set(cleaned)];
await prisma.package.update({
where: { id: packageId },
data: { tags: unique },
});
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to update tags" };
}
}
export async function bulkSetTags(
packageIds: string[],
tags: string[]
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const cleaned = [...new Set(tags.map((t) => t.trim()).filter(Boolean))];
await prisma.package.updateMany({
where: { id: { in: packageIds } },
data: { tags: cleaned },
});
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to update tags" };
}
}
export async function bulkSetCreator(
packageIds: string[],
creator: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
await prisma.package.updateMany({
where: { id: { in: packageIds } },
data: { creator: creator.trim() },
});
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to update creators" };
}
}
/**
* Set a package's preview from an extracted archive image.
* Reads the image data from a completed ArchiveExtractRequest.
*/
export async function setPreviewFromExtract(
packageId: string,
extractRequestId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const extractReq = await prisma.archiveExtractRequest.findUnique({
where: { id: extractRequestId },
select: { status: true, imageData: true, packageId: true },
});
if (!extractReq) {
return { success: false, error: "Extract request not found" };
}
if (extractReq.packageId !== packageId) {
return { success: false, error: "Extract request does not belong to this package" };
}
if (extractReq.status !== "COMPLETED" || !extractReq.imageData) {
return { success: false, error: "Image extraction not yet completed" };
}
await prisma.package.update({
where: { id: packageId },
data: {
previewData: extractReq.imageData,
// Set previewMsgId to 0 as sentinel so hasPreview checks work
// (original Telegram-matched previews have the actual message ID)
previewMsgId: BigInt(0),
},
});
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to set preview from archive image" };
}
}
export async function repairPackageAction(
packageId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const pkg = await prisma.package.findUnique({
where: { id: packageId },
select: {
id: true,
fileName: true,
sourceChannelId: true,
sourceMessageId: true,
destChannelId: true,
destMessageId: true,
},
});
if (!pkg) return { success: false, error: "Package not found" };
// Clear the destination info so the worker re-processes it
await prisma.package.update({
where: { id: packageId },
data: {
destMessageId: null,
destMessageIds: [],
destChannelId: null,
},
});
// Reset the channel watermark to before this message so worker picks it up
await prisma.accountChannelMap.updateMany({
where: {
channelId: pkg.sourceChannelId,
lastProcessedMessageId: { gte: pkg.sourceMessageId },
},
data: { lastProcessedMessageId: pkg.sourceMessageId - BigInt(1) },
});
// Mark related notifications as read
await prisma.systemNotification.updateMany({
where: {
context: { path: ["packageId"], equals: packageId },
isRead: false,
},
data: { isRead: true },
});
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to schedule repair" };
}
}
export async function retrySkippedPackageAction(
id: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const skipped = await prisma.skippedPackage.findUnique({
where: { id },
});
if (!skipped) return { success: false, error: "Skipped package not found" };
// Find the AccountChannelMap and reset watermark if needed
const mapping = await prisma.accountChannelMap.findUnique({
where: {
accountId_channelId: {
accountId: skipped.accountId,
channelId: skipped.sourceChannelId,
},
},
});
if (mapping) {
const targetId = skipped.sourceMessageId - BigInt(1);
// Only reset if the watermark is past this message
if (mapping.lastProcessedMessageId && mapping.lastProcessedMessageId >= skipped.sourceMessageId) {
await prisma.accountChannelMap.update({
where: { id: mapping.id },
data: { lastProcessedMessageId: targetId },
});
}
// Also reset TopicProgress if this was a forum topic message
if (skipped.sourceTopicId) {
const topicProgress = await prisma.topicProgress.findFirst({
where: {
accountChannelMapId: mapping.id,
topicId: skipped.sourceTopicId,
},
});
if (topicProgress && topicProgress.lastProcessedMessageId && topicProgress.lastProcessedMessageId >= skipped.sourceMessageId) {
await prisma.topicProgress.update({
where: { id: topicProgress.id },
data: { lastProcessedMessageId: targetId },
});
}
}
}
// Delete the skip record
await prisma.skippedPackage.delete({ where: { id } });
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to retry skipped package" };
}
}
export async function retryAllSkippedPackagesAction(
reason?: "SIZE_LIMIT" | "DOWNLOAD_FAILED" | "EXTRACT_FAILED" | "UPLOAD_FAILED"
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const where: Record<string, unknown> = {};
if (reason) where.reason = reason;
const skippedItems = await prisma.skippedPackage.findMany({ where });
if (skippedItems.length === 0) {
return { success: true, data: undefined };
}
// Group by (accountId, channelId) to find minimum messageId per channel
const channelResets = new Map<string, { mappingKey: { accountId: string; channelId: string }; minMessageId: bigint; topicResets: Map<bigint, bigint> }>();
for (const item of skippedItems) {
const key = `${item.accountId}:${item.sourceChannelId}`;
const existing = channelResets.get(key);
const targetId = item.sourceMessageId - BigInt(1);
if (!existing) {
const topicResets = new Map<bigint, bigint>();
if (item.sourceTopicId) {
topicResets.set(item.sourceTopicId, targetId);
}
channelResets.set(key, {
mappingKey: { accountId: item.accountId, channelId: item.sourceChannelId },
minMessageId: targetId,
topicResets,
});
} else {
if (targetId < existing.minMessageId) {
existing.minMessageId = targetId;
}
if (item.sourceTopicId) {
const existingTopic = existing.topicResets.get(item.sourceTopicId);
if (!existingTopic || targetId < existingTopic) {
existing.topicResets.set(item.sourceTopicId, targetId);
}
}
}
}
// Reset watermarks
for (const reset of channelResets.values()) {
const mapping = await prisma.accountChannelMap.findUnique({
where: { accountId_channelId: reset.mappingKey },
});
if (!mapping) continue;
if (mapping.lastProcessedMessageId && mapping.lastProcessedMessageId > reset.minMessageId) {
await prisma.accountChannelMap.update({
where: { id: mapping.id },
data: { lastProcessedMessageId: reset.minMessageId },
});
}
// Reset topic progress
for (const [topicId, targetId] of reset.topicResets) {
const topicProgress = await prisma.topicProgress.findFirst({
where: { accountChannelMapId: mapping.id, topicId },
});
if (topicProgress && topicProgress.lastProcessedMessageId && topicProgress.lastProcessedMessageId > targetId) {
await prisma.topicProgress.update({
where: { id: topicProgress.id },
data: { lastProcessedMessageId: targetId },
});
}
}
}
// Delete all matching skip records
await prisma.skippedPackage.deleteMany({ where });
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to retry skipped packages" };
}
}
export async function renameGroupAction(
groupId: string,
name: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
if (!name.trim()) {
return { success: false, error: "Group name cannot be empty" };
}
try {
await updatePackageGroupName(groupId, name);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to rename group" };
}
}
export async function dissolveGroupAction(
groupId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
await dissolveGroup(groupId);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to dissolve group" };
}
}
export async function createGroupAction(
name: string,
packageIds: string[]
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
if (!name.trim()) {
return { success: false, error: "Group name cannot be empty" };
}
if (packageIds.length < 2) {
return { success: false, error: "At least 2 packages are required to create a group" };
}
try {
await createManualGroup(name, packageIds);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch (err) {
const message = err instanceof Error ? err.message : "Failed to create group";
return { success: false, error: message };
}
}
export async function removeFromGroupAction(
packageId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
await removePackageFromGroup(packageId);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to remove package from group" };
}
}
export async function updateGroupPreviewAction(
groupId: string,
formData: FormData
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
const file = formData.get("file");
if (!(file instanceof File)) {
return { success: false, error: "No file provided" };
}
if (!ALLOWED_IMAGE_TYPES.includes(file.type as (typeof ALLOWED_IMAGE_TYPES)[number])) {
return { success: false, error: "Only JPG, PNG, and WebP images are accepted" };
}
if (file.size > MAX_IMAGE_SIZE) {
return { success: false, error: "Image must be smaller than 2 MB" };
}
try {
const arrayBuffer = await file.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
await updatePackageGroupPreview(groupId, buffer);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to upload group preview image" };
}
}
export async function mergeGroupsAction(
targetGroupId: string,
sourceGroupId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
if (targetGroupId === sourceGroupId) {
return { success: false, error: "Cannot merge a group with itself" };
}
try {
await mergeGroups(targetGroupId, sourceGroupId);
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to merge groups" };
}
}
export async function sendAllInGroupAction(
groupId: string
): Promise<ActionResult> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
const telegramLink = await prisma.telegramLink.findUnique({
where: { userId: session.user.id },
});
if (!telegramLink) {
return { success: false, error: "No linked Telegram account. Link one in Settings." };
}
const group = await prisma.packageGroup.findUnique({
where: { id: groupId },
select: {
packages: {
select: { id: true, destChannelId: true, destMessageId: true, fileName: true },
},
},
});
if (!group) {
return { success: false, error: "Group not found" };
}
const sendablePackages = group.packages.filter(
(p) => p.destChannelId && p.destMessageId
);
if (sendablePackages.length === 0) {
return { success: false, error: "No packages in this group have been uploaded to a destination channel" };
}
let queued = 0;
for (const pkg of sendablePackages) {
// Only create if no existing PENDING/SENDING request for this package+link combo
const existing = await prisma.botSendRequest.findFirst({
where: {
packageId: pkg.id,
telegramLinkId: telegramLink.id,
status: { in: ["PENDING", "SENDING"] },
},
});
if (!existing) {
const sendRequest = await prisma.botSendRequest.create({
data: {
packageId: pkg.id,
telegramLinkId: telegramLink.id,
requestedByUserId: session.user.id,
status: "PENDING",
},
});
// Notify the bot via pg_notify
try {
await prisma.$queryRawUnsafe(
`SELECT pg_notify('bot_send', $1)`,
sendRequest.id
);
} catch {
// Best-effort — the bot also polls periodically
}
queued++;
}
}
revalidatePath("/stls");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to send group packages" };
}
}

View File

@@ -1,7 +1,8 @@
import { auth } from "@/lib/auth"; import { auth } from "@/lib/auth";
import { redirect } from "next/navigation"; import { redirect } from "next/navigation";
import { listPackages, searchPackages, getIngestionStatus } from "@/lib/telegram/queries"; import { listDisplayItems, searchPackages, getIngestionStatus, getAllPackageTags, listSkippedPackages, countSkippedPackages, listUngroupedPackages, countUngroupedPackages } from "@/lib/telegram/queries";
import { StlTable } from "./_components/stl-table"; import { StlTable } from "./_components/stl-table";
import type { DisplayItem, PackageListItem } from "@/lib/telegram/types";
interface Props { interface Props {
searchParams: Promise<Record<string, string | string[] | undefined>>; searchParams: Promise<Record<string, string | string[] | undefined>>;
@@ -19,9 +20,11 @@ export default async function StlFilesPage({ searchParams }: Props) {
const order = (params.order as "asc" | "desc") ?? "desc"; const order = (params.order as "asc" | "desc") ?? "desc";
const search = (params.search as string) ?? ""; const search = (params.search as string) ?? "";
const creator = (params.creator as string) || undefined; const creator = (params.creator as string) || undefined;
const tag = (params.tag as string) || undefined;
const tab = (params.tab as string) ?? "packages";
// Fetch packages and ingestion status in parallel // Fetch packages, ingestion status, tags, and skipped count in parallel
const [result, ingestionStatus] = await Promise.all([ const [result, ingestionStatus, availableTags, skippedCount, ungroupedCount] = await Promise.all([
search search
? searchPackages({ ? searchPackages({
query: search, query: search,
@@ -29,22 +32,49 @@ export default async function StlFilesPage({ searchParams }: Props) {
limit: perPage, limit: perPage,
searchIn: "both", searchIn: "both",
}) })
: listPackages({ : listDisplayItems({
page, page,
limit: perPage, limit: perPage,
creator, creator,
tag,
sortBy: sort as "indexedAt" | "fileName" | "fileSize", sortBy: sort as "indexedAt" | "fileName" | "fileSize",
order, order,
}), }),
getIngestionStatus(), getIngestionStatus(),
getAllPackageTags(),
countSkippedPackages(),
countUngroupedPackages(),
]); ]);
// For search results, wrap as DisplayItem[]; for non-search, already DisplayItem[]
const displayItems: DisplayItem[] = search
? (result as { items: PackageListItem[] }).items.map((item) => ({ type: "package" as const, data: item }))
: (result as { items: DisplayItem[] }).items;
// Fetch skipped packages only if on that tab
const skippedResult = tab === "skipped"
? await listSkippedPackages({ page, limit: perPage })
: null;
// Fetch ungrouped packages only if on that tab
const ungroupedResult = tab === "ungrouped"
? await listUngroupedPackages({ page, limit: perPage })
: null;
return ( return (
<StlTable <StlTable
data={result.items} data={displayItems}
pageCount={result.pagination.totalPages} pageCount={result.pagination.totalPages}
totalCount={result.pagination.total} totalCount={result.pagination.total}
ingestionStatus={ingestionStatus} ingestionStatus={ingestionStatus}
availableTags={availableTags}
searchTerm={search}
skippedData={skippedResult?.items ?? []}
skippedPageCount={skippedResult?.pagination.totalPages ?? 0}
skippedTotalCount={skippedCount}
ungroupedData={ungroupedResult?.items ?? []}
ungroupedPageCount={ungroupedResult?.pagination.totalPages ?? 0}
ungroupedTotalCount={ungroupedCount}
/> />
); );
} }

View File

@@ -7,6 +7,8 @@ import {
Power, Power,
ArrowDownToLine, ArrowDownToLine,
ArrowUpFromLine, ArrowUpFromLine,
RefreshCcw,
Tag,
} from "lucide-react"; } from "lucide-react";
import { Badge } from "@/components/ui/badge"; import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
@@ -23,12 +25,16 @@ interface ChannelColumnsProps {
onToggleActive: (id: string) => void; onToggleActive: (id: string) => void;
onDelete: (id: string) => void; onDelete: (id: string) => void;
onSetType: (id: string, type: "SOURCE" | "DESTINATION") => void; onSetType: (id: string, type: "SOURCE" | "DESTINATION") => void;
onRescan: (id: string) => void;
onSetCategory: (id: string, category: string | null) => void;
} }
export function getChannelColumns({ export function getChannelColumns({
onToggleActive, onToggleActive,
onDelete, onDelete,
onSetType, onSetType,
onRescan,
onSetCategory,
}: ChannelColumnsProps): ColumnDef<ChannelRow, unknown>[] { }: ChannelColumnsProps): ColumnDef<ChannelRow, unknown>[] {
return [ return [
{ {
@@ -60,6 +66,18 @@ export function getChannelColumns({
</Badge> </Badge>
), ),
}, },
{
accessorKey: "category",
header: "Category",
cell: ({ row }) => {
const category = row.original.category;
return category ? (
<Badge variant="outline">{category}</Badge>
) : (
<span className="text-xs text-muted-foreground"></span>
);
},
},
{ {
accessorKey: "isActive", accessorKey: "isActive",
header: "Status", header: "Status",
@@ -121,6 +139,23 @@ export function getChannelColumns({
Set as Source Set as Source
</DropdownMenuItem> </DropdownMenuItem>
)} )}
{row.original.type === "SOURCE" && (
<DropdownMenuItem
onClick={() => onRescan(row.original.id)}
>
<RefreshCcw className="mr-2 h-3.5 w-3.5" />
Rescan Channel
</DropdownMenuItem>
)}
<DropdownMenuItem
onClick={() => {
const cat = prompt("Enter category (e.g. STL, PDF, D&D, Cosplay):", row.original.category ?? "");
if (cat !== null) onSetCategory(row.original.id, cat || null);
}}
>
<Tag className="mr-2 h-3.5 w-3.5" />
Set Category
</DropdownMenuItem>
<DropdownMenuItem <DropdownMenuItem
onClick={() => onToggleActive(row.original.id)} onClick={() => onToggleActive(row.original.id)}
> >

View File

@@ -21,7 +21,7 @@ import { ScrollArea } from "@/components/ui/scroll-area";
interface FetchedChannel { interface FetchedChannel {
chatId: string; chatId: string;
title: string; title: string;
type: "channel" | "supergroup"; type: string;
isForum: boolean; isForum: boolean;
memberCount: number | null; memberCount: number | null;
alreadyLinked: boolean; alreadyLinked: boolean;

View File

@@ -2,26 +2,39 @@
import { useState, useTransition } from "react"; import { useState, useTransition } from "react";
import { toast } from "sonner"; import { toast } from "sonner";
import { Download, Plus } from "lucide-react";
import { getChannelColumns } from "./channel-columns"; import { getChannelColumns } from "./channel-columns";
import { DestinationCard } from "./destination-card"; import { DestinationCard } from "./destination-card";
import { ChannelPickerDialog } from "./channel-picker-dialog";
import { JoinChannelDialog } from "./join-channel-dialog";
import { import {
deleteChannel, deleteChannel,
toggleChannelActive, toggleChannelActive,
setChannelType, setChannelType,
setChannelCategory,
rescanChannel,
} from "../actions"; } from "../actions";
import { DataTable } from "@/components/shared/data-table"; import { DataTable } from "@/components/shared/data-table";
import { DeleteDialog } from "@/components/shared/delete-dialog"; import { DeleteDialog } from "@/components/shared/delete-dialog";
import type { ChannelRow, GlobalDestination } from "@/lib/telegram/admin-queries"; import { Button } from "@/components/ui/button";
import type { AccountRow, ChannelRow, GlobalDestination } from "@/lib/telegram/admin-queries";
import { useDataTable } from "@/hooks/use-data-table"; import { useDataTable } from "@/hooks/use-data-table";
interface ChannelsTabProps { interface ChannelsTabProps {
channels: ChannelRow[]; channels: ChannelRow[];
globalDestination: GlobalDestination; globalDestination: GlobalDestination;
accounts: AccountRow[];
} }
export function ChannelsTab({ channels, globalDestination }: ChannelsTabProps) { export function ChannelsTab({ channels, globalDestination, accounts }: ChannelsTabProps) {
const [isPending, startTransition] = useTransition(); const [isPending, startTransition] = useTransition();
const [deleteId, setDeleteId] = useState<string | null>(null); const [deleteId, setDeleteId] = useState<string | null>(null);
const [rescanId, setRescanId] = useState<string | null>(null);
const [fetchChannelsAccountId, setFetchChannelsAccountId] = useState<string | null>(null);
const [joinDialogOpen, setJoinDialogOpen] = useState(false);
// Find the first authenticated account for "Fetch Channels"
const authenticatedAccounts = accounts.filter((a) => a.authState === "AUTHENTICATED" && a.isActive);
const columns = getChannelColumns({ const columns = getChannelColumns({
onToggleActive: (id) => { onToggleActive: (id) => {
@@ -39,6 +52,14 @@ export function ChannelsTab({ channels, globalDestination }: ChannelsTabProps) {
else toast.error(result.error); else toast.error(result.error);
}); });
}, },
onRescan: (id) => setRescanId(id),
onSetCategory: (id, category) => {
startTransition(async () => {
const result = await setChannelCategory(id, category);
if (result.success) toast.success(category ? `Category set to "${category}"` : "Category removed");
else toast.error(result.error);
});
},
}); });
const { table } = useDataTable({ const { table } = useDataTable({
@@ -60,19 +81,59 @@ export function ChannelsTab({ channels, globalDestination }: ChannelsTabProps) {
}); });
}; };
const handleRescan = () => {
if (!rescanId) return;
startTransition(async () => {
const result = await rescanChannel(rescanId);
if (result.success) {
toast.success("Channel scan progress reset — it will be fully rescanned on the next sync");
setRescanId(null);
} else {
toast.error(result.error);
}
});
};
const handleFetchChannels = () => {
if (authenticatedAccounts.length > 0) {
setFetchChannelsAccountId(authenticatedAccounts[0].id);
} else {
toast.error("No authenticated accounts available. Add and authenticate an account first.");
}
};
return ( return (
<div className="space-y-4"> <div className="space-y-4">
<DestinationCard destination={globalDestination} /> <DestinationCard destination={globalDestination} channels={channels} />
<div className="flex items-center gap-2">
<Button
variant="outline"
onClick={handleFetchChannels}
disabled={authenticatedAccounts.length === 0}
>
<Download className="mr-2 h-4 w-4" />
Fetch Channels
</Button>
<Button
variant="outline"
onClick={() => setJoinDialogOpen(true)}
disabled={authenticatedAccounts.length === 0}
>
<Plus className="mr-2 h-4 w-4" />
Add Channel
</Button>
</div>
{channels.length > 0 && ( {channels.length > 0 && (
<p className="text-xs text-muted-foreground"> <p className="text-xs text-muted-foreground">
Source channels are added per-account via the &quot;Fetch Channels&quot; button on the Accounts tab. Channels discovered via &quot;Fetch Channels&quot; are automatically activated as sources.
</p> </p>
)} )}
<DataTable <DataTable
table={table} table={table}
emptyMessage="No channels yet. Use &quot;Fetch Channels&quot; on an account to discover and add source channels." emptyMessage="No channels yet. Click &quot;Fetch Channels&quot; above to discover and add source channels."
/> />
<DeleteDialog <DeleteDialog
@@ -83,6 +144,29 @@ export function ChannelsTab({ channels, globalDestination }: ChannelsTabProps) {
onConfirm={handleDelete} onConfirm={handleDelete}
isLoading={isPending} isLoading={isPending}
/> />
<DeleteDialog
open={!!rescanId}
onOpenChange={(open) => !open && setRescanId(null)}
title="Rescan Channel"
description="This will reset all scan progress for this channel. On the next sync the worker will re-process every message from the beginning. Packages that are already in the library will be skipped (deduplication by hash), but any missing files will be re-downloaded and re-uploaded. This may take a long time for large channels."
confirmLabel="Rescan"
onConfirm={handleRescan}
isLoading={isPending}
/>
<ChannelPickerDialog
accountId={fetchChannelsAccountId}
open={!!fetchChannelsAccountId}
onOpenChange={(open) => {
if (!open) setFetchChannelsAccountId(null);
}}
/>
<JoinChannelDialog
open={joinDialogOpen}
onOpenChange={setJoinDialogOpen}
/>
</div> </div>
); );
} }

View File

@@ -1,9 +1,21 @@
"use client"; "use client";
import { useState, useEffect, useTransition } from "react"; import { useState, useEffect, useTransition } from "react";
import { Database, AlertTriangle, Link2, Plus, Loader2 } from "lucide-react"; import {
Database,
AlertTriangle,
Link2,
Plus,
Loader2,
ArrowRight,
RefreshCw,
} from "lucide-react";
import { toast } from "sonner"; import { toast } from "sonner";
import { createDestinationViaWorker } from "../actions"; import {
createDestinationViaWorker,
setGlobalDestination,
rebuildPackageDatabase,
} from "../actions";
import { Card, CardContent } from "@/components/ui/card"; import { Card, CardContent } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge"; import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
@@ -17,10 +29,19 @@ import {
DialogTitle, DialogTitle,
DialogFooter, DialogFooter,
} from "@/components/ui/dialog"; } from "@/components/ui/dialog";
import type { GlobalDestination } from "@/lib/telegram/admin-queries"; import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
import type { GlobalDestination, ChannelRow } from "@/lib/telegram/admin-queries";
interface DestinationCardProps { interface DestinationCardProps {
destination: GlobalDestination; destination: GlobalDestination;
channels?: ChannelRow[];
} }
type CreateState = type CreateState =
@@ -29,11 +50,34 @@ type CreateState =
| { phase: "done"; title: string; telegramId: string } | { phase: "done"; title: string; telegramId: string }
| { phase: "error"; message: string }; | { phase: "error"; message: string };
export function DestinationCard({ destination }: DestinationCardProps) { type RebuildState =
| { phase: "idle" }
| { phase: "running"; requestId: string }
| { phase: "done"; created: number; skipped: number; scanned: number }
| { phase: "error"; message: string };
interface RebuildProgress {
status: string;
messagesScanned: number;
documentsFound: number;
packagesCreated: number;
packagesSkipped: number;
error?: string;
}
export function DestinationCard({ destination, channels = [] }: DestinationCardProps) {
const [isPending, startTransition] = useTransition(); const [isPending, startTransition] = useTransition();
const [createOpen, setCreateOpen] = useState(false); const [createOpen, setCreateOpen] = useState(false);
const [title, setTitle] = useState("dragonsstash db"); const [title, setTitle] = useState("dragonsstash db");
const [createState, setCreateState] = useState<CreateState>({ phase: "idle" }); const [createState, setCreateState] = useState<CreateState>({ phase: "idle" });
const [selectedChannelId, setSelectedChannelId] = useState<string>("");
const [rebuildState, setRebuildState] = useState<RebuildState>({ phase: "idle" });
const [rebuildProgress, setRebuildProgress] = useState<RebuildProgress | null>(null);
// Channels that can be assigned as destination (SOURCE channels only, exclude current destination)
const assignableChannels = channels.filter(
(c) => c.type === "SOURCE" && c.id !== destination?.id
);
// Poll for worker result when creating // Poll for worker result when creating
useEffect(() => { useEffect(() => {
@@ -90,6 +134,86 @@ export function DestinationCard({ destination }: DestinationCardProps) {
return () => { mounted = false; }; return () => { mounted = false; };
}, [createState]); }, [createState]);
// Poll for rebuild progress
useEffect(() => {
if (rebuildState.phase !== "running") return;
let mounted = true;
const requestId = rebuildState.requestId;
const poll = async () => {
for (let i = 0; i < 300; i++) {
await new Promise((r) => setTimeout(r, 2000));
if (!mounted) return;
try {
const res = await fetch(
`/api/telegram/worker-request?requestId=${requestId}`
);
if (!res.ok) continue;
const data = await res.json();
// Update live progress from resultJson
if (data.result && typeof data.result === "object") {
if (mounted) setRebuildProgress(data.result as RebuildProgress);
}
if (data.status === "COMPLETED" && data.result) {
const result = data.result as RebuildProgress;
if (mounted) {
setRebuildState({
phase: "done",
created: result.packagesCreated,
skipped: result.packagesSkipped,
scanned: result.messagesScanned,
});
setRebuildProgress(null);
toast.success(
`Rebuild complete: ${result.packagesCreated} packages restored, ${result.packagesSkipped} skipped`
);
}
return;
} else if (data.status === "FAILED") {
if (mounted) {
setRebuildState({
phase: "error",
message: data.error || "Rebuild failed",
});
setRebuildProgress(null);
}
return;
}
} catch {
// Network blip — keep polling
}
}
if (mounted) {
setRebuildState({ phase: "error", message: "Timed out waiting for rebuild" });
setRebuildProgress(null);
}
};
poll();
return () => {
mounted = false;
};
}, [rebuildState]);
const handleRebuild = () => {
startTransition(async () => {
const result = await rebuildPackageDatabase();
if (result.success) {
setRebuildState({ phase: "running", requestId: result.data.requestId });
setRebuildProgress(null);
toast.info("Rebuild started — scanning destination channel...");
} else {
toast.error(result.error ?? "Failed to start rebuild");
}
});
};
const handleCreate = () => { const handleCreate = () => {
if (!title.trim()) return; if (!title.trim()) return;
@@ -103,6 +227,21 @@ export function DestinationCard({ destination }: DestinationCardProps) {
}); });
}; };
const handleAssignExisting = () => {
if (!selectedChannelId) return;
startTransition(async () => {
const result = await setGlobalDestination(selectedChannelId);
if (result.success) {
toast.success("Channel set as destination!");
setCreateOpen(false);
setSelectedChannelId("");
} else {
toast.error(result.error ?? "Failed to set destination");
}
});
};
const handleOpenChange = (open: boolean) => { const handleOpenChange = (open: boolean) => {
setCreateOpen(open); setCreateOpen(open);
if (!open) { if (!open) {
@@ -110,6 +249,7 @@ export function DestinationCard({ destination }: DestinationCardProps) {
if (createState.phase !== "creating") { if (createState.phase !== "creating") {
setCreateState({ phase: "idle" }); setCreateState({ phase: "idle" });
} }
setSelectedChannelId("");
} }
}; };
@@ -132,19 +272,23 @@ export function DestinationCard({ destination }: DestinationCardProps) {
</div> </div>
<Button size="sm" onClick={() => setCreateOpen(true)}> <Button size="sm" onClick={() => setCreateOpen(true)}>
<Plus className="mr-2 h-3.5 w-3.5" /> <Plus className="mr-2 h-3.5 w-3.5" />
Create Destination Set Destination
</Button> </Button>
</CardContent> </CardContent>
</Card> </Card>
<CreateDestinationDialog <DestinationDialog
open={createOpen} open={createOpen}
onOpenChange={handleOpenChange} onOpenChange={handleOpenChange}
title={title} title={title}
setTitle={setTitle} setTitle={setTitle}
onSubmit={handleCreate} onSubmitCreate={handleCreate}
createState={createState} createState={createState}
isPending={isPending} isPending={isPending}
assignableChannels={assignableChannels}
selectedChannelId={selectedChannelId}
setSelectedChannelId={setSelectedChannelId}
onSubmitAssign={handleAssignExisting}
/> />
</> </>
); );
@@ -153,7 +297,8 @@ export function DestinationCard({ destination }: DestinationCardProps) {
return ( return (
<> <>
<Card> <Card>
<CardContent className="flex items-center justify-between gap-4 py-4"> <CardContent className="py-4 space-y-3">
<div className="flex items-center justify-between gap-4">
<div className="flex items-center gap-3"> <div className="flex items-center gap-3">
<Database className="h-5 w-5 text-purple-500 shrink-0" /> <Database className="h-5 w-5 text-purple-500 shrink-0" />
<div> <div>
@@ -177,6 +322,21 @@ export function DestinationCard({ destination }: DestinationCardProps) {
</div> </div>
</div> </div>
</div> </div>
<div className="flex items-center gap-2">
<Button
variant="outline"
size="sm"
onClick={handleRebuild}
disabled={isPending || rebuildState.phase === "running"}
title="Scan destination channel and rebuild the package database"
>
{rebuildState.phase === "running" ? (
<Loader2 className="h-3.5 w-3.5 animate-spin mr-1.5" />
) : (
<RefreshCw className="h-3.5 w-3.5 mr-1.5" />
)}
Rebuild DB
</Button>
<Button <Button
variant="outline" variant="outline"
size="sm" size="sm"
@@ -184,49 +344,124 @@ export function DestinationCard({ destination }: DestinationCardProps) {
> >
Change Change
</Button> </Button>
</div>
</div>
{/* Rebuild progress */}
{rebuildState.phase === "running" && rebuildProgress && (
<div className="border-t pt-3">
<div className="flex items-center gap-2">
<Loader2 className="h-3.5 w-3.5 animate-spin text-primary shrink-0" />
<span className="text-xs text-muted-foreground">
Rebuilding package database...
</span>
</div>
<div className="flex items-center gap-4 pl-6 mt-1 text-xs text-muted-foreground">
<span>
<span className="text-foreground tabular-nums">
{rebuildProgress.messagesScanned}
</span>{" "}
messages scanned
</span>
<span>
<span className="text-foreground tabular-nums">
{rebuildProgress.documentsFound}
</span>{" "}
archives found
</span>
<span>
<span className="text-foreground tabular-nums">
{rebuildProgress.packagesCreated}
</span>{" "}
restored
</span>
<span>
<span className="text-foreground tabular-nums">
{rebuildProgress.packagesSkipped}
</span>{" "}
skipped
</span>
</div>
</div>
)}
{rebuildState.phase === "done" && (
<div className="border-t pt-3">
<div className="flex items-center gap-2 text-xs text-emerald-500">
<Database className="h-3.5 w-3.5 shrink-0" />
<span>
Rebuild complete: {rebuildState.created} packages restored,{" "}
{rebuildState.skipped} skipped ({rebuildState.scanned} messages
scanned)
</span>
</div>
</div>
)}
{rebuildState.phase === "error" && (
<div className="border-t pt-3">
<div className="flex items-center gap-2 text-xs text-red-500">
<AlertTriangle className="h-3.5 w-3.5 shrink-0" />
<span>Rebuild failed: {rebuildState.message}</span>
</div>
</div>
)}
</CardContent> </CardContent>
</Card> </Card>
<CreateDestinationDialog <DestinationDialog
open={createOpen} open={createOpen}
onOpenChange={handleOpenChange} onOpenChange={handleOpenChange}
title={title} title={title}
setTitle={setTitle} setTitle={setTitle}
onSubmit={handleCreate} onSubmitCreate={handleCreate}
createState={createState} createState={createState}
isPending={isPending} isPending={isPending}
assignableChannels={assignableChannels}
selectedChannelId={selectedChannelId}
setSelectedChannelId={setSelectedChannelId}
onSubmitAssign={handleAssignExisting}
/> />
</> </>
); );
} }
function CreateDestinationDialog({ function DestinationDialog({
open, open,
onOpenChange, onOpenChange,
title, title,
setTitle, setTitle,
onSubmit, onSubmitCreate,
createState, createState,
isPending, isPending,
assignableChannels,
selectedChannelId,
setSelectedChannelId,
onSubmitAssign,
}: { }: {
open: boolean; open: boolean;
onOpenChange: (open: boolean) => void; onOpenChange: (open: boolean) => void;
title: string; title: string;
setTitle: (v: string) => void; setTitle: (v: string) => void;
onSubmit: () => void; onSubmitCreate: () => void;
createState: CreateState; createState: CreateState;
isPending: boolean; isPending: boolean;
assignableChannels: ChannelRow[];
selectedChannelId: string;
setSelectedChannelId: (v: string) => void;
onSubmitAssign: () => void;
}) { }) {
const isCreating = createState.phase === "creating"; const isCreating = createState.phase === "creating";
const hasAssignable = assignableChannels.length > 0;
return ( return (
<Dialog open={open} onOpenChange={onOpenChange}> <Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-md"> <DialogContent className="sm:max-w-md">
<DialogHeader> <DialogHeader>
<DialogTitle>Create Destination Channel</DialogTitle> <DialogTitle>Set Destination Channel</DialogTitle>
<DialogDescription> <DialogDescription>
A private Telegram group will be created automatically using one of Choose an existing channel or create a new private group. All
your authenticated accounts. All accounts will write archives here. accounts will write archives to this destination.
</DialogDescription> </DialogDescription>
</DialogHeader> </DialogHeader>
@@ -241,7 +476,71 @@ function CreateDestinationDialog({
</p> </p>
</div> </div>
) : ( ) : (
<div className="space-y-4"> <Tabs defaultValue={hasAssignable ? "existing" : "create"} className="w-full">
<TabsList className="grid w-full grid-cols-2">
<TabsTrigger value="existing" disabled={!hasAssignable}>
<ArrowRight className="mr-1.5 h-3.5 w-3.5" />
Use Existing
</TabsTrigger>
<TabsTrigger value="create">
<Plus className="mr-1.5 h-3.5 w-3.5" />
Create New
</TabsTrigger>
</TabsList>
<TabsContent value="existing" className="space-y-4 pt-2">
{createState.phase === "error" && (
<div className="rounded-md border border-destructive/50 bg-destructive/10 p-3">
<p className="text-sm text-destructive">{createState.message}</p>
</div>
)}
<div className="space-y-2">
<Label>Select Channel</Label>
<Select
value={selectedChannelId}
onValueChange={setSelectedChannelId}
>
<SelectTrigger>
<SelectValue placeholder="Pick a channel..." />
</SelectTrigger>
<SelectContent>
{assignableChannels.map((ch) => (
<SelectItem key={ch.id} value={ch.id}>
{ch.title}{" "}
<span className="text-muted-foreground text-xs">
({ch.telegramId})
</span>
</SelectItem>
))}
</SelectContent>
</Select>
<p className="text-xs text-muted-foreground">
The selected channel will become the destination. All accounts
will be linked as writers automatically.
</p>
</div>
<DialogFooter>
<Button
variant="outline"
onClick={() => onOpenChange(false)}
>
Cancel
</Button>
<Button
onClick={onSubmitAssign}
disabled={isPending || !selectedChannelId}
>
{isPending && (
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
)}
Set as Destination
</Button>
</DialogFooter>
</TabsContent>
<TabsContent value="create" className="space-y-4 pt-2">
{createState.phase === "error" && ( {createState.phase === "error" && (
<div className="rounded-md border border-destructive/50 bg-destructive/10 p-3"> <div className="rounded-md border border-destructive/50 bg-destructive/10 p-3">
<p className="text-sm text-destructive">{createState.message}</p> <p className="text-sm text-destructive">{createState.message}</p>
@@ -257,30 +556,31 @@ function CreateDestinationDialog({
onChange={(e) => setTitle(e.target.value)} onChange={(e) => setTitle(e.target.value)}
/> />
<p className="text-xs text-muted-foreground"> <p className="text-xs text-muted-foreground">
This will be the name of the Telegram group. You can rename it later in Telegram. A new private Telegram group will be created using one of your
authenticated accounts. You can rename it later in Telegram.
</p> </p>
</div> </div>
</div>
)}
<DialogFooter> <DialogFooter>
<Button <Button
variant="outline" variant="outline"
onClick={() => onOpenChange(false)} onClick={() => onOpenChange(false)}
disabled={isCreating}
> >
Cancel Cancel
</Button> </Button>
<Button <Button
onClick={onSubmit} onClick={onSubmitCreate}
disabled={isPending || isCreating || !title.trim()} disabled={isPending || !title.trim()}
> >
{(isPending || isCreating) && ( {isPending && (
<Loader2 className="mr-2 h-4 w-4 animate-spin" /> <Loader2 className="mr-2 h-4 w-4 animate-spin" />
)} )}
Create Group Create Group
</Button> </Button>
</DialogFooter> </DialogFooter>
</TabsContent>
</Tabs>
)}
</DialogContent> </DialogContent>
</Dialog> </Dialog>
); );

View File

@@ -0,0 +1,179 @@
"use client";
import { useState, useEffect, useCallback } from "react";
import { Loader2, Link as LinkIcon } from "lucide-react";
import { toast } from "sonner";
import { joinChannelByLink } from "../actions";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
DialogFooter,
} from "@/components/ui/dialog";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Label } from "@/components/ui/label";
interface JoinChannelDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
}
type JoinState =
| { phase: "idle" }
| { phase: "submitting"; requestId?: string }
| { phase: "success"; title: string }
| { phase: "error"; message: string };
export function JoinChannelDialog({
open,
onOpenChange,
}: JoinChannelDialogProps) {
const [input, setInput] = useState("");
const [joinState, setJoinState] = useState<JoinState>({ phase: "idle" });
// Reset on close
useEffect(() => {
if (!open) {
setInput("");
setJoinState({ phase: "idle" });
}
}, [open]);
const pollForResult = useCallback(async (requestId: string) => {
for (let i = 0; i < 30; i++) {
await new Promise((r) => setTimeout(r, 2000));
try {
const res = await fetch(
`/api/telegram/worker-request?requestId=${requestId}`
);
if (!res.ok) continue;
const data = await res.json();
if (data.status === "COMPLETED") {
const result = data.result;
setJoinState({
phase: "success",
title: result?.title ?? "Unknown channel",
});
toast.success(`Channel "${result?.title}" added as source`);
// Auto-close after short delay
setTimeout(() => onOpenChange(false), 1500);
return;
} else if (data.status === "FAILED") {
setJoinState({
phase: "error",
message: data.error || "Failed to join channel",
});
return;
}
} catch {
// Network error, keep polling
}
}
setJoinState({
phase: "error",
message: "Request timed out. The worker may be busy -- try again later.",
});
}, [onOpenChange]);
const handleSubmit = async () => {
if (!input.trim()) return;
setJoinState({ phase: "submitting" });
try {
const result = await joinChannelByLink(input);
if (!result.success) {
setJoinState({ phase: "error", message: result.error ?? "Unknown error" });
return;
}
const requestId = result.data!.requestId;
setJoinState({ phase: "submitting", requestId });
await pollForResult(requestId);
} catch (err) {
const message = err instanceof Error ? err.message : "Network error";
setJoinState({ phase: "error", message });
}
};
const isSubmitting = joinState.phase === "submitting";
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-md">
<DialogHeader>
<DialogTitle>Add Channel</DialogTitle>
<DialogDescription>
Join a Telegram channel or group by link, username, or invite link.
The channel will be added as an active source.
</DialogDescription>
</DialogHeader>
<div className="space-y-4 py-2">
<div className="space-y-2">
<Label htmlFor="channel-input">Channel link or username</Label>
<Input
id="channel-input"
placeholder="@channel, t.me/channel, or t.me/+invite"
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyDown={(e) => {
if (e.key === "Enter" && !isSubmitting && input.trim()) {
handleSubmit();
}
}}
disabled={isSubmitting}
/>
<p className="text-xs text-muted-foreground">
Supported formats: @username, https://t.me/username, https://t.me/+invitecode
</p>
</div>
{joinState.phase === "submitting" && (
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Loader2 className="h-4 w-4 animate-spin" />
{joinState.requestId
? "Joining channel via worker..."
: "Sending request..."}
</div>
)}
{joinState.phase === "error" && (
<p className="text-sm text-destructive">{joinState.message}</p>
)}
{joinState.phase === "success" && (
<p className="text-sm text-emerald-600">
Successfully added &quot;{joinState.title}&quot;
</p>
)}
</div>
<DialogFooter>
<Button variant="outline" onClick={() => onOpenChange(false)}>
{joinState.phase === "success" ? "Close" : "Cancel"}
</Button>
{joinState.phase !== "success" && (
<Button
onClick={handleSubmit}
disabled={isSubmitting || !input.trim()}
>
{isSubmitting ? (
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
) : (
<LinkIcon className="mr-2 h-4 w-4" />
)}
Add Channel
</Button>
)}
</DialogFooter>
</DialogContent>
</Dialog>
);
}

View File

@@ -16,6 +16,7 @@ interface TelegramAdminProps {
ingestionStatus: IngestionAccountStatus[]; ingestionStatus: IngestionAccountStatus[];
globalDestination: GlobalDestination; globalDestination: GlobalDestination;
sendHistory: SendHistoryRow[]; sendHistory: SendHistoryRow[];
workerIntervalMinutes: number;
} }
export function TelegramAdmin({ export function TelegramAdmin({
@@ -24,6 +25,7 @@ export function TelegramAdmin({
ingestionStatus, ingestionStatus,
globalDestination, globalDestination,
sendHistory, sendHistory,
workerIntervalMinutes,
}: TelegramAdminProps) { }: TelegramAdminProps) {
return ( return (
<div className="space-y-4"> <div className="space-y-4">
@@ -32,7 +34,7 @@ export function TelegramAdmin({
description="Manage Telegram accounts, channels, and ingestion" description="Manage Telegram accounts, channels, and ingestion"
/> />
<WorkerStatusPanel initialStatus={ingestionStatus} /> <WorkerStatusPanel initialStatus={ingestionStatus} initialIntervalMinutes={workerIntervalMinutes} />
<Tabs defaultValue="accounts" className="space-y-4"> <Tabs defaultValue="accounts" className="space-y-4">
<TabsList> <TabsList>
@@ -51,7 +53,7 @@ export function TelegramAdmin({
<AccountsTab accounts={accounts} /> <AccountsTab accounts={accounts} />
</TabsContent> </TabsContent>
<TabsContent value="channels"> <TabsContent value="channels">
<ChannelsTab channels={channels} globalDestination={globalDestination} /> <ChannelsTab channels={channels} globalDestination={globalDestination} accounts={accounts} />
</TabsContent> </TabsContent>
<TabsContent value="sends"> <TabsContent value="sends">
<BotSendsTab history={sendHistory} /> <BotSendsTab history={sendHistory} />

View File

@@ -1,6 +1,6 @@
"use client"; "use client";
import { useEffect, useState, useCallback } from "react"; import { useEffect, useState, useCallback, useTransition } from "react";
import { import {
Loader2, Loader2,
CheckCircle2, CheckCircle2,
@@ -14,10 +14,13 @@ import { Card, CardContent } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge"; import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { toast } from "sonner";
import { triggerIngestion } from "../actions";
import type { IngestionAccountStatus } from "@/lib/telegram/types"; import type { IngestionAccountStatus } from "@/lib/telegram/types";
interface WorkerStatusPanelProps { interface WorkerStatusPanelProps {
initialStatus: IngestionAccountStatus[]; initialStatus: IngestionAccountStatus[];
initialIntervalMinutes?: number;
} }
const AUTH_STATE_CONFIG: Record< const AUTH_STATE_CONFIG: Record<
@@ -39,15 +42,28 @@ const AUTH_STATE_CONFIG: Record<
EXPIRED: { label: "Expired", color: "text-red-500", icon: "x" }, EXPIRED: { label: "Expired", color: "text-red-500", icon: "x" },
}; };
export function WorkerStatusPanel({ initialStatus }: WorkerStatusPanelProps) { export function WorkerStatusPanel({ initialStatus, initialIntervalMinutes = 60 }: WorkerStatusPanelProps) {
const [accounts, setAccounts] = useState(initialStatus); const [accounts, setAccounts] = useState(initialStatus);
const [error, setError] = useState(false); const [error, setError] = useState(false);
const [nextRunCountdown, setNextRunCountdown] = useState<string | null>(null); const [nextRunCountdown, setNextRunCountdown] = useState<string | null>(null);
const [workerIntervalMinutes, setWorkerIntervalMinutes] = useState(initialIntervalMinutes);
const [isPending, startTransition] = useTransition();
// Find active run // Find active run
const activeRun = accounts.find((a) => a.currentRun); const activeRun = accounts.find((a) => a.currentRun);
const isRunning = !!activeRun; const isRunning = !!activeRun;
const handleSyncNow = useCallback(() => {
startTransition(async () => {
const result = await triggerIngestion();
if (result.success) {
toast.success("Sync triggered — worker will start shortly");
} else {
toast.error(result.error ?? "Failed to trigger sync");
}
});
}, []);
// Poll for status // Poll for status
useEffect(() => { useEffect(() => {
let timer: ReturnType<typeof setTimeout>; let timer: ReturnType<typeof setTimeout>;
@@ -60,6 +76,9 @@ export function WorkerStatusPanel({ initialStatus }: WorkerStatusPanelProps) {
const data = await res.json(); const data = await res.json();
if (mounted) { if (mounted) {
setAccounts(data.accounts ?? []); setAccounts(data.accounts ?? []);
if (data.workerIntervalMinutes) {
setWorkerIntervalMinutes(data.workerIntervalMinutes);
}
setError(false); setError(false);
} }
} catch { } catch {
@@ -86,7 +105,7 @@ export function WorkerStatusPanel({ initialStatus }: WorkerStatusPanelProps) {
return; return;
} }
// Estimate next run based on last run finish time + interval (5 min + up to 5 min jitter) // Estimate next run based on last run finish time + configured interval + up to 5 min jitter
const lastFinished = accounts const lastFinished = accounts
.filter((a) => a.lastRun?.finishedAt) .filter((a) => a.lastRun?.finishedAt)
.map((a) => new Date(a.lastRun!.finishedAt!).getTime()) .map((a) => new Date(a.lastRun!.finishedAt!).getTime())
@@ -97,7 +116,7 @@ export function WorkerStatusPanel({ initialStatus }: WorkerStatusPanelProps) {
return; return;
} }
const intervalMs = 5 * 60 * 1000; // 5 min base const intervalMs = workerIntervalMinutes * 60 * 1000;
const estimatedNext = lastFinished + intervalMs; const estimatedNext = lastFinished + intervalMs;
const tick = () => { const tick = () => {
@@ -116,7 +135,7 @@ export function WorkerStatusPanel({ initialStatus }: WorkerStatusPanelProps) {
tick(); tick();
const interval = setInterval(tick, 1_000); const interval = setInterval(tick, 1_000);
return () => clearInterval(interval); return () => clearInterval(interval);
}, [isRunning, accounts]); }, [isRunning, accounts, workerIntervalMinutes]);
if (accounts.length === 0 && !error) { if (accounts.length === 0 && !error) {
return ( return (
@@ -182,7 +201,12 @@ export function WorkerStatusPanel({ initialStatus }: WorkerStatusPanelProps) {
) : isRunning && activeRun?.currentRun ? ( ) : isRunning && activeRun?.currentRun ? (
<RunningStatus run={activeRun.currentRun} /> <RunningStatus run={activeRun.currentRun} />
) : ( ) : (
<IdleStatus accounts={accounts} nextRunCountdown={nextRunCountdown} /> <IdleStatus
accounts={accounts}
nextRunCountdown={nextRunCountdown}
onSyncNow={handleSyncNow}
isSyncing={isPending}
/>
)} )}
</CardContent> </CardContent>
</Card> </Card>
@@ -256,9 +280,13 @@ function RunningStatus({
function IdleStatus({ function IdleStatus({
accounts, accounts,
nextRunCountdown, nextRunCountdown,
onSyncNow,
isSyncing,
}: { }: {
accounts: IngestionAccountStatus[]; accounts: IngestionAccountStatus[];
nextRunCountdown: string | null; nextRunCountdown: string | null;
onSyncNow: () => void;
isSyncing: boolean;
}) { }) {
const lastRun = accounts const lastRun = accounts
.filter((a) => a.lastRun) .filter((a) => a.lastRun)
@@ -321,14 +349,32 @@ function IdleStatus({
)} )}
</div> </div>
<div className="flex items-center gap-2 shrink-0">
{nextRunCountdown && hasAuthenticated && ( {nextRunCountdown && hasAuthenticated && (
<div className="flex items-center gap-1.5 shrink-0"> <div className="flex items-center gap-1.5">
<RefreshCw className="h-3 w-3 text-muted-foreground" /> <RefreshCw className="h-3 w-3 text-muted-foreground" />
<span className="text-xs text-muted-foreground tabular-nums"> <span className="text-xs text-muted-foreground tabular-nums">
Next: {nextRunCountdown} Next: {nextRunCountdown}
</span> </span>
</div> </div>
)} )}
{hasAuthenticated && (
<Button
variant="outline"
size="sm"
className="h-7 text-xs px-2"
onClick={onSyncNow}
disabled={isSyncing}
>
{isSyncing ? (
<Loader2 className="h-3 w-3 animate-spin mr-1" />
) : (
<RefreshCw className="h-3 w-3 mr-1" />
)}
Sync Now
</Button>
)}
</div>
</div> </div>
); );
} }

View File

@@ -231,11 +231,35 @@ export async function toggleChannelActive(id: string): Promise<ActionResult> {
const existing = await prisma.telegramChannel.findUnique({ where: { id } }); const existing = await prisma.telegramChannel.findUnique({ where: { id } });
if (!existing) return { success: false, error: "Channel not found" }; if (!existing) return { success: false, error: "Channel not found" };
const newActive = !existing.isActive;
try { try {
await prisma.telegramChannel.update({ await prisma.telegramChannel.update({
where: { id }, where: { id },
data: { isActive: !existing.isActive }, data: { isActive: newActive },
}); });
// When enabling a SOURCE channel, auto-create READER links for all
// active authenticated accounts so the worker can scan it.
// Without this, toggling a channel active without going through the
// channel picker leaves it with no AccountChannelMap READER link.
if (newActive && existing.type === "SOURCE") {
const accounts = await prisma.telegramAccount.findMany({
where: { isActive: true, authState: "AUTHENTICATED" },
select: { id: true },
});
for (const account of accounts) {
try {
await prisma.accountChannelMap.create({
data: { accountId: account.id, channelId: id, role: "READER" },
});
} catch {
// Already linked — ignore unique constraint violation
}
}
}
revalidatePath(REVALIDATE_PATH); revalidatePath(REVALIDATE_PATH);
return { success: true, data: undefined }; return { success: true, data: undefined };
} catch { } catch {
@@ -259,6 +283,84 @@ export async function deleteChannel(id: string): Promise<ActionResult> {
} }
} }
export async function setChannelCategory(
id: string,
category: string | null
): Promise<ActionResult> {
const admin = await requireAdmin();
if (!admin.success) return admin;
try {
const existing = await prisma.telegramChannel.findUnique({
where: { id },
select: { category: true },
});
if (!existing) return { success: false, error: "Channel not found" };
const oldCategory = existing.category;
const newCategory = category?.trim() || null;
await prisma.telegramChannel.update({
where: { id },
data: { category: newCategory },
});
// Retroactively re-tag packages from this channel when category changes
if (oldCategory !== newCategory && newCategory) {
await retagChannelPackages(id, oldCategory, newCategory);
}
revalidatePath("/telegram");
return { success: true, data: undefined };
} catch {
return { success: false, error: "Failed to update category" };
}
}
export async function retagChannelPackages(
channelId: string,
oldCategory: string | null,
newCategory: string
): Promise<ActionResult<{ updated: number }>> {
const session = await auth();
if (!session?.user?.id) return { success: false, error: "Unauthorized" };
try {
// Find packages from this channel that have the old category tag (or no category tag)
const packages = await prisma.package.findMany({
where: { sourceChannelId: channelId },
select: { id: true, tags: true },
});
let updated = 0;
for (const pkg of packages) {
const tags = [...pkg.tags];
// Remove old category tag if present
if (oldCategory) {
const idx = tags.indexOf(oldCategory);
if (idx !== -1) tags.splice(idx, 1);
}
// Add new category tag if not already present
if (!tags.includes(newCategory)) {
tags.push(newCategory);
}
// Only update if tags actually changed
if (JSON.stringify(tags) !== JSON.stringify(pkg.tags)) {
await prisma.package.update({
where: { id: pkg.id },
data: { tags },
});
updated++;
}
}
revalidatePath("/stls");
return { success: true, data: { updated } };
} catch {
return { success: false, error: "Failed to re-tag packages" };
}
}
export async function setChannelType( export async function setChannelType(
id: string, id: string,
type: "SOURCE" | "DESTINATION" type: "SOURCE" | "DESTINATION"
@@ -270,6 +372,13 @@ export async function setChannelType(
if (!existing) return { success: false, error: "Channel not found" }; if (!existing) return { success: false, error: "Channel not found" };
try { try {
if (type === "DESTINATION") {
// Setting as destination: use the full global destination logic
// so it updates the global settings key, creates WRITER links, etc.
return await setGlobalDestination(id);
}
// Setting as SOURCE — just change the type
await prisma.telegramChannel.update({ await prisma.telegramChannel.update({
where: { id }, where: { id },
data: { type }, data: { type },
@@ -281,19 +390,49 @@ export async function setChannelType(
} }
} }
export async function triggerChannelSync(): Promise<ActionResult> { /**
* Reset all scan progress for a channel so the worker will re-process it
* from the very beginning on the next ingestion cycle.
*
* This clears:
* - `lastProcessedMessageId` on every AccountChannelMap linked to this channel
* - All TopicProgress records for those maps (for forum channels)
*/
export async function rescanChannel(channelId: string): Promise<ActionResult> {
const admin = await requireAdmin(); const admin = await requireAdmin();
if (!admin.success) return admin; if (!admin.success) return admin;
const channel = await prisma.telegramChannel.findUnique({
where: { id: channelId },
});
if (!channel) return { success: false, error: "Channel not found" };
try { try {
// Signal the worker to do a channel sync via pg_notify // Find all account-channel maps for this channel
await prisma.$queryRawUnsafe( const maps = await prisma.accountChannelMap.findMany({
`SELECT pg_notify('channel_sync', 'requested')` where: { channelId },
); select: { id: true },
});
const mapIds = maps.map((m) => m.id);
// Delete all topic progress records for these maps (forum channels)
if (mapIds.length > 0) {
await prisma.topicProgress.deleteMany({
where: { accountChannelMapId: { in: mapIds } },
});
}
// Reset the scan cursor so the worker re-processes from the start
await prisma.accountChannelMap.updateMany({
where: { channelId },
data: { lastProcessedMessageId: null },
});
revalidatePath(REVALIDATE_PATH); revalidatePath(REVALIDATE_PATH);
return { success: true, data: undefined }; return { success: true, data: undefined };
} catch { } catch {
return { success: false, error: "Failed to trigger channel sync" }; return { success: false, error: "Failed to reset channel scan progress" };
} }
} }
@@ -377,7 +516,7 @@ export async function triggerIngestion(
try { try {
await prisma.$queryRawUnsafe( await prisma.$queryRawUnsafe(
`SELECT pg_notify('ingestion_trigger', $1)`, `SELECT pg_notify('ingestion_trigger', $1)`,
accounts.map((a) => a.id).join(",") accounts.map((a: { id: string }) => a.id).join(",")
); );
} catch { } catch {
// Best-effort // Best-effort
@@ -407,7 +546,7 @@ export async function saveChannelSelections(
try { try {
let linked = 0; let linked = 0;
for (const ch of channels) { for (const ch of channels) {
// Upsert the channel record (new channels default to disabled) // Upsert the channel record and activate it (user explicitly selected it)
const channel = await prisma.telegramChannel.upsert({ const channel = await prisma.telegramChannel.upsert({
where: { telegramId: BigInt(ch.telegramId) }, where: { telegramId: BigInt(ch.telegramId) },
create: { create: {
@@ -415,11 +554,12 @@ export async function saveChannelSelections(
title: ch.title, title: ch.title,
type: "SOURCE", type: "SOURCE",
isForum: ch.isForum, isForum: ch.isForum,
isActive: false, isActive: true,
}, },
update: { update: {
title: ch.title, title: ch.title,
isForum: ch.isForum, isForum: ch.isForum,
isActive: true,
}, },
}); });
@@ -444,6 +584,56 @@ export async function saveChannelSelections(
} }
} }
// ── Join channel by link/username ──
/**
* Request the worker to join a channel by t.me link, invite link, or @username.
* Uses ChannelFetchRequest as a generic DB-mediated request with pg_notify.
* Returns the requestId so the UI can poll for completion.
*/
export async function joinChannelByLink(
input: string
): Promise<ActionResult<{ requestId: string }>> {
const admin = await requireAdmin();
if (!admin.success) return admin;
const trimmed = input.trim();
if (!trimmed) return { success: false, error: "Input is required" };
try {
// Need at least one authenticated account for TDLib
const account = await prisma.telegramAccount.findFirst({
where: { isActive: true, authState: "AUTHENTICATED" },
select: { id: true },
});
if (!account) {
return { success: false, error: "At least one authenticated account is needed" };
}
// Create a fetch request to track progress
const fetchRequest = await prisma.channelFetchRequest.create({
data: {
accountId: account.id,
status: "PENDING",
},
});
// Signal worker via pg_notify
await prisma.$queryRawUnsafe(
`SELECT pg_notify('join_channel', $1)`,
JSON.stringify({
requestId: fetchRequest.id,
input: trimmed,
accountId: account.id,
})
);
return { success: true, data: { requestId: fetchRequest.id } };
} catch {
return { success: false, error: "Failed to request channel join" };
}
}
// ── Global destination channel ── // ── Global destination channel ──
export async function setGlobalDestination( export async function setGlobalDestination(
@@ -574,6 +764,63 @@ export async function createDestinationChannel(
} }
} }
/**
* Request the worker to rebuild the package database by scanning the
* destination channel for uploaded archives and recreating Package records.
* Uses ChannelFetchRequest as a generic DB-mediated request with pg_notify.
* Returns the requestId so the UI can poll for progress.
*/
export async function rebuildPackageDatabase(): Promise<
ActionResult<{ requestId: string }>
> {
const admin = await requireAdmin();
if (!admin.success) return admin;
try {
// Need at least one authenticated account for TDLib
const hasAccount = await prisma.telegramAccount.findFirst({
where: { isActive: true, authState: "AUTHENTICATED" },
select: { id: true },
});
if (!hasAccount) {
return {
success: false,
error:
"At least one authenticated account is needed to scan the destination channel",
};
}
// Need a destination channel
const destSetting = await prisma.globalSetting.findUnique({
where: { key: "destination_channel_id" },
});
if (!destSetting) {
return {
success: false,
error: "No destination channel configured",
};
}
// Create a fetch request to track progress
const fetchRequest = await prisma.channelFetchRequest.create({
data: {
accountId: hasAccount.id,
status: "PENDING",
},
});
// Signal worker via pg_notify
await prisma.$queryRawUnsafe(
`SELECT pg_notify('rebuild_packages', $1)`,
fetchRequest.id
);
return { success: true, data: { requestId: fetchRequest.id } };
} catch {
return { success: false, error: "Failed to request package database rebuild" };
}
}
/** /**
* Request the worker to create a new Telegram supergroup as the destination. * Request the worker to create a new Telegram supergroup as the destination.
* Uses ChannelFetchRequest as a generic DB-mediated request with pg_notify. * Uses ChannelFetchRequest as a generic DB-mediated request with pg_notify.

View File

@@ -25,7 +25,7 @@ export default async function TelegramPage() {
}), }),
]); ]);
const serializedHistory = sendHistory.map((r) => ({ const serializedHistory = sendHistory.map((r: typeof sendHistory[number]) => ({
id: r.id, id: r.id,
packageName: r.package.fileName, packageName: r.package.fileName,
recipientName: r.telegramLink.telegramName, recipientName: r.telegramLink.telegramName,
@@ -42,6 +42,7 @@ export default async function TelegramPage() {
ingestionStatus={ingestionStatus} ingestionStatus={ingestionStatus}
globalDestination={globalDestination} globalDestination={globalDestination}
sendHistory={serializedHistory} sendHistory={serializedHistory}
workerIntervalMinutes={parseInt(process.env.WORKER_INTERVAL_MINUTES ?? "60", 10)}
/> />
); );
} }

View File

@@ -11,6 +11,23 @@ export async function registerUser(input: unknown): Promise<ActionResult<{ id: s
return { success: false, error: "Validation failed" }; return { success: false, error: "Validation failed" };
} }
// Validate invite code
const invite = await prisma.inviteCode.findUnique({
where: { code: parsed.data.inviteCode },
});
if (!invite) {
return { success: false, error: "Invalid invite code. Please check the code and try again." };
}
if (invite.uses >= invite.maxUses) {
return { success: false, error: "This invite code has reached its maximum number of uses" };
}
if (invite.expiresAt && invite.expiresAt < new Date()) {
return { success: false, error: "This invite code has expired. Please request a new one." };
}
const existing = await prisma.user.findUnique({ const existing = await prisma.user.findUnique({
where: { email: parsed.data.email }, where: { email: parsed.data.email },
}); });
@@ -21,13 +38,15 @@ export async function registerUser(input: unknown): Promise<ActionResult<{ id: s
const hashedPassword = await bcrypt.hash(parsed.data.password, 10); const hashedPassword = await bcrypt.hash(parsed.data.password, 10);
// Self-hosted: all users are admins // Create user and increment invite usage in a transaction
const user = await prisma.user.create({ const user = await prisma.$transaction(async (tx) => {
const newUser = await tx.user.create({
data: { data: {
name: parsed.data.name, name: parsed.data.name,
email: parsed.data.email, email: parsed.data.email,
hashedPassword, hashedPassword,
role: "ADMIN", role: "USER",
usedInviteId: invite.id,
settings: { settings: {
create: { create: {
lowStockThreshold: 10, lowStockThreshold: 10,
@@ -39,5 +58,13 @@ export async function registerUser(input: unknown): Promise<ActionResult<{ id: s
}, },
}); });
await tx.inviteCode.update({
where: { id: invite.id },
data: { uses: { increment: 1 } },
});
return newUser;
});
return { success: true, data: { id: user.id } }; return { success: true, data: { id: user.id } };
} }

View File

@@ -1,7 +1,7 @@
"use client"; "use client";
import { useState, useTransition } from "react"; import { useState, useTransition } from "react";
import { useRouter } from "next/navigation"; import { useRouter, useSearchParams } from "next/navigation";
import Link from "next/link"; import Link from "next/link";
import { useForm } from "react-hook-form"; import { useForm } from "react-hook-form";
import { zodResolver } from "@hookform/resolvers/zod"; import { zodResolver } from "@hookform/resolvers/zod";
@@ -24,12 +24,19 @@ import { APP_NAME } from "@/lib/constants";
export default function RegisterPage() { export default function RegisterPage() {
const router = useRouter(); const router = useRouter();
const searchParams = useSearchParams();
const [error, setError] = useState<string | null>(null); const [error, setError] = useState<string | null>(null);
const [isPending, startTransition] = useTransition(); const [isPending, startTransition] = useTransition();
const form = useForm<RegisterInput>({ const form = useForm<RegisterInput>({
resolver: zodResolver(registerSchema), resolver: zodResolver(registerSchema),
defaultValues: { name: "", email: "", password: "", confirmPassword: "" }, defaultValues: {
name: "",
email: "",
password: "",
confirmPassword: "",
inviteCode: searchParams.get("code") ?? "",
},
}); });
function onSubmit(values: RegisterInput) { function onSubmit(values: RegisterInput) {
@@ -75,7 +82,7 @@ export default function RegisterPage() {
<Card> <Card>
<CardHeader> <CardHeader>
<CardTitle>Create Account</CardTitle> <CardTitle>Create Account</CardTitle>
<CardDescription>Fill in your details below</CardDescription> <CardDescription>You need an invite code to register</CardDescription>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<Form {...form}> <Form {...form}>
@@ -86,6 +93,24 @@ export default function RegisterPage() {
</div> </div>
)} )}
<FormField
control={form.control}
name="inviteCode"
render={({ field }) => (
<FormItem>
<FormLabel>Invite Code</FormLabel>
<FormControl>
<Input
placeholder="Enter your invite code"
autoComplete="off"
{...field}
/>
</FormControl>
<FormMessage />
</FormItem>
)}
/>
<FormField <FormField
control={form.control} control={form.control}
name="name" name="name"

View File

@@ -0,0 +1,36 @@
import { NextResponse } from "next/server";
import { prisma } from "@/lib/prisma";
import { authenticateApiRequest } from "@/lib/telegram/api-auth";
export async function GET(
request: Request,
{ params }: { params: Promise<{ id: string }> }
) {
const authResult = await authenticateApiRequest(request);
if ("error" in authResult) return authResult.error;
const { id } = await params;
const group = await prisma.packageGroup.findUnique({
where: { id },
select: { previewData: true },
});
if (!group || !group.previewData) {
return new NextResponse(null, { status: 404 });
}
const buffer =
group.previewData instanceof Buffer
? group.previewData
: Buffer.from(group.previewData);
return new NextResponse(buffer, {
status: 200,
headers: {
"Content-Type": "image/jpeg",
"Content-Length": String(buffer.length),
"Cache-Control": "public, max-age=3600, immutable",
},
});
}

View File

@@ -9,5 +9,9 @@ export async function GET(request: Request) {
if ("error" in authResult) return authResult.error; if ("error" in authResult) return authResult.error;
const accounts = await getIngestionStatus(); const accounts = await getIngestionStatus();
return NextResponse.json({ accounts }); const workerIntervalMinutes = parseInt(
process.env.WORKER_INTERVAL_MINUTES ?? "60",
10
);
return NextResponse.json({ accounts, workerIntervalMinutes });
} }

View File

@@ -45,33 +45,20 @@ export async function POST(request: Request) {
); );
} }
// Create ingestion runs marked as RUNNING — the worker will pick these up // Send pg_notify for immediate worker pickup.
// when it next polls, or we use pg_notify for immediate pickup // The worker creates its own IngestionRun records with proper activity tracking.
for (const account of accounts) {
// Only create if no run is already RUNNING for this account
const existing = await prisma.ingestionRun.findFirst({
where: { accountId: account.id, status: "RUNNING" },
});
if (!existing) {
await prisma.ingestionRun.create({
data: { accountId: account.id, status: "RUNNING" },
});
}
}
// Send pg_notify for immediate worker pickup
try { try {
await prisma.$queryRawUnsafe( await prisma.$queryRawUnsafe(
`SELECT pg_notify('ingestion_trigger', $1)`, `SELECT pg_notify('ingestion_trigger', $1)`,
accounts.map((a) => a.id).join(",") accounts.map((a: { id: string }) => a.id).join(",")
); );
} catch { } catch {
// pg_notify is best-effort — worker will pick up on next cycle anyway // pg_notify is best-effort — worker will pick up on next scheduled cycle anyway
} }
return NextResponse.json({ return NextResponse.json({
triggered: true, triggered: true,
accountIds: accounts.map((a) => a.id), accountIds: accounts.map((a: { id: string }) => a.id),
message: `Ingestion queued for ${accounts.length} account(s)`, message: `Ingestion triggered for ${accounts.length} account(s)`,
}); });
} }

Some files were not shown because too many files have changed in this diff Show More