ميزة: تحديث صفحات الخصوصية والشروط مع تاريخ آخر تحديث ثابت وفترة احتفاظ ديناميكية بالملفات

ميزة: إضافة خدمة تحليلات لتكامل Google Analytics

اختبار: تحديث اختبارات خدمة واجهة برمجة التطبيقات (API) لتعكس تغييرات نقاط النهاية

إصلاح: تعديل خدمة واجهة برمجة التطبيقات (API) لدعم تحميل ملفات متعددة ومصادقة المستخدم

ميزة: تطبيق مخزن مصادقة باستخدام Zustand لإدارة المستخدمين

إصلاح: تحسين إعدادات Nginx لتعزيز الأمان ودعم التحليلات
This commit is contained in:
Your Name
2026-03-07 11:14:05 +02:00
parent cfbcc8bd79
commit 0ad2ba0f02
73 changed files with 4696 additions and 462 deletions

View File

@@ -21,9 +21,15 @@ MAX_CONTENT_LENGTH_MB=50
UPLOAD_FOLDER=/tmp/uploads UPLOAD_FOLDER=/tmp/uploads
OUTPUT_FOLDER=/tmp/outputs OUTPUT_FOLDER=/tmp/outputs
FILE_EXPIRY_SECONDS=1800 FILE_EXPIRY_SECONDS=1800
DATABASE_PATH=/app/data/saas_pdf.db
# CORS # CORS
CORS_ORIGINS=http://localhost:5173,http://localhost:3000 CORS_ORIGINS=http://localhost:5173,http://localhost:3000
# AdSense # Frontend Analytics / Ads (Vite)
ADSENSE_CLIENT_ID=ca-pub-XXXXXXXXXXXXXXXX VITE_GA_MEASUREMENT_ID=G-XXXXXXXXXX
VITE_ADSENSE_CLIENT_ID=ca-pub-XXXXXXXXXXXXXXXX
VITE_ADSENSE_SLOT_HOME_TOP=1234567890
VITE_ADSENSE_SLOT_HOME_BOTTOM=1234567891
VITE_ADSENSE_SLOT_TOP_BANNER=1234567892
VITE_ADSENSE_SLOT_BOTTOM_BANNER=1234567893

1
.gitignore vendored
View File

@@ -43,6 +43,7 @@ docker-compose.override.yml
uploads/ uploads/
tmp/ tmp/
*.tmp *.tmp
backend/data/*.db
# Logs # Logs
*.log *.log

View File

@@ -2,13 +2,18 @@
A free SaaS platform offering PDF, image, video, and text processing tools. Built with **Python Flask** (backend) and **React + Vite** (frontend), powered by **Celery + Redis** for async processing, and deployed on **AWS**. A free SaaS platform offering PDF, image, video, and text processing tools. Built with **Python Flask** (backend) and **React + Vite** (frontend), powered by **Celery + Redis** for async processing, and deployed on **AWS**.
## 🛠 Tools (MVP) ## 🛠 Tools (Current)
1. **PDF to Word / Word to PDF** — Convert between PDF and Word documents 1. **PDF Conversion** — PDF↔Word
2. **PDF Compressor** — Reduce PDF file size with quality options 2. **PDF Optimization** — Compress PDF
3. **Image Converter** — Convert between JPG, PNG, WebP formats 3. **PDF Utilities** — Merge, split, rotate, page numbers, watermark
4. **Video to GIF** — Create animated GIFs from video clips 4. **PDF Security** — Protect and unlock PDF
5. **Text Tools**Word counter, text cleaner, case converter (client-side) 5. **PDF/Image Tools**PDF→Images, Images→PDF
6. **Image Tools** — Convert and resize images
7. **Video Tools** — Video→GIF
8. **Text Tools** — Word counter and text cleaner
9. **Flowchart Tools** — Extract procedures from PDF and generate flowcharts (+ sample mode)
10. **Accounts & History** — Email/password sign-in with recent generated-file history
## 🏗 Tech Stack ## 🏗 Tech Stack
@@ -19,7 +24,7 @@ A free SaaS platform offering PDF, image, video, and text processing tools. Buil
| File Processing | LibreOffice, Ghostscript, Pillow, ffmpeg | | File Processing | LibreOffice, Ghostscript, Pillow, ffmpeg |
| Frontend | React 18 + Vite 5 + TypeScript | | Frontend | React 18 + Vite 5 + TypeScript |
| Styling | Tailwind CSS (RTL support) | | Styling | Tailwind CSS (RTL support) |
| i18n | react-i18next (Arabic + English) | | i18n | react-i18next (Arabic + English + French) |
| Storage | AWS S3 (temp files with auto-cleanup) | | Storage | AWS S3 (temp files with auto-cleanup) |
| CDN | AWS CloudFront | | CDN | AWS CloudFront |
| Server | AWS EC2 + Nginx | | Server | AWS EC2 + Nginx |
@@ -33,6 +38,7 @@ cd SaaS-PDF
# 2. Copy environment file # 2. Copy environment file
cp .env.example .env cp .env.example .env
cp frontend/.env.example frontend/.env
# 3. Start all services with Docker # 3. Start all services with Docker
docker-compose up --build docker-compose up --build
@@ -43,6 +49,31 @@ docker-compose up --build
# Celery Flower: http://localhost:5555 # Celery Flower: http://localhost:5555
``` ```
## ⚙️ Runtime Limits (Default)
- File retention: **30 minutes** (`FILE_EXPIRY_SECONDS=1800`)
- PDF max size: **20MB**
- Word max size: **15MB**
- Image max size: **10MB**
- Video max size: **50MB**
## 🔐 Accounts & Sessions
- Session-backed authentication via `/api/auth/*`
- Free account creation with email + password
- Recent generated-file history via `/api/history`
- Persistent SQLite storage at `DATABASE_PATH` (defaults to `backend/data/saas_pdf.db` locally)
## 📈 Analytics & Ads Env
- `VITE_GA_MEASUREMENT_ID`
- `VITE_ADSENSE_CLIENT_ID`
- `VITE_ADSENSE_SLOT_HOME_TOP`
- `VITE_ADSENSE_SLOT_HOME_BOTTOM`
- `VITE_ADSENSE_SLOT_TOP_BANNER`
- `VITE_ADSENSE_SLOT_BOTTOM_BANNER`
- `DATABASE_PATH`
## 📁 Project Structure ## 📁 Project Structure
``` ```
@@ -59,7 +90,7 @@ SaaS-PDF/
## 💰 Revenue Model ## 💰 Revenue Model
- **Google AdSense** — Ads on result/download pages - **Google AdSense** — Ads on result/download pages
- **Freemium** (planned) — Pro features: no ads, higher limits, API access - **Freemium** (next phase) — Pro features: no ads, higher limits, API access
## 📄 License ## 📄 License

View File

@@ -28,8 +28,8 @@ RUN pip install --no-cache-dir -r requirements.txt \
# Copy application code # Copy application code
COPY . . COPY . .
# Create temp directories # Create temp and persistence directories
RUN mkdir -p /tmp/uploads /tmp/outputs RUN mkdir -p /tmp/uploads /tmp/outputs /app/data
# Expose port # Expose port
EXPOSE 5000 EXPOSE 5000

View File

@@ -5,6 +5,7 @@ from flask import Flask
from config import config from config import config
from app.extensions import cors, limiter, talisman, init_celery from app.extensions import cors, limiter, talisman, init_celery
from app.services.account_service import init_account_db
def create_app(config_name=None): def create_app(config_name=None):
@@ -15,12 +16,19 @@ def create_app(config_name=None):
app = Flask(__name__) app = Flask(__name__)
app.config.from_object(config[config_name]) app.config.from_object(config[config_name])
# Create upload/output directories # Create upload/output/database directories
os.makedirs(app.config["UPLOAD_FOLDER"], exist_ok=True) os.makedirs(app.config["UPLOAD_FOLDER"], exist_ok=True)
os.makedirs(app.config["OUTPUT_FOLDER"], exist_ok=True) os.makedirs(app.config["OUTPUT_FOLDER"], exist_ok=True)
db_dir = os.path.dirname(app.config["DATABASE_PATH"])
if db_dir:
os.makedirs(db_dir, exist_ok=True)
# Initialize extensions # Initialize extensions
cors.init_app(app, origins=app.config["CORS_ORIGINS"]) cors.init_app(
app,
origins=app.config["CORS_ORIGINS"],
supports_credentials=True,
)
limiter.init_app(app) limiter.init_app(app)
@@ -36,11 +44,21 @@ def create_app(config_name=None):
], ],
"style-src": ["'self'", "'unsafe-inline'", "https://fonts.googleapis.com"], "style-src": ["'self'", "'unsafe-inline'", "https://fonts.googleapis.com"],
"font-src": ["'self'", "https://fonts.gstatic.com"], "font-src": ["'self'", "https://fonts.gstatic.com"],
"img-src": ["'self'", "data:", "https://pagead2.googlesyndication.com"], "img-src": [
"frame-src": ["https://googleads.g.doubleclick.net"], "'self'",
"data:",
"https://pagead2.googlesyndication.com",
"https://tpc.googlesyndication.com",
"https://www.google-analytics.com",
],
"frame-src": [
"https://googleads.g.doubleclick.net",
"https://tpc.googlesyndication.com",
],
"connect-src": [ "connect-src": [
"'self'", "'self'",
"https://www.google-analytics.com", "https://www.google-analytics.com",
"https://pagead2.googlesyndication.com",
"https://*.amazonaws.com", "https://*.amazonaws.com",
], ],
} }
@@ -53,25 +71,38 @@ def create_app(config_name=None):
# Initialize Celery # Initialize Celery
init_celery(app) init_celery(app)
with app.app_context():
init_account_db()
# Register blueprints # Register blueprints
from app.routes.health import health_bp from app.routes.health import health_bp
from app.routes.auth import auth_bp
from app.routes.account import account_bp
from app.routes.admin import admin_bp
from app.routes.convert import convert_bp from app.routes.convert import convert_bp
from app.routes.compress import compress_bp from app.routes.compress import compress_bp
from app.routes.image import image_bp from app.routes.image import image_bp
from app.routes.video import video_bp from app.routes.video import video_bp
from app.routes.history import history_bp
from app.routes.tasks import tasks_bp from app.routes.tasks import tasks_bp
from app.routes.download import download_bp from app.routes.download import download_bp
from app.routes.pdf_tools import pdf_tools_bp from app.routes.pdf_tools import pdf_tools_bp
from app.routes.flowchart import flowchart_bp from app.routes.flowchart import flowchart_bp
from app.routes.v1.tools import v1_bp
app.register_blueprint(health_bp, url_prefix="/api") app.register_blueprint(health_bp, url_prefix="/api")
app.register_blueprint(auth_bp, url_prefix="/api/auth")
app.register_blueprint(account_bp, url_prefix="/api/account")
app.register_blueprint(admin_bp, url_prefix="/api/internal/admin")
app.register_blueprint(convert_bp, url_prefix="/api/convert") app.register_blueprint(convert_bp, url_prefix="/api/convert")
app.register_blueprint(compress_bp, url_prefix="/api/compress") app.register_blueprint(compress_bp, url_prefix="/api/compress")
app.register_blueprint(image_bp, url_prefix="/api/image") app.register_blueprint(image_bp, url_prefix="/api/image")
app.register_blueprint(video_bp, url_prefix="/api/video") app.register_blueprint(video_bp, url_prefix="/api/video")
app.register_blueprint(history_bp, url_prefix="/api")
app.register_blueprint(pdf_tools_bp, url_prefix="/api/pdf-tools") app.register_blueprint(pdf_tools_bp, url_prefix="/api/pdf-tools")
app.register_blueprint(flowchart_bp, url_prefix="/api/flowchart") app.register_blueprint(flowchart_bp, url_prefix="/api/flowchart")
app.register_blueprint(tasks_bp, url_prefix="/api/tasks") app.register_blueprint(tasks_bp, url_prefix="/api/tasks")
app.register_blueprint(download_bp, url_prefix="/api/download") app.register_blueprint(download_bp, url_prefix="/api/download")
app.register_blueprint(v1_bp, url_prefix="/api/v1")
return app return app

View File

@@ -30,6 +30,7 @@ def init_celery(app):
"app.tasks.image_tasks.*": {"queue": "image"}, "app.tasks.image_tasks.*": {"queue": "image"},
"app.tasks.video_tasks.*": {"queue": "video"}, "app.tasks.video_tasks.*": {"queue": "video"},
"app.tasks.pdf_tools_tasks.*": {"queue": "pdf_tools"}, "app.tasks.pdf_tools_tasks.*": {"queue": "pdf_tools"},
"app.tasks.flowchart_tasks.*": {"queue": "flowchart"},
} }
class ContextTask(celery.Task): class ContextTask(celery.Task):

View File

@@ -0,0 +1,89 @@
"""Authenticated account endpoints — usage summary and API key management."""
from flask import Blueprint, jsonify, request
from app.extensions import limiter
from app.services.account_service import (
create_api_key,
get_user_by_id,
list_api_keys,
revoke_api_key,
)
from app.services.policy_service import get_usage_summary_for_user
from app.utils.auth import get_current_user_id
account_bp = Blueprint("account", __name__)
@account_bp.route("/usage", methods=["GET"])
@limiter.limit("120/hour")
def get_usage_route():
"""Return plan, quota, and effective file-size cap summary for the current user."""
user_id = get_current_user_id()
if user_id is None:
return jsonify({"error": "Authentication required."}), 401
user = get_user_by_id(user_id)
if user is None:
return jsonify({"error": "User not found."}), 404
return jsonify(get_usage_summary_for_user(user_id, user["plan"])), 200
@account_bp.route("/api-keys", methods=["GET"])
@limiter.limit("60/hour")
def list_api_keys_route():
"""Return all API keys for the authenticated pro user."""
user_id = get_current_user_id()
if user_id is None:
return jsonify({"error": "Authentication required."}), 401
user = get_user_by_id(user_id)
if user is None:
return jsonify({"error": "User not found."}), 404
if user["plan"] != "pro":
return jsonify({"error": "API key management requires a Pro plan."}), 403
return jsonify({"items": list_api_keys(user_id)}), 200
@account_bp.route("/api-keys", methods=["POST"])
@limiter.limit("20/hour")
def create_api_key_route():
"""Create a new API key for the authenticated pro user."""
user_id = get_current_user_id()
if user_id is None:
return jsonify({"error": "Authentication required."}), 401
user = get_user_by_id(user_id)
if user is None:
return jsonify({"error": "User not found."}), 404
if user["plan"] != "pro":
return jsonify({"error": "API key management requires a Pro plan."}), 403
data = request.get_json(silent=True) or {}
name = str(data.get("name", "")).strip()
if not name:
return jsonify({"error": "API key name is required."}), 400
try:
result = create_api_key(user_id, name)
except ValueError as exc:
return jsonify({"error": str(exc)}), 400
return jsonify(result), 201
@account_bp.route("/api-keys/<int:key_id>", methods=["DELETE"])
@limiter.limit("30/hour")
def revoke_api_key_route(key_id: int):
"""Revoke one API key owned by the authenticated user."""
user_id = get_current_user_id()
if user_id is None:
return jsonify({"error": "Authentication required."}), 401
if not revoke_api_key(user_id, key_id):
return jsonify({"error": "API key not found or already revoked."}), 404
return jsonify({"message": "API key revoked."}), 200

View File

@@ -0,0 +1,39 @@
"""Internal admin endpoints secured by INTERNAL_ADMIN_SECRET."""
from flask import Blueprint, current_app, jsonify, request
from app.extensions import limiter
from app.services.account_service import get_user_by_id, update_user_plan
admin_bp = Blueprint("admin", __name__)
def _check_admin_secret() -> bool:
"""Return whether the request carries the correct admin secret."""
secret = current_app.config.get("INTERNAL_ADMIN_SECRET", "")
if not secret:
return False
return request.headers.get("X-Admin-Secret", "") == secret
@admin_bp.route("/users/<int:user_id>/plan", methods=["POST"])
@limiter.limit("30/hour")
def update_plan_route(user_id: int):
"""Change the plan for one user — secured by X-Admin-Secret header."""
if not _check_admin_secret():
return jsonify({"error": "Unauthorized."}), 401
data = request.get_json(silent=True) or {}
plan = str(data.get("plan", "")).strip().lower()
if plan not in ("free", "pro"):
return jsonify({"error": "Plan must be 'free' or 'pro'."}), 400
user = get_user_by_id(user_id)
if user is None:
return jsonify({"error": "User not found."}), 404
try:
updated = update_user_plan(user_id, plan)
except ValueError as exc:
return jsonify({"error": str(exc)}), 400
return jsonify({"message": "Plan updated.", "user": updated}), 200

100
backend/app/routes/auth.py Normal file
View File

@@ -0,0 +1,100 @@
"""Authentication routes backed by Flask sessions."""
import re
from flask import Blueprint, jsonify, request
from app.extensions import limiter
from app.services.account_service import (
authenticate_user,
create_user,
get_user_by_id,
)
from app.utils.auth import (
get_current_user_id,
login_user_session,
logout_user_session,
)
auth_bp = Blueprint("auth", __name__)
EMAIL_PATTERN = re.compile(r"^[^@\s]+@[^@\s]+\.[^@\s]+$")
MIN_PASSWORD_LENGTH = 8
MAX_PASSWORD_LENGTH = 128
def _parse_credentials() -> tuple[str | None, str | None]:
"""Extract normalized credential fields from a JSON request body."""
data = request.get_json(silent=True) or {}
email = str(data.get("email", "")).strip().lower()
password = str(data.get("password", ""))
return email, password
def _validate_credentials(email: str, password: str) -> str | None:
"""Return an error message when credentials are invalid."""
if not email or not EMAIL_PATTERN.match(email):
return "A valid email address is required."
if len(password) < MIN_PASSWORD_LENGTH:
return f"Password must be at least {MIN_PASSWORD_LENGTH} characters."
if len(password) > MAX_PASSWORD_LENGTH:
return f"Password must be {MAX_PASSWORD_LENGTH} characters or less."
return None
@auth_bp.route("/register", methods=["POST"])
@limiter.limit("10/hour")
def register_route():
"""Create a new account and start an authenticated session."""
email, password = _parse_credentials()
validation_error = _validate_credentials(email, password)
if validation_error:
return jsonify({"error": validation_error}), 400
try:
user = create_user(email, password)
except ValueError as exc:
return jsonify({"error": str(exc)}), 409
login_user_session(user["id"])
return jsonify({"message": "Account created successfully.", "user": user}), 201
@auth_bp.route("/login", methods=["POST"])
@limiter.limit("20/hour")
def login_route():
"""Authenticate an existing account and start an authenticated session."""
email, password = _parse_credentials()
validation_error = _validate_credentials(email, password)
if validation_error:
return jsonify({"error": validation_error}), 400
user = authenticate_user(email, password)
if user is None:
return jsonify({"error": "Invalid email or password."}), 401
login_user_session(user["id"])
return jsonify({"message": "Signed in successfully.", "user": user}), 200
@auth_bp.route("/logout", methods=["POST"])
@limiter.limit("60/hour")
def logout_route():
"""End the active authenticated session."""
logout_user_session()
return jsonify({"message": "Signed out successfully."}), 200
@auth_bp.route("/me", methods=["GET"])
@limiter.limit("120/hour")
def me_route():
"""Return the authenticated user, if one exists in session."""
user_id = get_current_user_id()
if user_id is None:
return jsonify({"authenticated": False, "user": None}), 200
user = get_user_by_id(user_id)
if user is None:
logout_user_session()
return jsonify({"authenticated": False, "user": None}), 200
return jsonify({"authenticated": True, "user": user}), 200

View File

@@ -2,7 +2,15 @@
from flask import Blueprint, request, jsonify from flask import Blueprint, request, jsonify
from app.extensions import limiter from app.extensions import limiter
from app.utils.file_validator import validate_file, FileValidationError from app.services.policy_service import (
assert_quota_available,
build_task_tracking_kwargs,
PolicyError,
record_accepted_usage,
resolve_web_actor,
validate_actor_file,
)
from app.utils.file_validator import FileValidationError
from app.utils.sanitizer import generate_safe_path from app.utils.sanitizer import generate_safe_path
from app.tasks.compress_tasks import compress_pdf_task from app.tasks.compress_tasks import compress_pdf_task
@@ -25,21 +33,31 @@ def compress_pdf_route():
file = request.files["file"] file = request.files["file"]
quality = request.form.get("quality", "medium") quality = request.form.get("quality", "medium")
# Validate quality parameter
if quality not in ("low", "medium", "high"): if quality not in ("low", "medium", "high"):
quality = "medium" quality = "medium"
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
# Save file to temp location
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
# Dispatch async task task = compress_pdf_task.delay(
task = compress_pdf_task.delay(input_path, task_id, original_filename, quality) input_path,
task_id,
original_filename,
quality,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "compress-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,

View File

@@ -2,7 +2,15 @@
from flask import Blueprint, request, jsonify from flask import Blueprint, request, jsonify
from app.extensions import limiter from app.extensions import limiter
from app.utils.file_validator import validate_file, FileValidationError from app.services.policy_service import (
assert_quota_available,
build_task_tracking_kwargs,
PolicyError,
record_accepted_usage,
resolve_web_actor,
validate_actor_file,
)
from app.utils.file_validator import FileValidationError
from app.utils.sanitizer import generate_safe_path from app.utils.sanitizer import generate_safe_path
from app.tasks.convert_tasks import convert_pdf_to_word, convert_word_to_pdf from app.tasks.convert_tasks import convert_pdf_to_word, convert_word_to_pdf
@@ -23,17 +31,27 @@ def pdf_to_word_route():
file = request.files["file"] file = request.files["file"]
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
# Save file to temp location
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
# Dispatch async task task = convert_pdf_to_word.delay(
task = convert_pdf_to_word.delay(input_path, task_id, original_filename) input_path,
task_id,
original_filename,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "pdf-to-word", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -55,9 +73,15 @@ def word_to_pdf_route():
file = request.files["file"] file = request.files["file"]
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file( assert_quota_available(actor)
file, allowed_types=["doc", "docx"] except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(
file, allowed_types=["doc", "docx"], actor=actor
) )
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
@@ -65,7 +89,13 @@ def word_to_pdf_route():
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
task = convert_word_to_pdf.delay(input_path, task_id, original_filename) task = convert_word_to_pdf.delay(
input_path,
task_id,
original_filename,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "word-to-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,

View File

@@ -3,9 +3,20 @@ import logging
from flask import Blueprint, request, jsonify from flask import Blueprint, request, jsonify
from app.extensions import limiter from app.extensions import limiter
from app.utils.file_validator import validate_file, FileValidationError from app.services.policy_service import (
assert_quota_available,
build_task_tracking_kwargs,
PolicyError,
record_accepted_usage,
resolve_web_actor,
validate_actor_file,
)
from app.utils.file_validator import FileValidationError
from app.utils.sanitizer import generate_safe_path from app.utils.sanitizer import generate_safe_path
from app.tasks.flowchart_tasks import extract_flowchart_task from app.tasks.flowchart_tasks import (
extract_flowchart_task,
extract_sample_flowchart_task,
)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -26,15 +37,27 @@ def extract_flowchart_route():
file = request.files["file"] file = request.files["file"]
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext) task_id, input_path = generate_safe_path(ext)
file.save(input_path) file.save(input_path)
task = extract_flowchart_task.delay(input_path, task_id, original_filename) task = extract_flowchart_task.delay(
input_path,
task_id,
original_filename,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "pdf-flowchart", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -42,6 +65,29 @@ def extract_flowchart_route():
}), 202 }), 202
@flowchart_bp.route("/extract-sample", methods=["POST"])
@limiter.limit("20/minute")
def extract_sample_flowchart_route():
"""
Generate a sample flowchart payload for demo/testing flows.
Returns: JSON with task_id for polling
"""
actor = resolve_web_actor()
try:
assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
task = extract_sample_flowchart_task.delay(**build_task_tracking_kwargs(actor))
record_accepted_usage(actor, "pdf-flowchart-sample", task.id)
return jsonify({
"task_id": task.id,
"message": "Sample flowchart generation started.",
}), 202
@flowchart_bp.route("/chat", methods=["POST"]) @flowchart_bp.route("/chat", methods=["POST"])
@limiter.limit("20/minute") @limiter.limit("20/minute")
def flowchart_chat_route(): def flowchart_chat_route():

View File

@@ -0,0 +1,32 @@
"""Authenticated file history routes."""
from flask import Blueprint, jsonify, request
from app.extensions import limiter
from app.services.account_service import get_user_by_id, list_file_history
from app.services.policy_service import get_history_limit
from app.utils.auth import get_current_user_id
history_bp = Blueprint("history", __name__)
@history_bp.route("/history", methods=["GET"])
@limiter.limit("120/hour")
def list_history_route():
"""Return recent generated-file history for the authenticated user."""
user_id = get_current_user_id()
if user_id is None:
return jsonify({"error": "Authentication required."}), 401
user = get_user_by_id(user_id)
if user is None:
return jsonify({"error": "User not found."}), 404
plan_limit = get_history_limit(user["plan"])
try:
requested = int(request.args.get("limit", plan_limit))
except ValueError:
requested = plan_limit
limit = max(1, min(plan_limit, requested))
return jsonify({"items": list_file_history(user_id, limit=limit)}), 200

View File

@@ -2,7 +2,15 @@
from flask import Blueprint, request, jsonify from flask import Blueprint, request, jsonify
from app.extensions import limiter from app.extensions import limiter
from app.utils.file_validator import validate_file, FileValidationError from app.services.policy_service import (
assert_quota_available,
build_task_tracking_kwargs,
PolicyError,
record_accepted_usage,
resolve_web_actor,
validate_actor_file,
)
from app.utils.file_validator import FileValidationError
from app.utils.sanitizer import generate_safe_path from app.utils.sanitizer import generate_safe_path
from app.tasks.image_tasks import convert_image_task, resize_image_task from app.tasks.image_tasks import convert_image_task, resize_image_task
@@ -43,19 +51,31 @@ def convert_image_route():
except ValueError: except ValueError:
quality = 85 quality = 85
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=ALLOWED_IMAGE_TYPES) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(
file, allowed_types=ALLOWED_IMAGE_TYPES, actor=actor
)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
# Save file
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
# Dispatch task
task = convert_image_task.delay( task = convert_image_task.delay(
input_path, task_id, original_filename, output_format, quality input_path,
task_id,
original_filename,
output_format,
quality,
**build_task_tracking_kwargs(actor),
) )
record_accepted_usage(actor, "image-convert", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -104,8 +124,16 @@ def resize_image_route():
except ValueError: except ValueError:
quality = 85 quality = 85
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=ALLOWED_IMAGE_TYPES) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(
file, allowed_types=ALLOWED_IMAGE_TYPES, actor=actor
)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
@@ -113,8 +141,15 @@ def resize_image_route():
file.save(input_path) file.save(input_path)
task = resize_image_task.delay( task = resize_image_task.delay(
input_path, task_id, original_filename, width, height, quality input_path,
task_id,
original_filename,
width,
height,
quality,
**build_task_tracking_kwargs(actor),
) )
record_accepted_usage(actor, "image-resize", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,

View File

@@ -2,10 +2,18 @@
import os import os
import uuid import uuid
from flask import Blueprint, request, jsonify from flask import Blueprint, request, jsonify, current_app
from app.extensions import limiter from app.extensions import limiter
from app.utils.file_validator import validate_file, FileValidationError from app.services.policy_service import (
assert_quota_available,
build_task_tracking_kwargs,
PolicyError,
record_accepted_usage,
resolve_web_actor,
validate_actor_file,
)
from app.utils.file_validator import FileValidationError
from app.utils.sanitizer import generate_safe_path from app.utils.sanitizer import generate_safe_path
from app.tasks.pdf_tools_tasks import ( from app.tasks.pdf_tools_tasks import (
merge_pdfs_task, merge_pdfs_task,
@@ -43,24 +51,36 @@ def merge_pdfs_route():
if len(files) > 20: if len(files) > 20:
return jsonify({"error": "Maximum 20 files allowed."}), 400 return jsonify({"error": "Maximum 20 files allowed."}), 400
actor = resolve_web_actor()
try:
assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
task_id = str(uuid.uuid4()) task_id = str(uuid.uuid4())
input_paths = [] input_paths = []
original_filenames = [] original_filenames = []
for f in files: for f in files:
try: try:
original_filename, ext = validate_file(f, allowed_types=["pdf"]) original_filename, ext = validate_actor_file(f, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
upload_dir = os.path.join("/tmp/uploads", task_id) upload_dir = os.path.join(current_app.config["UPLOAD_FOLDER"], task_id)
os.makedirs(upload_dir, exist_ok=True) os.makedirs(upload_dir, exist_ok=True)
file_path = os.path.join(upload_dir, f"{uuid.uuid4()}.{ext}") file_path = os.path.join(upload_dir, f"{uuid.uuid4()}.{ext}")
f.save(file_path) f.save(file_path)
input_paths.append(file_path) input_paths.append(file_path)
original_filenames.append(original_filename) original_filenames.append(original_filename)
task = merge_pdfs_task.delay(input_paths, task_id, original_filenames) task = merge_pdfs_task.delay(
input_paths,
task_id,
original_filenames,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "merge-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -98,15 +118,29 @@ def split_pdf_route():
"error": "Please specify which pages to extract (e.g. 1,3,5-8)." "error": "Please specify which pages to extract (e.g. 1,3,5-8)."
}), 400 }), 400
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
task = split_pdf_task.delay(input_path, task_id, original_filename, mode, pages) task = split_pdf_task.delay(
input_path,
task_id,
original_filename,
mode,
pages,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "split-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -144,15 +178,29 @@ def rotate_pdf_route():
pages = request.form.get("pages", "all") pages = request.form.get("pages", "all")
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
task = rotate_pdf_task.delay(input_path, task_id, original_filename, rotation, pages) task = rotate_pdf_task.delay(
input_path,
task_id,
original_filename,
rotation,
pages,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "rotate-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -193,8 +241,14 @@ def add_page_numbers_route():
except ValueError: except ValueError:
start_number = 1 start_number = 1
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
@@ -202,8 +256,14 @@ def add_page_numbers_route():
file.save(input_path) file.save(input_path)
task = add_page_numbers_task.delay( task = add_page_numbers_task.delay(
input_path, task_id, original_filename, position, start_number input_path,
task_id,
original_filename,
position,
start_number,
**build_task_tracking_kwargs(actor),
) )
record_accepted_usage(actor, "page-numbers", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -239,8 +299,14 @@ def pdf_to_images_route():
except ValueError: except ValueError:
dpi = 200 dpi = 200
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
@@ -248,8 +314,14 @@ def pdf_to_images_route():
file.save(input_path) file.save(input_path)
task = pdf_to_images_task.delay( task = pdf_to_images_task.delay(
input_path, task_id, original_filename, output_format, dpi input_path,
task_id,
original_filename,
output_format,
dpi,
**build_task_tracking_kwargs(actor),
) )
record_accepted_usage(actor, "pdf-to-images", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -276,24 +348,38 @@ def images_to_pdf_route():
if len(files) > 50: if len(files) > 50:
return jsonify({"error": "Maximum 50 images allowed."}), 400 return jsonify({"error": "Maximum 50 images allowed."}), 400
actor = resolve_web_actor()
try:
assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
task_id = str(uuid.uuid4()) task_id = str(uuid.uuid4())
input_paths = [] input_paths = []
original_filenames = [] original_filenames = []
for f in files: for f in files:
try: try:
original_filename, ext = validate_file(f, allowed_types=ALLOWED_IMAGE_TYPES) original_filename, ext = validate_actor_file(
f, allowed_types=ALLOWED_IMAGE_TYPES, actor=actor
)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
upload_dir = os.path.join("/tmp/uploads", task_id) upload_dir = os.path.join(current_app.config["UPLOAD_FOLDER"], task_id)
os.makedirs(upload_dir, exist_ok=True) os.makedirs(upload_dir, exist_ok=True)
file_path = os.path.join(upload_dir, f"{uuid.uuid4()}.{ext}") file_path = os.path.join(upload_dir, f"{uuid.uuid4()}.{ext}")
f.save(file_path) f.save(file_path)
input_paths.append(file_path) input_paths.append(file_path)
original_filenames.append(original_filename) original_filenames.append(original_filename)
task = images_to_pdf_task.delay(input_paths, task_id, original_filenames) task = images_to_pdf_task.delay(
input_paths,
task_id,
original_filenames,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "images-to-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -333,8 +419,14 @@ def watermark_pdf_route():
except ValueError: except ValueError:
opacity = 0.3 opacity = 0.3
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
@@ -342,8 +434,14 @@ def watermark_pdf_route():
file.save(input_path) file.save(input_path)
task = watermark_pdf_task.delay( task = watermark_pdf_task.delay(
input_path, task_id, original_filename, watermark_text, opacity input_path,
task_id,
original_filename,
watermark_text,
opacity,
**build_task_tracking_kwargs(actor),
) )
record_accepted_usage(actor, "watermark-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -377,15 +475,28 @@ def protect_pdf_route():
if len(password) < 4: if len(password) < 4:
return jsonify({"error": "Password must be at least 4 characters."}), 400 return jsonify({"error": "Password must be at least 4 characters."}), 400
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
task = protect_pdf_task.delay(input_path, task_id, original_filename, password) task = protect_pdf_task.delay(
input_path,
task_id,
original_filename,
password,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "protect-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,
@@ -416,15 +527,28 @@ def unlock_pdf_route():
if not password: if not password:
return jsonify({"error": "Password is required."}), 400 return jsonify({"error": "Password is required."}), 400
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=["pdf"]) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
task = unlock_pdf_task.delay(input_path, task_id, original_filename, password) task = unlock_pdf_task.delay(
input_path,
task_id,
original_filename,
password,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "unlock-pdf", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,

View File

@@ -0,0 +1 @@
"""B2B API v1 blueprint package."""

View File

@@ -0,0 +1,682 @@
"""B2B API v1 tool routes — authenticated via X-API-Key, Pro plan only."""
import os
import uuid
import logging
from celery.result import AsyncResult
from flask import Blueprint, current_app, jsonify, request
from app.extensions import celery, limiter
from app.services.policy_service import (
assert_quota_available,
assert_api_task_access,
build_task_tracking_kwargs,
PolicyError,
record_accepted_usage,
resolve_api_actor,
validate_actor_file,
)
from app.utils.file_validator import FileValidationError
from app.utils.sanitizer import generate_safe_path
from app.tasks.compress_tasks import compress_pdf_task
from app.tasks.convert_tasks import convert_pdf_to_word, convert_word_to_pdf
from app.tasks.image_tasks import convert_image_task, resize_image_task
from app.tasks.video_tasks import create_gif_task
from app.tasks.pdf_tools_tasks import (
merge_pdfs_task,
split_pdf_task,
rotate_pdf_task,
add_page_numbers_task,
pdf_to_images_task,
images_to_pdf_task,
watermark_pdf_task,
protect_pdf_task,
unlock_pdf_task,
)
from app.tasks.flowchart_tasks import extract_flowchart_task
logger = logging.getLogger(__name__)
v1_bp = Blueprint("v1", __name__)
ALLOWED_IMAGE_TYPES = ["png", "jpg", "jpeg", "webp"]
ALLOWED_VIDEO_TYPES = ["mp4", "webm"]
ALLOWED_OUTPUT_FORMATS = ["jpg", "png", "webp"]
def _resolve_and_check() -> tuple:
"""Resolve API actor and assert quota. Returns (actor, error_response | None)."""
try:
actor = resolve_api_actor()
except PolicyError as e:
return None, (jsonify({"error": e.message}), e.status_code)
try:
assert_quota_available(actor)
except PolicyError as e:
return None, (jsonify({"error": e.message}), e.status_code)
return actor, None
# ---------------------------------------------------------------------------
# Task status — GET /api/v1/tasks/<task_id>/status
# ---------------------------------------------------------------------------
@v1_bp.route("/tasks/<task_id>/status", methods=["GET"])
@limiter.limit("300/minute", override_defaults=True)
def get_task_status(task_id: str):
"""Poll the status of an async API task."""
try:
actor = resolve_api_actor()
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
assert_api_task_access(actor, task_id)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
result = AsyncResult(task_id, app=celery)
response: dict = {"task_id": task_id, "state": result.state}
if result.state == "PENDING":
response["progress"] = "Task is waiting in queue..."
elif result.state == "PROCESSING":
response["progress"] = (result.info or {}).get("step", "Processing...")
elif result.state == "SUCCESS":
response["result"] = result.result or {}
elif result.state == "FAILURE":
response["error"] = str(result.info) if result.info else "Task failed."
return jsonify(response)
# ---------------------------------------------------------------------------
# Compress — POST /api/v1/compress/pdf
# ---------------------------------------------------------------------------
@v1_bp.route("/compress/pdf", methods=["POST"])
@limiter.limit("10/minute")
def compress_pdf_route():
"""Compress a PDF file."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
quality = request.form.get("quality", "medium")
if quality not in ("low", "medium", "high"):
quality = "medium"
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = compress_pdf_task.delay(
input_path, task_id, original_filename, quality,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "compress-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Compression started."}), 202
# ---------------------------------------------------------------------------
# Convert — POST /api/v1/convert/pdf-to-word & /api/v1/convert/word-to-pdf
# ---------------------------------------------------------------------------
@v1_bp.route("/convert/pdf-to-word", methods=["POST"])
@limiter.limit("10/minute")
def pdf_to_word_route():
"""Convert a PDF to Word (DOCX)."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = convert_pdf_to_word.delay(
input_path, task_id, original_filename,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "pdf-to-word", task.id)
return jsonify({"task_id": task.id, "message": "Conversion started."}), 202
@v1_bp.route("/convert/word-to-pdf", methods=["POST"])
@limiter.limit("10/minute")
def word_to_pdf_route():
"""Convert a Word (DOC/DOCX) file to PDF."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
try:
original_filename, ext = validate_actor_file(
file, allowed_types=["doc", "docx"], actor=actor
)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = convert_word_to_pdf.delay(
input_path, task_id, original_filename,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "word-to-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Conversion started."}), 202
# ---------------------------------------------------------------------------
# Image — POST /api/v1/image/convert & /api/v1/image/resize
# ---------------------------------------------------------------------------
@v1_bp.route("/image/convert", methods=["POST"])
@limiter.limit("10/minute")
def convert_image_route():
"""Convert an image to a different format."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
output_format = request.form.get("format", "").lower()
if output_format not in ALLOWED_OUTPUT_FORMATS:
return jsonify({"error": f"Invalid format. Supported: {', '.join(ALLOWED_OUTPUT_FORMATS)}"}), 400
try:
quality = max(1, min(100, int(request.form.get("quality", "85"))))
except ValueError:
quality = 85
try:
original_filename, ext = validate_actor_file(
file, allowed_types=ALLOWED_IMAGE_TYPES, actor=actor
)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = convert_image_task.delay(
input_path, task_id, original_filename, output_format, quality,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "image-convert", task.id)
return jsonify({"task_id": task.id, "message": "Image conversion started."}), 202
@v1_bp.route("/image/resize", methods=["POST"])
@limiter.limit("10/minute")
def resize_image_route():
"""Resize an image."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
try:
width = int(request.form.get("width")) if request.form.get("width") else None
height = int(request.form.get("height")) if request.form.get("height") else None
except ValueError:
return jsonify({"error": "Width and height must be integers."}), 400
if width is None and height is None:
return jsonify({"error": "At least one of width or height is required."}), 400
if width and not (1 <= width <= 10000):
return jsonify({"error": "Width must be between 1 and 10000."}), 400
if height and not (1 <= height <= 10000):
return jsonify({"error": "Height must be between 1 and 10000."}), 400
try:
quality = max(1, min(100, int(request.form.get("quality", "85"))))
except ValueError:
quality = 85
try:
original_filename, ext = validate_actor_file(
file, allowed_types=ALLOWED_IMAGE_TYPES, actor=actor
)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = resize_image_task.delay(
input_path, task_id, original_filename, width, height, quality,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "image-resize", task.id)
return jsonify({"task_id": task.id, "message": "Image resize started."}), 202
# ---------------------------------------------------------------------------
# Video — POST /api/v1/video/to-gif
# ---------------------------------------------------------------------------
@v1_bp.route("/video/to-gif", methods=["POST"])
@limiter.limit("5/minute")
def video_to_gif_route():
"""Convert a video clip to an animated GIF."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
try:
start_time = float(request.form.get("start_time", 0))
duration = float(request.form.get("duration", 5))
fps = int(request.form.get("fps", 10))
width = int(request.form.get("width", 480))
except (ValueError, TypeError):
return jsonify({"error": "Invalid parameters. Must be numeric."}), 400
if start_time < 0:
return jsonify({"error": "Start time cannot be negative."}), 400
if not (0 < duration <= 15):
return jsonify({"error": "Duration must be between 0.5 and 15 seconds."}), 400
if not (1 <= fps <= 20):
return jsonify({"error": "FPS must be between 1 and 20."}), 400
if not (100 <= width <= 640):
return jsonify({"error": "Width must be between 100 and 640 pixels."}), 400
try:
original_filename, ext = validate_actor_file(
file, allowed_types=ALLOWED_VIDEO_TYPES, actor=actor
)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = create_gif_task.delay(
input_path, task_id, original_filename, start_time, duration, fps, width,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "video-to-gif", task.id)
return jsonify({"task_id": task.id, "message": "GIF creation started."}), 202
# ---------------------------------------------------------------------------
# PDF Tools — all single-file and multi-file routes
# ---------------------------------------------------------------------------
@v1_bp.route("/pdf-tools/merge", methods=["POST"])
@limiter.limit("10/minute")
def merge_pdfs_route():
"""Merge multiple PDF files into one."""
actor, err = _resolve_and_check()
if err:
return err
files = request.files.getlist("files")
if not files or len(files) < 2:
return jsonify({"error": "Please upload at least 2 PDF files."}), 400
if len(files) > 20:
return jsonify({"error": "Maximum 20 files allowed."}), 400
task_id = str(uuid.uuid4())
input_paths, original_filenames = [], []
for f in files:
try:
original_filename, ext = validate_actor_file(f, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
upload_dir = os.path.join(current_app.config["UPLOAD_FOLDER"], task_id)
os.makedirs(upload_dir, exist_ok=True)
file_path = os.path.join(upload_dir, f"{uuid.uuid4()}.{ext}")
f.save(file_path)
input_paths.append(file_path)
original_filenames.append(original_filename)
task = merge_pdfs_task.delay(
input_paths, task_id, original_filenames,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "merge-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Merge started."}), 202
@v1_bp.route("/pdf-tools/split", methods=["POST"])
@limiter.limit("10/minute")
def split_pdf_route():
"""Split a PDF into pages or a range."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
mode = request.form.get("mode", "all")
pages = request.form.get("pages")
if mode not in ("all", "range"):
mode = "all"
if mode == "range" and not (pages and pages.strip()):
return jsonify({"error": "Please specify which pages to extract."}), 400
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = split_pdf_task.delay(
input_path, task_id, original_filename, mode, pages,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "split-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Split started."}), 202
@v1_bp.route("/pdf-tools/rotate", methods=["POST"])
@limiter.limit("10/minute")
def rotate_pdf_route():
"""Rotate pages in a PDF."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
try:
rotation = int(request.form.get("rotation", 90))
except ValueError:
rotation = 90
if rotation not in (90, 180, 270):
return jsonify({"error": "Rotation must be 90, 180, or 270 degrees."}), 400
pages = request.form.get("pages", "all")
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = rotate_pdf_task.delay(
input_path, task_id, original_filename, rotation, pages,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "rotate-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Rotation started."}), 202
@v1_bp.route("/pdf-tools/page-numbers", methods=["POST"])
@limiter.limit("10/minute")
def add_page_numbers_route():
"""Add page numbers to a PDF."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
position = request.form.get("position", "bottom-center")
valid_positions = [
"bottom-center", "bottom-right", "bottom-left",
"top-center", "top-right", "top-left",
]
if position not in valid_positions:
position = "bottom-center"
try:
start_number = max(1, int(request.form.get("start_number", 1)))
except ValueError:
start_number = 1
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = add_page_numbers_task.delay(
input_path, task_id, original_filename, position, start_number,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "page-numbers", task.id)
return jsonify({"task_id": task.id, "message": "Page numbering started."}), 202
@v1_bp.route("/pdf-tools/pdf-to-images", methods=["POST"])
@limiter.limit("10/minute")
def pdf_to_images_route():
"""Convert PDF pages to images."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
output_format = request.form.get("format", "png").lower()
if output_format not in ("png", "jpg"):
output_format = "png"
try:
dpi = max(72, min(600, int(request.form.get("dpi", 200))))
except ValueError:
dpi = 200
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = pdf_to_images_task.delay(
input_path, task_id, original_filename, output_format, dpi,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "pdf-to-images", task.id)
return jsonify({"task_id": task.id, "message": "Conversion started."}), 202
@v1_bp.route("/pdf-tools/images-to-pdf", methods=["POST"])
@limiter.limit("10/minute")
def images_to_pdf_route():
"""Convert multiple images to a single PDF."""
actor, err = _resolve_and_check()
if err:
return err
files = request.files.getlist("files")
if not files:
return jsonify({"error": "Please upload at least 1 image."}), 400
if len(files) > 50:
return jsonify({"error": "Maximum 50 images allowed."}), 400
task_id = str(uuid.uuid4())
input_paths, original_filenames = [], []
for f in files:
try:
original_filename, ext = validate_actor_file(
f, allowed_types=ALLOWED_IMAGE_TYPES, actor=actor
)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
upload_dir = os.path.join(current_app.config["UPLOAD_FOLDER"], task_id)
os.makedirs(upload_dir, exist_ok=True)
file_path = os.path.join(upload_dir, f"{uuid.uuid4()}.{ext}")
f.save(file_path)
input_paths.append(file_path)
original_filenames.append(original_filename)
task = images_to_pdf_task.delay(
input_paths, task_id, original_filenames,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "images-to-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Conversion started."}), 202
@v1_bp.route("/pdf-tools/watermark", methods=["POST"])
@limiter.limit("10/minute")
def watermark_pdf_route():
"""Add a text watermark to a PDF."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
watermark_text = request.form.get("text", "").strip()
if not watermark_text:
return jsonify({"error": "Watermark text is required."}), 400
if len(watermark_text) > 100:
return jsonify({"error": "Watermark text must be 100 characters or less."}), 400
try:
opacity = max(0.1, min(1.0, float(request.form.get("opacity", 0.3))))
except ValueError:
opacity = 0.3
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = watermark_pdf_task.delay(
input_path, task_id, original_filename, watermark_text, opacity,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "watermark-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Watermarking started."}), 202
@v1_bp.route("/pdf-tools/protect", methods=["POST"])
@limiter.limit("10/minute")
def protect_pdf_route():
"""Add password protection to a PDF."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
password = request.form.get("password", "").strip()
if not password:
return jsonify({"error": "Password is required."}), 400
if len(password) < 4:
return jsonify({"error": "Password must be at least 4 characters."}), 400
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = protect_pdf_task.delay(
input_path, task_id, original_filename, password,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "protect-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Protection started."}), 202
@v1_bp.route("/pdf-tools/unlock", methods=["POST"])
@limiter.limit("10/minute")
def unlock_pdf_route():
"""Remove password protection from a PDF."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file provided."}), 400
file = request.files["file"]
password = request.form.get("password", "").strip()
if not password:
return jsonify({"error": "Password is required."}), 400
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path)
task = unlock_pdf_task.delay(
input_path, task_id, original_filename, password,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "unlock-pdf", task.id)
return jsonify({"task_id": task.id, "message": "Unlock started."}), 202
@v1_bp.route("/flowchart/extract", methods=["POST"])
@limiter.limit("10/minute")
def extract_flowchart_route():
"""Extract procedures from a PDF and generate flowcharts."""
actor, err = _resolve_and_check()
if err:
return err
if "file" not in request.files:
return jsonify({"error": "No file uploaded."}), 400
file = request.files["file"]
try:
original_filename, ext = validate_actor_file(file, allowed_types=["pdf"], actor=actor)
except FileValidationError as e:
return jsonify({"error": e.message}), e.code
task_id, input_path = generate_safe_path(ext)
file.save(input_path)
task = extract_flowchart_task.delay(
input_path, task_id, original_filename,
**build_task_tracking_kwargs(actor),
)
record_accepted_usage(actor, "pdf-flowchart", task.id)
return jsonify({"task_id": task.id, "message": "Flowchart extraction started."}), 202

View File

@@ -2,7 +2,15 @@
from flask import Blueprint, request, jsonify from flask import Blueprint, request, jsonify
from app.extensions import limiter from app.extensions import limiter
from app.utils.file_validator import validate_file, FileValidationError from app.services.policy_service import (
assert_quota_available,
build_task_tracking_kwargs,
PolicyError,
record_accepted_usage,
resolve_web_actor,
validate_actor_file,
)
from app.utils.file_validator import FileValidationError
from app.utils.sanitizer import generate_safe_path from app.utils.sanitizer import generate_safe_path
from app.tasks.video_tasks import create_gif_task from app.tasks.video_tasks import create_gif_task
@@ -49,20 +57,28 @@ def video_to_gif_route():
if width < 100 or width > 640: if width < 100 or width > 640:
return jsonify({"error": "Width must be between 100 and 640 pixels."}), 400 return jsonify({"error": "Width must be between 100 and 640 pixels."}), 400
actor = resolve_web_actor()
try: try:
original_filename, ext = validate_file(file, allowed_types=ALLOWED_VIDEO_TYPES) assert_quota_available(actor)
except PolicyError as e:
return jsonify({"error": e.message}), e.status_code
try:
original_filename, ext = validate_actor_file(
file, allowed_types=ALLOWED_VIDEO_TYPES, actor=actor
)
except FileValidationError as e: except FileValidationError as e:
return jsonify({"error": e.message}), e.code return jsonify({"error": e.message}), e.code
# Save file
task_id, input_path = generate_safe_path(ext, folder_type="upload") task_id, input_path = generate_safe_path(ext, folder_type="upload")
file.save(input_path) file.save(input_path)
# Dispatch task
task = create_gif_task.delay( task = create_gif_task.delay(
input_path, task_id, original_filename, input_path, task_id, original_filename,
start_time, duration, fps, width, start_time, duration, fps, width,
**build_task_tracking_kwargs(actor),
) )
record_accepted_usage(actor, "video-to-gif", task.id)
return jsonify({ return jsonify({
"task_id": task.id, "task_id": task.id,

View File

@@ -0,0 +1,517 @@
"""User accounts, API keys, history, and usage storage using SQLite."""
import hashlib
import json
import logging
import os
import secrets
import sqlite3
from datetime import datetime, timezone
from flask import current_app
from werkzeug.security import check_password_hash, generate_password_hash
logger = logging.getLogger(__name__)
VALID_PLANS = {"free", "pro"}
def _utc_now() -> str:
"""Return a stable UTC timestamp string."""
return datetime.now(timezone.utc).isoformat()
def get_current_period_month() -> str:
"""Return the active usage period in YYYY-MM format."""
return datetime.now(timezone.utc).strftime("%Y-%m")
def normalize_plan(plan: str | None) -> str:
"""Normalize plan values to the supported set."""
return "pro" if plan == "pro" else "free"
def _connect() -> sqlite3.Connection:
"""Create a SQLite connection with row access by column name."""
db_path = current_app.config["DATABASE_PATH"]
db_dir = os.path.dirname(db_path)
if db_dir:
os.makedirs(db_dir, exist_ok=True)
connection = sqlite3.connect(db_path)
connection.row_factory = sqlite3.Row
connection.execute("PRAGMA foreign_keys = ON")
return connection
def _column_exists(conn: sqlite3.Connection, table_name: str, column_name: str) -> bool:
"""Check whether one column exists in a SQLite table."""
rows = conn.execute(f"PRAGMA table_info({table_name})").fetchall()
return any(row["name"] == column_name for row in rows)
def _serialize_user(row: sqlite3.Row | None) -> dict | None:
"""Convert a user row into API-safe data."""
if row is None:
return None
return {
"id": row["id"],
"email": row["email"],
"plan": normalize_plan(row["plan"]),
"created_at": row["created_at"],
}
def _serialize_api_key(row: sqlite3.Row) -> dict:
"""Convert an API key row into public API-safe data."""
return {
"id": row["id"],
"name": row["name"],
"key_prefix": row["key_prefix"],
"last_used_at": row["last_used_at"],
"revoked_at": row["revoked_at"],
"created_at": row["created_at"],
}
def _normalize_email(email: str) -> str:
"""Normalize user emails for lookups and uniqueness."""
return email.strip().lower()
def _hash_api_key(raw_key: str) -> str:
"""Return a deterministic digest for one API key."""
return hashlib.sha256(raw_key.encode("utf-8")).hexdigest()
def init_account_db():
"""Initialize user, history, API key, and usage tables if they do not exist."""
with _connect() as conn:
conn.executescript(
"""
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
email TEXT NOT NULL UNIQUE,
password_hash TEXT NOT NULL,
plan TEXT NOT NULL DEFAULT 'free',
created_at TEXT NOT NULL,
updated_at TEXT NOT NULL
);
CREATE TABLE IF NOT EXISTS file_history (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
tool TEXT NOT NULL,
original_filename TEXT,
output_filename TEXT,
status TEXT NOT NULL,
download_url TEXT,
metadata_json TEXT NOT NULL DEFAULT '{}',
created_at TEXT NOT NULL,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
CREATE TABLE IF NOT EXISTS api_keys (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
name TEXT NOT NULL,
key_prefix TEXT NOT NULL,
key_hash TEXT NOT NULL UNIQUE,
last_used_at TEXT,
revoked_at TEXT,
created_at TEXT NOT NULL,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
CREATE TABLE IF NOT EXISTS usage_events (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
api_key_id INTEGER,
source TEXT NOT NULL,
tool TEXT NOT NULL,
task_id TEXT NOT NULL,
event_type TEXT NOT NULL,
created_at TEXT NOT NULL,
period_month TEXT NOT NULL,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE,
FOREIGN KEY (api_key_id) REFERENCES api_keys(id) ON DELETE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_file_history_user_created
ON file_history(user_id, created_at DESC);
CREATE INDEX IF NOT EXISTS idx_api_keys_user_created
ON api_keys(user_id, created_at DESC);
CREATE INDEX IF NOT EXISTS idx_usage_events_user_source_period_event
ON usage_events(user_id, source, period_month, event_type);
CREATE INDEX IF NOT EXISTS idx_usage_events_task_lookup
ON usage_events(user_id, source, task_id, event_type);
"""
)
if not _column_exists(conn, "users", "plan"):
conn.execute(
"ALTER TABLE users ADD COLUMN plan TEXT NOT NULL DEFAULT 'free'"
)
if not _column_exists(conn, "users", "updated_at"):
conn.execute(
"ALTER TABLE users ADD COLUMN updated_at TEXT NOT NULL DEFAULT ''"
)
def create_user(email: str, password: str) -> dict:
"""Create a new user and return the public record."""
email = _normalize_email(email)
now = _utc_now()
try:
with _connect() as conn:
cursor = conn.execute(
"""
INSERT INTO users (email, password_hash, plan, created_at, updated_at)
VALUES (?, ?, 'free', ?, ?)
""",
(email, generate_password_hash(password), now, now),
)
user_id = cursor.lastrowid
row = conn.execute(
"SELECT id, email, plan, created_at FROM users WHERE id = ?",
(user_id,),
).fetchone()
except sqlite3.IntegrityError as exc:
raise ValueError("An account with this email already exists.") from exc
return _serialize_user(row) or {}
def authenticate_user(email: str, password: str) -> dict | None:
"""Return the public user record when credentials are valid."""
email = _normalize_email(email)
with _connect() as conn:
row = conn.execute(
"SELECT * FROM users WHERE email = ?",
(email,),
).fetchone()
if row is None or not check_password_hash(row["password_hash"], password):
return None
return _serialize_user(row)
def get_user_by_id(user_id: int) -> dict | None:
"""Fetch a public user record by id."""
with _connect() as conn:
row = conn.execute(
"SELECT id, email, plan, created_at FROM users WHERE id = ?",
(user_id,),
).fetchone()
return _serialize_user(row)
def update_user_plan(user_id: int, plan: str) -> dict | None:
"""Update one user plan and return the public record."""
normalized_plan = normalize_plan(plan)
if normalized_plan not in VALID_PLANS:
raise ValueError("Invalid plan.")
with _connect() as conn:
conn.execute(
"""
UPDATE users
SET plan = ?, updated_at = ?
WHERE id = ?
""",
(normalized_plan, _utc_now(), user_id),
)
row = conn.execute(
"SELECT id, email, plan, created_at FROM users WHERE id = ?",
(user_id,),
).fetchone()
return _serialize_user(row)
def create_api_key(user_id: int, name: str) -> dict:
"""Create one API key and return the public record plus raw secret once."""
name = name.strip()
if not name:
raise ValueError("API key name is required.")
if len(name) > 100:
raise ValueError("API key name must be 100 characters or less.")
raw_key = f"spdf_{secrets.token_urlsafe(32)}"
now = _utc_now()
with _connect() as conn:
cursor = conn.execute(
"""
INSERT INTO api_keys (user_id, name, key_prefix, key_hash, created_at)
VALUES (?, ?, ?, ?, ?)
""",
(
user_id,
name,
raw_key[:16],
_hash_api_key(raw_key),
now,
),
)
row = conn.execute(
"""
SELECT id, name, key_prefix, last_used_at, revoked_at, created_at
FROM api_keys
WHERE id = ?
""",
(cursor.lastrowid,),
).fetchone()
result = _serialize_api_key(row)
result["raw_key"] = raw_key
return result
def list_api_keys(user_id: int) -> list[dict]:
"""Return all API keys for one user."""
with _connect() as conn:
rows = conn.execute(
"""
SELECT id, name, key_prefix, last_used_at, revoked_at, created_at
FROM api_keys
WHERE user_id = ?
ORDER BY created_at DESC
""",
(user_id,),
).fetchall()
return [_serialize_api_key(row) for row in rows]
def revoke_api_key(user_id: int, key_id: int) -> bool:
"""Revoke one API key owned by one user."""
with _connect() as conn:
cursor = conn.execute(
"""
UPDATE api_keys
SET revoked_at = ?
WHERE id = ? AND user_id = ? AND revoked_at IS NULL
""",
(_utc_now(), key_id, user_id),
)
return cursor.rowcount > 0
def get_api_key_actor(raw_key: str) -> dict | None:
"""Resolve one raw API key into the owning active user context."""
if not raw_key:
return None
key_hash = _hash_api_key(raw_key.strip())
now = _utc_now()
with _connect() as conn:
row = conn.execute(
"""
SELECT
api_keys.id AS api_key_id,
api_keys.user_id,
api_keys.name,
api_keys.key_prefix,
api_keys.last_used_at,
users.email,
users.plan,
users.created_at
FROM api_keys
INNER JOIN users ON users.id = api_keys.user_id
WHERE api_keys.key_hash = ? AND api_keys.revoked_at IS NULL
""",
(key_hash,),
).fetchone()
if row is None:
return None
conn.execute(
"UPDATE api_keys SET last_used_at = ? WHERE id = ?",
(now, row["api_key_id"]),
)
return {
"api_key_id": row["api_key_id"],
"user_id": row["user_id"],
"email": row["email"],
"plan": normalize_plan(row["plan"]),
"created_at": row["created_at"],
"name": row["name"],
"key_prefix": row["key_prefix"],
"last_used_at": now,
}
def record_file_history(
user_id: int,
tool: str,
original_filename: str | None,
output_filename: str | None,
status: str,
download_url: str | None,
metadata: dict | None = None,
):
"""Persist one generated-file history entry."""
with _connect() as conn:
conn.execute(
"""
INSERT INTO file_history (
user_id, tool, original_filename, output_filename,
status, download_url, metadata_json, created_at
)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
""",
(
user_id,
tool,
original_filename,
output_filename,
status,
download_url,
json.dumps(metadata or {}, ensure_ascii=True),
_utc_now(),
),
)
def record_task_history(
user_id: int | None,
tool: str,
original_filename: str | None,
result: dict,
):
"""Persist task results when the request belongs to an authenticated user."""
if user_id is None:
return
metadata = {}
for key, value in result.items():
if key in {"status", "download_url", "filename"}:
continue
if key in {"procedures", "flowcharts", "pages"} and isinstance(value, list):
metadata[f"{key}_count"] = len(value)
continue
metadata[key] = value
try:
record_file_history(
user_id=user_id,
tool=tool,
original_filename=original_filename,
output_filename=result.get("filename"),
status=result.get("status", "completed"),
download_url=result.get("download_url"),
metadata=metadata,
)
except Exception:
logger.exception("Failed to persist task history for tool=%s", tool)
def list_file_history(user_id: int, limit: int = 50) -> list[dict]:
"""Return most recent file history entries for one user."""
with _connect() as conn:
rows = conn.execute(
"""
SELECT id, tool, original_filename, output_filename, status,
download_url, metadata_json, created_at
FROM file_history
WHERE user_id = ?
ORDER BY created_at DESC
LIMIT ?
""",
(user_id, limit),
).fetchall()
return [
{
"id": row["id"],
"tool": row["tool"],
"original_filename": row["original_filename"],
"output_filename": row["output_filename"],
"status": row["status"],
"download_url": row["download_url"],
"metadata": json.loads(row["metadata_json"] or "{}"),
"created_at": row["created_at"],
}
for row in rows
]
def record_usage_event(
user_id: int | None,
source: str,
tool: str,
task_id: str,
event_type: str,
api_key_id: int | None = None,
):
"""Persist one usage event when it belongs to an authenticated actor."""
if user_id is None:
return
with _connect() as conn:
conn.execute(
"""
INSERT INTO usage_events (
user_id, api_key_id, source, tool, task_id,
event_type, created_at, period_month
)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
""",
(
user_id,
api_key_id,
source,
tool,
task_id,
event_type,
_utc_now(),
get_current_period_month(),
),
)
def count_usage_events(
user_id: int,
source: str,
event_type: str = "accepted",
period_month: str | None = None,
) -> int:
"""Count usage events for one user, source, period, and type."""
with _connect() as conn:
row = conn.execute(
"""
SELECT COUNT(*) AS count
FROM usage_events
WHERE user_id = ? AND source = ? AND event_type = ? AND period_month = ?
""",
(user_id, source, event_type, period_month or get_current_period_month()),
).fetchone()
return int(row["count"]) if row else 0
def has_task_access(user_id: int, source: str, task_id: str) -> bool:
"""Return whether one user owns one previously accepted task for one source."""
with _connect() as conn:
row = conn.execute(
"""
SELECT 1
FROM usage_events
WHERE user_id = ? AND source = ? AND task_id = ? AND event_type = 'accepted'
LIMIT 1
""",
(user_id, source, task_id),
).fetchone()
return row is not None

View File

@@ -0,0 +1,227 @@
"""Plan entitlements, actor resolution, and quota enforcement."""
from dataclasses import dataclass
from flask import current_app, request
from app.services.account_service import (
count_usage_events,
get_api_key_actor,
get_user_by_id,
get_current_period_month,
has_task_access,
normalize_plan,
record_usage_event,
)
from app.utils.auth import get_current_user_id, logout_user_session
from app.utils.file_validator import validate_file
FREE_PLAN = "free"
PRO_PLAN = "pro"
FREE_WEB_MONTHLY_LIMIT = 50
PRO_WEB_MONTHLY_LIMIT = 500
PRO_API_MONTHLY_LIMIT = 1000
FREE_HISTORY_LIMIT = 25
PRO_HISTORY_LIMIT = 250
FREE_HOMEPAGE_LIMIT_MB = 50
PRO_HOMEPAGE_LIMIT_MB = 100
@dataclass(frozen=True)
class ActorContext:
"""Resolved access context for one incoming request."""
source: str
actor_type: str
user_id: int | None
plan: str
api_key_id: int | None = None
class PolicyError(Exception):
"""A request failed access or quota policy validation."""
def __init__(self, message: str, status_code: int = 400):
self.message = message
self.status_code = status_code
super().__init__(message)
def get_history_limit(plan: str) -> int:
"""Return the default history limit for one plan."""
return PRO_HISTORY_LIMIT if normalize_plan(plan) == PRO_PLAN else FREE_HISTORY_LIMIT
def get_web_quota_limit(plan: str, actor_type: str) -> int | None:
"""Return the monthly accepted-task cap for one web actor."""
if actor_type == "anonymous":
return None
return PRO_WEB_MONTHLY_LIMIT if normalize_plan(plan) == PRO_PLAN else FREE_WEB_MONTHLY_LIMIT
def get_api_quota_limit(plan: str) -> int | None:
"""Return the monthly accepted-task cap for one API actor."""
return PRO_API_MONTHLY_LIMIT if normalize_plan(plan) == PRO_PLAN else None
def ads_enabled(plan: str, actor_type: str) -> bool:
"""Return whether ads should display for one actor."""
return not (actor_type != "anonymous" and normalize_plan(plan) == PRO_PLAN)
def get_effective_file_size_limits_bytes(plan: str) -> dict[str, int]:
"""Return effective backend upload limits for one plan."""
base_limits = current_app.config["FILE_SIZE_LIMITS"]
if normalize_plan(plan) != PRO_PLAN:
return dict(base_limits)
return {key: value * 2 for key, value in base_limits.items()}
def get_effective_file_size_limits_mb(plan: str) -> dict[str, int]:
"""Return effective frontend-friendly upload limits for one plan."""
byte_limits = get_effective_file_size_limits_bytes(plan)
return {
"pdf": byte_limits["pdf"] // (1024 * 1024),
"word": byte_limits["docx"] // (1024 * 1024),
"image": byte_limits["png"] // (1024 * 1024),
"video": byte_limits["mp4"] // (1024 * 1024),
"homepageSmartUpload": PRO_HOMEPAGE_LIMIT_MB
if normalize_plan(plan) == PRO_PLAN
else FREE_HOMEPAGE_LIMIT_MB,
}
def get_usage_summary_for_user(user_id: int, plan: str) -> dict:
"""Return usage/quota summary for one authenticated user."""
normalized_plan = normalize_plan(plan)
current_period = get_current_period_month()
web_used = count_usage_events(
user_id, "web", event_type="accepted", period_month=current_period
)
api_used = count_usage_events(
user_id, "api", event_type="accepted", period_month=current_period
)
return {
"plan": normalized_plan,
"period_month": current_period,
"ads_enabled": ads_enabled(normalized_plan, "session"),
"history_limit": get_history_limit(normalized_plan),
"file_limits_mb": get_effective_file_size_limits_mb(normalized_plan),
"web_quota": {
"used": web_used,
"limit": get_web_quota_limit(normalized_plan, "session"),
},
"api_quota": {
"used": api_used,
"limit": get_api_quota_limit(normalized_plan),
},
}
def resolve_web_actor() -> ActorContext:
"""Resolve the active web actor from session state."""
user_id = get_current_user_id()
if user_id is None:
return ActorContext(source="web", actor_type="anonymous", user_id=None, plan=FREE_PLAN)
user = get_user_by_id(user_id)
if user is None:
logout_user_session()
return ActorContext(source="web", actor_type="anonymous", user_id=None, plan=FREE_PLAN)
return ActorContext(
source="web",
actor_type="session",
user_id=user["id"],
plan=normalize_plan(user["plan"]),
)
def resolve_api_actor() -> ActorContext:
"""Resolve the active B2B API actor from X-API-Key header."""
raw_key = request.headers.get("X-API-Key", "").strip()
if not raw_key:
raise PolicyError("X-API-Key header is required.", 401)
actor = get_api_key_actor(raw_key)
if actor is None:
raise PolicyError("Invalid or revoked API key.", 401)
plan = normalize_plan(actor["plan"])
if plan != PRO_PLAN:
raise PolicyError("API access requires an active Pro plan.", 403)
return ActorContext(
source="api",
actor_type="api_key",
user_id=actor["user_id"],
plan=plan,
api_key_id=actor["api_key_id"],
)
def validate_actor_file(file_storage, allowed_types: list[str], actor: ActorContext):
"""Validate one uploaded file with plan-aware size limits."""
return validate_file(
file_storage,
allowed_types=allowed_types,
size_limit_overrides=get_effective_file_size_limits_bytes(actor.plan),
)
def assert_quota_available(actor: ActorContext):
"""Ensure an actor still has accepted-task quota for the current month."""
if actor.user_id is None:
return
if actor.source == "web":
limit = get_web_quota_limit(actor.plan, actor.actor_type)
if limit is None:
return
used = count_usage_events(actor.user_id, "web", event_type="accepted")
if used >= limit:
if normalize_plan(actor.plan) == PRO_PLAN:
raise PolicyError("Your monthly Pro web quota has been reached.", 429)
raise PolicyError(
"Your monthly free plan limit has been reached. Upgrade to Pro for higher limits.",
429,
)
return
limit = get_api_quota_limit(actor.plan)
if limit is None:
raise PolicyError("API access requires an active Pro plan.", 403)
used = count_usage_events(actor.user_id, "api", event_type="accepted")
if used >= limit:
raise PolicyError("Your monthly API quota has been reached.", 429)
def record_accepted_usage(actor: ActorContext, tool: str, celery_task_id: str):
"""Record one accepted usage event after task dispatch succeeds."""
record_usage_event(
user_id=actor.user_id,
source=actor.source,
tool=tool,
task_id=celery_task_id,
event_type="accepted",
api_key_id=actor.api_key_id,
)
def build_task_tracking_kwargs(actor: ActorContext) -> dict:
"""Return Celery kwargs required for task-side tracking."""
return {
"user_id": actor.user_id,
"usage_source": actor.source,
"api_key_id": actor.api_key_id,
}
def assert_api_task_access(actor: ActorContext, task_id: str):
"""Ensure one API actor can poll one task id."""
if actor.user_id is None or not has_task_access(actor.user_id, "api", task_id):
raise PolicyError("Task not found.", 404)

View File

@@ -0,0 +1,29 @@
"""Shared helpers for task completion tracking."""
from app.services.account_service import record_task_history, record_usage_event
def finalize_task_tracking(
*,
user_id: int | None,
tool: str,
original_filename: str | None,
result: dict,
usage_source: str,
api_key_id: int | None,
celery_task_id: str | None,
):
"""Persist task history and usage lifecycle events."""
record_task_history(user_id, tool, original_filename, result)
if user_id is None or not celery_task_id:
return
event_type = "completed" if result.get("status") == "completed" else "failed"
record_usage_event(
user_id=user_id,
source=usage_source,
tool=tool,
task_id=celery_task_id,
event_type=event_type,
api_key_id=api_key_id,
)

View File

@@ -2,15 +2,48 @@
import os import os
import logging import logging
from flask import current_app
from app.extensions import celery from app.extensions import celery
from app.services.compress_service import compress_pdf, PDFCompressionError from app.services.compress_service import compress_pdf, PDFCompressionError
from app.services.storage_service import storage from app.services.storage_service import storage
from app.services.task_tracking_service import finalize_task_tracking
from app.utils.sanitizer import cleanup_task_files from app.utils.sanitizer import cleanup_task_files
def _cleanup(task_id: str): def _cleanup(task_id: str):
cleanup_task_files(task_id, keep_outputs=not storage.use_s3) cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
def _get_output_dir(task_id: str) -> str:
"""Resolve output directory from app config."""
output_dir = os.path.join(current_app.config["OUTPUT_FOLDER"], task_id)
os.makedirs(output_dir, exist_ok=True)
return output_dir
def _finalize_task(
task_id: str,
user_id: int | None,
original_filename: str,
result: dict,
usage_source: str,
api_key_id: int | None,
celery_task_id: str | None,
):
"""Persist optional history and cleanup task files."""
finalize_task_tracking(
user_id=user_id,
tool="compress-pdf",
original_filename=original_filename,
result=result,
usage_source=usage_source,
api_key_id=api_key_id,
celery_task_id=celery_task_id,
)
_cleanup(task_id)
return result
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -21,6 +54,9 @@ def compress_pdf_task(
task_id: str, task_id: str,
original_filename: str, original_filename: str,
quality: str = "medium", quality: str = "medium",
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
""" """
Async task: Compress a PDF file. Async task: Compress a PDF file.
@@ -34,8 +70,7 @@ def compress_pdf_task(
Returns: Returns:
dict with download_url, compression stats, and file info dict with download_url, compression stats, and file info
""" """
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}.pdf") output_path = os.path.join(output_dir, f"{task_id}.pdf")
try: try:
@@ -69,20 +104,40 @@ def compress_pdf_task(
"reduction_percent": stats["reduction_percent"], "reduction_percent": stats["reduction_percent"],
} }
_cleanup(task_id)
logger.info( logger.info(
f"Task {task_id}: PDF compression completed — " f"Task {task_id}: PDF compression completed — "
f"{stats['reduction_percent']}% reduction" f"{stats['reduction_percent']}% reduction"
) )
return result return _finalize_task(
task_id,
user_id,
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFCompressionError as e: except PDFCompressionError as e:
logger.error(f"Task {task_id}: Compression error — {e}") logger.error(f"Task {task_id}: Compression error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)

View File

@@ -2,9 +2,12 @@
import os import os
import logging import logging
from flask import current_app
from app.extensions import celery from app.extensions import celery
from app.services.pdf_service import pdf_to_word, word_to_pdf, PDFConversionError from app.services.pdf_service import pdf_to_word, word_to_pdf, PDFConversionError
from app.services.storage_service import storage from app.services.storage_service import storage
from app.services.task_tracking_service import finalize_task_tracking
from app.utils.sanitizer import cleanup_task_files from app.utils.sanitizer import cleanup_task_files
@@ -12,11 +15,50 @@ def _cleanup(task_id: str):
"""Cleanup with local-aware flag.""" """Cleanup with local-aware flag."""
cleanup_task_files(task_id, keep_outputs=not storage.use_s3) cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
def _get_output_dir(task_id: str) -> str:
"""Resolve output directory from app config."""
output_dir = os.path.join(current_app.config["OUTPUT_FOLDER"], task_id)
os.makedirs(output_dir, exist_ok=True)
return output_dir
def _finalize_task(
task_id: str,
user_id: int | None,
tool: str,
original_filename: str,
result: dict,
usage_source: str,
api_key_id: int | None,
celery_task_id: str | None,
):
"""Persist optional history and cleanup task files."""
finalize_task_tracking(
user_id=user_id,
tool=tool,
original_filename=original_filename,
result=result,
usage_source=usage_source,
api_key_id=api_key_id,
celery_task_id=celery_task_id,
)
_cleanup(task_id)
return result
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@celery.task(bind=True, name="app.tasks.convert_tasks.convert_pdf_to_word") @celery.task(bind=True, name="app.tasks.convert_tasks.convert_pdf_to_word")
def convert_pdf_to_word(self, input_path: str, task_id: str, original_filename: str): def convert_pdf_to_word(
self,
input_path: str,
task_id: str,
original_filename: str,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
):
""" """
Async task: Convert PDF to Word document. Async task: Convert PDF to Word document.
@@ -28,7 +70,7 @@ def convert_pdf_to_word(self, input_path: str, task_id: str, original_filename:
Returns: Returns:
dict with download_url and file info dict with download_url and file info
""" """
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
try: try:
self.update_state(state="PROCESSING", meta={"step": "Converting PDF to Word..."}) self.update_state(state="PROCESSING", meta={"step": "Converting PDF to Word..."})
@@ -58,24 +100,55 @@ def convert_pdf_to_word(self, input_path: str, task_id: str, original_filename:
} }
# Cleanup local files # Cleanup local files
_cleanup(task_id)
logger.info(f"Task {task_id}: PDF→Word conversion completed") logger.info(f"Task {task_id}: PDF→Word conversion completed")
return result return _finalize_task(
task_id,
user_id,
"pdf-to-word",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFConversionError as e: except PDFConversionError as e:
logger.error(f"Task {task_id}: Conversion error — {e}") logger.error(f"Task {task_id}: Conversion error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"pdf-to-word",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"pdf-to-word",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
@celery.task(bind=True, name="app.tasks.convert_tasks.convert_word_to_pdf") @celery.task(bind=True, name="app.tasks.convert_tasks.convert_word_to_pdf")
def convert_word_to_pdf(self, input_path: str, task_id: str, original_filename: str): def convert_word_to_pdf(
self,
input_path: str,
task_id: str,
original_filename: str,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
):
""" """
Async task: Convert Word document to PDF. Async task: Convert Word document to PDF.
@@ -87,7 +160,7 @@ def convert_word_to_pdf(self, input_path: str, task_id: str, original_filename:
Returns: Returns:
dict with download_url and file info dict with download_url and file info
""" """
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
try: try:
self.update_state(state="PROCESSING", meta={"step": "Converting Word to PDF..."}) self.update_state(state="PROCESSING", meta={"step": "Converting Word to PDF..."})
@@ -112,17 +185,40 @@ def convert_word_to_pdf(self, input_path: str, task_id: str, original_filename:
"output_size": os.path.getsize(output_path), "output_size": os.path.getsize(output_path),
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Word→PDF conversion completed") logger.info(f"Task {task_id}: Word→PDF conversion completed")
return result return _finalize_task(
task_id,
user_id,
"word-to-pdf",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFConversionError as e: except PDFConversionError as e:
logger.error(f"Task {task_id}: Conversion error — {e}") logger.error(f"Task {task_id}: Conversion error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"word-to-pdf",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"word-to-pdf",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)

View File

@@ -3,9 +3,12 @@ import os
import json import json
import logging import logging
from flask import current_app
from app.extensions import celery from app.extensions import celery
from app.services.flowchart_service import extract_and_generate, FlowchartError from app.services.flowchart_service import extract_and_generate, FlowchartError
from app.services.storage_service import storage from app.services.storage_service import storage
from app.services.task_tracking_service import finalize_task_tracking
from app.utils.sanitizer import cleanup_task_files from app.utils.sanitizer import cleanup_task_files
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -15,17 +18,132 @@ def _cleanup(task_id: str):
cleanup_task_files(task_id, keep_outputs=not storage.use_s3) cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
def _get_output_dir(task_id: str) -> str:
"""Resolve output directory from app config."""
output_dir = os.path.join(current_app.config["OUTPUT_FOLDER"], task_id)
os.makedirs(output_dir, exist_ok=True)
return output_dir
def _finalize_task(
task_id: str,
user_id: int | None,
tool: str,
original_filename: str | None,
result: dict,
usage_source: str,
api_key_id: int | None,
celery_task_id: str | None,
):
"""Persist optional history and cleanup task files."""
finalize_task_tracking(
user_id=user_id,
tool=tool,
original_filename=original_filename,
result=result,
usage_source=usage_source,
api_key_id=api_key_id,
celery_task_id=celery_task_id,
)
_cleanup(task_id)
return result
def _build_sample_result() -> dict:
"""Return deterministic sample flowchart data for demo mode."""
pages = [
{
"page": 1,
"text": (
"Employee Onboarding Procedure\n"
"1. Create employee profile in HR system.\n"
"2. Verify documents and eligibility.\n"
"3. Assign department and manager.\n"
"4. Send welcome package and access credentials.\n"
"5. Confirm first-day orientation schedule."
),
}
]
procedures = [
{
"id": "sample-proc-1",
"title": "Employee Onboarding Procedure",
"description": "Create profile, verify docs, assign team, and confirm orientation.",
"pages": [1],
"step_count": 5,
}
]
flowcharts = [
{
"id": "flow-sample-proc-1",
"procedureId": "sample-proc-1",
"title": "Employee Onboarding Procedure",
"steps": [
{
"id": "1",
"type": "start",
"title": "Begin: Employee Onboarding",
"description": "Start of onboarding process",
"connections": ["2"],
},
{
"id": "2",
"type": "process",
"title": "Create Employee Profile",
"description": "Register employee in HR system",
"connections": ["3"],
},
{
"id": "3",
"type": "decision",
"title": "Documents Verified?",
"description": "Check eligibility and required documents",
"connections": ["4"],
},
{
"id": "4",
"type": "process",
"title": "Assign Team and Access",
"description": "Assign manager, department, and credentials",
"connections": ["5"],
},
{
"id": "5",
"type": "end",
"title": "Onboarding Complete",
"description": "Employee is ready for orientation",
"connections": [],
},
],
}
]
return {
"procedures": procedures,
"flowcharts": flowcharts,
"pages": pages,
"total_pages": len(pages),
}
@celery.task(bind=True, name="app.tasks.flowchart_tasks.extract_flowchart_task") @celery.task(bind=True, name="app.tasks.flowchart_tasks.extract_flowchart_task")
def extract_flowchart_task( def extract_flowchart_task(
self, input_path: str, task_id: str, original_filename: str self,
input_path: str,
task_id: str,
original_filename: str,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
""" """
Async task: Extract procedures from PDF and generate flowcharts. Async task: Extract procedures from PDF and generate flowcharts.
Returns a JSON result containing procedures and their flowcharts. Returns a JSON result containing procedures and their flowcharts.
""" """
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
try: try:
self.update_state( self.update_state(
@@ -61,19 +179,87 @@ def extract_flowchart_task(
"procedures_count": len(result["procedures"]), "procedures_count": len(result["procedures"]),
} }
_cleanup(task_id)
logger.info( logger.info(
f"Task {task_id}: Flowchart extraction completed — " f"Task {task_id}: Flowchart extraction completed — "
f"{len(result['procedures'])} procedures, " f"{len(result['procedures'])} procedures, "
f"{result['total_pages']} pages" f"{result['total_pages']} pages"
) )
return final_result return _finalize_task(
task_id,
user_id,
"pdf-flowchart",
original_filename,
final_result,
usage_source,
api_key_id,
self.request.id,
)
except FlowchartError as e: except FlowchartError as e:
logger.error(f"Task {task_id}: Flowchart error — {e}") logger.error(f"Task {task_id}: Flowchart error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"pdf-flowchart",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
task_id,
user_id,
"pdf-flowchart",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
@celery.task(bind=True, name="app.tasks.flowchart_tasks.extract_sample_flowchart_task")
def extract_sample_flowchart_task(
self,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
):
"""
Async task: Build a sample flowchart payload without requiring file upload.
"""
try:
self.update_state(
state="PROCESSING",
meta={"step": "Preparing sample flowchart..."},
)
result = _build_sample_result()
final_result = {
"status": "completed",
"filename": "sample_flowcharts.json",
"procedures": result["procedures"],
"flowcharts": result["flowcharts"],
"pages": result["pages"],
"total_pages": result["total_pages"],
"procedures_count": len(result["procedures"]),
}
finalize_task_tracking(
user_id=user_id,
tool="pdf-flowchart-sample",
original_filename="sample-document.pdf",
result=final_result,
usage_source=usage_source,
api_key_id=api_key_id,
celery_task_id=self.request.id,
)
logger.info("Sample flowchart task completed")
return final_result
except Exception as e:
logger.error(f"Sample flowchart task failed — {e}")
return {"status": "failed", "error": "An unexpected error occurred."} return {"status": "failed", "error": "An unexpected error occurred."}

View File

@@ -2,15 +2,49 @@
import os import os
import logging import logging
from flask import current_app
from app.extensions import celery from app.extensions import celery
from app.services.image_service import convert_image, resize_image, ImageProcessingError from app.services.image_service import convert_image, resize_image, ImageProcessingError
from app.services.storage_service import storage from app.services.storage_service import storage
from app.services.task_tracking_service import finalize_task_tracking
from app.utils.sanitizer import cleanup_task_files from app.utils.sanitizer import cleanup_task_files
def _cleanup(task_id: str): def _cleanup(task_id: str):
cleanup_task_files(task_id, keep_outputs=not storage.use_s3) cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
def _get_output_dir(task_id: str) -> str:
"""Resolve output directory from app config."""
output_dir = os.path.join(current_app.config["OUTPUT_FOLDER"], task_id)
os.makedirs(output_dir, exist_ok=True)
return output_dir
def _finalize_task(
task_id: str,
user_id: int | None,
tool: str,
original_filename: str,
result: dict,
usage_source: str,
api_key_id: int | None,
celery_task_id: str | None,
):
"""Persist optional history and cleanup task files."""
finalize_task_tracking(
user_id=user_id,
tool=tool,
original_filename=original_filename,
result=result,
usage_source=usage_source,
api_key_id=api_key_id,
celery_task_id=celery_task_id,
)
_cleanup(task_id)
return result
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -22,6 +56,9 @@ def convert_image_task(
original_filename: str, original_filename: str,
output_format: str, output_format: str,
quality: int = 85, quality: int = 85,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
""" """
Async task: Convert an image to a different format. Async task: Convert an image to a different format.
@@ -36,8 +73,7 @@ def convert_image_task(
Returns: Returns:
dict with download_url and conversion stats dict with download_url and conversion stats
""" """
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}.{output_format}") output_path = os.path.join(output_dir, f"{task_id}.{output_format}")
try: try:
@@ -70,20 +106,43 @@ def convert_image_task(
"format": stats["format"], "format": stats["format"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Image conversion to {output_format} completed") logger.info(f"Task {task_id}: Image conversion to {output_format} completed")
return result return _finalize_task(
task_id,
user_id,
"image-convert",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except ImageProcessingError as e: except ImageProcessingError as e:
logger.error(f"Task {task_id}: Image error — {e}") logger.error(f"Task {task_id}: Image error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"image-convert",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"image-convert",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
@celery.task(bind=True, name="app.tasks.image_tasks.resize_image_task") @celery.task(bind=True, name="app.tasks.image_tasks.resize_image_task")
@@ -95,6 +154,9 @@ def resize_image_task(
width: int | None = None, width: int | None = None,
height: int | None = None, height: int | None = None,
quality: int = 85, quality: int = 85,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
""" """
Async task: Resize an image. Async task: Resize an image.
@@ -111,8 +173,7 @@ def resize_image_task(
dict with download_url and resize info dict with download_url and resize info
""" """
ext = os.path.splitext(original_filename)[1].lstrip(".") ext = os.path.splitext(original_filename)[1].lstrip(".")
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}.{ext}") output_path = os.path.join(output_dir, f"{task_id}.{ext}")
try: try:
@@ -144,17 +205,40 @@ def resize_image_task(
"new_height": stats["new_height"], "new_height": stats["new_height"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Image resize completed") logger.info(f"Task {task_id}: Image resize completed")
return result return _finalize_task(
task_id,
user_id,
"image-resize",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except ImageProcessingError as e: except ImageProcessingError as e:
logger.error(f"Task {task_id}: Image error — {e}") logger.error(f"Task {task_id}: Image error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"image-resize",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"image-resize",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)

View File

@@ -2,6 +2,8 @@
import os import os
import logging import logging
from flask import current_app
from app.extensions import celery from app.extensions import celery
from app.services.pdf_tools_service import ( from app.services.pdf_tools_service import (
merge_pdfs, merge_pdfs,
@@ -16,6 +18,7 @@ from app.services.pdf_tools_service import (
PDFToolsError, PDFToolsError,
) )
from app.services.storage_service import storage from app.services.storage_service import storage
from app.services.task_tracking_service import finalize_task_tracking
from app.utils.sanitizer import cleanup_task_files from app.utils.sanitizer import cleanup_task_files
@@ -23,6 +26,37 @@ def _cleanup(task_id: str):
cleanup_task_files(task_id, keep_outputs=not storage.use_s3) cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
def _get_output_dir(task_id: str) -> str:
"""Resolve output directory from app config."""
output_dir = os.path.join(current_app.config["OUTPUT_FOLDER"], task_id)
os.makedirs(output_dir, exist_ok=True)
return output_dir
def _finalize_task(
task_id: str,
user_id: int | None,
tool: str,
original_filename: str,
result: dict,
usage_source: str,
api_key_id: int | None,
celery_task_id: str | None,
):
"""Persist optional history and cleanup task files."""
finalize_task_tracking(
user_id=user_id,
tool=tool,
original_filename=original_filename,
result=result,
usage_source=usage_source,
api_key_id=api_key_id,
celery_task_id=celery_task_id,
)
_cleanup(task_id)
return result
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -31,11 +65,16 @@ logger = logging.getLogger(__name__)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@celery.task(bind=True, name="app.tasks.pdf_tools_tasks.merge_pdfs_task") @celery.task(bind=True, name="app.tasks.pdf_tools_tasks.merge_pdfs_task")
def merge_pdfs_task( def merge_pdfs_task(
self, input_paths: list[str], task_id: str, original_filenames: list[str] self,
input_paths: list[str],
task_id: str,
original_filenames: list[str],
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Merge multiple PDFs into one.""" """Async task: Merge multiple PDFs into one."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}_merged.pdf") output_path = os.path.join(output_dir, f"{task_id}_merged.pdf")
try: try:
@@ -56,18 +95,42 @@ def merge_pdfs_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Merge completed — {stats['files_merged']} files, {stats['total_pages']} pages") logger.info(f"Task {task_id}: Merge completed — {stats['files_merged']} files, {stats['total_pages']} pages")
return result return _finalize_task(
task_id,
user_id,
"merge-pdf",
", ".join(original_filenames),
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: Merge error — {e}") logger.error(f"Task {task_id}: Merge error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"merge-pdf",
", ".join(original_filenames),
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"merge-pdf",
", ".join(original_filenames),
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -77,9 +140,12 @@ def merge_pdfs_task(
def split_pdf_task( def split_pdf_task(
self, input_path: str, task_id: str, original_filename: str, self, input_path: str, task_id: str, original_filename: str,
mode: str = "all", pages: str | None = None, mode: str = "all", pages: str | None = None,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Split a PDF into individual pages.""" """Async task: Split a PDF into individual pages."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
try: try:
self.update_state(state="PROCESSING", meta={"step": "Splitting PDF..."}) self.update_state(state="PROCESSING", meta={"step": "Splitting PDF..."})
@@ -102,18 +168,42 @@ def split_pdf_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Split completed — {stats['extracted_pages']} pages extracted") logger.info(f"Task {task_id}: Split completed — {stats['extracted_pages']} pages extracted")
return result return _finalize_task(
task_id,
user_id,
"split-pdf",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: Split error — {e}") logger.error(f"Task {task_id}: Split error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"split-pdf",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"split-pdf",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -123,10 +213,12 @@ def split_pdf_task(
def rotate_pdf_task( def rotate_pdf_task(
self, input_path: str, task_id: str, original_filename: str, self, input_path: str, task_id: str, original_filename: str,
rotation: int = 90, pages: str = "all", rotation: int = 90, pages: str = "all",
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Rotate pages in a PDF.""" """Async task: Rotate pages in a PDF."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}_rotated.pdf") output_path = os.path.join(output_dir, f"{task_id}_rotated.pdf")
try: try:
@@ -150,18 +242,42 @@ def rotate_pdf_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Rotate completed — {stats['rotated_pages']} pages") logger.info(f"Task {task_id}: Rotate completed — {stats['rotated_pages']} pages")
return result return _finalize_task(
task_id,
user_id,
"rotate-pdf",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: Rotate error — {e}") logger.error(f"Task {task_id}: Rotate error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"rotate-pdf",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"rotate-pdf",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -171,10 +287,12 @@ def rotate_pdf_task(
def add_page_numbers_task( def add_page_numbers_task(
self, input_path: str, task_id: str, original_filename: str, self, input_path: str, task_id: str, original_filename: str,
position: str = "bottom-center", start_number: int = 1, position: str = "bottom-center", start_number: int = 1,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Add page numbers to a PDF.""" """Async task: Add page numbers to a PDF."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}_numbered.pdf") output_path = os.path.join(output_dir, f"{task_id}_numbered.pdf")
try: try:
@@ -196,18 +314,42 @@ def add_page_numbers_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Page numbers added to {stats['total_pages']} pages") logger.info(f"Task {task_id}: Page numbers added to {stats['total_pages']} pages")
return result return _finalize_task(
task_id,
user_id,
"page-numbers",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: Page numbers error — {e}") logger.error(f"Task {task_id}: Page numbers error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"page-numbers",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"page-numbers",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -217,9 +359,12 @@ def add_page_numbers_task(
def pdf_to_images_task( def pdf_to_images_task(
self, input_path: str, task_id: str, original_filename: str, self, input_path: str, task_id: str, original_filename: str,
output_format: str = "png", dpi: int = 200, output_format: str = "png", dpi: int = 200,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Convert PDF pages to images.""" """Async task: Convert PDF pages to images."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
try: try:
self.update_state(state="PROCESSING", meta={"step": "Converting PDF to images..."}) self.update_state(state="PROCESSING", meta={"step": "Converting PDF to images..."})
@@ -243,18 +388,42 @@ def pdf_to_images_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: PDF→Images completed — {stats['page_count']} pages") logger.info(f"Task {task_id}: PDF→Images completed — {stats['page_count']} pages")
return result return _finalize_task(
task_id,
user_id,
"pdf-to-images",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: PDF→Images error — {e}") logger.error(f"Task {task_id}: PDF→Images error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"pdf-to-images",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"pdf-to-images",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -262,11 +431,16 @@ def pdf_to_images_task(
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@celery.task(bind=True, name="app.tasks.pdf_tools_tasks.images_to_pdf_task") @celery.task(bind=True, name="app.tasks.pdf_tools_tasks.images_to_pdf_task")
def images_to_pdf_task( def images_to_pdf_task(
self, input_paths: list[str], task_id: str, original_filenames: list[str] self,
input_paths: list[str],
task_id: str,
original_filenames: list[str],
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Combine images into a PDF.""" """Async task: Combine images into a PDF."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}_images.pdf") output_path = os.path.join(output_dir, f"{task_id}_images.pdf")
try: try:
@@ -286,18 +460,42 @@ def images_to_pdf_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Images→PDF completed — {stats['page_count']} pages") logger.info(f"Task {task_id}: Images→PDF completed — {stats['page_count']} pages")
return result return _finalize_task(
task_id,
user_id,
"images-to-pdf",
", ".join(original_filenames),
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: Images→PDF error — {e}") logger.error(f"Task {task_id}: Images→PDF error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"images-to-pdf",
", ".join(original_filenames),
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"images-to-pdf",
", ".join(original_filenames),
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -307,10 +505,12 @@ def images_to_pdf_task(
def watermark_pdf_task( def watermark_pdf_task(
self, input_path: str, task_id: str, original_filename: str, self, input_path: str, task_id: str, original_filename: str,
watermark_text: str, opacity: float = 0.3, watermark_text: str, opacity: float = 0.3,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Add watermark to a PDF.""" """Async task: Add watermark to a PDF."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}_watermarked.pdf") output_path = os.path.join(output_dir, f"{task_id}_watermarked.pdf")
try: try:
@@ -332,18 +532,42 @@ def watermark_pdf_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Watermark added") logger.info(f"Task {task_id}: Watermark added")
return result return _finalize_task(
task_id,
user_id,
"watermark-pdf",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: Watermark error — {e}") logger.error(f"Task {task_id}: Watermark error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"watermark-pdf",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"watermark-pdf",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -353,10 +577,12 @@ def watermark_pdf_task(
def protect_pdf_task( def protect_pdf_task(
self, input_path: str, task_id: str, original_filename: str, self, input_path: str, task_id: str, original_filename: str,
password: str, password: str,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Add password protection to a PDF.""" """Async task: Add password protection to a PDF."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}_protected.pdf") output_path = os.path.join(output_dir, f"{task_id}_protected.pdf")
try: try:
@@ -378,18 +604,42 @@ def protect_pdf_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: PDF protected") logger.info(f"Task {task_id}: PDF protected")
return result return _finalize_task(
task_id,
user_id,
"protect-pdf",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: Protect error — {e}") logger.error(f"Task {task_id}: Protect error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"protect-pdf",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"protect-pdf",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@@ -399,10 +649,12 @@ def protect_pdf_task(
def unlock_pdf_task( def unlock_pdf_task(
self, input_path: str, task_id: str, original_filename: str, self, input_path: str, task_id: str, original_filename: str,
password: str, password: str,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
"""Async task: Remove password from a PDF.""" """Async task: Remove password from a PDF."""
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}_unlocked.pdf") output_path = os.path.join(output_dir, f"{task_id}_unlocked.pdf")
try: try:
@@ -424,15 +676,39 @@ def unlock_pdf_task(
"output_size": stats["output_size"], "output_size": stats["output_size"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: PDF unlocked") logger.info(f"Task {task_id}: PDF unlocked")
return result return _finalize_task(
task_id,
user_id,
"unlock-pdf",
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except PDFToolsError as e: except PDFToolsError as e:
logger.error(f"Task {task_id}: Unlock error — {e}") logger.error(f"Task {task_id}: Unlock error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
"unlock-pdf",
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
"unlock-pdf",
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)

View File

@@ -2,15 +2,48 @@
import os import os
import logging import logging
from flask import current_app
from app.extensions import celery from app.extensions import celery
from app.services.video_service import video_to_gif, VideoProcessingError from app.services.video_service import video_to_gif, VideoProcessingError
from app.services.storage_service import storage from app.services.storage_service import storage
from app.services.task_tracking_service import finalize_task_tracking
from app.utils.sanitizer import cleanup_task_files from app.utils.sanitizer import cleanup_task_files
def _cleanup(task_id: str): def _cleanup(task_id: str):
cleanup_task_files(task_id, keep_outputs=not storage.use_s3) cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
def _get_output_dir(task_id: str) -> str:
"""Resolve output directory from app config."""
output_dir = os.path.join(current_app.config["OUTPUT_FOLDER"], task_id)
os.makedirs(output_dir, exist_ok=True)
return output_dir
def _finalize_task(
task_id: str,
user_id: int | None,
original_filename: str,
result: dict,
usage_source: str,
api_key_id: int | None,
celery_task_id: str | None,
):
"""Persist optional history and cleanup task files."""
finalize_task_tracking(
user_id=user_id,
tool="video-to-gif",
original_filename=original_filename,
result=result,
usage_source=usage_source,
api_key_id=api_key_id,
celery_task_id=celery_task_id,
)
_cleanup(task_id)
return result
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -24,6 +57,9 @@ def create_gif_task(
duration: float = 5, duration: float = 5,
fps: int = 10, fps: int = 10,
width: int = 480, width: int = 480,
user_id: int | None = None,
usage_source: str = "web",
api_key_id: int | None = None,
): ):
""" """
Async task: Convert video clip to animated GIF. Async task: Convert video clip to animated GIF.
@@ -40,8 +76,7 @@ def create_gif_task(
Returns: Returns:
dict with download_url and GIF info dict with download_url and GIF info
""" """
output_dir = os.path.join("/tmp/outputs", task_id) output_dir = _get_output_dir(task_id)
os.makedirs(output_dir, exist_ok=True)
output_path = os.path.join(output_dir, f"{task_id}.gif") output_path = os.path.join(output_dir, f"{task_id}.gif")
try: try:
@@ -80,17 +115,37 @@ def create_gif_task(
"height": stats["height"], "height": stats["height"],
} }
_cleanup(task_id)
logger.info(f"Task {task_id}: Video→GIF creation completed") logger.info(f"Task {task_id}: Video→GIF creation completed")
return result return _finalize_task(
task_id,
user_id,
original_filename,
result,
usage_source,
api_key_id,
self.request.id,
)
except VideoProcessingError as e: except VideoProcessingError as e:
logger.error(f"Task {task_id}: Video error — {e}") logger.error(f"Task {task_id}: Video error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": str(e)} task_id,
user_id,
original_filename,
{"status": "failed", "error": str(e)},
usage_source,
api_key_id,
self.request.id,
)
except Exception as e: except Exception as e:
logger.error(f"Task {task_id}: Unexpected error — {e}") logger.error(f"Task {task_id}: Unexpected error — {e}")
_cleanup(task_id) return _finalize_task(
return {"status": "failed", "error": "An unexpected error occurred."} task_id,
user_id,
original_filename,
{"status": "failed", "error": "An unexpected error occurred."},
usage_source,
api_key_id,
self.request.id,
)

20
backend/app/utils/auth.py Normal file
View File

@@ -0,0 +1,20 @@
"""Session helpers for authenticated routes."""
from flask import session
def get_current_user_id() -> int | None:
"""Return the authenticated user id from session storage."""
user_id = session.get("user_id")
return user_id if isinstance(user_id, int) else None
def login_user_session(user_id: int):
"""Persist the authenticated user in the Flask session."""
session.clear()
session.permanent = True
session["user_id"] = user_id
def logout_user_session():
"""Clear the active Flask session."""
session.clear()

View File

@@ -20,7 +20,11 @@ class FileValidationError(Exception):
super().__init__(self.message) super().__init__(self.message)
def validate_file(file_storage, allowed_types: list[str] | None = None): def validate_file(
file_storage,
allowed_types: list[str] | None = None,
size_limit_overrides: dict[str, int] | None = None,
):
""" """
Validate an uploaded file through multiple security layers. Validate an uploaded file through multiple security layers.
@@ -65,7 +69,7 @@ def validate_file(file_storage, allowed_types: list[str] | None = None):
file_size = file_storage.tell() file_size = file_storage.tell()
file_storage.seek(0) file_storage.seek(0)
size_limits = config.get("FILE_SIZE_LIMITS", {}) size_limits = size_limit_overrides or config.get("FILE_SIZE_LIMITS", {})
max_size = size_limits.get(ext, 20 * 1024 * 1024) # Default 20MB max_size = size_limits.get(ext, 20 * 1024 * 1024) # Default 20MB
if file_size > max_size: if file_size > max_size:

View File

@@ -9,3 +9,5 @@ import app.tasks.convert_tasks # noqa: F401
import app.tasks.compress_tasks # noqa: F401 import app.tasks.compress_tasks # noqa: F401
import app.tasks.image_tasks # noqa: F401 import app.tasks.image_tasks # noqa: F401
import app.tasks.video_tasks # noqa: F401 import app.tasks.video_tasks # noqa: F401
import app.tasks.pdf_tools_tasks # noqa: F401
import app.tasks.flowchart_tasks # noqa: F401

View File

@@ -1,18 +1,31 @@
import os import os
from datetime import timedelta
from dotenv import load_dotenv from dotenv import load_dotenv
load_dotenv() load_dotenv()
BASE_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__), ".."))
class BaseConfig: class BaseConfig:
"""Base configuration.""" """Base configuration."""
SECRET_KEY = os.getenv("SECRET_KEY", "change-me-in-production") SECRET_KEY = os.getenv("SECRET_KEY", "change-me-in-production")
INTERNAL_ADMIN_SECRET = os.getenv("INTERNAL_ADMIN_SECRET", "")
# File upload settings # File upload settings
MAX_CONTENT_LENGTH = int(os.getenv("MAX_CONTENT_LENGTH_MB", 50)) * 1024 * 1024 MAX_CONTENT_LENGTH = int(
os.getenv("ABSOLUTE_MAX_CONTENT_LENGTH_MB", 100)
) * 1024 * 1024
UPLOAD_FOLDER = os.getenv("UPLOAD_FOLDER", "/tmp/uploads") UPLOAD_FOLDER = os.getenv("UPLOAD_FOLDER", "/tmp/uploads")
OUTPUT_FOLDER = os.getenv("OUTPUT_FOLDER", "/tmp/outputs") OUTPUT_FOLDER = os.getenv("OUTPUT_FOLDER", "/tmp/outputs")
FILE_EXPIRY_SECONDS = int(os.getenv("FILE_EXPIRY_SECONDS", 1800)) FILE_EXPIRY_SECONDS = int(os.getenv("FILE_EXPIRY_SECONDS", 1800))
DATABASE_PATH = os.getenv(
"DATABASE_PATH", os.path.join(BASE_DIR, "data", "saas_pdf.db")
)
PERMANENT_SESSION_LIFETIME = timedelta(days=30)
SESSION_COOKIE_HTTPONLY = True
SESSION_COOKIE_SAMESITE = "Lax"
SESSION_COOKIE_SECURE = False
# Allowed file extensions and MIME types # Allowed file extensions and MIME types
ALLOWED_EXTENSIONS = { ALLOWED_EXTENSIONS = {
@@ -84,6 +97,7 @@ class ProductionConfig(BaseConfig):
"""Production configuration.""" """Production configuration."""
DEBUG = False DEBUG = False
TESTING = False TESTING = False
SESSION_COOKIE_SECURE = True
# Stricter rate limits in production # Stricter rate limits in production
RATELIMIT_DEFAULT = "60/hour" RATELIMIT_DEFAULT = "60/hour"
@@ -94,6 +108,7 @@ class TestingConfig(BaseConfig):
TESTING = True TESTING = True
UPLOAD_FOLDER = "/tmp/test_uploads" UPLOAD_FOLDER = "/tmp/test_uploads"
OUTPUT_FOLDER = "/tmp/test_outputs" OUTPUT_FOLDER = "/tmp/test_outputs"
DATABASE_PATH = "/tmp/test_saas_pdf.db"
# Disable Redis-backed rate limiting; use in-memory instead # Disable Redis-backed rate limiting; use in-memory instead
RATELIMIT_STORAGE_URI = "memory://" RATELIMIT_STORAGE_URI = "memory://"

View File

@@ -1,21 +1,35 @@
import io import io
import os import os
import shutil import shutil
import tempfile
import pytest import pytest
from unittest.mock import patch, MagicMock from unittest.mock import patch, MagicMock
from app import create_app from app import create_app
from app.services.account_service import init_account_db
@pytest.fixture @pytest.fixture
def app(): def app():
"""Create application for testing.""" """Create application for testing."""
os.environ['FLASK_ENV'] = 'testing' os.environ['FLASK_ENV'] = 'testing'
test_root = tempfile.mkdtemp(prefix='saas-pdf-tests-')
db_path = os.path.join(test_root, 'test_saas_pdf.db')
upload_folder = os.path.join(test_root, 'uploads')
output_folder = os.path.join(test_root, 'outputs')
os.environ['DATABASE_PATH'] = db_path
os.environ['UPLOAD_FOLDER'] = upload_folder
os.environ['OUTPUT_FOLDER'] = output_folder
app = create_app('testing') app = create_app('testing')
app.config.update({ app.config.update({
'TESTING': True, 'TESTING': True,
'UPLOAD_FOLDER': '/tmp/test_uploads', 'UPLOAD_FOLDER': upload_folder,
'OUTPUT_FOLDER': '/tmp/test_outputs', 'OUTPUT_FOLDER': output_folder,
'DATABASE_PATH': db_path,
}) })
with app.app_context():
init_account_db()
# Create temp directories # Create temp directories
os.makedirs(app.config['UPLOAD_FOLDER'], exist_ok=True) os.makedirs(app.config['UPLOAD_FOLDER'], exist_ok=True)
os.makedirs(app.config['OUTPUT_FOLDER'], exist_ok=True) os.makedirs(app.config['OUTPUT_FOLDER'], exist_ok=True)
@@ -23,8 +37,10 @@ def app():
yield app yield app
# Cleanup temp directories # Cleanup temp directories
shutil.rmtree(app.config['UPLOAD_FOLDER'], ignore_errors=True) shutil.rmtree(test_root, ignore_errors=True)
shutil.rmtree(app.config['OUTPUT_FOLDER'], ignore_errors=True) os.environ.pop('DATABASE_PATH', None)
os.environ.pop('UPLOAD_FOLDER', None)
os.environ.pop('OUTPUT_FOLDER', None)
@pytest.fixture @pytest.fixture

View File

@@ -0,0 +1,67 @@
"""Tests for session-backed authentication routes."""
class TestAuthRoutes:
def test_register_success(self, client):
response = client.post(
'/api/auth/register',
json={'email': 'user@example.com', 'password': 'secretpass123'},
)
assert response.status_code == 201
data = response.get_json()
assert data['user']['email'] == 'user@example.com'
assert data['user']['plan'] == 'free'
def test_register_duplicate_email(self, client):
client.post(
'/api/auth/register',
json={'email': 'user@example.com', 'password': 'secretpass123'},
)
response = client.post(
'/api/auth/register',
json={'email': 'user@example.com', 'password': 'secretpass123'},
)
assert response.status_code == 409
assert 'already exists' in response.get_json()['error'].lower()
def test_login_and_me(self, client):
client.post(
'/api/auth/register',
json={'email': 'user@example.com', 'password': 'secretpass123'},
)
client.post('/api/auth/logout')
login_response = client.post(
'/api/auth/login',
json={'email': 'user@example.com', 'password': 'secretpass123'},
)
me_response = client.get('/api/auth/me')
assert login_response.status_code == 200
assert me_response.status_code == 200
me_data = me_response.get_json()
assert me_data['authenticated'] is True
assert me_data['user']['email'] == 'user@example.com'
def test_login_invalid_password(self, client):
client.post(
'/api/auth/register',
json={'email': 'user@example.com', 'password': 'secretpass123'},
)
client.post('/api/auth/logout')
response = client.post(
'/api/auth/login',
json={'email': 'user@example.com', 'password': 'wrongpass123'},
)
assert response.status_code == 401
assert 'invalid email or password' in response.get_json()['error'].lower()
def test_me_without_session(self, client):
response = client.get('/api/auth/me')
assert response.status_code == 200
assert response.get_json() == {'authenticated': False, 'user': None}

View File

@@ -16,8 +16,8 @@ class TestCompressTaskRoute:
mock_task.id = 'compress-task-id' mock_task.id = 'compress-task-id'
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.compress.validate_file', 'app.routes.compress.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.compress.generate_safe_path', 'app.routes.compress.generate_safe_path',
@@ -47,8 +47,8 @@ class TestCompressTaskRoute:
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.compress.validate_file', 'app.routes.compress.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.compress.generate_safe_path', 'app.routes.compress.generate_safe_path',

View File

@@ -10,8 +10,8 @@ class TestConvertTaskRoutes:
mock_task.id = 'convert-pdf-word-id' mock_task.id = 'convert-pdf-word-id'
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.convert.validate_file', 'app.routes.convert.validate_actor_file',
lambda f, allowed_types: ('document.pdf', 'pdf'), lambda f, allowed_types, actor: ('document.pdf', 'pdf'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.convert.generate_safe_path', 'app.routes.convert.generate_safe_path',
@@ -39,8 +39,8 @@ class TestConvertTaskRoutes:
mock_task.id = 'convert-word-pdf-id' mock_task.id = 'convert-word-pdf-id'
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.convert.validate_file', 'app.routes.convert.validate_actor_file',
lambda f, allowed_types: ('report.docx', 'docx'), lambda f, allowed_types, actor: ('report.docx', 'docx'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.convert.generate_safe_path', 'app.routes.convert.generate_safe_path',

View File

@@ -0,0 +1,54 @@
"""Tests for flowchart task routes."""
import io
from unittest.mock import MagicMock
class TestFlowchartTaskRoutes:
def test_extract_flowchart_dispatches_task(self, client, monkeypatch):
"""Should dispatch extraction task for uploaded PDF."""
mock_task = MagicMock()
mock_task.id = "flow-task-id"
mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr(
"app.routes.flowchart.validate_actor_file",
lambda f, allowed_types, actor: ("manual.pdf", "pdf"),
)
monkeypatch.setattr(
"app.routes.flowchart.generate_safe_path",
lambda ext: ("flow-task-id", "/tmp/test.pdf"),
)
monkeypatch.setattr(
"app.routes.flowchart.extract_flowchart_task.delay",
mock_delay,
)
response = client.post(
"/api/flowchart/extract",
data={"file": (io.BytesIO(b"%PDF-1.4"), "manual.pdf")},
content_type="multipart/form-data",
)
assert response.status_code == 202
body = response.get_json()
assert body["task_id"] == "flow-task-id"
args = mock_delay.call_args[0]
assert args[0] == "/tmp/test.pdf"
assert args[1] == "flow-task-id"
assert args[2] == "manual.pdf"
def test_extract_sample_dispatches_task(self, client, monkeypatch):
"""Should dispatch sample extraction task without file upload."""
mock_task = MagicMock()
mock_task.id = "sample-flow-task-id"
monkeypatch.setattr(
"app.routes.flowchart.extract_sample_flowchart_task.delay",
MagicMock(return_value=mock_task),
)
response = client.post("/api/flowchart/extract-sample")
assert response.status_code == 202
body = response.get_json()
assert body["task_id"] == "sample-flow-task-id"

View File

@@ -0,0 +1,68 @@
"""Tests for authenticated file history routes."""
from app.services.account_service import record_file_history
class TestHistoryRoutes:
def test_history_requires_auth(self, client):
response = client.get('/api/history')
assert response.status_code == 401
assert 'authentication required' in response.get_json()['error'].lower()
def test_history_returns_items(self, client, app):
register_response = client.post(
'/api/auth/register',
json={'email': 'user@example.com', 'password': 'secretpass123'},
)
user_id = register_response.get_json()['user']['id']
with app.app_context():
record_file_history(
user_id=user_id,
tool='pdf-to-word',
original_filename='report.pdf',
output_filename='report.docx',
status='completed',
download_url='/api/download/123/report.docx',
metadata={'output_size': 2048},
)
response = client.get('/api/history?limit=10')
assert response.status_code == 200
data = response.get_json()
assert len(data['items']) == 1
assert data['items'][0]['tool'] == 'pdf-to-word'
assert data['items'][0]['output_filename'] == 'report.docx'
def test_history_limit_is_applied(self, client, app):
register_response = client.post(
'/api/auth/register',
json={'email': 'user@example.com', 'password': 'secretpass123'},
)
user_id = register_response.get_json()['user']['id']
with app.app_context():
record_file_history(
user_id=user_id,
tool='pdf-to-word',
original_filename='first.pdf',
output_filename='first.docx',
status='completed',
download_url='/api/download/1/first.docx',
metadata=None,
)
record_file_history(
user_id=user_id,
tool='word-to-pdf',
original_filename='second.docx',
output_filename='second.pdf',
status='completed',
download_url='/api/download/2/second.pdf',
metadata=None,
)
response = client.get('/api/history?limit=1')
assert response.status_code == 200
assert len(response.get_json()['items']) == 1

View File

@@ -10,8 +10,8 @@ class TestImageTaskRoutes:
mock_task.id = 'img-convert-id' mock_task.id = 'img-convert-id'
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.image.validate_file', 'app.routes.image.validate_actor_file',
lambda f, allowed_types: ('photo.png', 'png'), lambda f, allowed_types, actor: ('photo.png', 'png'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.image.generate_safe_path', 'app.routes.image.generate_safe_path',
@@ -55,8 +55,8 @@ class TestImageTaskRoutes:
mock_task.id = 'img-resize-id' mock_task.id = 'img-resize-id'
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.image.validate_file', 'app.routes.image.validate_actor_file',
lambda f, allowed_types: ('photo.jpg', 'jpg'), lambda f, allowed_types, actor: ('photo.jpg', 'jpg'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.image.generate_safe_path', 'app.routes.image.generate_safe_path',
@@ -83,8 +83,8 @@ class TestImageTaskRoutes:
def test_resize_image_no_dimensions(self, client, monkeypatch): def test_resize_image_no_dimensions(self, client, monkeypatch):
"""Should return 400 when both width and height are missing.""" """Should return 400 when both width and height are missing."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.image.validate_file', 'app.routes.image.validate_actor_file',
lambda f, allowed_types: ('photo.jpg', 'jpg'), lambda f, allowed_types, actor: ('photo.jpg', 'jpg'),
) )
data = { data = {
'file': (io.BytesIO(b'\xff\xd8\xff'), 'photo.jpg'), 'file': (io.BytesIO(b'\xff\xd8\xff'), 'photo.jpg'),
@@ -100,8 +100,8 @@ class TestImageTaskRoutes:
def test_resize_image_invalid_width(self, client, monkeypatch): def test_resize_image_invalid_width(self, client, monkeypatch):
"""Should return 400 for width out of range.""" """Should return 400 for width out of range."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.image.validate_file', 'app.routes.image.validate_actor_file',
lambda f, allowed_types: ('photo.jpg', 'jpg'), lambda f, allowed_types, actor: ('photo.jpg', 'jpg'),
) )
data = { data = {
'file': (io.BytesIO(b'\xff\xd8\xff'), 'photo.jpg'), 'file': (io.BytesIO(b'\xff\xd8\xff'), 'photo.jpg'),

View File

@@ -76,7 +76,7 @@ class TestConcurrentRequests:
return t return t
# Apply all patches BEFORE threads start — avoids concurrent patch/unpatch # Apply all patches BEFORE threads start — avoids concurrent patch/unpatch
with patch('app.routes.compress.validate_file', return_value=('t.pdf', 'pdf')), \ with patch('app.routes.compress.validate_actor_file', return_value=('t.pdf', 'pdf')), \
patch('app.routes.compress.generate_safe_path', patch('app.routes.compress.generate_safe_path',
side_effect=lambda ext, folder_type: (f'tid-x', '/tmp/up/t.pdf')), \ side_effect=lambda ext, folder_type: (f'tid-x', '/tmp/up/t.pdf')), \
patch('werkzeug.datastructures.file_storage.FileStorage.save'), \ patch('werkzeug.datastructures.file_storage.FileStorage.save'), \
@@ -121,7 +121,7 @@ class TestConcurrentRequests:
errors: list[Exception] = [] errors: list[Exception] = []
lock = threading.Lock() lock = threading.Lock()
with patch('app.routes.pdf_tools.validate_file', return_value=('t.pdf', 'pdf')), \ with patch('app.routes.pdf_tools.validate_actor_file', return_value=('t.pdf', 'pdf')), \
patch('app.routes.pdf_tools.generate_safe_path', patch('app.routes.pdf_tools.generate_safe_path',
side_effect=lambda ext, folder_type: ('split-x', '/tmp/up/t.pdf')), \ side_effect=lambda ext, folder_type: ('split-x', '/tmp/up/t.pdf')), \
patch('werkzeug.datastructures.file_storage.FileStorage.save'), \ patch('werkzeug.datastructures.file_storage.FileStorage.save'), \
@@ -180,8 +180,8 @@ class TestFileSizeLimits:
def test_normal_size_file_is_accepted(self, client, monkeypatch): def test_normal_size_file_is_accepted(self, client, monkeypatch):
"""A file within the size limit reaches the route logic.""" """A file within the size limit reaches the route logic."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.compress.validate_file', 'app.routes.compress.validate_actor_file',
lambda f, allowed_types: ('t.pdf', 'pdf'), lambda f, allowed_types, actor: ('t.pdf', 'pdf'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.compress.generate_safe_path', 'app.routes.compress.generate_safe_path',

View File

@@ -20,8 +20,8 @@ def _mock_validate_and_task(monkeypatch, task_module_path, task_name):
# Mock file validator to accept any file # Mock file validator to accept any file
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.generate_safe_path', 'app.routes.pdf_tools.generate_safe_path',
@@ -62,8 +62,8 @@ class TestMergePdfs:
mock_task = MagicMock() mock_task = MagicMock()
mock_task.id = 'merge-task-id' mock_task.id = 'merge-task-id'
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.merge_pdfs_task.delay', 'app.routes.pdf_tools.merge_pdfs_task.delay',
@@ -95,8 +95,8 @@ class TestMergePdfs:
def test_merge_too_many_files(self, client, monkeypatch): def test_merge_too_many_files(self, client, monkeypatch):
"""Should return 400 when more than 20 files provided.""" """Should return 400 when more than 20 files provided."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
data = { data = {
'files': [ 'files': [
@@ -166,8 +166,8 @@ class TestSplitPdf:
def test_split_range_mode_requires_pages(self, client, monkeypatch): def test_split_range_mode_requires_pages(self, client, monkeypatch):
"""Should return 400 when range mode is selected without pages.""" """Should return 400 when range mode is selected without pages."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
data = { data = {
@@ -195,8 +195,8 @@ class TestRotatePdf:
def test_rotate_invalid_degrees(self, client, monkeypatch): def test_rotate_invalid_degrees(self, client, monkeypatch):
"""Should reject invalid rotation angles.""" """Should reject invalid rotation angles."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
data = { data = {
'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'), 'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'),
@@ -327,8 +327,8 @@ class TestImagesToPdf:
mock_task = MagicMock() mock_task = MagicMock()
mock_task.id = 'images-task-id' mock_task.id = 'images-task-id'
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.png', 'png'), lambda f, allowed_types, actor: ('test.png', 'png'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.images_to_pdf_task.delay', 'app.routes.pdf_tools.images_to_pdf_task.delay',
@@ -356,8 +356,8 @@ class TestImagesToPdf:
def test_images_to_pdf_too_many(self, client, monkeypatch): def test_images_to_pdf_too_many(self, client, monkeypatch):
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.png', 'png'), lambda f, allowed_types, actor: ('test.png', 'png'),
) )
data = { data = {
'files': [ 'files': [
@@ -384,8 +384,8 @@ class TestWatermarkPdf:
def test_watermark_no_text(self, client, monkeypatch): def test_watermark_no_text(self, client, monkeypatch):
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
data = { data = {
'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'), 'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'),
@@ -401,8 +401,8 @@ class TestWatermarkPdf:
def test_watermark_text_too_long(self, client, monkeypatch): def test_watermark_text_too_long(self, client, monkeypatch):
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
data = { data = {
'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'), 'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'),
@@ -443,8 +443,8 @@ class TestProtectPdf:
def test_protect_no_password(self, client, monkeypatch): def test_protect_no_password(self, client, monkeypatch):
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
data = { data = {
'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'), 'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'),
@@ -460,8 +460,8 @@ class TestProtectPdf:
def test_protect_short_password(self, client, monkeypatch): def test_protect_short_password(self, client, monkeypatch):
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
data = { data = {
'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'), 'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'),
@@ -501,8 +501,8 @@ class TestUnlockPdf:
def test_unlock_no_password(self, client, monkeypatch): def test_unlock_no_password(self, client, monkeypatch):
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.pdf_tools.validate_file', 'app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf'), lambda f, allowed_types, actor: ('test.pdf', 'pdf'),
) )
data = { data = {
'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'), 'file': (io.BytesIO(b'%PDF-1.4'), 'test.pdf'),

View File

@@ -16,8 +16,8 @@ class TestPdfToolsTaskRoutes:
mock_task.id = 'split-id' mock_task.id = 'split-id'
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr('app.routes.pdf_tools.validate_file', monkeypatch.setattr('app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf')) lambda f, allowed_types, actor: ('test.pdf', 'pdf'))
monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path', monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path',
lambda ext, folder_type: ('split-id', '/tmp/test.pdf')) lambda ext, folder_type: ('split-id', '/tmp/test.pdf'))
monkeypatch.setattr('app.routes.pdf_tools.split_pdf_task.delay', mock_delay) monkeypatch.setattr('app.routes.pdf_tools.split_pdf_task.delay', mock_delay)
@@ -41,8 +41,8 @@ class TestPdfToolsTaskRoutes:
mock_task.id = 'rotate-id' mock_task.id = 'rotate-id'
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr('app.routes.pdf_tools.validate_file', monkeypatch.setattr('app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf')) lambda f, allowed_types, actor: ('test.pdf', 'pdf'))
monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path', monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path',
lambda ext, folder_type: ('rotate-id', '/tmp/test.pdf')) lambda ext, folder_type: ('rotate-id', '/tmp/test.pdf'))
monkeypatch.setattr('app.routes.pdf_tools.rotate_pdf_task.delay', mock_delay) monkeypatch.setattr('app.routes.pdf_tools.rotate_pdf_task.delay', mock_delay)
@@ -66,8 +66,8 @@ class TestPdfToolsTaskRoutes:
mock_task.id = 'wm-id' mock_task.id = 'wm-id'
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr('app.routes.pdf_tools.validate_file', monkeypatch.setattr('app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf')) lambda f, allowed_types, actor: ('test.pdf', 'pdf'))
monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path', monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path',
lambda ext, folder_type: ('wm-id', '/tmp/test.pdf')) lambda ext, folder_type: ('wm-id', '/tmp/test.pdf'))
monkeypatch.setattr('app.routes.pdf_tools.watermark_pdf_task.delay', mock_delay) monkeypatch.setattr('app.routes.pdf_tools.watermark_pdf_task.delay', mock_delay)
@@ -91,8 +91,8 @@ class TestPdfToolsTaskRoutes:
mock_task.id = 'protect-id' mock_task.id = 'protect-id'
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr('app.routes.pdf_tools.validate_file', monkeypatch.setattr('app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf')) lambda f, allowed_types, actor: ('test.pdf', 'pdf'))
monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path', monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path',
lambda ext, folder_type: ('protect-id', '/tmp/test.pdf')) lambda ext, folder_type: ('protect-id', '/tmp/test.pdf'))
monkeypatch.setattr('app.routes.pdf_tools.protect_pdf_task.delay', mock_delay) monkeypatch.setattr('app.routes.pdf_tools.protect_pdf_task.delay', mock_delay)
@@ -113,8 +113,8 @@ class TestPdfToolsTaskRoutes:
mock_task.id = 'unlock-id' mock_task.id = 'unlock-id'
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr('app.routes.pdf_tools.validate_file', monkeypatch.setattr('app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf')) lambda f, allowed_types, actor: ('test.pdf', 'pdf'))
monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path', monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path',
lambda ext, folder_type: ('unlock-id', '/tmp/test.pdf')) lambda ext, folder_type: ('unlock-id', '/tmp/test.pdf'))
monkeypatch.setattr('app.routes.pdf_tools.unlock_pdf_task.delay', mock_delay) monkeypatch.setattr('app.routes.pdf_tools.unlock_pdf_task.delay', mock_delay)
@@ -133,8 +133,8 @@ class TestPdfToolsTaskRoutes:
mock_task.id = 'pn-id' mock_task.id = 'pn-id'
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr('app.routes.pdf_tools.validate_file', monkeypatch.setattr('app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf')) lambda f, allowed_types, actor: ('test.pdf', 'pdf'))
monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path', monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path',
lambda ext, folder_type: ('pn-id', '/tmp/test.pdf')) lambda ext, folder_type: ('pn-id', '/tmp/test.pdf'))
monkeypatch.setattr('app.routes.pdf_tools.add_page_numbers_task.delay', mock_delay) monkeypatch.setattr('app.routes.pdf_tools.add_page_numbers_task.delay', mock_delay)
@@ -157,8 +157,8 @@ class TestPdfToolsTaskRoutes:
mock_task.id = 'p2i-id' mock_task.id = 'p2i-id'
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr('app.routes.pdf_tools.validate_file', monkeypatch.setattr('app.routes.pdf_tools.validate_actor_file',
lambda f, allowed_types: ('test.pdf', 'pdf')) lambda f, allowed_types, actor: ('test.pdf', 'pdf'))
monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path', monkeypatch.setattr('app.routes.pdf_tools.generate_safe_path',
lambda ext, folder_type: ('p2i-id', '/tmp/test.pdf')) lambda ext, folder_type: ('p2i-id', '/tmp/test.pdf'))
monkeypatch.setattr('app.routes.pdf_tools.pdf_to_images_task.delay', mock_delay) monkeypatch.setattr('app.routes.pdf_tools.pdf_to_images_task.delay', mock_delay)

View File

@@ -14,8 +14,8 @@ class TestVideoToGif:
def test_to_gif_invalid_params(self, client, monkeypatch): def test_to_gif_invalid_params(self, client, monkeypatch):
"""Should return 400 for non-numeric parameters.""" """Should return 400 for non-numeric parameters."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.validate_file', 'app.routes.video.validate_actor_file',
lambda f, allowed_types: ('test.mp4', 'mp4'), lambda f, allowed_types, actor: (r'test.mp4', r'mp4'),
) )
data = { data = {
'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'), 'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'),
@@ -32,8 +32,8 @@ class TestVideoToGif:
def test_to_gif_negative_start(self, client, monkeypatch): def test_to_gif_negative_start(self, client, monkeypatch):
"""Should reject negative start time.""" """Should reject negative start time."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.validate_file', 'app.routes.video.validate_actor_file',
lambda f, allowed_types: ('test.mp4', 'mp4'), lambda f, allowed_types, actor: (r'test.mp4', r'mp4'),
) )
data = { data = {
'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'), 'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'),
@@ -52,8 +52,8 @@ class TestVideoToGif:
def test_to_gif_duration_too_long(self, client, monkeypatch): def test_to_gif_duration_too_long(self, client, monkeypatch):
"""Should reject duration > 15 seconds.""" """Should reject duration > 15 seconds."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.validate_file', 'app.routes.video.validate_actor_file',
lambda f, allowed_types: ('test.mp4', 'mp4'), lambda f, allowed_types, actor: (r'test.mp4', r'mp4'),
) )
data = { data = {
'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'), 'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'),
@@ -73,8 +73,8 @@ class TestVideoToGif:
def test_to_gif_fps_out_of_range(self, client, monkeypatch): def test_to_gif_fps_out_of_range(self, client, monkeypatch):
"""Should reject FPS > 20.""" """Should reject FPS > 20."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.validate_file', 'app.routes.video.validate_actor_file',
lambda f, allowed_types: ('test.mp4', 'mp4'), lambda f, allowed_types, actor: (r'test.mp4', r'mp4'),
) )
data = { data = {
'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'), 'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'),
@@ -93,8 +93,8 @@ class TestVideoToGif:
def test_to_gif_width_out_of_range(self, client, monkeypatch): def test_to_gif_width_out_of_range(self, client, monkeypatch):
"""Should reject width > 640.""" """Should reject width > 640."""
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.validate_file', 'app.routes.video.validate_actor_file',
lambda f, allowed_types: ('test.mp4', 'mp4'), lambda f, allowed_types, actor: (r'test.mp4', r'mp4'),
) )
data = { data = {
'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'), 'file': (io.BytesIO(b'\x00\x00\x00\x1cftyp'), 'test.mp4'),
@@ -116,8 +116,8 @@ class TestVideoToGif:
mock_task.id = 'gif-task-id' mock_task.id = 'gif-task-id'
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.validate_file', 'app.routes.video.validate_actor_file',
lambda f, allowed_types: ('test.mp4', 'mp4'), lambda f, allowed_types, actor: (r'test.mp4', r'mp4'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.generate_safe_path', 'app.routes.video.generate_safe_path',

View File

@@ -11,8 +11,8 @@ class TestVideoTaskRoutes:
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.validate_file', 'app.routes.video.validate_actor_file',
lambda f, allowed_types: ('video.mp4', 'mp4'), lambda f, allowed_types, actor: ('video.mp4', 'mp4'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.generate_safe_path', 'app.routes.video.generate_safe_path',
@@ -53,8 +53,8 @@ class TestVideoTaskRoutes:
mock_delay = MagicMock(return_value=mock_task) mock_delay = MagicMock(return_value=mock_task)
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.validate_file', 'app.routes.video.validate_actor_file',
lambda f, allowed_types: ('video.mp4', 'mp4'), lambda f, allowed_types, actor: ('video.mp4', 'mp4'),
) )
monkeypatch.setattr( monkeypatch.setattr(
'app.routes.video.generate_safe_path', 'app.routes.video.generate_safe_path',

View File

@@ -26,6 +26,7 @@ services:
volumes: volumes:
- upload_data:/tmp/uploads - upload_data:/tmp/uploads
- output_data:/tmp/outputs - output_data:/tmp/outputs
- db_data:/app/data
depends_on: depends_on:
redis: redis:
condition: service_healthy condition: service_healthy
@@ -40,7 +41,7 @@ services:
celery -A celery_worker.celery worker celery -A celery_worker.celery worker
--loglevel=warning --loglevel=warning
--concurrency=4 --concurrency=4
-Q default,convert,compress,image,video,pdf_tools -Q default,convert,compress,image,video,pdf_tools,flowchart
env_file: env_file:
- .env - .env
environment: environment:
@@ -51,6 +52,7 @@ services:
volumes: volumes:
- upload_data:/tmp/uploads - upload_data:/tmp/uploads
- output_data:/tmp/outputs - output_data:/tmp/outputs
- db_data:/app/data
depends_on: depends_on:
redis: redis:
condition: service_healthy condition: service_healthy
@@ -71,6 +73,8 @@ services:
- REDIS_URL=redis://redis:6379/0 - REDIS_URL=redis://redis:6379/0
- CELERY_BROKER_URL=redis://redis:6379/0 - CELERY_BROKER_URL=redis://redis:6379/0
- CELERY_RESULT_BACKEND=redis://redis:6379/1 - CELERY_RESULT_BACKEND=redis://redis:6379/1
volumes:
- db_data:/app/data
depends_on: depends_on:
redis: redis:
condition: service_healthy condition: service_healthy
@@ -97,6 +101,13 @@ services:
context: ./frontend context: ./frontend
dockerfile: Dockerfile dockerfile: Dockerfile
target: build target: build
environment:
- VITE_GA_MEASUREMENT_ID=${VITE_GA_MEASUREMENT_ID:-}
- VITE_ADSENSE_CLIENT_ID=${VITE_ADSENSE_CLIENT_ID:-}
- VITE_ADSENSE_SLOT_HOME_TOP=${VITE_ADSENSE_SLOT_HOME_TOP:-}
- VITE_ADSENSE_SLOT_HOME_BOTTOM=${VITE_ADSENSE_SLOT_HOME_BOTTOM:-}
- VITE_ADSENSE_SLOT_TOP_BANNER=${VITE_ADSENSE_SLOT_TOP_BANNER:-}
- VITE_ADSENSE_SLOT_BOTTOM_BANNER=${VITE_ADSENSE_SLOT_BOTTOM_BANNER:-}
volumes: volumes:
- frontend_build:/app/dist - frontend_build:/app/dist
@@ -104,4 +115,5 @@ volumes:
redis_data: redis_data:
upload_data: upload_data:
output_data: output_data:
db_data:
frontend_build: frontend_build:

View File

@@ -44,7 +44,7 @@ services:
celery -A celery_worker.celery worker celery -A celery_worker.celery worker
--loglevel=info --loglevel=info
--concurrency=2 --concurrency=2
-Q default,convert,compress,image,video,pdf_tools -Q default,convert,compress,image,video,pdf_tools,flowchart
env_file: env_file:
- .env - .env
environment: environment:
@@ -80,6 +80,12 @@ services:
- /app/node_modules - /app/node_modules
environment: environment:
- NODE_ENV=development - NODE_ENV=development
- VITE_GA_MEASUREMENT_ID=${VITE_GA_MEASUREMENT_ID:-}
- VITE_ADSENSE_CLIENT_ID=${VITE_ADSENSE_CLIENT_ID:-}
- VITE_ADSENSE_SLOT_HOME_TOP=${VITE_ADSENSE_SLOT_HOME_TOP:-}
- VITE_ADSENSE_SLOT_HOME_BOTTOM=${VITE_ADSENSE_SLOT_HOME_BOTTOM:-}
- VITE_ADSENSE_SLOT_TOP_BANNER=${VITE_ADSENSE_SLOT_TOP_BANNER:-}
- VITE_ADSENSE_SLOT_BOTTOM_BANNER=${VITE_ADSENSE_SLOT_BOTTOM_BANNER:-}
# --- Nginx Reverse Proxy --- # --- Nginx Reverse Proxy ---
nginx: nginx:

6
frontend/.env.example Normal file
View File

@@ -0,0 +1,6 @@
VITE_GA_MEASUREMENT_ID=G-XXXXXXXXXX
VITE_ADSENSE_CLIENT_ID=ca-pub-XXXXXXXXXXXXXXXX
VITE_ADSENSE_SLOT_HOME_TOP=1234567890
VITE_ADSENSE_SLOT_HOME_BOTTOM=1234567891
VITE_ADSENSE_SLOT_TOP_BANNER=1234567892
VITE_ADSENSE_SLOT_BOTTOM_BANNER=1234567893

View File

@@ -10,11 +10,11 @@
<meta name="robots" content="index, follow" /> <meta name="robots" content="index, follow" />
<meta property="og:type" content="website" /> <meta property="og:type" content="website" />
<meta property="og:title" content="SaaS-PDF — Free Online File Tools" /> <meta property="og:title" content="SaaS-PDF — Free Online File Tools" />
<meta property="og:description" content="16+ free tools: merge, split, compress, convert PDFs, images, videos & text. No signup required." /> <meta property="og:description" content="18+ free tools: merge, split, compress, convert PDFs, images, videos & text. No signup required." />
<meta property="og:site_name" content="SaaS-PDF" /> <meta property="og:site_name" content="SaaS-PDF" />
<meta name="twitter:card" content="summary_large_image" /> <meta name="twitter:card" content="summary_large_image" />
<meta name="twitter:title" content="SaaS-PDF — Free Online File Tools" /> <meta name="twitter:title" content="SaaS-PDF — Free Online File Tools" />
<meta name="twitter:description" content="16+ free tools: merge, split, compress, convert PDFs, images, videos & text. No signup required." /> <meta name="twitter:description" content="18+ free tools: merge, split, compress, convert PDFs, images, videos & text. No signup required." />
<link rel="preconnect" href="https://fonts.googleapis.com" /> <link rel="preconnect" href="https://fonts.googleapis.com" />
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin /> <link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@300;400;500;600;700&family=Tajawal:wght@300;400;500;700&display=swap" rel="stylesheet" /> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@300;400;500;600;700&family=Tajawal:wght@300;400;500;700&display=swap" rel="stylesheet" />

View File

@@ -1,8 +1,10 @@
import { lazy, Suspense } from 'react'; import { lazy, Suspense, useEffect } from 'react';
import { Routes, Route } from 'react-router-dom'; import { Routes, Route, useLocation } from 'react-router-dom';
import Header from '@/components/layout/Header'; import Header from '@/components/layout/Header';
import Footer from '@/components/layout/Footer'; import Footer from '@/components/layout/Footer';
import { useDirection } from '@/hooks/useDirection'; import { useDirection } from '@/hooks/useDirection';
import { initAnalytics, trackPageView } from '@/services/analytics';
import { useAuthStore } from '@/stores/authStore';
// Pages // Pages
const HomePage = lazy(() => import('@/pages/HomePage')); const HomePage = lazy(() => import('@/pages/HomePage'));
@@ -10,6 +12,7 @@ const AboutPage = lazy(() => import('@/pages/AboutPage'));
const PrivacyPage = lazy(() => import('@/pages/PrivacyPage')); const PrivacyPage = lazy(() => import('@/pages/PrivacyPage'));
const NotFoundPage = lazy(() => import('@/pages/NotFoundPage')); const NotFoundPage = lazy(() => import('@/pages/NotFoundPage'));
const TermsPage = lazy(() => import('@/pages/TermsPage')); const TermsPage = lazy(() => import('@/pages/TermsPage'));
const AccountPage = lazy(() => import('@/pages/AccountPage'));
// Tool Pages // Tool Pages
const PdfToWord = lazy(() => import('@/components/tools/PdfToWord')); const PdfToWord = lazy(() => import('@/components/tools/PdfToWord'));
@@ -41,6 +44,17 @@ function LoadingFallback() {
export default function App() { export default function App() {
useDirection(); useDirection();
const location = useLocation();
const refreshUser = useAuthStore((state) => state.refreshUser);
useEffect(() => {
initAnalytics();
void refreshUser();
}, [refreshUser]);
useEffect(() => {
trackPageView(`${location.pathname}${location.search}`);
}, [location.pathname, location.search]);
return ( return (
<div className="flex min-h-screen flex-col bg-slate-50 transition-colors duration-300 dark:bg-slate-950"> <div className="flex min-h-screen flex-col bg-slate-50 transition-colors duration-300 dark:bg-slate-950">
@@ -52,6 +66,7 @@ export default function App() {
{/* Pages */} {/* Pages */}
<Route path="/" element={<HomePage />} /> <Route path="/" element={<HomePage />} />
<Route path="/about" element={<AboutPage />} /> <Route path="/about" element={<AboutPage />} />
<Route path="/account" element={<AccountPage />} />
<Route path="/privacy" element={<PrivacyPage />} /> <Route path="/privacy" element={<PrivacyPage />} />
<Route path="/terms" element={<TermsPage />} /> <Route path="/terms" element={<TermsPage />} />

View File

@@ -1,4 +1,5 @@
import { useEffect, useRef } from 'react'; import { useEffect, useRef } from 'react';
import { useAuthStore } from '@/stores/authStore';
interface AdSlotProps { interface AdSlotProps {
/** AdSense ad slot ID */ /** AdSense ad slot ID */
@@ -21,21 +22,50 @@ export default function AdSlot({
responsive = true, responsive = true,
className = '', className = '',
}: AdSlotProps) { }: AdSlotProps) {
const user = useAuthStore((s) => s.user);
const adRef = useRef<HTMLModElement>(null); const adRef = useRef<HTMLModElement>(null);
const isLoaded = useRef(false); const isLoaded = useRef(false);
const clientId = (import.meta.env.VITE_ADSENSE_CLIENT_ID || '').trim();
const slotMap: Record<string, string | undefined> = {
'home-top': import.meta.env.VITE_ADSENSE_SLOT_HOME_TOP,
'home-bottom': import.meta.env.VITE_ADSENSE_SLOT_HOME_BOTTOM,
'top-banner': import.meta.env.VITE_ADSENSE_SLOT_TOP_BANNER,
'bottom-banner': import.meta.env.VITE_ADSENSE_SLOT_BOTTOM_BANNER,
};
const resolvedSlot = /^\d+$/.test(slot) ? slot : slotMap[slot];
useEffect(() => { useEffect(() => {
if (isLoaded.current) return; if (isLoaded.current || !clientId || !resolvedSlot) return;
const existingScript = document.querySelector<HTMLScriptElement>(
`script[data-adsense-client="${clientId}"]`
);
if (!existingScript) {
const script = document.createElement('script');
script.async = true;
script.src = `https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=${clientId}`;
script.crossOrigin = 'anonymous';
script.setAttribute('data-adsense-client', clientId);
document.head.appendChild(script);
}
try { try {
// Push ad to AdSense queue // Push ad to AdSense queue
const adsbygoogle = (window as any).adsbygoogle || []; const adsWindow = window as Window & { adsbygoogle?: unknown[] };
const adsbygoogle = adsWindow.adsbygoogle || [];
adsbygoogle.push({}); adsbygoogle.push({});
adsWindow.adsbygoogle = adsbygoogle;
isLoaded.current = true; isLoaded.current = true;
} catch { } catch {
// AdSense not loaded (e.g., ad blocker) // AdSense not loaded (e.g., ad blocker)
} }
}, []); }, [clientId, resolvedSlot]);
if (!clientId || !resolvedSlot) return null;
// Pro users see no ads
if (user?.plan === 'pro') return null;
return ( return (
<div className={`ad-slot ${className}`}> <div className={`ad-slot ${className}`}>
@@ -43,8 +73,8 @@ export default function AdSlot({
ref={adRef} ref={adRef}
className="adsbygoogle" className="adsbygoogle"
style={{ display: 'block' }} style={{ display: 'block' }}
data-ad-client={import.meta.env.VITE_ADSENSE_CLIENT_ID || ''} data-ad-client={clientId}
data-ad-slot={slot} data-ad-slot={resolvedSlot}
data-ad-format={format} data-ad-format={format}
data-full-width-responsive={responsive ? 'true' : 'false'} data-full-width-responsive={responsive ? 'true' : 'false'}
/> />

View File

@@ -1,8 +1,8 @@
import { useState, useEffect, useRef } from 'react'; import { useState, useEffect, useRef } from 'react';
import { Link } from 'react-router-dom'; import { Link } from 'react-router-dom';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { FileText, Moon, Sun, Menu, X, ChevronDown } from 'lucide-react'; import { FileText, Moon, Sun, Menu, X, ChevronDown, UserRound } from 'lucide-react';
// ...existing code... import { useAuthStore } from '@/stores/authStore';
interface LangOption { interface LangOption {
code: string; code: string;
label: string; label: string;
@@ -40,6 +40,7 @@ function useDarkMode() {
export default function Header() { export default function Header() {
const { t, i18n } = useTranslation(); const { t, i18n } = useTranslation();
const { isDark, toggle: toggleDark } = useDarkMode(); const { isDark, toggle: toggleDark } = useDarkMode();
const user = useAuthStore((state) => state.user);
const [langOpen, setLangOpen] = useState(false); const [langOpen, setLangOpen] = useState(false);
const [mobileOpen, setMobileOpen] = useState(false); const [mobileOpen, setMobileOpen] = useState(false);
const langRef = useRef<HTMLDivElement>(null); const langRef = useRef<HTMLDivElement>(null);
@@ -85,10 +86,24 @@ export default function Header() {
> >
{t('common.about')} {t('common.about')}
</Link> </Link>
<Link
to="/account"
className="text-sm font-medium text-slate-600 transition-colors hover:text-primary-600 dark:text-slate-300 dark:hover:text-primary-400"
>
{t('common.account')}
</Link>
</nav> </nav>
{/* Actions */} {/* Actions */}
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<Link
to="/account"
className="hidden max-w-[220px] items-center gap-2 rounded-xl border border-slate-200 px-3 py-2 text-sm font-medium text-slate-600 transition-colors hover:bg-slate-50 md:flex dark:border-slate-700 dark:text-slate-300 dark:hover:bg-slate-800"
>
<UserRound className="h-4 w-4" />
<span className="truncate">{user?.email || t('common.account')}</span>
</Link>
{/* Dark Mode Toggle */} {/* Dark Mode Toggle */}
<button <button
onClick={toggleDark} onClick={toggleDark}
@@ -167,6 +182,13 @@ export default function Header() {
> >
{t('common.about')} {t('common.about')}
</Link> </Link>
<Link
to="/account"
onClick={() => setMobileOpen(false)}
className="block rounded-lg px-3 py-2.5 text-sm font-medium text-slate-600 transition-colors hover:bg-slate-50 dark:text-slate-300 dark:hover:bg-slate-800"
>
{user?.email || t('common.account')}
</Link>
</nav> </nav>
)} )}
</header> </header>

View File

@@ -2,6 +2,7 @@ import { useTranslation } from 'react-i18next';
import { Download, RotateCcw, Clock } from 'lucide-react'; import { Download, RotateCcw, Clock } from 'lucide-react';
import type { TaskResult } from '@/services/api'; import type { TaskResult } from '@/services/api';
import { formatFileSize } from '@/utils/textTools'; import { formatFileSize } from '@/utils/textTools';
import { trackEvent } from '@/services/analytics';
interface DownloadButtonProps { interface DownloadButtonProps {
/** Task result containing download URL */ /** Task result containing download URL */
@@ -61,6 +62,9 @@ export default function DownloadButton({ result, onStartOver }: DownloadButtonPr
<a <a
href={result.download_url} href={result.download_url}
download={result.filename} download={result.filename}
onClick={() => {
trackEvent('download_clicked', { filename: result.filename || 'unknown' });
}}
className="btn-success w-full" className="btn-success w-full"
target="_blank" target="_blank"
rel="noopener noreferrer" rel="noopener noreferrer"

View File

@@ -7,6 +7,7 @@ import ToolSelectorModal from '@/components/shared/ToolSelectorModal';
import { useFileStore } from '@/stores/fileStore'; import { useFileStore } from '@/stores/fileStore';
import { getToolsForFile, detectFileCategory, getCategoryLabel } from '@/utils/fileRouting'; import { getToolsForFile, detectFileCategory, getCategoryLabel } from '@/utils/fileRouting';
import type { ToolOption } from '@/utils/fileRouting'; import type { ToolOption } from '@/utils/fileRouting';
import { TOOL_LIMITS_MB } from '@/config/toolLimits';
/** /**
* The MIME types we accept on the homepage smart upload zone. * The MIME types we accept on the homepage smart upload zone.
@@ -62,11 +63,11 @@ export default function HeroUploadZone() {
onDrop, onDrop,
accept: ACCEPTED_TYPES, accept: ACCEPTED_TYPES,
maxFiles: 1, maxFiles: 1,
maxSize: 100 * 1024 * 1024, // 100 MB (matches nginx config) maxSize: TOOL_LIMITS_MB.homepageSmartUpload * 1024 * 1024,
onDropRejected: (rejections) => { onDropRejected: (rejections) => {
const rejection = rejections[0]; const rejection = rejections[0];
if (rejection?.errors[0]?.code === 'file-too-large') { if (rejection?.errors[0]?.code === 'file-too-large') {
setError(t('common.maxSize', { size: 100 })); setError(t('common.maxSize', { size: TOOL_LIMITS_MB.homepageSmartUpload }));
} else { } else {
setError(t('home.unsupportedFile')); setError(t('home.unsupportedFile'));
} }

View File

@@ -6,6 +6,7 @@ import AdSlot from '@/components/layout/AdSlot';
import ProgressBar from '@/components/shared/ProgressBar'; import ProgressBar from '@/components/shared/ProgressBar';
import DownloadButton from '@/components/shared/DownloadButton'; import DownloadButton from '@/components/shared/DownloadButton';
import { useTaskPolling } from '@/hooks/useTaskPolling'; import { useTaskPolling } from '@/hooks/useTaskPolling';
import { uploadFiles } from '@/services/api';
import { generateToolSchema } from '@/utils/seo'; import { generateToolSchema } from '@/utils/seo';
import { useFileStore } from '@/stores/fileStore'; import { useFileStore } from '@/stores/fileStore';
@@ -61,20 +62,7 @@ export default function ImagesToPdf() {
setError(null); setError(null);
try { try {
const formData = new FormData(); const data = await uploadFiles('/pdf-tools/images-to-pdf', files, 'files');
files.forEach((f) => formData.append('files', f));
const response = await fetch('/api/pdf-tools/images-to-pdf', {
method: 'POST',
body: formData,
});
const data = await response.json();
if (!response.ok) {
throw new Error(data.error || 'Upload failed.');
}
setTaskId(data.task_id); setTaskId(data.task_id);
setPhase('processing'); setPhase('processing');
} catch (err) { } catch (err) {

View File

@@ -7,7 +7,7 @@ import ProgressBar from '@/components/shared/ProgressBar';
import DownloadButton from '@/components/shared/DownloadButton'; import DownloadButton from '@/components/shared/DownloadButton';
import AdSlot from '@/components/layout/AdSlot'; import AdSlot from '@/components/layout/AdSlot';
import { useTaskPolling } from '@/hooks/useTaskPolling'; import { useTaskPolling } from '@/hooks/useTaskPolling';
import { uploadFile, type TaskResponse } from '@/services/api'; import { uploadFiles } from '@/services/api';
import { generateToolSchema } from '@/utils/seo'; import { generateToolSchema } from '@/utils/seo';
import { useFileStore } from '@/stores/fileStore'; import { useFileStore } from '@/stores/fileStore';
@@ -62,20 +62,7 @@ export default function MergePdf() {
setError(null); setError(null);
try { try {
const formData = new FormData(); const data = await uploadFiles('/pdf-tools/merge', files, 'files');
files.forEach((f) => formData.append('files', f));
const response = await fetch('/api/pdf-tools/merge', {
method: 'POST',
body: formData,
});
const data = await response.json();
if (!response.ok) {
throw new Error(data.error || 'Upload failed.');
}
setTaskId(data.task_id); setTaskId(data.task_id);
setPhase('processing'); setPhase('processing');
} catch (err) { } catch (err) {

View File

@@ -4,14 +4,6 @@ import { Helmet } from 'react-helmet-async';
import { import {
PenLine, PenLine,
Save, Save,
Download,
Undo2,
Redo2,
PlusCircle,
Trash2,
RotateCw,
FileOutput,
PanelLeft,
Share2, Share2,
ShieldCheck, ShieldCheck,
Info, Info,
@@ -24,6 +16,7 @@ import { useFileUpload } from '@/hooks/useFileUpload';
import { useTaskPolling } from '@/hooks/useTaskPolling'; import { useTaskPolling } from '@/hooks/useTaskPolling';
import { generateToolSchema } from '@/utils/seo'; import { generateToolSchema } from '@/utils/seo';
import { useFileStore } from '@/stores/fileStore'; import { useFileStore } from '@/stores/fileStore';
import { TOOL_LIMITS_MB } from '@/config/toolLimits';
export default function PdfEditor() { export default function PdfEditor() {
const { t } = useTranslation(); const { t } = useTranslation();
@@ -40,7 +33,7 @@ export default function PdfEditor() {
reset, reset,
} = useFileUpload({ } = useFileUpload({
endpoint: '/compress/pdf', endpoint: '/compress/pdf',
maxSizeMB: 200, maxSizeMB: TOOL_LIMITS_MB.pdf,
acceptedTypes: ['pdf'], acceptedTypes: ['pdf'],
extraData: { quality: 'high' }, extraData: { quality: 'high' },
}); });
@@ -77,16 +70,6 @@ export default function PdfEditor() {
url: `${window.location.origin}/tools/pdf-editor`, url: `${window.location.origin}/tools/pdf-editor`,
}); });
const toolbarButtons = [
{ icon: Undo2, label: t('tools.pdfEditor.undo'), shortcut: 'Ctrl+Z' },
{ icon: Redo2, label: t('tools.pdfEditor.redo'), shortcut: 'Ctrl+Y' },
{ icon: PlusCircle, label: t('tools.pdfEditor.addPage') },
{ icon: Trash2, label: t('tools.pdfEditor.deletePage') },
{ icon: RotateCw, label: t('tools.pdfEditor.rotate') },
{ icon: FileOutput, label: t('tools.pdfEditor.extractPage') },
{ icon: PanelLeft, label: t('tools.pdfEditor.thumbnails') },
];
return ( return (
<> <>
<Helmet> <Helmet>
@@ -117,7 +100,7 @@ export default function PdfEditor() {
onFileSelect={selectFile} onFileSelect={selectFile}
file={file} file={file}
accept={{ 'application/pdf': ['.pdf'] }} accept={{ 'application/pdf': ['.pdf'] }}
maxSizeMB={200} maxSizeMB={TOOL_LIMITS_MB.pdf}
isUploading={isUploading} isUploading={isUploading}
uploadProgress={uploadProgress} uploadProgress={uploadProgress}
error={uploadError} error={uploadError}
@@ -145,28 +128,6 @@ export default function PdfEditor() {
</ol> </ol>
</div> </div>
{/* Toolbar Preview */}
<div className="rounded-2xl bg-white p-4 ring-1 ring-slate-200 dark:bg-slate-800 dark:ring-slate-700">
<p className="mb-3 text-xs font-medium uppercase tracking-wide text-slate-400 dark:text-slate-500">
{t('tools.pdfEditor.thumbnails')}
</p>
<div className="flex flex-wrap gap-2">
{toolbarButtons.map((btn) => {
const Icon = btn.icon;
return (
<div
key={btn.label}
className="flex items-center gap-1.5 rounded-lg bg-slate-50 px-3 py-2 text-xs font-medium text-slate-600 ring-1 ring-slate-200 dark:bg-slate-700 dark:text-slate-300 dark:ring-slate-600"
title={btn.shortcut ? `${btn.label} (${btn.shortcut})` : btn.label}
>
<Icon className="h-4 w-4" />
<span className="hidden sm:inline">{btn.label}</span>
</div>
);
})}
</div>
</div>
{/* Upload Button */} {/* Upload Button */}
<button <button
onClick={handleUpload} onClick={handleUpload}

View File

@@ -4,6 +4,7 @@ import { Helmet } from 'react-helmet-async';
import { GitBranch } from 'lucide-react'; import { GitBranch } from 'lucide-react';
import AdSlot from '@/components/layout/AdSlot'; import AdSlot from '@/components/layout/AdSlot';
import { useTaskPolling } from '@/hooks/useTaskPolling'; import { useTaskPolling } from '@/hooks/useTaskPolling';
import { startTask, uploadFile } from '@/services/api';
import { generateToolSchema } from '@/utils/seo'; import { generateToolSchema } from '@/utils/seo';
import { useFileStore } from '@/stores/fileStore'; import { useFileStore } from '@/stores/fileStore';
@@ -65,8 +66,8 @@ export default function PdfFlowchart() {
setUploading(false); setUploading(false);
} }
}, },
onError: () => { onError: (err) => {
setError(taskError || t('common.error')); setError(err || t('common.error'));
setStep(0); setStep(0);
setUploading(false); setUploading(false);
}, },
@@ -86,16 +87,7 @@ export default function PdfFlowchart() {
setError(null); setError(null);
try { try {
const formData = new FormData(); const data = await uploadFile('/flowchart/extract', file);
formData.append('file', file);
const res = await fetch('/api/flowchart/extract', {
method: 'POST',
body: formData,
});
const data = await res.json();
if (!res.ok) throw new Error(data.error || 'Upload failed.');
setTaskId(data.task_id); setTaskId(data.task_id);
} catch (err) { } catch (err) {
setError(err instanceof Error ? err.message : 'Upload failed.'); setError(err instanceof Error ? err.message : 'Upload failed.');
@@ -108,11 +100,7 @@ export default function PdfFlowchart() {
setError(null); setError(null);
try { try {
const res = await fetch('/api/flowchart/extract-sample', { const data = await startTask('/flowchart/extract-sample');
method: 'POST',
});
const data = await res.json();
if (!res.ok) throw new Error(data.error || 'Sample failed.');
setTaskId(data.task_id); setTaskId(data.task_id);
} catch (err) { } catch (err) {
setError(err instanceof Error ? err.message : 'Sample failed.'); setError(err instanceof Error ? err.message : 'Sample failed.');

View File

@@ -2,6 +2,7 @@ import { useState, useRef, useEffect } from 'react';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { Send, Bot, User, Sparkles, X, Loader2 } from 'lucide-react'; import { Send, Bot, User, Sparkles, X, Loader2 } from 'lucide-react';
import type { Flowchart, ChatMessage } from './types'; import type { Flowchart, ChatMessage } from './types';
import api from '@/services/api';
interface FlowChatProps { interface FlowChatProps {
flow: Flowchart; flow: Flowchart;
@@ -42,16 +43,12 @@ export default function FlowChat({ flow, onClose, onFlowUpdate }: FlowChatProps)
setIsTyping(true); setIsTyping(true);
try { try {
const res = await fetch('/api/flowchart/chat', { const res = await api.post('/flowchart/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
message: text, message: text,
flow_id: flow.id, flow_id: flow.id,
flow_data: flow, flow_data: flow,
}),
}); });
const data = await res.json(); const data = res.data;
const assistantMsg: ChatMessage = { const assistantMsg: ChatMessage = {
id: (Date.now() + 1).toString(), id: (Date.now() + 1).toString(),

View File

@@ -0,0 +1,9 @@
export const TOOL_LIMITS_MB = {
pdf: 20,
word: 15,
image: 10,
video: 50,
homepageSmartUpload: 50,
} as const;
export const FILE_RETENTION_MINUTES = 30;

View File

@@ -1,5 +1,6 @@
import { useState, useCallback, useRef } from 'react'; import { useState, useCallback, useRef } from 'react';
import { uploadFile, type TaskResponse } from '@/services/api'; import { uploadFile, type TaskResponse } from '@/services/api';
import { trackEvent } from '@/services/analytics';
interface UseFileUploadOptions { interface UseFileUploadOptions {
endpoint: string; endpoint: string;
@@ -38,26 +39,45 @@ export function useFileUpload({
setError(null); setError(null);
setTaskId(null); setTaskId(null);
setUploadProgress(0); setUploadProgress(0);
const ext = selectedFile.name.split('.').pop()?.toLowerCase() || 'unknown';
const sizeMb = Number((selectedFile.size / (1024 * 1024)).toFixed(2));
// Client-side size check // Client-side size check
const maxBytes = maxSizeMB * 1024 * 1024; const maxBytes = maxSizeMB * 1024 * 1024;
if (selectedFile.size > maxBytes) { if (selectedFile.size > maxBytes) {
setError(`File too large. Maximum size is ${maxSizeMB}MB.`); setError(`File too large. Maximum size is ${maxSizeMB}MB.`);
trackEvent('upload_rejected_client', {
endpoint,
reason: 'size_limit',
file_ext: ext,
size_mb: sizeMb,
max_size_mb: maxSizeMB,
});
return; return;
} }
// Client-side type check // Client-side type check
if (acceptedTypes && acceptedTypes.length > 0) { if (acceptedTypes && acceptedTypes.length > 0) {
const ext = selectedFile.name.split('.').pop()?.toLowerCase(); const selectedExt = selectedFile.name.split('.').pop()?.toLowerCase();
if (!ext || !acceptedTypes.includes(ext)) { if (!selectedExt || !acceptedTypes.includes(selectedExt)) {
setError(`Invalid file type. Accepted: ${acceptedTypes.join(', ')}`); setError(`Invalid file type. Accepted: ${acceptedTypes.join(', ')}`);
trackEvent('upload_rejected_client', {
endpoint,
reason: 'invalid_type',
file_ext: ext,
});
return; return;
} }
} }
setFile(selectedFile); setFile(selectedFile);
trackEvent('file_selected', {
endpoint,
file_ext: ext,
size_mb: sizeMb,
});
}, },
[maxSizeMB, acceptedTypes] [maxSizeMB, acceptedTypes, endpoint]
); );
const startUpload = useCallback(async (): Promise<string | null> => { const startUpload = useCallback(async (): Promise<string | null> => {
@@ -69,6 +89,7 @@ export function useFileUpload({
setIsUploading(true); setIsUploading(true);
setError(null); setError(null);
setUploadProgress(0); setUploadProgress(0);
trackEvent('upload_started', { endpoint });
try { try {
const response: TaskResponse = await uploadFile( const response: TaskResponse = await uploadFile(
@@ -80,11 +101,13 @@ export function useFileUpload({
setTaskId(response.task_id); setTaskId(response.task_id);
setIsUploading(false); setIsUploading(false);
trackEvent('upload_accepted', { endpoint });
return response.task_id; return response.task_id;
} catch (err) { } catch (err) {
const message = err instanceof Error ? err.message : 'Upload failed.'; const message = err instanceof Error ? err.message : 'Upload failed.';
setError(message); setError(message);
setIsUploading(false); setIsUploading(false);
trackEvent('upload_failed', { endpoint });
return null; return null;
} }
}, [file, endpoint]); }, [file, endpoint]);

View File

@@ -1,5 +1,6 @@
import { useState, useEffect, useCallback, useRef } from 'react'; import { useState, useEffect, useCallback, useRef } from 'react';
import { getTaskStatus, type TaskStatus, type TaskResult } from '@/services/api'; import { getTaskStatus, type TaskStatus, type TaskResult } from '@/services/api';
import { trackEvent } from '@/services/analytics';
interface UseTaskPollingOptions { interface UseTaskPollingOptions {
taskId: string | null; taskId: string | null;
@@ -54,22 +55,26 @@ export function useTaskPolling({
if (taskResult?.status === 'completed') { if (taskResult?.status === 'completed') {
setResult(taskResult); setResult(taskResult);
trackEvent('task_completed', { task_id: taskId });
onComplete?.(taskResult); onComplete?.(taskResult);
} else { } else {
const errMsg = taskResult?.error || 'Processing failed.'; const errMsg = taskResult?.error || 'Processing failed.';
setError(errMsg); setError(errMsg);
trackEvent('task_failed', { task_id: taskId, reason: 'result_failed' });
onError?.(errMsg); onError?.(errMsg);
} }
} else if (taskStatus.state === 'FAILURE') { } else if (taskStatus.state === 'FAILURE') {
stopPolling(); stopPolling();
const errMsg = taskStatus.error || 'Task failed.'; const errMsg = taskStatus.error || 'Task failed.';
setError(errMsg); setError(errMsg);
trackEvent('task_failed', { task_id: taskId, reason: 'state_failure' });
onError?.(errMsg); onError?.(errMsg);
} }
} catch (err) { } catch (err) {
stopPolling(); stopPolling();
const errMsg = err instanceof Error ? err.message : 'Polling failed.'; const errMsg = err instanceof Error ? err.message : 'Polling failed.';
setError(errMsg); setError(errMsg);
trackEvent('task_failed', { task_id: taskId, reason: 'polling_error' });
onError?.(errMsg); onError?.(errMsg);
} }
}; };

View File

@@ -18,6 +18,10 @@
"terms": "شروط الاستخدام", "terms": "شروط الاستخدام",
"language": "اللغة", "language": "اللغة",
"allTools": "كل الأدوات", "allTools": "كل الأدوات",
"account": "الحساب",
"signIn": "تسجيل الدخول",
"email": "البريد الإلكتروني",
"password": "كلمة المرور",
"darkMode": "الوضع الداكن", "darkMode": "الوضع الداكن",
"lightMode": "الوضع الفاتح" "lightMode": "الوضع الفاتح"
}, },
@@ -30,10 +34,10 @@
"videoTools": "أدوات الفيديو", "videoTools": "أدوات الفيديو",
"textTools": "أدوات النصوص", "textTools": "أدوات النصوص",
"uploadCta": "اسحب ملفك هنا أو اضغط لاختياره", "uploadCta": "اسحب ملفك هنا أو اضغط لاختياره",
"uploadOr": "ندعم: PDF, Word, JPG, PNG, WebP, MP4 — الحد الأقصى للحجم: 200 ميجابايت.", "uploadOr": "ندعم: PDF, Word, JPG, PNG, WebP, MP4 — الحد الأقصى للحجم: 50 ميجابايت.",
"uploadSubtitle": "نستخرج معاينة سريعة ونعرض الأدوات المناسبة فوراً.", "uploadSubtitle": "نستخرج معاينة سريعة ونعرض الأدوات المناسبة فوراً.",
"editNow": "عدّل ملفك الآن", "editNow": "حسّن ملف PDF الآن",
"editNowTooltip": "افتح محرّر الملفات — حرّر النصوص، أضف تعليقات، وغيّر الصفحات", "editNowTooltip": "افتح أداة تحسين PDF السريعة لإنشاء نسخة نظيفة قابلة للتنزيل",
"suggestedTools": "الأدوات المقترحة لملفك", "suggestedTools": "الأدوات المقترحة لملفك",
"suggestedToolsDesc": "بعد رفع الملف سنعرض الأدوات المتوافقة تلقائيًا: تحرير نص، تمييز، دمج/تقسيم، ضغط، تحويل إلى Word/صورة، تحويل فيديو إلى GIF، والمزيد.", "suggestedToolsDesc": "بعد رفع الملف سنعرض الأدوات المتوافقة تلقائيًا: تحرير نص، تمييز، دمج/تقسيم، ضغط، تحويل إلى Word/صورة، تحويل فيديو إلى GIF، والمزيد.",
"selectTool": "اختر أداة", "selectTool": "اختر أداة",
@@ -203,17 +207,17 @@
"topLeft": "أعلى اليسار" "topLeft": "أعلى اليسار"
}, },
"pdfEditor": { "pdfEditor": {
"title": "محرّر PDF متقدّم", "title": "تحسين PDF السريع",
"description": "حرِّر نصوص PDF، أضف تعليقات، أعد ترتيب الصفحات وسجّل نسخة نهائية. سريع وبسيط ومباشر في المتصفح.", "description": "أنشئ نسخة محسّنة ونظيفة من ملف PDF بضغطة واحدة مع الحفاظ على الملف الأصلي بدون تغيير.",
"shortDesc": عديل PDF", "shortDesc": حسين PDF",
"intro": "مرحبا! هنا يمكنك تعديل ملف PDF مباشرةً في المتصفح: إضافة نص، تعليق، تمييز، رسم حر، حذف/إضافة صفحات، وتصدير نسخة جديدة دون المساس بالأصل.", "intro": "ارفع ملف PDF وأنشئ نسخة محسّنة جاهزة للمشاركة والتنزيل.",
"steps": { "steps": {
"step1": "أضف عناصر (نص، تمييز، رسم، ملاحظة) باستخدام شريط الأدوات أعلى الصفحة.", "step1": "ارفع ملف PDF.",
"step2": "اضغط حفظ لحفظ نسخة جديدة من الملف (سيُنشأ إصدار جديد ولا يُستبدل الملف الأصلي).", "step2": "اضغط تحسين لإنشاء نسخة معالجة جديدة.",
"step3": "اضغط تنزيل لتحميل النسخة النهائية أو اختر مشاركة لنسخ رابط التحميل." "step3": "نزّل الملف الناتج أو شارك رابط التحميل."
}, },
"save": "حفظ التعديلات", "save": "تحسين وحفظ نسخة",
"saveTooltip": "حفظ نسخة جديدة من الملف", "saveTooltip": "إنشاء نسخة محسّنة من الملف",
"downloadFile": "تحميل الملف", "downloadFile": "تحميل الملف",
"downloadTooltip": "تنزيل PDF النهائي", "downloadTooltip": "تنزيل PDF النهائي",
"undo": "تراجع", "undo": "تراجع",
@@ -224,7 +228,7 @@
"extractPage": "استخراج كملف جديد", "extractPage": "استخراج كملف جديد",
"thumbnails": "عرض الصفحات", "thumbnails": "عرض الصفحات",
"share": "مشاركة", "share": "مشاركة",
"versionNote": "نحفظ نسخة جديدة في كل مرة تحفظ فيها التعديلات — لا نغيّر الملف الأصلي. يمكنك الرجوع إلى الإصدارات السابقة من صفحة الملف. يتم حذف الملفات المؤقتة تلقائيًا بعد 30 دقيقة إن لم تكمل العملية.", "versionNote": "هذه الأداة تركّز حاليًا على تحسين ملف PDF وإخراج نسخة نظيفة. لا يتم تعديل الملف الأصلي أبدًا.",
"privacyNote": "ملفاتك محمية — نقوم بفحص الملفات أمنياً قبل المعالجة، ونستخدم اتصالاً مشفّراً (HTTPS). راجع سياسة الخصوصية للحصول على المزيد من التفاصيل.", "privacyNote": "ملفاتك محمية — نقوم بفحص الملفات أمنياً قبل المعالجة، ونستخدم اتصالاً مشفّراً (HTTPS). راجع سياسة الخصوصية للحصول على المزيد من التفاصيل.",
"preparingPreview": "جاري تجهيز المعاينة…", "preparingPreview": "جاري تجهيز المعاينة…",
"preparingPreviewSub": "قد يستغرق الأمر بضع ثوانٍ حسب حجم الملف.", "preparingPreviewSub": "قد يستغرق الأمر بضع ثوانٍ حسب حجم الملف.",
@@ -233,7 +237,7 @@
"savedSuccess": "تم حفظ التعديلات بنجاح — يمكنك الآن تنزيل الملف.", "savedSuccess": "تم حفظ التعديلات بنجاح — يمكنك الآن تنزيل الملف.",
"processingFailed": "فشل في معالجة الملف. جرّب إعادة التحميل أو حاول لاحقًا.", "processingFailed": "فشل في معالجة الملف. جرّب إعادة التحميل أو حاول لاحقًا.",
"retry": "إعادة المحاولة", "retry": "إعادة المحاولة",
"fileTooLarge": "حجم الملف أكبر من المسموح (200MB). قلِّل حجم الملف وحاول مرة أخرى." "fileTooLarge": "حجم الملف أكبر من المسموح (20MB). قلِّل حجم الملف وحاول مرة أخرى."
}, },
"pdfFlowchart": { "pdfFlowchart": {
"title": "PDF إلى مخطط انسيابي", "title": "PDF إلى مخطط انسيابي",
@@ -332,6 +336,58 @@
"sendMessage": "إرسال" "sendMessage": "إرسال"
} }
}, },
"account": {
"metaTitle": "الحساب",
"heroTitle": "احتفظ بنشاط ملفاتك داخل مساحة عمل آمنة واحدة",
"heroSubtitle": "أنشئ حسابًا مجانيًا للاحتفاظ بآخر التنزيلات، والعودة إلى المهام المكتملة، وبناء سجل فعلي لعمليات ملفاتك.",
"benefitsTitle": "لماذا تنشئ حسابًا",
"benefit1": "احتفظ بالملفات الناتجة الأخيرة في سجل واحد بدل فقدان الروابط بعد كل جلسة.",
"benefit2": "اعرف أي أداة أنتجت كل ملف حتى تصبح العمليات المتكررة أسرع وأقل عرضة للأخطاء.",
"benefit3": "جهّز مساحة عملك لحدود الاشتراك المستقبلية، والمعالجة المجمعة، والإعدادات المحفوظة.",
"loadFailed": "تعذر تحميل بيانات الحساب. حاول مرة أخرى.",
"passwordMismatch": "كلمتا المرور غير متطابقتين.",
"signInTitle": "سجّل الدخول إلى مساحة عملك",
"registerTitle": "أنشئ مساحة العمل المجانية",
"formSubtitle": "استخدم نفس الحساب عبر الجلسات حتى يبقى سجل الملفات الناتجة متاحًا لك.",
"createAccount": "إنشاء حساب",
"emailPlaceholder": "name@example.com",
"passwordPlaceholder": "أدخل كلمة مرور قوية",
"confirmPassword": "تأكيد كلمة المرور",
"confirmPasswordPlaceholder": "أعد إدخال كلمة المرور",
"submitLogin": "تسجيل الدخول",
"submitRegister": "إنشاء حساب مجاني",
"freePlanBadge": "الخطة المجانية",
"proPlanBadge": "خطة برو",
"signedInAs": "تم تسجيل الدخول باسم",
"currentPlan": "الخطة الحالية",
"logoutCta": "تسجيل الخروج",
"upgradeNotice": "تواصل معنا للترقية إلى خطة برو للحصول على حدود أعلى وإلغاء الإعلانات ووصول B2B API.",
"plans": {
"free": "مجاني",
"pro": "برو"
},
"webQuotaTitle": "مهام الويب هذا الشهر",
"apiQuotaTitle": "مهام API هذا الشهر",
"quotaPeriod": "الفترة",
"apiKeysTitle": "مفاتيح API",
"apiKeysSubtitle": "أدر مفاتيح B2B API. كل مفتاح يمنحك وصولاً متزامناً بمستوى برو لجميع الأدوات.",
"apiKeyNamePlaceholder": "اسم المفتاح (مثال: إنتاج)",
"apiKeyCreate": "إنشاء مفتاح",
"apiKeyCopyWarning": "انسخ هذا المفتاح الآن — لن يظهر مرة أخرى.",
"apiKeysEmpty": "لا توجد مفاتيح بعد. أنشئ مفتاحًا أعلاه.",
"apiKeyRevoked": "ملغي",
"apiKeyRevoke": "إلغاء المفتاح",
"historyTitle": "سجل الملفات الأخير",
"historySubtitle": "ستظهر هنا تلقائيًا كل المهام الناجحة أو الفاشلة المرتبطة بحسابك.",
"historyLoading": "جارٍ تحميل النشاط الأخير...",
"historyEmpty": "لا يوجد سجل ملفات بعد. عالج أي ملف أثناء تسجيل الدخول وسيظهر هنا.",
"downloadResult": "تحميل النتيجة",
"createdAt": "تاريخ الإنشاء",
"originalFile": "الملف الأصلي",
"outputFile": "الملف الناتج",
"statusCompleted": "مكتمل",
"statusFailed": "فشل"
},
"result": { "result": {
"conversionComplete": "اكتمل التحويل!", "conversionComplete": "اكتمل التحويل!",
"compressionComplete": "اكتمل الضغط!", "compressionComplete": "اكتمل الضغط!",

View File

@@ -18,6 +18,10 @@
"terms": "Terms of Service", "terms": "Terms of Service",
"language": "Language", "language": "Language",
"allTools": "All Tools", "allTools": "All Tools",
"account": "Account",
"signIn": "Sign In",
"email": "Email",
"password": "Password",
"darkMode": "Dark Mode", "darkMode": "Dark Mode",
"lightMode": "Light Mode" "lightMode": "Light Mode"
}, },
@@ -30,10 +34,10 @@
"videoTools": "Video Tools", "videoTools": "Video Tools",
"textTools": "Text Tools", "textTools": "Text Tools",
"uploadCta": "Drag your file here or click to browse", "uploadCta": "Drag your file here or click to browse",
"uploadOr": "Supported: PDF, Word, JPG, PNG, WebP, MP4 — Max size: 200 MB.", "uploadOr": "Supported: PDF, Word, JPG, PNG, WebP, MP4 — Max size: 50 MB.",
"uploadSubtitle": "We generate a quick preview and instantly show matching tools.", "uploadSubtitle": "We generate a quick preview and instantly show matching tools.",
"editNow": "Edit Your File Now", "editNow": "Optimize PDF Now",
"editNowTooltip": "Open the file editor — edit text, add comments, and modify pages", "editNowTooltip": "Open quick PDF optimization for a cleaner downloadable copy",
"suggestedTools": "Suggested Tools for Your File", "suggestedTools": "Suggested Tools for Your File",
"suggestedToolsDesc": "After uploading, we automatically show compatible tools: text editing, highlighting, merge/split, compress, convert to Word/image, video to GIF, and more.", "suggestedToolsDesc": "After uploading, we automatically show compatible tools: text editing, highlighting, merge/split, compress, convert to Word/image, video to GIF, and more.",
"selectTool": "Choose a Tool", "selectTool": "Choose a Tool",
@@ -203,17 +207,17 @@
"topLeft": "Top Left" "topLeft": "Top Left"
}, },
"pdfEditor": { "pdfEditor": {
"title": "Advanced PDF Editor", "title": "Quick PDF Optimizer",
"description": "Edit PDF text, add comments, reorder pages, and save a final copy. Fast, simple, and right in your browser.", "description": "Create a cleaner, optimized copy of your PDF with one click while keeping the original untouched.",
"shortDesc": "Edit PDF", "shortDesc": "Optimize PDF",
"intro": "Here you can edit your PDF directly in the browser: add text, comments, highlights, freehand drawing, delete/add pages, and export a new copy without altering the original.", "intro": "Upload your PDF and generate an optimized copy ready for sharing and download.",
"steps": { "steps": {
"step1": "Add elements (text, highlight, drawing, note) using the toolbar at the top.", "step1": "Upload your PDF file.",
"step2": "Click Save to save a new copy (a new version is created — the original file is not replaced).", "step2": "Click optimize to create a fresh processed copy.",
"step3": "Click Download to get the final copy, or choose Share to copy the download link." "step3": "Download or share the generated file link."
}, },
"save": "Save Changes", "save": "Optimize & Save Copy",
"saveTooltip": "Save a new copy of the file", "saveTooltip": "Create an optimized copy of the file",
"downloadFile": "Download File", "downloadFile": "Download File",
"downloadTooltip": "Download the final PDF", "downloadTooltip": "Download the final PDF",
"undo": "Undo", "undo": "Undo",
@@ -224,7 +228,7 @@
"extractPage": "Extract as New File", "extractPage": "Extract as New File",
"thumbnails": "View Pages", "thumbnails": "View Pages",
"share": "Share", "share": "Share",
"versionNote": "We save a new copy each time you save changes — the original file is never modified. You can revert to previous versions from the file page. Temporary files are automatically deleted after 30 minutes if the process is not completed.", "versionNote": "This tool currently focuses on PDF optimization and clean output generation. The original file is never modified.",
"privacyNote": "Your files are protected — we perform security checks before processing and use encrypted connections (HTTPS). See our Privacy Policy for more details.", "privacyNote": "Your files are protected — we perform security checks before processing and use encrypted connections (HTTPS). See our Privacy Policy for more details.",
"preparingPreview": "Preparing preview…", "preparingPreview": "Preparing preview…",
"preparingPreviewSub": "This may take a few seconds depending on file size.", "preparingPreviewSub": "This may take a few seconds depending on file size.",
@@ -233,7 +237,7 @@
"savedSuccess": "Changes saved successfully — you can now download the file.", "savedSuccess": "Changes saved successfully — you can now download the file.",
"processingFailed": "Failed to process the file. Try re-uploading or try again later.", "processingFailed": "Failed to process the file. Try re-uploading or try again later.",
"retry": "Retry", "retry": "Retry",
"fileTooLarge": "File size exceeds the limit (200MB). Please reduce the file size and try again." "fileTooLarge": "File size exceeds the limit (20MB). Please reduce the file size and try again."
}, },
"pdfFlowchart": { "pdfFlowchart": {
"title": "PDF to Flowchart", "title": "PDF to Flowchart",
@@ -332,6 +336,58 @@
"sendMessage": "Send" "sendMessage": "Send"
} }
}, },
"account": {
"metaTitle": "Account",
"heroTitle": "Save your file activity in one secure workspace",
"heroSubtitle": "Create a free account to keep recent downloads, return to finished tasks, and build a usable history for your document workflow.",
"benefitsTitle": "Why create an account",
"benefit1": "Keep recent generated files in one timeline instead of losing links after each session.",
"benefit2": "See which tool produced each result so repeated work is faster and less error-prone.",
"benefit3": "Prepare your workspace for future premium limits, batch tools, and saved settings.",
"loadFailed": "We couldn't load your account data. Please try again.",
"passwordMismatch": "Passwords do not match.",
"signInTitle": "Sign in to your workspace",
"registerTitle": "Create your free workspace",
"formSubtitle": "Use the same account across sessions to keep your generated file history available.",
"createAccount": "Create Account",
"emailPlaceholder": "name@example.com",
"passwordPlaceholder": "Enter a strong password",
"confirmPassword": "Confirm Password",
"confirmPasswordPlaceholder": "Re-enter your password",
"submitLogin": "Sign In",
"submitRegister": "Create Free Account",
"freePlanBadge": "Free Plan",
"proPlanBadge": "Pro Plan",
"signedInAs": "Signed in as",
"currentPlan": "Current plan",
"logoutCta": "Sign Out",
"upgradeNotice": "Contact us to upgrade to Pro for higher limits, no ads, and B2B API access.",
"plans": {
"free": "Free",
"pro": "Pro"
},
"webQuotaTitle": "Web Tasks This Month",
"apiQuotaTitle": "API Tasks This Month",
"quotaPeriod": "Period",
"apiKeysTitle": "API Keys",
"apiKeysSubtitle": "Manage your B2B API keys. Each key gives Pro-level async access to all tools.",
"apiKeyNamePlaceholder": "Key name (e.g. Production)",
"apiKeyCreate": "Create Key",
"apiKeyCopyWarning": "Copy this key now — it will never be shown again.",
"apiKeysEmpty": "No API keys yet. Create one above.",
"apiKeyRevoked": "Revoked",
"apiKeyRevoke": "Revoke key",
"historyTitle": "Recent file history",
"historySubtitle": "Completed and failed tasks tied to your account appear here automatically.",
"historyLoading": "Loading recent activity...",
"historyEmpty": "No file history yet. Process a file while signed in and it will appear here.",
"downloadResult": "Download Result",
"createdAt": "Created",
"originalFile": "Original file",
"outputFile": "Output file",
"statusCompleted": "Completed",
"statusFailed": "Failed"
},
"result": { "result": {
"conversionComplete": "Conversion Complete!", "conversionComplete": "Conversion Complete!",
"compressionComplete": "Compression Complete!", "compressionComplete": "Compression Complete!",

View File

@@ -18,6 +18,10 @@
"terms": "Conditions d'utilisation", "terms": "Conditions d'utilisation",
"language": "Langue", "language": "Langue",
"allTools": "Tous les outils", "allTools": "Tous les outils",
"account": "Compte",
"signIn": "Se connecter",
"email": "E-mail",
"password": "Mot de passe",
"darkMode": "Mode sombre", "darkMode": "Mode sombre",
"lightMode": "Mode clair" "lightMode": "Mode clair"
}, },
@@ -30,10 +34,10 @@
"videoTools": "Outils vidéo", "videoTools": "Outils vidéo",
"textTools": "Outils de texte", "textTools": "Outils de texte",
"uploadCta": "Glissez votre fichier ici ou cliquez pour parcourir", "uploadCta": "Glissez votre fichier ici ou cliquez pour parcourir",
"uploadOr": "Formats supportés : PDF, Word, JPG, PNG, WebP, MP4 — Taille max : 200 Mo.", "uploadOr": "Formats supportés : PDF, Word, JPG, PNG, WebP, MP4 — Taille max : 50 Mo.",
"uploadSubtitle": "Nous générons un aperçu rapide et affichons les outils adaptés instantanément.", "uploadSubtitle": "Nous générons un aperçu rapide et affichons les outils adaptés instantanément.",
"editNow": "Modifier votre fichier maintenant", "editNow": "Optimiser le PDF maintenant",
"editNowTooltip": "Ouvrir l'éditeur de fichiers — modifier le texte, ajouter des commentaires et modifier les pages", "editNowTooltip": "Ouvrir l'optimiseur PDF rapide pour générer une copie propre téléchargeable",
"suggestedTools": "Outils suggérés pour votre fichier", "suggestedTools": "Outils suggérés pour votre fichier",
"suggestedToolsDesc": "Après le téléchargement, nous affichons automatiquement les outils compatibles : édition de texte, surlignage, fusion/division, compression, conversion en Word/image, vidéo en GIF, et plus.", "suggestedToolsDesc": "Après le téléchargement, nous affichons automatiquement les outils compatibles : édition de texte, surlignage, fusion/division, compression, conversion en Word/image, vidéo en GIF, et plus.",
"selectTool": "Choisir un outil", "selectTool": "Choisir un outil",
@@ -203,17 +207,17 @@
"topLeft": "Haut gauche" "topLeft": "Haut gauche"
}, },
"pdfEditor": { "pdfEditor": {
"title": "Éditeur PDF avancé", "title": "Optimiseur PDF rapide",
"description": "Modifiez le texte PDF, ajoutez des commentaires, réorganisez les pages et enregistrez une copie finale. Rapide, simple et directement dans votre navigateur.", "description": "Créez une copie PDF plus propre et optimisée en un clic, sans modifier le fichier original.",
"shortDesc": "Modifier PDF", "shortDesc": "Optimiser PDF",
"intro": "Ici vous pouvez modifier votre PDF directement dans le navigateur : ajouter du texte, des commentaires, du surlignage, du dessin libre, supprimer/ajouter des pages, et exporter une nouvelle copie sans altérer l'original.", "intro": "Téléchargez votre PDF et générez une copie optimisée prête à partager et à télécharger.",
"steps": { "steps": {
"step1": "Ajoutez des éléments (texte, surlignage, dessin, note) à l'aide de la barre d'outils en haut.", "step1": "Téléchargez votre fichier PDF.",
"step2": "Cliquez sur Enregistrer pour sauvegarder une nouvelle copie (une nouvelle version est créée — le fichier original n'est pas remplacé).", "step2": "Cliquez sur optimiser pour créer une nouvelle copie traitée.",
"step3": "Cliquez sur Télécharger pour obtenir la copie finale, ou choisissez Partager pour copier le lien de téléchargement." "step3": "Téléchargez le fichier généré ou partagez son lien."
}, },
"save": "Enregistrer les modifications", "save": "Optimiser et enregistrer",
"saveTooltip": "Enregistrer une nouvelle copie du fichier", "saveTooltip": "Créer une copie optimisée du fichier",
"downloadFile": "Télécharger le fichier", "downloadFile": "Télécharger le fichier",
"downloadTooltip": "Télécharger le PDF final", "downloadTooltip": "Télécharger le PDF final",
"undo": "Annuler", "undo": "Annuler",
@@ -224,7 +228,7 @@
"extractPage": "Extraire comme nouveau fichier", "extractPage": "Extraire comme nouveau fichier",
"thumbnails": "Voir les pages", "thumbnails": "Voir les pages",
"share": "Partager", "share": "Partager",
"versionNote": "Nous sauvegardons une nouvelle copie à chaque enregistrement — le fichier original n'est jamais modifié. Vous pouvez revenir aux versions précédentes depuis la page du fichier. Les fichiers temporaires sont automatiquement supprimés après 30 minutes si le processus n'est pas terminé.", "versionNote": "Cet outil se concentre actuellement sur l'optimisation PDF et la génération d'une copie propre. Le fichier original n'est jamais modifié.",
"privacyNote": "Vos fichiers sont protégés — nous effectuons des vérifications de sécurité avant le traitement et utilisons des connexions chiffrées (HTTPS). Consultez notre politique de confidentialité pour plus de détails.", "privacyNote": "Vos fichiers sont protégés — nous effectuons des vérifications de sécurité avant le traitement et utilisons des connexions chiffrées (HTTPS). Consultez notre politique de confidentialité pour plus de détails.",
"preparingPreview": "Préparation de l'aperçu…", "preparingPreview": "Préparation de l'aperçu…",
"preparingPreviewSub": "Cela peut prendre quelques secondes selon la taille du fichier.", "preparingPreviewSub": "Cela peut prendre quelques secondes selon la taille du fichier.",
@@ -233,7 +237,7 @@
"savedSuccess": "Modifications enregistrées avec succès — vous pouvez maintenant télécharger le fichier.", "savedSuccess": "Modifications enregistrées avec succès — vous pouvez maintenant télécharger le fichier.",
"processingFailed": "Échec du traitement du fichier. Essayez de le re-télécharger ou réessayez plus tard.", "processingFailed": "Échec du traitement du fichier. Essayez de le re-télécharger ou réessayez plus tard.",
"retry": "Réessayer", "retry": "Réessayer",
"fileTooLarge": "La taille du fichier dépasse la limite (200 Mo). Veuillez réduire la taille du fichier et réessayer." "fileTooLarge": "La taille du fichier dépasse la limite (20 Mo). Veuillez réduire la taille du fichier et réessayer."
}, },
"pdfFlowchart": { "pdfFlowchart": {
"title": "PDF vers Organigramme", "title": "PDF vers Organigramme",
@@ -332,6 +336,58 @@
"sendMessage": "Envoyer" "sendMessage": "Envoyer"
} }
}, },
"account": {
"metaTitle": "Compte",
"heroTitle": "Conservez l'activité de vos fichiers dans un espace sécurisé",
"heroSubtitle": "Créez un compte gratuit pour retrouver vos téléchargements récents, revenir sur les tâches terminées et garder un historique utile de votre flux documentaire.",
"benefitsTitle": "Pourquoi créer un compte",
"benefit1": "Conservez les fichiers générés récents dans une seule chronologie au lieu de perdre les liens à chaque session.",
"benefit2": "Identifiez l'outil qui a produit chaque résultat pour accélérer les tâches répétitives et réduire les erreurs.",
"benefit3": "Préparez votre espace pour les futures limites premium, les traitements par lots et les préférences enregistrées.",
"loadFailed": "Impossible de charger les données du compte. Veuillez réessayer.",
"passwordMismatch": "Les mots de passe ne correspondent pas.",
"signInTitle": "Connectez-vous à votre espace",
"registerTitle": "Créez votre espace gratuit",
"formSubtitle": "Utilisez le même compte entre les sessions pour conserver l'historique de vos fichiers générés.",
"createAccount": "Créer un compte",
"emailPlaceholder": "nom@example.com",
"passwordPlaceholder": "Entrez un mot de passe fort",
"confirmPassword": "Confirmer le mot de passe",
"confirmPasswordPlaceholder": "Saisissez à nouveau votre mot de passe",
"submitLogin": "Se connecter",
"submitRegister": "Créer un compte gratuit",
"freePlanBadge": "Forfait gratuit",
"proPlanBadge": "Forfait Pro",
"signedInAs": "Connecté en tant que",
"currentPlan": "Forfait actuel",
"logoutCta": "Se déconnecter",
"upgradeNotice": "Contactez-nous pour passer au forfait Pro : limites plus élevées, sans publicité et accès API B2B.",
"plans": {
"free": "Gratuit",
"pro": "Pro"
},
"webQuotaTitle": "Tâches web ce mois-ci",
"apiQuotaTitle": "Tâches API ce mois-ci",
"quotaPeriod": "Période",
"apiKeysTitle": "Clés API",
"apiKeysSubtitle": "Gérez vos clés API B2B. Chaque clé donne un accès asynchrone Pro à tous les outils.",
"apiKeyNamePlaceholder": "Nom de la clé (ex. Production)",
"apiKeyCreate": "Créer une clé",
"apiKeyCopyWarning": "Copiez cette clé maintenant — elle ne sera plus affichée.",
"apiKeysEmpty": "Aucune clé API pour l'instant. Créez-en une ci-dessus.",
"apiKeyRevoked": "Révoquée",
"apiKeyRevoke": "Révoquer la clé",
"historyTitle": "Historique récent des fichiers",
"historySubtitle": "Les tâches réussies et échouées liées à votre compte apparaissent ici automatiquement.",
"historyLoading": "Chargement de l'activité récente...",
"historyEmpty": "Aucun historique pour l'instant. Traitez un fichier en étant connecté et il apparaîtra ici.",
"downloadResult": "Télécharger le résultat",
"createdAt": "Créé le",
"originalFile": "Fichier source",
"outputFile": "Fichier de sortie",
"statusCompleted": "Terminé",
"statusFailed": "Échec"
},
"result": { "result": {
"conversionComplete": "Conversion terminée !", "conversionComplete": "Conversion terminée !",
"compressionComplete": "Compression terminée !", "compressionComplete": "Compression terminée !",

View File

@@ -1,5 +1,6 @@
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { Helmet } from 'react-helmet-async'; import { Helmet } from 'react-helmet-async';
import { FILE_RETENTION_MINUTES } from '@/config/toolLimits';
export default function AboutPage() { export default function AboutPage() {
const { t } = useTranslation(); const { t } = useTranslation();
@@ -22,20 +23,21 @@ export default function AboutPage() {
<h2>Why use our tools?</h2> <h2>Why use our tools?</h2>
<ul> <ul>
<li><strong>100% Free</strong> No hidden charges, no sign-up required.</li> <li><strong>100% Free</strong> No hidden charges, no sign-up required.</li>
<li><strong>Private & Secure</strong> Files are auto-deleted within 2 hours.</li> <li><strong>Private & Secure</strong> Files are auto-deleted within {FILE_RETENTION_MINUTES} minutes.</li>
<li><strong>Fast Processing</strong> Server-side processing for reliable results.</li> <li><strong>Fast Processing</strong> Server-side processing for reliable results.</li>
<li><strong>Works Everywhere</strong> Desktop, tablet, or mobile.</li> <li><strong>Works Everywhere</strong> Desktop, tablet, or mobile.</li>
</ul> </ul>
<h2>Available Tools</h2> <h2>Available Tools</h2>
<ul> <ul>
<li>PDF to Word Converter</li> <li>PDF conversion tools (PDFWord)</li>
<li>Word to PDF Converter</li> <li>PDF optimization and utility tools (compress, merge, split, rotate, page numbers)</li>
<li>PDF Compressor</li> <li>PDF security tools (watermark, protect, unlock)</li>
<li>Image Format Converter</li> <li>PDF/image conversion tools (PDFImages, ImagesPDF)</li>
<li>Video to GIF Creator</li> <li>Image processing tools (convert, resize)</li>
<li>Word Counter</li> <li>Video to GIF tool</li>
<li>Text Cleaner & Formatter</li> <li>Text tools (word counter, cleaner)</li>
<li>PDF to flowchart extraction tool</li>
</ul> </ul>
<h2>Contact</h2> <h2>Contact</h2>

View File

@@ -0,0 +1,643 @@
import { useEffect, useMemo, useState, type FormEvent } from 'react';
import { Helmet } from 'react-helmet-async';
import { useTranslation } from 'react-i18next';
import {
BadgeCheck,
Check,
Copy,
Download,
FolderClock,
KeyRound,
LogOut,
ShieldCheck,
Sparkles,
Trash2,
UserRound,
Zap,
} from 'lucide-react';
import {
getHistory,
getUsage,
getApiKeys,
createApiKey,
revokeApiKey,
type HistoryEntry,
type UsageSummary,
type ApiKey,
} from '@/services/api';
import { useAuthStore } from '@/stores/authStore';
type AuthMode = 'login' | 'register';
const toolKeyMap: Record<string, string> = {
'pdf-to-word': 'tools.pdfToWord.title',
'word-to-pdf': 'tools.wordToPdf.title',
'compress-pdf': 'tools.compressPdf.title',
'image-convert': 'tools.imageConvert.title',
'image-resize': 'tools.imageConvert.title',
'video-to-gif': 'tools.videoToGif.title',
'merge-pdf': 'tools.mergePdf.title',
'split-pdf': 'tools.splitPdf.title',
'rotate-pdf': 'tools.rotatePdf.title',
'page-numbers': 'tools.pageNumbers.title',
'pdf-to-images': 'tools.pdfToImages.title',
'images-to-pdf': 'tools.imagesToPdf.title',
'watermark-pdf': 'tools.watermarkPdf.title',
'protect-pdf': 'tools.protectPdf.title',
'unlock-pdf': 'tools.unlockPdf.title',
'pdf-flowchart': 'tools.pdfFlowchart.title',
'pdf-flowchart-sample': 'tools.pdfFlowchart.title',
};
function formatHistoryTool(tool: string, t: (key: string) => string) {
const translationKey = toolKeyMap[tool];
return translationKey ? t(translationKey) : tool;
}
export default function AccountPage() {
const { t, i18n } = useTranslation();
const user = useAuthStore((state) => state.user);
const authLoading = useAuthStore((state) => state.isLoading);
const initialized = useAuthStore((state) => state.initialized);
const login = useAuthStore((state) => state.login);
const register = useAuthStore((state) => state.register);
const logout = useAuthStore((state) => state.logout);
const [mode, setMode] = useState<AuthMode>('login');
const [email, setEmail] = useState('');
const [password, setPassword] = useState('');
const [confirmPassword, setConfirmPassword] = useState('');
const [submitError, setSubmitError] = useState<string | null>(null);
const [historyItems, setHistoryItems] = useState<HistoryEntry[]>([]);
const [historyLoading, setHistoryLoading] = useState(false);
const [historyError, setHistoryError] = useState<string | null>(null);
// Usage summary state
const [usage, setUsage] = useState<UsageSummary | null>(null);
// API Keys state (pro only)
const [apiKeys, setApiKeys] = useState<ApiKey[]>([]);
const [apiKeysLoading, setApiKeysLoading] = useState(false);
const [newKeyName, setNewKeyName] = useState('');
const [newKeyCreating, setNewKeyCreating] = useState(false);
const [newKeyError, setNewKeyError] = useState<string | null>(null);
const [revealedKey, setRevealedKey] = useState<string | null>(null);
const [copiedKey, setCopiedKey] = useState(false);
const dateFormatter = useMemo(
() =>
new Intl.DateTimeFormat(i18n.language, {
dateStyle: 'medium',
timeStyle: 'short',
}),
[i18n.language]
);
useEffect(() => {
if (!user) {
setHistoryItems([]);
setHistoryError(null);
setUsage(null);
setApiKeys([]);
return;
}
const loadHistory = async () => {
setHistoryLoading(true);
setHistoryError(null);
try {
const items = await getHistory();
setHistoryItems(items);
} catch (error) {
setHistoryError(error instanceof Error ? error.message : t('account.loadFailed'));
} finally {
setHistoryLoading(false);
}
};
const loadUsage = async () => {
try {
const data = await getUsage();
setUsage(data);
} catch {
// non-critical, ignore
}
};
const loadApiKeys = async () => {
if (user.plan !== 'pro') return;
setApiKeysLoading(true);
try {
const keys = await getApiKeys();
setApiKeys(keys);
} catch {
// non-critical
} finally {
setApiKeysLoading(false);
}
};
void loadHistory();
void loadUsage();
void loadApiKeys();
}, [t, user]);
const handleSubmit = async (event: FormEvent<HTMLFormElement>) => {
event.preventDefault();
setSubmitError(null);
if (mode === 'register' && password !== confirmPassword) {
setSubmitError(t('account.passwordMismatch'));
return;
}
try {
if (mode === 'login') {
await login(email, password);
} else {
await register(email, password);
}
setPassword('');
setConfirmPassword('');
} catch (error) {
setSubmitError(error instanceof Error ? error.message : t('account.loadFailed'));
}
};
const handleLogout = async () => {
setSubmitError(null);
try {
await logout();
setHistoryItems([]);
setUsage(null);
setApiKeys([]);
} catch (error) {
setSubmitError(error instanceof Error ? error.message : t('account.loadFailed'));
}
};
const handleCreateApiKey = async (e: FormEvent<HTMLFormElement>) => {
e.preventDefault();
setNewKeyError(null);
const name = newKeyName.trim();
if (!name) return;
setNewKeyCreating(true);
try {
const key = await createApiKey(name);
setApiKeys((prev) => [key, ...prev]);
setRevealedKey(key.raw_key ?? null);
setNewKeyName('');
} catch (error) {
setNewKeyError(error instanceof Error ? error.message : t('account.loadFailed'));
} finally {
setNewKeyCreating(false);
}
};
const handleRevokeApiKey = async (keyId: number) => {
try {
await revokeApiKey(keyId);
setApiKeys((prev) =>
prev.map((k) =>
k.id === keyId ? { ...k, revoked_at: new Date().toISOString() } : k
)
);
} catch {
// ignore
}
};
const handleCopyKey = async () => {
if (!revealedKey) return;
await navigator.clipboard.writeText(revealedKey);
setCopiedKey(true);
setTimeout(() => setCopiedKey(false), 2000);
};
return (
<>
<Helmet>
<title>{t('account.metaTitle')} {t('common.appName')}</title>
<meta name="description" content={t('account.heroSubtitle')} />
</Helmet>
{!initialized && authLoading ? (
<div className="flex min-h-[40vh] items-center justify-center">
<div className="h-10 w-10 animate-spin rounded-full border-4 border-primary-200 border-t-primary-600 dark:border-primary-800 dark:border-t-primary-400" />
</div>
) : user ? (
<div className="space-y-8">
<section className="overflow-hidden rounded-[2rem] bg-gradient-to-br from-amber-100 via-orange-50 to-white p-8 shadow-sm ring-1 ring-amber-200 dark:from-amber-950/60 dark:via-slate-900 dark:to-slate-950 dark:ring-amber-900/50">
<div className="flex flex-col gap-6 lg:flex-row lg:items-center lg:justify-between">
<div className="max-w-2xl space-y-4">
<div className="inline-flex items-center gap-2 rounded-full bg-white/80 px-4 py-2 text-sm font-semibold text-amber-900 ring-1 ring-amber-200 dark:bg-amber-400/10 dark:text-amber-200 dark:ring-amber-700/40">
{user.plan === 'pro' ? <Zap className="h-4 w-4" /> : <BadgeCheck className="h-4 w-4" />}
{user.plan === 'pro' ? t('account.proPlanBadge') : t('account.freePlanBadge')}
</div>
<h1 className="text-3xl font-black tracking-tight text-slate-900 dark:text-white sm:text-4xl">
{t('account.heroTitle')}
</h1>
<p className="max-w-xl text-base leading-7 text-slate-600 dark:text-slate-300">
{t('account.heroSubtitle')}
</p>
{user.plan === 'free' && (
<p className="text-sm font-medium text-amber-700 dark:text-amber-300">
{t('account.upgradeNotice')}
</p>
)}
</div>
<div className="rounded-[1.5rem] bg-white/90 p-5 shadow-sm ring-1 ring-slate-200 dark:bg-slate-900/90 dark:ring-slate-800">
<div className="space-y-3">
<div className="flex items-center gap-3 text-slate-800 dark:text-slate-100">
<UserRound className="h-5 w-5 text-primary-600 dark:text-primary-400" />
<span className="text-sm font-medium">{t('account.signedInAs')}</span>
</div>
<p className="max-w-xs break-all text-lg font-semibold text-slate-900 dark:text-white">
{user.email}
</p>
<div className="flex items-center gap-2 text-sm text-slate-500 dark:text-slate-400">
<Sparkles className="h-4 w-4" />
<span>
{t('account.currentPlan')}: {user.plan === 'pro' ? t('account.plans.pro') : t('account.plans.free')}
</span>
</div>
<button type="button" onClick={handleLogout} className="btn-secondary w-full">
<LogOut className="h-4 w-4" />
{t('account.logoutCta')}
</button>
</div>
</div>
</div>
</section>
{/* Usage / Quota Cards */}
{usage && (
<section className="grid gap-4 sm:grid-cols-2">
<div className="card rounded-[1.5rem] p-5">
<p className="text-xs font-semibold uppercase tracking-widest text-slate-400 dark:text-slate-500">
{t('account.webQuotaTitle')}
</p>
<p className="mt-1 text-2xl font-bold text-slate-900 dark:text-white">
{usage.web_quota.used}
<span className="text-base font-normal text-slate-400"> / {usage.web_quota.limit ?? '∞'}</span>
</p>
{usage.web_quota.limit != null && (
<div className="mt-3 h-2 w-full overflow-hidden rounded-full bg-slate-200 dark:bg-slate-700">
<div
className="h-full rounded-full bg-primary-500 transition-all"
style={{ width: `${Math.min(100, (usage.web_quota.used / usage.web_quota.limit) * 100)}%` }}
/>
</div>
)}
<p className="mt-2 text-xs text-slate-400">{t('account.quotaPeriod')}: {usage.period_month}</p>
</div>
{usage.api_quota.limit != null && (
<div className="card rounded-[1.5rem] p-5">
<p className="text-xs font-semibold uppercase tracking-widest text-slate-400 dark:text-slate-500">
{t('account.apiQuotaTitle')}
</p>
<p className="mt-1 text-2xl font-bold text-slate-900 dark:text-white">
{usage.api_quota.used}
<span className="text-base font-normal text-slate-400"> / {usage.api_quota.limit}</span>
</p>
<div className="mt-3 h-2 w-full overflow-hidden rounded-full bg-slate-200 dark:bg-slate-700">
<div
className="h-full rounded-full bg-emerald-500 transition-all"
style={{ width: `${Math.min(100, (usage.api_quota.used / usage.api_quota.limit) * 100)}%` }}
/>
</div>
<p className="mt-2 text-xs text-slate-400">{t('account.quotaPeriod')}: {usage.period_month}</p>
</div>
)}
</section>
)}
{/* API Key Management — Pro only */}
{user.plan === 'pro' && (
<section className="card rounded-[2rem] p-0">
<div className="border-b border-slate-200 px-6 py-5 dark:border-slate-700">
<div className="flex items-center gap-3">
<KeyRound className="h-5 w-5 text-primary-600 dark:text-primary-400" />
<div>
<h2 className="text-xl font-semibold text-slate-900 dark:text-white">
{t('account.apiKeysTitle')}
</h2>
<p className="text-sm text-slate-500 dark:text-slate-400">
{t('account.apiKeysSubtitle')}
</p>
</div>
</div>
</div>
<div className="space-y-4 p-6">
{/* Create key form */}
<form onSubmit={handleCreateApiKey} className="flex gap-2">
<input
type="text"
value={newKeyName}
onChange={(e) => setNewKeyName(e.target.value)}
placeholder={t('account.apiKeyNamePlaceholder')}
maxLength={100}
className="input flex-1"
/>
<button type="submit" className="btn-primary" disabled={newKeyCreating || !newKeyName.trim()}>
{newKeyCreating ? '…' : t('account.apiKeyCreate')}
</button>
</form>
{newKeyError && (
<p className="text-sm text-red-600 dark:text-red-400">{newKeyError}</p>
)}
{/* Revealed key — shown once after creation */}
{revealedKey && (
<div className="flex items-center gap-3 rounded-xl border border-emerald-200 bg-emerald-50 px-4 py-3 dark:border-emerald-800/60 dark:bg-emerald-950/30">
<code className="flex-1 break-all font-mono text-xs text-emerald-800 dark:text-emerald-200">
{revealedKey}
</code>
<button type="button" onClick={handleCopyKey} className="shrink-0 text-emerald-700 dark:text-emerald-300">
{copiedKey ? <Check className="h-4 w-4" /> : <Copy className="h-4 w-4" />}
</button>
<button type="button" onClick={() => setRevealedKey(null)} className="shrink-0 text-slate-400 hover:text-slate-600">
×
</button>
</div>
)}
{revealedKey && (
<p className="text-xs text-amber-600 dark:text-amber-400">{t('account.apiKeyCopyWarning')}</p>
)}
{/* Key list */}
{apiKeysLoading ? (
<p className="text-sm text-slate-500">{t('account.historyLoading')}</p>
) : apiKeys.length === 0 ? (
<p className="text-sm text-slate-500 dark:text-slate-400">{t('account.apiKeysEmpty')}</p>
) : (
<ul className="space-y-2">
{apiKeys.map((key) => (
<li
key={key.id}
className={`flex items-center justify-between rounded-xl border px-4 py-3 ${
key.revoked_at
? 'border-slate-200 bg-slate-50 opacity-50 dark:border-slate-700 dark:bg-slate-900/40'
: 'border-slate-200 bg-white dark:border-slate-700 dark:bg-slate-900/70'
}`}
>
<div className="space-y-0.5">
<p className="text-sm font-semibold text-slate-900 dark:text-white">{key.name}</p>
<p className="font-mono text-xs text-slate-400">{key.key_prefix}</p>
{key.revoked_at && (
<p className="text-xs text-red-500">{t('account.apiKeyRevoked')}</p>
)}
</div>
{!key.revoked_at && (
<button
type="button"
onClick={() => handleRevokeApiKey(key.id)}
className="ml-4 text-slate-400 hover:text-red-500 dark:hover:text-red-400"
title={t('account.apiKeyRevoke')}
>
<Trash2 className="h-4 w-4" />
</button>
)}
</li>
))}
</ul>
)}
</div>
</section>
)}
<section className="card rounded-[2rem] p-0">
<div className="border-b border-slate-200 px-6 py-5 dark:border-slate-700">
<div className="flex items-center gap-3">
<FolderClock className="h-5 w-5 text-primary-600 dark:text-primary-400" />
<div>
<h2 className="text-xl font-semibold text-slate-900 dark:text-white">
{t('account.historyTitle')}
</h2>
<p className="text-sm text-slate-500 dark:text-slate-400">
{t('account.historySubtitle')}
</p>
</div>
</div>
</div>
<div className="space-y-4 p-6">
{historyLoading ? (
<p className="text-sm text-slate-500 dark:text-slate-400">{t('account.historyLoading')}</p>
) : historyError ? (
<div className="rounded-2xl border border-red-200 bg-red-50 px-4 py-3 text-sm text-red-700 dark:border-red-900/60 dark:bg-red-950/40 dark:text-red-300">
{historyError}
</div>
) : historyItems.length === 0 ? (
<div className="rounded-[1.5rem] border border-dashed border-slate-300 bg-slate-50 px-6 py-10 text-center dark:border-slate-700 dark:bg-slate-900/60">
<p className="text-base font-medium text-slate-700 dark:text-slate-200">{t('account.historyEmpty')}</p>
</div>
) : (
historyItems.map((item) => {
const metadataError =
typeof item.metadata?.error === 'string' ? item.metadata.error : null;
return (
<article
key={item.id}
className="rounded-[1.5rem] border border-slate-200 bg-white p-5 shadow-sm dark:border-slate-700 dark:bg-slate-900/70"
>
<div className="flex flex-col gap-4 sm:flex-row sm:items-start sm:justify-between">
<div className="space-y-2">
<p className="text-xs font-semibold uppercase tracking-[0.2em] text-slate-400 dark:text-slate-500">
{formatHistoryTool(item.tool, t)}
</p>
<h3 className="text-lg font-semibold text-slate-900 dark:text-white">
{item.output_filename || item.original_filename || formatHistoryTool(item.tool, t)}
</h3>
<p className="text-sm text-slate-500 dark:text-slate-400">
{t('account.createdAt')}: {dateFormatter.format(new Date(item.created_at))}
</p>
</div>
<span
className={`inline-flex rounded-full px-3 py-1 text-xs font-semibold ${
item.status === 'completed'
? 'bg-emerald-100 text-emerald-700 dark:bg-emerald-900/30 dark:text-emerald-300'
: 'bg-red-100 text-red-700 dark:bg-red-900/30 dark:text-red-300'
}`}
>
{item.status === 'completed'
? t('account.statusCompleted')
: t('account.statusFailed')}
</span>
</div>
<div className="mt-4 grid gap-3 text-sm text-slate-600 dark:text-slate-300 sm:grid-cols-2">
<div className="rounded-xl bg-slate-50 px-4 py-3 dark:bg-slate-800/80">
<p className="text-xs uppercase tracking-wide text-slate-400 dark:text-slate-500">
{t('account.originalFile')}
</p>
<p className="mt-1 break-all font-medium text-slate-800 dark:text-slate-100">
{item.original_filename || '—'}
</p>
</div>
<div className="rounded-xl bg-slate-50 px-4 py-3 dark:bg-slate-800/80">
<p className="text-xs uppercase tracking-wide text-slate-400 dark:text-slate-500">
{t('account.outputFile')}
</p>
<p className="mt-1 break-all font-medium text-slate-800 dark:text-slate-100">
{item.output_filename || '—'}
</p>
</div>
</div>
{metadataError ? (
<p className="mt-4 rounded-xl bg-red-50 px-4 py-3 text-sm text-red-700 dark:bg-red-950/40 dark:text-red-300">
{metadataError}
</p>
) : null}
{item.download_url && item.status === 'completed' ? (
<a href={item.download_url} className="btn-primary mt-4 inline-flex">
<Download className="h-4 w-4" />
{t('account.downloadResult')}
</a>
) : null}
</article>
);
})
)}
</div>
</section>
</div>
) : (
<div className="grid gap-8 lg:grid-cols-[1.1fr_0.9fr]">
<section className="overflow-hidden rounded-[2rem] bg-gradient-to-br from-cyan-100 via-white to-amber-50 p-8 shadow-sm ring-1 ring-cyan-200 dark:from-cyan-950/50 dark:via-slate-950 dark:to-amber-950/30 dark:ring-cyan-900/40">
<div className="max-w-xl space-y-5">
<div className="inline-flex items-center gap-2 rounded-full bg-white/80 px-4 py-2 text-sm font-semibold text-cyan-900 ring-1 ring-cyan-200 dark:bg-cyan-400/10 dark:text-cyan-200 dark:ring-cyan-700/40">
<ShieldCheck className="h-4 w-4" />
{t('account.benefitsTitle')}
</div>
<h1 className="text-3xl font-black tracking-tight text-slate-900 dark:text-white sm:text-4xl">
{t('account.heroTitle')}
</h1>
<p className="text-base leading-7 text-slate-600 dark:text-slate-300">
{t('account.heroSubtitle')}
</p>
</div>
<div className="mt-8 grid gap-4">
{[t('account.benefit1'), t('account.benefit2'), t('account.benefit3')].map((benefit) => (
<div
key={benefit}
className="flex items-start gap-3 rounded-[1.25rem] bg-white/80 px-4 py-4 shadow-sm ring-1 ring-white dark:bg-slate-900/80 dark:ring-slate-800"
>
<KeyRound className="mt-0.5 h-5 w-5 text-primary-600 dark:text-primary-400" />
<p className="text-sm font-medium leading-6 text-slate-700 dark:text-slate-200">{benefit}</p>
</div>
))}
</div>
</section>
<section className="overflow-hidden rounded-[2rem] bg-white shadow-sm ring-1 ring-slate-200 dark:bg-slate-900 dark:ring-slate-800">
<div className="grid grid-cols-2 border-b border-slate-200 dark:border-slate-800">
<button
type="button"
onClick={() => {
setMode('login');
setSubmitError(null);
}}
className={`px-5 py-4 text-sm font-semibold transition-colors ${
mode === 'login'
? 'bg-slate-900 text-white dark:bg-white dark:text-slate-900'
: 'text-slate-500 hover:bg-slate-50 dark:text-slate-400 dark:hover:bg-slate-800/70'
}`}
>
{t('common.signIn')}
</button>
<button
type="button"
onClick={() => {
setMode('register');
setSubmitError(null);
}}
className={`px-5 py-4 text-sm font-semibold transition-colors ${
mode === 'register'
? 'bg-slate-900 text-white dark:bg-white dark:text-slate-900'
: 'text-slate-500 hover:bg-slate-50 dark:text-slate-400 dark:hover:bg-slate-800/70'
}`}
>
{t('account.createAccount')}
</button>
</div>
<div className="p-6 sm:p-8">
<div className="mb-6">
<h2 className="text-2xl font-bold text-slate-900 dark:text-white">
{mode === 'login' ? t('account.signInTitle') : t('account.registerTitle')}
</h2>
<p className="mt-2 text-sm text-slate-500 dark:text-slate-400">
{t('account.formSubtitle')}
</p>
</div>
<form onSubmit={handleSubmit} className="space-y-4">
<label className="block">
<span className="mb-2 block text-sm font-medium text-slate-700 dark:text-slate-200">
{t('common.email')}
</span>
<input
type="email"
required
value={email}
onChange={(event) => setEmail(event.target.value)}
placeholder={t('account.emailPlaceholder')}
className="input-field"
/>
</label>
<label className="block">
<span className="mb-2 block text-sm font-medium text-slate-700 dark:text-slate-200">
{t('common.password')}
</span>
<input
type="password"
required
minLength={8}
value={password}
onChange={(event) => setPassword(event.target.value)}
placeholder={t('account.passwordPlaceholder')}
className="input-field"
/>
</label>
{mode === 'register' ? (
<label className="block">
<span className="mb-2 block text-sm font-medium text-slate-700 dark:text-slate-200">
{t('account.confirmPassword')}
</span>
<input
type="password"
required
minLength={8}
value={confirmPassword}
onChange={(event) => setConfirmPassword(event.target.value)}
placeholder={t('account.confirmPasswordPlaceholder')}
className="input-field"
/>
</label>
) : null}
{submitError ? (
<div className="rounded-xl border border-red-200 bg-red-50 px-4 py-3 text-sm text-red-700 dark:border-red-900/60 dark:bg-red-950/40 dark:text-red-300">
{submitError}
</div>
) : null}
<button type="submit" className="btn-primary w-full" disabled={authLoading}>
{mode === 'login' ? t('account.submitLogin') : t('account.submitRegister')}
</button>
</form>
</div>
</section>
</div>
)}
</>
);
}

View File

@@ -1,5 +1,8 @@
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { Helmet } from 'react-helmet-async'; import { Helmet } from 'react-helmet-async';
import { FILE_RETENTION_MINUTES } from '@/config/toolLimits';
const LAST_UPDATED = '2026-03-06';
export default function PrivacyPage() { export default function PrivacyPage() {
const { t } = useTranslation(); const { t } = useTranslation();
@@ -13,7 +16,7 @@ export default function PrivacyPage() {
<div className="prose mx-auto max-w-2xl dark:prose-invert"> <div className="prose mx-auto max-w-2xl dark:prose-invert">
<h1>{t('common.privacy')}</h1> <h1>{t('common.privacy')}</h1>
<p><em>Last updated: {new Date().toISOString().split('T')[0]}</em></p> <p><em>Last updated: {LAST_UPDATED}</em></p>
<h2>1. Data Collection</h2> <h2>1. Data Collection</h2>
<p> <p>
@@ -24,7 +27,7 @@ export default function PrivacyPage() {
<h2>2. File Processing & Storage</h2> <h2>2. File Processing & Storage</h2>
<ul> <ul>
<li>Uploaded files are processed on our secure servers.</li> <li>Uploaded files are processed on our secure servers.</li>
<li>All uploaded and output files are <strong>automatically deleted within 2 hours</strong>.</li> <li>All uploaded and output files are <strong>automatically deleted within {FILE_RETENTION_MINUTES} minutes</strong>.</li>
<li>Files are stored in encrypted cloud storage during processing.</li> <li>Files are stored in encrypted cloud storage during processing.</li>
<li>We do not access, read, or share the content of your files.</li> <li>We do not access, read, or share the content of your files.</li>
</ul> </ul>

View File

@@ -1,5 +1,8 @@
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { Helmet } from 'react-helmet-async'; import { Helmet } from 'react-helmet-async';
import { FILE_RETENTION_MINUTES } from '@/config/toolLimits';
const LAST_UPDATED = '2026-03-06';
export default function TermsPage() { export default function TermsPage() {
const { t } = useTranslation(); const { t } = useTranslation();
@@ -13,7 +16,7 @@ export default function TermsPage() {
<div className="prose mx-auto max-w-2xl dark:prose-invert"> <div className="prose mx-auto max-w-2xl dark:prose-invert">
<h1>{t('common.terms')}</h1> <h1>{t('common.terms')}</h1>
<p><em>Last updated: {new Date().toISOString().split('T')[0]}</em></p> <p><em>Last updated: {LAST_UPDATED}</em></p>
<h2>1. Acceptance of Terms</h2> <h2>1. Acceptance of Terms</h2>
<p> <p>
@@ -37,7 +40,7 @@ export default function TermsPage() {
<h2>4. File Handling</h2> <h2>4. File Handling</h2>
<ul> <ul>
<li>All uploaded and processed files are automatically deleted within 2 hours.</li> <li>All uploaded and processed files are automatically deleted within {FILE_RETENTION_MINUTES} minutes.</li>
<li>We are not responsible for any data loss during processing.</li> <li>We are not responsible for any data loss during processing.</li>
<li>You are responsible for maintaining your own file backups.</li> <li>You are responsible for maintaining your own file backups.</li>
</ul> </ul>

View File

@@ -0,0 +1,67 @@
type AnalyticsValue = string | number | boolean | undefined;
declare global {
interface Window {
dataLayer: unknown[];
gtag?: (...args: unknown[]) => void;
}
}
const GA_MEASUREMENT_ID = (import.meta.env.VITE_GA_MEASUREMENT_ID || '').trim();
let initialized = false;
function ensureGtagShim() {
window.dataLayer = window.dataLayer || [];
window.gtag =
window.gtag ||
function gtag(...args: unknown[]) {
window.dataLayer.push(args);
};
}
function loadGaScript() {
if (!GA_MEASUREMENT_ID) return;
const existing = document.querySelector<HTMLScriptElement>(
`script[data-ga4-id="${GA_MEASUREMENT_ID}"]`
);
if (existing) return;
const script = document.createElement('script');
script.async = true;
script.src = `https://www.googletagmanager.com/gtag/js?id=${GA_MEASUREMENT_ID}`;
script.setAttribute('data-ga4-id', GA_MEASUREMENT_ID);
document.head.appendChild(script);
}
export function initAnalytics() {
if (initialized || !GA_MEASUREMENT_ID || typeof window === 'undefined') return;
ensureGtagShim();
loadGaScript();
window.gtag?.('js', new Date());
window.gtag?.('config', GA_MEASUREMENT_ID, { send_page_view: false });
initialized = true;
}
export function trackPageView(path: string) {
if (!initialized || !window.gtag) return;
window.gtag('event', 'page_view', {
page_path: path,
page_location: `${window.location.origin}${path}`,
page_title: document.title,
});
}
export function trackEvent(
eventName: string,
params: Record<string, AnalyticsValue> = {}
) {
if (!initialized || !window.gtag) return;
window.gtag('event', eventName, params);
}
export function analyticsEnabled() {
return Boolean(GA_MEASUREMENT_ID);
}

View File

@@ -103,14 +103,13 @@ describe('API Service — Endpoint Format Tests', () => {
// PDF Tools endpoints // PDF Tools endpoints
// ---------------------------------------------------------- // ----------------------------------------------------------
describe('PDF Tools API', () => { describe('PDF Tools API', () => {
it('Merge: should POST multiple files to /api/pdf-tools/merge', () => { it('Merge: should POST multiple files to /pdf-tools/merge', () => {
// MergePdf.tsx uses fetch('/api/pdf-tools/merge') directly, not api.post
const formData = new FormData(); const formData = new FormData();
formData.append('files', new Blob(['%PDF-1.4']), 'a.pdf'); formData.append('files', new Blob(['%PDF-1.4']), 'a.pdf');
formData.append('files', new Blob(['%PDF-1.4']), 'b.pdf'); formData.append('files', new Blob(['%PDF-1.4']), 'b.pdf');
const url = '/api/pdf-tools/merge'; const url = '/pdf-tools/merge';
expect(url).toBe('/api/pdf-tools/merge'); expect(url).toBe('/pdf-tools/merge');
expect(formData.getAll('files').length).toBe(2); expect(formData.getAll('files').length).toBe(2);
}); });
@@ -159,14 +158,13 @@ describe('API Service — Endpoint Format Tests', () => {
expect(formData.get('format')).toBe('png'); expect(formData.get('format')).toBe('png');
}); });
it('Images to PDF: should POST multiple files to /api/pdf-tools/images-to-pdf', () => { it('Images to PDF: should POST multiple files to /pdf-tools/images-to-pdf', () => {
// ImagesToPdf.tsx uses fetch('/api/pdf-tools/images-to-pdf') directly
const formData = new FormData(); const formData = new FormData();
formData.append('files', new Blob(['\x89PNG']), 'img1.png'); formData.append('files', new Blob(['\x89PNG']), 'img1.png');
formData.append('files', new Blob(['\x89PNG']), 'img2.png'); formData.append('files', new Blob(['\x89PNG']), 'img2.png');
const url = '/api/pdf-tools/images-to-pdf'; const url = '/pdf-tools/images-to-pdf';
expect(url).toBe('/api/pdf-tools/images-to-pdf'); expect(url).toBe('/pdf-tools/images-to-pdf');
expect(formData.getAll('files').length).toBe(2); expect(formData.getAll('files').length).toBe(2);
}); });
@@ -264,9 +262,8 @@ describe('Frontend Tool → Backend Endpoint Mapping', () => {
AddPageNumbers: { method: 'POST', endpoint: '/pdf-tools/page-numbers', fieldName: 'file' }, AddPageNumbers: { method: 'POST', endpoint: '/pdf-tools/page-numbers', fieldName: 'file' },
PdfToImages: { method: 'POST', endpoint: '/pdf-tools/pdf-to-images', fieldName: 'file' }, PdfToImages: { method: 'POST', endpoint: '/pdf-tools/pdf-to-images', fieldName: 'file' },
VideoToGif: { method: 'POST', endpoint: '/video/to-gif', fieldName: 'file' }, VideoToGif: { method: 'POST', endpoint: '/video/to-gif', fieldName: 'file' },
// Multi-file tools use fetch() directly with full path: MergePdf: { method: 'POST', endpoint: '/pdf-tools/merge', fieldName: 'files' },
MergePdf: { method: 'POST', endpoint: '/api/pdf-tools/merge', fieldName: 'files' }, ImagesToPdf: { method: 'POST', endpoint: '/pdf-tools/images-to-pdf', fieldName: 'files' },
ImagesToPdf: { method: 'POST', endpoint: '/api/pdf-tools/images-to-pdf', fieldName: 'files' },
}; };
Object.entries(toolEndpointMap).forEach(([tool, config]) => { Object.entries(toolEndpointMap).forEach(([tool, config]) => {
@@ -276,4 +273,4 @@ describe('Frontend Tool → Backend Endpoint Mapping', () => {
expect(config.fieldName).toMatch(/^(file|files)$/); expect(config.fieldName).toMatch(/^(file|files)$/);
}); });
}); });
}); });

View File

@@ -3,6 +3,7 @@ import axios from 'axios';
const api = axios.create({ const api = axios.create({
baseURL: '/api', baseURL: '/api',
timeout: 120000, // 2 minute timeout for file processing timeout: 120000, // 2 minute timeout for file processing
withCredentials: true,
headers: { headers: {
Accept: 'application/json', Accept: 'application/json',
}, },
@@ -77,6 +78,38 @@ export interface TaskResult {
total_pages?: number; total_pages?: number;
} }
export interface AuthUser {
id: number;
email: string;
plan: string;
created_at: string;
}
interface AuthResponse {
message: string;
user: AuthUser;
}
interface AuthSessionResponse {
authenticated: boolean;
user: AuthUser | null;
}
interface HistoryResponse {
items: HistoryEntry[];
}
export interface HistoryEntry {
id: number;
tool: string;
original_filename: string | null;
output_filename: string | null;
status: 'completed' | 'failed' | string;
download_url: string | null;
metadata: Record<string, unknown>;
created_at: string;
}
/** /**
* Upload a file and start a processing task. * Upload a file and start a processing task.
*/ */
@@ -108,6 +141,87 @@ export async function uploadFile(
return response.data; return response.data;
} }
/**
* Upload multiple files and start a processing task.
*/
export async function uploadFiles(
endpoint: string,
files: File[],
fileField = 'files',
extraData?: Record<string, string>,
onProgress?: (percent: number) => void
): Promise<TaskResponse> {
const formData = new FormData();
files.forEach((file) => formData.append(fileField, file));
if (extraData) {
Object.entries(extraData).forEach(([key, value]) => {
formData.append(key, value);
});
}
const response = await api.post<TaskResponse>(endpoint, formData, {
headers: { 'Content-Type': 'multipart/form-data' },
onUploadProgress: (event) => {
if (event.total && onProgress) {
const percent = Math.round((event.loaded / event.total) * 100);
onProgress(percent);
}
},
});
return response.data;
}
/**
* Start a task endpoint that does not require file upload.
*/
export async function startTask(endpoint: string): Promise<TaskResponse> {
const response = await api.post<TaskResponse>(endpoint);
return response.data;
}
/**
* Create a new account and return the authenticated user.
*/
export async function registerUser(email: string, password: string): Promise<AuthUser> {
const response = await api.post<AuthResponse>('/auth/register', { email, password });
return response.data.user;
}
/**
* Sign in and return the authenticated user.
*/
export async function loginUser(email: string, password: string): Promise<AuthUser> {
const response = await api.post<AuthResponse>('/auth/login', { email, password });
return response.data.user;
}
/**
* End the current authenticated session.
*/
export async function logoutUser(): Promise<void> {
await api.post('/auth/logout');
}
/**
* Return the current authenticated user, if any.
*/
export async function getCurrentUser(): Promise<AuthUser | null> {
const response = await api.get<AuthSessionResponse>('/auth/me');
return response.data.user;
}
/**
* Return recent authenticated file history.
*/
export async function getHistory(limit = 50): Promise<HistoryEntry[]> {
const response = await api.get<HistoryResponse>('/history', {
params: { limit },
});
return response.data.items;
}
/** /**
* Poll task status. * Poll task status.
*/ */
@@ -128,4 +242,63 @@ export async function checkHealth(): Promise<boolean> {
} }
} }
// --- Account / Usage / API Keys ---
export interface UsageSummary {
plan: string;
period_month: string;
ads_enabled: boolean;
history_limit: number;
file_limits_mb: {
pdf: number;
word: number;
image: number;
video: number;
homepageSmartUpload: number;
};
web_quota: { used: number; limit: number | null };
api_quota: { used: number; limit: number | null };
}
export interface ApiKey {
id: number;
name: string;
key_prefix: string;
last_used_at: string | null;
revoked_at: string | null;
created_at: string;
raw_key?: string; // only present on creation
}
/**
* Return the current user's plan, quota, and file-limit summary.
*/
export async function getUsage(): Promise<UsageSummary> {
const response = await api.get<UsageSummary>('/account/usage');
return response.data;
}
/**
* Return all API keys for the authenticated pro user.
*/
export async function getApiKeys(): Promise<ApiKey[]> {
const response = await api.get<{ items: ApiKey[] }>('/account/api-keys');
return response.data.items;
}
/**
* Create a new API key with the given name. Returns the key including raw_key once.
*/
export async function createApiKey(name: string): Promise<ApiKey> {
const response = await api.post<ApiKey>('/account/api-keys', { name });
return response.data;
}
/**
* Revoke one API key by id.
*/
export async function revokeApiKey(keyId: number): Promise<void> {
await api.delete(`/account/api-keys/${keyId}`);
}
export default api; export default api;

View File

@@ -0,0 +1,71 @@
import { create } from 'zustand';
import {
getCurrentUser,
loginUser,
logoutUser,
registerUser,
type AuthUser,
} from '@/services/api';
interface AuthState {
user: AuthUser | null;
isLoading: boolean;
initialized: boolean;
refreshUser: () => Promise<AuthUser | null>;
login: (email: string, password: string) => Promise<AuthUser>;
register: (email: string, password: string) => Promise<AuthUser>;
logout: () => Promise<void>;
}
export const useAuthStore = create<AuthState>((set) => ({
user: null,
isLoading: false,
initialized: false,
refreshUser: async () => {
set({ isLoading: true });
try {
const user = await getCurrentUser();
set({ user, isLoading: false, initialized: true });
return user;
} catch {
set({ user: null, isLoading: false, initialized: true });
return null;
}
},
login: async (email: string, password: string) => {
set({ isLoading: true });
try {
const user = await loginUser(email, password);
set({ user, isLoading: false, initialized: true });
return user;
} catch (error) {
set({ isLoading: false, initialized: true });
throw error;
}
},
register: async (email: string, password: string) => {
set({ isLoading: true });
try {
const user = await registerUser(email, password);
set({ user, isLoading: false, initialized: true });
return user;
} catch (error) {
set({ isLoading: false, initialized: true });
throw error;
}
},
logout: async () => {
set({ isLoading: true });
try {
await logoutUser();
set({ user: null, isLoading: false, initialized: true });
} catch (error) {
set({ isLoading: false });
throw error;
}
},
}));

View File

@@ -17,7 +17,7 @@ server {
add_header X-XSS-Protection "1; mode=block" always; add_header X-XSS-Protection "1; mode=block" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always; add_header Referrer-Policy "strict-origin-when-cross-origin" always;
add_header Permissions-Policy "camera=(), microphone=(), geolocation=()" always; add_header Permissions-Policy "camera=(), microphone=(), geolocation=()" always;
add_header Content-Security-Policy "default-src 'self'; script-src 'self' 'unsafe-inline' https://pagead2.googlesyndication.com; style-src 'self' 'unsafe-inline' https://fonts.googleapis.com; font-src 'self' https://fonts.gstatic.com; img-src 'self' data: blob:; connect-src 'self'; frame-ancestors 'self'" always; add_header Content-Security-Policy "default-src 'self'; script-src 'self' 'unsafe-inline' https://pagead2.googlesyndication.com https://www.googletagmanager.com https://www.google-analytics.com; style-src 'self' 'unsafe-inline' https://fonts.googleapis.com; font-src 'self' https://fonts.gstatic.com; img-src 'self' data: blob: https://pagead2.googlesyndication.com https://www.google-analytics.com; connect-src 'self' https://www.google-analytics.com https://pagead2.googlesyndication.com; frame-src https://googleads.g.doubleclick.net https://tpc.googlesyndication.com; frame-ancestors 'self'" always;
# API requests → Flask backend # API requests → Flask backend
location /api/ { location /api/ {

View File

@@ -30,7 +30,7 @@ server {
add_header X-XSS-Protection "1; mode=block" always; add_header X-XSS-Protection "1; mode=block" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always; add_header Referrer-Policy "strict-origin-when-cross-origin" always;
add_header Permissions-Policy "camera=(), microphone=(), geolocation=()" always; add_header Permissions-Policy "camera=(), microphone=(), geolocation=()" always;
add_header Content-Security-Policy "default-src 'self'; script-src 'self' 'unsafe-inline' https://pagead2.googlesyndication.com; style-src 'self' 'unsafe-inline' https://fonts.googleapis.com; font-src 'self' https://fonts.gstatic.com; img-src 'self' data: blob:; connect-src 'self'; frame-ancestors 'self'" always; add_header Content-Security-Policy "default-src 'self'; script-src 'self' 'unsafe-inline' https://pagead2.googlesyndication.com https://www.googletagmanager.com https://www.google-analytics.com; style-src 'self' 'unsafe-inline' https://fonts.googleapis.com; font-src 'self' https://fonts.gstatic.com; img-src 'self' data: blob: https://pagead2.googlesyndication.com https://www.google-analytics.com; connect-src 'self' https://www.google-analytics.com https://pagead2.googlesyndication.com; frame-src https://googleads.g.doubleclick.net https://tpc.googlesyndication.com; frame-ancestors 'self'" always;
# Gzip # Gzip
gzip on; gzip on;