feat: Initialize frontend with React, Vite, and Tailwind CSS
- Set up main entry point for React application. - Create About, Home, NotFound, Privacy, and Terms pages with SEO support. - Implement API service for file uploads and task management. - Add global styles using Tailwind CSS. - Create utility functions for SEO and text processing. - Configure Vite for development and production builds. - Set up Nginx configuration for serving frontend and backend. - Add scripts for cleanup of expired files and sitemap generation. - Implement deployment script for production environment.
This commit is contained in:
29
.env.example
Normal file
29
.env.example
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
# Flask
|
||||||
|
FLASK_ENV=development
|
||||||
|
FLASK_DEBUG=1
|
||||||
|
SECRET_KEY=change-me-in-production
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL=redis://redis:6379/0
|
||||||
|
|
||||||
|
# Celery
|
||||||
|
CELERY_BROKER_URL=redis://redis:6379/0
|
||||||
|
CELERY_RESULT_BACKEND=redis://redis:6379/1
|
||||||
|
|
||||||
|
# AWS S3
|
||||||
|
AWS_ACCESS_KEY_ID=your-access-key
|
||||||
|
AWS_SECRET_ACCESS_KEY=your-secret-key
|
||||||
|
AWS_S3_BUCKET=saas-pdf-temp-files
|
||||||
|
AWS_S3_REGION=eu-west-1
|
||||||
|
|
||||||
|
# File Processing
|
||||||
|
MAX_CONTENT_LENGTH_MB=50
|
||||||
|
UPLOAD_FOLDER=/tmp/uploads
|
||||||
|
OUTPUT_FOLDER=/tmp/outputs
|
||||||
|
FILE_EXPIRY_SECONDS=1800
|
||||||
|
|
||||||
|
# CORS
|
||||||
|
CORS_ORIGINS=http://localhost:5173,http://localhost:3000
|
||||||
|
|
||||||
|
# AdSense
|
||||||
|
ADSENSE_CLIENT_ID=ca-pub-XXXXXXXXXXXXXXXX
|
||||||
54
.gitignore
vendored
Normal file
54
.gitignore
vendored
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
# Python
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
*.egg-info/
|
||||||
|
dist/
|
||||||
|
build/
|
||||||
|
*.egg
|
||||||
|
.eggs/
|
||||||
|
venv/
|
||||||
|
.venv/
|
||||||
|
env/
|
||||||
|
|
||||||
|
# Node
|
||||||
|
node_modules/
|
||||||
|
frontend/dist/
|
||||||
|
frontend/build/
|
||||||
|
.npm
|
||||||
|
*.tsbuildinfo
|
||||||
|
|
||||||
|
# Environment
|
||||||
|
.env
|
||||||
|
.env.local
|
||||||
|
.env.production
|
||||||
|
|
||||||
|
# IDE
|
||||||
|
.vscode/
|
||||||
|
.idea/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
*~
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
desktop.ini
|
||||||
|
|
||||||
|
# Docker
|
||||||
|
docker-compose.override.yml
|
||||||
|
|
||||||
|
# Uploads & temp files
|
||||||
|
uploads/
|
||||||
|
tmp/
|
||||||
|
*.tmp
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
*.log
|
||||||
|
logs/
|
||||||
|
|
||||||
|
# Coverage
|
||||||
|
htmlcov/
|
||||||
|
.coverage
|
||||||
|
coverage/
|
||||||
66
README.md
Normal file
66
README.md
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
# SaaS-PDF — Free Online Tools Platform
|
||||||
|
|
||||||
|
A free SaaS platform offering PDF, image, video, and text processing tools. Built with **Python Flask** (backend) and **React + Vite** (frontend), powered by **Celery + Redis** for async processing, and deployed on **AWS**.
|
||||||
|
|
||||||
|
## 🛠 Tools (MVP)
|
||||||
|
|
||||||
|
1. **PDF to Word / Word to PDF** — Convert between PDF and Word documents
|
||||||
|
2. **PDF Compressor** — Reduce PDF file size with quality options
|
||||||
|
3. **Image Converter** — Convert between JPG, PNG, WebP formats
|
||||||
|
4. **Video to GIF** — Create animated GIFs from video clips
|
||||||
|
5. **Text Tools** — Word counter, text cleaner, case converter (client-side)
|
||||||
|
|
||||||
|
## 🏗 Tech Stack
|
||||||
|
|
||||||
|
| Layer | Technology |
|
||||||
|
|-------|-----------|
|
||||||
|
| Backend API | Python 3.12 + Flask 3.x |
|
||||||
|
| Task Queue | Celery 5.x + Redis |
|
||||||
|
| File Processing | LibreOffice, Ghostscript, Pillow, ffmpeg |
|
||||||
|
| Frontend | React 18 + Vite 5 + TypeScript |
|
||||||
|
| Styling | Tailwind CSS (RTL support) |
|
||||||
|
| i18n | react-i18next (Arabic + English) |
|
||||||
|
| Storage | AWS S3 (temp files with auto-cleanup) |
|
||||||
|
| CDN | AWS CloudFront |
|
||||||
|
| Server | AWS EC2 + Nginx |
|
||||||
|
|
||||||
|
## 🚀 Quick Start (Development)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Clone the repo
|
||||||
|
git clone https://github.com/aborayan2022/SaaS-PDF.git
|
||||||
|
cd SaaS-PDF
|
||||||
|
|
||||||
|
# 2. Copy environment file
|
||||||
|
cp .env.example .env
|
||||||
|
|
||||||
|
# 3. Start all services with Docker
|
||||||
|
docker-compose up --build
|
||||||
|
|
||||||
|
# 4. Access the app
|
||||||
|
# Frontend: http://localhost:5173
|
||||||
|
# Backend API: http://localhost:5000/api
|
||||||
|
# Celery Flower: http://localhost:5555
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📁 Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
SaaS-PDF/
|
||||||
|
├── backend/ # Flask API + Celery Workers
|
||||||
|
├── frontend/ # React + Vite + TypeScript
|
||||||
|
├── nginx/ # Reverse proxy configuration
|
||||||
|
├── scripts/ # Deployment & maintenance scripts
|
||||||
|
├── docs/ # Project documentation
|
||||||
|
├── docker-compose.yml
|
||||||
|
└── docker-compose.prod.yml
|
||||||
|
```
|
||||||
|
|
||||||
|
## 💰 Revenue Model
|
||||||
|
|
||||||
|
- **Google AdSense** — Ads on result/download pages
|
||||||
|
- **Freemium** (planned) — Pro features: no ads, higher limits, API access
|
||||||
|
|
||||||
|
## 📄 License
|
||||||
|
|
||||||
|
MIT
|
||||||
41
backend/Dockerfile
Normal file
41
backend/Dockerfile
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
FROM python:3.12-slim-bookworm
|
||||||
|
|
||||||
|
# Prevent interactive prompts during package installation
|
||||||
|
ENV DEBIAN_FRONTEND=noninteractive
|
||||||
|
|
||||||
|
# Install system dependencies for file processing
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
libreoffice-core \
|
||||||
|
libreoffice-writer \
|
||||||
|
libreoffice-calc \
|
||||||
|
libreoffice-draw \
|
||||||
|
ghostscript \
|
||||||
|
ffmpeg \
|
||||||
|
libmagic1 \
|
||||||
|
imagemagick \
|
||||||
|
curl \
|
||||||
|
&& apt-get clean \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Set working directory
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy requirements first for Docker layer caching
|
||||||
|
COPY requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt
|
||||||
|
|
||||||
|
# Copy application code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Create temp directories
|
||||||
|
RUN mkdir -p /tmp/uploads /tmp/outputs
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 5000
|
||||||
|
|
||||||
|
# Health check
|
||||||
|
HEALTHCHECK --interval=30s --timeout=10s --retries=3 \
|
||||||
|
CMD curl -f http://localhost:5000/api/health || exit 1
|
||||||
|
|
||||||
|
# Run with Gunicorn
|
||||||
|
CMD ["gunicorn", "--bind", "0.0.0.0:5000", "--workers", "4", "--timeout", "120", "wsgi:app"]
|
||||||
73
backend/app/__init__.py
Normal file
73
backend/app/__init__.py
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
"""Flask Application Factory."""
|
||||||
|
import os
|
||||||
|
|
||||||
|
from flask import Flask
|
||||||
|
|
||||||
|
from config import config
|
||||||
|
from app.extensions import cors, limiter, talisman, init_celery
|
||||||
|
|
||||||
|
|
||||||
|
def create_app(config_name=None):
|
||||||
|
"""Create and configure the Flask application."""
|
||||||
|
if config_name is None:
|
||||||
|
config_name = os.getenv("FLASK_ENV", "development")
|
||||||
|
|
||||||
|
app = Flask(__name__)
|
||||||
|
app.config.from_object(config[config_name])
|
||||||
|
|
||||||
|
# Create upload/output directories
|
||||||
|
os.makedirs(app.config["UPLOAD_FOLDER"], exist_ok=True)
|
||||||
|
os.makedirs(app.config["OUTPUT_FOLDER"], exist_ok=True)
|
||||||
|
|
||||||
|
# Initialize extensions
|
||||||
|
cors.init_app(app, origins=app.config["CORS_ORIGINS"])
|
||||||
|
|
||||||
|
limiter.init_app(app)
|
||||||
|
|
||||||
|
# Talisman security headers (relaxed CSP for AdSense)
|
||||||
|
csp = {
|
||||||
|
"default-src": "'self'",
|
||||||
|
"script-src": [
|
||||||
|
"'self'",
|
||||||
|
"'unsafe-inline'",
|
||||||
|
"https://pagead2.googlesyndication.com",
|
||||||
|
"https://www.googletagmanager.com",
|
||||||
|
"https://www.google-analytics.com",
|
||||||
|
],
|
||||||
|
"style-src": ["'self'", "'unsafe-inline'", "https://fonts.googleapis.com"],
|
||||||
|
"font-src": ["'self'", "https://fonts.gstatic.com"],
|
||||||
|
"img-src": ["'self'", "data:", "https://pagead2.googlesyndication.com"],
|
||||||
|
"frame-src": ["https://googleads.g.doubleclick.net"],
|
||||||
|
"connect-src": [
|
||||||
|
"'self'",
|
||||||
|
"https://www.google-analytics.com",
|
||||||
|
"https://*.amazonaws.com",
|
||||||
|
],
|
||||||
|
}
|
||||||
|
talisman.init_app(
|
||||||
|
app,
|
||||||
|
content_security_policy=csp,
|
||||||
|
force_https=config_name == "production",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Initialize Celery
|
||||||
|
init_celery(app)
|
||||||
|
|
||||||
|
# Register blueprints
|
||||||
|
from app.routes.health import health_bp
|
||||||
|
from app.routes.convert import convert_bp
|
||||||
|
from app.routes.compress import compress_bp
|
||||||
|
from app.routes.image import image_bp
|
||||||
|
from app.routes.video import video_bp
|
||||||
|
from app.routes.tasks import tasks_bp
|
||||||
|
from app.routes.download import download_bp
|
||||||
|
|
||||||
|
app.register_blueprint(health_bp, url_prefix="/api")
|
||||||
|
app.register_blueprint(convert_bp, url_prefix="/api/convert")
|
||||||
|
app.register_blueprint(compress_bp, url_prefix="/api/compress")
|
||||||
|
app.register_blueprint(image_bp, url_prefix="/api/image")
|
||||||
|
app.register_blueprint(video_bp, url_prefix="/api/video")
|
||||||
|
app.register_blueprint(tasks_bp, url_prefix="/api/tasks")
|
||||||
|
app.register_blueprint(download_bp, url_prefix="/api/download")
|
||||||
|
|
||||||
|
return app
|
||||||
43
backend/app/extensions.py
Normal file
43
backend/app/extensions.py
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
"""Flask extensions initialization."""
|
||||||
|
from celery import Celery
|
||||||
|
from flask_cors import CORS
|
||||||
|
from flask_limiter import Limiter
|
||||||
|
from flask_limiter.util import get_remote_address
|
||||||
|
from flask_talisman import Talisman
|
||||||
|
|
||||||
|
# Initialize extensions (will be bound to app in create_app)
|
||||||
|
cors = CORS()
|
||||||
|
limiter = Limiter(key_func=get_remote_address)
|
||||||
|
talisman = Talisman()
|
||||||
|
celery = Celery()
|
||||||
|
|
||||||
|
|
||||||
|
def init_celery(app):
|
||||||
|
"""Initialize Celery with Flask app context."""
|
||||||
|
celery.conf.broker_url = app.config["CELERY_BROKER_URL"]
|
||||||
|
celery.conf.result_backend = app.config["CELERY_RESULT_BACKEND"]
|
||||||
|
celery.conf.result_expires = app.config.get("FILE_EXPIRY_SECONDS", 1800)
|
||||||
|
celery.conf.task_serializer = "json"
|
||||||
|
celery.conf.result_serializer = "json"
|
||||||
|
celery.conf.accept_content = ["json"]
|
||||||
|
celery.conf.timezone = "UTC"
|
||||||
|
celery.conf.task_track_started = True
|
||||||
|
|
||||||
|
# Set task routes
|
||||||
|
celery.conf.task_routes = {
|
||||||
|
"app.tasks.convert_tasks.*": {"queue": "convert"},
|
||||||
|
"app.tasks.compress_tasks.*": {"queue": "compress"},
|
||||||
|
"app.tasks.image_tasks.*": {"queue": "image"},
|
||||||
|
"app.tasks.video_tasks.*": {"queue": "video"},
|
||||||
|
}
|
||||||
|
|
||||||
|
class ContextTask(celery.Task):
|
||||||
|
"""Make Celery tasks work with Flask app context."""
|
||||||
|
abstract = True
|
||||||
|
|
||||||
|
def __call__(self, *args, **kwargs):
|
||||||
|
with app.app_context():
|
||||||
|
return self.run(*args, **kwargs)
|
||||||
|
|
||||||
|
celery.Task = ContextTask
|
||||||
|
return celery
|
||||||
1
backend/app/middleware/__init__.py
Normal file
1
backend/app/middleware/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Backend application middleware."""
|
||||||
18
backend/app/middleware/rate_limiter.py
Normal file
18
backend/app/middleware/rate_limiter.py
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
"""Rate limiting middleware configuration."""
|
||||||
|
from app.extensions import limiter
|
||||||
|
|
||||||
|
|
||||||
|
# Custom rate limits for specific operations
|
||||||
|
UPLOAD_LIMIT = "10/minute"
|
||||||
|
DOWNLOAD_LIMIT = "30/minute"
|
||||||
|
API_LIMIT = "100/hour"
|
||||||
|
|
||||||
|
|
||||||
|
def get_upload_limit():
|
||||||
|
"""Get the rate limit for file upload endpoints."""
|
||||||
|
return UPLOAD_LIMIT
|
||||||
|
|
||||||
|
|
||||||
|
def get_download_limit():
|
||||||
|
"""Get the rate limit for file download endpoints."""
|
||||||
|
return DOWNLOAD_LIMIT
|
||||||
1
backend/app/routes/__init__.py
Normal file
1
backend/app/routes/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Backend application routes."""
|
||||||
47
backend/app/routes/compress.py
Normal file
47
backend/app/routes/compress.py
Normal file
@@ -0,0 +1,47 @@
|
|||||||
|
"""PDF compression routes."""
|
||||||
|
from flask import Blueprint, request, jsonify
|
||||||
|
|
||||||
|
from app.extensions import limiter
|
||||||
|
from app.utils.file_validator import validate_file, FileValidationError
|
||||||
|
from app.utils.sanitizer import generate_safe_path
|
||||||
|
from app.tasks.compress_tasks import compress_pdf_task
|
||||||
|
|
||||||
|
compress_bp = Blueprint("compress", __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@compress_bp.route("/pdf", methods=["POST"])
|
||||||
|
@limiter.limit("10/minute")
|
||||||
|
def compress_pdf_route():
|
||||||
|
"""
|
||||||
|
Compress a PDF file.
|
||||||
|
|
||||||
|
Accepts: multipart/form-data with 'file' field (PDF)
|
||||||
|
Optional form field 'quality': "low", "medium", "high" (default: "medium")
|
||||||
|
Returns: JSON with task_id for polling
|
||||||
|
"""
|
||||||
|
if "file" not in request.files:
|
||||||
|
return jsonify({"error": "No file provided."}), 400
|
||||||
|
|
||||||
|
file = request.files["file"]
|
||||||
|
quality = request.form.get("quality", "medium")
|
||||||
|
|
||||||
|
# Validate quality parameter
|
||||||
|
if quality not in ("low", "medium", "high"):
|
||||||
|
quality = "medium"
|
||||||
|
|
||||||
|
try:
|
||||||
|
original_filename, ext = validate_file(file, allowed_types=["pdf"])
|
||||||
|
except FileValidationError as e:
|
||||||
|
return jsonify({"error": e.message}), e.code
|
||||||
|
|
||||||
|
# Save file to temp location
|
||||||
|
task_id, input_path = generate_safe_path(ext, folder_type="upload")
|
||||||
|
file.save(input_path)
|
||||||
|
|
||||||
|
# Dispatch async task
|
||||||
|
task = compress_pdf_task.delay(input_path, task_id, original_filename, quality)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"task_id": task.id,
|
||||||
|
"message": "Compression started. Poll /api/tasks/{task_id}/status for progress.",
|
||||||
|
}), 202
|
||||||
73
backend/app/routes/convert.py
Normal file
73
backend/app/routes/convert.py
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
"""PDF conversion routes (PDF↔Word)."""
|
||||||
|
from flask import Blueprint, request, jsonify
|
||||||
|
|
||||||
|
from app.extensions import limiter
|
||||||
|
from app.utils.file_validator import validate_file, FileValidationError
|
||||||
|
from app.utils.sanitizer import generate_safe_path
|
||||||
|
from app.tasks.convert_tasks import convert_pdf_to_word, convert_word_to_pdf
|
||||||
|
|
||||||
|
convert_bp = Blueprint("convert", __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@convert_bp.route("/pdf-to-word", methods=["POST"])
|
||||||
|
@limiter.limit("10/minute")
|
||||||
|
def pdf_to_word_route():
|
||||||
|
"""
|
||||||
|
Convert a PDF file to Word (DOCX).
|
||||||
|
|
||||||
|
Accepts: multipart/form-data with 'file' field (PDF)
|
||||||
|
Returns: JSON with task_id for polling
|
||||||
|
"""
|
||||||
|
if "file" not in request.files:
|
||||||
|
return jsonify({"error": "No file provided."}), 400
|
||||||
|
|
||||||
|
file = request.files["file"]
|
||||||
|
|
||||||
|
try:
|
||||||
|
original_filename, ext = validate_file(file, allowed_types=["pdf"])
|
||||||
|
except FileValidationError as e:
|
||||||
|
return jsonify({"error": e.message}), e.code
|
||||||
|
|
||||||
|
# Save file to temp location
|
||||||
|
task_id, input_path = generate_safe_path(ext, folder_type="upload")
|
||||||
|
file.save(input_path)
|
||||||
|
|
||||||
|
# Dispatch async task
|
||||||
|
task = convert_pdf_to_word.delay(input_path, task_id, original_filename)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"task_id": task.id,
|
||||||
|
"message": "Conversion started. Poll /api/tasks/{task_id}/status for progress.",
|
||||||
|
}), 202
|
||||||
|
|
||||||
|
|
||||||
|
@convert_bp.route("/word-to-pdf", methods=["POST"])
|
||||||
|
@limiter.limit("10/minute")
|
||||||
|
def word_to_pdf_route():
|
||||||
|
"""
|
||||||
|
Convert a Word (DOC/DOCX) file to PDF.
|
||||||
|
|
||||||
|
Accepts: multipart/form-data with 'file' field (DOC/DOCX)
|
||||||
|
Returns: JSON with task_id for polling
|
||||||
|
"""
|
||||||
|
if "file" not in request.files:
|
||||||
|
return jsonify({"error": "No file provided."}), 400
|
||||||
|
|
||||||
|
file = request.files["file"]
|
||||||
|
|
||||||
|
try:
|
||||||
|
original_filename, ext = validate_file(
|
||||||
|
file, allowed_types=["doc", "docx"]
|
||||||
|
)
|
||||||
|
except FileValidationError as e:
|
||||||
|
return jsonify({"error": e.message}), e.code
|
||||||
|
|
||||||
|
task_id, input_path = generate_safe_path(ext, folder_type="upload")
|
||||||
|
file.save(input_path)
|
||||||
|
|
||||||
|
task = convert_word_to_pdf.delay(input_path, task_id, original_filename)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"task_id": task.id,
|
||||||
|
"message": "Conversion started. Poll /api/tasks/{task_id}/status for progress.",
|
||||||
|
}), 202
|
||||||
35
backend/app/routes/download.py
Normal file
35
backend/app/routes/download.py
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
"""Local file download route — used when S3 is not configured."""
|
||||||
|
import os
|
||||||
|
|
||||||
|
from flask import Blueprint, send_file, abort, request, current_app
|
||||||
|
|
||||||
|
download_bp = Blueprint("download", __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@download_bp.route("/<task_id>/<filename>", methods=["GET"])
|
||||||
|
def download_file(task_id: str, filename: str):
|
||||||
|
"""
|
||||||
|
Serve a processed file from local filesystem.
|
||||||
|
|
||||||
|
Only active in development (when S3 is not configured).
|
||||||
|
"""
|
||||||
|
# Security: sanitize inputs
|
||||||
|
# Only allow UUID-style task IDs and safe filenames
|
||||||
|
if ".." in task_id or "/" in task_id or "\\" in task_id:
|
||||||
|
abort(400, "Invalid task ID.")
|
||||||
|
if ".." in filename or "/" in filename or "\\" in filename:
|
||||||
|
abort(400, "Invalid filename.")
|
||||||
|
|
||||||
|
output_dir = current_app.config["OUTPUT_FOLDER"]
|
||||||
|
file_path = os.path.join(output_dir, task_id, filename)
|
||||||
|
|
||||||
|
if not os.path.isfile(file_path):
|
||||||
|
abort(404, "File not found or expired.")
|
||||||
|
|
||||||
|
download_name = request.args.get("name", filename)
|
||||||
|
|
||||||
|
return send_file(
|
||||||
|
file_path,
|
||||||
|
as_attachment=True,
|
||||||
|
download_name=download_name,
|
||||||
|
)
|
||||||
14
backend/app/routes/health.py
Normal file
14
backend/app/routes/health.py
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
"""Health check endpoint."""
|
||||||
|
from flask import Blueprint, jsonify
|
||||||
|
|
||||||
|
health_bp = Blueprint("health", __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@health_bp.route("/health", methods=["GET"])
|
||||||
|
def health_check():
|
||||||
|
"""Simple health check — returns 200 if the service is running."""
|
||||||
|
return jsonify({
|
||||||
|
"status": "healthy",
|
||||||
|
"service": "SaaS-PDF API",
|
||||||
|
"version": "1.0.0",
|
||||||
|
})
|
||||||
122
backend/app/routes/image.py
Normal file
122
backend/app/routes/image.py
Normal file
@@ -0,0 +1,122 @@
|
|||||||
|
"""Image processing routes."""
|
||||||
|
from flask import Blueprint, request, jsonify
|
||||||
|
|
||||||
|
from app.extensions import limiter
|
||||||
|
from app.utils.file_validator import validate_file, FileValidationError
|
||||||
|
from app.utils.sanitizer import generate_safe_path
|
||||||
|
from app.tasks.image_tasks import convert_image_task, resize_image_task
|
||||||
|
|
||||||
|
image_bp = Blueprint("image", __name__)
|
||||||
|
|
||||||
|
ALLOWED_IMAGE_TYPES = ["png", "jpg", "jpeg", "webp"]
|
||||||
|
ALLOWED_OUTPUT_FORMATS = ["jpg", "png", "webp"]
|
||||||
|
|
||||||
|
|
||||||
|
@image_bp.route("/convert", methods=["POST"])
|
||||||
|
@limiter.limit("10/minute")
|
||||||
|
def convert_image_route():
|
||||||
|
"""
|
||||||
|
Convert an image to a different format.
|
||||||
|
|
||||||
|
Accepts: multipart/form-data with:
|
||||||
|
- 'file': Image file (PNG, JPG, JPEG, WebP)
|
||||||
|
- 'format': Target format ("jpg", "png", "webp")
|
||||||
|
- 'quality' (optional): Quality 1-100 (default: 85)
|
||||||
|
Returns: JSON with task_id for polling
|
||||||
|
"""
|
||||||
|
if "file" not in request.files:
|
||||||
|
return jsonify({"error": "No file provided."}), 400
|
||||||
|
|
||||||
|
file = request.files["file"]
|
||||||
|
output_format = request.form.get("format", "").lower()
|
||||||
|
quality = request.form.get("quality", "85")
|
||||||
|
|
||||||
|
# Validate output format
|
||||||
|
if output_format not in ALLOWED_OUTPUT_FORMATS:
|
||||||
|
return jsonify({
|
||||||
|
"error": f"Invalid format. Supported: {', '.join(ALLOWED_OUTPUT_FORMATS)}"
|
||||||
|
}), 400
|
||||||
|
|
||||||
|
# Validate quality
|
||||||
|
try:
|
||||||
|
quality = max(1, min(100, int(quality)))
|
||||||
|
except ValueError:
|
||||||
|
quality = 85
|
||||||
|
|
||||||
|
try:
|
||||||
|
original_filename, ext = validate_file(file, allowed_types=ALLOWED_IMAGE_TYPES)
|
||||||
|
except FileValidationError as e:
|
||||||
|
return jsonify({"error": e.message}), e.code
|
||||||
|
|
||||||
|
# Save file
|
||||||
|
task_id, input_path = generate_safe_path(ext, folder_type="upload")
|
||||||
|
file.save(input_path)
|
||||||
|
|
||||||
|
# Dispatch task
|
||||||
|
task = convert_image_task.delay(
|
||||||
|
input_path, task_id, original_filename, output_format, quality
|
||||||
|
)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"task_id": task.id,
|
||||||
|
"message": "Image conversion started. Poll /api/tasks/{task_id}/status for progress.",
|
||||||
|
}), 202
|
||||||
|
|
||||||
|
|
||||||
|
@image_bp.route("/resize", methods=["POST"])
|
||||||
|
@limiter.limit("10/minute")
|
||||||
|
def resize_image_route():
|
||||||
|
"""
|
||||||
|
Resize an image.
|
||||||
|
|
||||||
|
Accepts: multipart/form-data with:
|
||||||
|
- 'file': Image file
|
||||||
|
- 'width' (optional): Target width
|
||||||
|
- 'height' (optional): Target height
|
||||||
|
- 'quality' (optional): Quality 1-100 (default: 85)
|
||||||
|
Returns: JSON with task_id for polling
|
||||||
|
"""
|
||||||
|
if "file" not in request.files:
|
||||||
|
return jsonify({"error": "No file provided."}), 400
|
||||||
|
|
||||||
|
file = request.files["file"]
|
||||||
|
width = request.form.get("width")
|
||||||
|
height = request.form.get("height")
|
||||||
|
quality = request.form.get("quality", "85")
|
||||||
|
|
||||||
|
# Validate dimensions
|
||||||
|
try:
|
||||||
|
width = int(width) if width else None
|
||||||
|
height = int(height) if height else None
|
||||||
|
except ValueError:
|
||||||
|
return jsonify({"error": "Width and height must be integers."}), 400
|
||||||
|
|
||||||
|
if width is None and height is None:
|
||||||
|
return jsonify({"error": "At least one of width or height is required."}), 400
|
||||||
|
|
||||||
|
if width and (width < 1 or width > 10000):
|
||||||
|
return jsonify({"error": "Width must be between 1 and 10000."}), 400
|
||||||
|
if height and (height < 1 or height > 10000):
|
||||||
|
return jsonify({"error": "Height must be between 1 and 10000."}), 400
|
||||||
|
|
||||||
|
try:
|
||||||
|
quality = max(1, min(100, int(quality)))
|
||||||
|
except ValueError:
|
||||||
|
quality = 85
|
||||||
|
|
||||||
|
try:
|
||||||
|
original_filename, ext = validate_file(file, allowed_types=ALLOWED_IMAGE_TYPES)
|
||||||
|
except FileValidationError as e:
|
||||||
|
return jsonify({"error": e.message}), e.code
|
||||||
|
|
||||||
|
task_id, input_path = generate_safe_path(ext, folder_type="upload")
|
||||||
|
file.save(input_path)
|
||||||
|
|
||||||
|
task = resize_image_task.delay(
|
||||||
|
input_path, task_id, original_filename, width, height, quality
|
||||||
|
)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"task_id": task.id,
|
||||||
|
"message": "Image resize started. Poll /api/tasks/{task_id}/status for progress.",
|
||||||
|
}), 202
|
||||||
39
backend/app/routes/tasks.py
Normal file
39
backend/app/routes/tasks.py
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
"""Task status polling endpoint."""
|
||||||
|
from flask import Blueprint, jsonify
|
||||||
|
from celery.result import AsyncResult
|
||||||
|
|
||||||
|
from app.extensions import celery
|
||||||
|
|
||||||
|
tasks_bp = Blueprint("tasks", __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@tasks_bp.route("/<task_id>/status", methods=["GET"])
|
||||||
|
def get_task_status(task_id: str):
|
||||||
|
"""
|
||||||
|
Get the status of an async task.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
JSON with task state and result (if completed)
|
||||||
|
"""
|
||||||
|
result = AsyncResult(task_id, app=celery)
|
||||||
|
|
||||||
|
response = {
|
||||||
|
"task_id": task_id,
|
||||||
|
"state": result.state,
|
||||||
|
}
|
||||||
|
|
||||||
|
if result.state == "PENDING":
|
||||||
|
response["progress"] = "Task is waiting in queue..."
|
||||||
|
|
||||||
|
elif result.state == "PROCESSING":
|
||||||
|
meta = result.info or {}
|
||||||
|
response["progress"] = meta.get("step", "Processing...")
|
||||||
|
|
||||||
|
elif result.state == "SUCCESS":
|
||||||
|
task_result = result.result or {}
|
||||||
|
response["result"] = task_result
|
||||||
|
|
||||||
|
elif result.state == "FAILURE":
|
||||||
|
response["error"] = str(result.info) if result.info else "Task failed."
|
||||||
|
|
||||||
|
return jsonify(response)
|
||||||
70
backend/app/routes/video.py
Normal file
70
backend/app/routes/video.py
Normal file
@@ -0,0 +1,70 @@
|
|||||||
|
"""Video processing routes."""
|
||||||
|
from flask import Blueprint, request, jsonify
|
||||||
|
|
||||||
|
from app.extensions import limiter
|
||||||
|
from app.utils.file_validator import validate_file, FileValidationError
|
||||||
|
from app.utils.sanitizer import generate_safe_path
|
||||||
|
from app.tasks.video_tasks import create_gif_task
|
||||||
|
|
||||||
|
video_bp = Blueprint("video", __name__)
|
||||||
|
|
||||||
|
ALLOWED_VIDEO_TYPES = ["mp4", "webm"]
|
||||||
|
|
||||||
|
|
||||||
|
@video_bp.route("/to-gif", methods=["POST"])
|
||||||
|
@limiter.limit("5/minute")
|
||||||
|
def video_to_gif_route():
|
||||||
|
"""
|
||||||
|
Convert a video clip to an animated GIF.
|
||||||
|
|
||||||
|
Accepts: multipart/form-data with:
|
||||||
|
- 'file': Video file (MP4, WebM, max 50MB)
|
||||||
|
- 'start_time' (optional): Start time in seconds (default: 0)
|
||||||
|
- 'duration' (optional): Duration in seconds, max 15 (default: 5)
|
||||||
|
- 'fps' (optional): Frames per second, max 20 (default: 10)
|
||||||
|
- 'width' (optional): Output width, max 640 (default: 480)
|
||||||
|
Returns: JSON with task_id for polling
|
||||||
|
"""
|
||||||
|
if "file" not in request.files:
|
||||||
|
return jsonify({"error": "No file provided."}), 400
|
||||||
|
|
||||||
|
file = request.files["file"]
|
||||||
|
|
||||||
|
# Parse and validate parameters
|
||||||
|
try:
|
||||||
|
start_time = float(request.form.get("start_time", 0))
|
||||||
|
duration = float(request.form.get("duration", 5))
|
||||||
|
fps = int(request.form.get("fps", 10))
|
||||||
|
width = int(request.form.get("width", 480))
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
return jsonify({"error": "Invalid parameters. Must be numeric."}), 400
|
||||||
|
|
||||||
|
# Enforce limits
|
||||||
|
if start_time < 0:
|
||||||
|
return jsonify({"error": "Start time cannot be negative."}), 400
|
||||||
|
if duration <= 0 or duration > 15:
|
||||||
|
return jsonify({"error": "Duration must be between 0.5 and 15 seconds."}), 400
|
||||||
|
if fps < 1 or fps > 20:
|
||||||
|
return jsonify({"error": "FPS must be between 1 and 20."}), 400
|
||||||
|
if width < 100 or width > 640:
|
||||||
|
return jsonify({"error": "Width must be between 100 and 640 pixels."}), 400
|
||||||
|
|
||||||
|
try:
|
||||||
|
original_filename, ext = validate_file(file, allowed_types=ALLOWED_VIDEO_TYPES)
|
||||||
|
except FileValidationError as e:
|
||||||
|
return jsonify({"error": e.message}), e.code
|
||||||
|
|
||||||
|
# Save file
|
||||||
|
task_id, input_path = generate_safe_path(ext, folder_type="upload")
|
||||||
|
file.save(input_path)
|
||||||
|
|
||||||
|
# Dispatch task
|
||||||
|
task = create_gif_task.delay(
|
||||||
|
input_path, task_id, original_filename,
|
||||||
|
start_time, duration, fps, width,
|
||||||
|
)
|
||||||
|
|
||||||
|
return jsonify({
|
||||||
|
"task_id": task.id,
|
||||||
|
"message": "GIF creation started. Poll /api/tasks/{task_id}/status for progress.",
|
||||||
|
}), 202
|
||||||
1
backend/app/services/__init__.py
Normal file
1
backend/app/services/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Backend application services."""
|
||||||
109
backend/app/services/compress_service.py
Normal file
109
backend/app/services/compress_service.py
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
"""PDF compression service using Ghostscript."""
|
||||||
|
import os
|
||||||
|
import subprocess
|
||||||
|
import logging
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class PDFCompressionError(Exception):
|
||||||
|
"""Custom exception for PDF compression failures."""
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# Ghostscript quality presets
|
||||||
|
QUALITY_PRESETS = {
|
||||||
|
"low": "/screen", # 72 dpi — smallest file, lowest quality
|
||||||
|
"medium": "/ebook", # 150 dpi — good balance (default)
|
||||||
|
"high": "/printer", # 300 dpi — high quality, moderate compression
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def compress_pdf(
|
||||||
|
input_path: str, output_path: str, quality: str = "medium"
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Compress a PDF file using Ghostscript.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the input PDF file
|
||||||
|
output_path: Path for the compressed output file
|
||||||
|
quality: Compression quality — "low", "medium", or "high"
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with original_size, compressed_size, reduction_percent
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
PDFCompressionError: If compression fails
|
||||||
|
"""
|
||||||
|
if quality not in QUALITY_PRESETS:
|
||||||
|
quality = "medium"
|
||||||
|
|
||||||
|
gs_quality = QUALITY_PRESETS[quality]
|
||||||
|
|
||||||
|
# Ensure output directory exists
|
||||||
|
os.makedirs(os.path.dirname(output_path), exist_ok=True)
|
||||||
|
|
||||||
|
cmd = [
|
||||||
|
"gs",
|
||||||
|
"-sDEVICE=pdfwrite",
|
||||||
|
"-dCompatibilityLevel=1.4",
|
||||||
|
f"-dPDFSETTINGS={gs_quality}",
|
||||||
|
"-dNOPAUSE",
|
||||||
|
"-dQUIET",
|
||||||
|
"-dBATCH",
|
||||||
|
"-dColorImageResolution=150",
|
||||||
|
"-dGrayImageResolution=150",
|
||||||
|
"-dMonoImageResolution=150",
|
||||||
|
f"-sOutputFile={output_path}",
|
||||||
|
input_path,
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
original_size = os.path.getsize(input_path)
|
||||||
|
|
||||||
|
result = subprocess.run(
|
||||||
|
cmd,
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
timeout=120,
|
||||||
|
)
|
||||||
|
|
||||||
|
if result.returncode != 0:
|
||||||
|
logger.error(f"Ghostscript compression failed: {result.stderr}")
|
||||||
|
raise PDFCompressionError(
|
||||||
|
f"Compression failed: {result.stderr or 'Unknown error'}"
|
||||||
|
)
|
||||||
|
|
||||||
|
if not os.path.exists(output_path):
|
||||||
|
raise PDFCompressionError("Compressed file was not created.")
|
||||||
|
|
||||||
|
compressed_size = os.path.getsize(output_path)
|
||||||
|
|
||||||
|
# If compressed file is larger, keep original
|
||||||
|
if compressed_size >= original_size:
|
||||||
|
import shutil
|
||||||
|
shutil.copy2(input_path, output_path)
|
||||||
|
compressed_size = original_size
|
||||||
|
|
||||||
|
reduction = (
|
||||||
|
((original_size - compressed_size) / original_size) * 100
|
||||||
|
if original_size > 0
|
||||||
|
else 0
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"PDF compression: {original_size} → {compressed_size} "
|
||||||
|
f"({reduction:.1f}% reduction)"
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"original_size": original_size,
|
||||||
|
"compressed_size": compressed_size,
|
||||||
|
"reduction_percent": round(reduction, 1),
|
||||||
|
}
|
||||||
|
|
||||||
|
except subprocess.TimeoutExpired:
|
||||||
|
raise PDFCompressionError("Compression timed out. File may be too large.")
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise PDFCompressionError("Ghostscript is not installed on the server.")
|
||||||
169
backend/app/services/image_service.py
Normal file
169
backend/app/services/image_service.py
Normal file
@@ -0,0 +1,169 @@
|
|||||||
|
"""Image processing service using Pillow."""
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from PIL import Image
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class ImageProcessingError(Exception):
|
||||||
|
"""Custom exception for image processing failures."""
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# Supported format mappings
|
||||||
|
FORMAT_MAP = {
|
||||||
|
"jpg": "JPEG",
|
||||||
|
"jpeg": "JPEG",
|
||||||
|
"png": "PNG",
|
||||||
|
"webp": "WEBP",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def convert_image(
|
||||||
|
input_path: str,
|
||||||
|
output_path: str,
|
||||||
|
output_format: str,
|
||||||
|
quality: int = 85,
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Convert an image to a different format.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the input image
|
||||||
|
output_path: Path for the output image
|
||||||
|
output_format: Target format ("jpg", "png", "webp")
|
||||||
|
quality: Output quality 1-100 (for lossy formats)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with original_size, converted_size, dimensions
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ImageProcessingError: If conversion fails
|
||||||
|
"""
|
||||||
|
output_format = output_format.lower()
|
||||||
|
if output_format not in FORMAT_MAP:
|
||||||
|
raise ImageProcessingError(
|
||||||
|
f"Unsupported output format: {output_format}. "
|
||||||
|
f"Supported: {', '.join(FORMAT_MAP.keys())}"
|
||||||
|
)
|
||||||
|
|
||||||
|
pil_format = FORMAT_MAP[output_format]
|
||||||
|
os.makedirs(os.path.dirname(output_path), exist_ok=True)
|
||||||
|
|
||||||
|
try:
|
||||||
|
original_size = os.path.getsize(input_path)
|
||||||
|
|
||||||
|
# Open and re-encode (strips any malicious payloads)
|
||||||
|
with Image.open(input_path) as img:
|
||||||
|
# Convert RGBA to RGB for JPEG (JPEG doesn't support alpha)
|
||||||
|
if pil_format == "JPEG" and img.mode in ("RGBA", "P", "LA"):
|
||||||
|
background = Image.new("RGB", img.size, (255, 255, 255))
|
||||||
|
if img.mode == "P":
|
||||||
|
img = img.convert("RGBA")
|
||||||
|
background.paste(img, mask=img.split()[-1] if "A" in img.mode else None)
|
||||||
|
img = background
|
||||||
|
|
||||||
|
width, height = img.size
|
||||||
|
|
||||||
|
# Save with quality setting
|
||||||
|
save_kwargs = {}
|
||||||
|
if pil_format in ("JPEG", "WEBP"):
|
||||||
|
save_kwargs["quality"] = max(1, min(100, quality))
|
||||||
|
save_kwargs["optimize"] = True
|
||||||
|
elif pil_format == "PNG":
|
||||||
|
save_kwargs["optimize"] = True
|
||||||
|
|
||||||
|
img.save(output_path, format=pil_format, **save_kwargs)
|
||||||
|
|
||||||
|
converted_size = os.path.getsize(output_path)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Image conversion: {input_path} → {output_format} "
|
||||||
|
f"({original_size} → {converted_size})"
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"original_size": original_size,
|
||||||
|
"converted_size": converted_size,
|
||||||
|
"width": width,
|
||||||
|
"height": height,
|
||||||
|
"format": output_format,
|
||||||
|
}
|
||||||
|
|
||||||
|
except (IOError, OSError, Image.DecompressionBombError) as e:
|
||||||
|
raise ImageProcessingError(f"Image processing failed: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
|
def resize_image(
|
||||||
|
input_path: str,
|
||||||
|
output_path: str,
|
||||||
|
width: int | None = None,
|
||||||
|
height: int | None = None,
|
||||||
|
quality: int = 85,
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Resize an image while maintaining aspect ratio.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the input image
|
||||||
|
output_path: Path for the resized image
|
||||||
|
width: Target width (None to auto-calculate from height)
|
||||||
|
height: Target height (None to auto-calculate from width)
|
||||||
|
quality: Output quality 1-100
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with original and new dimensions
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ImageProcessingError: If resize fails
|
||||||
|
"""
|
||||||
|
if width is None and height is None:
|
||||||
|
raise ImageProcessingError("At least one of width or height must be specified.")
|
||||||
|
|
||||||
|
os.makedirs(os.path.dirname(output_path), exist_ok=True)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with Image.open(input_path) as img:
|
||||||
|
orig_width, orig_height = img.size
|
||||||
|
|
||||||
|
# Calculate missing dimension to maintain aspect ratio
|
||||||
|
if width and not height:
|
||||||
|
ratio = width / orig_width
|
||||||
|
height = int(orig_height * ratio)
|
||||||
|
elif height and not width:
|
||||||
|
ratio = height / orig_height
|
||||||
|
width = int(orig_width * ratio)
|
||||||
|
|
||||||
|
# Resize using high-quality resampling
|
||||||
|
resized = img.resize((width, height), Image.Resampling.LANCZOS)
|
||||||
|
|
||||||
|
# Detect format from output extension
|
||||||
|
ext = os.path.splitext(output_path)[1].lower().strip(".")
|
||||||
|
pil_format = FORMAT_MAP.get(ext, "PNG")
|
||||||
|
|
||||||
|
save_kwargs = {"optimize": True}
|
||||||
|
if pil_format in ("JPEG", "WEBP"):
|
||||||
|
save_kwargs["quality"] = quality
|
||||||
|
# Handle RGBA for JPEG
|
||||||
|
if resized.mode in ("RGBA", "P", "LA"):
|
||||||
|
background = Image.new("RGB", resized.size, (255, 255, 255))
|
||||||
|
if resized.mode == "P":
|
||||||
|
resized = resized.convert("RGBA")
|
||||||
|
background.paste(
|
||||||
|
resized, mask=resized.split()[-1] if "A" in resized.mode else None
|
||||||
|
)
|
||||||
|
resized = background
|
||||||
|
|
||||||
|
resized.save(output_path, format=pil_format, **save_kwargs)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"original_width": orig_width,
|
||||||
|
"original_height": orig_height,
|
||||||
|
"new_width": width,
|
||||||
|
"new_height": height,
|
||||||
|
}
|
||||||
|
|
||||||
|
except (IOError, OSError, Image.DecompressionBombError) as e:
|
||||||
|
raise ImageProcessingError(f"Image resize failed: {str(e)}")
|
||||||
170
backend/app/services/pdf_service.py
Normal file
170
backend/app/services/pdf_service.py
Normal file
@@ -0,0 +1,170 @@
|
|||||||
|
"""PDF conversion service using LibreOffice headless."""
|
||||||
|
import os
|
||||||
|
import subprocess
|
||||||
|
import logging
|
||||||
|
import tempfile
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class PDFConversionError(Exception):
|
||||||
|
"""Custom exception for PDF conversion failures."""
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def pdf_to_word(input_path: str, output_dir: str) -> str:
|
||||||
|
"""
|
||||||
|
Convert a PDF file to Word (DOCX) format using LibreOffice headless.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the input PDF file
|
||||||
|
output_dir: Directory for the output file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Path to the converted DOCX file
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
PDFConversionError: If conversion fails
|
||||||
|
"""
|
||||||
|
os.makedirs(output_dir, exist_ok=True)
|
||||||
|
|
||||||
|
# Use a unique user profile per process to avoid lock conflicts
|
||||||
|
user_install_dir = tempfile.mkdtemp(prefix="lo_pdf2word_")
|
||||||
|
|
||||||
|
cmd = [
|
||||||
|
"soffice",
|
||||||
|
"--headless",
|
||||||
|
"--norestore",
|
||||||
|
f"-env:UserInstallation=file://{user_install_dir}",
|
||||||
|
"--infilter=writer_pdf_import",
|
||||||
|
"--convert-to", "docx",
|
||||||
|
"--outdir", output_dir,
|
||||||
|
input_path,
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
logger.info(f"Running LibreOffice PDF→Word: {' '.join(cmd)}")
|
||||||
|
|
||||||
|
result = subprocess.run(
|
||||||
|
cmd,
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
timeout=120, # 2 minute timeout
|
||||||
|
env={**os.environ, "HOME": user_install_dir},
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info(f"LibreOffice stdout: {result.stdout}")
|
||||||
|
logger.info(f"LibreOffice stderr: {result.stderr}")
|
||||||
|
logger.info(f"LibreOffice returncode: {result.returncode}")
|
||||||
|
|
||||||
|
# LibreOffice names output based on input filename
|
||||||
|
input_basename = os.path.splitext(os.path.basename(input_path))[0]
|
||||||
|
output_path = os.path.join(output_dir, f"{input_basename}.docx")
|
||||||
|
|
||||||
|
# Check output file first — LibreOffice may return non-zero
|
||||||
|
# due to harmless warnings (e.g. javaldx) even on success
|
||||||
|
if os.path.exists(output_path) and os.path.getsize(output_path) > 0:
|
||||||
|
logger.info(f"PDF→Word conversion successful: {output_path}")
|
||||||
|
return output_path
|
||||||
|
|
||||||
|
# No output file — now treat as real error
|
||||||
|
if result.returncode != 0:
|
||||||
|
# Filter out known harmless warnings
|
||||||
|
stderr = result.stderr or ""
|
||||||
|
real_errors = [
|
||||||
|
line for line in stderr.strip().splitlines()
|
||||||
|
if not line.startswith("Warning: failed to launch javaldx")
|
||||||
|
]
|
||||||
|
error_msg = "\n".join(real_errors) if real_errors else stderr
|
||||||
|
logger.error(f"LibreOffice PDF→Word failed: {error_msg}")
|
||||||
|
raise PDFConversionError(
|
||||||
|
f"Conversion failed: {error_msg or 'Unknown error'}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Return code 0 but no output file
|
||||||
|
files_in_dir = os.listdir(output_dir) if os.path.exists(output_dir) else []
|
||||||
|
logger.error(
|
||||||
|
f"Expected output not found at {output_path}. "
|
||||||
|
f"Files in output dir: {files_in_dir}"
|
||||||
|
)
|
||||||
|
raise PDFConversionError("Output file was not created.")
|
||||||
|
|
||||||
|
except subprocess.TimeoutExpired:
|
||||||
|
raise PDFConversionError("Conversion timed out. File may be too large.")
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise PDFConversionError("LibreOffice is not installed on the server.")
|
||||||
|
finally:
|
||||||
|
# Cleanup temporary user profile
|
||||||
|
import shutil
|
||||||
|
shutil.rmtree(user_install_dir, ignore_errors=True)
|
||||||
|
|
||||||
|
|
||||||
|
def word_to_pdf(input_path: str, output_dir: str) -> str:
|
||||||
|
"""
|
||||||
|
Convert a Word (DOC/DOCX) file to PDF format using LibreOffice headless.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the input Word file
|
||||||
|
output_dir: Directory for the output file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Path to the converted PDF file
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
PDFConversionError: If conversion fails
|
||||||
|
"""
|
||||||
|
os.makedirs(output_dir, exist_ok=True)
|
||||||
|
|
||||||
|
# Use a unique user profile per process to avoid lock conflicts
|
||||||
|
user_install_dir = tempfile.mkdtemp(prefix="lo_word2pdf_")
|
||||||
|
|
||||||
|
cmd = [
|
||||||
|
"soffice",
|
||||||
|
"--headless",
|
||||||
|
"--norestore",
|
||||||
|
f"-env:UserInstallation=file://{user_install_dir}",
|
||||||
|
"--convert-to", "pdf",
|
||||||
|
"--outdir", output_dir,
|
||||||
|
input_path,
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
cmd,
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
timeout=120,
|
||||||
|
env={**os.environ, "HOME": user_install_dir},
|
||||||
|
)
|
||||||
|
|
||||||
|
input_basename = os.path.splitext(os.path.basename(input_path))[0]
|
||||||
|
output_path = os.path.join(output_dir, f"{input_basename}.pdf")
|
||||||
|
|
||||||
|
# Check output file first — LibreOffice may return non-zero
|
||||||
|
# due to harmless warnings (e.g. javaldx) even on success
|
||||||
|
if os.path.exists(output_path) and os.path.getsize(output_path) > 0:
|
||||||
|
logger.info(f"Word→PDF conversion successful: {output_path}")
|
||||||
|
return output_path
|
||||||
|
|
||||||
|
if result.returncode != 0:
|
||||||
|
stderr = result.stderr or ""
|
||||||
|
real_errors = [
|
||||||
|
line for line in stderr.strip().splitlines()
|
||||||
|
if not line.startswith("Warning: failed to launch javaldx")
|
||||||
|
]
|
||||||
|
error_msg = "\n".join(real_errors) if real_errors else stderr
|
||||||
|
logger.error(f"LibreOffice Word→PDF failed: {error_msg}")
|
||||||
|
raise PDFConversionError(
|
||||||
|
f"Conversion failed: {error_msg or 'Unknown error'}"
|
||||||
|
)
|
||||||
|
|
||||||
|
raise PDFConversionError("Output file was not created.")
|
||||||
|
|
||||||
|
except subprocess.TimeoutExpired:
|
||||||
|
raise PDFConversionError("Conversion timed out. File may be too large.")
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise PDFConversionError("LibreOffice is not installed on the server.")
|
||||||
|
finally:
|
||||||
|
# Cleanup temporary user profile
|
||||||
|
import shutil
|
||||||
|
shutil.rmtree(user_install_dir, ignore_errors=True)
|
||||||
154
backend/app/services/storage_service.py
Normal file
154
backend/app/services/storage_service.py
Normal file
@@ -0,0 +1,154 @@
|
|||||||
|
"""Storage service — S3 in production, local files in development."""
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from flask import current_app
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def _is_s3_configured() -> bool:
|
||||||
|
"""Check if AWS S3 credentials are provided."""
|
||||||
|
key = current_app.config.get("AWS_ACCESS_KEY_ID")
|
||||||
|
secret = current_app.config.get("AWS_SECRET_ACCESS_KEY")
|
||||||
|
return bool(key and secret and key.strip() and secret.strip())
|
||||||
|
|
||||||
|
|
||||||
|
class StorageService:
|
||||||
|
"""Handle file storage — uses S3 when configured, local filesystem otherwise."""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self._client = None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def use_s3(self) -> bool:
|
||||||
|
return _is_s3_configured()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def client(self):
|
||||||
|
"""Lazy-initialize S3 client (only when S3 is configured)."""
|
||||||
|
if self._client is None:
|
||||||
|
import boto3
|
||||||
|
self._client = boto3.client(
|
||||||
|
"s3",
|
||||||
|
region_name=current_app.config["AWS_S3_REGION"],
|
||||||
|
aws_access_key_id=current_app.config["AWS_ACCESS_KEY_ID"],
|
||||||
|
aws_secret_access_key=current_app.config["AWS_SECRET_ACCESS_KEY"],
|
||||||
|
)
|
||||||
|
return self._client
|
||||||
|
|
||||||
|
@property
|
||||||
|
def bucket(self):
|
||||||
|
return current_app.config["AWS_S3_BUCKET"]
|
||||||
|
|
||||||
|
def upload_file(self, local_path: str, task_id: str, folder: str = "outputs") -> str:
|
||||||
|
"""
|
||||||
|
Upload / store a file.
|
||||||
|
|
||||||
|
In S3 mode: uploads to S3 bucket.
|
||||||
|
In local mode: copies file to the outputs directory.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
S3 key or local relative path (used as identifier)
|
||||||
|
"""
|
||||||
|
filename = os.path.basename(local_path)
|
||||||
|
key = f"{folder}/{task_id}/{filename}"
|
||||||
|
|
||||||
|
if self.use_s3:
|
||||||
|
from botocore.exceptions import ClientError
|
||||||
|
try:
|
||||||
|
self.client.upload_file(local_path, self.bucket, key)
|
||||||
|
return key
|
||||||
|
except ClientError as e:
|
||||||
|
raise RuntimeError(f"Failed to upload file to S3: {e}")
|
||||||
|
else:
|
||||||
|
# Local mode — keep file in the outputs directory
|
||||||
|
output_dir = current_app.config["OUTPUT_FOLDER"]
|
||||||
|
dest_dir = os.path.join(output_dir, task_id)
|
||||||
|
os.makedirs(dest_dir, exist_ok=True)
|
||||||
|
dest_path = os.path.join(dest_dir, filename)
|
||||||
|
|
||||||
|
if os.path.abspath(local_path) != os.path.abspath(dest_path):
|
||||||
|
shutil.copy2(local_path, dest_path)
|
||||||
|
|
||||||
|
logger.info(f"[Local] Stored file: {dest_path}")
|
||||||
|
return key
|
||||||
|
|
||||||
|
def generate_presigned_url(
|
||||||
|
self, s3_key: str, expiry: int | None = None, original_filename: str | None = None
|
||||||
|
) -> str:
|
||||||
|
"""
|
||||||
|
Generate a download URL.
|
||||||
|
|
||||||
|
S3 mode: presigned URL.
|
||||||
|
Local mode: /api/download/<task_id>/<filename>
|
||||||
|
"""
|
||||||
|
if self.use_s3:
|
||||||
|
from botocore.exceptions import ClientError
|
||||||
|
if expiry is None:
|
||||||
|
expiry = current_app.config.get("FILE_EXPIRY_SECONDS", 1800)
|
||||||
|
|
||||||
|
params = {
|
||||||
|
"Bucket": self.bucket,
|
||||||
|
"Key": s3_key,
|
||||||
|
}
|
||||||
|
if original_filename:
|
||||||
|
params["ResponseContentDisposition"] = (
|
||||||
|
f'attachment; filename="{original_filename}"'
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
url = self.client.generate_presigned_url(
|
||||||
|
"get_object",
|
||||||
|
Params=params,
|
||||||
|
ExpiresIn=expiry,
|
||||||
|
)
|
||||||
|
return url
|
||||||
|
except ClientError as e:
|
||||||
|
raise RuntimeError(f"Failed to generate presigned URL: {e}")
|
||||||
|
else:
|
||||||
|
# Local mode — return path to Flask download route
|
||||||
|
parts = s3_key.strip("/").split("/")
|
||||||
|
# key = "outputs/<task_id>/<filename>"
|
||||||
|
if len(parts) >= 3:
|
||||||
|
task_id = parts[1]
|
||||||
|
filename = parts[2]
|
||||||
|
else:
|
||||||
|
task_id = parts[0]
|
||||||
|
filename = parts[-1]
|
||||||
|
|
||||||
|
download_name = original_filename or filename
|
||||||
|
return f"/api/download/{task_id}/{filename}?name={download_name}"
|
||||||
|
|
||||||
|
def delete_file(self, s3_key: str):
|
||||||
|
"""Delete a file from S3 (no-op in local mode)."""
|
||||||
|
if self.use_s3:
|
||||||
|
from botocore.exceptions import ClientError
|
||||||
|
try:
|
||||||
|
self.client.delete_object(Bucket=self.bucket, Key=s3_key)
|
||||||
|
except ClientError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
def file_exists(self, s3_key: str) -> bool:
|
||||||
|
"""Check if a file exists."""
|
||||||
|
if self.use_s3:
|
||||||
|
from botocore.exceptions import ClientError
|
||||||
|
try:
|
||||||
|
self.client.head_object(Bucket=self.bucket, Key=s3_key)
|
||||||
|
return True
|
||||||
|
except ClientError:
|
||||||
|
return False
|
||||||
|
else:
|
||||||
|
parts = s3_key.strip("/").split("/")
|
||||||
|
if len(parts) >= 3:
|
||||||
|
task_id = parts[1]
|
||||||
|
filename = parts[2]
|
||||||
|
else:
|
||||||
|
task_id = parts[0]
|
||||||
|
filename = parts[-1]
|
||||||
|
output_dir = current_app.config["OUTPUT_FOLDER"]
|
||||||
|
return os.path.isfile(os.path.join(output_dir, task_id, filename))
|
||||||
|
|
||||||
|
|
||||||
|
# Singleton instance
|
||||||
|
storage = StorageService()
|
||||||
176
backend/app/services/video_service.py
Normal file
176
backend/app/services/video_service.py
Normal file
@@ -0,0 +1,176 @@
|
|||||||
|
"""Video to GIF conversion service using ffmpeg."""
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import subprocess
|
||||||
|
import logging
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class VideoProcessingError(Exception):
|
||||||
|
"""Custom exception for video processing failures."""
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
# Safety constraints
|
||||||
|
MAX_DURATION = 15 # seconds
|
||||||
|
MAX_WIDTH = 640 # pixels
|
||||||
|
MAX_FPS = 20
|
||||||
|
DEFAULT_FPS = 10
|
||||||
|
DEFAULT_WIDTH = 480
|
||||||
|
|
||||||
|
|
||||||
|
def video_to_gif(
|
||||||
|
input_path: str,
|
||||||
|
output_path: str,
|
||||||
|
start_time: float = 0,
|
||||||
|
duration: float = 5,
|
||||||
|
fps: int = DEFAULT_FPS,
|
||||||
|
width: int = DEFAULT_WIDTH,
|
||||||
|
) -> dict:
|
||||||
|
"""
|
||||||
|
Convert a video clip to an animated GIF using ffmpeg.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the input video (MP4/WebM)
|
||||||
|
output_path: Path for the output GIF
|
||||||
|
start_time: Start time in seconds
|
||||||
|
duration: Duration in seconds (max 15)
|
||||||
|
fps: Frames per second (max 20)
|
||||||
|
width: Output width in pixels (max 640)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with output_size, duration, fps, dimensions
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
VideoProcessingError: If conversion fails
|
||||||
|
"""
|
||||||
|
# Sanitize numeric parameters (prevent injection)
|
||||||
|
start_time = max(0, float(start_time))
|
||||||
|
duration = max(0.5, min(MAX_DURATION, float(duration)))
|
||||||
|
fps = max(1, min(MAX_FPS, int(fps)))
|
||||||
|
width = max(100, min(MAX_WIDTH, int(width)))
|
||||||
|
|
||||||
|
os.makedirs(os.path.dirname(output_path), exist_ok=True)
|
||||||
|
|
||||||
|
# Two-pass palette approach for high-quality GIF
|
||||||
|
palette_path = output_path + ".palette.png"
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Pass 1: Generate optimized palette
|
||||||
|
palette_cmd = [
|
||||||
|
"ffmpeg",
|
||||||
|
"-y",
|
||||||
|
"-ss", str(start_time),
|
||||||
|
"-t", str(duration),
|
||||||
|
"-i", input_path,
|
||||||
|
"-vf", f"fps={fps},scale={width}:-1:flags=lanczos,palettegen=stats_mode=diff",
|
||||||
|
palette_path,
|
||||||
|
]
|
||||||
|
|
||||||
|
result = subprocess.run(
|
||||||
|
palette_cmd,
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
timeout=60,
|
||||||
|
)
|
||||||
|
|
||||||
|
if result.returncode != 0:
|
||||||
|
logger.error(f"ffmpeg palette generation failed: {result.stderr}")
|
||||||
|
raise VideoProcessingError("Failed to process video for GIF creation.")
|
||||||
|
|
||||||
|
# Pass 2: Create GIF using palette
|
||||||
|
gif_cmd = [
|
||||||
|
"ffmpeg",
|
||||||
|
"-y",
|
||||||
|
"-ss", str(start_time),
|
||||||
|
"-t", str(duration),
|
||||||
|
"-i", input_path,
|
||||||
|
"-i", palette_path,
|
||||||
|
"-lavfi", f"fps={fps},scale={width}:-1:flags=lanczos [x]; [x][1:v] paletteuse=dither=bayer:bayer_scale=5",
|
||||||
|
output_path,
|
||||||
|
]
|
||||||
|
|
||||||
|
result = subprocess.run(
|
||||||
|
gif_cmd,
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
timeout=120,
|
||||||
|
)
|
||||||
|
|
||||||
|
if result.returncode != 0:
|
||||||
|
logger.error(f"ffmpeg GIF creation failed: {result.stderr}")
|
||||||
|
raise VideoProcessingError("Failed to create GIF from video.")
|
||||||
|
|
||||||
|
if not os.path.exists(output_path):
|
||||||
|
raise VideoProcessingError("GIF file was not created.")
|
||||||
|
|
||||||
|
output_size = os.path.getsize(output_path)
|
||||||
|
|
||||||
|
# Get actual output dimensions
|
||||||
|
actual_width, actual_height = _get_gif_dimensions(output_path)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Video→GIF: {input_path} → {output_path} "
|
||||||
|
f"({output_size} bytes, {duration}s, {fps}fps, {actual_width}x{actual_height})"
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"output_size": output_size,
|
||||||
|
"duration": duration,
|
||||||
|
"fps": fps,
|
||||||
|
"width": actual_width,
|
||||||
|
"height": actual_height,
|
||||||
|
}
|
||||||
|
|
||||||
|
except subprocess.TimeoutExpired:
|
||||||
|
raise VideoProcessingError("GIF creation timed out. Video may be too large.")
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise VideoProcessingError("ffmpeg is not installed on the server.")
|
||||||
|
finally:
|
||||||
|
# Cleanup palette file
|
||||||
|
if os.path.exists(palette_path):
|
||||||
|
os.remove(palette_path)
|
||||||
|
|
||||||
|
|
||||||
|
def get_video_duration(input_path: str) -> float:
|
||||||
|
"""Get the duration of a video file in seconds."""
|
||||||
|
cmd = [
|
||||||
|
"ffprobe",
|
||||||
|
"-v", "error",
|
||||||
|
"-show_entries", "format=duration",
|
||||||
|
"-of", "default=noprint_wrappers=1:nokey=1",
|
||||||
|
input_path,
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
cmd, capture_output=True, text=True, timeout=10
|
||||||
|
)
|
||||||
|
return float(result.stdout.strip())
|
||||||
|
except (subprocess.TimeoutExpired, ValueError):
|
||||||
|
return 0.0
|
||||||
|
|
||||||
|
|
||||||
|
def _get_gif_dimensions(gif_path: str) -> tuple[int, int]:
|
||||||
|
"""Get GIF dimensions using ffprobe."""
|
||||||
|
cmd = [
|
||||||
|
"ffprobe",
|
||||||
|
"-v", "error",
|
||||||
|
"-select_streams", "v:0",
|
||||||
|
"-show_entries", "stream=width,height",
|
||||||
|
"-of", "csv=p=0",
|
||||||
|
gif_path,
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
cmd, capture_output=True, text=True, timeout=10
|
||||||
|
)
|
||||||
|
parts = result.stdout.strip().split(",")
|
||||||
|
if len(parts) == 2:
|
||||||
|
return int(parts[0]), int(parts[1])
|
||||||
|
except (subprocess.TimeoutExpired, ValueError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
return 0, 0
|
||||||
1
backend/app/tasks/__init__.py
Normal file
1
backend/app/tasks/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Celery tasks for async file processing."""
|
||||||
88
backend/app/tasks/compress_tasks.py
Normal file
88
backend/app/tasks/compress_tasks.py
Normal file
@@ -0,0 +1,88 @@
|
|||||||
|
"""Celery tasks for PDF compression."""
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from app.extensions import celery
|
||||||
|
from app.services.compress_service import compress_pdf, PDFCompressionError
|
||||||
|
from app.services.storage_service import storage
|
||||||
|
from app.utils.sanitizer import cleanup_task_files
|
||||||
|
|
||||||
|
|
||||||
|
def _cleanup(task_id: str):
|
||||||
|
cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@celery.task(bind=True, name="app.tasks.compress_tasks.compress_pdf_task")
|
||||||
|
def compress_pdf_task(
|
||||||
|
self,
|
||||||
|
input_path: str,
|
||||||
|
task_id: str,
|
||||||
|
original_filename: str,
|
||||||
|
quality: str = "medium",
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Async task: Compress a PDF file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the uploaded PDF file
|
||||||
|
task_id: Unique task identifier
|
||||||
|
original_filename: Original filename for download
|
||||||
|
quality: Compression quality ("low", "medium", "high")
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with download_url, compression stats, and file info
|
||||||
|
"""
|
||||||
|
output_dir = os.path.join("/tmp/outputs", task_id)
|
||||||
|
os.makedirs(output_dir, exist_ok=True)
|
||||||
|
output_path = os.path.join(output_dir, f"{task_id}.pdf")
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.update_state(
|
||||||
|
state="PROCESSING",
|
||||||
|
meta={"step": f"Compressing PDF ({quality} quality)..."},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Compress using Ghostscript
|
||||||
|
stats = compress_pdf(input_path, output_path, quality)
|
||||||
|
|
||||||
|
self.update_state(state="PROCESSING", meta={"step": "Uploading result..."})
|
||||||
|
|
||||||
|
# Upload to S3
|
||||||
|
s3_key = storage.upload_file(output_path, task_id, folder="outputs")
|
||||||
|
|
||||||
|
# Generate download filename
|
||||||
|
name_without_ext = os.path.splitext(original_filename)[0]
|
||||||
|
download_name = f"{name_without_ext}_compressed.pdf"
|
||||||
|
|
||||||
|
download_url = storage.generate_presigned_url(
|
||||||
|
s3_key, original_filename=download_name
|
||||||
|
)
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"status": "completed",
|
||||||
|
"download_url": download_url,
|
||||||
|
"filename": download_name,
|
||||||
|
"original_size": stats["original_size"],
|
||||||
|
"compressed_size": stats["compressed_size"],
|
||||||
|
"reduction_percent": stats["reduction_percent"],
|
||||||
|
}
|
||||||
|
|
||||||
|
_cleanup(task_id)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Task {task_id}: PDF compression completed — "
|
||||||
|
f"{stats['reduction_percent']}% reduction"
|
||||||
|
)
|
||||||
|
return result
|
||||||
|
|
||||||
|
except PDFCompressionError as e:
|
||||||
|
logger.error(f"Task {task_id}: Compression error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": str(e)}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Task {task_id}: Unexpected error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": "An unexpected error occurred."}
|
||||||
128
backend/app/tasks/convert_tasks.py
Normal file
128
backend/app/tasks/convert_tasks.py
Normal file
@@ -0,0 +1,128 @@
|
|||||||
|
"""Celery tasks for PDF conversion (PDF↔Word)."""
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from app.extensions import celery
|
||||||
|
from app.services.pdf_service import pdf_to_word, word_to_pdf, PDFConversionError
|
||||||
|
from app.services.storage_service import storage
|
||||||
|
from app.utils.sanitizer import cleanup_task_files
|
||||||
|
|
||||||
|
|
||||||
|
def _cleanup(task_id: str):
|
||||||
|
"""Cleanup with local-aware flag."""
|
||||||
|
cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@celery.task(bind=True, name="app.tasks.convert_tasks.convert_pdf_to_word")
|
||||||
|
def convert_pdf_to_word(self, input_path: str, task_id: str, original_filename: str):
|
||||||
|
"""
|
||||||
|
Async task: Convert PDF to Word document.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the uploaded PDF file
|
||||||
|
task_id: Unique task identifier
|
||||||
|
original_filename: Original filename for download
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with download_url and file info
|
||||||
|
"""
|
||||||
|
output_dir = os.path.join("/tmp/outputs", task_id)
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.update_state(state="PROCESSING", meta={"step": "Converting PDF to Word..."})
|
||||||
|
|
||||||
|
# Convert using LibreOffice
|
||||||
|
output_path = pdf_to_word(input_path, output_dir)
|
||||||
|
|
||||||
|
self.update_state(state="PROCESSING", meta={"step": "Uploading result..."})
|
||||||
|
|
||||||
|
# Upload to S3
|
||||||
|
s3_key = storage.upload_file(output_path, task_id, folder="outputs")
|
||||||
|
|
||||||
|
# Generate download filename
|
||||||
|
name_without_ext = os.path.splitext(original_filename)[0]
|
||||||
|
download_name = f"{name_without_ext}.docx"
|
||||||
|
|
||||||
|
# Generate presigned URL
|
||||||
|
download_url = storage.generate_presigned_url(
|
||||||
|
s3_key, original_filename=download_name
|
||||||
|
)
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"status": "completed",
|
||||||
|
"download_url": download_url,
|
||||||
|
"filename": download_name,
|
||||||
|
"output_size": os.path.getsize(output_path),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cleanup local files
|
||||||
|
_cleanup(task_id)
|
||||||
|
|
||||||
|
logger.info(f"Task {task_id}: PDF→Word conversion completed")
|
||||||
|
return result
|
||||||
|
|
||||||
|
except PDFConversionError as e:
|
||||||
|
logger.error(f"Task {task_id}: Conversion error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": str(e)}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Task {task_id}: Unexpected error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": "An unexpected error occurred."}
|
||||||
|
|
||||||
|
|
||||||
|
@celery.task(bind=True, name="app.tasks.convert_tasks.convert_word_to_pdf")
|
||||||
|
def convert_word_to_pdf(self, input_path: str, task_id: str, original_filename: str):
|
||||||
|
"""
|
||||||
|
Async task: Convert Word document to PDF.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the uploaded Word file
|
||||||
|
task_id: Unique task identifier
|
||||||
|
original_filename: Original filename for download
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with download_url and file info
|
||||||
|
"""
|
||||||
|
output_dir = os.path.join("/tmp/outputs", task_id)
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.update_state(state="PROCESSING", meta={"step": "Converting Word to PDF..."})
|
||||||
|
|
||||||
|
output_path = word_to_pdf(input_path, output_dir)
|
||||||
|
|
||||||
|
self.update_state(state="PROCESSING", meta={"step": "Uploading result..."})
|
||||||
|
|
||||||
|
s3_key = storage.upload_file(output_path, task_id, folder="outputs")
|
||||||
|
|
||||||
|
name_without_ext = os.path.splitext(original_filename)[0]
|
||||||
|
download_name = f"{name_without_ext}.pdf"
|
||||||
|
|
||||||
|
download_url = storage.generate_presigned_url(
|
||||||
|
s3_key, original_filename=download_name
|
||||||
|
)
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"status": "completed",
|
||||||
|
"download_url": download_url,
|
||||||
|
"filename": download_name,
|
||||||
|
"output_size": os.path.getsize(output_path),
|
||||||
|
}
|
||||||
|
|
||||||
|
_cleanup(task_id)
|
||||||
|
|
||||||
|
logger.info(f"Task {task_id}: Word→PDF conversion completed")
|
||||||
|
return result
|
||||||
|
|
||||||
|
except PDFConversionError as e:
|
||||||
|
logger.error(f"Task {task_id}: Conversion error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": str(e)}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Task {task_id}: Unexpected error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": "An unexpected error occurred."}
|
||||||
160
backend/app/tasks/image_tasks.py
Normal file
160
backend/app/tasks/image_tasks.py
Normal file
@@ -0,0 +1,160 @@
|
|||||||
|
"""Celery tasks for image processing."""
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from app.extensions import celery
|
||||||
|
from app.services.image_service import convert_image, resize_image, ImageProcessingError
|
||||||
|
from app.services.storage_service import storage
|
||||||
|
from app.utils.sanitizer import cleanup_task_files
|
||||||
|
|
||||||
|
|
||||||
|
def _cleanup(task_id: str):
|
||||||
|
cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@celery.task(bind=True, name="app.tasks.image_tasks.convert_image_task")
|
||||||
|
def convert_image_task(
|
||||||
|
self,
|
||||||
|
input_path: str,
|
||||||
|
task_id: str,
|
||||||
|
original_filename: str,
|
||||||
|
output_format: str,
|
||||||
|
quality: int = 85,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Async task: Convert an image to a different format.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the uploaded image
|
||||||
|
task_id: Unique task identifier
|
||||||
|
original_filename: Original filename for download
|
||||||
|
output_format: Target format ("jpg", "png", "webp")
|
||||||
|
quality: Output quality 1-100
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with download_url and conversion stats
|
||||||
|
"""
|
||||||
|
output_dir = os.path.join("/tmp/outputs", task_id)
|
||||||
|
os.makedirs(output_dir, exist_ok=True)
|
||||||
|
output_path = os.path.join(output_dir, f"{task_id}.{output_format}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.update_state(
|
||||||
|
state="PROCESSING",
|
||||||
|
meta={"step": f"Converting image to {output_format.upper()}..."},
|
||||||
|
)
|
||||||
|
|
||||||
|
stats = convert_image(input_path, output_path, output_format, quality)
|
||||||
|
|
||||||
|
self.update_state(state="PROCESSING", meta={"step": "Uploading result..."})
|
||||||
|
|
||||||
|
s3_key = storage.upload_file(output_path, task_id, folder="outputs")
|
||||||
|
|
||||||
|
name_without_ext = os.path.splitext(original_filename)[0]
|
||||||
|
download_name = f"{name_without_ext}.{output_format}"
|
||||||
|
|
||||||
|
download_url = storage.generate_presigned_url(
|
||||||
|
s3_key, original_filename=download_name
|
||||||
|
)
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"status": "completed",
|
||||||
|
"download_url": download_url,
|
||||||
|
"filename": download_name,
|
||||||
|
"original_size": stats["original_size"],
|
||||||
|
"converted_size": stats["converted_size"],
|
||||||
|
"width": stats["width"],
|
||||||
|
"height": stats["height"],
|
||||||
|
"format": stats["format"],
|
||||||
|
}
|
||||||
|
|
||||||
|
_cleanup(task_id)
|
||||||
|
|
||||||
|
logger.info(f"Task {task_id}: Image conversion to {output_format} completed")
|
||||||
|
return result
|
||||||
|
|
||||||
|
except ImageProcessingError as e:
|
||||||
|
logger.error(f"Task {task_id}: Image error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": str(e)}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Task {task_id}: Unexpected error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": "An unexpected error occurred."}
|
||||||
|
|
||||||
|
|
||||||
|
@celery.task(bind=True, name="app.tasks.image_tasks.resize_image_task")
|
||||||
|
def resize_image_task(
|
||||||
|
self,
|
||||||
|
input_path: str,
|
||||||
|
task_id: str,
|
||||||
|
original_filename: str,
|
||||||
|
width: int | None = None,
|
||||||
|
height: int | None = None,
|
||||||
|
quality: int = 85,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Async task: Resize an image.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the uploaded image
|
||||||
|
task_id: Unique task identifier
|
||||||
|
original_filename: Original filename for download
|
||||||
|
width: Target width
|
||||||
|
height: Target height
|
||||||
|
quality: Output quality 1-100
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with download_url and resize info
|
||||||
|
"""
|
||||||
|
ext = os.path.splitext(original_filename)[1].lstrip(".")
|
||||||
|
output_dir = os.path.join("/tmp/outputs", task_id)
|
||||||
|
os.makedirs(output_dir, exist_ok=True)
|
||||||
|
output_path = os.path.join(output_dir, f"{task_id}.{ext}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.update_state(
|
||||||
|
state="PROCESSING",
|
||||||
|
meta={"step": "Resizing image..."},
|
||||||
|
)
|
||||||
|
|
||||||
|
stats = resize_image(input_path, output_path, width, height, quality)
|
||||||
|
|
||||||
|
self.update_state(state="PROCESSING", meta={"step": "Uploading result..."})
|
||||||
|
|
||||||
|
s3_key = storage.upload_file(output_path, task_id, folder="outputs")
|
||||||
|
|
||||||
|
name_without_ext = os.path.splitext(original_filename)[0]
|
||||||
|
download_name = f"{name_without_ext}_resized.{ext}"
|
||||||
|
|
||||||
|
download_url = storage.generate_presigned_url(
|
||||||
|
s3_key, original_filename=download_name
|
||||||
|
)
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"status": "completed",
|
||||||
|
"download_url": download_url,
|
||||||
|
"filename": download_name,
|
||||||
|
"original_width": stats["original_width"],
|
||||||
|
"original_height": stats["original_height"],
|
||||||
|
"new_width": stats["new_width"],
|
||||||
|
"new_height": stats["new_height"],
|
||||||
|
}
|
||||||
|
|
||||||
|
_cleanup(task_id)
|
||||||
|
|
||||||
|
logger.info(f"Task {task_id}: Image resize completed")
|
||||||
|
return result
|
||||||
|
|
||||||
|
except ImageProcessingError as e:
|
||||||
|
logger.error(f"Task {task_id}: Image error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": str(e)}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Task {task_id}: Unexpected error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": "An unexpected error occurred."}
|
||||||
96
backend/app/tasks/video_tasks.py
Normal file
96
backend/app/tasks/video_tasks.py
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
"""Celery tasks for video processing."""
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from app.extensions import celery
|
||||||
|
from app.services.video_service import video_to_gif, VideoProcessingError
|
||||||
|
from app.services.storage_service import storage
|
||||||
|
from app.utils.sanitizer import cleanup_task_files
|
||||||
|
|
||||||
|
|
||||||
|
def _cleanup(task_id: str):
|
||||||
|
cleanup_task_files(task_id, keep_outputs=not storage.use_s3)
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@celery.task(bind=True, name="app.tasks.video_tasks.create_gif_task")
|
||||||
|
def create_gif_task(
|
||||||
|
self,
|
||||||
|
input_path: str,
|
||||||
|
task_id: str,
|
||||||
|
original_filename: str,
|
||||||
|
start_time: float = 0,
|
||||||
|
duration: float = 5,
|
||||||
|
fps: int = 10,
|
||||||
|
width: int = 480,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Async task: Convert video clip to animated GIF.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
input_path: Path to the uploaded video
|
||||||
|
task_id: Unique task identifier
|
||||||
|
original_filename: Original filename for download
|
||||||
|
start_time: Start time in seconds
|
||||||
|
duration: Duration in seconds
|
||||||
|
fps: Frames per second
|
||||||
|
width: Output width in pixels
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict with download_url and GIF info
|
||||||
|
"""
|
||||||
|
output_dir = os.path.join("/tmp/outputs", task_id)
|
||||||
|
os.makedirs(output_dir, exist_ok=True)
|
||||||
|
output_path = os.path.join(output_dir, f"{task_id}.gif")
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.update_state(
|
||||||
|
state="PROCESSING",
|
||||||
|
meta={"step": "Creating GIF from video..."},
|
||||||
|
)
|
||||||
|
|
||||||
|
stats = video_to_gif(
|
||||||
|
input_path, output_path,
|
||||||
|
start_time=start_time,
|
||||||
|
duration=duration,
|
||||||
|
fps=fps,
|
||||||
|
width=width,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.update_state(state="PROCESSING", meta={"step": "Uploading result..."})
|
||||||
|
|
||||||
|
s3_key = storage.upload_file(output_path, task_id, folder="outputs")
|
||||||
|
|
||||||
|
name_without_ext = os.path.splitext(original_filename)[0]
|
||||||
|
download_name = f"{name_without_ext}.gif"
|
||||||
|
|
||||||
|
download_url = storage.generate_presigned_url(
|
||||||
|
s3_key, original_filename=download_name
|
||||||
|
)
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"status": "completed",
|
||||||
|
"download_url": download_url,
|
||||||
|
"filename": download_name,
|
||||||
|
"output_size": stats["output_size"],
|
||||||
|
"duration": stats["duration"],
|
||||||
|
"fps": stats["fps"],
|
||||||
|
"width": stats["width"],
|
||||||
|
"height": stats["height"],
|
||||||
|
}
|
||||||
|
|
||||||
|
_cleanup(task_id)
|
||||||
|
|
||||||
|
logger.info(f"Task {task_id}: Video→GIF creation completed")
|
||||||
|
return result
|
||||||
|
|
||||||
|
except VideoProcessingError as e:
|
||||||
|
logger.error(f"Task {task_id}: Video error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": str(e)}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Task {task_id}: Unexpected error — {e}")
|
||||||
|
_cleanup(task_id)
|
||||||
|
return {"status": "failed", "error": "An unexpected error occurred."}
|
||||||
1
backend/app/utils/__init__.py
Normal file
1
backend/app/utils/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Backend application utilities."""
|
||||||
31
backend/app/utils/cleanup.py
Normal file
31
backend/app/utils/cleanup.py
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
"""Scheduled cleanup of expired temporary files."""
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import time
|
||||||
|
|
||||||
|
from flask import current_app
|
||||||
|
|
||||||
|
|
||||||
|
def cleanup_expired_files():
|
||||||
|
"""Remove files older than FILE_EXPIRY_SECONDS from upload/output dirs."""
|
||||||
|
expiry = current_app.config.get("FILE_EXPIRY_SECONDS", 1800)
|
||||||
|
now = time.time()
|
||||||
|
removed_count = 0
|
||||||
|
|
||||||
|
for folder_key in ["UPLOAD_FOLDER", "OUTPUT_FOLDER"]:
|
||||||
|
folder = current_app.config.get(folder_key)
|
||||||
|
if not folder or not os.path.exists(folder):
|
||||||
|
continue
|
||||||
|
|
||||||
|
for task_dir_name in os.listdir(folder):
|
||||||
|
task_dir = os.path.join(folder, task_dir_name)
|
||||||
|
if not os.path.isdir(task_dir):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check directory age based on modification time
|
||||||
|
dir_mtime = os.path.getmtime(task_dir)
|
||||||
|
if now - dir_mtime > expiry:
|
||||||
|
shutil.rmtree(task_dir, ignore_errors=True)
|
||||||
|
removed_count += 1
|
||||||
|
|
||||||
|
return removed_count
|
||||||
111
backend/app/utils/file_validator.py
Normal file
111
backend/app/utils/file_validator.py
Normal file
@@ -0,0 +1,111 @@
|
|||||||
|
"""File validation utilities — multi-layer security checks."""
|
||||||
|
import os
|
||||||
|
|
||||||
|
import magic
|
||||||
|
from flask import current_app
|
||||||
|
from werkzeug.utils import secure_filename
|
||||||
|
|
||||||
|
|
||||||
|
class FileValidationError(Exception):
|
||||||
|
"""Custom exception for file validation failures."""
|
||||||
|
|
||||||
|
def __init__(self, message: str, code: int = 400):
|
||||||
|
self.message = message
|
||||||
|
self.code = code
|
||||||
|
super().__init__(self.message)
|
||||||
|
|
||||||
|
|
||||||
|
def validate_file(file_storage, allowed_types: list[str] | None = None):
|
||||||
|
"""
|
||||||
|
Validate an uploaded file through multiple security layers.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_storage: Flask FileStorage object from request.files
|
||||||
|
allowed_types: List of allowed extensions (e.g., ["pdf", "docx"]).
|
||||||
|
If None, uses all allowed extensions from config.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
tuple: (sanitized_filename, detected_extension)
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
FileValidationError: If validation fails at any layer.
|
||||||
|
"""
|
||||||
|
config = current_app.config
|
||||||
|
|
||||||
|
# Layer 1: Check if file exists and has a filename
|
||||||
|
if not file_storage or file_storage.filename == "":
|
||||||
|
raise FileValidationError("No file provided.")
|
||||||
|
|
||||||
|
filename = secure_filename(file_storage.filename)
|
||||||
|
if not filename:
|
||||||
|
raise FileValidationError("Invalid filename.")
|
||||||
|
|
||||||
|
# Layer 2: Check file extension against whitelist
|
||||||
|
ext = _get_extension(filename)
|
||||||
|
allowed_extensions = config.get("ALLOWED_EXTENSIONS", {})
|
||||||
|
|
||||||
|
if allowed_types:
|
||||||
|
valid_extensions = {k: v for k, v in allowed_extensions.items() if k in allowed_types}
|
||||||
|
else:
|
||||||
|
valid_extensions = allowed_extensions
|
||||||
|
|
||||||
|
if ext not in valid_extensions:
|
||||||
|
raise FileValidationError(
|
||||||
|
f"File type '.{ext}' is not allowed. "
|
||||||
|
f"Allowed types: {', '.join(valid_extensions.keys())}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Layer 3: Check file size against type-specific limits
|
||||||
|
file_storage.seek(0, os.SEEK_END)
|
||||||
|
file_size = file_storage.tell()
|
||||||
|
file_storage.seek(0)
|
||||||
|
|
||||||
|
size_limits = config.get("FILE_SIZE_LIMITS", {})
|
||||||
|
max_size = size_limits.get(ext, 20 * 1024 * 1024) # Default 20MB
|
||||||
|
|
||||||
|
if file_size > max_size:
|
||||||
|
max_mb = max_size / (1024 * 1024)
|
||||||
|
raise FileValidationError(
|
||||||
|
f"File too large. Maximum size for .{ext} files is {max_mb:.0f}MB."
|
||||||
|
)
|
||||||
|
|
||||||
|
if file_size == 0:
|
||||||
|
raise FileValidationError("File is empty.")
|
||||||
|
|
||||||
|
# Layer 4: Check MIME type using magic bytes
|
||||||
|
file_header = file_storage.read(8192)
|
||||||
|
file_storage.seek(0)
|
||||||
|
|
||||||
|
detected_mime = magic.from_buffer(file_header, mime=True)
|
||||||
|
expected_mimes = valid_extensions.get(ext, [])
|
||||||
|
|
||||||
|
if detected_mime not in expected_mimes:
|
||||||
|
raise FileValidationError(
|
||||||
|
f"File content does not match extension '.{ext}'. "
|
||||||
|
f"Detected type: {detected_mime}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Layer 5: Additional content checks for specific types
|
||||||
|
if ext == "pdf":
|
||||||
|
_check_pdf_safety(file_header)
|
||||||
|
|
||||||
|
return filename, ext
|
||||||
|
|
||||||
|
|
||||||
|
def _get_extension(filename: str) -> str:
|
||||||
|
"""Extract and normalize file extension."""
|
||||||
|
if "." not in filename:
|
||||||
|
return ""
|
||||||
|
return filename.rsplit(".", 1)[1].lower()
|
||||||
|
|
||||||
|
|
||||||
|
def _check_pdf_safety(file_header: bytes):
|
||||||
|
"""Check PDF for potentially dangerous embedded content."""
|
||||||
|
dangerous_patterns = [b"/JS", b"/JavaScript", b"/Launch", b"/EmbeddedFile"]
|
||||||
|
header_str = file_header
|
||||||
|
|
||||||
|
for pattern in dangerous_patterns:
|
||||||
|
if pattern in header_str:
|
||||||
|
raise FileValidationError(
|
||||||
|
"PDF contains potentially unsafe content (embedded scripts)."
|
||||||
|
)
|
||||||
77
backend/app/utils/sanitizer.py
Normal file
77
backend/app/utils/sanitizer.py
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
"""Filename sanitization and temporary file management."""
|
||||||
|
import os
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
from flask import current_app
|
||||||
|
|
||||||
|
|
||||||
|
def generate_safe_path(extension: str, folder_type: str = "upload") -> tuple[str, str]:
|
||||||
|
"""
|
||||||
|
Generate a safe file path using UUID.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
extension: File extension (without dot)
|
||||||
|
folder_type: "upload" for input files, "output" for processed files
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
tuple: (task_id, full_file_path)
|
||||||
|
"""
|
||||||
|
task_id = str(uuid.uuid4())
|
||||||
|
|
||||||
|
if folder_type == "upload":
|
||||||
|
base_dir = current_app.config["UPLOAD_FOLDER"]
|
||||||
|
else:
|
||||||
|
base_dir = current_app.config["OUTPUT_FOLDER"]
|
||||||
|
|
||||||
|
# Create task-specific directory
|
||||||
|
task_dir = os.path.join(base_dir, task_id)
|
||||||
|
os.makedirs(task_dir, exist_ok=True)
|
||||||
|
|
||||||
|
filename = f"{task_id}.{extension}"
|
||||||
|
file_path = os.path.join(task_dir, filename)
|
||||||
|
|
||||||
|
return task_id, file_path
|
||||||
|
|
||||||
|
|
||||||
|
def get_output_path(task_id: str, extension: str) -> str:
|
||||||
|
"""
|
||||||
|
Get the output file path for a processed file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
task_id: The task UUID
|
||||||
|
extension: Output file extension
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Full output file path
|
||||||
|
"""
|
||||||
|
output_dir = current_app.config["OUTPUT_FOLDER"]
|
||||||
|
task_dir = os.path.join(output_dir, task_id)
|
||||||
|
os.makedirs(task_dir, exist_ok=True)
|
||||||
|
|
||||||
|
filename = f"{task_id}.{extension}"
|
||||||
|
return os.path.join(task_dir, filename)
|
||||||
|
|
||||||
|
|
||||||
|
def cleanup_task_files(task_id: str, keep_outputs: bool = False):
|
||||||
|
"""
|
||||||
|
Remove temporary files for a given task.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
task_id: The task UUID
|
||||||
|
keep_outputs: If True, only clean uploads (used in local storage mode)
|
||||||
|
"""
|
||||||
|
import shutil
|
||||||
|
|
||||||
|
upload_dir = current_app.config.get("UPLOAD_FOLDER", "/tmp/uploads")
|
||||||
|
output_dir = current_app.config.get("OUTPUT_FOLDER", "/tmp/outputs")
|
||||||
|
|
||||||
|
# Always clean uploads
|
||||||
|
upload_task_dir = os.path.join(upload_dir, task_id)
|
||||||
|
if os.path.exists(upload_task_dir):
|
||||||
|
shutil.rmtree(upload_task_dir, ignore_errors=True)
|
||||||
|
|
||||||
|
# Only clean outputs when using S3 (files already uploaded to S3)
|
||||||
|
if not keep_outputs:
|
||||||
|
output_task_dir = os.path.join(output_dir, task_id)
|
||||||
|
if os.path.exists(output_task_dir):
|
||||||
|
shutil.rmtree(output_task_dir, ignore_errors=True)
|
||||||
11
backend/celery_worker.py
Normal file
11
backend/celery_worker.py
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
"""Celery worker entry point."""
|
||||||
|
from app import create_app
|
||||||
|
from app.extensions import celery
|
||||||
|
|
||||||
|
app = create_app()
|
||||||
|
|
||||||
|
# Import all tasks so Celery discovers them
|
||||||
|
import app.tasks.convert_tasks # noqa: F401
|
||||||
|
import app.tasks.compress_tasks # noqa: F401
|
||||||
|
import app.tasks.image_tasks # noqa: F401
|
||||||
|
import app.tasks.video_tasks # noqa: F401
|
||||||
93
backend/config/__init__.py
Normal file
93
backend/config/__init__.py
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
import os
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
|
||||||
|
load_dotenv()
|
||||||
|
|
||||||
|
|
||||||
|
class BaseConfig:
|
||||||
|
"""Base configuration."""
|
||||||
|
SECRET_KEY = os.getenv("SECRET_KEY", "change-me-in-production")
|
||||||
|
|
||||||
|
# File upload settings
|
||||||
|
MAX_CONTENT_LENGTH = int(os.getenv("MAX_CONTENT_LENGTH_MB", 50)) * 1024 * 1024
|
||||||
|
UPLOAD_FOLDER = os.getenv("UPLOAD_FOLDER", "/tmp/uploads")
|
||||||
|
OUTPUT_FOLDER = os.getenv("OUTPUT_FOLDER", "/tmp/outputs")
|
||||||
|
FILE_EXPIRY_SECONDS = int(os.getenv("FILE_EXPIRY_SECONDS", 1800))
|
||||||
|
|
||||||
|
# Allowed file extensions and MIME types
|
||||||
|
ALLOWED_EXTENSIONS = {
|
||||||
|
"pdf": ["application/pdf"],
|
||||||
|
"doc": ["application/msword"],
|
||||||
|
"docx": [
|
||||||
|
"application/vnd.openxmlformats-officedocument.wordprocessingml.document"
|
||||||
|
],
|
||||||
|
"png": ["image/png"],
|
||||||
|
"jpg": ["image/jpeg"],
|
||||||
|
"jpeg": ["image/jpeg"],
|
||||||
|
"webp": ["image/webp"],
|
||||||
|
"mp4": ["video/mp4"],
|
||||||
|
"webm": ["video/webm"],
|
||||||
|
}
|
||||||
|
|
||||||
|
# File size limits per type (bytes)
|
||||||
|
FILE_SIZE_LIMITS = {
|
||||||
|
"pdf": 20 * 1024 * 1024, # 20MB
|
||||||
|
"doc": 15 * 1024 * 1024, # 15MB
|
||||||
|
"docx": 15 * 1024 * 1024, # 15MB
|
||||||
|
"png": 10 * 1024 * 1024, # 10MB
|
||||||
|
"jpg": 10 * 1024 * 1024, # 10MB
|
||||||
|
"jpeg": 10 * 1024 * 1024, # 10MB
|
||||||
|
"webp": 10 * 1024 * 1024, # 10MB
|
||||||
|
"mp4": 50 * 1024 * 1024, # 50MB
|
||||||
|
"webm": 50 * 1024 * 1024, # 50MB
|
||||||
|
}
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL = os.getenv("REDIS_URL", "redis://redis:6379/0")
|
||||||
|
|
||||||
|
# Celery
|
||||||
|
CELERY_BROKER_URL = os.getenv("CELERY_BROKER_URL", "redis://redis:6379/0")
|
||||||
|
CELERY_RESULT_BACKEND = os.getenv("CELERY_RESULT_BACKEND", "redis://redis:6379/1")
|
||||||
|
|
||||||
|
# AWS S3
|
||||||
|
AWS_ACCESS_KEY_ID = os.getenv("AWS_ACCESS_KEY_ID")
|
||||||
|
AWS_SECRET_ACCESS_KEY = os.getenv("AWS_SECRET_ACCESS_KEY")
|
||||||
|
AWS_S3_BUCKET = os.getenv("AWS_S3_BUCKET", "saas-pdf-temp-files")
|
||||||
|
AWS_S3_REGION = os.getenv("AWS_S3_REGION", "eu-west-1")
|
||||||
|
|
||||||
|
# CORS
|
||||||
|
CORS_ORIGINS = os.getenv("CORS_ORIGINS", "http://localhost:5173").split(",")
|
||||||
|
|
||||||
|
# Rate Limiting
|
||||||
|
RATELIMIT_STORAGE_URI = os.getenv("REDIS_URL", "redis://redis:6379/0")
|
||||||
|
RATELIMIT_DEFAULT = "100/hour"
|
||||||
|
|
||||||
|
|
||||||
|
class DevelopmentConfig(BaseConfig):
|
||||||
|
"""Development configuration."""
|
||||||
|
DEBUG = True
|
||||||
|
TESTING = False
|
||||||
|
|
||||||
|
|
||||||
|
class ProductionConfig(BaseConfig):
|
||||||
|
"""Production configuration."""
|
||||||
|
DEBUG = False
|
||||||
|
TESTING = False
|
||||||
|
# Stricter rate limits in production
|
||||||
|
RATELIMIT_DEFAULT = "60/hour"
|
||||||
|
|
||||||
|
|
||||||
|
class TestingConfig(BaseConfig):
|
||||||
|
"""Testing configuration."""
|
||||||
|
DEBUG = True
|
||||||
|
TESTING = True
|
||||||
|
UPLOAD_FOLDER = "/tmp/test_uploads"
|
||||||
|
OUTPUT_FOLDER = "/tmp/test_outputs"
|
||||||
|
|
||||||
|
|
||||||
|
config = {
|
||||||
|
"development": DevelopmentConfig,
|
||||||
|
"production": ProductionConfig,
|
||||||
|
"testing": TestingConfig,
|
||||||
|
"default": DevelopmentConfig,
|
||||||
|
}
|
||||||
27
backend/requirements.txt
Normal file
27
backend/requirements.txt
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
# Core Framework
|
||||||
|
flask>=3.0,<4.0
|
||||||
|
flask-cors>=4.0,<5.0
|
||||||
|
flask-limiter[redis]>=3.5,<4.0
|
||||||
|
flask-talisman>=1.1,<2.0
|
||||||
|
gunicorn>=22.0,<23.0
|
||||||
|
python-dotenv>=1.0,<2.0
|
||||||
|
|
||||||
|
# Task Queue
|
||||||
|
celery[redis]>=5.3,<6.0
|
||||||
|
redis>=5.0,<6.0
|
||||||
|
flower>=2.0,<3.0
|
||||||
|
|
||||||
|
# File Processing
|
||||||
|
Pillow>=10.0,<11.0
|
||||||
|
python-magic>=0.4.27,<1.0
|
||||||
|
ffmpeg-python>=0.2,<1.0
|
||||||
|
|
||||||
|
# AWS
|
||||||
|
boto3>=1.34,<2.0
|
||||||
|
|
||||||
|
# Security
|
||||||
|
werkzeug>=3.0,<4.0
|
||||||
|
|
||||||
|
# Testing
|
||||||
|
pytest>=8.0,<9.0
|
||||||
|
pytest-flask>=1.3,<2.0
|
||||||
0
backend/tests/__init__.py
Normal file
0
backend/tests/__init__.py
Normal file
26
backend/tests/conftest.py
Normal file
26
backend/tests/conftest.py
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
import os
|
||||||
|
import pytest
|
||||||
|
from app import create_app
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def app():
|
||||||
|
"""Create application for testing."""
|
||||||
|
os.environ['FLASK_ENV'] = 'testing'
|
||||||
|
app = create_app()
|
||||||
|
app.config.update({
|
||||||
|
'TESTING': True,
|
||||||
|
})
|
||||||
|
yield app
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def client(app):
|
||||||
|
"""Flask test client."""
|
||||||
|
return app.test_client()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def runner(app):
|
||||||
|
"""Flask test CLI runner."""
|
||||||
|
return app.test_cli_runner()
|
||||||
21
backend/tests/test_compress.py
Normal file
21
backend/tests/test_compress.py
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
"""Tests for PDF compression endpoint."""
|
||||||
|
import io
|
||||||
|
|
||||||
|
|
||||||
|
def test_compress_pdf_no_file(client):
|
||||||
|
"""POST /api/compress/pdf without file should return 400."""
|
||||||
|
response = client.post('/api/compress/pdf')
|
||||||
|
assert response.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
def test_compress_pdf_wrong_extension(client):
|
||||||
|
"""POST /api/compress/pdf with non-PDF should return 400."""
|
||||||
|
data = {
|
||||||
|
'file': (io.BytesIO(b'hello'), 'test.docx'),
|
||||||
|
}
|
||||||
|
response = client.post(
|
||||||
|
'/api/compress/pdf',
|
||||||
|
data=data,
|
||||||
|
content_type='multipart/form-data',
|
||||||
|
)
|
||||||
|
assert response.status_code == 400
|
||||||
42
backend/tests/test_convert.py
Normal file
42
backend/tests/test_convert.py
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
"""Tests for file conversion endpoints."""
|
||||||
|
import io
|
||||||
|
|
||||||
|
|
||||||
|
def test_pdf_to_word_no_file(client):
|
||||||
|
"""POST /api/convert/pdf-to-word without file should return 400."""
|
||||||
|
response = client.post('/api/convert/pdf-to-word')
|
||||||
|
assert response.status_code == 400
|
||||||
|
data = response.get_json()
|
||||||
|
assert 'error' in data
|
||||||
|
|
||||||
|
|
||||||
|
def test_pdf_to_word_wrong_extension(client):
|
||||||
|
"""POST /api/convert/pdf-to-word with non-PDF should return 400."""
|
||||||
|
data = {
|
||||||
|
'file': (io.BytesIO(b'hello world'), 'test.txt'),
|
||||||
|
}
|
||||||
|
response = client.post(
|
||||||
|
'/api/convert/pdf-to-word',
|
||||||
|
data=data,
|
||||||
|
content_type='multipart/form-data',
|
||||||
|
)
|
||||||
|
assert response.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
def test_word_to_pdf_no_file(client):
|
||||||
|
"""POST /api/convert/word-to-pdf without file should return 400."""
|
||||||
|
response = client.post('/api/convert/word-to-pdf')
|
||||||
|
assert response.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
def test_word_to_pdf_wrong_extension(client):
|
||||||
|
"""POST /api/convert/word-to-pdf with non-Word file should return 400."""
|
||||||
|
data = {
|
||||||
|
'file': (io.BytesIO(b'hello world'), 'test.pdf'),
|
||||||
|
}
|
||||||
|
response = client.post(
|
||||||
|
'/api/convert/word-to-pdf',
|
||||||
|
data=data,
|
||||||
|
content_type='multipart/form-data',
|
||||||
|
)
|
||||||
|
assert response.status_code == 400
|
||||||
15
backend/tests/test_health.py
Normal file
15
backend/tests/test_health.py
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
"""Tests for health check and app creation."""
|
||||||
|
|
||||||
|
|
||||||
|
def test_health_endpoint(client):
|
||||||
|
"""GET /api/health should return 200."""
|
||||||
|
response = client.get('/api/health')
|
||||||
|
assert response.status_code == 200
|
||||||
|
data = response.get_json()
|
||||||
|
assert data['status'] == 'healthy'
|
||||||
|
|
||||||
|
|
||||||
|
def test_app_creates(app):
|
||||||
|
"""App should create without errors."""
|
||||||
|
assert app is not None
|
||||||
|
assert app.config['TESTING'] is True
|
||||||
27
backend/tests/test_image.py
Normal file
27
backend/tests/test_image.py
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
"""Tests for image conversion & resize endpoints."""
|
||||||
|
import io
|
||||||
|
|
||||||
|
|
||||||
|
def test_image_convert_no_file(client):
|
||||||
|
"""POST /api/image/convert without file should return 400."""
|
||||||
|
response = client.post('/api/image/convert')
|
||||||
|
assert response.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
def test_image_resize_no_file(client):
|
||||||
|
"""POST /api/image/resize without file should return 400."""
|
||||||
|
response = client.post('/api/image/resize')
|
||||||
|
assert response.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
def test_image_convert_wrong_type(client):
|
||||||
|
"""POST /api/image/convert with non-image should return 400."""
|
||||||
|
data = {
|
||||||
|
'file': (io.BytesIO(b'not an image'), 'test.pdf'),
|
||||||
|
}
|
||||||
|
response = client.post(
|
||||||
|
'/api/image/convert',
|
||||||
|
data=data,
|
||||||
|
content_type='multipart/form-data',
|
||||||
|
)
|
||||||
|
assert response.status_code == 400
|
||||||
19
backend/tests/test_utils.py
Normal file
19
backend/tests/test_utils.py
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
"""Tests for text utility functions."""
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
|
||||||
|
# Add backend to path so we can import utils directly
|
||||||
|
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..'))
|
||||||
|
|
||||||
|
from app.utils.file_validator import validate_file
|
||||||
|
from app.utils.sanitizer import generate_safe_path
|
||||||
|
|
||||||
|
|
||||||
|
def test_generate_safe_path():
|
||||||
|
"""generate_safe_path should produce UUID-based path."""
|
||||||
|
path = generate_safe_path('uploads', 'test.pdf')
|
||||||
|
assert path.startswith('uploads')
|
||||||
|
assert path.endswith('.pdf')
|
||||||
|
# Should contain a UUID directory
|
||||||
|
parts = path.replace('\\', '/').split('/')
|
||||||
|
assert len(parts) >= 3 # uploads / uuid / filename.pdf
|
||||||
7
backend/wsgi.py
Normal file
7
backend/wsgi.py
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
"""WSGI entry point for Gunicorn."""
|
||||||
|
from app import create_app
|
||||||
|
|
||||||
|
app = create_app()
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
app.run(host="0.0.0.0", port=5000)
|
||||||
107
docker-compose.prod.yml
Normal file
107
docker-compose.prod.yml
Normal file
@@ -0,0 +1,107 @@
|
|||||||
|
services:
|
||||||
|
# --- Redis ---
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
volumes:
|
||||||
|
- redis_data:/data
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "redis-cli", "ping"]
|
||||||
|
interval: 10s
|
||||||
|
timeout: 3s
|
||||||
|
retries: 5
|
||||||
|
restart: always
|
||||||
|
|
||||||
|
# --- Flask Backend ---
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
env_file:
|
||||||
|
- .env
|
||||||
|
environment:
|
||||||
|
- FLASK_ENV=production
|
||||||
|
- REDIS_URL=redis://redis:6379/0
|
||||||
|
- CELERY_BROKER_URL=redis://redis:6379/0
|
||||||
|
- CELERY_RESULT_BACKEND=redis://redis:6379/1
|
||||||
|
volumes:
|
||||||
|
- upload_data:/tmp/uploads
|
||||||
|
- output_data:/tmp/outputs
|
||||||
|
depends_on:
|
||||||
|
redis:
|
||||||
|
condition: service_healthy
|
||||||
|
restart: always
|
||||||
|
|
||||||
|
# --- Celery Worker ---
|
||||||
|
celery_worker:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
command: >
|
||||||
|
celery -A celery_worker.celery worker
|
||||||
|
--loglevel=warning
|
||||||
|
--concurrency=4
|
||||||
|
-Q default,convert,compress,image,video
|
||||||
|
env_file:
|
||||||
|
- .env
|
||||||
|
environment:
|
||||||
|
- FLASK_ENV=production
|
||||||
|
- REDIS_URL=redis://redis:6379/0
|
||||||
|
- CELERY_BROKER_URL=redis://redis:6379/0
|
||||||
|
- CELERY_RESULT_BACKEND=redis://redis:6379/1
|
||||||
|
volumes:
|
||||||
|
- upload_data:/tmp/uploads
|
||||||
|
- output_data:/tmp/outputs
|
||||||
|
depends_on:
|
||||||
|
redis:
|
||||||
|
condition: service_healthy
|
||||||
|
restart: always
|
||||||
|
|
||||||
|
# --- Celery Beat (Scheduled Tasks) ---
|
||||||
|
celery_beat:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
command: >
|
||||||
|
celery -A celery_worker.celery beat
|
||||||
|
--loglevel=warning
|
||||||
|
env_file:
|
||||||
|
- .env
|
||||||
|
environment:
|
||||||
|
- FLASK_ENV=production
|
||||||
|
- REDIS_URL=redis://redis:6379/0
|
||||||
|
- CELERY_BROKER_URL=redis://redis:6379/0
|
||||||
|
- CELERY_RESULT_BACKEND=redis://redis:6379/1
|
||||||
|
depends_on:
|
||||||
|
redis:
|
||||||
|
condition: service_healthy
|
||||||
|
restart: always
|
||||||
|
|
||||||
|
# --- Nginx (serves built frontend + reverse proxy) ---
|
||||||
|
nginx:
|
||||||
|
image: nginx:alpine
|
||||||
|
ports:
|
||||||
|
- "80:80"
|
||||||
|
- "443:443"
|
||||||
|
volumes:
|
||||||
|
- ./nginx/nginx.prod.conf:/etc/nginx/conf.d/default.conf:ro
|
||||||
|
- frontend_build:/usr/share/nginx/html:ro
|
||||||
|
- ./nginx/ssl:/etc/nginx/ssl:ro
|
||||||
|
depends_on:
|
||||||
|
- backend
|
||||||
|
- frontend_build_step
|
||||||
|
restart: always
|
||||||
|
|
||||||
|
# --- Frontend Build (one-shot) ---
|
||||||
|
frontend_build_step:
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
target: build
|
||||||
|
volumes:
|
||||||
|
- frontend_build:/app/dist
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
redis_data:
|
||||||
|
upload_data:
|
||||||
|
output_data:
|
||||||
|
frontend_build:
|
||||||
99
docker-compose.yml
Normal file
99
docker-compose.yml
Normal file
@@ -0,0 +1,99 @@
|
|||||||
|
services:
|
||||||
|
# --- Redis ---
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
ports:
|
||||||
|
- "6379:6379"
|
||||||
|
volumes:
|
||||||
|
- redis_data:/data
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "redis-cli", "ping"]
|
||||||
|
interval: 10s
|
||||||
|
timeout: 3s
|
||||||
|
retries: 5
|
||||||
|
|
||||||
|
# --- Flask Backend ---
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
ports:
|
||||||
|
- "5000:5000"
|
||||||
|
env_file:
|
||||||
|
- .env
|
||||||
|
environment:
|
||||||
|
- FLASK_ENV=development
|
||||||
|
- REDIS_URL=redis://redis:6379/0
|
||||||
|
- CELERY_BROKER_URL=redis://redis:6379/0
|
||||||
|
- CELERY_RESULT_BACKEND=redis://redis:6379/1
|
||||||
|
volumes:
|
||||||
|
- ./backend:/app
|
||||||
|
- upload_data:/tmp/uploads
|
||||||
|
- output_data:/tmp/outputs
|
||||||
|
depends_on:
|
||||||
|
redis:
|
||||||
|
condition: service_healthy
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
# --- Celery Worker ---
|
||||||
|
celery_worker:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
command: >
|
||||||
|
celery -A celery_worker.celery worker
|
||||||
|
--loglevel=info
|
||||||
|
--concurrency=2
|
||||||
|
-Q default,convert,compress,image,video
|
||||||
|
env_file:
|
||||||
|
- .env
|
||||||
|
environment:
|
||||||
|
- FLASK_ENV=development
|
||||||
|
- REDIS_URL=redis://redis:6379/0
|
||||||
|
- CELERY_BROKER_URL=redis://redis:6379/0
|
||||||
|
- CELERY_RESULT_BACKEND=redis://redis:6379/1
|
||||||
|
volumes:
|
||||||
|
- ./backend:/app
|
||||||
|
- upload_data:/tmp/uploads
|
||||||
|
- output_data:/tmp/outputs
|
||||||
|
depends_on:
|
||||||
|
redis:
|
||||||
|
condition: service_healthy
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "celery", "-A", "celery_worker.celery", "inspect", "ping"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 3
|
||||||
|
start_period: 30s
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
# --- React Frontend (Vite Dev) ---
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
target: development
|
||||||
|
ports:
|
||||||
|
- "5173:5173"
|
||||||
|
volumes:
|
||||||
|
- ./frontend:/app
|
||||||
|
- /app/node_modules
|
||||||
|
environment:
|
||||||
|
- NODE_ENV=development
|
||||||
|
|
||||||
|
# --- Nginx Reverse Proxy ---
|
||||||
|
nginx:
|
||||||
|
image: nginx:alpine
|
||||||
|
ports:
|
||||||
|
- "80:80"
|
||||||
|
volumes:
|
||||||
|
- ./nginx/nginx.conf:/etc/nginx/conf.d/default.conf:ro
|
||||||
|
depends_on:
|
||||||
|
- backend
|
||||||
|
- frontend
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
redis_data:
|
||||||
|
upload_data:
|
||||||
|
output_data:
|
||||||
0
docs/Plan-1.md
Normal file
0
docs/Plan-1.md
Normal file
41
frontend/Dockerfile
Normal file
41
frontend/Dockerfile
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
# ---- Build Stage ----
|
||||||
|
FROM node:20-alpine AS build
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
COPY package.json ./
|
||||||
|
RUN npm install
|
||||||
|
|
||||||
|
# Copy source code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Build for production
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# ---- Production Stage ----
|
||||||
|
FROM nginx:alpine AS production
|
||||||
|
|
||||||
|
# Copy built assets
|
||||||
|
COPY --from=build /app/dist /usr/share/nginx/html
|
||||||
|
|
||||||
|
# Copy nginx config for SPA routing
|
||||||
|
COPY nginx-frontend.conf /etc/nginx/conf.d/default.conf
|
||||||
|
|
||||||
|
EXPOSE 80
|
||||||
|
|
||||||
|
CMD ["nginx", "-g", "daemon off;"]
|
||||||
|
|
||||||
|
# ---- Development Stage ----
|
||||||
|
FROM node:20-alpine AS development
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
COPY package.json ./
|
||||||
|
RUN npm install
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
EXPOSE 5173
|
||||||
|
|
||||||
|
CMD ["npm", "run", "dev", "--", "--host", "0.0.0.0"]
|
||||||
17
frontend/index.html
Normal file
17
frontend/index.html
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en" dir="ltr">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8" />
|
||||||
|
<link rel="icon" type="image/svg+xml" href="/favicon.svg" />
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||||
|
<meta name="description" content="Free online tools for PDF, image, video, and text processing. Convert, compress, and transform your files instantly." />
|
||||||
|
<link rel="preconnect" href="https://fonts.googleapis.com" />
|
||||||
|
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
|
||||||
|
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@300;400;500;600;700&family=Tajawal:wght@300;400;500;700&display=swap" rel="stylesheet" />
|
||||||
|
<title>SaaS-PDF — Free Online File Tools</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div id="root"></div>
|
||||||
|
<script type="module" src="/src/main.tsx"></script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
21
frontend/nginx-frontend.conf
Normal file
21
frontend/nginx-frontend.conf
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
root /usr/share/nginx/html;
|
||||||
|
index index.html;
|
||||||
|
|
||||||
|
# SPA fallback
|
||||||
|
location / {
|
||||||
|
try_files $uri $uri/ /index.html;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cache static assets
|
||||||
|
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff2?)$ {
|
||||||
|
expires 1y;
|
||||||
|
add_header Cache-Control "public, immutable";
|
||||||
|
}
|
||||||
|
|
||||||
|
# Security headers
|
||||||
|
add_header X-Frame-Options "SAMEORIGIN" always;
|
||||||
|
add_header X-Content-Type-Options "nosniff" always;
|
||||||
|
add_header X-XSS-Protection "1; mode=block" always;
|
||||||
|
}
|
||||||
38
frontend/package.json
Normal file
38
frontend/package.json
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
{
|
||||||
|
"name": "saas-pdf-frontend",
|
||||||
|
"private": true,
|
||||||
|
"version": "1.0.0",
|
||||||
|
"type": "module",
|
||||||
|
"scripts": {
|
||||||
|
"dev": "vite",
|
||||||
|
"build": "tsc --noEmit && vite build",
|
||||||
|
"preview": "vite preview",
|
||||||
|
"lint": "eslint ."
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"axios": "^1.7.0",
|
||||||
|
"i18next": "^23.11.0",
|
||||||
|
"i18next-browser-languagedetector": "^8.0.0",
|
||||||
|
"lucide-react": "^0.400.0",
|
||||||
|
"react": "^18.3.0",
|
||||||
|
"react-dom": "^18.3.0",
|
||||||
|
"react-dropzone": "^14.2.0",
|
||||||
|
"react-ga4": "^2.1.0",
|
||||||
|
"react-helmet-async": "^2.0.0",
|
||||||
|
"react-i18next": "^14.1.0",
|
||||||
|
"react-router-dom": "^6.23.0",
|
||||||
|
"sonner": "^1.5.0",
|
||||||
|
"zustand": "^4.5.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/node": "^20.14.0",
|
||||||
|
"@types/react": "^18.3.0",
|
||||||
|
"@types/react-dom": "^18.3.0",
|
||||||
|
"@vitejs/plugin-react": "^4.3.0",
|
||||||
|
"autoprefixer": "^10.4.0",
|
||||||
|
"postcss": "^8.4.0",
|
||||||
|
"tailwindcss": "^3.4.0",
|
||||||
|
"typescript": "^5.5.0",
|
||||||
|
"vite": "^5.4.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
6
frontend/postcss.config.js
Normal file
6
frontend/postcss.config.js
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
export default {
|
||||||
|
plugins: {
|
||||||
|
tailwindcss: {},
|
||||||
|
autoprefixer: {},
|
||||||
|
},
|
||||||
|
};
|
||||||
1
frontend/public/ads.txt
Normal file
1
frontend/public/ads.txt
Normal file
@@ -0,0 +1 @@
|
|||||||
|
google.com, pub-XXXXXXXXXXXXXXXX, DIRECT, f08c47fec0942fa0
|
||||||
7
frontend/public/favicon.svg
Normal file
7
frontend/public/favicon.svg
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 64 64" fill="none">
|
||||||
|
<rect width="64" height="64" rx="14" fill="#4F46E5"/>
|
||||||
|
<path d="M18 20h20a2 2 0 0 1 2 2v20a2 2 0 0 1-2 2H18a2 2 0 0 1-2-2V22a2 2 0 0 1 2-2z" stroke="#fff" stroke-width="2.5" fill="none"/>
|
||||||
|
<path d="M22 28h12M22 33h8" stroke="#fff" stroke-width="2" stroke-linecap="round"/>
|
||||||
|
<path d="M42 24l6-6M48 18v6h-6" stroke="#93C5FD" stroke-width="2.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||||
|
<path d="M42 40l6 6M48 46v-6h-6" stroke="#93C5FD" stroke-width="2.5" stroke-linecap="round" stroke-linejoin="round"/>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 596 B |
6
frontend/public/robots.txt
Normal file
6
frontend/public/robots.txt
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
# robots.txt — SaaS-PDF
|
||||||
|
User-agent: *
|
||||||
|
Allow: /
|
||||||
|
Disallow: /api/
|
||||||
|
|
||||||
|
Sitemap: https://yourdomain.com/sitemap.xml
|
||||||
71
frontend/src/App.tsx
Normal file
71
frontend/src/App.tsx
Normal file
@@ -0,0 +1,71 @@
|
|||||||
|
import { lazy, Suspense } from 'react';
|
||||||
|
import { Routes, Route } from 'react-router-dom';
|
||||||
|
import Header from '@/components/layout/Header';
|
||||||
|
import Footer from '@/components/layout/Footer';
|
||||||
|
import { useDirection } from '@/hooks/useDirection';
|
||||||
|
|
||||||
|
// Pages
|
||||||
|
const HomePage = lazy(() => import('@/pages/HomePage'));
|
||||||
|
const AboutPage = lazy(() => import('@/pages/AboutPage'));
|
||||||
|
const PrivacyPage = lazy(() => import('@/pages/PrivacyPage'));
|
||||||
|
const NotFoundPage = lazy(() => import('@/pages/NotFoundPage'));
|
||||||
|
const TermsPage = lazy(() => import('@/pages/TermsPage'));
|
||||||
|
|
||||||
|
// Tool Pages
|
||||||
|
const PdfToWord = lazy(() => import('@/components/tools/PdfToWord'));
|
||||||
|
const WordToPdf = lazy(() => import('@/components/tools/WordToPdf'));
|
||||||
|
const PdfCompressor = lazy(() => import('@/components/tools/PdfCompressor'));
|
||||||
|
const ImageConverter = lazy(() => import('@/components/tools/ImageConverter'));
|
||||||
|
const VideoToGif = lazy(() => import('@/components/tools/VideoToGif'));
|
||||||
|
const WordCounter = lazy(() => import('@/components/tools/WordCounter'));
|
||||||
|
const TextCleaner = lazy(() => import('@/components/tools/TextCleaner'));
|
||||||
|
|
||||||
|
function LoadingFallback() {
|
||||||
|
return (
|
||||||
|
<div className="flex min-h-[40vh] items-center justify-center">
|
||||||
|
<div className="h-10 w-10 animate-spin rounded-full border-4 border-primary-200 border-t-primary-600" />
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function App() {
|
||||||
|
useDirection();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="flex min-h-screen flex-col bg-slate-50">
|
||||||
|
<Header />
|
||||||
|
|
||||||
|
<main className="container mx-auto flex-1 px-4 py-8 sm:px-6 lg:px-8">
|
||||||
|
<Suspense fallback={<LoadingFallback />}>
|
||||||
|
<Routes>
|
||||||
|
{/* Pages */}
|
||||||
|
<Route path="/" element={<HomePage />} />
|
||||||
|
<Route path="/about" element={<AboutPage />} />
|
||||||
|
<Route path="/privacy" element={<PrivacyPage />} />
|
||||||
|
<Route path="/terms" element={<TermsPage />} />
|
||||||
|
|
||||||
|
{/* PDF Tools */}
|
||||||
|
<Route path="/tools/pdf-to-word" element={<PdfToWord />} />
|
||||||
|
<Route path="/tools/word-to-pdf" element={<WordToPdf />} />
|
||||||
|
<Route path="/tools/compress-pdf" element={<PdfCompressor />} />
|
||||||
|
|
||||||
|
{/* Image Tools */}
|
||||||
|
<Route path="/tools/image-converter" element={<ImageConverter />} />
|
||||||
|
|
||||||
|
{/* Video Tools */}
|
||||||
|
<Route path="/tools/video-to-gif" element={<VideoToGif />} />
|
||||||
|
|
||||||
|
{/* Text Tools */}
|
||||||
|
<Route path="/tools/word-counter" element={<WordCounter />} />
|
||||||
|
<Route path="/tools/text-cleaner" element={<TextCleaner />} />
|
||||||
|
|
||||||
|
{/* 404 */}
|
||||||
|
<Route path="*" element={<NotFoundPage />} />
|
||||||
|
</Routes>
|
||||||
|
</Suspense>
|
||||||
|
</main>
|
||||||
|
|
||||||
|
<Footer />
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
53
frontend/src/components/layout/AdSlot.tsx
Normal file
53
frontend/src/components/layout/AdSlot.tsx
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
import { useEffect, useRef } from 'react';
|
||||||
|
|
||||||
|
interface AdSlotProps {
|
||||||
|
/** AdSense ad slot ID */
|
||||||
|
slot: string;
|
||||||
|
/** Ad format: 'auto', 'rectangle', 'horizontal', 'vertical' */
|
||||||
|
format?: string;
|
||||||
|
/** Responsive mode */
|
||||||
|
responsive?: boolean;
|
||||||
|
/** Additional CSS class */
|
||||||
|
className?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Google AdSense ad slot component.
|
||||||
|
* Loads the ad unit once and handles cleanup.
|
||||||
|
*/
|
||||||
|
export default function AdSlot({
|
||||||
|
slot,
|
||||||
|
format = 'auto',
|
||||||
|
responsive = true,
|
||||||
|
className = '',
|
||||||
|
}: AdSlotProps) {
|
||||||
|
const adRef = useRef<HTMLModElement>(null);
|
||||||
|
const isLoaded = useRef(false);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (isLoaded.current) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Push ad to AdSense queue
|
||||||
|
const adsbygoogle = (window as any).adsbygoogle || [];
|
||||||
|
adsbygoogle.push({});
|
||||||
|
isLoaded.current = true;
|
||||||
|
} catch {
|
||||||
|
// AdSense not loaded (e.g., ad blocker)
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className={`ad-slot ${className}`}>
|
||||||
|
<ins
|
||||||
|
ref={adRef}
|
||||||
|
className="adsbygoogle"
|
||||||
|
style={{ display: 'block' }}
|
||||||
|
data-ad-client={import.meta.env.VITE_ADSENSE_CLIENT_ID || ''}
|
||||||
|
data-ad-slot={slot}
|
||||||
|
data-ad-format={format}
|
||||||
|
data-full-width-responsive={responsive ? 'true' : 'false'}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
45
frontend/src/components/layout/Footer.tsx
Normal file
45
frontend/src/components/layout/Footer.tsx
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
import { Link } from 'react-router-dom';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { FileText } from 'lucide-react';
|
||||||
|
|
||||||
|
export default function Footer() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<footer className="border-t border-slate-200 bg-slate-50">
|
||||||
|
<div className="mx-auto max-w-7xl px-4 py-8 sm:px-6 lg:px-8">
|
||||||
|
<div className="flex flex-col items-center justify-between gap-4 sm:flex-row">
|
||||||
|
{/* Brand */}
|
||||||
|
<div className="flex items-center gap-2 text-slate-600">
|
||||||
|
<FileText className="h-5 w-5" />
|
||||||
|
<span className="text-sm font-medium">
|
||||||
|
© {new Date().getFullYear()} {t('common.appName')}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Links */}
|
||||||
|
<div className="flex items-center gap-6">
|
||||||
|
<Link
|
||||||
|
to="/privacy"
|
||||||
|
className="text-sm text-slate-500 transition-colors hover:text-primary-600"
|
||||||
|
>
|
||||||
|
{t('common.privacy')}
|
||||||
|
</Link>
|
||||||
|
<Link
|
||||||
|
to="/terms"
|
||||||
|
className="text-sm text-slate-500 transition-colors hover:text-primary-600"
|
||||||
|
>
|
||||||
|
{t('common.terms')}
|
||||||
|
</Link>
|
||||||
|
<Link
|
||||||
|
to="/about"
|
||||||
|
className="text-sm text-slate-500 transition-colors hover:text-primary-600"
|
||||||
|
>
|
||||||
|
{t('common.about')}
|
||||||
|
</Link>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</footer>
|
||||||
|
);
|
||||||
|
}
|
||||||
50
frontend/src/components/layout/Header.tsx
Normal file
50
frontend/src/components/layout/Header.tsx
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
import { Link } from 'react-router-dom';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { FileText, Globe } from 'lucide-react';
|
||||||
|
|
||||||
|
export default function Header() {
|
||||||
|
const { t, i18n } = useTranslation();
|
||||||
|
|
||||||
|
const toggleLanguage = () => {
|
||||||
|
const newLang = i18n.language === 'ar' ? 'en' : 'ar';
|
||||||
|
i18n.changeLanguage(newLang);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<header className="sticky top-0 z-50 border-b border-slate-200 bg-white/80 backdrop-blur-lg">
|
||||||
|
<div className="mx-auto flex h-16 max-w-7xl items-center justify-between px-4 sm:px-6 lg:px-8">
|
||||||
|
{/* Logo */}
|
||||||
|
<Link to="/" className="flex items-center gap-2 text-xl font-bold text-primary-600">
|
||||||
|
<FileText className="h-7 w-7" />
|
||||||
|
<span>{t('common.appName')}</span>
|
||||||
|
</Link>
|
||||||
|
|
||||||
|
{/* Navigation */}
|
||||||
|
<nav className="hidden items-center gap-6 md:flex">
|
||||||
|
<Link
|
||||||
|
to="/"
|
||||||
|
className="text-sm font-medium text-slate-600 transition-colors hover:text-primary-600"
|
||||||
|
>
|
||||||
|
{t('common.home')}
|
||||||
|
</Link>
|
||||||
|
<Link
|
||||||
|
to="/about"
|
||||||
|
className="text-sm font-medium text-slate-600 transition-colors hover:text-primary-600"
|
||||||
|
>
|
||||||
|
{t('common.about')}
|
||||||
|
</Link>
|
||||||
|
</nav>
|
||||||
|
|
||||||
|
{/* Language Toggle */}
|
||||||
|
<button
|
||||||
|
onClick={toggleLanguage}
|
||||||
|
className="flex items-center gap-1.5 rounded-lg px-3 py-2 text-sm font-medium text-slate-600 transition-colors hover:bg-slate-100"
|
||||||
|
aria-label={t('common.language')}
|
||||||
|
>
|
||||||
|
<Globe className="h-4 w-4" />
|
||||||
|
<span>{i18n.language === 'ar' ? 'English' : 'العربية'}</span>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</header>
|
||||||
|
);
|
||||||
|
}
|
||||||
88
frontend/src/components/shared/DownloadButton.tsx
Normal file
88
frontend/src/components/shared/DownloadButton.tsx
Normal file
@@ -0,0 +1,88 @@
|
|||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Download, RotateCcw, Clock } from 'lucide-react';
|
||||||
|
import type { TaskResult } from '@/services/api';
|
||||||
|
import { formatFileSize } from '@/utils/textTools';
|
||||||
|
|
||||||
|
interface DownloadButtonProps {
|
||||||
|
/** Task result containing download URL */
|
||||||
|
result: TaskResult;
|
||||||
|
/** Called when user wants to start over */
|
||||||
|
onStartOver: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function DownloadButton({ result, onStartOver }: DownloadButtonProps) {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
if (!result.download_url) return null;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="rounded-2xl bg-emerald-50 p-6 ring-1 ring-emerald-200">
|
||||||
|
{/* Success header */}
|
||||||
|
<div className="mb-4 text-center">
|
||||||
|
<p className="text-lg font-semibold text-emerald-800">
|
||||||
|
{t('result.conversionComplete')}
|
||||||
|
</p>
|
||||||
|
<p className="mt-1 text-sm text-emerald-600">
|
||||||
|
{t('result.downloadReady')}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* File stats */}
|
||||||
|
{(result.original_size || result.compressed_size) && (
|
||||||
|
<div className="mb-4 grid grid-cols-2 gap-3 sm:grid-cols-3">
|
||||||
|
{result.original_size && (
|
||||||
|
<div className="rounded-lg bg-white p-3 text-center">
|
||||||
|
<p className="text-xs text-slate-500">{t('result.originalSize')}</p>
|
||||||
|
<p className="text-sm font-semibold text-slate-900">
|
||||||
|
{formatFileSize(result.original_size)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{result.compressed_size && (
|
||||||
|
<div className="rounded-lg bg-white p-3 text-center">
|
||||||
|
<p className="text-xs text-slate-500">{t('result.newSize')}</p>
|
||||||
|
<p className="text-sm font-semibold text-slate-900">
|
||||||
|
{formatFileSize(result.compressed_size)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{result.reduction_percent !== undefined && (
|
||||||
|
<div className="rounded-lg bg-white p-3 text-center">
|
||||||
|
<p className="text-xs text-slate-500">{t('result.reduction')}</p>
|
||||||
|
<p className="text-sm font-semibold text-emerald-600">
|
||||||
|
{result.reduction_percent}%
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Download button */}
|
||||||
|
<a
|
||||||
|
href={result.download_url}
|
||||||
|
download={result.filename}
|
||||||
|
className="btn-success w-full"
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
>
|
||||||
|
<Download className="h-5 w-5" />
|
||||||
|
{t('common.download')} — {result.filename}
|
||||||
|
</a>
|
||||||
|
|
||||||
|
{/* Expiry notice */}
|
||||||
|
<div className="mt-3 flex items-center justify-center gap-1.5 text-xs text-slate-500">
|
||||||
|
<Clock className="h-3.5 w-3.5" />
|
||||||
|
{t('result.linkExpiry')}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Start over */}
|
||||||
|
<button
|
||||||
|
onClick={onStartOver}
|
||||||
|
className="mt-4 flex w-full items-center justify-center gap-2 text-sm font-medium text-primary-600 transition-colors hover:text-primary-700"
|
||||||
|
>
|
||||||
|
<RotateCcw className="h-4 w-4" />
|
||||||
|
{t('common.startOver')}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
132
frontend/src/components/shared/FileUploader.tsx
Normal file
132
frontend/src/components/shared/FileUploader.tsx
Normal file
@@ -0,0 +1,132 @@
|
|||||||
|
import { useCallback } from 'react';
|
||||||
|
import { useDropzone, type Accept } from 'react-dropzone';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Upload, File, X } from 'lucide-react';
|
||||||
|
import { formatFileSize } from '@/utils/textTools';
|
||||||
|
|
||||||
|
interface FileUploaderProps {
|
||||||
|
/** Called when a file is selected/dropped */
|
||||||
|
onFileSelect: (file: File) => void;
|
||||||
|
/** Currently selected file */
|
||||||
|
file: File | null;
|
||||||
|
/** Accepted MIME types */
|
||||||
|
accept?: Accept;
|
||||||
|
/** Maximum file size in MB */
|
||||||
|
maxSizeMB?: number;
|
||||||
|
/** Whether upload is in progress */
|
||||||
|
isUploading?: boolean;
|
||||||
|
/** Upload progress percentage */
|
||||||
|
uploadProgress?: number;
|
||||||
|
/** Error message */
|
||||||
|
error?: string | null;
|
||||||
|
/** Reset handler */
|
||||||
|
onReset?: () => void;
|
||||||
|
/** Descriptive text for accepted file types */
|
||||||
|
acceptLabel?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function FileUploader({
|
||||||
|
onFileSelect,
|
||||||
|
file,
|
||||||
|
accept,
|
||||||
|
maxSizeMB = 20,
|
||||||
|
isUploading = false,
|
||||||
|
uploadProgress = 0,
|
||||||
|
error,
|
||||||
|
onReset,
|
||||||
|
acceptLabel,
|
||||||
|
}: FileUploaderProps) {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
const onDrop = useCallback(
|
||||||
|
(acceptedFiles: File[]) => {
|
||||||
|
if (acceptedFiles.length > 0) {
|
||||||
|
onFileSelect(acceptedFiles[0]);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
[onFileSelect]
|
||||||
|
);
|
||||||
|
|
||||||
|
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
||||||
|
onDrop,
|
||||||
|
accept,
|
||||||
|
maxFiles: 1,
|
||||||
|
maxSize: maxSizeMB * 1024 * 1024,
|
||||||
|
disabled: isUploading,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="w-full">
|
||||||
|
{/* Drop Zone */}
|
||||||
|
{!file && (
|
||||||
|
<div
|
||||||
|
{...getRootProps()}
|
||||||
|
className={`upload-zone ${isDragActive ? 'drag-active' : ''}`}
|
||||||
|
>
|
||||||
|
<input {...getInputProps()} />
|
||||||
|
<Upload
|
||||||
|
className={`mb-4 h-12 w-12 ${
|
||||||
|
isDragActive ? 'text-primary-500' : 'text-slate-400'
|
||||||
|
}`}
|
||||||
|
/>
|
||||||
|
<p className="mb-2 text-base font-medium text-slate-700">
|
||||||
|
{t('common.dragDrop')}
|
||||||
|
</p>
|
||||||
|
{acceptLabel && (
|
||||||
|
<p className="text-sm text-slate-500">{acceptLabel}</p>
|
||||||
|
)}
|
||||||
|
<p className="mt-1 text-xs text-slate-400">
|
||||||
|
{t('common.maxSize', { size: maxSizeMB })}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Selected File */}
|
||||||
|
{file && !isUploading && (
|
||||||
|
<div className="flex items-center gap-3 rounded-xl bg-primary-50 p-4 ring-1 ring-primary-200">
|
||||||
|
<File className="h-8 w-8 flex-shrink-0 text-primary-600" />
|
||||||
|
<div className="min-w-0 flex-1">
|
||||||
|
<p className="truncate text-sm font-medium text-slate-900">
|
||||||
|
{file.name}
|
||||||
|
</p>
|
||||||
|
<p className="text-xs text-slate-500">{formatFileSize(file.size)}</p>
|
||||||
|
</div>
|
||||||
|
{onReset && (
|
||||||
|
<button
|
||||||
|
onClick={onReset}
|
||||||
|
className="rounded-lg p-1.5 text-slate-400 transition-colors hover:bg-slate-200 hover:text-slate-600"
|
||||||
|
aria-label="Remove file"
|
||||||
|
>
|
||||||
|
<X className="h-5 w-5" />
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Upload Progress */}
|
||||||
|
{isUploading && (
|
||||||
|
<div className="rounded-xl bg-slate-50 p-4 ring-1 ring-slate-200">
|
||||||
|
<div className="mb-2 flex items-center justify-between">
|
||||||
|
<span className="text-sm font-medium text-slate-700">
|
||||||
|
{t('common.upload')}...
|
||||||
|
</span>
|
||||||
|
<span className="text-sm text-slate-500">{uploadProgress}%</span>
|
||||||
|
</div>
|
||||||
|
<div className="h-2 w-full overflow-hidden rounded-full bg-slate-200">
|
||||||
|
<div
|
||||||
|
className="h-full rounded-full bg-primary-600 transition-all duration-300"
|
||||||
|
style={{ width: `${uploadProgress}%` }}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Error */}
|
||||||
|
{error && (
|
||||||
|
<div className="mt-3 rounded-xl bg-red-50 p-3 ring-1 ring-red-200">
|
||||||
|
<p className="text-sm text-red-700">{error}</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
42
frontend/src/components/shared/ProgressBar.tsx
Normal file
42
frontend/src/components/shared/ProgressBar.tsx
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Loader2, CheckCircle2 } from 'lucide-react';
|
||||||
|
|
||||||
|
interface ProgressBarProps {
|
||||||
|
/** Current task state */
|
||||||
|
state: 'PENDING' | 'PROCESSING' | 'SUCCESS' | 'FAILURE' | string;
|
||||||
|
/** Progress message */
|
||||||
|
message?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function ProgressBar({ state, message }: ProgressBarProps) {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
const isActive = state === 'PENDING' || state === 'PROCESSING';
|
||||||
|
const isComplete = state === 'SUCCESS';
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="rounded-xl bg-slate-50 p-5 ring-1 ring-slate-200">
|
||||||
|
<div className="flex items-center gap-3">
|
||||||
|
{isActive && (
|
||||||
|
<Loader2 className="h-6 w-6 animate-spin text-primary-600" />
|
||||||
|
)}
|
||||||
|
{isComplete && (
|
||||||
|
<CheckCircle2 className="h-6 w-6 text-emerald-600" />
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="flex-1">
|
||||||
|
<p className="text-sm font-medium text-slate-700">
|
||||||
|
{message || t('common.processing')}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Animated progress bar for active states */}
|
||||||
|
{isActive && (
|
||||||
|
<div className="mt-3 h-1.5 w-full overflow-hidden rounded-full bg-slate-200">
|
||||||
|
<div className="progress-bar-animated h-full w-2/3 rounded-full bg-primary-500 transition-all" />
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
43
frontend/src/components/shared/ToolCard.tsx
Normal file
43
frontend/src/components/shared/ToolCard.tsx
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
import { Link } from 'react-router-dom';
|
||||||
|
import type { ReactNode } from 'react';
|
||||||
|
|
||||||
|
interface ToolCardProps {
|
||||||
|
/** Tool route path */
|
||||||
|
to: string;
|
||||||
|
/** Tool title */
|
||||||
|
title: string;
|
||||||
|
/** Short description */
|
||||||
|
description: string;
|
||||||
|
/** Pre-rendered icon element */
|
||||||
|
icon: ReactNode;
|
||||||
|
/** Icon background color class */
|
||||||
|
bgColor: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function ToolCard({
|
||||||
|
to,
|
||||||
|
title,
|
||||||
|
description,
|
||||||
|
icon,
|
||||||
|
bgColor,
|
||||||
|
}: ToolCardProps) {
|
||||||
|
return (
|
||||||
|
<Link to={to} className="tool-card group block">
|
||||||
|
<div className="flex items-start gap-4">
|
||||||
|
<div
|
||||||
|
className={`flex h-12 w-12 flex-shrink-0 items-center justify-center rounded-xl ${bgColor}`}
|
||||||
|
>
|
||||||
|
{icon}
|
||||||
|
</div>
|
||||||
|
<div className="min-w-0 flex-1">
|
||||||
|
<h3 className="text-base font-semibold text-slate-900 group-hover:text-primary-600 transition-colors">
|
||||||
|
{title}
|
||||||
|
</h3>
|
||||||
|
<p className="mt-1 text-sm text-slate-500 line-clamp-2">
|
||||||
|
{description}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</Link>
|
||||||
|
);
|
||||||
|
}
|
||||||
176
frontend/src/components/tools/ImageConverter.tsx
Normal file
176
frontend/src/components/tools/ImageConverter.tsx
Normal file
@@ -0,0 +1,176 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import { ImageIcon } from 'lucide-react';
|
||||||
|
import FileUploader from '@/components/shared/FileUploader';
|
||||||
|
import ProgressBar from '@/components/shared/ProgressBar';
|
||||||
|
import DownloadButton from '@/components/shared/DownloadButton';
|
||||||
|
import AdSlot from '@/components/layout/AdSlot';
|
||||||
|
import { useFileUpload } from '@/hooks/useFileUpload';
|
||||||
|
import { useTaskPolling } from '@/hooks/useTaskPolling';
|
||||||
|
import { generateToolSchema } from '@/utils/seo';
|
||||||
|
|
||||||
|
type OutputFormat = 'jpg' | 'png' | 'webp';
|
||||||
|
|
||||||
|
export default function ImageConverter() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const [phase, setPhase] = useState<'upload' | 'processing' | 'done'>('upload');
|
||||||
|
const [format, setFormat] = useState<OutputFormat>('jpg');
|
||||||
|
const [quality, setQuality] = useState(85);
|
||||||
|
|
||||||
|
const {
|
||||||
|
file,
|
||||||
|
uploadProgress,
|
||||||
|
isUploading,
|
||||||
|
taskId,
|
||||||
|
error: uploadError,
|
||||||
|
selectFile,
|
||||||
|
startUpload,
|
||||||
|
reset,
|
||||||
|
} = useFileUpload({
|
||||||
|
endpoint: '/image/convert',
|
||||||
|
maxSizeMB: 10,
|
||||||
|
acceptedTypes: ['png', 'jpg', 'jpeg', 'webp'],
|
||||||
|
extraData: { format, quality: quality.toString() },
|
||||||
|
});
|
||||||
|
|
||||||
|
const { status, result, error: taskError } = useTaskPolling({
|
||||||
|
taskId,
|
||||||
|
onComplete: () => setPhase('done'),
|
||||||
|
onError: () => setPhase('done'),
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleUpload = async () => {
|
||||||
|
const id = await startUpload();
|
||||||
|
if (id) setPhase('processing');
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleReset = () => {
|
||||||
|
reset();
|
||||||
|
setPhase('upload');
|
||||||
|
};
|
||||||
|
|
||||||
|
const formats: { value: OutputFormat; label: string }[] = [
|
||||||
|
{ value: 'jpg', label: 'JPG' },
|
||||||
|
{ value: 'png', label: 'PNG' },
|
||||||
|
{ value: 'webp', label: 'WebP' },
|
||||||
|
];
|
||||||
|
|
||||||
|
const schema = generateToolSchema({
|
||||||
|
name: t('tools.imageConvert.title'),
|
||||||
|
description: t('tools.imageConvert.description'),
|
||||||
|
url: `${window.location.origin}/tools/image-converter`,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('tools.imageConvert.title')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content={t('tools.imageConvert.description')} />
|
||||||
|
<link rel="canonical" href={`${window.location.origin}/tools/image-converter`} />
|
||||||
|
<script type="application/ld+json">{JSON.stringify(schema)}</script>
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="mx-auto max-w-2xl">
|
||||||
|
<div className="mb-8 text-center">
|
||||||
|
<div className="mx-auto mb-4 flex h-16 w-16 items-center justify-center rounded-2xl bg-purple-100">
|
||||||
|
<ImageIcon className="h-8 w-8 text-purple-600" />
|
||||||
|
</div>
|
||||||
|
<h1 className="section-heading">{t('tools.imageConvert.title')}</h1>
|
||||||
|
<p className="mt-2 text-slate-500">{t('tools.imageConvert.description')}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<AdSlot slot="top-banner" format="horizontal" className="mb-6" />
|
||||||
|
|
||||||
|
{phase === 'upload' && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<FileUploader
|
||||||
|
onFileSelect={selectFile}
|
||||||
|
file={file}
|
||||||
|
accept={{
|
||||||
|
'image/png': ['.png'],
|
||||||
|
'image/jpeg': ['.jpg', '.jpeg'],
|
||||||
|
'image/webp': ['.webp'],
|
||||||
|
}}
|
||||||
|
maxSizeMB={10}
|
||||||
|
isUploading={isUploading}
|
||||||
|
uploadProgress={uploadProgress}
|
||||||
|
error={uploadError}
|
||||||
|
onReset={handleReset}
|
||||||
|
acceptLabel="Images (PNG, JPG, WebP)"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{file && !isUploading && (
|
||||||
|
<>
|
||||||
|
{/* Format Selector */}
|
||||||
|
<div>
|
||||||
|
<label className="mb-2 block text-sm font-medium text-slate-700">
|
||||||
|
Convert to:
|
||||||
|
</label>
|
||||||
|
<div className="grid grid-cols-3 gap-3">
|
||||||
|
{formats.map((f) => (
|
||||||
|
<button
|
||||||
|
key={f.value}
|
||||||
|
onClick={() => setFormat(f.value)}
|
||||||
|
className={`rounded-xl p-3 text-center ring-1 transition-all ${
|
||||||
|
format === f.value
|
||||||
|
? 'bg-primary-50 ring-primary-300 text-primary-700 font-semibold'
|
||||||
|
: 'bg-white ring-slate-200 text-slate-600 hover:bg-slate-50'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{f.label}
|
||||||
|
</button>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Quality Slider (for lossy formats) */}
|
||||||
|
{format !== 'png' && (
|
||||||
|
<div>
|
||||||
|
<label className="mb-2 flex items-center justify-between text-sm font-medium text-slate-700">
|
||||||
|
<span>Quality</span>
|
||||||
|
<span className="text-primary-600">{quality}%</span>
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="range"
|
||||||
|
min="10"
|
||||||
|
max="100"
|
||||||
|
value={quality}
|
||||||
|
onChange={(e) => setQuality(Number(e.target.value))}
|
||||||
|
className="w-full accent-primary-600"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<button onClick={handleUpload} className="btn-primary w-full">
|
||||||
|
{t('tools.imageConvert.shortDesc')}
|
||||||
|
</button>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'processing' && !result && (
|
||||||
|
<ProgressBar state={status?.state || 'PENDING'} message={status?.progress} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'done' && result && result.status === 'completed' && (
|
||||||
|
<DownloadButton result={result} onStartOver={handleReset} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'done' && taskError && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div className="rounded-xl bg-red-50 p-4 ring-1 ring-red-200">
|
||||||
|
<p className="text-sm text-red-700">{taskError}</p>
|
||||||
|
</div>
|
||||||
|
<button onClick={handleReset} className="btn-secondary w-full">
|
||||||
|
{t('common.startOver')}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<AdSlot slot="bottom-banner" className="mt-8" />
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
148
frontend/src/components/tools/PdfCompressor.tsx
Normal file
148
frontend/src/components/tools/PdfCompressor.tsx
Normal file
@@ -0,0 +1,148 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import { Minimize2 } from 'lucide-react';
|
||||||
|
import FileUploader from '@/components/shared/FileUploader';
|
||||||
|
import ProgressBar from '@/components/shared/ProgressBar';
|
||||||
|
import DownloadButton from '@/components/shared/DownloadButton';
|
||||||
|
import AdSlot from '@/components/layout/AdSlot';
|
||||||
|
import { useFileUpload } from '@/hooks/useFileUpload';
|
||||||
|
import { useTaskPolling } from '@/hooks/useTaskPolling';
|
||||||
|
import { generateToolSchema } from '@/utils/seo';
|
||||||
|
|
||||||
|
type Quality = 'low' | 'medium' | 'high';
|
||||||
|
|
||||||
|
export default function PdfCompressor() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const [phase, setPhase] = useState<'upload' | 'processing' | 'done'>('upload');
|
||||||
|
const [quality, setQuality] = useState<Quality>('medium');
|
||||||
|
|
||||||
|
const {
|
||||||
|
file,
|
||||||
|
uploadProgress,
|
||||||
|
isUploading,
|
||||||
|
taskId,
|
||||||
|
error: uploadError,
|
||||||
|
selectFile,
|
||||||
|
startUpload,
|
||||||
|
reset,
|
||||||
|
} = useFileUpload({
|
||||||
|
endpoint: '/compress/pdf',
|
||||||
|
maxSizeMB: 20,
|
||||||
|
acceptedTypes: ['pdf'],
|
||||||
|
extraData: { quality },
|
||||||
|
});
|
||||||
|
|
||||||
|
const { status, result, error: taskError } = useTaskPolling({
|
||||||
|
taskId,
|
||||||
|
onComplete: () => setPhase('done'),
|
||||||
|
onError: () => setPhase('done'),
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleUpload = async () => {
|
||||||
|
const id = await startUpload();
|
||||||
|
if (id) setPhase('processing');
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleReset = () => {
|
||||||
|
reset();
|
||||||
|
setPhase('upload');
|
||||||
|
};
|
||||||
|
|
||||||
|
const qualityOptions: { value: Quality; label: string; desc: string }[] = [
|
||||||
|
{ value: 'low', label: t('tools.compressPdf.qualityLow'), desc: '72 DPI' },
|
||||||
|
{ value: 'medium', label: t('tools.compressPdf.qualityMedium'), desc: '150 DPI' },
|
||||||
|
{ value: 'high', label: t('tools.compressPdf.qualityHigh'), desc: '300 DPI' },
|
||||||
|
];
|
||||||
|
|
||||||
|
const schema = generateToolSchema({
|
||||||
|
name: t('tools.compressPdf.title'),
|
||||||
|
description: t('tools.compressPdf.description'),
|
||||||
|
url: `${window.location.origin}/tools/compress-pdf`,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('tools.compressPdf.title')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content={t('tools.compressPdf.description')} />
|
||||||
|
<link rel="canonical" href={`${window.location.origin}/tools/compress-pdf`} />
|
||||||
|
<script type="application/ld+json">{JSON.stringify(schema)}</script>
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="mx-auto max-w-2xl">
|
||||||
|
<div className="mb-8 text-center">
|
||||||
|
<div className="mx-auto mb-4 flex h-16 w-16 items-center justify-center rounded-2xl bg-orange-100">
|
||||||
|
<Minimize2 className="h-8 w-8 text-orange-600" />
|
||||||
|
</div>
|
||||||
|
<h1 className="section-heading">{t('tools.compressPdf.title')}</h1>
|
||||||
|
<p className="mt-2 text-slate-500">{t('tools.compressPdf.description')}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<AdSlot slot="top-banner" format="horizontal" className="mb-6" />
|
||||||
|
|
||||||
|
{phase === 'upload' && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<FileUploader
|
||||||
|
onFileSelect={selectFile}
|
||||||
|
file={file}
|
||||||
|
accept={{ 'application/pdf': ['.pdf'] }}
|
||||||
|
maxSizeMB={20}
|
||||||
|
isUploading={isUploading}
|
||||||
|
uploadProgress={uploadProgress}
|
||||||
|
error={uploadError}
|
||||||
|
onReset={handleReset}
|
||||||
|
acceptLabel="PDF (.pdf)"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Quality Selector */}
|
||||||
|
{file && !isUploading && (
|
||||||
|
<>
|
||||||
|
<div className="grid grid-cols-3 gap-3">
|
||||||
|
{qualityOptions.map((opt) => (
|
||||||
|
<button
|
||||||
|
key={opt.value}
|
||||||
|
onClick={() => setQuality(opt.value)}
|
||||||
|
className={`rounded-xl p-3 text-center ring-1 transition-all ${
|
||||||
|
quality === opt.value
|
||||||
|
? 'bg-primary-50 ring-primary-300 text-primary-700'
|
||||||
|
: 'bg-white ring-slate-200 text-slate-600 hover:bg-slate-50'
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
<p className="text-sm font-medium">{opt.label}</p>
|
||||||
|
<p className="text-xs text-slate-400 mt-0.5">{opt.desc}</p>
|
||||||
|
</button>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
<button onClick={handleUpload} className="btn-primary w-full">
|
||||||
|
{t('tools.compressPdf.shortDesc')}
|
||||||
|
</button>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'processing' && !result && (
|
||||||
|
<ProgressBar state={status?.state || 'PENDING'} message={status?.progress} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'done' && result && result.status === 'completed' && (
|
||||||
|
<DownloadButton result={result} onStartOver={handleReset} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'done' && taskError && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div className="rounded-xl bg-red-50 p-4 ring-1 ring-red-200">
|
||||||
|
<p className="text-sm text-red-700">{taskError}</p>
|
||||||
|
</div>
|
||||||
|
<button onClick={handleReset} className="btn-secondary w-full">
|
||||||
|
{t('common.startOver')}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<AdSlot slot="bottom-banner" className="mt-8" />
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
128
frontend/src/components/tools/PdfToWord.tsx
Normal file
128
frontend/src/components/tools/PdfToWord.tsx
Normal file
@@ -0,0 +1,128 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import { FileText } from 'lucide-react';
|
||||||
|
import FileUploader from '@/components/shared/FileUploader';
|
||||||
|
import ProgressBar from '@/components/shared/ProgressBar';
|
||||||
|
import DownloadButton from '@/components/shared/DownloadButton';
|
||||||
|
import AdSlot from '@/components/layout/AdSlot';
|
||||||
|
import { useFileUpload } from '@/hooks/useFileUpload';
|
||||||
|
import { useTaskPolling } from '@/hooks/useTaskPolling';
|
||||||
|
import { generateToolSchema } from '@/utils/seo';
|
||||||
|
|
||||||
|
export default function PdfToWord() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const [phase, setPhase] = useState<'upload' | 'processing' | 'done'>('upload');
|
||||||
|
|
||||||
|
const {
|
||||||
|
file,
|
||||||
|
uploadProgress,
|
||||||
|
isUploading,
|
||||||
|
taskId,
|
||||||
|
error: uploadError,
|
||||||
|
selectFile,
|
||||||
|
startUpload,
|
||||||
|
reset,
|
||||||
|
} = useFileUpload({
|
||||||
|
endpoint: '/convert/pdf-to-word',
|
||||||
|
maxSizeMB: 20,
|
||||||
|
acceptedTypes: ['pdf'],
|
||||||
|
});
|
||||||
|
|
||||||
|
const { status, result, error: taskError } = useTaskPolling({
|
||||||
|
taskId,
|
||||||
|
onComplete: () => setPhase('done'),
|
||||||
|
onError: () => setPhase('done'),
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleUpload = async () => {
|
||||||
|
const id = await startUpload();
|
||||||
|
if (id) setPhase('processing');
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleReset = () => {
|
||||||
|
reset();
|
||||||
|
setPhase('upload');
|
||||||
|
};
|
||||||
|
|
||||||
|
const schema = generateToolSchema({
|
||||||
|
name: t('tools.pdfToWord.title'),
|
||||||
|
description: t('tools.pdfToWord.description'),
|
||||||
|
url: `${window.location.origin}/tools/pdf-to-word`,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('tools.pdfToWord.title')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content={t('tools.pdfToWord.description')} />
|
||||||
|
<link rel="canonical" href={`${window.location.origin}/tools/pdf-to-word`} />
|
||||||
|
<script type="application/ld+json">{JSON.stringify(schema)}</script>
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="mx-auto max-w-2xl">
|
||||||
|
{/* Tool Header */}
|
||||||
|
<div className="mb-8 text-center">
|
||||||
|
<div className="mx-auto mb-4 flex h-16 w-16 items-center justify-center rounded-2xl bg-red-100">
|
||||||
|
<FileText className="h-8 w-8 text-red-600" />
|
||||||
|
</div>
|
||||||
|
<h1 className="section-heading">{t('tools.pdfToWord.title')}</h1>
|
||||||
|
<p className="mt-2 text-slate-500">{t('tools.pdfToWord.description')}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Ad Slot - Top */}
|
||||||
|
<AdSlot slot="top-banner" format="horizontal" className="mb-6" />
|
||||||
|
|
||||||
|
{/* Upload Phase */}
|
||||||
|
{phase === 'upload' && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<FileUploader
|
||||||
|
onFileSelect={selectFile}
|
||||||
|
file={file}
|
||||||
|
accept={{ 'application/pdf': ['.pdf'] }}
|
||||||
|
maxSizeMB={20}
|
||||||
|
isUploading={isUploading}
|
||||||
|
uploadProgress={uploadProgress}
|
||||||
|
error={uploadError}
|
||||||
|
onReset={handleReset}
|
||||||
|
acceptLabel="PDF (.pdf)"
|
||||||
|
/>
|
||||||
|
{file && !isUploading && (
|
||||||
|
<button onClick={handleUpload} className="btn-primary w-full">
|
||||||
|
{t('tools.pdfToWord.shortDesc')}
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Processing Phase */}
|
||||||
|
{phase === 'processing' && !result && (
|
||||||
|
<ProgressBar
|
||||||
|
state={status?.state || 'PENDING'}
|
||||||
|
message={status?.progress}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Done Phase */}
|
||||||
|
{phase === 'done' && result && result.status === 'completed' && (
|
||||||
|
<DownloadButton result={result} onStartOver={handleReset} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Error */}
|
||||||
|
{(phase === 'done' && taskError) && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div className="rounded-xl bg-red-50 p-4 ring-1 ring-red-200">
|
||||||
|
<p className="text-sm text-red-700">{taskError}</p>
|
||||||
|
</div>
|
||||||
|
<button onClick={handleReset} className="btn-secondary w-full">
|
||||||
|
{t('common.startOver')}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Ad Slot - Bottom */}
|
||||||
|
<AdSlot slot="bottom-banner" className="mt-8" />
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
146
frontend/src/components/tools/TextCleaner.tsx
Normal file
146
frontend/src/components/tools/TextCleaner.tsx
Normal file
@@ -0,0 +1,146 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import { Eraser, Copy, Check } from 'lucide-react';
|
||||||
|
import AdSlot from '@/components/layout/AdSlot';
|
||||||
|
import { removeExtraSpaces, convertCase, removeDiacritics } from '@/utils/textTools';
|
||||||
|
import { generateToolSchema } from '@/utils/seo';
|
||||||
|
|
||||||
|
export default function TextCleaner() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const [input, setInput] = useState('');
|
||||||
|
const [output, setOutput] = useState('');
|
||||||
|
const [copied, setCopied] = useState(false);
|
||||||
|
|
||||||
|
const applyTransform = (type: string) => {
|
||||||
|
let result = input;
|
||||||
|
switch (type) {
|
||||||
|
case 'removeSpaces':
|
||||||
|
result = removeExtraSpaces(input);
|
||||||
|
break;
|
||||||
|
case 'upper':
|
||||||
|
result = convertCase(input, 'upper');
|
||||||
|
break;
|
||||||
|
case 'lower':
|
||||||
|
result = convertCase(input, 'lower');
|
||||||
|
break;
|
||||||
|
case 'title':
|
||||||
|
result = convertCase(input, 'title');
|
||||||
|
break;
|
||||||
|
case 'sentence':
|
||||||
|
result = convertCase(input, 'sentence');
|
||||||
|
break;
|
||||||
|
case 'removeDiacritics':
|
||||||
|
result = removeDiacritics(input);
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
setOutput(result);
|
||||||
|
setCopied(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const copyToClipboard = async () => {
|
||||||
|
try {
|
||||||
|
await navigator.clipboard.writeText(output || input);
|
||||||
|
setCopied(true);
|
||||||
|
setTimeout(() => setCopied(false), 2000);
|
||||||
|
} catch {
|
||||||
|
// Clipboard API not available
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const buttons = [
|
||||||
|
{ key: 'removeSpaces', label: t('tools.textCleaner.removeSpaces'), color: 'bg-blue-600 hover:bg-blue-700' },
|
||||||
|
{ key: 'upper', label: t('tools.textCleaner.toUpperCase'), color: 'bg-purple-600 hover:bg-purple-700' },
|
||||||
|
{ key: 'lower', label: t('tools.textCleaner.toLowerCase'), color: 'bg-emerald-600 hover:bg-emerald-700' },
|
||||||
|
{ key: 'title', label: t('tools.textCleaner.toTitleCase'), color: 'bg-orange-600 hover:bg-orange-700' },
|
||||||
|
{ key: 'sentence', label: t('tools.textCleaner.toSentenceCase'), color: 'bg-rose-600 hover:bg-rose-700' },
|
||||||
|
{ key: 'removeDiacritics', label: t('tools.textCleaner.removeDiacritics'), color: 'bg-amber-600 hover:bg-amber-700' },
|
||||||
|
];
|
||||||
|
|
||||||
|
const schema = generateToolSchema({
|
||||||
|
name: t('tools.textCleaner.title'),
|
||||||
|
description: t('tools.textCleaner.description'),
|
||||||
|
url: `${window.location.origin}/tools/text-cleaner`,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('tools.textCleaner.title')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content={t('tools.textCleaner.description')} />
|
||||||
|
<link rel="canonical" href={`${window.location.origin}/tools/text-cleaner`} />
|
||||||
|
<script type="application/ld+json">{JSON.stringify(schema)}</script>
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="mx-auto max-w-2xl">
|
||||||
|
<div className="mb-8 text-center">
|
||||||
|
<div className="mx-auto mb-4 flex h-16 w-16 items-center justify-center rounded-2xl bg-indigo-100">
|
||||||
|
<Eraser className="h-8 w-8 text-indigo-600" />
|
||||||
|
</div>
|
||||||
|
<h1 className="section-heading">{t('tools.textCleaner.title')}</h1>
|
||||||
|
<p className="mt-2 text-slate-500">{t('tools.textCleaner.description')}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<AdSlot slot="top-banner" format="horizontal" className="mb-6" />
|
||||||
|
|
||||||
|
{/* Input */}
|
||||||
|
<textarea
|
||||||
|
value={input}
|
||||||
|
onChange={(e) => {
|
||||||
|
setInput(e.target.value);
|
||||||
|
setCopied(false);
|
||||||
|
}}
|
||||||
|
placeholder={t('tools.wordCounter.placeholder')}
|
||||||
|
className="input-field mb-4 min-h-[150px] resize-y text-sm"
|
||||||
|
dir="auto"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Transform Buttons */}
|
||||||
|
<div className="mb-4 flex flex-wrap gap-2">
|
||||||
|
{buttons.map((btn) => (
|
||||||
|
<button
|
||||||
|
key={btn.key}
|
||||||
|
onClick={() => applyTransform(btn.key)}
|
||||||
|
disabled={!input.trim()}
|
||||||
|
className={`rounded-lg px-4 py-2 text-xs font-medium text-white transition-colors disabled:opacity-40 ${btn.color}`}
|
||||||
|
>
|
||||||
|
{btn.label}
|
||||||
|
</button>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Output */}
|
||||||
|
{output && (
|
||||||
|
<div className="relative">
|
||||||
|
<textarea
|
||||||
|
value={output}
|
||||||
|
readOnly
|
||||||
|
className="input-field min-h-[150px] resize-y bg-emerald-50 text-sm"
|
||||||
|
dir="auto"
|
||||||
|
/>
|
||||||
|
<button
|
||||||
|
onClick={copyToClipboard}
|
||||||
|
className="absolute right-3 top-3 flex items-center gap-1 rounded-lg bg-white px-3 py-1.5 text-xs font-medium text-slate-600 shadow-sm ring-1 ring-slate-200 transition-colors hover:bg-slate-50"
|
||||||
|
>
|
||||||
|
{copied ? (
|
||||||
|
<>
|
||||||
|
<Check className="h-3.5 w-3.5 text-emerald-600" />
|
||||||
|
Copied!
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<Copy className="h-3.5 w-3.5" />
|
||||||
|
{t('tools.textCleaner.copyResult')}
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<AdSlot slot="bottom-banner" className="mt-8" />
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
192
frontend/src/components/tools/VideoToGif.tsx
Normal file
192
frontend/src/components/tools/VideoToGif.tsx
Normal file
@@ -0,0 +1,192 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import { Film } from 'lucide-react';
|
||||||
|
import FileUploader from '@/components/shared/FileUploader';
|
||||||
|
import ProgressBar from '@/components/shared/ProgressBar';
|
||||||
|
import DownloadButton from '@/components/shared/DownloadButton';
|
||||||
|
import AdSlot from '@/components/layout/AdSlot';
|
||||||
|
import { useFileUpload } from '@/hooks/useFileUpload';
|
||||||
|
import { useTaskPolling } from '@/hooks/useTaskPolling';
|
||||||
|
import { generateToolSchema } from '@/utils/seo';
|
||||||
|
|
||||||
|
export default function VideoToGif() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const [phase, setPhase] = useState<'upload' | 'processing' | 'done'>('upload');
|
||||||
|
const [startTime, setStartTime] = useState(0);
|
||||||
|
const [duration, setDuration] = useState(5);
|
||||||
|
const [fps, setFps] = useState(10);
|
||||||
|
const [width, setWidth] = useState(480);
|
||||||
|
|
||||||
|
const {
|
||||||
|
file,
|
||||||
|
uploadProgress,
|
||||||
|
isUploading,
|
||||||
|
taskId,
|
||||||
|
error: uploadError,
|
||||||
|
selectFile,
|
||||||
|
startUpload,
|
||||||
|
reset,
|
||||||
|
} = useFileUpload({
|
||||||
|
endpoint: '/video/to-gif',
|
||||||
|
maxSizeMB: 50,
|
||||||
|
acceptedTypes: ['mp4', 'webm'],
|
||||||
|
extraData: {
|
||||||
|
start_time: startTime.toString(),
|
||||||
|
duration: duration.toString(),
|
||||||
|
fps: fps.toString(),
|
||||||
|
width: width.toString(),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const { status, result, error: taskError } = useTaskPolling({
|
||||||
|
taskId,
|
||||||
|
onComplete: () => setPhase('done'),
|
||||||
|
onError: () => setPhase('done'),
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleUpload = async () => {
|
||||||
|
const id = await startUpload();
|
||||||
|
if (id) setPhase('processing');
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleReset = () => {
|
||||||
|
reset();
|
||||||
|
setPhase('upload');
|
||||||
|
};
|
||||||
|
|
||||||
|
const schema = generateToolSchema({
|
||||||
|
name: t('tools.videoToGif.title'),
|
||||||
|
description: t('tools.videoToGif.description'),
|
||||||
|
url: `${window.location.origin}/tools/video-to-gif`,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('tools.videoToGif.title')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content={t('tools.videoToGif.description')} />
|
||||||
|
<link rel="canonical" href={`${window.location.origin}/tools/video-to-gif`} />
|
||||||
|
<script type="application/ld+json">{JSON.stringify(schema)}</script>
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="mx-auto max-w-2xl">
|
||||||
|
<div className="mb-8 text-center">
|
||||||
|
<div className="mx-auto mb-4 flex h-16 w-16 items-center justify-center rounded-2xl bg-emerald-100">
|
||||||
|
<Film className="h-8 w-8 text-emerald-600" />
|
||||||
|
</div>
|
||||||
|
<h1 className="section-heading">{t('tools.videoToGif.title')}</h1>
|
||||||
|
<p className="mt-2 text-slate-500">{t('tools.videoToGif.description')}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<AdSlot slot="top-banner" format="horizontal" className="mb-6" />
|
||||||
|
|
||||||
|
{phase === 'upload' && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<FileUploader
|
||||||
|
onFileSelect={selectFile}
|
||||||
|
file={file}
|
||||||
|
accept={{
|
||||||
|
'video/mp4': ['.mp4'],
|
||||||
|
'video/webm': ['.webm'],
|
||||||
|
}}
|
||||||
|
maxSizeMB={50}
|
||||||
|
isUploading={isUploading}
|
||||||
|
uploadProgress={uploadProgress}
|
||||||
|
error={uploadError}
|
||||||
|
onReset={handleReset}
|
||||||
|
acceptLabel="Video (MP4, WebM) — max 50MB"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{file && !isUploading && (
|
||||||
|
<>
|
||||||
|
{/* GIF Options */}
|
||||||
|
<div className="grid grid-cols-2 gap-4">
|
||||||
|
<div>
|
||||||
|
<label className="mb-1 block text-sm font-medium text-slate-700">
|
||||||
|
{t('tools.videoToGif.startTime')}
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="number"
|
||||||
|
min="0"
|
||||||
|
step="0.5"
|
||||||
|
value={startTime}
|
||||||
|
onChange={(e) => setStartTime(Number(e.target.value))}
|
||||||
|
className="input-field"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label className="mb-1 block text-sm font-medium text-slate-700">
|
||||||
|
{t('tools.videoToGif.duration')}
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="number"
|
||||||
|
min="0.5"
|
||||||
|
max="15"
|
||||||
|
step="0.5"
|
||||||
|
value={duration}
|
||||||
|
onChange={(e) => setDuration(Number(e.target.value))}
|
||||||
|
className="input-field"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label className="mb-1 block text-sm font-medium text-slate-700">
|
||||||
|
{t('tools.videoToGif.fps')}
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="number"
|
||||||
|
min="1"
|
||||||
|
max="20"
|
||||||
|
value={fps}
|
||||||
|
onChange={(e) => setFps(Number(e.target.value))}
|
||||||
|
className="input-field"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label className="mb-1 block text-sm font-medium text-slate-700">
|
||||||
|
{t('tools.videoToGif.width')}
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="number"
|
||||||
|
min="100"
|
||||||
|
max="640"
|
||||||
|
step="10"
|
||||||
|
value={width}
|
||||||
|
onChange={(e) => setWidth(Number(e.target.value))}
|
||||||
|
className="input-field"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<button onClick={handleUpload} className="btn-primary w-full">
|
||||||
|
{t('tools.videoToGif.shortDesc')}
|
||||||
|
</button>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'processing' && !result && (
|
||||||
|
<ProgressBar state={status?.state || 'PENDING'} message={status?.progress} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'done' && result && result.status === 'completed' && (
|
||||||
|
<DownloadButton result={result} onStartOver={handleReset} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'done' && taskError && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div className="rounded-xl bg-red-50 p-4 ring-1 ring-red-200">
|
||||||
|
<p className="text-sm text-red-700">{taskError}</p>
|
||||||
|
</div>
|
||||||
|
<button onClick={handleReset} className="btn-secondary w-full">
|
||||||
|
{t('common.startOver')}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<AdSlot slot="bottom-banner" className="mt-8" />
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
81
frontend/src/components/tools/WordCounter.tsx
Normal file
81
frontend/src/components/tools/WordCounter.tsx
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import { Hash } from 'lucide-react';
|
||||||
|
import AdSlot from '@/components/layout/AdSlot';
|
||||||
|
import { countText, type TextStats } from '@/utils/textTools';
|
||||||
|
import { generateToolSchema } from '@/utils/seo';
|
||||||
|
|
||||||
|
export default function WordCounter() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const [text, setText] = useState('');
|
||||||
|
|
||||||
|
const stats: TextStats = countText(text);
|
||||||
|
|
||||||
|
const statItems = [
|
||||||
|
{ label: t('tools.wordCounter.words'), value: stats.words, color: 'bg-blue-50 text-blue-700' },
|
||||||
|
{ label: t('tools.wordCounter.characters'), value: stats.characters, color: 'bg-purple-50 text-purple-700' },
|
||||||
|
{ label: t('tools.wordCounter.sentences'), value: stats.sentences, color: 'bg-emerald-50 text-emerald-700' },
|
||||||
|
{ label: t('tools.wordCounter.paragraphs'), value: stats.paragraphs, color: 'bg-orange-50 text-orange-700' },
|
||||||
|
];
|
||||||
|
|
||||||
|
const schema = generateToolSchema({
|
||||||
|
name: t('tools.wordCounter.title'),
|
||||||
|
description: t('tools.wordCounter.description'),
|
||||||
|
url: `${window.location.origin}/tools/word-counter`,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('tools.wordCounter.title')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content={t('tools.wordCounter.description')} />
|
||||||
|
<link rel="canonical" href={`${window.location.origin}/tools/word-counter`} />
|
||||||
|
<script type="application/ld+json">{JSON.stringify(schema)}</script>
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="mx-auto max-w-2xl">
|
||||||
|
<div className="mb-8 text-center">
|
||||||
|
<div className="mx-auto mb-4 flex h-16 w-16 items-center justify-center rounded-2xl bg-blue-100">
|
||||||
|
<Hash className="h-8 w-8 text-blue-600" />
|
||||||
|
</div>
|
||||||
|
<h1 className="section-heading">{t('tools.wordCounter.title')}</h1>
|
||||||
|
<p className="mt-2 text-slate-500">{t('tools.wordCounter.description')}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<AdSlot slot="top-banner" format="horizontal" className="mb-6" />
|
||||||
|
|
||||||
|
{/* Stats Grid */}
|
||||||
|
<div className="mb-4 grid grid-cols-2 gap-3 sm:grid-cols-4">
|
||||||
|
{statItems.map((item) => (
|
||||||
|
<div
|
||||||
|
key={item.label}
|
||||||
|
className={`rounded-xl p-4 text-center ${item.color}`}
|
||||||
|
>
|
||||||
|
<p className="text-2xl font-bold">{item.value}</p>
|
||||||
|
<p className="text-xs font-medium opacity-80">{item.label}</p>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Reading Time */}
|
||||||
|
{stats.words > 0 && (
|
||||||
|
<p className="mb-4 text-center text-sm text-slate-500">
|
||||||
|
📖 Reading time: {stats.readingTime}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Text Input */}
|
||||||
|
<textarea
|
||||||
|
value={text}
|
||||||
|
onChange={(e) => setText(e.target.value)}
|
||||||
|
placeholder={t('tools.wordCounter.placeholder')}
|
||||||
|
className="input-field min-h-[300px] resize-y font-mono text-sm"
|
||||||
|
dir="auto"
|
||||||
|
/>
|
||||||
|
|
||||||
|
<AdSlot slot="bottom-banner" className="mt-8" />
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
121
frontend/src/components/tools/WordToPdf.tsx
Normal file
121
frontend/src/components/tools/WordToPdf.tsx
Normal file
@@ -0,0 +1,121 @@
|
|||||||
|
import { useState } from 'react';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import { FileOutput } from 'lucide-react';
|
||||||
|
import FileUploader from '@/components/shared/FileUploader';
|
||||||
|
import ProgressBar from '@/components/shared/ProgressBar';
|
||||||
|
import DownloadButton from '@/components/shared/DownloadButton';
|
||||||
|
import AdSlot from '@/components/layout/AdSlot';
|
||||||
|
import { useFileUpload } from '@/hooks/useFileUpload';
|
||||||
|
import { useTaskPolling } from '@/hooks/useTaskPolling';
|
||||||
|
import { generateToolSchema } from '@/utils/seo';
|
||||||
|
|
||||||
|
export default function WordToPdf() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
const [phase, setPhase] = useState<'upload' | 'processing' | 'done'>('upload');
|
||||||
|
|
||||||
|
const {
|
||||||
|
file,
|
||||||
|
uploadProgress,
|
||||||
|
isUploading,
|
||||||
|
taskId,
|
||||||
|
error: uploadError,
|
||||||
|
selectFile,
|
||||||
|
startUpload,
|
||||||
|
reset,
|
||||||
|
} = useFileUpload({
|
||||||
|
endpoint: '/convert/word-to-pdf',
|
||||||
|
maxSizeMB: 15,
|
||||||
|
acceptedTypes: ['doc', 'docx'],
|
||||||
|
});
|
||||||
|
|
||||||
|
const { status, result, error: taskError } = useTaskPolling({
|
||||||
|
taskId,
|
||||||
|
onComplete: () => setPhase('done'),
|
||||||
|
onError: () => setPhase('done'),
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleUpload = async () => {
|
||||||
|
const id = await startUpload();
|
||||||
|
if (id) setPhase('processing');
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleReset = () => {
|
||||||
|
reset();
|
||||||
|
setPhase('upload');
|
||||||
|
};
|
||||||
|
|
||||||
|
const schema = generateToolSchema({
|
||||||
|
name: t('tools.wordToPdf.title'),
|
||||||
|
description: t('tools.wordToPdf.description'),
|
||||||
|
url: `${window.location.origin}/tools/word-to-pdf`,
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('tools.wordToPdf.title')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content={t('tools.wordToPdf.description')} />
|
||||||
|
<link rel="canonical" href={`${window.location.origin}/tools/word-to-pdf`} />
|
||||||
|
<script type="application/ld+json">{JSON.stringify(schema)}</script>
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="mx-auto max-w-2xl">
|
||||||
|
<div className="mb-8 text-center">
|
||||||
|
<div className="mx-auto mb-4 flex h-16 w-16 items-center justify-center rounded-2xl bg-blue-100">
|
||||||
|
<FileOutput className="h-8 w-8 text-blue-600" />
|
||||||
|
</div>
|
||||||
|
<h1 className="section-heading">{t('tools.wordToPdf.title')}</h1>
|
||||||
|
<p className="mt-2 text-slate-500">{t('tools.wordToPdf.description')}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<AdSlot slot="top-banner" format="horizontal" className="mb-6" />
|
||||||
|
|
||||||
|
{phase === 'upload' && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<FileUploader
|
||||||
|
onFileSelect={selectFile}
|
||||||
|
file={file}
|
||||||
|
accept={{
|
||||||
|
'application/msword': ['.doc'],
|
||||||
|
'application/vnd.openxmlformats-officedocument.wordprocessingml.document': ['.docx'],
|
||||||
|
}}
|
||||||
|
maxSizeMB={15}
|
||||||
|
isUploading={isUploading}
|
||||||
|
uploadProgress={uploadProgress}
|
||||||
|
error={uploadError}
|
||||||
|
onReset={handleReset}
|
||||||
|
acceptLabel="Word (.doc, .docx)"
|
||||||
|
/>
|
||||||
|
{file && !isUploading && (
|
||||||
|
<button onClick={handleUpload} className="btn-primary w-full">
|
||||||
|
{t('tools.wordToPdf.shortDesc')}
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'processing' && !result && (
|
||||||
|
<ProgressBar state={status?.state || 'PENDING'} message={status?.progress} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'done' && result && result.status === 'completed' && (
|
||||||
|
<DownloadButton result={result} onStartOver={handleReset} />
|
||||||
|
)}
|
||||||
|
|
||||||
|
{phase === 'done' && taskError && (
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div className="rounded-xl bg-red-50 p-4 ring-1 ring-red-200">
|
||||||
|
<p className="text-sm text-red-700">{taskError}</p>
|
||||||
|
</div>
|
||||||
|
<button onClick={handleReset} className="btn-secondary w-full">
|
||||||
|
{t('common.startOver')}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<AdSlot slot="bottom-banner" className="mt-8" />
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
20
frontend/src/hooks/useDirection.ts
Normal file
20
frontend/src/hooks/useDirection.ts
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
import { useEffect } from 'react';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Hook that manages the HTML dir attribute based on current language.
|
||||||
|
*/
|
||||||
|
export function useDirection() {
|
||||||
|
const { i18n } = useTranslation();
|
||||||
|
const isRTL = i18n.language === 'ar';
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const dir = isRTL ? 'rtl' : 'ltr';
|
||||||
|
const lang = i18n.language;
|
||||||
|
|
||||||
|
document.documentElement.setAttribute('dir', dir);
|
||||||
|
document.documentElement.setAttribute('lang', lang);
|
||||||
|
}, [i18n.language, isRTL]);
|
||||||
|
|
||||||
|
return { isRTL, language: i18n.language };
|
||||||
|
}
|
||||||
110
frontend/src/hooks/useFileUpload.ts
Normal file
110
frontend/src/hooks/useFileUpload.ts
Normal file
@@ -0,0 +1,110 @@
|
|||||||
|
import { useState, useCallback, useRef } from 'react';
|
||||||
|
import { uploadFile, type TaskResponse } from '@/services/api';
|
||||||
|
|
||||||
|
interface UseFileUploadOptions {
|
||||||
|
endpoint: string;
|
||||||
|
maxSizeMB?: number;
|
||||||
|
acceptedTypes?: string[];
|
||||||
|
extraData?: Record<string, string>;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface UseFileUploadReturn {
|
||||||
|
file: File | null;
|
||||||
|
uploadProgress: number;
|
||||||
|
isUploading: boolean;
|
||||||
|
taskId: string | null;
|
||||||
|
error: string | null;
|
||||||
|
selectFile: (file: File) => void;
|
||||||
|
startUpload: () => Promise<string | null>;
|
||||||
|
reset: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useFileUpload({
|
||||||
|
endpoint,
|
||||||
|
maxSizeMB = 20,
|
||||||
|
acceptedTypes,
|
||||||
|
extraData,
|
||||||
|
}: UseFileUploadOptions): UseFileUploadReturn {
|
||||||
|
const [file, setFile] = useState<File | null>(null);
|
||||||
|
const [uploadProgress, setUploadProgress] = useState(0);
|
||||||
|
const [isUploading, setIsUploading] = useState(false);
|
||||||
|
const [taskId, setTaskId] = useState<string | null>(null);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
const extraDataRef = useRef(extraData);
|
||||||
|
extraDataRef.current = extraData;
|
||||||
|
|
||||||
|
const selectFile = useCallback(
|
||||||
|
(selectedFile: File) => {
|
||||||
|
setError(null);
|
||||||
|
setTaskId(null);
|
||||||
|
setUploadProgress(0);
|
||||||
|
|
||||||
|
// Client-side size check
|
||||||
|
const maxBytes = maxSizeMB * 1024 * 1024;
|
||||||
|
if (selectedFile.size > maxBytes) {
|
||||||
|
setError(`File too large. Maximum size is ${maxSizeMB}MB.`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Client-side type check
|
||||||
|
if (acceptedTypes && acceptedTypes.length > 0) {
|
||||||
|
const ext = selectedFile.name.split('.').pop()?.toLowerCase();
|
||||||
|
if (!ext || !acceptedTypes.includes(ext)) {
|
||||||
|
setError(`Invalid file type. Accepted: ${acceptedTypes.join(', ')}`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
setFile(selectedFile);
|
||||||
|
},
|
||||||
|
[maxSizeMB, acceptedTypes]
|
||||||
|
);
|
||||||
|
|
||||||
|
const startUpload = useCallback(async (): Promise<string | null> => {
|
||||||
|
if (!file) {
|
||||||
|
setError('No file selected.');
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
setIsUploading(true);
|
||||||
|
setError(null);
|
||||||
|
setUploadProgress(0);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response: TaskResponse = await uploadFile(
|
||||||
|
endpoint,
|
||||||
|
file,
|
||||||
|
extraDataRef.current,
|
||||||
|
(percent) => setUploadProgress(percent)
|
||||||
|
);
|
||||||
|
|
||||||
|
setTaskId(response.task_id);
|
||||||
|
setIsUploading(false);
|
||||||
|
return response.task_id;
|
||||||
|
} catch (err) {
|
||||||
|
const message = err instanceof Error ? err.message : 'Upload failed.';
|
||||||
|
setError(message);
|
||||||
|
setIsUploading(false);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}, [file, endpoint]);
|
||||||
|
|
||||||
|
const reset = useCallback(() => {
|
||||||
|
setFile(null);
|
||||||
|
setUploadProgress(0);
|
||||||
|
setIsUploading(false);
|
||||||
|
setTaskId(null);
|
||||||
|
setError(null);
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return {
|
||||||
|
file,
|
||||||
|
uploadProgress,
|
||||||
|
isUploading,
|
||||||
|
taskId,
|
||||||
|
error,
|
||||||
|
selectFile,
|
||||||
|
startUpload,
|
||||||
|
reset,
|
||||||
|
};
|
||||||
|
}
|
||||||
87
frontend/src/hooks/useTaskPolling.ts
Normal file
87
frontend/src/hooks/useTaskPolling.ts
Normal file
@@ -0,0 +1,87 @@
|
|||||||
|
import { useState, useEffect, useCallback, useRef } from 'react';
|
||||||
|
import { getTaskStatus, type TaskStatus, type TaskResult } from '@/services/api';
|
||||||
|
|
||||||
|
interface UseTaskPollingOptions {
|
||||||
|
taskId: string | null;
|
||||||
|
intervalMs?: number;
|
||||||
|
onComplete?: (result: TaskResult) => void;
|
||||||
|
onError?: (error: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface UseTaskPollingReturn {
|
||||||
|
status: TaskStatus | null;
|
||||||
|
isPolling: boolean;
|
||||||
|
result: TaskResult | null;
|
||||||
|
error: string | null;
|
||||||
|
stopPolling: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function useTaskPolling({
|
||||||
|
taskId,
|
||||||
|
intervalMs = 1500,
|
||||||
|
onComplete,
|
||||||
|
onError,
|
||||||
|
}: UseTaskPollingOptions): UseTaskPollingReturn {
|
||||||
|
const [status, setStatus] = useState<TaskStatus | null>(null);
|
||||||
|
const [isPolling, setIsPolling] = useState(false);
|
||||||
|
const [result, setResult] = useState<TaskResult | null>(null);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
const intervalRef = useRef<ReturnType<typeof setInterval> | null>(null);
|
||||||
|
|
||||||
|
const stopPolling = useCallback(() => {
|
||||||
|
if (intervalRef.current) {
|
||||||
|
clearInterval(intervalRef.current);
|
||||||
|
intervalRef.current = null;
|
||||||
|
}
|
||||||
|
setIsPolling(false);
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (!taskId) return;
|
||||||
|
|
||||||
|
setIsPolling(true);
|
||||||
|
setResult(null);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
const poll = async () => {
|
||||||
|
try {
|
||||||
|
const taskStatus = await getTaskStatus(taskId);
|
||||||
|
setStatus(taskStatus);
|
||||||
|
|
||||||
|
if (taskStatus.state === 'SUCCESS') {
|
||||||
|
stopPolling();
|
||||||
|
const taskResult = taskStatus.result;
|
||||||
|
|
||||||
|
if (taskResult?.status === 'completed') {
|
||||||
|
setResult(taskResult);
|
||||||
|
onComplete?.(taskResult);
|
||||||
|
} else {
|
||||||
|
const errMsg = taskResult?.error || 'Processing failed.';
|
||||||
|
setError(errMsg);
|
||||||
|
onError?.(errMsg);
|
||||||
|
}
|
||||||
|
} else if (taskStatus.state === 'FAILURE') {
|
||||||
|
stopPolling();
|
||||||
|
const errMsg = taskStatus.error || 'Task failed.';
|
||||||
|
setError(errMsg);
|
||||||
|
onError?.(errMsg);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
stopPolling();
|
||||||
|
const errMsg = err instanceof Error ? err.message : 'Polling failed.';
|
||||||
|
setError(errMsg);
|
||||||
|
onError?.(errMsg);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Poll immediately, then set interval
|
||||||
|
poll();
|
||||||
|
intervalRef.current = setInterval(poll, intervalMs);
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
stopPolling();
|
||||||
|
};
|
||||||
|
}, [taskId, intervalMs]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||||
|
|
||||||
|
return { status, isPolling, result, error, stopPolling };
|
||||||
|
}
|
||||||
96
frontend/src/i18n/ar.json
Normal file
96
frontend/src/i18n/ar.json
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
{
|
||||||
|
"common": {
|
||||||
|
"appName": "SaaS-PDF",
|
||||||
|
"tagline": "أدوات ملفات مجانية على الإنترنت",
|
||||||
|
"upload": "رفع ملف",
|
||||||
|
"download": "تحميل",
|
||||||
|
"processing": "جاري المعالجة...",
|
||||||
|
"dragDrop": "اسحب الملف وأفلته هنا، أو اضغط للاختيار",
|
||||||
|
"maxSize": "الحد الأقصى لحجم الملف: {{size}} ميجابايت",
|
||||||
|
"tryOtherTools": "جرب أدوات أخرى",
|
||||||
|
"error": "خطأ",
|
||||||
|
"success": "تم بنجاح",
|
||||||
|
"loading": "جاري التحميل...",
|
||||||
|
"startOver": "ابدأ من جديد",
|
||||||
|
"home": "الرئيسية",
|
||||||
|
"about": "عن الموقع",
|
||||||
|
"privacy": "سياسة الخصوصية",
|
||||||
|
"terms": "شروط الاستخدام",
|
||||||
|
"language": "اللغة",
|
||||||
|
"allTools": "كل الأدوات"
|
||||||
|
},
|
||||||
|
"home": {
|
||||||
|
"hero": "حوّل ملفاتك فوراً",
|
||||||
|
"heroSub": "أدوات مجانية لمعالجة ملفات PDF والصور والفيديو والنصوص. بدون تسجيل.",
|
||||||
|
"popularTools": "الأدوات الشائعة",
|
||||||
|
"pdfTools": "أدوات PDF",
|
||||||
|
"imageTools": "أدوات الصور",
|
||||||
|
"videoTools": "أدوات الفيديو",
|
||||||
|
"textTools": "أدوات النصوص"
|
||||||
|
},
|
||||||
|
"tools": {
|
||||||
|
"pdfToWord": {
|
||||||
|
"title": "PDF إلى Word",
|
||||||
|
"description": "حوّل ملفات PDF إلى مستندات Word قابلة للتعديل مجاناً.",
|
||||||
|
"shortDesc": "PDF → Word"
|
||||||
|
},
|
||||||
|
"wordToPdf": {
|
||||||
|
"title": "Word إلى PDF",
|
||||||
|
"description": "حوّل مستندات Word (DOC, DOCX) إلى صيغة PDF مجاناً.",
|
||||||
|
"shortDesc": "Word → PDF"
|
||||||
|
},
|
||||||
|
"compressPdf": {
|
||||||
|
"title": "ضغط PDF",
|
||||||
|
"description": "قلّل حجم ملف PDF مع الحفاظ على الجودة. اختر مستوى الضغط.",
|
||||||
|
"shortDesc": "ضغط PDF",
|
||||||
|
"qualityLow": "أقصى ضغط",
|
||||||
|
"qualityMedium": "متوازن",
|
||||||
|
"qualityHigh": "جودة عالية"
|
||||||
|
},
|
||||||
|
"imageConvert": {
|
||||||
|
"title": "محوّل الصور",
|
||||||
|
"description": "حوّل الصور بين صيغ JPG و PNG و WebP فوراً.",
|
||||||
|
"shortDesc": "تحويل الصور"
|
||||||
|
},
|
||||||
|
"videoToGif": {
|
||||||
|
"title": "فيديو إلى GIF",
|
||||||
|
"description": "أنشئ صور GIF متحركة من مقاطع الفيديو. خصّص وقت البداية والمدة والجودة.",
|
||||||
|
"shortDesc": "فيديو → GIF",
|
||||||
|
"startTime": "وقت البداية (ثوانٍ)",
|
||||||
|
"duration": "المدة (ثوانٍ)",
|
||||||
|
"fps": "إطارات في الثانية",
|
||||||
|
"width": "العرض (بكسل)"
|
||||||
|
},
|
||||||
|
"wordCounter": {
|
||||||
|
"title": "عدّاد الكلمات",
|
||||||
|
"description": "عُد الكلمات والحروف والجمل والفقرات في نصك فوراً.",
|
||||||
|
"shortDesc": "عد الكلمات",
|
||||||
|
"words": "كلمات",
|
||||||
|
"characters": "حروف",
|
||||||
|
"sentences": "جمل",
|
||||||
|
"paragraphs": "فقرات",
|
||||||
|
"placeholder": "اكتب أو الصق نصك هنا..."
|
||||||
|
},
|
||||||
|
"textCleaner": {
|
||||||
|
"title": "منظّف النصوص",
|
||||||
|
"description": "أزل المسافات الزائدة، حوّل حالة الحروف، ونظّف نصك فوراً.",
|
||||||
|
"shortDesc": "تنظيف النص",
|
||||||
|
"removeSpaces": "إزالة المسافات الزائدة",
|
||||||
|
"toUpperCase": "أحرف كبيرة",
|
||||||
|
"toLowerCase": "أحرف صغيرة",
|
||||||
|
"toTitleCase": "حروف العنوان",
|
||||||
|
"toSentenceCase": "حالة الجملة",
|
||||||
|
"removeDiacritics": "إزالة التشكيل العربي",
|
||||||
|
"copyResult": "نسخ النتيجة"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"result": {
|
||||||
|
"conversionComplete": "اكتمل التحويل!",
|
||||||
|
"compressionComplete": "اكتمل الضغط!",
|
||||||
|
"originalSize": "الحجم الأصلي",
|
||||||
|
"newSize": "الحجم الجديد",
|
||||||
|
"reduction": "نسبة التقليل",
|
||||||
|
"downloadReady": "ملفك جاهز للتحميل.",
|
||||||
|
"linkExpiry": "رابط التحميل ينتهي خلال 30 دقيقة."
|
||||||
|
}
|
||||||
|
}
|
||||||
96
frontend/src/i18n/en.json
Normal file
96
frontend/src/i18n/en.json
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
{
|
||||||
|
"common": {
|
||||||
|
"appName": "SaaS-PDF",
|
||||||
|
"tagline": "Free Online File Tools",
|
||||||
|
"upload": "Upload File",
|
||||||
|
"download": "Download",
|
||||||
|
"processing": "Processing...",
|
||||||
|
"dragDrop": "Drag & drop your file here, or click to browse",
|
||||||
|
"maxSize": "Maximum file size: {{size}}MB",
|
||||||
|
"tryOtherTools": "Try Other Tools",
|
||||||
|
"error": "Error",
|
||||||
|
"success": "Success",
|
||||||
|
"loading": "Loading...",
|
||||||
|
"startOver": "Start Over",
|
||||||
|
"home": "Home",
|
||||||
|
"about": "About",
|
||||||
|
"privacy": "Privacy Policy",
|
||||||
|
"terms": "Terms of Service",
|
||||||
|
"language": "Language",
|
||||||
|
"allTools": "All Tools"
|
||||||
|
},
|
||||||
|
"home": {
|
||||||
|
"hero": "Transform Your Files Instantly",
|
||||||
|
"heroSub": "Free online tools for PDF, image, video, and text processing. No registration required.",
|
||||||
|
"popularTools": "Popular Tools",
|
||||||
|
"pdfTools": "PDF Tools",
|
||||||
|
"imageTools": "Image Tools",
|
||||||
|
"videoTools": "Video Tools",
|
||||||
|
"textTools": "Text Tools"
|
||||||
|
},
|
||||||
|
"tools": {
|
||||||
|
"pdfToWord": {
|
||||||
|
"title": "PDF to Word",
|
||||||
|
"description": "Convert PDF files to editable Word documents online for free.",
|
||||||
|
"shortDesc": "PDF → Word"
|
||||||
|
},
|
||||||
|
"wordToPdf": {
|
||||||
|
"title": "Word to PDF",
|
||||||
|
"description": "Convert Word documents (DOC, DOCX) to PDF format online for free.",
|
||||||
|
"shortDesc": "Word → PDF"
|
||||||
|
},
|
||||||
|
"compressPdf": {
|
||||||
|
"title": "Compress PDF",
|
||||||
|
"description": "Reduce PDF file size while maintaining quality. Choose your compression level.",
|
||||||
|
"shortDesc": "Compress PDF",
|
||||||
|
"qualityLow": "Maximum Compression",
|
||||||
|
"qualityMedium": "Balanced",
|
||||||
|
"qualityHigh": "High Quality"
|
||||||
|
},
|
||||||
|
"imageConvert": {
|
||||||
|
"title": "Image Converter",
|
||||||
|
"description": "Convert images between JPG, PNG, and WebP formats instantly.",
|
||||||
|
"shortDesc": "Convert Images"
|
||||||
|
},
|
||||||
|
"videoToGif": {
|
||||||
|
"title": "Video to GIF",
|
||||||
|
"description": "Create animated GIFs from video clips. Customize start time, duration, and quality.",
|
||||||
|
"shortDesc": "Video → GIF",
|
||||||
|
"startTime": "Start Time (seconds)",
|
||||||
|
"duration": "Duration (seconds)",
|
||||||
|
"fps": "Frames Per Second",
|
||||||
|
"width": "Width (pixels)"
|
||||||
|
},
|
||||||
|
"wordCounter": {
|
||||||
|
"title": "Word Counter",
|
||||||
|
"description": "Count words, characters, sentences, and paragraphs in your text instantly.",
|
||||||
|
"shortDesc": "Count Words",
|
||||||
|
"words": "Words",
|
||||||
|
"characters": "Characters",
|
||||||
|
"sentences": "Sentences",
|
||||||
|
"paragraphs": "Paragraphs",
|
||||||
|
"placeholder": "Type or paste your text here..."
|
||||||
|
},
|
||||||
|
"textCleaner": {
|
||||||
|
"title": "Text Cleaner",
|
||||||
|
"description": "Remove extra spaces, convert text case, and clean up your text instantly.",
|
||||||
|
"shortDesc": "Clean Text",
|
||||||
|
"removeSpaces": "Remove Extra Spaces",
|
||||||
|
"toUpperCase": "UPPER CASE",
|
||||||
|
"toLowerCase": "lower case",
|
||||||
|
"toTitleCase": "Title Case",
|
||||||
|
"toSentenceCase": "Sentence case",
|
||||||
|
"removeDiacritics": "Remove Arabic Diacritics",
|
||||||
|
"copyResult": "Copy Result"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"result": {
|
||||||
|
"conversionComplete": "Conversion Complete!",
|
||||||
|
"compressionComplete": "Compression Complete!",
|
||||||
|
"originalSize": "Original Size",
|
||||||
|
"newSize": "New Size",
|
||||||
|
"reduction": "Reduction",
|
||||||
|
"downloadReady": "Your file is ready for download.",
|
||||||
|
"linkExpiry": "Download link expires in 30 minutes."
|
||||||
|
}
|
||||||
|
}
|
||||||
27
frontend/src/i18n/index.ts
Normal file
27
frontend/src/i18n/index.ts
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
import i18n from 'i18next';
|
||||||
|
import { initReactI18next } from 'react-i18next';
|
||||||
|
import LanguageDetector from 'i18next-browser-languagedetector';
|
||||||
|
|
||||||
|
import en from './en.json';
|
||||||
|
import ar from './ar.json';
|
||||||
|
|
||||||
|
i18n
|
||||||
|
.use(LanguageDetector)
|
||||||
|
.use(initReactI18next)
|
||||||
|
.init({
|
||||||
|
resources: {
|
||||||
|
en: { translation: en },
|
||||||
|
ar: { translation: ar },
|
||||||
|
},
|
||||||
|
fallbackLng: 'en',
|
||||||
|
supportedLngs: ['en', 'ar'],
|
||||||
|
interpolation: {
|
||||||
|
escapeValue: false,
|
||||||
|
},
|
||||||
|
detection: {
|
||||||
|
order: ['querystring', 'cookie', 'localStorage', 'navigator'],
|
||||||
|
caches: ['localStorage', 'cookie'],
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
export default i18n;
|
||||||
17
frontend/src/main.tsx
Normal file
17
frontend/src/main.tsx
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import ReactDOM from 'react-dom/client';
|
||||||
|
import { BrowserRouter } from 'react-router-dom';
|
||||||
|
import { HelmetProvider } from 'react-helmet-async';
|
||||||
|
import App from './App';
|
||||||
|
import './i18n';
|
||||||
|
import './styles/global.css';
|
||||||
|
|
||||||
|
ReactDOM.createRoot(document.getElementById('root')!).render(
|
||||||
|
<React.StrictMode>
|
||||||
|
<HelmetProvider>
|
||||||
|
<BrowserRouter>
|
||||||
|
<App />
|
||||||
|
</BrowserRouter>
|
||||||
|
</HelmetProvider>
|
||||||
|
</React.StrictMode>,
|
||||||
|
);
|
||||||
49
frontend/src/pages/AboutPage.tsx
Normal file
49
frontend/src/pages/AboutPage.tsx
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
|
||||||
|
export default function AboutPage() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('common.about')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content="About our free online file conversion tools." />
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="prose mx-auto max-w-2xl dark:prose-invert">
|
||||||
|
<h1>{t('common.about')}</h1>
|
||||||
|
|
||||||
|
<p>
|
||||||
|
We provide free, fast, and secure online tools for converting, compressing,
|
||||||
|
and processing files — PDFs, images, videos, and text.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h2>Why use our tools?</h2>
|
||||||
|
<ul>
|
||||||
|
<li><strong>100% Free</strong> — No hidden charges, no sign-up required.</li>
|
||||||
|
<li><strong>Private & Secure</strong> — Files are auto-deleted within 2 hours.</li>
|
||||||
|
<li><strong>Fast Processing</strong> — Server-side processing for reliable results.</li>
|
||||||
|
<li><strong>Works Everywhere</strong> — Desktop, tablet, or mobile.</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<h2>Available Tools</h2>
|
||||||
|
<ul>
|
||||||
|
<li>PDF to Word Converter</li>
|
||||||
|
<li>Word to PDF Converter</li>
|
||||||
|
<li>PDF Compressor</li>
|
||||||
|
<li>Image Format Converter</li>
|
||||||
|
<li>Video to GIF Creator</li>
|
||||||
|
<li>Word Counter</li>
|
||||||
|
<li>Text Cleaner & Formatter</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<h2>Contact</h2>
|
||||||
|
<p>
|
||||||
|
Have feedback or feature requests? Reach out at{' '}
|
||||||
|
<a href="mailto:support@example.com">support@example.com</a>.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
93
frontend/src/pages/HomePage.tsx
Normal file
93
frontend/src/pages/HomePage.tsx
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import {
|
||||||
|
FileText,
|
||||||
|
FileOutput,
|
||||||
|
Minimize2,
|
||||||
|
ImageIcon,
|
||||||
|
Film,
|
||||||
|
Hash,
|
||||||
|
Eraser,
|
||||||
|
} from 'lucide-react';
|
||||||
|
import ToolCard from '@/components/shared/ToolCard';
|
||||||
|
import AdSlot from '@/components/layout/AdSlot';
|
||||||
|
|
||||||
|
interface ToolInfo {
|
||||||
|
key: string;
|
||||||
|
path: string;
|
||||||
|
icon: React.ReactNode;
|
||||||
|
bgColor: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
const tools: ToolInfo[] = [
|
||||||
|
{ key: 'pdfToWord', path: '/tools/pdf-to-word', icon: <FileText className="h-6 w-6 text-red-600" />, bgColor: 'bg-red-50' },
|
||||||
|
{ key: 'wordToPdf', path: '/tools/word-to-pdf', icon: <FileOutput className="h-6 w-6 text-blue-600" />, bgColor: 'bg-blue-50' },
|
||||||
|
{ key: 'compressPdf', path: '/tools/compress-pdf', icon: <Minimize2 className="h-6 w-6 text-orange-600" />, bgColor: 'bg-orange-50' },
|
||||||
|
{ key: 'imageConvert', path: '/tools/image-converter', icon: <ImageIcon className="h-6 w-6 text-purple-600" />, bgColor: 'bg-purple-50' },
|
||||||
|
{ key: 'videoToGif', path: '/tools/video-to-gif', icon: <Film className="h-6 w-6 text-emerald-600" />, bgColor: 'bg-emerald-50' },
|
||||||
|
{ key: 'wordCounter', path: '/tools/word-counter', icon: <Hash className="h-6 w-6 text-blue-600" />, bgColor: 'bg-blue-50' },
|
||||||
|
{ key: 'textCleaner', path: '/tools/text-cleaner', icon: <Eraser className="h-6 w-6 text-indigo-600" />, bgColor: 'bg-indigo-50' },
|
||||||
|
];
|
||||||
|
|
||||||
|
export default function HomePage() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('common.appName')} — {t('home.heroSub')}</title>
|
||||||
|
<meta name="description" content={t('home.heroSub')} />
|
||||||
|
<link rel="canonical" href={window.location.origin} />
|
||||||
|
<script type="application/ld+json">
|
||||||
|
{JSON.stringify({
|
||||||
|
'@context': 'https://schema.org',
|
||||||
|
'@type': 'WebSite',
|
||||||
|
name: t('common.appName'),
|
||||||
|
url: window.location.origin,
|
||||||
|
description: t('home.heroSub'),
|
||||||
|
potentialAction: {
|
||||||
|
'@type': 'SearchAction',
|
||||||
|
target: `${window.location.origin}/tools/{search_term_string}`,
|
||||||
|
'query-input': 'required name=search_term_string',
|
||||||
|
},
|
||||||
|
})}
|
||||||
|
</script>
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
{/* Hero Section */}
|
||||||
|
<section className="py-12 text-center sm:py-16">
|
||||||
|
<h1 className="text-4xl font-bold tracking-tight text-slate-900 sm:text-5xl">
|
||||||
|
{t('home.hero')}
|
||||||
|
</h1>
|
||||||
|
<p className="mx-auto mt-4 max-w-xl text-lg text-slate-500">
|
||||||
|
{t('home.heroSub')}
|
||||||
|
</p>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
{/* Ad Slot */}
|
||||||
|
<AdSlot slot="home-top" format="horizontal" className="mb-8" />
|
||||||
|
|
||||||
|
{/* Tools Grid */}
|
||||||
|
<section>
|
||||||
|
<h2 className="mb-6 text-center text-xl font-semibold text-slate-800">
|
||||||
|
{t('home.popularTools')}
|
||||||
|
</h2>
|
||||||
|
<div className="grid gap-4 sm:grid-cols-2 lg:grid-cols-3">
|
||||||
|
{tools.map((tool) => (
|
||||||
|
<ToolCard
|
||||||
|
key={tool.key}
|
||||||
|
to={tool.path}
|
||||||
|
icon={tool.icon}
|
||||||
|
title={t(`tools.${tool.key}.title`)}
|
||||||
|
description={t(`tools.${tool.key}.shortDesc`)}
|
||||||
|
bgColor={tool.bgColor}
|
||||||
|
/>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
{/* Ad Slot - Bottom */}
|
||||||
|
<AdSlot slot="home-bottom" className="mt-12" />
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
34
frontend/src/pages/NotFoundPage.tsx
Normal file
34
frontend/src/pages/NotFoundPage.tsx
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
import { Link } from 'react-router-dom';
|
||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
import { Home } from 'lucide-react';
|
||||||
|
|
||||||
|
export default function NotFoundPage() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>404 — {t('common.appName')}</title>
|
||||||
|
<meta name="robots" content="noindex" />
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="flex flex-col items-center justify-center py-24 text-center">
|
||||||
|
<p className="text-7xl font-bold text-primary-600">404</p>
|
||||||
|
<h1 className="mt-4 text-2xl font-semibold text-slate-900">
|
||||||
|
Page Not Found
|
||||||
|
</h1>
|
||||||
|
<p className="mt-2 text-slate-500">
|
||||||
|
The page you're looking for doesn't exist or has been moved.
|
||||||
|
</p>
|
||||||
|
<Link
|
||||||
|
to="/"
|
||||||
|
className="btn-primary mt-8 inline-flex items-center gap-2"
|
||||||
|
>
|
||||||
|
<Home className="h-4 w-4" />
|
||||||
|
{t('common.home')}
|
||||||
|
</Link>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
59
frontend/src/pages/PrivacyPage.tsx
Normal file
59
frontend/src/pages/PrivacyPage.tsx
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
|
||||||
|
export default function PrivacyPage() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('common.privacy')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content="Privacy policy for our online tools." />
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="prose mx-auto max-w-2xl dark:prose-invert">
|
||||||
|
<h1>{t('common.privacy')}</h1>
|
||||||
|
<p><em>Last updated: {new Date().toISOString().split('T')[0]}</em></p>
|
||||||
|
|
||||||
|
<h2>1. Data Collection</h2>
|
||||||
|
<p>
|
||||||
|
We only collect files you intentionally upload for processing. We do not
|
||||||
|
require registration, and we do not store personal information.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h2>2. File Processing & Storage</h2>
|
||||||
|
<ul>
|
||||||
|
<li>Uploaded files are processed on our secure servers.</li>
|
||||||
|
<li>All uploaded and output files are <strong>automatically deleted within 2 hours</strong>.</li>
|
||||||
|
<li>Files are stored in encrypted cloud storage during processing.</li>
|
||||||
|
<li>We do not access, read, or share the content of your files.</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<h2>3. Cookies & Analytics</h2>
|
||||||
|
<p>
|
||||||
|
We use essential cookies to remember your language preference. We may use
|
||||||
|
Google Analytics and Google AdSense, which may place their own cookies.
|
||||||
|
You can manage cookie preferences in your browser settings.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h2>4. Third-Party Services</h2>
|
||||||
|
<ul>
|
||||||
|
<li><strong>Google AdSense</strong> — for displaying advertisements.</li>
|
||||||
|
<li><strong>AWS S3</strong> — for temporary file storage.</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<h2>5. Security</h2>
|
||||||
|
<p>
|
||||||
|
We employ industry-standard security measures including HTTPS encryption,
|
||||||
|
file validation, rate limiting, and automatic file cleanup.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h2>6. Contact</h2>
|
||||||
|
<p>
|
||||||
|
Questions about this policy? Contact us at{' '}
|
||||||
|
<a href="mailto:support@example.com">support@example.com</a>.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
66
frontend/src/pages/TermsPage.tsx
Normal file
66
frontend/src/pages/TermsPage.tsx
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
import { useTranslation } from 'react-i18next';
|
||||||
|
import { Helmet } from 'react-helmet-async';
|
||||||
|
|
||||||
|
export default function TermsPage() {
|
||||||
|
const { t } = useTranslation();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<Helmet>
|
||||||
|
<title>{t('common.terms')} — {t('common.appName')}</title>
|
||||||
|
<meta name="description" content="Terms of service for our online tools." />
|
||||||
|
</Helmet>
|
||||||
|
|
||||||
|
<div className="prose mx-auto max-w-2xl dark:prose-invert">
|
||||||
|
<h1>{t('common.terms')}</h1>
|
||||||
|
<p><em>Last updated: {new Date().toISOString().split('T')[0]}</em></p>
|
||||||
|
|
||||||
|
<h2>1. Acceptance of Terms</h2>
|
||||||
|
<p>
|
||||||
|
By accessing and using SaaS-PDF, you agree to be bound by these Terms of
|
||||||
|
Service. If you do not agree, please discontinue use immediately.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h2>2. Service Description</h2>
|
||||||
|
<p>
|
||||||
|
SaaS-PDF provides free online tools for file conversion, compression,
|
||||||
|
and transformation. The service is provided “as is” without
|
||||||
|
warranties of any kind.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h2>3. Acceptable Use</h2>
|
||||||
|
<ul>
|
||||||
|
<li>You may only upload files that you have the right to process.</li>
|
||||||
|
<li>You must not upload malicious, illegal, or copyrighted content without authorization.</li>
|
||||||
|
<li>Automated or excessive use of the service is prohibited.</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<h2>4. File Handling</h2>
|
||||||
|
<ul>
|
||||||
|
<li>All uploaded and processed files are automatically deleted within 2 hours.</li>
|
||||||
|
<li>We are not responsible for any data loss during processing.</li>
|
||||||
|
<li>You are responsible for maintaining your own file backups.</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<h2>5. Limitation of Liability</h2>
|
||||||
|
<p>
|
||||||
|
SaaS-PDF shall not be liable for any direct, indirect, incidental, or
|
||||||
|
consequential damages resulting from the use or inability to use the
|
||||||
|
service.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h2>6. Changes to Terms</h2>
|
||||||
|
<p>
|
||||||
|
We reserve the right to modify these terms at any time. Continued use of
|
||||||
|
the service after changes constitutes acceptance of the updated terms.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<h2>7. Contact</h2>
|
||||||
|
<p>
|
||||||
|
Questions about these terms? Contact us at{' '}
|
||||||
|
<a href="mailto:support@example.com">support@example.com</a>.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
114
frontend/src/services/api.ts
Normal file
114
frontend/src/services/api.ts
Normal file
@@ -0,0 +1,114 @@
|
|||||||
|
import axios from 'axios';
|
||||||
|
|
||||||
|
const api = axios.create({
|
||||||
|
baseURL: '/api',
|
||||||
|
timeout: 120000, // 2 minute timeout for file processing
|
||||||
|
headers: {
|
||||||
|
Accept: 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Request interceptor for logging
|
||||||
|
api.interceptors.request.use(
|
||||||
|
(config) => config,
|
||||||
|
(error) => Promise.reject(error)
|
||||||
|
);
|
||||||
|
|
||||||
|
// Response interceptor for error handling
|
||||||
|
api.interceptors.response.use(
|
||||||
|
(response) => response,
|
||||||
|
(error) => {
|
||||||
|
if (error.response) {
|
||||||
|
const message = error.response.data?.error || 'An error occurred.';
|
||||||
|
return Promise.reject(new Error(message));
|
||||||
|
}
|
||||||
|
if (error.request) {
|
||||||
|
return Promise.reject(new Error('Network error. Please check your connection.'));
|
||||||
|
}
|
||||||
|
return Promise.reject(error);
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// --- API Functions ---
|
||||||
|
|
||||||
|
export interface TaskResponse {
|
||||||
|
task_id: string;
|
||||||
|
message: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface TaskStatus {
|
||||||
|
task_id: string;
|
||||||
|
state: 'PENDING' | 'PROCESSING' | 'SUCCESS' | 'FAILURE';
|
||||||
|
progress?: string;
|
||||||
|
result?: TaskResult;
|
||||||
|
error?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface TaskResult {
|
||||||
|
status: 'completed' | 'failed';
|
||||||
|
download_url?: string;
|
||||||
|
filename?: string;
|
||||||
|
error?: string;
|
||||||
|
original_size?: number;
|
||||||
|
compressed_size?: number;
|
||||||
|
reduction_percent?: number;
|
||||||
|
width?: number;
|
||||||
|
height?: number;
|
||||||
|
output_size?: number;
|
||||||
|
duration?: number;
|
||||||
|
fps?: number;
|
||||||
|
format?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Upload a file and start a processing task.
|
||||||
|
*/
|
||||||
|
export async function uploadFile(
|
||||||
|
endpoint: string,
|
||||||
|
file: File,
|
||||||
|
extraData?: Record<string, string>,
|
||||||
|
onProgress?: (percent: number) => void
|
||||||
|
): Promise<TaskResponse> {
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', file);
|
||||||
|
|
||||||
|
if (extraData) {
|
||||||
|
Object.entries(extraData).forEach(([key, value]) => {
|
||||||
|
formData.append(key, value);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const response = await api.post<TaskResponse>(endpoint, formData, {
|
||||||
|
headers: { 'Content-Type': 'multipart/form-data' },
|
||||||
|
onUploadProgress: (event) => {
|
||||||
|
if (event.total && onProgress) {
|
||||||
|
const percent = Math.round((event.loaded / event.total) * 100);
|
||||||
|
onProgress(percent);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
return response.data;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Poll task status.
|
||||||
|
*/
|
||||||
|
export async function getTaskStatus(taskId: string): Promise<TaskStatus> {
|
||||||
|
const response = await api.get<TaskStatus>(`/tasks/${taskId}/status`);
|
||||||
|
return response.data;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check API health.
|
||||||
|
*/
|
||||||
|
export async function checkHealth(): Promise<boolean> {
|
||||||
|
try {
|
||||||
|
const response = await api.get('/health');
|
||||||
|
return response.data.status === 'healthy';
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export default api;
|
||||||
90
frontend/src/styles/global.css
Normal file
90
frontend/src/styles/global.css
Normal file
@@ -0,0 +1,90 @@
|
|||||||
|
@tailwind base;
|
||||||
|
@tailwind components;
|
||||||
|
@tailwind utilities;
|
||||||
|
|
||||||
|
@layer base {
|
||||||
|
:root {
|
||||||
|
--color-bg: #ffffff;
|
||||||
|
--color-surface: #f8fafc;
|
||||||
|
--color-text: #0f172a;
|
||||||
|
--color-text-secondary: #64748b;
|
||||||
|
--color-border: #e2e8f0;
|
||||||
|
}
|
||||||
|
|
||||||
|
html {
|
||||||
|
scroll-behavior: smooth;
|
||||||
|
}
|
||||||
|
|
||||||
|
body {
|
||||||
|
@apply bg-white text-slate-900 antialiased;
|
||||||
|
font-family: 'Inter', 'Tajawal', system-ui, sans-serif;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* RTL Support */
|
||||||
|
[dir="rtl"] body {
|
||||||
|
font-family: 'Tajawal', 'Inter', system-ui, sans-serif;
|
||||||
|
}
|
||||||
|
|
||||||
|
[dir="rtl"] .ltr-only {
|
||||||
|
direction: ltr;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@layer components {
|
||||||
|
.btn-primary {
|
||||||
|
@apply inline-flex items-center justify-center gap-2 rounded-xl bg-primary-600 px-6 py-3 text-sm font-semibold text-white shadow-sm transition-all hover:bg-primary-700 focus-visible:outline focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:outline-primary-600 disabled:opacity-50 disabled:cursor-not-allowed;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-secondary {
|
||||||
|
@apply inline-flex items-center justify-center gap-2 rounded-xl bg-white px-6 py-3 text-sm font-semibold text-slate-900 shadow-sm ring-1 ring-inset ring-slate-300 transition-all hover:bg-slate-50 disabled:opacity-50 disabled:cursor-not-allowed;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-success {
|
||||||
|
@apply inline-flex items-center justify-center gap-2 rounded-xl bg-emerald-600 px-6 py-3 text-sm font-semibold text-white shadow-sm transition-all hover:bg-emerald-700 disabled:opacity-50 disabled:cursor-not-allowed;
|
||||||
|
}
|
||||||
|
|
||||||
|
.card {
|
||||||
|
@apply rounded-2xl bg-white p-6 shadow-sm ring-1 ring-slate-200 transition-shadow hover:shadow-md;
|
||||||
|
}
|
||||||
|
|
||||||
|
.tool-card {
|
||||||
|
@apply card cursor-pointer hover:ring-primary-300 hover:shadow-lg transition-all duration-200;
|
||||||
|
}
|
||||||
|
|
||||||
|
.input-field {
|
||||||
|
@apply block w-full rounded-xl border-0 py-3 px-4 text-slate-900 shadow-sm ring-1 ring-inset ring-slate-300 placeholder:text-slate-400 focus:ring-2 focus:ring-inset focus:ring-primary-600 sm:text-sm sm:leading-6;
|
||||||
|
}
|
||||||
|
|
||||||
|
.section-heading {
|
||||||
|
@apply text-2xl font-bold tracking-tight text-slate-900 sm:text-3xl;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Upload zone styles */
|
||||||
|
.upload-zone {
|
||||||
|
@apply flex flex-col items-center justify-center rounded-2xl border-2 border-dashed border-slate-300 bg-slate-50 p-8 text-center transition-colors cursor-pointer;
|
||||||
|
}
|
||||||
|
|
||||||
|
.upload-zone:hover,
|
||||||
|
.upload-zone.drag-active {
|
||||||
|
@apply border-primary-400 bg-primary-50;
|
||||||
|
}
|
||||||
|
|
||||||
|
.upload-zone.drag-active {
|
||||||
|
@apply ring-2 ring-primary-300;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Progress bar animation */
|
||||||
|
@keyframes progress-pulse {
|
||||||
|
0%, 100% { opacity: 1; }
|
||||||
|
50% { opacity: 0.6; }
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-bar-animated {
|
||||||
|
animation: progress-pulse 1.5s ease-in-out infinite;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Ad slot container */
|
||||||
|
.ad-slot {
|
||||||
|
@apply flex items-center justify-center bg-slate-50 rounded-xl border border-slate-200 min-h-[90px] overflow-hidden;
|
||||||
|
}
|
||||||
69
frontend/src/utils/seo.ts
Normal file
69
frontend/src/utils/seo.ts
Normal file
@@ -0,0 +1,69 @@
|
|||||||
|
/**
|
||||||
|
* SEO utility functions for structured data generation.
|
||||||
|
*/
|
||||||
|
|
||||||
|
export interface ToolSeoData {
|
||||||
|
name: string;
|
||||||
|
description: string;
|
||||||
|
url: string;
|
||||||
|
category?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate WebApplication JSON-LD structured data for a tool page.
|
||||||
|
*/
|
||||||
|
export function generateToolSchema(tool: ToolSeoData): object {
|
||||||
|
return {
|
||||||
|
'@context': 'https://schema.org',
|
||||||
|
'@type': 'WebApplication',
|
||||||
|
name: tool.name,
|
||||||
|
url: tool.url,
|
||||||
|
applicationCategory: tool.category || 'UtilitiesApplication',
|
||||||
|
operatingSystem: 'Any',
|
||||||
|
offers: {
|
||||||
|
'@type': 'Offer',
|
||||||
|
price: '0',
|
||||||
|
priceCurrency: 'USD',
|
||||||
|
},
|
||||||
|
description: tool.description,
|
||||||
|
inLanguage: ['en', 'ar'],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate BreadcrumbList JSON-LD.
|
||||||
|
*/
|
||||||
|
export function generateBreadcrumbs(
|
||||||
|
items: { name: string; url: string }[]
|
||||||
|
): object {
|
||||||
|
return {
|
||||||
|
'@context': 'https://schema.org',
|
||||||
|
'@type': 'BreadcrumbList',
|
||||||
|
itemListElement: items.map((item, index) => ({
|
||||||
|
'@type': 'ListItem',
|
||||||
|
position: index + 1,
|
||||||
|
name: item.name,
|
||||||
|
item: item.url,
|
||||||
|
})),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate FAQ structured data.
|
||||||
|
*/
|
||||||
|
export function generateFAQ(
|
||||||
|
questions: { question: string; answer: string }[]
|
||||||
|
): object {
|
||||||
|
return {
|
||||||
|
'@context': 'https://schema.org',
|
||||||
|
'@type': 'FAQPage',
|
||||||
|
mainEntity: questions.map((q) => ({
|
||||||
|
'@type': 'Question',
|
||||||
|
name: q.question,
|
||||||
|
acceptedAnswer: {
|
||||||
|
'@type': 'Answer',
|
||||||
|
text: q.answer,
|
||||||
|
},
|
||||||
|
})),
|
||||||
|
};
|
||||||
|
}
|
||||||
124
frontend/src/utils/textTools.ts
Normal file
124
frontend/src/utils/textTools.ts
Normal file
@@ -0,0 +1,124 @@
|
|||||||
|
/**
|
||||||
|
* Client-side text processing utilities.
|
||||||
|
* These run entirely in the browser — no API calls needed.
|
||||||
|
*/
|
||||||
|
|
||||||
|
export interface TextStats {
|
||||||
|
words: number;
|
||||||
|
characters: number;
|
||||||
|
charactersNoSpaces: number;
|
||||||
|
sentences: number;
|
||||||
|
paragraphs: number;
|
||||||
|
readingTime: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Count words, characters, sentences, and paragraphs.
|
||||||
|
* Supports both English and Arabic text.
|
||||||
|
*/
|
||||||
|
export function countText(text: string): TextStats {
|
||||||
|
if (!text.trim()) {
|
||||||
|
return {
|
||||||
|
words: 0,
|
||||||
|
characters: 0,
|
||||||
|
charactersNoSpaces: 0,
|
||||||
|
sentences: 0,
|
||||||
|
paragraphs: 0,
|
||||||
|
readingTime: '0 min',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const characters = text.length;
|
||||||
|
const charactersNoSpaces = text.replace(/\s/g, '').length;
|
||||||
|
|
||||||
|
// Word count — split by whitespace, filter empty
|
||||||
|
const words = text
|
||||||
|
.trim()
|
||||||
|
.split(/\s+/)
|
||||||
|
.filter((w) => w.length > 0).length;
|
||||||
|
|
||||||
|
// Sentence count — split by sentence-ending punctuation
|
||||||
|
const sentences = text
|
||||||
|
.split(/[.!?؟。]+/)
|
||||||
|
.filter((s) => s.trim().length > 0).length;
|
||||||
|
|
||||||
|
// Paragraph count — split by double newlines or single newlines
|
||||||
|
const paragraphs = text
|
||||||
|
.split(/\n\s*\n|\n/)
|
||||||
|
.filter((p) => p.trim().length > 0).length;
|
||||||
|
|
||||||
|
// Reading time (avg 200 words/min for English, 150 for Arabic)
|
||||||
|
const avgWPM = 180;
|
||||||
|
const minutes = Math.ceil(words / avgWPM);
|
||||||
|
const readingTime = minutes <= 1 ? '< 1 min' : `${minutes} min`;
|
||||||
|
|
||||||
|
return {
|
||||||
|
words,
|
||||||
|
characters,
|
||||||
|
charactersNoSpaces,
|
||||||
|
sentences,
|
||||||
|
paragraphs,
|
||||||
|
readingTime,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Remove extra whitespace (multiple spaces, tabs, etc.)
|
||||||
|
*/
|
||||||
|
export function removeExtraSpaces(text: string): string {
|
||||||
|
return text
|
||||||
|
.replace(/[^\S\n]+/g, ' ') // multiple spaces → single space
|
||||||
|
.replace(/\n{3,}/g, '\n\n') // 3+ newlines → 2
|
||||||
|
.trim();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert text case.
|
||||||
|
*/
|
||||||
|
export function convertCase(
|
||||||
|
text: string,
|
||||||
|
type: 'upper' | 'lower' | 'title' | 'sentence'
|
||||||
|
): string {
|
||||||
|
switch (type) {
|
||||||
|
case 'upper':
|
||||||
|
return text.toUpperCase();
|
||||||
|
|
||||||
|
case 'lower':
|
||||||
|
return text.toLowerCase();
|
||||||
|
|
||||||
|
case 'title':
|
||||||
|
return text.replace(
|
||||||
|
/\w\S*/g,
|
||||||
|
(txt) => txt.charAt(0).toUpperCase() + txt.substring(1).toLowerCase()
|
||||||
|
);
|
||||||
|
|
||||||
|
case 'sentence':
|
||||||
|
return text
|
||||||
|
.toLowerCase()
|
||||||
|
.replace(/(^\s*\w|[.!?؟]\s*\w)/g, (match) => match.toUpperCase());
|
||||||
|
|
||||||
|
default:
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Remove Arabic diacritics (tashkeel) from text.
|
||||||
|
*/
|
||||||
|
export function removeDiacritics(text: string): string {
|
||||||
|
// Arabic diacritics Unicode range: \u064B-\u065F, \u0670
|
||||||
|
return text.replace(/[\u064B-\u065F\u0670]/g, '');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format file size in human-readable form.
|
||||||
|
*/
|
||||||
|
export function formatFileSize(bytes: number): string {
|
||||||
|
if (bytes === 0) return '0 B';
|
||||||
|
|
||||||
|
const units = ['B', 'KB', 'MB', 'GB'];
|
||||||
|
const k = 1024;
|
||||||
|
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
||||||
|
|
||||||
|
return `${parseFloat((bytes / Math.pow(k, i)).toFixed(1))} ${units[i]}`;
|
||||||
|
}
|
||||||
1
frontend/src/vite-env.d.ts
vendored
Normal file
1
frontend/src/vite-env.d.ts
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
/// <reference types="vite/client" />
|
||||||
44
frontend/tailwind.config.js
Normal file
44
frontend/tailwind.config.js
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
/** @type {import('tailwindcss').Config} */
|
||||||
|
export default {
|
||||||
|
content: [
|
||||||
|
"./index.html",
|
||||||
|
"./src/**/*.{js,ts,jsx,tsx}",
|
||||||
|
],
|
||||||
|
darkMode: 'class',
|
||||||
|
theme: {
|
||||||
|
extend: {
|
||||||
|
colors: {
|
||||||
|
primary: {
|
||||||
|
50: '#eff6ff',
|
||||||
|
100: '#dbeafe',
|
||||||
|
200: '#bfdbfe',
|
||||||
|
300: '#93c5fd',
|
||||||
|
400: '#60a5fa',
|
||||||
|
500: '#3b82f6',
|
||||||
|
600: '#2563eb',
|
||||||
|
700: '#1d4ed8',
|
||||||
|
800: '#1e40af',
|
||||||
|
900: '#1e3a8a',
|
||||||
|
950: '#172554',
|
||||||
|
},
|
||||||
|
accent: {
|
||||||
|
50: '#fdf4ff',
|
||||||
|
100: '#fae8ff',
|
||||||
|
200: '#f5d0fe',
|
||||||
|
300: '#f0abfc',
|
||||||
|
400: '#e879f9',
|
||||||
|
500: '#d946ef',
|
||||||
|
600: '#c026d3',
|
||||||
|
700: '#a21caf',
|
||||||
|
800: '#86198f',
|
||||||
|
900: '#701a75',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
fontFamily: {
|
||||||
|
sans: ['Inter', 'Tajawal', 'system-ui', 'sans-serif'],
|
||||||
|
arabic: ['Tajawal', 'Inter', 'sans-serif'],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
plugins: [],
|
||||||
|
};
|
||||||
23
frontend/tsconfig.json
Normal file
23
frontend/tsconfig.json
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"target": "ES2020",
|
||||||
|
"useDefineForClassFields": true,
|
||||||
|
"lib": ["ES2020", "DOM", "DOM.Iterable"],
|
||||||
|
"module": "ESNext",
|
||||||
|
"skipLibCheck": true,
|
||||||
|
"moduleResolution": "bundler",
|
||||||
|
"allowImportingTsExtensions": true,
|
||||||
|
"isolatedModules": true,
|
||||||
|
"moduleDetection": "force",
|
||||||
|
"noEmit": true,
|
||||||
|
"jsx": "react-jsx",
|
||||||
|
"strict": true,
|
||||||
|
"noUnusedLocals": false,
|
||||||
|
"noUnusedParameters": false,
|
||||||
|
"noFallthroughCasesInSwitch": true,
|
||||||
|
"paths": {
|
||||||
|
"@/*": ["./src/*"]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"include": ["src"]
|
||||||
|
}
|
||||||
34
frontend/vite.config.ts
Normal file
34
frontend/vite.config.ts
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
import { defineConfig } from 'vite';
|
||||||
|
import react from '@vitejs/plugin-react';
|
||||||
|
import path from 'path';
|
||||||
|
|
||||||
|
export default defineConfig({
|
||||||
|
plugins: [react()],
|
||||||
|
resolve: {
|
||||||
|
alias: {
|
||||||
|
'@': path.resolve(__dirname, './src'),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
server: {
|
||||||
|
port: 5173,
|
||||||
|
host: true,
|
||||||
|
proxy: {
|
||||||
|
'/api': {
|
||||||
|
target: 'http://backend:5000',
|
||||||
|
changeOrigin: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
build: {
|
||||||
|
outDir: 'dist',
|
||||||
|
sourcemap: false,
|
||||||
|
rollupOptions: {
|
||||||
|
output: {
|
||||||
|
manualChunks: {
|
||||||
|
vendor: ['react', 'react-dom', 'react-router-dom'],
|
||||||
|
i18n: ['i18next', 'react-i18next'],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
});
|
||||||
48
nginx/nginx.conf
Normal file
48
nginx/nginx.conf
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
upstream backend {
|
||||||
|
server backend:5000;
|
||||||
|
}
|
||||||
|
|
||||||
|
upstream frontend {
|
||||||
|
server frontend:5173;
|
||||||
|
}
|
||||||
|
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name localhost;
|
||||||
|
client_max_body_size 100M;
|
||||||
|
|
||||||
|
# Security headers
|
||||||
|
add_header X-Frame-Options "SAMEORIGIN" always;
|
||||||
|
add_header X-Content-Type-Options "nosniff" always;
|
||||||
|
add_header X-XSS-Protection "1; mode=block" always;
|
||||||
|
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
|
||||||
|
|
||||||
|
# API requests → Flask backend
|
||||||
|
location /api/ {
|
||||||
|
proxy_pass http://backend;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
|
||||||
|
# Timeout for large file uploads
|
||||||
|
proxy_read_timeout 300s;
|
||||||
|
proxy_send_timeout 300s;
|
||||||
|
proxy_connect_timeout 60s;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Frontend (Vite dev server in dev, static in prod)
|
||||||
|
location / {
|
||||||
|
proxy_pass http://frontend;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection "upgrade";
|
||||||
|
}
|
||||||
|
|
||||||
|
# Health check
|
||||||
|
location /health {
|
||||||
|
proxy_pass http://backend/api/health;
|
||||||
|
}
|
||||||
|
}
|
||||||
65
nginx/nginx.prod.conf
Normal file
65
nginx/nginx.prod.conf
Normal file
@@ -0,0 +1,65 @@
|
|||||||
|
upstream backend {
|
||||||
|
server backend:5000;
|
||||||
|
}
|
||||||
|
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name _;
|
||||||
|
client_max_body_size 100M;
|
||||||
|
|
||||||
|
# Redirect HTTP to HTTPS
|
||||||
|
return 301 https://$host$request_uri;
|
||||||
|
}
|
||||||
|
|
||||||
|
server {
|
||||||
|
listen 443 ssl http2;
|
||||||
|
server_name _;
|
||||||
|
client_max_body_size 100M;
|
||||||
|
|
||||||
|
# SSL certificates (mount via certbot / Let's Encrypt)
|
||||||
|
ssl_certificate /etc/nginx/ssl/fullchain.pem;
|
||||||
|
ssl_certificate_key /etc/nginx/ssl/privkey.pem;
|
||||||
|
ssl_protocols TLSv1.2 TLSv1.3;
|
||||||
|
ssl_prefer_server_ciphers on;
|
||||||
|
ssl_ciphers ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256;
|
||||||
|
|
||||||
|
# Security headers
|
||||||
|
add_header Strict-Transport-Security "max-age=63072000; includeSubDomains; preload" always;
|
||||||
|
add_header X-Frame-Options "SAMEORIGIN" always;
|
||||||
|
add_header X-Content-Type-Options "nosniff" always;
|
||||||
|
add_header X-XSS-Protection "1; mode=block" always;
|
||||||
|
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
|
||||||
|
|
||||||
|
# Gzip
|
||||||
|
gzip on;
|
||||||
|
gzip_types text/plain text/css application/json application/javascript text/xml application/xml text/javascript image/svg+xml;
|
||||||
|
gzip_min_length 1000;
|
||||||
|
|
||||||
|
# API requests → Flask
|
||||||
|
location /api/ {
|
||||||
|
proxy_pass http://backend;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_read_timeout 300s;
|
||||||
|
proxy_send_timeout 300s;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Frontend static files
|
||||||
|
location / {
|
||||||
|
root /usr/share/nginx/html;
|
||||||
|
try_files $uri $uri/ /index.html;
|
||||||
|
|
||||||
|
# Cache static assets
|
||||||
|
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff2?)$ {
|
||||||
|
expires 1y;
|
||||||
|
add_header Cache-Control "public, immutable";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Health check
|
||||||
|
location /health {
|
||||||
|
proxy_pass http://backend/api/health;
|
||||||
|
}
|
||||||
|
}
|
||||||
85
scripts/cleanup_expired_files.py
Normal file
85
scripts/cleanup_expired_files.py
Normal file
@@ -0,0 +1,85 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
cleanup_expired_files.py
|
||||||
|
Removes expired upload/output files older than FILE_EXPIRY_SECONDS.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python scripts/cleanup_expired_files.py # Dry run
|
||||||
|
python scripts/cleanup_expired_files.py --execute # Actually delete
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
import shutil
|
||||||
|
import argparse
|
||||||
|
|
||||||
|
# Default to 2 hours
|
||||||
|
DEFAULT_EXPIRY_SECONDS = 7200
|
||||||
|
UPLOAD_DIR = os.path.join(os.path.dirname(__file__), '..', 'backend', 'uploads')
|
||||||
|
|
||||||
|
|
||||||
|
def cleanup(upload_dir: str, expiry_seconds: int, dry_run: bool = True) -> dict:
|
||||||
|
"""Remove directories older than expiry_seconds."""
|
||||||
|
now = time.time()
|
||||||
|
stats = {'scanned': 0, 'deleted': 0, 'freed_bytes': 0, 'errors': 0}
|
||||||
|
|
||||||
|
if not os.path.isdir(upload_dir):
|
||||||
|
print(f"Upload directory does not exist: {upload_dir}")
|
||||||
|
return stats
|
||||||
|
|
||||||
|
for entry in os.listdir(upload_dir):
|
||||||
|
full_path = os.path.join(upload_dir, entry)
|
||||||
|
if not os.path.isdir(full_path):
|
||||||
|
continue
|
||||||
|
|
||||||
|
stats['scanned'] += 1
|
||||||
|
mod_time = os.path.getmtime(full_path)
|
||||||
|
age = now - mod_time
|
||||||
|
|
||||||
|
if age > expiry_seconds:
|
||||||
|
# Calculate size
|
||||||
|
dir_size = sum(
|
||||||
|
os.path.getsize(os.path.join(dp, f))
|
||||||
|
for dp, _, filenames in os.walk(full_path)
|
||||||
|
for f in filenames
|
||||||
|
)
|
||||||
|
|
||||||
|
if dry_run:
|
||||||
|
print(f"[DRY RUN] Would delete: {entry} (age: {age:.0f}s, size: {dir_size / 1024:.1f} KB)")
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
shutil.rmtree(full_path)
|
||||||
|
print(f"Deleted: {entry} (age: {age:.0f}s, size: {dir_size / 1024:.1f} KB)")
|
||||||
|
stats['deleted'] += 1
|
||||||
|
stats['freed_bytes'] += dir_size
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error deleting {entry}: {e}")
|
||||||
|
stats['errors'] += 1
|
||||||
|
|
||||||
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
parser = argparse.ArgumentParser(description='Cleanup expired upload files')
|
||||||
|
parser.add_argument('--execute', action='store_true', help='Actually delete files (default is dry run)')
|
||||||
|
parser.add_argument('--expiry', type=int, default=DEFAULT_EXPIRY_SECONDS, help='Expiry time in seconds')
|
||||||
|
parser.add_argument('--dir', type=str, default=UPLOAD_DIR, help='Upload directory path')
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
dry_run = not args.execute
|
||||||
|
if dry_run:
|
||||||
|
print("=== DRY RUN MODE (use --execute to delete) ===\n")
|
||||||
|
|
||||||
|
stats = cleanup(args.dir, args.expiry, dry_run)
|
||||||
|
|
||||||
|
print(f"\n--- Summary ---")
|
||||||
|
print(f"Scanned: {stats['scanned']} directories")
|
||||||
|
print(f"Deleted: {stats['deleted']} directories")
|
||||||
|
print(f"Freed: {stats['freed_bytes'] / 1024 / 1024:.2f} MB")
|
||||||
|
if stats['errors']:
|
||||||
|
print(f"Errors: {stats['errors']}")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
58
scripts/deploy.sh
Normal file
58
scripts/deploy.sh
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# deploy.sh — Production deployment script for SaaS-PDF
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
echo "========================================="
|
||||||
|
echo " SaaS-PDF Production Deployment"
|
||||||
|
echo "========================================="
|
||||||
|
|
||||||
|
# Colors
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
# Check Docker
|
||||||
|
if ! command -v docker &> /dev/null; then
|
||||||
|
echo -e "${RED}Docker is not installed.${NC}"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if ! command -v docker-compose &> /dev/null && ! docker compose version &> /dev/null; then
|
||||||
|
echo -e "${RED}Docker Compose is not installed.${NC}"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check .env
|
||||||
|
if [ ! -f ".env" ]; then
|
||||||
|
echo -e "${RED}.env file not found! Copy .env.example and configure it.${NC}"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${YELLOW}1/5 — Pulling latest code...${NC}"
|
||||||
|
git pull origin main 2>/dev/null || echo "Not a git repo or no remote, skipping pull."
|
||||||
|
|
||||||
|
echo -e "${YELLOW}2/5 — Building Docker images...${NC}"
|
||||||
|
docker compose -f docker-compose.prod.yml build --no-cache
|
||||||
|
|
||||||
|
echo -e "${YELLOW}3/5 — Stopping old containers...${NC}"
|
||||||
|
docker compose -f docker-compose.prod.yml down --remove-orphans
|
||||||
|
|
||||||
|
echo -e "${YELLOW}4/5 — Starting services...${NC}"
|
||||||
|
docker compose -f docker-compose.prod.yml up -d
|
||||||
|
|
||||||
|
echo -e "${YELLOW}5/5 — Waiting for health check...${NC}"
|
||||||
|
sleep 10
|
||||||
|
|
||||||
|
# Health check
|
||||||
|
if curl -sf http://localhost/health > /dev/null 2>&1; then
|
||||||
|
echo -e "${GREEN}✓ Deployment successful! Service is healthy.${NC}"
|
||||||
|
else
|
||||||
|
echo -e "${RED}✗ Health check failed. Check logs:${NC}"
|
||||||
|
echo " docker compose -f docker-compose.prod.yml logs backend"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo -e "${GREEN}Deployment complete!${NC}"
|
||||||
|
echo " App: http://localhost"
|
||||||
|
echo " Logs: docker compose -f docker-compose.prod.yml logs -f"
|
||||||
86
scripts/generate_sitemap.py
Normal file
86
scripts/generate_sitemap.py
Normal file
@@ -0,0 +1,86 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
generate_sitemap.py
|
||||||
|
Generates sitemap.xml for SEO.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python scripts/generate_sitemap.py --domain https://yourdomain.com
|
||||||
|
"""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
|
||||||
|
TOOLS = [
|
||||||
|
'/tools/pdf-to-word',
|
||||||
|
'/tools/word-to-pdf',
|
||||||
|
'/tools/compress-pdf',
|
||||||
|
'/tools/image-converter',
|
||||||
|
'/tools/video-to-gif',
|
||||||
|
'/tools/word-counter',
|
||||||
|
'/tools/text-cleaner',
|
||||||
|
]
|
||||||
|
|
||||||
|
PAGES = [
|
||||||
|
'/',
|
||||||
|
'/about',
|
||||||
|
'/privacy',
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def generate_sitemap(domain: str) -> str:
|
||||||
|
today = datetime.now().strftime('%Y-%m-%d')
|
||||||
|
urls = []
|
||||||
|
|
||||||
|
# Home page — highest priority
|
||||||
|
urls.append(f''' <url>
|
||||||
|
<loc>{domain}/</loc>
|
||||||
|
<lastmod>{today}</lastmod>
|
||||||
|
<changefreq>weekly</changefreq>
|
||||||
|
<priority>1.0</priority>
|
||||||
|
</url>''')
|
||||||
|
|
||||||
|
# Tool pages — high priority
|
||||||
|
for tool in TOOLS:
|
||||||
|
urls.append(f''' <url>
|
||||||
|
<loc>{domain}{tool}</loc>
|
||||||
|
<lastmod>{today}</lastmod>
|
||||||
|
<changefreq>monthly</changefreq>
|
||||||
|
<priority>0.9</priority>
|
||||||
|
</url>''')
|
||||||
|
|
||||||
|
# Static pages — lower priority
|
||||||
|
for page in PAGES[1:]:
|
||||||
|
urls.append(f''' <url>
|
||||||
|
<loc>{domain}{page}</loc>
|
||||||
|
<lastmod>{today}</lastmod>
|
||||||
|
<changefreq>monthly</changefreq>
|
||||||
|
<priority>0.5</priority>
|
||||||
|
</url>''')
|
||||||
|
|
||||||
|
sitemap = f'''<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
|
||||||
|
{chr(10).join(urls)}
|
||||||
|
</urlset>'''
|
||||||
|
|
||||||
|
return sitemap
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
parser = argparse.ArgumentParser(description='Generate sitemap.xml')
|
||||||
|
parser.add_argument('--domain', type=str, required=True, help='Site domain (e.g. https://yourdomain.com)')
|
||||||
|
parser.add_argument('--output', type=str, default='frontend/public/sitemap.xml', help='Output file path')
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
domain = args.domain.rstrip('/')
|
||||||
|
sitemap = generate_sitemap(domain)
|
||||||
|
|
||||||
|
with open(args.output, 'w', encoding='utf-8') as f:
|
||||||
|
f.write(sitemap)
|
||||||
|
|
||||||
|
print(f"Sitemap generated: {args.output}")
|
||||||
|
print(f"URLs: {len(TOOLS) + len(PAGES)}")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
Reference in New Issue
Block a user