Complete API documentation for FluxUpload.
The main class that orchestrates file uploads.
new FluxUpload(options: FluxUploadOptions)
| Property | Type | Description |
|---|---|---|
storage |
Plugin | Plugin[] | Required. Storage driver(s) |
validators |
Plugin[] | Validation plugins |
transformers |
Plugin[] | Transform plugins |
limits |
Object | { fileSize, files, fields, fieldSize } |
onField |
Function | (name, value) => void |
onFile |
Function | (fileInfo, stream) => void |
onError |
Function | (error) => void |
onFinish |
Function | (result) => void |
Process an incoming HTTP request with multipart/form-data.
const result = await uploader.handle(req);
// result: { fields: {}, files: [], errors: [] }
Parse multipart data directly from a buffer.
const result = await uploader.parse(buffer, boundary);
Shutdown all plugins and cleanup resources.
Enforce file size limits with immediate stream abort.
new QuotaLimiter({
maxFileSize: 50 * 1024 * 1024, // 50MB per file
maxTotalSize: 200 * 1024 * 1024 // 200MB total
})
Verify file types by checking binary signatures.
new MagicByteDetector({
allowed: ['image/*', 'application/pdf'],
denied: ['application/x-executable']
})
Supported: JPEG, PNG, GIF, WebP, BMP, TIFF, PDF, ZIP, MP3, MP4, and more.
Validate image dimensions without full decode.
new ImageDimensionProbe({
minWidth: 100,
maxWidth: 4096,
minHeight: 100,
maxHeight: 4096
})
Token bucket rate limiting per IP.
new RateLimiter({
maxRequests: 100,
windowMs: 60000, // 1 minute
keyGenerator: (req) => req.ip
})
CSRF token validation.
new CsrfProtection({
headerName: 'X-CSRF-Token',
cookieName: 'csrf-token',
tokenLifetime: 3600000
})
Calculate file checksums during upload.
new StreamHasher({
algorithm: 'sha256', // sha256, sha512, md5
encoding: 'hex' // hex, base64, base64url
})
Hash is available in result.files[n].metadata.hash
Compress files on-the-fly.
new StreamCompressor({
algorithm: 'gzip', // gzip, deflate, brotli
level: 6,
compressibleTypes: ['text/*', 'application/json', 'image/svg+xml']
})
Filesystem storage with atomic writes.
new LocalStorage({
destination: './uploads',
naming: 'uuid', // uuid, original, timestamp, hash, slugify
createDirectories: true,
fileMode: 0o644,
dirMode: 0o755
})
delete(filename) - Delete a fileexists(filename) - Check if file existsgetMetadata(filename) - Get file metadataAWS S3 and S3-compatible storage (MinIO, DO Spaces, etc.).
new S3Storage({
bucket: 'my-bucket',
region: 'us-east-1',
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
sessionToken: process.env.AWS_SESSION_TOKEN, // optional
endpoint: 'https://custom-endpoint.com', // optional
prefix: 'uploads/',
acl: 'private', // private, public-read
storageClass: 'STANDARD',
naming: 'uuid'
})
getPresignedUrl(key, options) - Generate presigned upload URLconst { getLogger } = require('fluxupload');
const logger = getLogger({
level: 'info',
format: 'json'
});
logger.info('Message', { data });
const { getCollector } = require('fluxupload');
const metrics = getCollector();
// Prometheus format
metrics.toPrometheus();
const { ProgressTracker } = require('fluxupload');
const tracker = new ProgressTracker();
tracker.on('progress', (p) => {
console.log(p.percentage);
});
const { HealthCheck } = require('fluxupload');
const health = new HealthCheck();
await health.check();
FluxUpload includes full TypeScript definitions.
import FluxUpload, {
LocalStorage,
S3Storage,
QuotaLimiter,
MagicByteDetector,
ImageDimensionProbe,
StreamHasher,
Plugin,
UploadContext,
UploadResult
} from 'fluxupload';
const uploader = new FluxUpload({
storage: new LocalStorage({ destination: './uploads' })
});
const result: UploadResult = await uploader.handle(req);