Protect your research data with proper backups. This guide covers backup strategies, data export, and recovery procedures.
The fastest way to backup your data:
# Automatic backup with timestamp
npm run backup
# Manual copy
cp src/backend/data/tracker.db backups/tracker-$(date +%Y%m%d).db
Your data is backed up! The database file contains everything: projects, tasks, goals, tags, and settings.
All your research data is stored in a single SQLite database file:
Location: src/backend/data/tracker.db
Contents:
File Size: Typically 1-5 MB (even with hundreds of tasks)
logs/)Weekly Manual Backup:
# Create a backups folder
mkdir -p backups
# Copy database with date
cp src/backend/data/tracker.db backups/tracker-$(date +%Y%m%d).db
Pro: Simple, full control Con: Easy to forget
Best For: Getting started, infrequent users
Setup Automatic Backups:
Create backup.sh in project root:
#!/bin/bash
# Create backup directory if it doesn't exist
mkdir -p backups
# Backup with timestamp
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
cp src/backend/data/tracker.db "backups/tracker-${TIMESTAMP}.db"
# Keep only last 30 backups
ls -t backups/tracker-*.db | tail -n +31 | xargs -r rm
echo "Backup created: tracker-${TIMESTAMP}.db"
echo "Total backups: $(ls backups/tracker-*.db | wc -l)"
Make it executable:
chmod +x backup.sh
Run it:
./backup.sh
Automate Daily:
macOS/Linux (cron):
# Edit crontab
crontab -e
# Add daily backup at 11 PM
0 23 * * * cd /path/to/project_tracker && ./backup.sh
Windows (Task Scheduler):
bash backup.shPro: Automatic, version history Con: Requires initial setup
Best For: Regular users, important research data
Store your database in a cloud-synced folder:
Dropbox/Google Drive/OneDrive:
Option A: Symlink the Database
# Move database to Dropbox
mv src/backend/data/tracker.db ~/Dropbox/ResearchTracker/tracker.db
# Create symlink
ln -s ~/Dropbox/ResearchTracker/tracker.db src/backend/data/tracker.db
Option B: Change Database Path
Create .env file:
DATABASE_PATH=/Users/yourname/Dropbox/ResearchTracker/tracker.db
Pro: Automatic sync, accessible from multiple devices Con: Sync conflicts possible with simultaneous access
Best For: Users who switch between computers, want cloud backup
โ ๏ธ Warning: Donโt run the app on multiple devices simultaneously with cloud sync. SQLite doesnโt support concurrent access across machines.
Track your database with Git:
# Initialize git in data folder
cd src/backend/data
git init
# Add database
git add tracker.db
# Commit changes
git commit -m "Initial database state"
# Daily commits
git add tracker.db
git commit -m "Daily backup - $(date +%Y-%m-%d)"
# Push to private GitHub repo
git remote add origin git@github.com:yourusername/research-tracker-data.git
git push origin main
Pro: Full version history, can revert to any point Con: Not ideal for binary files, requires Git knowledge
Best For: Power users, those already using Git
Graduate Student (High Stakes):
Researcher (Medium Stakes):
Casual User (Low Stakes):
Always backup before:
# Pre-event backup
cp src/backend/data/tracker.db backups/tracker-before-update-$(date +%Y%m%d).db
Replace current database with backup:
# CAUTION: This deletes current data!
# Stop the app first
cp backups/tracker-20260107.db src/backend/data/tracker.db
# Restart app
npm run dev
Use SQLite to extract specific data:
# Install SQLite tools
# macOS: brew install sqlite
# Ubuntu: sudo apt-get install sqlite3
# Open backup
sqlite3 backups/tracker-20260107.db
# List tables
.tables
# Export specific project
.mode insert projects
.output restored-projects.sql
SELECT * FROM projects WHERE name LIKE '%Dissertation%';
.quit
# Import to current database
sqlite3 src/backend/data/tracker.db < restored-projects.sql
Export Projects:
sqlite3 src/backend/data/tracker.db -csv -header "SELECT * FROM projects" > projects.csv
Export Tasks:
sqlite3 src/backend/data/tracker.db -csv -header "SELECT * FROM tasks" > tasks.csv
Export Everything:
# Create exports folder
mkdir -p exports
# Export all tables
for table in $(sqlite3 src/backend/data/tracker.db "SELECT name FROM sqlite_master WHERE type='table'"); do
sqlite3 src/backend/data/tracker.db -csv -header "SELECT * FROM $table" > "exports/${table}.csv"
done
# Install jq if needed
# macOS: brew install jq
# Ubuntu: sudo apt-get install jq
# Export tasks to JSON
echo "[" > tasks.json
sqlite3 src/backend/data/tracker.db -json "SELECT * FROM tasks" >> tasks.json
echo "]" >> tasks.json
Export for Excel:
Export for Notion:
Export for Trello:
# Full restore
cp backups/tracker-20260107.db src/backend/data/tracker.db
# Import tasks from CSV
sqlite3 src/backend/data/tracker.db
.mode csv
.import tasks.csv tasks
# Attach second database
sqlite3 src/backend/data/tracker.db
ATTACH 'backups/old-tracker.db' AS old;
# Copy specific projects
INSERT INTO projects SELECT * FROM old.projects WHERE id > 100;
# Copy related tasks
INSERT INTO tasks SELECT * FROM old.tasks WHERE project_id > 100;
DETACH old;
.quit
Monthly Backup Test:
# Test restore
cp backups/tracker-latest.db /tmp/test-tracker.db
DATABASE_PATH=/tmp/test-tracker.db npm run dev
# Check in browser, then stop and clean up
rm /tmp/test-tracker.db
# Check database integrity
sqlite3 src/backend/data/tracker.db "PRAGMA integrity_check;"
# Should output: ok
Keep Last 30 Days:
# Remove backups older than 30 days
find backups/ -name "tracker-*.db" -mtime +30 -delete
Keep Weekly Backups:
# Keep one backup per week
# Archive first backup of each week to long-term storage
ls backups/tracker-2026* | head -n 1 | xargs -I {} cp {} archives/
# Compress backups older than 3 months
find backups/ -name "tracker-*.db" -mtime +90 -exec gzip {} \;
# Decompress if needed
gunzip backups/tracker-20251001.db.gz
# Check current database size
du -h src/backend/data/tracker.db
# Estimate yearly storage
# Typical: 2MB database ร 365 daily backups = ~730MB/year
# With monthly cleanup: ~60MB/year
Dropbox:
Google Drive:
OneDrive:
iCloud (macOS):
GitHub Private Repo:
Backblaze B2:
Example: Dropbox
# 1. Install Dropbox
# 2. Create folder
mkdir ~/Dropbox/ResearchTrackerBackups
# 3. Add to backup script
cp src/backend/data/tracker.db ~/Dropbox/ResearchTrackerBackups/tracker-$(date +%Y%m%d).db
If your computer dies:
src/backend/data/npm install and npm run devIf database becomes corrupted:
# Try to repair
sqlite3 src/backend/data/tracker.db "PRAGMA integrity_check;"
# If repair fails, restore from backup
cp backups/tracker-latest.db src/backend/data/tracker.db
If you delete important data:
Initial Setup (Do Once):
Daily (Automated):
Weekly (5 minutes):
Monthly (15 minutes):
Quarterly (30 minutes):
# Encrypt backup with GPG
gpg --symmetric --cipher-algo AES256 src/backend/data/tracker.db
# Creates: tracker.db.gpg
# Decrypt
gpg tracker.db.gpg
# Only backup if database changed
if ! cmp -s src/backend/data/tracker.db backups/tracker-latest.db; then
cp src/backend/data/tracker.db backups/tracker-$(date +%Y%m%d).db
fi
# Backup to remote server via SSH
scp src/backend/data/tracker.db user@server:/backups/tracker-$(date +%Y%m%d).db
# Or use rsync
rsync -avz src/backend/data/tracker.db user@server:/backups/
Remember: Your research data is valuable. Back it up regularly! ๐พ