A surf and weather forecasting data API built with FastAPI.
surfe-diem provides surf spot, buoy, tide, and weather data via a modern REST API. Designed for surf forecasting apps, dashboards, and enthusiasts.
app/- Main application code (models, routers, classes, etc.)tests/- Unit and integration testsjobs/- Shell scripts for setup and maintenancedata/- Static data files (JSON, etc.)tools/- Utility scripts
git clone https://github.com/crubio/surfe-diem-api.git
cd surfe-diem-apipip3 install -r requirements.txt(see Environment Variables section below for details)
uvicorn app.main:app --host=0.0.0.0 --port=5000 --reloadVisit http://127.0.0.1:5000/api/v1 for interactive docs.
GET /- Health checkGET /api/v1/spots- Get surf spotsPOST /api/v1/spots- Create new surf spot (admin only)GET /api/v1/locations- Get buoy locationsGET /api/v1/forecast- Get weather forecastGET /api/v1/weather- Get current weatherGET /api/v1/tides/find_closest- Find nearest tide station
POST /api/v1/batch-forecast- Batch forecast for multiple locations
The batch forecast endpoint is designed for efficiency - instead of making multiple API calls from your frontend, send a list of buoy IDs and spot IDs in a single request and get current forecast data for all of them.
Request:
{
"buoy_ids": ["46042", "46232"],
"spot_ids": [1, 2, 3]
}Response:
{
"buoys": [
{
"id": "46042",
"name": "Monterey Bay Buoy",
"description": "NOAA Buoy",
"location": "Monterey Bay, CA",
"url": "https://www.ndbc.noaa.gov/station_page.php?station=46042",
"latest_observation": {...},
"weather_forecast": {...}
}
],
"spots": [
{
"id": 1,
"name": "Steamer Lane",
"timezone": "America/Los_Angeles",
"latitude": 36.9519,
"longitude": -122.0308,
"subregion_name": "Santa Cruz",
"weather_forecast": {...},
"current_weather": {...}
}
],
"errors": [
{
"id": "invalid_id",
"type": "buoy",
"error": "Buoy location not found or inactive"
}
]
}| Variable | Description | Example |
|---|---|---|
| DATABASE_HOSTNAME | Postgres host | localhost |
| DATABASE_PORT | Postgres port | 5432 |
| DATABASE_PASSWORD | Postgres password | yourpassword |
| DATABASE_NAME | Postgres DB name | surfe_diem_api |
| DATABASE_USERNAME | Postgres user | postgres |
| SECRET_KEY | JWT secret key | (generate your own) |
| ALGORITHM | JWT algorithm | HS256 |
| ACCESS_TOKEN_EXPIRE_MINUTES | JWT expiry in minutes | 60 |
| SQLITE_URI | SQLite DB URI | sqlite:///./surfe-diem-api.db |
| DATABASE_URL | Alternative DB URL | postgresql://user:pass@host/db |
| DATABASE_URI | Alternative DB URI | postgresql://user:pass@host/db |
| SQLITE_DB | SQLite database file path | ./surfe-diem-api.db |
| ENVIRONMENT | Environment name | development |
Create a database in PostgreSQL, then create a .env file with:
DATABASE_HOSTNAME=localhost
DATABASE_PORT=5432
DATABASE_PASSWORD=your_secure_password
DATABASE_NAME=your_database_name
DATABASE_USERNAME=your_username
SECRET_KEY=your_secure_secret_key_here
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=60Create a .env file with:
SECRET_KEY=your_secure_secret_key_here
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=60
DATABASE_URL=sqlite:///./surfe-diem-api.db
DATABASE_URI=sqlite:///./surfe-diem-api.db
SQLITE_URI=sqlite:///./surfe-diem-api.db
SQLITE_DB=./surfe-diem-api.db
ENVIRONMENT=developmentThe API can be deployed to any cloud platform that supports Python applications:
- Heroku: Uses the included Procfile
- Render: Supports automatic deployments from GitHub
- Railway: Compatible with Python web services
- Docker: Containerized deployment option
The API runs on port 5000 by default, but can be configured via the PORT environment variable.
docker-compose up -dThis will start PostgreSQL and the API in containers.
docker build -t surfe-diem-api .
docker run -p 5000:8000 surfe-diem-apiThe testing setup requires these additional packages (already in requirements.txt):
pytest- Testing frameworkhttpx- HTTP client for testingpytest-asyncio- Async test support
tests/test_basic.py- Unit tests for core classestests/test_integration.py- Integration tests for API endpoints
# Run all tests
python3 -m pytest
# Run specific test file
python3 -m pytest tests/test_integration.py
# Run tests with verbose output
python3 -m pytest -v
# Run tests matching a pattern
python3 -m pytest -k "batch_forecast"- Unit Tests: Test individual functions/classes in isolation
- Integration Tests: Test API endpoints with the full application stack
Example integration test:
def test_batch_forecast_endpoint_empty_request(client):
"""Test the batch forecast endpoint with empty request."""
response = client.post("/api/v1/batch-forecast", json={
"buoy_ids": [],
"spot_ids": []
})
assert response.status_code == 200
data = response.json()
assert "buoys" in data
assert "spots" in data
assert "errors" in data- Use type hints everywhere
- Follow PEP 8 guidelines
- Use dataclasses for data models
- Add docstrings to all functions
- Create a new router in
app/routers/ - Add Pydantic models for request/response validation
- Include the router in
app/main.py - Write integration tests
- Update this README
# Create a new migration
alembic revision --autogenerate -m "description"
# Apply migrations
alembic upgrade headThe production database is backed up as surfe-diem-api.db.backup. To update the production database:
- Create a new backup of your local database
- Commit the backup file to GitHub
- Deploy the updated code to your hosting platform
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Run the test suite
- Submit a pull request
Database Connection Errors
- Check your
.envfile configuration - Ensure the database is running
- Verify connection credentials
Import Errors
- Make sure all dependencies are installed:
pip install -r requirements.txt - Check Python path and virtual environment
Test Failures
- Ensure the database is properly set up
- Check that external APIs are accessible
- Verify test data exists in the database
- Check the API documentation at
/api/v1 - Review the test files for usage examples
- Open an issue with detailed error information
For questions or support, please open an issue on GitHub.
Built with β€οΈ for the surfing community