Skip to content

Conversation

@edmundmiller
Copy link
Contributor

Summary

Create new pulumi/seqera_platform directory with support for multiple independent Seqera Platform workspaces. This enables managing separate workspaces with different configurations while sharing common code.

Structure

pulumi/seqera_platform/
├── shared/                       # Shared Python modules
│   ├── providers/                # AWS, GitHub, Seqera providers
│   ├── infrastructure/           # S3, IAM, compute environments
│   ├── integrations/             # GitHub, workspace participants
│   ├── config/                   # Configuration management
│   └── utils/                    # Helper functions
├── awsmegatests/                 # AWS Megatests workspace
│   └── workspace_config.py       # CPU, GPU, ARM enabled
└── resource_optimization/        # Resource Optimization workspace
    └── workspace_config.py       # CPU only (for Florian)

Workspaces

AWS Megatests

  • Full infrastructure with CPU, GPU, and ARM compute environments
  • S3 bucket: nf-core-awsmegatests
  • GitHub integration enabled
  • Team participant management

Resource Optimization

  • CPU-only compute environment (no ARM, no GPU)
  • S3 bucket: nf-core-resource-optimization
  • Focused setup for resource profiling and optimization work
  • GitHub integration disabled by default

Key Features

  • Independent Workspaces: Each workspace is a separate Pulumi project with its own state
  • Shared Code: Common functionality in shared/ module to avoid duplication
  • Flexible Configuration: workspace_config.py controls which resources to deploy
  • Preserved Legacy: Original pulumi/AWSMegatests/ remains untouched

Next Steps

  • Test awsmegatests workspace in a dev stack
  • Test resource_optimization workspace
  • Validate ESC environment configuration
  • Compare outputs with legacy AWSMegatests
  • Create migration plan for deprecating legacy structure

Files Changed

  • 36 files changed
  • 3,258 insertions
  • Complete documentation for both workspaces

🤖 Generated with Claude Code

edmundmiller and others added 3 commits September 22, 2025 17:02
…ing errors

- Add multipart upload permissions to S3 IAM policy for large files >5GB
- Remove problematic closures from publishDir tags configuration
- Update policy version hash to trigger compute environment recreation
- Fixes workflow failures with S3 copy operations and casting errors
🗑️ Removed Legacy Files:
- seqerakit CLI config files (*.yml) - replaced by Terraform provider approach
- Legacy GitHub workflow (deploy-seqerakit.yml) - marked as TODO/placeholder
- Development Pulumi config (Pulumi.dev.yaml)
- Empty tests directory with only .pyc files
- Cache directories (.pytest_cache, .mypy_cache, .ruff_cache, __pycache__)
- SDK build artifacts (build/, *.egg-info/)

✅ Kept Active Infrastructure:
- seqerakit/current-env-*.json - used by Terraform provider (constants.py)
- seqerakit/configs/nextflow-*.config - used by compute environments
- Complete src/ modular structure - restored from parent commit
- SDK source code in sdks/seqera/pulumi_seqera/

🏗️ Architecture Migration:
Project migrated from seqerakit CLI approach to native Terraform provider.
The .yml files were CLI wrappers around .json specs, now obsolete.

Result: Removed ~15 legacy files while preserving all active infrastructure.
Create new pulumi/seqera_platform directory with support for multiple
independent Seqera Platform workspaces. This enables managing separate
workspaces with different configurations while sharing common code.

Structure:
- shared/: Reusable modules for providers, infrastructure, and integrations
- awsmegatests/: Full workspace with CPU, GPU, and ARM compute environments
- resource_optimization/: Focused workspace with CPU-only for resource testing

Each workspace is an independent Pulumi project with workspace-specific
configuration defined in workspace_config.py. The shared modules are
imported via sys.path manipulation to avoid code duplication.

🤖 Generated with Claude Code (https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants