Skip to content

Commit f0634f3

Browse files
authored
docs: Update Sugar project URL and add swarm-external-secrets project (#227)
1 parent b7dca35 commit f0634f3

File tree

10 files changed

+28
-11
lines changed

10 files changed

+28
-11
lines changed

bkp/blogs/ibis-framework/index.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -84,10 +84,10 @@
8484
"metadata": {},
8585
"outputs": [],
8686
"source": [
87-
"import matplotlib.pyplot as plt\n",
88-
"import pandas as pd\n",
8987
"import sqlite3\n",
90-
"import ibis"
88+
"\n",
89+
"import ibis\n",
90+
"import matplotlib.pyplot as plt"
9191
]
9292
},
9393
{

pages/blog/console-based-representation-in-astx/index.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -74,9 +74,9 @@
7474
"outputs": [],
7575
"source": [
7676
"# import display for AST visualization\n",
77-
"from IPython.display import display\n",
78-
"from astx.viz import traverse_ast_ascii, graph_to_ascii\n",
79-
"import astx"
77+
"import astx\n",
78+
"\n",
79+
"from astx.viz import graph_to_ascii, traverse_ast_ascii"
8080
]
8181
},
8282
{

pages/blog/console-based-representation-in-astx/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -54,9 +54,9 @@ The first thing to do is, in your Jupyter Notebook instance, import `display`, w
5454

5555
```python
5656
# import display for AST visualization
57-
from IPython.display import display
58-
from astx.viz import traverse_ast_ascii, graph_to_ascii
5957
import astx
58+
59+
from astx.viz import graph_to_ascii, traverse_ast_ascii
6060
```
6161

6262
Then we create an instance of the Module class, and this instance will be the first node of the tree, or the root node. After that, we declare the variables and literal that will be part of the basic operation that we will parse into an AST.

pages/blog/scaling-machine-learning-projects-with-dask/index.ipynb

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -183,8 +183,8 @@
183183
"source": [
184184
"#Lazy Evalution with dask\n",
185185
"import dask.dataframe as dd\n",
186-
"import pandas as pd\n",
187186
"import numpy as np\n",
187+
"import pandas as pd\n",
188188
"\n",
189189
"# Creating a dummy dataset\n",
190190
"num_rows = 100 # Number of rows\n",
@@ -286,6 +286,7 @@
286286
"# Dynamic task scheduling with Dask\n",
287287
"import dask\n",
288288
"\n",
289+
"\n",
289290
"@dask.delayed\n",
290291
"def square(x):\n",
291292
" return x * x\n",

pages/blog/scaling-machine-learning-projects-with-dask/index.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -136,8 +136,8 @@ print(result.compute()) # Outputting the computed result
136136
```python
137137
#Lazy Evalution with dask
138138
import dask.dataframe as dd
139-
import pandas as pd
140139
import numpy as np
140+
import pandas as pd
141141

142142
# Creating a dummy dataset
143143
num_rows = 100 # Number of rows
@@ -230,6 +230,7 @@ print(y_dask.compute())
230230
# Dynamic task scheduling with Dask
231231
import dask
232232

233+
233234
@dask.delayed
234235
def square(x):
235236
return x * x

pages/blog/streamlining-project-automation-with-makim/index.ipynb

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -92,6 +92,7 @@
9292
"outputs": [],
9393
"source": [
9494
"import os\n",
95+
"\n",
9596
"os.environ[\"NO_COLOR\"] = \"1\""
9697
]
9798
},

pages/blog/streamlining-project-automation-with-makim/index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,7 @@ For this tutorial, we will disable the output color feature provided by typer, t
6161

6262
```python
6363
import os
64+
6465
os.environ["NO_COLOR"] = "1"
6566
```
6667

pages/blog/unlocking-the-power-of-multiple-dispatch-in-python-with-plum-dispatch/index.ipynb

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -72,6 +72,7 @@
7272
"source": [
7373
"from plum import dispatch\n",
7474
"\n",
75+
"\n",
7576
"class Processor:\n",
7677
" @dispatch\n",
7778
" def process(self, data: int):\n",

pages/blog/unlocking-the-power-of-multiple-dispatch-in-python-with-plum-dispatch/index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,7 @@ To demonstrate the basic usage of `plum-dispatch`, let's start with a simple exa
4141
```python
4242
from plum import dispatch
4343

44+
4445
class Processor:
4546
@dispatch
4647
def process(self, data: int):

pages/projects/list/index.md

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -170,14 +170,25 @@ projects:
170170
type: incubated
171171
maintainer_name: Ivan Ogasawara
172172
maintainer_email: [email protected]
173-
url: https://osl-incubator.github.io/sugar/
173+
url: https://sugar-org.github.io/sugar/
174174
communication_channel:
175175
provider: discord
176176
url: https://opensciencelabs.org/discord
177177
description: |
178178
Sugar aims to organize your stack of containers, gathering some useful scripts
179179
and keeping this information centralized in a configuration file. So the command
180180
line would be very simple.
181+
182+
- name: swarm-external-secrets
183+
type: incubated
184+
maintainer_name: Sai Sanjay
185+
maintainer_email: [email protected]
186+
url: https://sugar-org.github.io/swarm-external-secrets/
187+
communication_channel:
188+
provider: discord
189+
url: https://opensciencelabs.org/discord
190+
description: |
191+
A Docker Swarm secrets plugin that integrates with multiple secret management providers including HashiCorp Vault, AWS Secrets Manager, Azure Key Vault, and OpenBao.
181192
---
182193

183194
# Affiliated and Incubated Projects

0 commit comments

Comments
 (0)