Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
83 commits
Select commit Hold shift + click to select a range
57d0ba9
added auto-publish workflow
pauladkisson Aug 13, 2025
3d4664f
added pyproject.toml
pauladkisson Aug 13, 2025
f376da2
added sparse_environment.yaml
pauladkisson Aug 13, 2025
f51bc39
pinned openssl and certifi
pauladkisson Aug 14, 2025
9cc85fa
updated syntax for pn.Card in savingInputParameters
pauladkisson Aug 16, 2025
2e97412
established initial python 3.12 environment
pauladkisson Aug 16, 2025
622327a
Fixed homepage by swapping the panel.widgets.DataFrame with panel.wid…
pauladkisson Aug 18, 2025
dafb0c3
implement venus' changes in savesStoresList.py
pauladkisson Aug 18, 2025
cbf9692
Revert "implement venus' changes in savesStoresList.py"
pauladkisson Aug 19, 2025
3421a46
added dependencies to pyproject.toml
pauladkisson Aug 20, 2025
df45f33
Updated auto-publish to use TestPyPI
pauladkisson Aug 20, 2025
54241e7
removed old lock files
pauladkisson Aug 20, 2025
6a6f03d
updated version
pauladkisson Aug 20, 2025
764480a
added workflow_dispatch
pauladkisson Aug 20, 2025
e68180a
moved auto-publish to the correct location
pauladkisson Aug 20, 2025
06b0f23
moved auto-publish to the correct location
pauladkisson Aug 20, 2025
c1735d0
.yml
pauladkisson Aug 20, 2025
dc1bbcb
fixed workflow warning
pauladkisson Aug 20, 2025
0149473
switched to modern attestations approach
pauladkisson Aug 20, 2025
a437368
switched to modern attestations approach
pauladkisson Aug 20, 2025
c8f16d9
updated comment header
pauladkisson Aug 20, 2025
2d56292
updated version
pauladkisson Aug 20, 2025
3fc2de3
Fix bugs exposed by pip-installable environment (#145)
pauladkisson Aug 21, 2025
fe4d3dd
alpha5
pauladkisson Aug 21, 2025
316c63a
reorganized into src/guppy pattern
pauladkisson Aug 21, 2025
1103e45
added main entry point
pauladkisson Aug 21, 2025
ed7d141
added main entry point pt 2
pauladkisson Aug 21, 2025
c3d0ac1
reverted back to subprocess calls bc matplotlib needs to be in the ma…
pauladkisson Aug 21, 2025
32ad20e
fixed subprocess.call syntax to work with guppy entry point
pauladkisson Aug 21, 2025
c040672
removed readh5 clutter
pauladkisson Aug 21, 2025
ad3dd98
saving log file to home directory to avoid permissions errors
pauladkisson Aug 21, 2025
c6525cc
update pyproject.toml
pauladkisson Aug 21, 2025
c30b8ec
update pyproject.toml to use direct dependencies
pauladkisson Aug 26, 2025
38943b3
added explicit support for python 3.10-3.13
pauladkisson Aug 26, 2025
81aa0c0
removed old spec files
pauladkisson Aug 27, 2025
e6430c7
Added installation instructions to the readme
pauladkisson Aug 27, 2025
71aa80a
Added changelog
pauladkisson Aug 27, 2025
37ab6fb
Added manifest
pauladkisson Aug 27, 2025
ee9c99c
updated changelog to explicitly mention breaking changes
pauladkisson Aug 27, 2025
057ee6d
added headless mode for step 1 and appropriate test
pauladkisson Aug 28, 2025
be54ce4
added headless mode for step 2 and appropriate test
pauladkisson Aug 28, 2025
6be04a9
renamed test_step2
pauladkisson Aug 28, 2025
5a1fedb
added headless mode for step 3 and appropriate test
pauladkisson Aug 28, 2025
b9c2fdf
fixed storenames_map
pauladkisson Aug 28, 2025
30ec4a2
added headless mode for step 4 and appropriate test
pauladkisson Aug 28, 2025
04c5642
added headless mode for step 5 and appropriate test
pauladkisson Aug 28, 2025
838d911
added the rest of the data modalities (except npm) for step 3
pauladkisson Aug 29, 2025
1d24eca
added the rest of the data modalities (except npm) for step 2
pauladkisson Aug 29, 2025
58092e5
added npm to step 2
pauladkisson Aug 29, 2025
31dc80d
added npm to step 3
pauladkisson Aug 29, 2025
fb20557
added the rest of the data modalities for step 4
pauladkisson Aug 29, 2025
2664ab5
added the rest of the data modalities for step 5
pauladkisson Aug 29, 2025
c132433
merge main
pauladkisson Aug 30, 2025
08e5799
updated run-tests.yml
pauladkisson Aug 30, 2025
20dc6e8
Added load-data action (#156)
pauladkisson Sep 1, 2025
0ad6de1
merge main
pauladkisson Sep 1, 2025
70b7227
added inputs to the workflow dispatch
pauladkisson Sep 1, 2025
1f9a490
switched to rclone lsjson
pauladkisson Sep 1, 2025
db38acf
fixed hashing
pauladkisson Sep 1, 2025
4274192
updated dependency groups to include test
pauladkisson Sep 1, 2025
58848bf
DEBUG commented out plt.switch_backend
pauladkisson Sep 1, 2025
7909b04
Only set matplotlib backend if not in CI environment
pauladkisson Sep 1, 2025
507163b
updated cache key
pauladkisson Sep 1, 2025
5dc12ee
updated cache key
pauladkisson Sep 1, 2025
61209c5
updated testing data location
pauladkisson Sep 1, 2025
c1a629c
fixed bug with npm params
pauladkisson Sep 2, 2025
a4e12ac
added extra assertions to test_step2 to ensure npm creates the requis…
pauladkisson Sep 2, 2025
2a32bc6
updated default os to windows-2022
pauladkisson Sep 2, 2025
ac86e35
updated pr-tests
pauladkisson Sep 2, 2025
d6a33e4
updated pr-tests
pauladkisson Sep 2, 2025
b05382d
removed assess-file-changes
pauladkisson Sep 2, 2025
00b0e67
fixed data cache hash
pauladkisson Sep 3, 2025
35e5731
updated npm_timestamp to use name instead of index
pauladkisson Sep 16, 2025
96facf3
disable parallel execution for ci tests
pauladkisson Sep 18, 2025
1dc4b3d
Merge branch 'dev' into testing
pauladkisson Oct 16, 2025
2144e96
Merge branch 'dev' into testing
pauladkisson Oct 17, 2025
a62f665
Updated the paths in test_step2.
pauladkisson Oct 27, 2025
cd07797
Updated the paths in the rest of the tests.
pauladkisson Oct 28, 2025
787fefc
Added new data to test_step2.py.
pauladkisson Oct 29, 2025
cd72eb2
Added timestamp specification for step two.
pauladkisson Oct 29, 2025
617f381
Added extra assertions for the rest of the npm data sets.
pauladkisson Oct 29, 2025
b09332d
Added new data to test_step3-5.
pauladkisson Oct 30, 2025
d93e382
Added fix for sampledata_npm_5
pauladkisson Oct 30, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions .github/actions/load-data/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,17 +13,18 @@ runs:
rclone_config: ${{ inputs.rclone-config }}

- name: Get dataset version hash
id: hash
shell: bash
run: |
HASH=$(rclone lsl remote:"SampleData" --drive-shared-with-me)
HASH=$(rclone lsjson remote:"SampleData" --drive-shared-with-me --recursive | sed 's/,$//' | sort | sha256sum | cut -d' ' -f1)
echo "DATASET_HASH=$HASH" >> $GITHUB_OUTPUT

- name: Cache datasets
uses: actions/cache@v4
id: cache-datasets
with:
path: ./testing_data
key: ephys-datasets-${{ steps.ephys.outputs.DATASET_HASH }}
key: ${{ steps.hash.outputs.DATASET_HASH }}
enableCrossOsArchive: true

- if: ${{ steps.cache-datasets.outputs.cache-hit != 'true' }}
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/all_os_versions.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
["ubuntu-latest", "macos-latest", "windows-2022"]
1 change: 1 addition & 0 deletions .github/workflows/all_python_versions.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
["3.10", "3.11", "3.12", "3.13"]
41 changes: 40 additions & 1 deletion .github/workflows/pr-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,43 @@ on:

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
cancel-in-progress: true

jobs:
load_python_and_os_versions:
runs-on: ubuntu-latest
outputs:
ALL_PYTHON_VERSIONS: ${{ steps.load_python_versions.outputs.python_versions }}
ALL_OS_VERSIONS: ${{ steps.load_os_versions.outputs.os_versions }}
steps:
- uses: actions/checkout@v4
- id: load_python_versions
run: echo "python_versions=$(cat ./.github/workflows/all_python_versions.txt)" >> "$GITHUB_OUTPUT"
- id: load_os_versions
run: echo "os_versions=$(cat ./.github/workflows/all_os_versions.txt)" >> "$GITHUB_OUTPUT"
- name: Debugging
run: |
echo "Loaded Python versions: ${{ steps.load_python_versions.outputs.python_versions }}"
echo "Loaded OS versions: ${{ steps.load_os_versions.outputs.os_versions }}"

run-tests:
needs: [load_python_and_os_versions]
uses: ./.github/workflows/run-tests.yml
secrets:
RCLONE_CONFIG: ${{ secrets.RCLONE_CONFIG }}
with: # Ternary operator: condition && value_if_true || value_if_false
python-versions: ${{ github.event.pull_request.draft == true && '["3.10"]' || needs.load_python_and_os_versions.outputs.ALL_PYTHON_VERSIONS }}
os-versions: ${{ github.event.pull_request.draft == true && '["ubuntu-latest"]' || needs.load_python_and_os_versions.outputs.ALL_OS_VERSIONS }}

check-final-status:
name: All tests passing
if: always()
needs:
- run-tests
runs-on: ubuntu-latest
steps:
- name: Decide whether all jobs succeeded or at least one failed
uses: re-actors/alls-green@release/v1
with:
allowed-skips: run-tests
jobs: ${{ toJSON(needs) }}
48 changes: 47 additions & 1 deletion .github/workflows/run-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,50 @@ on:
description: 'List of OS versions to use in matrix, as JSON string'
required: true
type: string
workflow_dispatch:
secrets:
RCLONE_CONFIG:
required: true
workflow_dispatch:
inputs:
python-versions:
description: 'List of Python versions to use in matrix, as JSON string'
required: true
type: string
default: '["3.10", "3.11", "3.12", "3.13"]'
os-versions:
description: 'List of OS versions to use in matrix, as JSON string'
required: true
type: string
default: '["ubuntu-latest", "windows-2022", "macos-latest"]'

jobs:
run:
name: ${{ matrix.os }} Python ${{ matrix.python-version }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
python-version: ${{ fromJson(inputs.python-versions) }}
os: ${{ fromJson(inputs.os-versions) }}
steps:
- uses: actions/checkout@v5
- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Install pip
run: python -m pip install -U pip # Official recommended way

- name: Install GuPPy with testing requirements
run: |
python -m pip install "."
python -m pip install --group test

- name: Prepare data for tests
uses: ./.github/actions/load-data
with:
rclone-config: ${{ secrets.RCLONE_CONFIG }}

- name: Run tests
run: pytest tests -vv -rsx # -n auto --dist loadscope # TODO: re-enable parallel execution when logging issues with Windows are resolved
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,6 @@ z-score_methods.tgn
GuPPy/runFiberPhotometryAnalysis.ipynb
.vscode/
*.egg-info/
.clinerules/

testing_data/
7 changes: 7 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,13 @@ dependencies = [
"tables",
]

[dependency-groups]
test = [
"pytest",
"pytest-cov",
"pytest-xdist" # Runs tests on parallel
]

[project.scripts]
guppy = "guppy.main:main"

Expand Down
5 changes: 4 additions & 1 deletion src/guppy/preprocess.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,10 @@
from matplotlib.widgets import MultiCursor
from pathlib import Path
from .combineDataFn import processTimestampsForCombiningData
plt.switch_backend('TKAgg')

# Only set matplotlib backend if not in CI environment
if not os.getenv('CI'):
plt.switch_backend('TKAgg')

def takeOnlyDirs(paths):
removePaths = []
Expand Down
59 changes: 49 additions & 10 deletions src/guppy/saveStoresList.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,6 +133,16 @@ def saveStorenames(inputParameters, data, event_name, flag, filepath):
# getting input parameters
inputParameters = inputParameters

# Headless path: if storenames_map provided, write storesList.csv without building the Panel UI
storenames_map = inputParameters.get("storenames_map")
if isinstance(storenames_map, dict) and len(storenames_map) > 0:
op = make_dir(filepath)
arr = np.asarray([list(storenames_map.keys()), list(storenames_map.values())], dtype=str)
np.savetxt(os.path.join(op, 'storesList.csv'), arr, delimiter=",", fmt='%s')
insertLog(f"Storeslist file saved at {op}", logging.INFO)
insertLog('Storeslist : \n'+str(arr), logging.INFO)
return

# reading storenames from the data fetched using 'readtsq' function
if isinstance(data, pd.DataFrame):
data['name'] = np.asarray(data['name'], dtype=str)
Expand Down Expand Up @@ -583,7 +593,7 @@ def check_channels(state):
return unique_state.shape[0], unique_state

# function to decide NPM timestamps unit (seconds, ms or us)
def decide_ts_unit_for_npm(df):
def decide_ts_unit_for_npm(df, timestamp_column_name=None, time_unit=None, headless=False):
col_names = np.array(list(df.columns))
col_names_ts = ['']
for name in col_names:
Expand All @@ -592,6 +602,18 @@ def decide_ts_unit_for_npm(df):

ts_unit = 'seconds'
if len(col_names_ts)>2:
# Headless path: auto-select column/unit without any UI
if headless:
if timestamp_column_name is not None:
assert timestamp_column_name in col_names_ts, f"Provided timestamp_column_name '{timestamp_column_name}' not found in columns {col_names_ts[1:]}"
chosen = timestamp_column_name
else:
chosen = col_names_ts[1]
df.insert(1, 'Timestamp', df[chosen])
df = df.drop(col_names_ts[1:], axis=1)
valid_units = {'seconds', 'milliseconds', 'microseconds'}
ts_unit = time_unit if (isinstance(time_unit, str) and time_unit in valid_units) else 'seconds'
return df, ts_unit
#def comboBoxSelected(event):
# print(event.widget.get())

Expand Down Expand Up @@ -741,10 +763,19 @@ def read_doric(filepath):
# and recognize type of 'csv' files either from
# Neurophotometrics, Doric systems or custom made 'csv' files
# and read data accordingly
def import_np_doric_csv(filepath, isosbestic_control, num_ch):
def import_np_doric_csv(filepath, isosbestic_control, num_ch, inputParameters=None):

insertLog("If it exists, importing either NPM or Doric or csv file based on the structure of file",
logging.DEBUG)
# Headless configuration (used to avoid any UI prompts when running tests)
headless = bool(os.environ.get('GUPPY_BASE_DIR'))
npm_timestamp_column_name = None
npm_time_unit = None
npm_split_events = None
if isinstance(inputParameters, dict):
npm_timestamp_column_name = inputParameters.get('npm_timestamp_column_name')
npm_time_unit = inputParameters.get('npm_time_unit', 'seconds')
npm_split_events = inputParameters.get('npm_split_events', True)
path = sorted(glob.glob(os.path.join(filepath, '*.csv'))) + \
sorted(glob.glob(os.path.join(filepath, '*.doric')))
path_chev = glob.glob(os.path.join(filepath, '*chev*'))
Expand Down Expand Up @@ -879,16 +910,19 @@ def import_np_doric_csv(filepath, isosbestic_control, num_ch):
elif flag=='event_np':
type_val = np.array(df.iloc[:,1])
type_val_unique = np.unique(type_val)
window = tk.Tk()
if len(type_val_unique)>1:
response = messagebox.askyesno('Multiple event TTLs', 'Based on the TTL file,\
if headless:
response = 1 if bool(npm_split_events) else 0
else:
window = tk.Tk()
if len(type_val_unique)>1:
response = messagebox.askyesno('Multiple event TTLs', 'Based on the TTL file,\
it looks like TTLs \
belongs to multipe behavior type. \
Do you want to create multiple files for each \
behavior type ?')
else:
response = 0
window.destroy()
else:
response = 0
window.destroy()
if response==1:
timestamps = np.array(df.iloc[:,0])
for j in range(len(type_val_unique)):
Expand All @@ -907,7 +941,12 @@ def import_np_doric_csv(filepath, isosbestic_control, num_ch):
event_from_filename.append('event'+str(0))
else:
file = f'file{str(i)}_'
df, ts_unit = decide_ts_unit_for_npm(df)
df, ts_unit = decide_ts_unit_for_npm(
df,
timestamp_column_name=npm_timestamp_column_name,
time_unit=npm_time_unit,
headless=headless
)
df, indices_dict, num_channels = decide_indices(file, df, flag)
keys = list(indices_dict.keys())
for k in range(len(keys)):
Expand Down Expand Up @@ -1004,7 +1043,7 @@ def execute(inputParameters):
for i in folderNames:
filepath = os.path.join(inputParameters['abspath'], i)
data = readtsq(filepath)
event_name, flag = import_np_doric_csv(filepath, isosbestic_control, num_ch)
event_name, flag = import_np_doric_csv(filepath, isosbestic_control, num_ch, inputParameters=inputParameters)
saveStorenames(inputParameters, data, event_name, flag, filepath)
insertLog('#'*400, logging.INFO)
except Exception as e:
Expand Down
49 changes: 33 additions & 16 deletions src/guppy/savingInputParameters.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,23 +24,31 @@ def savingInputParameters():
else:
pass

# Create the main window
folder_selection = tk.Tk()
folder_selection.title("Select the folder path where your data is located")
folder_selection.geometry("700x200")
def select_folder():
# Determine base folder path (headless-friendly via env var)
base_dir_env = os.environ.get('GUPPY_BASE_DIR')
is_headless = base_dir_env and os.path.isdir(base_dir_env)
if is_headless:
global folder_path
folder_path = filedialog.askdirectory(title="Select the folder path where your data is located")
if folder_path:
print(f"Folder path set to {folder_path}")
folder_selection.destroy()
else:
folder_path = os.path.expanduser('~')
print(f"Folder path set to {folder_path}")

select_button = ttk.Button(folder_selection, text="Select a Folder", command=select_folder)
select_button.pack(pady=5)
folder_selection.mainloop()
folder_path = base_dir_env
print(f"Folder path set to {folder_path} (from GUPPY_BASE_DIR)")
else:
# Create the main window
folder_selection = tk.Tk()
folder_selection.title("Select the folder path where your data is located")
folder_selection.geometry("700x200")
def select_folder():
global folder_path
folder_path = filedialog.askdirectory(title="Select the folder path where your data is located")
if folder_path:
print(f"Folder path set to {folder_path}")
folder_selection.destroy()
else:
folder_path = os.path.expanduser('~')
print(f"Folder path set to {folder_path}")

select_button = ttk.Button(folder_selection, text="Select a Folder", command=select_folder)
select_button.pack(pady=5)
folder_selection.mainloop()

current_dir = os.getcwd()

Expand Down Expand Up @@ -525,4 +533,13 @@ def onclickpsth(event=None):
template.main.append(group)
template.main.append(visualize)

# Expose minimal hooks and widgets to enable programmatic testing
template._hooks = {
"onclickProcess": onclickProcess,
"getInputParameters": getInputParameters,
}
template._widgets = {
"files_1": files_1,
}

return template
8 changes: 8 additions & 0 deletions src/guppy/testing/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
from .api import step1, step2, step3, step4

__all__ = [
"step1",
"step2",
"step3",
"step4",
]
Loading