Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests #43

Open
wants to merge 18 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 22 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,24 @@
# Data needed by notebooks, should be retrieved by running download_data.sh
notebooks/eventseg/Sherlock_AG_movie.npy
notebooks/eventseg/Sherlock_AG_recall.npy
notebooks/fmrisim/Corr_MVPA_Data_dataspace/
notebooks/fmrisim/Corr_MVPA_archive.tar.gz
notebooks/isc/brainiak-aperture-isc-data/
notebooks/srm/brainiak-aperture-srm-data/
notebooks/fmrisim/Corr_MVPA/
notebooks/iem/AL61_Bilat-V1_attnContrast.mat
notebooks/iem/RademakerEtAl2019_WM_S05_avgTime.npz
notebooks/isc/brainiak-aperture-isc-data.tgz
notebooks/srm/brainiak-aperture-srm-data.tgz
notebooks/htfa/data
notebooks/real-time/sample-config.toml

# Files generated when running fmrisim notebook
notebooks/fmrisim/Condition_A.txt
notebooks/fmrisim/Condition_B.txt
notebooks/fmrisim/epoch_file.npy


# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
Expand Down Expand Up @@ -133,4 +154,4 @@ dmypy.json
.DS_Store

# htfa temp files
notebooks/htfa/*_frames/*.jpg
notebooks/htfa/*_frames/*.jpg
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[submodule "notebooks/real-time/rt-cloud"]
path = notebooks/real-time/rt-cloud
url = https://github.com/brainiak/rt-cloud.git
2 changes: 2 additions & 0 deletions download_data.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
#!/bin/bash
find notebooks -name 'download_data.sh' -execdir bash download_data.sh \;
4 changes: 4 additions & 0 deletions notebooks/brsa/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
scipy
numpy
brainiak
matplotlib
4 changes: 4 additions & 0 deletions notebooks/eventseg/download_data.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
#!/bin/bash

wget -nc https://ndownloader.figshare.com/files/22927253 -O Sherlock_AG_movie.npy
wget -nc https://ndownloader.figshare.com/files/22927256 -O Sherlock_AG_recall.npy
4 changes: 4 additions & 0 deletions notebooks/eventseg/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
numpy
matplotlib
brainiak
scipy
11 changes: 11 additions & 0 deletions notebooks/fcma/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
nxviz<=0.6.3
pandas
brainiak
mpi4py
nilearn
matplotlib
seaborn
networkx
nibabel
numpy
scikit_learn
8 changes: 8 additions & 0 deletions notebooks/fmrisim/download_data.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#!/bin/bash

# Download the archive containing the example data if we don't have it already
wget -nc https://dataspace.princeton.edu/bitstream/88435/dsp01dn39x4181/2/Corr_MVPA_archive.tar.gz

# If the file doesn't exist, we need to extract it.
test ! -e Corr_MVPA && tar xzkvf Corr_MVPA_archive.tar.gz Corr_MVPA_Data_dataspace/Participant_01_rest_run01.nii && mv Corr_MVPA_Data_dataspace Corr_MVPA

11 changes: 5 additions & 6 deletions notebooks/fmrisim/fmrisim_multivariate_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@
"source": [
"*1.2 Load participant data*<a id=\"load_ppt\"></a>\n",
"\n",
"Any 4 dimensional fMRI data that is readible by nibabel can be used as input to this pipeline. For this example, data is taken from the open access repository DataSpace: http://arks.princeton.edu/ark:/88435/dsp01dn39x4181. This file is unzipped and placed in the home directory with the name Corr_MVPA "
"Any 4 dimensional fMRI data that is readible by nibabel can be used as input to this pipeline. For this example, data is taken from the open access repository DataSpace: http://arks.princeton.edu/ark:/88435/dsp01dn39x4181. This file is unzipped and placed same directory as this notebook with the name Corr_MVPA "
Copy link
Collaborator

@manojneuro manojneuro Sep 30, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@CameronTEllis FYI: please note a minor update to the data directory to make it compatible for automated testing.

]
},
{
Expand All @@ -122,8 +122,7 @@
},
"outputs": [],
"source": [
"home = str(Path.home())\n",
"nii = nibabel.load(home + '/Corr_MVPA/Participant_01_rest_run01.nii')\n",
"nii = nibabel.load('Corr_MVPA/Participant_01_rest_run01.nii')\n",
"volume = nii.get_data()"
]
},
Expand Down Expand Up @@ -5453,18 +5452,18 @@
"outputs": [],
"source": [
"fmrisim.export_epoch_file(stimfunction=[np.hstack((stimfunc_A, stimfunc_B))],\n",
" filename=home + '/epoch_file.npy',\n",
" filename='epoch_file.npy',\n",
" tr_duration=tr,\n",
" temporal_resolution=temporal_res,\n",
" )\n",
"\n",
"fmrisim.export_3_column(stimfunction=stimfunc_A,\n",
" filename=home + '/Condition_A.txt',\n",
" filename='Condition_A.txt',\n",
" temporal_resolution=temporal_res,\n",
" )\n",
"\n",
"fmrisim.export_3_column(stimfunction=stimfunc_B,\n",
" filename=home + '/Condition_B.txt',\n",
" filename='Condition_B.txt',\n",
" temporal_resolution=temporal_res,\n",
" )"
]
Expand Down
6 changes: 6 additions & 0 deletions notebooks/fmrisim/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
matplotlib
nibabel
scipy
numpy
brainiak
scikit_learn
16 changes: 16 additions & 0 deletions notebooks/htfa/download_data.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#!/bin/bash

if [ -d "data/" ]; then
echo "Skipping download of data for HTFA notebook, already present"
else
mkdir data
wget --save-cookies cookies.txt --keep-session-cookies --no-check-certificate -q \
"https://docs.google.com/uc?export=download&id=1IBA39ZZjeGS1u_DvZdiw1AZZQMS3K5q0" -O- \
| sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p' > confirm
wget --load-cookies cookies.txt --no-check-certificate -q \
"https://docs.google.com/uc?export=download&confirm="$(cat confirm)"&id=1IBA39ZZjeGS1u_DvZdiw1AZZQMS3K5q0" -O data/pieman.zip
rm cookies.txt confirm
unzip data/pieman.zip -d data/
rm data/pieman.zip
fi

1,472 changes: 785 additions & 687 deletions notebooks/htfa/htfa.ipynb

Large diffs are not rendered by default.

12 changes: 12 additions & 0 deletions notebooks/htfa/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
holoviews
numpy
pandas
brainiak
nilearn
timecorr
matplotlib
mpi4py
nibabel
seaborn
nltools
ipython
4 changes: 4 additions & 0 deletions notebooks/iem/download_data.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
#!/bin/bash

wget -nc https://zenodo.org/record/4950267/files/RademakerEtAl2019_WM_S05_avgTime.npz
wget -nc https://zenodo.org/record/4950267/files/AL61_Bilat-V1_attnContrast.mat
5 changes: 5 additions & 0 deletions notebooks/iem/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
scipy
brainiak
numpy
matplotlib
scikit_learn
24 changes: 7 additions & 17 deletions notebooks/isc/ISC.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
Expand Down Expand Up @@ -75,25 +75,15 @@
"name": "stdout",
"output_type": "stream",
"text": [
"--2020-12-07 01:33:28-- https://zenodo.org/record/4300904/files/brainiak-aperture-isc-data.tgz\n",
"Resolving zenodo.org (zenodo.org)... 137.138.76.77\n",
"Connecting to zenodo.org (zenodo.org)|137.138.76.77|:443... connected.\n",
"HTTP request sent, awaiting response... 200 OK\n",
"Length: 3146248838 (2.9G) [application/octet-stream]\n",
"Saving to: ‘brainiak-aperture-isc-data.tgz’\n",
"\n",
"brainiak-aperture-i 100%[===================>] 2.93G 6.53MB/s in 4m 32s \n",
"\n",
"2020-12-07 01:38:02 (11.0 MB/s) - ‘brainiak-aperture-isc-data.tgz’ saved [3146248838/3146248838]\n",
"File ‘brainiak-aperture-isc-data.tgz’ already there; not retrieving.\n",
"\n"
]
}
],
"source": [
"# Download and extract example data from Zenodo\n",
"!wget https://zenodo.org/record/4300904/files/brainiak-aperture-isc-data.tgz\n",
"!tar -xzf brainiak-aperture-isc-data.tgz\n",
"!rm brainiak-aperture-isc-data.tgz"
"!wget -nc https://zenodo.org/record/4300904/files/brainiak-aperture-isc-data.tgz\n",
"!tar --skip-old-files -xzf brainiak-aperture-isc-data.tgz\n"
]
},
{
Expand Down Expand Up @@ -637,9 +627,9 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"display_name": "brainiak_pr_venv_ORLfW",
"language": "python",
"name": "python3"
"name": "brainiak_pr_venv_orlfw"
},
"language_info": {
"codemirror_mode": {
Expand All @@ -651,7 +641,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.7"
"version": "3.8.11"
}
},
"nbformat": 4,
Expand Down
4 changes: 4 additions & 0 deletions notebooks/isc/download_data.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
#!/bin/bash

# Download the data, the python notebook handles extraction and
wget -nc https://zenodo.org/record/4300904/files/brainiak-aperture-isc-data.tgz
7 changes: 7 additions & 0 deletions notebooks/isc/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
matplotlib
scipy
seaborn
brainiak
nibabel
numpy
nilearn
7 changes: 7 additions & 0 deletions notebooks/matnormal/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
tensorflow
numpy
matplotlib
brainiak
seaborn
scipy
scikit_learn
31 changes: 31 additions & 0 deletions notebooks/real-time/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
awscli
bcrypt
bids
brainiak
dicom
indexed_gzip
inflect
inotify
ipython
jupyter
matplotlib
mypy
nibabel
nilearn
nodejs
numpy
pandas
pybids
pydicom
pyOpenSSL
python_bcrypt
python_dateutil
requests
rpyc
scikit_learn
scipy
toml
tornado
watchdog
websocket_client
wsaccel
1 change: 1 addition & 0 deletions notebooks/real-time/rt-cloud
Submodule rt-cloud added at 557821
22 changes: 13 additions & 9 deletions notebooks/real-time/rtcloud_notebook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,12 @@
"metadata": {},
"outputs": [],
"source": [
"# If rt-cloud repo is present in local directory and RTCLOUD_PATH is not already set, \n",
"# then set RTCLOUD_PATH to the local repo.\n",
"import os\n",
"if 'RTCLOUD_PATH' not in os.environ and os.path.exists('rt-cloud'):\n",
" os.environ['RTCLOUD_PATH'] = os.path.abspath('rt-cloud')\n",
"\n",
"!echo $RTCLOUD_PATH"
]
},
Expand All @@ -73,7 +79,6 @@
"import warnings; warnings.simplefilter('ignore')\n",
"\n",
"#---- Import the necessary python modules\n",
"import os\n",
"import sys\n",
"import threading\n",
"import argparse\n",
Expand Down Expand Up @@ -279,9 +284,7 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": false
},
"metadata": {},
"outputs": [],
"source": [
"%%html\n",
Expand All @@ -299,7 +302,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Instead of running the classification script from the web browser (as above), it can be run from the command line. The command below will run the sample.py script (of the real-time aperture notebook) as if running on the command line."
"Instead of running the classification script from the web browser (as above), it can be run directly by running the scripts main function. "
]
},
{
Expand All @@ -308,8 +311,9 @@
"metadata": {},
"outputs": [],
"source": [
"# Run the classification script from the command line instead of from the browser interface\n",
"!python -u $scriptToRun -c $configFile"
"# Run the classification script directly by importing the scripts main function.\n",
"from sample import main\n",
"main(['-c', configFile])"
]
},
{
Expand All @@ -328,7 +332,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
Expand All @@ -342,7 +346,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.11"
"version": "3.8.3"
}
},
"nbformat": 4,
Expand Down
7 changes: 3 additions & 4 deletions notebooks/srm/SRM.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -94,9 +94,8 @@
],
"source": [
"# Download and extract example data from Zenodo\n",
"!wget https://zenodo.org/record/4300825/files/brainiak-aperture-srm-data.tgz\n",
"!tar -xzf brainiak-aperture-srm-data.tgz\n",
"!rm brainiak-aperture-srm-data.tgz"
"!wget -nc https://zenodo.org/record/4300825/files/brainiak-aperture-srm-data.tgz\n",
"!tar --skip-old-files -xzf brainiak-aperture-srm-data.tgz\n"
]
},
{
Expand Down Expand Up @@ -556,7 +555,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.12"
"version": "3.8.3"
}
},
"nbformat": 4,
Expand Down
3 changes: 3 additions & 0 deletions notebooks/srm/download_data.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/bash

wget -nc https://zenodo.org/record/4300825/files/brainiak-aperture-srm-data.tgz
7 changes: 7 additions & 0 deletions notebooks/srm/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
brainiak
matplotlib
nibabel
nilearn
seaborn
numpy
scipy
Loading