Commit 3bb78273 authored by mwaskom's avatar mwaskom Committed by Alexandre Gramfort

Add basic volume example and clarify basic surface example

Adding underlying files for logo

Adapting MNE Python gen_rst sphinxext for thumbnails

Change name of label example for ordering

Change name of contour example for ordering

Fix gen_rst so example index isn't double-headered

Delete montage example for now

Move view example for ordering

Rename view example

Improve code and aesthetics of parc values example

Updating basic volume example further

Name full and thumb images differently in gen_rst

Track volume mask file

Change datatype of mask file for better storage

Demonstrate alpha in parcellation example

Unify license statements across the package

Change example README as it is now incorporated into the docs page

Change view in label example

Updating website banner

Converting README to markdown and updating some aspects of it

Flattening and tweaking doc index

Increasing the size of the example thumbnails

Update Alex's affiliation

removing large files
parent 57f1d826
Copyright (c) 2011, Neuroimaging in Python Team
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
* Neither the name of the <organization> nor the
names of its contributors may be used to endorse or promote products
derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL <COPYRIGHT HOLDER> BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
.. -*- mode: rst -*-
PySurfer
========
PySurfer is a Python module for interacting with a cortical surface
representations of neuroimaging data from Freesurfer. It extends Mayavi's
powerful visualization engine with a high-level interface for working with
MRI and MEG data.
![](doc/logo_files/pysurfer_logo_small.png)
PySurfer is a Python package for interacting with a cortical surface
representations of neuroimaging data. It extends Mayavi's powerful
visualization engine with a high-level interface for working with MRI and MEG
data.
PySurfer offers both a command-line interface designed to broadly replicate
Freesurfer's Tksurfer program as well as a Python library for writing scripts
to efficiently explore complex datasets.
PySurfer offers both a command-line interface designed to broadly the
Freesurfer Tksurfer program and a Python library for writing scripts to
efficiently explore complex datasets and prepare publication-ready figures.
To goal of the project is to facilitate the production of figures that are
both beautiful and scientifically informative.
To goal of the project is to facilitate the production of figures that are both
beautiful and scientifically informative.
Important Links
---------------
......@@ -27,40 +27,38 @@ Important Links
Install
-------
This packages uses distutils, which is the default way of installing python
modules. To install in your home directory, use::
python setup.py install --home
To install for all users on Unix/Linux::
This packages uses setuptools. To install it for all users, run:
python setup.py build
sudo python setup.py install
For information about dependencies, please see the online documentation:
http://pysurfer.github.com/install.html
If you do not have sudo privileges, you can install locally:
python setup.py install --home
For information about dependencies, please see the [online
documentation](http://pysurfer.github.com/install.html)
License
-------
Available under the BSD (3-clause) license.
Available under the Revised BSD (3-clause) license.
Testing
-------
You can launch the test suite for the io library using nosetests from the
source folder.
You can launch the test suite by running `nosetests` from the source folder.
For the visualization module the best way to test is to build the documentation,
which will run the example scripts and automatically generate static image output.
From the source directory::
Another way to test is to build the documentation, which will run the example
scripts and automatically generate static image output. From the source
directory:
cd doc/
make clean
make html
The resulting documentation will live at _build/html/index.html, which can
be compared to the online docs.
The resulting documentation will live at _build/html/index.html, which can be
compared to the online docs.
Either method will work only if you have Freesurfer installed on your
machine with a valid SUBJECTS_DIR folder.
Either method will work only if you have Freesurfer installed on your machine
with a valid SUBJECTS\_DIR folder.
doc/_static/banner.png

44.5 KB | W: | H:

doc/_static/banner.png

65.9 KB | W: | H:

doc/_static/banner.png
doc/_static/banner.png
doc/_static/banner.png
doc/_static/banner.png
  • 2-up
  • Swipe
  • Onion skin
......@@ -4,6 +4,6 @@
{% block header %}
<div style="background-color: white; text-align: left; padding: 10px 10px 15px 15px">
<a href="{{pathto('index') }}">
<img src="{{ pathto("_static/banner.png", 1) }}" alt="PySurfer logo" border="0" />
<img src="{{ pathto("_static/banner.png", 1) }}" alt="PySurfer logo" height=180px width=600px border="0" />
</div>
{% endblock %}
Documentation
=============
Detailed Documentation
======================
.. _doc-index:
......
Introduction
============
PySurfer is a Python based program for visualization and interaction
with cortical surface representations of neuroimaging data from
`Freesurfer <http://surfer.nmr.mgh.harvard.edu/>`_. It extends
`Mayavi's <http://github.enthought.com/mayavi/mayavi/index.html>`_ powerful
visualization engine with a high-level interface for working with
MRI and MEG data.
PySurfer is a Python library for visualizing cortical surface representations
of neuroimaging data. The package is primarily intended for use with
`Freesurfer <http://surfer.nmr.mgh.harvard.edu/>`_, but it can plot data that
are drawn from a variety of sources. PySurfer extends `Mayavi's
<http://github.enthought.com/mayavi/mayavi/index.html>`_ powerful rendering
engine with a high-level interface for working with MRI and MEG data.
PySurfer offers both a command-line interface designed to broadly
replicate Freesurfer's Tksurfer program as well as a Python library
for writing scripts to efficiently explore complex datasets.
Contents
--------
More Information
----------------
.. toctree::
:maxdepth: 2
:maxdepth: 1
install
examples/index.rst
......@@ -25,20 +21,16 @@ Contents
Authors
-------
Michael Waskom, Stanford University
Alexandre Gramfort, Harvard Med. School MGH Martinos Center - INRIA
Scott Burns, Harvard Med. School MGH Martinos Center
Martin Luessi, Harvard Med. School MGH Martinos Center
Eric Larson, University of Washington ILABS
- Michael Waskom, Stanford University
- Alexandre Gramfort, Telecom ParisTech - CNRS, CEA - Neurospin
- Scott Burns, Vanderbilt University
- Martin Luessi, Harvard Medical School MGH Martinos Center
- Eric Larson, University of Washington ILABS
License
-------
The PySurfer source code is available under the simplified BSD license
The PySurfer source code is available under the Revised BSD (3-Clause) license
Support
-------
......
This diff is collapsed.
PySurfer Examples
=================
PySurfer Example Gallery
========================
This directory contains a set of example scripts intended to demonstrate the
functionality offered by PySurfer, from the most basic tasks to more complicated
visualizations. It also serves as a test suite for the visualization module,
which is difficult to test numerically.
This page offers a set of example scripts intended to demonstrate the
functionality offered by PySurfer, from the most basic tasks to more complex
visualizations.
Any example with a filename that begins with ``plot`` will be executed when
the documentation is built, with any figures build during execution captured
and saved in the html file.
For each script that produces a figure, the image is automatically captured and
displayed along with the code that generated it.
The automatic doc engine writes an example index by the alphabetical filename,
and the general intention is to move from basic examples to more complex tasks.
Please keep this in mind when naming example scripts.
Contributions are more than welcome!
Used PySurfer to present neuroimaging data in a way that isn’t covered here?
Please `contribute <https://github.com/nipy/PySurfer>`_ it!
"""
=======================
Display fMRI Activation
=======================
Load a statistical overlay on the inflated surface.
The most straightforward way to plot activations is when you already have
a map of them defined on the Freesurfer surface. This map can be stored in any
file format that Nibabel can understand.
"""
print __doc__
import os.path as op
from surfer import Brain
"""
Bring up the visualization
Bring up the visualization window.
"""
brain = Brain("fsaverage", "lh", "inflated")
"""
Get a path to the overlay file.
"""
overlay_file = op.join("example_data", "lh.sig.nii.gz")
overlay_file = "example_data/lh.sig.nii.gz"
"""
Display the overlay on the surface using the defaults
to control thresholding and colorbar saturation.
These can be set through your config file.
Display the overlay on the surface using the defaults to control thresholding
and colorbar saturation. These can be set through your config file.
"""
brain.add_overlay(overlay_file)
......@@ -34,7 +35,7 @@ You can then turn the overlay off.
brain.overlays["sig"].remove()
"""
Now add the overlay again, but this time with set threshold
and showing only the positive activations
Now add the overlay again, but this time with set threshold and showing only
the positive activations.
"""
brain.add_overlay(overlay_file, min=5, max=20, sign="pos")
"""
======================
Display an fMRI Volume
======================
To plot data on the surface that is stored as a volume image, it is only
slightly more complicated. You'll have to use the function
``surfer.project_volume_data``, which makes an external call to the Freesurfer
program ``mri_vol2surf``.
Note: In PySurfer versions 0.4 and lower, the project_volume_data function must
be imported from ``surfer.io``.
"""
print __doc__
import os
from surfer import Brain, project_volume_data
"""
Bring up the visualization window.
"""
brain = Brain("fsaverage", "lh", "inflated")
"""
Get a path to the volume file.
"""
volume_file = "example_data/zstat.nii.gz"
"""
There are two options for specifying the registration between the volume and
the surface you want to plot on. The first is to give a path to a
Freesurfer-style linear transformation matrix that will align the statistical
volume with the Freesurfer anatomy.
Most of the time you will be plotting data that are in MNI152 space on the
fsaverage brain. For this case, Freesurfer actually ships a registration matrix
file to align your data with the surface.
"""
reg_file = os.path.join(os.environ["FREESURFER_HOME"],
"average/mni152.register.dat")
zstat = project_volume_data(volume_file, "lh", reg_file)
"""
Note that the contours of the fsaverage surface don't perfectly match the
MNI brain, so this will only approximate the location of your activation
(although it generally does a pretty good job). A more accurate way to
visualize data would be to run the MNI152 brain through the recon-all pipeline.
Alternatively, if your data are already in register with the Freesurfer
anatomy, you can provide project_volume_data with the subject ID, avoiding the
need to specify a registration file.
By default, 3mm of smoothing is applied on the surface to clean up the overlay a
bit, although the extent of smoothing can be controlled.
"""
zstat = project_volume_data(volume_file, "lh",
subject_id="fsaverage", smooth_fwhm=0.5)
"""
Once you have the statistical data loaded into Python, you can simply pass it
to the `add_overlay` method of the Brain object.
"""
brain.add_overlay(zstat, min=2, max=12)
"""
It can also be a good idea to plot the inverse of the mask that was used in the
analysis, so you can be clear about areas that were not included.
It's good to change some parameters of the sampling to account for the fact that
you are projecting binary (0, 1) data.
"""
mask_file = "example_data/mask.nii.gz"
mask = project_volume_data(mask_file, "lh", subject_id="fsaverage",
smooth_fwhm=0, projsum="max").astype(bool)
mask = ~mask
brain.add_data(mask, min=0, max=10, thresh=.5,
colormap="bone", alpha=.6, colorbar=False)
brain.show_view("medial")
......@@ -39,7 +39,8 @@ brain.add_label("BA44", borders=True)
brain.add_label("BA6", alpha=.7)
# Finally, you can plot the label in any color you want.
brain.show_view("medial")
brain.show_view(dict(azimuth=-42, elevation=105, distance=225,
focalpoint=[-30, -20, 15]))
# Use any valid matplotlib color.
brain.add_label("V1", color="steelblue", alpha=.6)
......
"""
==================
Display ROI Values
==================
Here we demonstrate how to take the results of an ROI analysis
performed within each region of some parcellation and display
those values on the surface to quickly summarize the analysis.
Here we demonstrate how to take the results of an ROI analysis performed within
each region of some parcellation and display those values on the surface to
quickly summarize the analysis.
"""
print __doc__
import os
import os.path as op
import numpy as np
import nibabel as nib
from surfer import Brain
......@@ -20,36 +20,36 @@ hemi = "lh"
surface = "inflated"
"""
Bring up the visualization
Bring up the visualization.
"""
brain = Brain(subject_id, hemi, surface,
config_opts=dict(background="lightslategray",
cortex="high_contrast"))
config_opts=dict(background="white"))
"""
Read in the aparc annotation file
Read in the Buckner resting state network annotation. (This requires a
relatively recent version of Freesurfer, or it can be downloaded separately).
"""
aparc_file = op.join(os.environ["SUBJECTS_DIR"],
subject_id, "label",
hemi + ".aparc.a2009s.annot")
aparc_file = os.path.join(os.environ["SUBJECTS_DIR"],
subject_id, "label",
hemi + ".Yeo2011_17Networks_N1000.annot")
labels, ctab, names = nib.freesurfer.read_annot(aparc_file)
"""
Make a random vector of scalar data corresponding to
a value for each region in the parcellation.
Make a random vector of scalar data corresponding to a value for each region in
the parcellation.
"""
roi_data = np.random.random(len(names))
rs = np.random.RandomState(4)
roi_data = rs.uniform(.5, .75, size=len(names))
"""
Make a vector containing the data point at each vertex.
"""
vtx_data = np.zeros(len(labels))
for i, data in enumerate(roi_data):
vtx_data[labels == i] = data
vtx_data = roi_data[labels]
"""
Display these values on the brain.
Use the hot colormap and add an alpha channel
so the underlying anatomy is visible.
Display these values on the brain. Use a sequential colormap (assuming
these data move from low to high values), and add an alpha channel so the
underlying anatomy is visible.
"""
brain.add_data(vtx_data, 0, 1, colormap="hot", alpha=.7)
brain.add_data(vtx_data, .5, .75, colormap="GnBu", alpha=.8)
......@@ -44,6 +44,6 @@ plot things separately for the left and right hemispheres.
"""
subjects_dir = os.environ["SUBJECTS_DIR"]
annot_path = pjoin(subjects_dir, subject_id, "label", "lh.aparc.annot")
brain.add_annotation(annot_path, hemi='lh', borders=False)
brain.add_annotation(annot_path, hemi='lh', borders=False, alpha=.75)
annot_path = pjoin(subjects_dir, subject_id, "label", "rh.aparc.a2009s.annot")
brain.add_annotation(annot_path, hemi='rh', remove_existing=False)
"""
======================
Make a multiview image
======================
Make one image from multiple views.
"""
print __doc__
from surfer import Brain
sub = 'fsaverage'
hemi = 'lh'
surf = 'inflated'
bgcolor = 'w'
brain = Brain(sub, hemi, surf, config_opts={'background': bgcolor})
###############################################################################
# Get a set of images as a montage, note the data could be saved if desired
image = brain.save_montage(None, ['l', 'v', 'm'], orientation='v')
brain.close()
###############################################################################
# View created image
import pylab as pl
fig = pl.figure(figsize=(5, 3), facecolor=bgcolor)
ax = pl.axes(frameon=False)
ax.imshow(image, origin='upper')
pl.xticks(())
pl.yticks(())
pl.draw()
pl.show()
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment