accessing hip hop 2D coordinate spaces via python api

Questions about Anatomist manipulation

Moderators: denghien, riviere

Post Reply
JGrif
Posts: 1
Joined: Mon Apr 13, 2015 8:40 am

accessing hip hop 2D coordinate spaces via python api

Post by JGrif »

Dear List,

I am in need of a little assistance working with the outputs of the 'hip-hop' toolbox.

I have run some brains through the pipeline and now have the 'rectangular_flat' and other .gii files.

However I'm a little uncertain now as to the format of the outputs and how I can use them.

In particular, I would like to know how to access the spherical and polar coordinate space representations and how to identify corresponding xyz locations/vertices for these spaces. I am looking to project data from custom data structures and also from nifti fMRI images on to these 2D representations. I would prefer to do this via the pyanatomist scripting api rather than the gui if possible.

If, for example, someone could show me how to generate something similar to the 2D projection figures in the Auzias et al. Hip-Hop paper via the python api, that would be extremely helpful.


Many thanks,

John
User avatar
riviere
Site Admin
Posts: 1361
Joined: Tue Jan 06, 2004 12:21 pm
Location: CEA NeuroSpin, Saint Aubin, France
Contact:

Re: accessing hip hop 2D coordinate spaces via python api

Post by riviere »

Hi,

Hip Hop produces coordinates fields as textures for the cortical meshes: the .gii files contain the coordinate (longitude or latitude) for each vertex of the corresponding cortical mesh (keeping the same ordering).

For instance, using the python API of pyaims / pyanatomist:

Code: Select all

from soma import aims
white_mesh = aims.read('Lwhite.gii')
longitude = aims.read('Lwhite_lon.gii')
latitude = aims.read('Lwhite_lat.gii')

# for a given vertex index i, the corresponding 2D and 3D coords:
i = 43  # or any other index under len(white_mesh.vertex())
print 'vertex %d: 3D pos:' % i, tuple(white_mesh.vertex()[i]), ', 2D coords:', (longitude[0][i], latitude[0][i])
Each vertex having 3D coordinates in the "real space", the point matching a specific 2D coordinate will be the vertex position with longitude/latitude coords being the closest to the given coords. This can be done either using "brute force", calculating the distance between the given coord couple and every vertex in the mesh, or using some optimization (using a kdtree for instance).

fMRI images need to be projected onto the cortical surface mesh (and perhaps preprocessed). There are tools for this in the cortical surface toolbox of BrainVisa (this projection is done independently from the Hip Hip coordinates). It produces activation textures, matching the same cortical meshes as the Hip Hop coordinates, so are also in the same order. Thus projected activation locations can be mapped on the 2D coordinates system of Hip Hop for inter-subject comparison for instance (as I suppose this is what you want to do).

Is it clear enough ?

Denis
Post Reply