although my question is related to the new Primatologist toolbox, I am asking it here because Primatologist comes with BrainRat. Let me know if you want me to post it on the BrainVsa forum.
So I have installed Brainvisa 4.6 on an Ubuntu14.04 machine, as well as on a Debian machine. I tried the primatologist pipeline on both machines with several T1 or T2 macaque images. Every time, the pipeline crashes during the "Atlas Registration -> Transform Application" step, returning the following error :
<class 'sqlite3.OperationalError'>: too many SQL variables
The log says:
processes.py (3172) in _processExecution:
client.py (578) in delete_workflow:
return self._engine_proxy.delete_workflow(workflow_id, force)
engine.py (904) in delete_workflow:
database_server.py (1484) in delete_workflow:
database_server.py (579) in clean:
raise DatabaseError('%s: %s \n' % (type(e), e))
Do you have any clue what's happening ?
Thanks in advance,
Institut de Neurosciences de La Timone,
Marseille, france https://meca-brain.org
Is the problem specific to Primatologist ? It seems to have problems using the sqlite database of Soma-Workflow, so I wonder if the soma-workflow database didn't get corrupted or something.
If you are using local computing, you can exit brainvisa, and remove the file(s) $HOME/.soma-workflow/soma_workflow*.db
(you can also cleanup remaining temporary transfer files $HOME/.soma-workflow/transfered_files/*/*)
If you are using remote (client/server) computing you can use soma_workflow_gui, in the connection dialog, there is a button to kill the servers and remove the database.
If the problem is still here, we'll have to investigate deeper...