This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
pipeline:meridien:sp_polishing [2021/03/16 09:54] aali [Input] |
pipeline:meridien:sp_polishing [2021/03/16 10:15] aali [Typical usage] |
||
---|---|---|---|
Line 30: | Line 30: | ||
: After running the polishing routine, the wrapper automatically converts the shiny.star in | : After running the polishing routine, the wrapper automatically converts the shiny.star in | ||
the Output folder to a BDB stack and saves it in the folder name of Polish Stack. | the Output folder to a BDB stack and saves it in the folder name of Polish Stack. | ||
- | |||
- | : | ||
</ | </ | ||
Line 43: | Line 41: | ||
; %%--%%training_params : In case if the user wants to do the training part of the polishing then this is None. Otherwise provide the optimize_params.txt file. (default none) | ; %%--%%training_params : In case if the user wants to do the training part of the polishing then this is None. Otherwise provide the optimize_params.txt file. (default none) | ||
; %%--%%min_no_particles : Number of particles to inlcude for training. (default 5000) | ; %%--%%min_no_particles : Number of particles to inlcude for training. (default 5000) | ||
- | ; %%--%%submission_template : Submission template for mpi command (default none) | + | ; %%--%%submission_template : Submission template for MPI command. In case of running this tool on a cluster, always a submission template. The number of MPI processors and threads in the submission template should be the same as the one specified below in the MPI procs and threads value.(default none) |
; %%--%%submission_command : Submission commmand for cluster (default sbatch) | ; %%--%%submission_command : Submission commmand for cluster (default sbatch) | ||
; %%--%%relion_mpirun_executable : Since there can be more than one mpirun environment installed on a workstation or on a cluster. It can be sometimes necessary to provide the relion specific mpirun executable. Just type which mpirun -a in a terminal and choose the one which relion requires. (default mpirun) | ; %%--%%relion_mpirun_executable : Since there can be more than one mpirun environment installed on a workstation or on a cluster. It can be sometimes necessary to provide the relion specific mpirun executable. Just type which mpirun -a in a terminal and choose the one which relion requires. (default mpirun) | ||
; %%--%%relion_polishing_executable : Similar to the issue of relion mpirun executable, it can be sometime necessary to provide the specific relion polishing executable also. Just type which relion_motion_refine_mpi and copy the path here. (default relion_motion_refine_mpi) | ; %%--%%relion_polishing_executable : Similar to the issue of relion mpirun executable, it can be sometime necessary to provide the specific relion polishing executable also. Just type which relion_motion_refine_mpi and copy the path here. (default relion_motion_refine_mpi) | ||
- | ; %%--%%mpi_procs : The number of MPI processors used for Relion multiprocessing. For training part only one MPI processors is used. (default 1) | + | ; %%--%%mpi_procs : The number of MPI processors used for Relion multiprocessing. For training part only one MPI processors is used. In case when running the program on cluster, the MPI processors value should be same as used in the submission template.(default 1) |
- | ; %%--%%no_of_threads : The number of threads use during the polishing. (default 1) | + | ; %%--%%no_of_threads : The number of threads use during the polishing. In case when running the program on cluster, the thread value should be same as used in the submission template.(default 1) |
\\ | \\ | ||
Line 65: | Line 63: | ||
===== ===== | ===== ===== | ||
- | To run it properly on cluster. Do the following task before running . | + | To run it properly on cluster. Do the following task before running |
< | < | ||
- | module unload sphire | + | module unload sphire |
- | module load sphire/ | + | module load sphire/ |
module load relion | module load relion | ||
Line 86: | Line 84: | ||
==== Reference ==== | ==== Reference ==== | ||
- | Zivanov | + | Jasenko |
==== Developer Notes ==== | ==== Developer Notes ==== | ||