User Tools

Site Tools


pipeline:meridien:sp_polishing

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
pipeline:meridien:sp_polishing [2021/03/16 09:56]
aali [Input]
pipeline:meridien:sp_polishing [2021/03/16 10:15]
aali [Typical usage]
Line 30: Line 30:
 : After running the polishing routine, the wrapper automatically converts the shiny.star in : After running the polishing routine, the wrapper automatically converts the shiny.star in
   the Output folder to a BDB stack and saves it in the folder name of Polish Stack.   the Output folder to a BDB stack and saves it in the folder name of Polish Stack.
- 
-: 
  
 </code> </code>
Line 43: Line 41:
   ; %%--%%training_params : In case if the user wants to do the training part of the polishing then this is None. Otherwise provide the optimize_params.txt file. (default none)   ; %%--%%training_params : In case if the user wants to do the training part of the polishing then this is None. Otherwise provide the optimize_params.txt file. (default none)
   ; %%--%%min_no_particles : Number of particles to inlcude for training. (default 5000)   ; %%--%%min_no_particles : Number of particles to inlcude for training. (default 5000)
-  ; %%--%%submission_template :  Submission template for mpi command (default none)+  ; %%--%%submission_template :  Submission template for MPI command. In case of running this tool on a cluster, always a submission template. The number of MPI processors and threads in the submission template should be the same as the one specified below in the MPI procs and threads value.(default none)
   ; %%--%%submission_command : Submission commmand for cluster (default sbatch)   ; %%--%%submission_command : Submission commmand for cluster (default sbatch)
   ; %%--%%relion_mpirun_executable :  Since there can be more than one mpirun environment installed on a workstation or on a cluster. It can be sometimes necessary to provide the relion specific mpirun executable. Just type which mpirun -a in a terminal and choose the one which relion requires. (default mpirun)   ; %%--%%relion_mpirun_executable :  Since there can be more than one mpirun environment installed on a workstation or on a cluster. It can be sometimes necessary to provide the relion specific mpirun executable. Just type which mpirun -a in a terminal and choose the one which relion requires. (default mpirun)
Line 65: Line 63:
 =====   ===== =====   =====
  
-To run it properly on cluster. Do the following task before running .+To run it properly on cluster. Do the following task before running in case if you are running from MPI dortmund. These settings can be different if you are using it outside.
  
 <code> <code>
-module unload sphire+module unload sphire (unload the module sphire) 
  
-module load sphire/clem/Polishing_SPHIRE_v1.2 or sphire/clr/Polishing_SPHIRE_v1.2+module load sphire/clem/Polishing_SPHIRE_v1.2 or sphire/clr/Polishing_SPHIRE_v1.2 (this has to be done only in case if you have a different installation then the default one, since this tool is still not released, the git branch is different). 
  
 module load relion module load relion
Line 86: Line 84:
 ==== Reference ==== ==== Reference ====
  
-Zivanov J., Nakane T., Forsberg B.O., Kimanius D., Hagen W.J.H., Lindahl E, Scheres S.H.W. (2018) "New tools for automated high-resolution cryo-EM structure determination in RELION." //eLife// **7**: e42166.+Jasenko Zivanov, Takanori Nakane and Sjors H W Scheres: A Bayesian approach to beam-induced motion correction in cryo-EM single-particle analysis, ''IUCrJ'' 6, 5-17, January, 2019..
 ==== Developer Notes ==== ==== Developer Notes ====
  
pipeline/meridien/sp_polishing.txt ยท Last modified: 2021/03/16 10:44 by aali