[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [modeller_usage] loops and disulfide



Hello Joel,

I'll send you something that I think is exactly what you want. Check the MyModel.py.

I use two scripts to model my stuff when I want to define my custom restraints. I use two scripts because parallel processing just doesn't like putting everything in one script, don't ask me why :x

The first is build_model.py and it's basically the main script that modeller uses. It has a few extra features but don't mind them.

The important lines are:

(number 11)  from MyModel import MyModel

(number 123)  a = MyModel(env,

This just imports and uses the MyModel class defined in the second script: MyModel.py. That second one has what you need. Basically it defines a new class:

class MyModel(loopmodel):

that inherits from loopmodel instead of automodel. It then has a function called special_patches that includes my restraints. That includes disulfide bridges:

self.patch(residue_type='DISU', residues=(self.residues['981:B'], self.residues['984:B']))

Take a look at the scripts and if you have any doubt just call back :)

Regards!
João [ .. ] Rodrigues

(Blog)   http://doeidoei.wordpress.com
(MSN)   ">
(Skype) rodrigues.jglm


On Mon, Nov 9, 2009 at 7:00 PM, Joel Tyndall <">> wrote:

Hi folks,

 

I’m looking to generate a model with a disulfide constraint as well as loop refinement. I am still getting my head around the python scripts. In the old top files, one could do multiple things in the single top file. I can’t seem to get past this issue with the python scripts. I can run things separately but I am at a loss trying to combine them. I seem to be stuck with

 

a = loopmodel(env,....

 

vs

 

a = MyModel (having defined the disulfide patch

 

Could someone help or direct me to the write web page. Is this the process to actually run two separate cripts (generate disulfide then run loop refinement?

 

Many thanks in advance

 

An example script would be awesome

 

J

 

_________________________________

Joel Tyndall, PhD

Senior Lecturer in Medicinal Chemistry
National School of Pharmacy
University of Otago
PO Box 56 Dunedin 9054
New Zealand                   

http://www.researcherid.com/rid/C-2803-2008

Pukeka Matua
Te Kura Taiwhanga Putaiao
Te Whare Wananga o Otago
Pouaka Poutapeta 56 Otepoti 9054
Aotearoa

Ph / Waea               +64 3 4797293
Fax / Waeawhakaahua     +64 3 4797034

 


_______________________________________________
modeller_usage mailing list
">
https://salilab.org/mailman/listinfo/modeller_usage


from modeller import *
from modeller.automodel import *    # Load the automodel class


class MyModel(loopmodel):
    """Includes disulfides in a model"""
    
    def special_patches(self, aln):
    
        rsr = self.restraints
        at = self.atoms
        
        # Close Contacts from PDB
        #REMARK 500  ATM1  RES C  SSEQI   ATM2  RES C  SSEQI           DISTANCE          
        #REMARK 500   SG   CYS B   988     CD   GLN B   991              1.80            
        #REMARK 500   O    ARG B   929     N    GLY B   931              2.16            
        #REMARK 500   OG   SER B  1408     OD1  ASP B  1427              2.18            
        #REMARK 500   OE1  GLN B  1237     OH   TYR B  1239              2.18                

        # Close contacts from PDB file

        rsr.add(forms.gaussian(group=physical.xy_distance, feature=features.distance(at['O:922:B'], at['N:924:B']), mean=2.16, stdev=0.2))
        rsr.add(forms.gaussian(group=physical.xy_distance, feature=features.distance(at['OG:1401:B'], at['OD1:1420:B']), mean=2.18, stdev=0.2))
        rsr.add(forms.gaussian(group=physical.xy_distance, feature=features.distance(at['OE1:1230:B'], at['OH:1232:B']), mean=2.18, stdev=0.2))

        # Thioester
        # Mutated GLN to CYS
        self.patch(residue_type='DISU', residues=(self.residues['981:B'], self.residues['984:B'])) 

        # S-S Bonds

        self.patch(residue_type='DISU', residues=(self.residues['537:A'], self.residues['787:B']))
        self.patch(residue_type='DISU', residues=(self.residues['605:A'], self.residues['640:A']))
        self.patch(residue_type='DISU', residues=(self.residues['664:B'], self.residues['691:B']))
        self.patch(residue_type='DISU', residues=(self.residues['665:B'], self.residues['698:B']))
        self.patch(residue_type='DISU', residues=(self.residues['678:B'], self.residues['699:B']))
        self.patch(residue_type='DISU', residues=(self.residues['844:B'], self.residues['1484:B']))
        self.patch(residue_type='DISU', residues=(self.residues['1072:B'], self.residues['1129:B']))
        self.patch(residue_type='DISU', residues=(self.residues['1329:B'], self.residues['1460:B']))
        self.patch(residue_type='DISU', residues=(self.residues['1360:B'], self.residues['1429:B']))
        self.patch(residue_type='DISU', residues=(self.residues['1477:B'], self.residues['1482:B']))
        self.patch(residue_type='DISU', residues=(self.residues['1489:B'], self.residues['1561:B']))
        self.patch(residue_type='DISU', residues=(self.residues['1508:B'], self.residues['1632:B']))
        self.patch(residue_type='DISU', residues=(self.residues['1608:B'], self.residues['1617:B']))
# coding=ISO-8859-15
# PIR2PDB script
# Takes as input a PIR file and PDB file from ModeloMatic server
# Start a modelling run with local MODELLER
# Wraps everything in a neat tar.gz package

# 25.10.2009 - João Rodrigues

## Main Imports

from MyModel import MyModel
from modeller import *                      # Load standard Modeller classes
from modeller.automodel import *            # Load the automodel class
from modeller.parallel import *             # Load the parallel class, to use multiple processors
import os, sys, optparse, tarfile, time     # Automation

## Logger

class Logger:
    def __init__(self, stdout, filename):
        self.stdout = stdout
        self.logfile = file(filename, 'a')
        self.logfile.write('\n\nNew run at %s\n\n' % time.ctime())
    def write(self, text):
        self.stdout.write(text)
        self.logfile.write(text)
        self.logfile.flush()
    def close(self):
        """Does this work or not?"""
        self.stdout.close()
        self.logfile.close()


## Define & Process Arguments

def process_args():
    """To process arguments in the command line when calling the script"""

    parser = optparse.OptionParser()
    parser.add_option("-A", "--alignfile", action="store", dest="alignfile", help="Path to Alignment File in PIR Format")
    parser.add_option("-T", "--templatefile", action="store", dest="templatefile", help="Path to Template File in PDB Format")    
    parser.add_option("-P", "--processors", action="store", dest="processors", default=2, help="Number of processors to be used in the modelling run [Default: 2]")
    parser.add_option("-M", "--nummodels", action="store", dest="nummodels", default=100, help="Number of models to generate [Default: 100]")
    parser.add_option("-L", "--loopmodels", action="store", dest="loopmodels", default=10, help="Number of models to generate [Default: 10]")    
    parser.add_option("-R", "--refine_mode", action="store", dest="refine_mode", default='slow', help="Model refinement level (check: http://salilab.org/modeller/manual/node44.html) [Default: slow]")
    parser.add_option("-v", "--verbose", action="store_true", dest="verbose", default=False, help="Verbose Output")
    parser.add_option("-c", "--clean", action="store_true", dest="clean", default=False, help="Deletes Files after creating tar.gz archive [Default: False]")
    return parser.parse_args()

(opts, args) = process_args()

#print opts
#sys.exit(0)

if opts.alignfile == None or opts.templatefile == None:
    print 'USAGE: python build_model.py -T <template file> -A <alignment file>'
    print 'HELP:  python build_model.py -h'
    sys.exit(0)

## Find MODELLER Path
# It should be defined when the libraries of modeller are loaded...

modellerPath = os.popen('env | grep -i modinstall9v7').read().strip().split('=')[-1]

## Define Model and Template Names

# Convention: Model name is the first word present in *ali file followed by a _ (underscore) character
# MODEL_<name of template(s)>.ali
targetName = os.path.abspath(opts.alignfile).split('/')[-1].split('_')[0]

# Convention: Templates are the only PDB files in the folder
templateName = os.path.abspath(opts.templatefile).split('/')[-1].split('.')[0]

## Create target directory
i = 0
dirName = targetName
while 1:
    if not os.path.exists(dirName+'.tar.gz'):
        break
    else:
        i += 1
        dirName = dirName.split('_')[0]+'_'+str(i)

## Define LOG File

logger = Logger(sys.stdout, '%s.log' %dirName)
sys.stdout = logger

## Create the MODELLER job
j = job(modeller_path=os.path.join(modellerPath, "bin/modslave.py"))

## Set Number of Processors to Run Modelling Task & Append Slaves accordingly
# Convention: -PX is a flag passed to the script that defines the number of processors to be used
# Example: $MODELLER/bin/modpy.sh python PBP1_3DWK.py -P6 defines 6 processors
for i in range(int(opts.processors)):
    j.append(local_slave())

## Request verbose output (OPTIONAL)
if opts.verbose:
    log.verbose()

## Create and Setup a new MODELLER environment to build this model in
env = environ()
env.io.atom_files_directory = ['./'] # directories for input atom files
env.io.hetatm = True # Read in HETATM records from template PDBs

## Model Optimization

# Determine usage of automodel or loopmodel
# Convention: Alignment file has gaps (----) in the MODEL part then it has loops..

pirFile = [i for i in os.listdir('.') if i.split('.')[-1] == 'ali' and i.split('_')[0] == targetName][0]
if ''.join(open(pirFile).read().split('>')[1].strip().split('\n')[3:]).find('-') != '-1':
    use_loopmodel = True
else:
    use_loopmodel = False

# Use dope_loopmodel or dopehr_loopmodel in place of loopmodel to obtain better quality loops (but slower) #

md_level = {'very_slow':refine.very_slow, 'slow':refine.slow, 'fast':refine.fast, 'very_fast':refine.very_fast} # Check User Refinement Level

if use_loopmodel == True:
    a = MyModel(env,
                alnfile  = '%s_%s.ali' %(targetName, templateName),      # alignment filename
                knowns   = (templateName),                               # codes of the templates
                sequence = targetName)                                   # code of the target

    # Loop Modelling
    a.loop.starting_model = 1
    a.loop.ending_model   = int(opts.loopmodels)
    a.loop.md_level       = md_level[opts.refine_mode]

else:
    a = automodel(env,
                alnfile  = '%s_%s.ali' %(targetName, templateName),      # alignment filename
                knowns   = (templateName),                               # codes of the templates
                sequence = targetName)                                   # code of the target

# Set General Modelling Parameters

a.starting_model= 1     # Will not generate more models than the set in nummodels variable

# Sanity check for more loopmodels than allmodels (fullmodels < 1)
fullmodels = int(opts.nummodels)/int(opts.loopmodels)
if fullmodels <= 0: fullmodels = 1
else: a.ending_model = fullmodels

# Optimization
  # CG
a.library_schedule = autosched.slow          # Very thorough VTFM optimization
a.max_var_iterations = 300                   # Select length of optimizations
a.max_molpdf = 1e6                           # do not stop unless obj.func. > 1E6
  #MD
a.md_level = md_level[opts.refine_mode]      # model refinement level

a.use_parallel_job(j)                        # Use the job for model building
a.make()                                     # do the actual homology modeling

## Wrap up && Clean Up

tar = tarfile.open(dirName+'.tar.gz', 'w:gz')

for eachFile in os.listdir('.'): # Copy relevant files to directory (pdbs, modeller output, alignments)
    if eachFile.startswith(targetName) and not os.path.isdir(eachFile) and eachFile.split('.')[-1] != 'gz':
        tar.add(eachFile)
        if opts.clean:
            os.remove(eachFile)
tar.close()

print 'DONE'