Learning Methods for DNA Binding in Computational Biology Mark Koni , Dustin Holloway, Yue Fan, Chaitanya Sai and Charles DeLisi Abstract c We describe some recently developed and new applications of learning theory to prediction of binding by transcription factors to DNA in yeast and humans, as well as location of binding sites. This has potential applications to new types of biochemical bindings as well. Some related algorithms for identifying binding site locations are described. 1. INTRODUCTION Fig. 1: DNA binding inAgrobacterium tumefaciens (E. Zhuang, K. Kahn, We present here a review of recent results and some [62]) developing methods using kernel-based learning, for pairing genes with DNA regions and with specific locations involved Diversity of information: Information which can be in genetic transcription control. Identification of such relevant to computational determination of TF-gene pairings is a fundamental part of understanding gene identifications is highly diverse, including such elements as expression regulation in a cell, is a key to identifying microarray experiment correlations [27], [34], [6] gene fundamental biochemical interactions and pathways, and ontology [11], and phylogeny [5] information. It is sparse, gives basic insight into the cell's development and its with some data available for some genes only, and very high- response to damage or stress. At the center of this process is dimensional (DNA string counts give thousands of data the direct interaction of transcription factors (TFs) and the dimensions; see §II). specific locations (cis -elements) they bind to in DNA. The Learning methods which extrapolate binding rules from biological binding mechanism is hard to solve chemically, or examples have the potential for widespread and practical use. predict using other non-experimental methods (fig 1 below). Current approaches include artificial neural network (ANN) Note that the notion of TF binding with a gene here actually methods (e.g., the ANN-spec algorithm [59], which involves binding of DNA regions adjacent to the gene such combines weight-matrices and artificial neural networks; as introns or the upstream or downstream region of the gene radial basis function (RBF) methods [10]; kernel learning (fig. 3 in §II), though we will sometimes loosely state that a methods, e.g., support vector machine (SVM) [58] and - gene binds a TF. A typical HTH-type DNA binding is nearest neighbors ( NN) methods. illustrated in fig. 1. Methods in computational biology are making it possible Kernel learning: In almost all cases any type of biological to avoid experimental procedures (see, e.g., [9], [18]) for and computational information applied to identification of a identifying gene-TF interactions, using computed TF binding target (i.e., a DNA location where it binds) needs implementations of new mathematical and statistical to be put into a structured framework. Kernel-based learning methods. methods, particularly support vector machines (SVMs) [56], [57], [7], [46] give a natural basis for integrating input data. The methods have been used in genomics in a number of ways, for example for predicting gene function from gene Manuscript received January 31, 2007. datasets [40], [28]. Mark Kon and Yue Fan are in the Department of Mathematics and Statistics, Boston University, Boston, MA 02215 (emails [email protected], We complete the introduction by briefly mentioning [email protected]). M. Kon is the corresponding and presenting author. some of the context relating kernel learning to prior neural Telephone: 617-353-9549. net methods. Kernel learning methods started as applications Dustin Holloway is in the Departments of Molecular Biology, Cell in artificial and then RBF-based neural networks [42], [15] Biology, and Biochemistry, Boston University, Boston, MA, USA 02215 (email [email protected]). before their current formulations [51]. They extended into Chaitanya Sai is in the Department of Cognitive and Neural Sciences statistical and machine learning methods, and are now used at Boston University, Boston, MA 02215 (email [email protected]). by theoretical and applied workers in neural networks, Charles DeLisi is in Bioinformatics and Systems Biology, Boston statisticians, and computer scientists. Kernel learning has University, Boston, MA 02215 (email [email protected]). expanded over 20 years as neural paradigms (e.g., Grossberg equation feedforward nets [17]) have extended to statistically oriented RBF networks, and then kernel approaches for regularization networks and SVM. Perceptrons have effectively been generalized to SVM, and feedforward neural nets now have extended to include RBF nets, with kernels based on optimization of well-defined Lagrangians. Kernel Fig. 3: Gene and upstream motif structure for yeast learning is now used in computer science, statistics, artificial intelligence and computational biology, as well as in To learn the rules for interaction with a given gene (for neuroscience. fixed! ), consider a training data set We remark that some of the results here are abbreviated :=\~ ¸² Á & ³¹~ Áwhere and & ~ ¸cÁ ¹ . because of space limitations. These are given in more detail Define the correct classifier &~ in [23] and [25]. if attaches TF (under any experimental condition) . H c otherwise II. BIOLOGICAL B ACKGROUND, F EATURE S PACE M ETHODS We wish to learn¢ =\ ¦ defined by The transformation of DNA information into working proteins in the cell involves transcription of DNA to pre- ²³~&~ correct classification of mRNA, which becomes mRNA and leads to a protein, which from the examples in . For the initial case ofS . then determines cellular properties. We will consider only : cerevisiae,wedefinethegene formallyby the start of this process (transcription); see fig. 2. ~à its upstream (binding region) sequence of or less bases (for a fixed orientation and side of the double helix). Thus :78 ¦ Á with 7 ~¸(Á.Á*Á;¹ . For any ²³, form a feature vector using information about or its upstream regionÀ²³ This might be a string vectorstr based on a lexical enumeration¸¹ STR of all consecutive -mers (strings of length 6), e.g., STR ~ (*(.;*Á STR ~ *.; (*(Á etc., appearing in 's upstream region. The componentstr²³ is then the upstream count of STR in Fig. 2: Transcription: a strand of DNA is transcribed into RNA (http://biology.unm.edu/ccouncil/Biology_124/Summaries/T&T.html) geneÀ The feature map maps into the string feature space -²³strconsisting of possible string vectors str . The promoter region of DNA near a given gene, is the Another feature map is=mot²³ : ¦ - mot , with mot ²³ domain to which TF's generally bind; this is the upstream the upstream count of occurrences of MOT , with¸¹ MOT region in the case of yeast (see fig. 3). It contains regulatory an enumeration of 104 motifs (known binding sequences for sequences which attract transcription factors (TF's), whose any TF inSÀ cerevisiae) . Note that motif counting is a presence is required for transcription of the DNA into RNA. standard way of identifying! - binding and binding locations Regulatory sequences are inexactly repeating patterns known for! near . Another useful map isexp ²³ , an expression as motifs, which stand out as similar patterns across species - data profile for , i.e., a Boolean array indicating expression their function is to attract specific TF's. TF's bind to the or lack of expression of in a set of microarray experiments. promoter region at transcription factor binding sites (TFBS). General feature maps: Consider a general map =¢¦-, Definitions: Assume a fixed speciesI (e.g., the yeast S. with- thefeature space (a vector space), and cerevisiae) has genome= (collection of genes). Generally, a x ~²³-=\. If ¢ ¦ ¸cÁ¹ 1 is a classification fixed TF! attaches to a regulatory motif, a stretch of DNA map, we may wish to assign the classificationc or between 5 and 20 base pairs long (fig 3). directly to the feature vectorx rather than its corresponding gene~c ²x ³ . Then, if is invertible, the composition c ²³~xx k\ ²³¢-¦ accomplishes this, assigning a classification to eachx - . In the example of the string feature space-~-str , maps a sequencex of string counts in-c into yes ( ) or no ( 1) in \:. We replace the data~ ¸² Á & ³¹ with equivalent data ² ³ ~ xxy into a kernel function2² Á ³ , representing an :~¸²xx Á&³¹Áwith ~ ²³À Given data (examples) : , imposed geometry on-2²Á³~h in whichxx x x is the we seek to approximate the (unknown) exact classifier inner product of its arguments (see §III). function¢-¦ 8: which correctly generalizes . For all Many of these data sets (each of which yields a different (test) feature vectorsxx , this yields²³~& , with & the feature map-2³ and kernel yield weak classifiers at best, correct classification ofxÀ The problem of determining but they can combine using kernel methods to give better may be reformulated to seek¢-¦l , where predictions (§III). The figure below depicts predictive accuracy for each kernel individually, and then the ²³xxxxif ²³~²³ if ²³~c ; combination of all kernels. Performances in terms of for technical reasons it will be easier to work with sensitivity (dashed), positive predictive value (PPV, solid) procedures which (equivalently) find such an instead. and F1 score (blue, the harmonic mean of sensitivity and We have used 26 feature maps (and kernels) in our yeast PPV) are given. The binding data are from [61], [20], [30], studies. Table 1 contains a summary of them. [33]. Fig. 4: Predictive accuracies of the 26 kernels above using SVM to predict binding of genes!! with a fixed TF (averaged over ). PPV is positive predictive value, while- 1 is the harmonic mean of PPV and sensitivity (see [23]). III. SVM LEARNING AND THEK ERNEL TRICK Here is a brief overview of the SVM learning approach. The SVM provides a discriminant function of the form ²xwx ³ ~ h b, with ² x ³ yielding the classification c Table 1: List of feature maps used for s. Cerevisiae binding prediction. - &~(indicating ~ ²x ³ binds ! ) and otherwise mer denotes a string of length [23]. MT and DG are calculated using the &~c. The optimalw can be found using quadratic EMBOSS1 toolbox.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages8 Page
-
File Size-