Star TEX: the Next Generation Several Ideas on How to Modernize TEX Already Exist

Total Page:16

File Type:pdf, Size:1020Kb

Star TEX: the Next Generation Several Ideas on How to Modernize TEX Already Exist TUGboat, Volume 33 (2012), No. 2 199 Star TEX: The Next Generation Several ideas on how to modernize TEX already exist. Some of them have actually been implemented. Didier Verna In this paper, we present ours. The possible future Abstract that we would like to see happening for TEX is some- what different from the current direction(s) TEX's While T X is unanimously praised for its typesetting E evolution has been following. In our view, modern- capabilities, it is also regularly blamed for its poor izing TEX can start with grounding it in an old yet programmatic offerings. A macro-expansion system very modern programming language: Common Lisp. is indeed far from the best choice in terms of general- Section 2 clarifies what our notion of a \better", or purpose programming. Several solutions have been \more modern" TEX is. Section 3 on the next page proposed to modernize T X on the programming E presents several approaches sharing a similar goal. side. All of them currently involve a heterogeneous Section 4 on the following page justifies the choice approach in which T X is mixed with a full-blown pro- E of Common Lisp. Finally, section 5 on page 205 gramming language. This paper advocates another, outlines the projected final shape of the project. homogeneous approach in which TEX is first rewrit- ten in a modern language, Common Lisp, which 2 A better TEX serves both at the core of the program and at the The T X community has this particularity of being scripting level. All programmatic macros of T X are E E a mix of two populations: people coming from the hence rendered obsolete, as the underlying language world of typography, who love it for its beautiful itself can be used for user-level programming. typesetting, and people coming from the world of Prologue computer science, who appreciate it for its automa- tion and similarity to a programming language. It is TEX true that the way TEX works is much closer to that The [final] frontier. of a compiled language than a WYSIWYG editor. These are the voyages, In both worlds, TEX is unanimously acclaimed Of a software enterprise. for the quality of its typesetting. This shouldn't Its continuing mission: be surprising, as it has always been TEX's primary To explore new tokens, objective. The question of its programmatic capa- To seek out a new life, bilities, however, is much more arguable. People New forms of implementation. unfamiliar with programming in general easily ac- knowledge that TEX's programmatic interface is not 1 Introduction trivial to use. For people coming from the world of In 2010, I asked Donald Knuth why he chose to computer science, it is even more obvious that TEX design and implement TEX as a macro-expansion is no match for a real programming language, partly system rather than as a full-blown, procedure-based, due to its macro-expansion nature. programming language. His answer was twofold: Let us recall that TEX was not originally meant 1. He wanted something relatively simple for his to be a programming language. To quote its author, secretary who was not a computer scientist. Donald Knuth [8]: I'm not going to design a programming lan- 2. The very limited computing resources at that guage; I want to have just a typesetting lan- time practically mandated the use of something guage. [. ] In some sense I put in many of much lighter than a complete programming lan- T X's programming features only after kick- guage. E ing and screaming. The first part of the answer left me with a slight From this perspective, it seems natural to con- feeling of skepticism. It remains to be seen that T X E sider that a \better" T X, today, would essentially is simple to use. Although it probably is for simple E deliver the same typesetting quality from a more things, its programmatic capabilities are notoriously modern programmatic interface. More precisely: tricky. The second part of the answer, on the other hand, was both very convincing and arguably now • access to the typesetting subset of TEX's primi- obsolete as well. Time has passed and the situation tives should be provided with a consistent syntax today is very different from what it was 30 years (in particular, no more need for \relax), ago. The available computing power has grown expo- • the programmatic API (\def, \if, etc.) should nentially, and so have our overall skills in language be dropped in favor of support from a real pro- design and implementation. gramming language, Star TEX: The Next Generation 200 TUGboat, Volume 33 (2012), No. 2 • the system should remain simple to use, at least more integrated than the other alternatives is built for simple things, just as TEX is today, like this: a Lua interpreter is embedded in a more • the system should remain highly extensible and or less regular TEX, written in WEB and C. customizable, just as TEX is today, The idea that we suggest in this paper is that • and finally, preserving backward compatibility, another, fully integrated approach is also possible. although not considered mandatory, should also In this approach, another programming language be considered. would be used to completely rewrite TEX and pro- vide the desired programmatic layer at the same 3 Alternatives time. We are aware of at least one previous attempt 4 The idea of grounding TEX into a real program- at a fully integrated approach. NTS , the \New ming language is not new. Let us mention some Typesetting System" was supposed to be a complete such attempts that we are aware of. eval4tex1 and re-implementation of TEX in Java, but the project sTEXme2 both use Scheme (another dialect of Lisp). was never widely adopted. We don't think that this PerlTEX[13, 15], as its name suggests, chooses Perl. project's demise indicates in any way that the ap- QaTEX/PyTEX[4] use Python, and finally, also as proach is doomed in general. On the contrary, the its name suggests, LuaTEX3 employs Lua. remaining sections explain why we think that Com- These approaches, although motivated more or mon Lisp is a very good candidate for it. less by the same general idea, work in very different 4 Common Lisp: why? ways. Some of them wrap TEX in a programming Lisp is a very old language. It was invented by language by giving the language access to TEX's in- ternals. Others wrap a programming language in John McCarthy in the late 1950s [9]. Contrary to what many people seem to think however, being old TEX by allowing TEX to execute code (LuaTEX be- longs to this category). Some do both. For instance, doesn't imply being obsolete. In this case, it is a synonym for being mature and modern. When Lisp sTEXme ships with a extended Scheme engine that was invented, it was in fact way ahead of its time, to can access TEX's internals, as well as a modified TEX engine that can evaluate Scheme code. the point that even its inventor hadn't realized the extent of his creation. A sign of this is the recent Some of them allow authors to write TEX macros incorporation of features that Lisp already provided in a different language (PerlTEX lets you write TEX 50 years ago into so-called \modern" programming macros in Perl). Others, like QaTEX, aim at com- languages, such as C# with the addition of dynamic pletely getting rid of TEX macros so that all program- matic functionality is written in another language types, or C++ and Java with the addition of lambda (anonymous) functions. (Python, with PyTEX in that case). This particular case is closer to what we have in mind. 4.1 Standardization Finally, some approaches use a synchronous Common Lisp, in particular, was standardized in dual-process scheme in which both TEX and the programming language of choice run in parallel, com- 1994 [1] (it was in fact the first object-oriented lan- municating either via standard input/output redi- guage to get an ANSI standard). Remember how rection, or by file or socket I/O. That is the case of stability was important to Donald Knuth for TEX? This means that even across different implementa- sTEXme and PerlTEX. Others, like eval4tex use a multi-pass scheme instead. In a first pass, the \for- tions (there are half a dozen or so), that the core of eign" code is extracted and sent to the programming the language will never change, and in fact hasn't for language. The programming language in question the last 20 years. Compare this to modern scripting languages such as Python or Ruby, for which the executes its code and sends it back to TEX. In the \standard" is essentially the current implementation final pass, TEX is left with only regular TEX macros to process. of the current version of the language written by the In spite of all these variations, it is worth stress- author of the language. ing that all these approaches have something in com- The Common Lisp standard is fairly comprehen- mon: they are heterogeneous. They involve both a sive: it includes not only the language core but a large programming language engine on one hand, and the library of functions. Of course, because the standard is 20 years old, it lacks several things that are con- original, though possibly modified, TEX engine on sidered important today (such as a multi-threading the other hand. Even LuaTEX which is somehow API). Every Common Lisp implementation provides 1 http://www.ccs.neu.edu/home/dorai/eval4tex/ its own version of non-standard features, but again, 2 http://stexme.sourceforge.net/ 3 http://www.luatex.org 4 http://nts.tug.org Didier Verna TUGboat, Volume 33 (2012), No.
Recommended publications
  • The Machine That Builds Itself: How the Strengths of Lisp Family
    Khomtchouk et al. OPINION NOTE The Machine that Builds Itself: How the Strengths of Lisp Family Languages Facilitate Building Complex and Flexible Bioinformatic Models Bohdan B. Khomtchouk1*, Edmund Weitz2 and Claes Wahlestedt1 *Correspondence: [email protected] Abstract 1Center for Therapeutic Innovation and Department of We address the need for expanding the presence of the Lisp family of Psychiatry and Behavioral programming languages in bioinformatics and computational biology research. Sciences, University of Miami Languages of this family, like Common Lisp, Scheme, or Clojure, facilitate the Miller School of Medicine, 1120 NW 14th ST, Miami, FL, USA creation of powerful and flexible software models that are required for complex 33136 and rapidly evolving domains like biology. We will point out several important key Full list of author information is features that distinguish languages of the Lisp family from other programming available at the end of the article languages and we will explain how these features can aid researchers in becoming more productive and creating better code. We will also show how these features make these languages ideal tools for artificial intelligence and machine learning applications. We will specifically stress the advantages of domain-specific languages (DSL): languages which are specialized to a particular area and thus not only facilitate easier research problem formulation, but also aid in the establishment of standards and best programming practices as applied to the specific research field at hand. DSLs are particularly easy to build in Common Lisp, the most comprehensive Lisp dialect, which is commonly referred to as the “programmable programming language.” We are convinced that Lisp grants programmers unprecedented power to build increasingly sophisticated artificial intelligence systems that may ultimately transform machine learning and AI research in bioinformatics and computational biology.
    [Show full text]
  • From JSON to JSEN Through Virtual Languages of the Creative Commons Attribution License (
    © 2021 by the authors; licensee RonPub, Lübeck, Germany. This article is an open access article distributed under the terms and conditions A.of Ceravola, the Creative F. Joublin:Commons From Attribution JSON to license JSEN through(http://c reativecommons.org/licenses/by/4Virtual Languages .0/). Open Access Open Journal of Web Technology (OJWT) Volume 8, Issue 1, 2021 www.ronpub.com/ojwt ISSN 2199-188X From JSON to JSEN through Virtual Languages Antonello Ceravola, Frank Joublin Honda Research Institute Europe GmbH, Carl Legien Str. 30, Offenbach/Main, Germany, {Antonello.Ceravola, Frank.Joublin}@honda-ri.de ABSTRACT In this paper we describe a data format suitable for storing and manipulating executable language statements that can be used for exchanging/storing programs, executing them concurrently and extending homoiconicity of the hosting language. We call it JSEN, JavaScript Executable Notation, which represents the counterpart of JSON, JavaScript Object Notation. JSON and JSEN complement each other. The former is a data format for storing and representing objects and data, while the latter has been created for exchanging/storing/executing and manipulating statements of programs. The two formats, JSON and JSEN, share some common properties, reviewed in this paper with a more extensive analysis on what the JSEN data format can provide. JSEN extends homoiconicity of the hosting language (in our case JavaScript), giving the possibility to manipulate programs in a finer grain manner than what is currently possible. This property makes definition of virtual languages (or DSL) simple and straightforward. Moreover, JSEN provides a base for implementing a type of concurrent multitasking for a single-threaded language like JavaScript.
    [Show full text]
  • The Semantics of Syntax Applying Denotational Semantics to Hygienic Macro Systems
    The Semantics of Syntax Applying Denotational Semantics to Hygienic Macro Systems Neelakantan R. Krishnaswami University of Birmingham <[email protected]> 1. Introduction body of a macro definition do not interfere with names oc- Typically, when semanticists hear the words “Scheme” or curring in the macro’s arguments. Consider this definition of and “Lisp”, what comes to mind is “untyped lambda calculus a short-circuiting operator: plus higher-order state and first-class control”. Given our (define-syntax and typical concerns, this seems to be the essence of Scheme: it (syntax-rules () is a dynamically typed applied lambda calculus that sup- ((and e1 e2) (let ((tmp e1)) ports mutable data and exposes first-class continuations to (if tmp the programmer. These features expose a complete com- e2 putational substrate to programmers so elegant that it can tmp))))) even be characterized mathematically; every monadically- representable effect can be implemented with state and first- In this definition, even if the variable tmp occurs freely class control [4]. in e2, it will not be in the scope of the variable definition However, these days even mundane languages like Java in the body of the and macro. As a result, it is important to support higher-order functions and state. So from the point interpret the body of the macro not merely as a piece of raw of view of a working programmer, the most distinctive fea- syntax, but as an alpha-equivalence class. ture of Scheme is something quite different – its support for 2.2 Open Recursion macros. The intuitive explanation is that a macro is a way of defining rewrites on abstract syntax trees.
    [Show full text]
  • Functional Programming Patterns in Scala and Clojure Write Lean Programs for the JVM
    Early Praise for Functional Programming Patterns This book is an absolute gem and should be required reading for anybody looking to transition from OO to FP. It is an extremely well-built safety rope for those crossing the bridge between two very different worlds. Consider this mandatory reading. ➤ Colin Yates, technical team leader at QFI Consulting, LLP This book sticks to the meat and potatoes of what functional programming can do for the object-oriented JVM programmer. The functional patterns are sectioned in the back of the book separate from the functional replacements of the object-oriented patterns, making the book handy reference material. As a Scala programmer, I even picked up some new tricks along the read. ➤ Justin James, developer with Full Stack Apps This book is good for those who have dabbled a bit in Clojure or Scala but are not really comfortable with it; the ideal audience is seasoned OO programmers looking to adopt a functional style, as it gives those programmers a guide for transitioning away from the patterns they are comfortable with. ➤ Rod Hilton, Java developer and PhD candidate at the University of Colorado Functional Programming Patterns in Scala and Clojure Write Lean Programs for the JVM Michael Bevilacqua-Linn The Pragmatic Bookshelf Dallas, Texas • Raleigh, North Carolina Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks. Where those designations appear in this book, and The Pragmatic Programmers, LLC was aware of a trademark claim, the designations have been printed in initial capital letters or in all capitals.
    [Show full text]
  • Lisp, Jazz, Aikido Three Expressions of a Single Essence
    Lisp, Jazz, Aikido Three Expressions of a Single Essence Didier Vernaa a EPITA Research and Development Laboratory Abstract The relation between Science (what we can explain) and Art (what we can’t) has long been acknowledged and while every science contains an artistic part, every art form also needs a bit of science. Among all scientific disciplines, programming holds a special place for two reasons. First, the artistic part is not only undeniable but also essential. Second, and much like in a purely artistic discipline, the act of programming is driven partly by the notion of aesthetics: the pleasure we have in creating beautiful things. Even though the importance of aesthetics in the act of programming is now unquestioned, more could still be written on the subject. The field called “psychology of programming” focuses on the cognitive aspects ofthe activity, with the goal of improving the productivity of programmers. While many scientists have emphasized their concern for aesthetics and the impact it has on their activity, few computer scientists have actually written about their thought process while programming. What makes us like or dislike such and such language or paradigm? Why do we shape our programs the way we do? By answering these questions from the angle of aesthetics, we may be able to shed some new light on the art of programming. Starting from the assumption that aesthetics is an inherently transversal dimension, it should be possible for every programmer to find the same aesthetic driving force in every creative activity they undertake, not just programming, and in doing so, get deeper insight on why and how they do things the way they do.
    [Show full text]
  • Lisp, Jazz, Aikido – Three Expressions of a Single Essence Didier Verna
    Lisp, Jazz, Aikido – Three Expressions of a Single Essence Didier Verna To cite this version: Didier Verna. Lisp, Jazz, Aikido – Three Expressions of a Single Essence. The Art, Science, and Engineering of Programming, aosa, Inc., 2018, 2 (3), 10.22152/programming-journal.org/2018/2/10. hal-01813470 HAL Id: hal-01813470 https://hal.archives-ouvertes.fr/hal-01813470 Submitted on 12 Jun 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Lisp, Jazz, Aikido Three Expressions of a Single Essence Didier Vernaa a EPITA Research and Development Laboratory Abstract The relation between Science (what we can explain) and Art (what we can’t) has long been acknowledged and while every science contains an artistic part, every art form also needs a bit of science. Among all scientific disciplines, programming holds a special place for two reasons. First, the artistic part is not only undeniable but also essential. Second, and much like in a purely artistic discipline, the act of programming is driven partly by the notion of aesthetics: the pleasure we have in creating beautiful things. Even though the importance of aesthetics in the act of programming is now unquestioned, more could still be written on the subject.
    [Show full text]
  • SMC2011 Template
    From Live Coding to Virtual Being Nikolai Suslov Tatiana Soshenina Fund for Supporting Moscow Institute of Architecture Development of Russian Technology State Academy! Vologda, Russia Moscow, Russia [email protected] [email protected] ABSTRACT "e self#explorative, colla%orative environments and virtual worlds are se&ing up the new standards in so'ware engineering for today( In this, live coding is also re)uired in reviewing as for programmers and as for artists too( "e most popular live coding frameworks, even %eing %uilt %y using highly dynamic, re+ective languages, still suffer on tight %indings to single#node or client#server architecture, language or platform dependence and third#party tools. "at leads to ina%ility nor to develop nor scale to the Internet of things the new created works using live coding. In the paper we introduce the prototype of integration of o%ject#oriented language for pa&ern matching .Meta onto Virtual /orld Framework on 0avaScript( "at integration will allow using the live coding in virtual worlds with user-de1ned languages. Also we explore the possi%ilities of a conformal scaling of live coding in the case of augmented reality systems and Internet of things. In summary, the paper descri%es the efforts %eing done for involving virtual worlds architecture in live coding process. All prototypes that are descri%ed in the paper are availa%le for experimenting with on 2restianstvo SD2 open source project3 h&p344www(*restianstvo(org 1. FROM CO !R TOOLS TO S!LF E"#LORAT$VE% COLLABORAT$VE ENVIRONMENTS Fundamentally, live coding is a%out highly dynamic programming languages, re+ectivity, homoiconicity and Meta properties.
    [Show full text]
  • Functional Programming in Lisp
    Functional Programming in Lisp Dr. Neil T. Dantam CSCI-561, Colorado School of Mines Fall 2020 Dantam (Mines CSCI-561) Functional Programming in Lisp Fall 2020 1 / 50 Alan Perlis on Programming Languages https://doi.org/10.1145%2F947955.1083808 “A language that doesn’t affect the way you think about programming is not worth knowing.” Alan J. Perlis (1922-1990) First Turing Award Recipient, 1966 Dantam (Mines CSCI-561) Functional Programming in Lisp Fall 2020 2 / 50 Eric Raymond and Paul Graham on Lisp Lisp is worth learning for a different reason— By induction, the only programmers in a position to see the profound enlightenment experience you will have all the differences in power between the various languages when you finally get it. That experience will make are those who understand the most powerful one. you a better programmer for the rest of your days, (This is probably what Eric Raymond meant about even if you never actually use Lisp itself a lot. Lisp making you a better programmer.) –Eric Raymond –Paul Graham http://www.catb.org/esr/faqs/hacker-howto.html http://www.paulgraham.com/avg.html Dantam (Mines CSCI-561) Functional Programming in Lisp Fall 2020 3 / 50 Peter Siebel on Lisp, “Blub,” and “Omega” https://youtu.be/4NO83wZVT0A?t=18m26s Dantam (Mines CSCI-561) Functional Programming in Lisp Fall 2020 4 / 50 Introduction Functional Programming Features Outcomes I Persistence: variables and data structures are I Review/understand concepts of immutable (constant) functional programming I Recursion: construct algorithms as recursive I Implement Lisp programs in functions (vs.
    [Show full text]
  • Scheme in Scheme 2
    • Implementing a language is a good way to learn more about programming languages • Interpreters are easier to implement than compilers, in general • Scheme is a simple language, but also a powerful one • Implementing it first in Scheme allows us to put off some of the more complex lower-level parts, like parsing and data structures • While focusing on higher-level aspects • Simple syntax and semantics • John McCarthy’s original Lisp had very little structure: • Procedures CONS, CAR, CDR, EQ and ATOM • Special forms QUOTE, COND, SET and LAMBDA • Values T and NIL • The rest of Lisp can be built on this foundation (more or less) • “A meta-circular evaluator is a special case of a self-interpreter in which the existing facilities of the parent interpreter are directly applied to the source code being interpreted, without any need for additional implementation. Meta-circular evaluation is most common in the context of homoiconic languages”. • We’ll look at an adaptation from Abelson and Sussman,Structure and Interpretation of Computer Programs, MIT Press, 1996. • Homoiconicity is a property of some programming languages • From homo meaning the same and icon meaning representation • A programming language is homoiconic if its primary representation for programs is also a data structure in a primitive type of the language itself • Few examples: Lisp, Prolog, Snobol • We’ll not do all of Scheme, just enough for you to understand the approach • We can us the same approach for an interpreter for Scheme in Python • To provide reasonable efficiency, we’ll use mutable-pairs • Scheme calls a cons cell a pair • Lisp always had special functions to change (aka destructively modify or mutate) the components of a simple cons cell • Can you detect a sentiment there? • RPLACA (RePLAce CAr) was Lisp’s function to replace the car of a cons cell with a new pointer • RPLACD (RePLAce CDr) clobbered the cons cell’s cdr pointer GL% clisp [5]> l2 ..
    [Show full text]
  • Clojure (For the Masses
    (Clojure (for the masses)) (author Tero Kadenius Tarmo Aidantausta) (date 30.03.2011) 1 Contents 1 Introduction 4 1.1 Dialect of Lisp . 4 1.2 Dynamic typing . 4 1.3 Functional programming . 4 2 Introduction to Clojure syntax 4 2.1 Lispy syntax . 5 2.1.1 Parentheses, parentheses, parentheses . 5 2.1.2 Lists . 6 2.1.3 Prex vs. inx notation . 6 2.1.4 Dening functions . 6 3 Functional vs. imperative programming 7 3.1 Imperative programming . 7 3.2 Object oriented programming . 7 3.3 Functional programming . 8 3.3.1 Functions as rst class objects . 8 3.3.2 Pure functions . 8 3.3.3 Higher-order functions . 9 3.4 Dierences . 10 3.5 Critique . 10 4 Closer look at Clojure 11 4.1 Syntax . 11 4.1.1 Reader . 11 4.1.2 Symbols . 11 4.1.3 Literals . 11 4.1.4 Lists . 11 4.1.5 Vectors . 12 4.1.6 Maps . 12 4.1.7 Sets . 12 4.2 Macros . 12 4.3 Evaluation . 13 4.4 Read-Eval-Print-Loop . 13 4.5 Data structures . 13 4.5.1 Sequences . 14 4.6 Control structures . 14 4.6.1 if . 14 4.6.2 do . 14 4.6.3 loop/recur . 14 2 5 Concurrency in Clojure 15 5.1 From serial to parallel computing . 15 5.2 Problems caused by imperative programming paradigm . 15 5.3 Simple things should be simple . 16 5.4 Reference types . 16 5.4.1 Vars . 16 5.4.2 Atoms . 17 5.4.3 Agents .
    [Show full text]
  • LISP a Programmable Programming Language
    LISP A Programmable Programming Language Christoph Müller 25.01.2017 Offenes Kolloquium für Informatik Agenda 1. Why Lisp? 2. The Basics of Lisp 3. Macros in Action 4. Tools and Platforms 5. Literature and more obscure Lisp dialects 6. Why it never (really) caught on 7. Bonus - A bit of History 1 Why Lisp? Timeless • When we talk about Lisp we talk about a language family • One of the oldest (∼ 1958) language families still in use today (only Fortran is older) • The Syntax is by its very nature timeless 2 Innovator • Garbage Collection • Homoiconicity (Code is Data) • Higher Order Functions • Dynamic Typing • Read Evaluate Print Loop (REPL) • Multiple Dispatch • And many more ... 3 Scheme - A Language for Teaching • Scheme was used as an introductory Language in famous Universities like MIT (6.001) • Extremely small language core • Great for learning to build your own abstractions 4 Picking a Language for this Talk Lets look at the most popular Lisp dialects on GitHub (provided by GitHut): GitHub Popuplarity Rank Language 20 Emacs Lisp 23 Clojure 40 Scheme 42 Common Lisp 48 Racket Clojure with its JVM heritage and Scheme with its focus on a small core will be used throughout this talk. 5 The Basics of Lisp The name giving lists • The basis of lisp is the s(ymbolic)-expression • Either a atom or a list • Atoms are either symbols or literals • Every element of a list is either an atom or another list • Elements are separated by whitespace and surrounded with parentheses • The first element of a (to be evaluated) list has to be what we will call a verb in this talk ;atoms x 12 ;lists (+ 1 2 3) (+ (* 2 3) 3) 6 What is a verb? • A verb is either a • A function • A macro • A special form • Special forms include if, fn, loop, recur etc.
    [Show full text]
  • Functional Programming: a Pragpub Anthology
    Prepared exclusively for Shohei Tanaka Prepared exclusively for Shohei Tanaka Early Praise for Functional Programming: A PragPub Anthology If you’ve been wondering what all the functional hubbub is about, Functional Programming: A PragPub Anthology will satisfy. You can wet your whistle with several languages, get a feel for how to think functionally, and do so without overcommitting to one language or school of thought. ➤ Ben Vandgrift Chief architect, Oryx Systems Inc. Programming’s last sea change was in the 1990s when object orientation went mainstream. It’s happening again, but this time it’s functional programming that’s sweeping through our profession. Read this book to understand why and to learn how to ride the wave. ➤ Paul Butcher Founder and CTO, writeandimprove.com I really enjoyed the structure and flow of the book. The chapters stand on their own as essays but when put together make a strong argument for functional programming, regardless of the language. It’s also a treat to see all these different familiar writers write about diverse languages. ➤ Ben Marx Lead engineer, Bleacher Report Prepared exclusively for Shohei Tanaka You’re sure to find a way functional programming resonates with you with the wealth of approaches and languages covered. The treatment of Scala collection is superb: everything a beginner needs to know from the get-go! ➤ Jeff Heon Research software developer, CRIM Prepared exclusively for Shohei Tanaka Functional Programming: A PragPub Anthology Exploring Clojure, Elixir, Haskell, Scala, and Swift Michael Swaine and the PragPub writers The Pragmatic Bookshelf Raleigh, North Carolina Prepared exclusively for Shohei Tanaka Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks.
    [Show full text]