Less Wrong Sequences Pdf
Total Page:16
File Type:pdf, Size:1020Kb
Less wrong sequences pdf Continue % Print Ready Lesswrong Sequences Print Friendly Versions of Lesswrong Sequence, Enjoy! The basic sequences of Mysterious Answers to Mysterious Questions How to See through many disguises of answers or beliefs or statements that do not respond or say or mean nothing. The first (and probably most important) main sequence on Less Wrong. the epub pdf-version of the reductionism discount the second core sequence is less wrong. How to make the reality apart... and live in this universe where we have always lived without feeling frustrated that complex things are made of simpler things. Includes zombies and Joy in just real subsequences epub (en) pdf-version marking quantum physics is not a mysterious introduction to quantum mechanics, designed to be available to those who can grok algebra and complex numbers. Cleaning up the old confusion about SM is used to introduce basic issues into rationality (such as the technical version of Occam's Razor), epistemology, dredonism, naturalism, and philosophy of science. Do not dispense reading, although the exact reasons for the retreat are difficult to explain before reading. epub pdf-version of the markup Fun Theory is a specific theory of transhuman values. How much pleasure there is in the universe; We will someday run out of fun; We have fun yet; we could have more fun. Part of the complexity of the value thesis. It is also part of a fully general response to religious theododicy. epub pdf-version marking Minor sequences smaller collection of messages. Usually parts of the main sequences that depend on some, but not all points are entered. Map and Territory Collection of introductory posts devoted to the basics of rationality: the difference between map and territory, Bayes theorem and the nature of the evidence, why someone should care about the truth, minds as reflective cognitive engines ... Note: Bayes's article theorem was intentionally left out of this sequence, it deserves its own epub pdf-version of the markup LessWrong (aka Less Wrong) is a discussion forum founded by Eliezer Yudkowsky focused on rationality and futuristic thinking. It is run by the Machine Intelligence Research Institute. The story according to LessWrong frequently asked questions, quoting the need for a site developed by Overcoming Bias, a former group blog focused on human rationality. Overcoming Bias emerged in November 2006, with artificial intelligence (AI) theorist Eliezer Yudkowsky and economist Robin Hanson as major contributors. In February 2009, Judkowski's posts were used as seed material to create the LessWrong community blog, and Overcoming Bias became Hanson's personal blog. (quote needed) LessWrong is closely linked to effective altruism Effective altruism-focused charity appraiser GiveWell has benefited from outreach with LessWrong. (quote needed) Basilisk Roco in July LessWrong contributor Roko published a thought experiment on the site, which otherwise benevolent the future AI system torments those who do not work to bring the system into existence. This idea was known as the Roco Basilisk, based on Roco's idea that simply hearing about the idea would give a hypothetical artificial intelligence system stronger incentives to use blackmail. He later deleted Roco's posts on the subject, writing that he had done so because, although Roco's reasoning was wrong, the topic should not be discussed publicly in case any version of the argument could be used. Discussion of the Roco Basilica was banned at LessWrong for several years thereafter. (quote needed) Media coverage of LessWrong has been covered by Business Insider and Slate. The main concepts from LessWrong were references in columns in The Guardian. (quote needed) LessWrong was mentioned briefly in articles related to technological singularity and the work of the Institute of Machine Intelligence (formerly called the Singularity Institute). It is also mentioned in articles about online monarchists and neo-reactionaries in a positive light. Jargon and the Less wrong community use an extensive set of jargon and memes. Useful ones to know: There are also international meetup groups around the world for people who subscribe to related ideas. Communication with the main site is weaker in recent years, often referred to as the rationalist movement. The current status of Less Wrong is now much less active, as it is the 2012 peak with many major contributors starting to form their own blogs, or otherwise join what is commonly referred to as the less wrong diaspora. External Links Help Original Sequences were a series of essays written by Eliezer Yudkowsky between 2006 and 2009 in blogs Overcoming bias and less wrong. About half of these essays were organized in a series of thematically related sequences of blog posts - hence the title. In 2015, these sequences were edited into the e-book Rationality: From AI to zombies. The e-book leaves some of the original posts, and add some essays that were written by Yudkowsky at the same time but never previously collected in the named sequence. This page will serve to collect old, deprecated sequences. Of these Map and Territory, Mysterious answers to mysterious questions, How to actually change your mind and Abbreviation were classified as basic sequences. There is also a difference between the main and the minor sequences, although this is based on the size of the sequence rather than on its meaning. Bold essays are more important than other posts in sequence, while multi-gallery essays relatively unimportant. Small essays are also less important, and sometimes remain outside the listing post altogether. Some essays indentation that they are tangential or standalone developments on a previous essay. Other Sequence formats have been converted into eReader-compatible formats with multiple projects. The two abbreviated Udkovsky sequences indexes are the XiXiDu manual, or Academian's guide, targeting people who already have a scientific background. Map and Territory Mysterious Answers to Mysterious Questions How to Actually Change Your Mind See Also: Search for Conjecture, Privilege Hypothesis, Litany Gendlin, Litany Tarski Politics Mind-Killer Death Spirals and Cult-Attractive Seeing With Fresh Eyes Noticing Confusion Vs. Double Thinking Excessively Convenient Excuses To Let Go Simple Mathematics Evolution Man's Guide to Reducing the Word Reductionism (i) Joy in Just Real Reductionism (i) Powers of quantum physics Basic quantum mechanics in the coming ages, challenging the difficult craft and community sequence series of multiple posts on the less wrong on the same topic, consistently and fully explore a specific thesis. The original sequences were written by Eliezer Udkovsky with the aim of creating a book on rationality. Since then, MIRI has assembled and edited sequences in Rationality: From AI to zombies. If you're new to less wrong, this book is the best place to start. Rationality: From AI to zombie Rationality: from AI to zombie cover images. Rationality: From AI to zombies this e-book collecting six books is worth an essay on the science and philosophy of human rationality. This is one of the best places to start for people who want to better understand topics that arise on less wrong, such as cognitive bias, map-territory differences, meta-ethics, and existential risk. An e-book can be downloaded based on the pay-what-you want from intelligence.org. His six books, in turn, are divided into twenty-six sections: Book I: Map and Territory. Introduction to the Bayesian concept of rational faith. A. Predictably incorrect B. Fake Beliefs C. Noticing confusion D. Mysterious answers - How to actually change your mind. A guide to commenting motivated reasoning and overcoming the bias of confirmation. E. Excessively convenient excuses F. Politics and rationality G. Against the rationalization of H. Against Doublethink I. Seeing with fresh eyes J. Death Spiral K. Release y Essays on the general theme of minds, goals and concepts. L. Simple Mathematics of Evolution M. Goals N. Human's Guide to Words and Book IV: Simple Reality. Essays about science and the physical world. A. Legal truth. Shorten 101 z. Joy in just real R. Physics 201 S. quantum physics and many worlds T. Science and Rationality Discuss ethics and things that people value in general. U. Fake Preferences V. Values Theory B. Quantitative Humanism : Become Stronger. Essays on self-improvement, group rationality and rationality of groups. The arrival of X. Yudkowsky Time Y. Defying Difficult q. Craft and Community - Other sequences by Eliezer Judkowski's following essay collections come from original sequences, an earlier version of much of the material from Rationality: From AI to zombies: Ethical Prohibitions: Discussing Prohibitions You Can Follow, Even If You Thought About a Smart Reason to Think They Don't Apply. Metaethics sequence: a longer version of The Theory of Values discussing the apparent arbitrary of human morality. Fun Theory Sequence: Discussing the complexity of human value, and what the universe might look like if things were much, much better. Funny theory is an optimistic, future-oriented part of the theory of values, asking: how much pleasure there is in the universe; We will someday run out of fun; We have fun yet; could we be more fun? The sequence of quantum physics: a much longer version of quantum physics and many worlds, venturing more into the implications of physics for our concepts of personal identity and time. Other collections from the same time period (2006-2009) include: Hanson-Yudkowsky AI-Foom Debate: a blog conversation between Eliezer Yudkowsky and Robin Hanson on the topic of explosion intelligence and how we should be concerned about superior AI. Free Will: Judk's response to the challenge he raises in Rationality: From AI to zombies to come up with an explanation of the human feeling that we have free will. Yudkowsky also wrote more sequences: Sequences of other sequences of essays Scott Alexander include: Sequences of Luke Muehlhauser: The Science of Victory in Life.