Psychology Software for Python Release 3.0.2

Total Page:16

File Type:pdf, Size:1020Kb

Psychology Software for Python Release 3.0.2 PsychoPy - Psychology software for Python Release 3.0.2 Jonathan Peirce Jan 29, 2019 Contents 1 About PsychoPy 1 2 General issues 3 3 Installation 29 4 Manual install 31 5 Getting Started 35 6 Builder 43 7 Coder 75 8 Running studies online 95 9 Reference Manual (API) 103 10 Troubleshooting 325 11 Recipes (“How-to”s) 329 12 Frequently Asked Questions (FAQs) 341 13 Resources (e.g. for teaching) 343 14 For Developers 345 15 PsychoPy Experiment file format (.psyexp) 359 Python Module Index 363 i ii CHAPTER 1 About PsychoPy 1.1 Citing PsychoPy If you use this software, please cite one of the publications that describe it. • Peirce, J. W., & MacAskill, M. R. (2018). Building Experiments in PsychoPy. London: Sage. • Peirce J. W. (2009). Generating stimuli for neuroscience using PsychoPy. Frontiers in Neuroinformatics, 2 (10), 1-8. doi:10.3389/neuro.11.010.2008 • Peirce, J. W. (2007). PsychoPy - Psychophysics software in Python. Journal of Neuroscience Methods, 162 (1-2):8-13 doi:10.1016/j.jneumeth.2006.11.017 Citing these papers gives the reviewer/reader of your study information about how the system works and it attributes some credit for its original creation. Academic assessment (whether for promotion or even getting appointed to a job in the first place) prioritises publications over making useful tools for others. Citations provide a way for the developers to justify their continued involvement in the development of the package. 1 PsychoPy - Psychology software for Python, Release 3.0.2 2 Chapter 1. About PsychoPy CHAPTER 2 General issues These are issues that users should be aware of, whether they are using Builder or Coder views. 2.1 Monitor Center PsychoPy provides a simple and intuitive way for you to calibrate your monitor and provide other information about it and then import that information into your experiment. Information is inserted in the Monitor Center (Tools menu), which allows you to store information about multiple monitors and keep track of multiple calibrations for the same monitor. For experiments written in the Builder view, you can then import this information by simply specifying the name of the monitor that you wish to use in the Experiment settings dialog. For experiments created as scripts you can retrieve the information when creating the Window by simply naming the monitor that you created in Monitor Center. e.g.: from psychopy import visual win= visual.Window([1024,768], mon='SonyG500') Of course, the name of the monitor in the script needs to match perfectly the name given in the Monitor Center. 2.1.1 Real world units One of the particular features of PsychoPy is that you can specify the size and location of stimuli in units that are independent of your particular setup, such as degrees of visual angle (see Units for the window and stimuli). In order for this to be possible you need to inform PsychoPy of some characteristics of your monitor. Your choice of units determines the information you need to provide: Units Requires ‘norm’ (normalised to width/height) n/a ‘pix’ (pixels) Screen width in pixels ‘cm’ (centimeters on the screen) Screen width in pixels and screen width in cm ‘deg’ (degrees of visual angle) Screen width (pixels), screen width (cm) and distance (cm) 3 PsychoPy - Psychology software for Python, Release 3.0.2 2.1.2 Calibrating your monitor PsychoPy can also store and use information about the gamma correction required for your monitor. If you have a Spectrascan PR650 (other devices will hopefully be added) you can perform an automated calibration in which PsychoPy will measure the necessary gamma value to be applied to your monitor. Alternatively this can be added manually into the grid to the right of the Monitor Center. To run a calibration, connect the PR650 via the serial port and, immediately after turning it on press the Find PR650 button in the Monitor Center. Note that, if you don’t have a photometer to hand then there is a method for determining the necessary gamma value psychophysically included in PsychoPy (see gammaMotionNull and gammaMotionAnalysis in the demos menu). The two additional tables in the Calibration box of the Monitor Center provide conversion from DKL and LMS colour spaces to RGB. 2.2 Units for the window and stimuli One of the key advantages of PsychoPy over many other experiment-building software packages is that stimuli can be described in a wide variety of real-world, device-independent units. In most other systems you provide the stimuli at a fixed size and location in pixels, or percentage of the screen, and then have to calculate how many cm or degrees of visual angle that was. In PsychoPy, after providing information about your monitor, via the Monitor Center, you can simply specify your stimulus in the unit of your choice and allow PsychoPy to calculate the appropriate pixel size for you. Your choice of unit depends on the circumstances. For conducting demos, the two normalised units (‘norm’ and ‘height’) are often handy because the stimulus scales naturally with the window size. For running an experiment it’s usually best to use something like ‘cm’ or ‘deg’ so that the stimulus is a fixed size irrespective of the monitor/window. For all units, the centre of the screen is represented by coordinates (0,0), negative values mean down/left, positive values mean up/right. 2.2.1 Height units With ‘height’ units everything is specified relative to the height of the window (note the window, not the screen). As a result, the dimensions of a screen with standard 4:3 aspect ratio will range (-0.6667,-0.5) in the bottom left to (+0.6667,+0.5) in the top right. For a standard widescreen (16:10 aspect ratio) the bottom left of the screen is (-0.8,- 0.5) and top-right is (+0.8,+0.5). This type of unit can be useful in that it scales with window size, unlike Degrees of visual angle or Centimeters on screen, but stimuli remain square, unlike Normalised units units. Obviously it has the disadvantage that the location of the right and left edges of the screen have to be determined from a knowledge of the screen dimensions. (These can be determined at any point by the Window.size attribute.) Spatial frequency: cycles per stimulus (so will scale with the size of the stimulus). Requires : No monitor information 2.2.2 Normalised units In normalised (‘norm’) units the window ranges in both x and y from -1 to +1. That is, the top right of the window has coordinates (1,1), the bottom left is (-1,-1). Note that, in this scheme, setting the height of the stimulus to be 1.0, will make it half the height of the window, not the full height (because the window has a total height of 1:-1 = 2!). Also note that specifying the width and height to be equal will not result in a square stimulus if your window is not square - the image will have the same aspect ratio as your window. e.g. on a 1024x768 window the size=(0.75,1) will be square. 4 Chapter 2. General issues PsychoPy - Psychology software for Python, Release 3.0.2 Spatial frequency: cycles per stimulus (so will scale with the size of the stimulus). Requires : No monitor information 2.2.3 Centimeters on screen Set the size and location of the stimulus in centimeters on the screen. Spatial frequency: cycles per cm Requires : information about the screen width in cm and size in pixels Assumes : pixels are square. Can be verified by drawing a stimulus with matching width and height and verifying that it is in fact square. For a CRT this can be controlled by setting the size of the viewable screen (settings on the monitor itself). 2.2.4 Degrees of visual angle Use degrees of visual angle to set the size and location of the stimulus. This is, of course, dependent on the distance that the participant sits from the screen as well as the screen itself, so make sure that this is controlled, and remember to change the setting in Monitor Center if the viewing distance changes. Spatial frequency: cycles per degree Requires : information about the screen width in cm and pixels and the viewing distance in cm There are actually three variants: ‘deg’, ‘degFlat’, and ‘degFlatPos’ ‘deg’ : Most people using degrees of visual angle choose to make the assumption that a degree of visual angle spans the same number of pixels at all parts of the screen. This isn’t actually true for standard flat screens - a degree of visual angle at the edge of the screen spans more pixels because it is further from the eye. For moderate eccentricities the error is small (a 0.2% error in size calculation at 3 deg eccentricity) but grows as stimuli are placed further from the centre of the screen (a 2% error at 10 deg). For most studies this form of calculation is preferred, as it does not result in a warped appearance of visual stimuli, but if you need greater precision at far eccentricities then choose one of the alternatives below. ‘degFlatPos’ : This accounts for flat screens in calculating position coordinates of visual stimuli but leaves size and spatial frequency uncorrected. This means that an evenly spaced grid of visual stimuli will appear warped in position but will ‘degFlat’: This corrects the calculations of degrees for flatness of the screen for each vertex of your stimuli. Square stimuli in the periphery will, therefore, become more spaced apart but they will also get larger and rhomboid in the pixels that they occupy.
Recommended publications
  • Python Programming
    Python Programming Wikibooks.org June 22, 2012 On the 28th of April 2012 the contents of the English as well as German Wikibooks and Wikipedia projects were licensed under Creative Commons Attribution-ShareAlike 3.0 Unported license. An URI to this license is given in the list of figures on page 149. If this document is a derived work from the contents of one of these projects and the content was still licensed by the project under this license at the time of derivation this document has to be licensed under the same, a similar or a compatible license, as stated in section 4b of the license. The list of contributors is included in chapter Contributors on page 143. The licenses GPL, LGPL and GFDL are included in chapter Licenses on page 153, since this book and/or parts of it may or may not be licensed under one or more of these licenses, and thus require inclusion of these licenses. The licenses of the figures are given in the list of figures on page 149. This PDF was generated by the LATEX typesetting software. The LATEX source code is included as an attachment (source.7z.txt) in this PDF file. To extract the source from the PDF file, we recommend the use of http://www.pdflabs.com/tools/pdftk-the-pdf-toolkit/ utility or clicking the paper clip attachment symbol on the lower left of your PDF Viewer, selecting Save Attachment. After extracting it from the PDF file you have to rename it to source.7z. To uncompress the resulting archive we recommend the use of http://www.7-zip.org/.
    [Show full text]
  • Exploring the Role of Language in Two Systems for Categorization Kayleigh Ryherd University of Connecticut - Storrs, [email protected]
    University of Connecticut OpenCommons@UConn Doctoral Dissertations University of Connecticut Graduate School 4-26-2019 Exploring the Role of Language in Two Systems for Categorization Kayleigh Ryherd University of Connecticut - Storrs, [email protected] Follow this and additional works at: https://opencommons.uconn.edu/dissertations Recommended Citation Ryherd, Kayleigh, "Exploring the Role of Language in Two Systems for Categorization" (2019). Doctoral Dissertations. 2163. https://opencommons.uconn.edu/dissertations/2163 Exploring the Role of Language in Two Systems for Categorization Kayleigh Ryherd, PhD University of Connecticut, 2019 Multiple theories of category learning converge on the idea that there are two systems for categorization, each designed to process different types of category structures. The associative system learns categories that have probabilistic boundaries and multiple overlapping features through iterative association of features and feedback. The hypothesis-testing system learns rule-based categories through explicit testing of hy- potheses about category boundaries. Prior research suggests that language resources are necessary for the hypothesis-testing system but not for the associative system. However, other research emphasizes the role of verbal labels in learning the probabilistic similarity-based categories best learned by the associative system. This suggests that language may be relevant for the associative system in a different way than it is relevant for the hypothesis-testing system. Thus, this study investigated the ways in which language plays a role in the two systems for category learning. In the first experiment, I tested whether language is related to an individual’s ability to switch between the associative and hypothesis-testing systems. I found that participants showed remarkable ability to switch between systems regardless of their language ability.
    [Show full text]
  • Predictive Decoding of Neural Data Yaroslav O
    New Jersey Institute of Technology Digital Commons @ NJIT Dissertations Theses and Dissertations Spring 2009 Predictive decoding of neural data Yaroslav O. Halchenko New Jersey Institute of Technology Follow this and additional works at: https://digitalcommons.njit.edu/dissertations Part of the Computer Sciences Commons Recommended Citation Halchenko, Yaroslav O., "Predictive decoding of neural data" (2009). Dissertations. 901. https://digitalcommons.njit.edu/dissertations/901 This Dissertation is brought to you for free and open access by the Theses and Dissertations at Digital Commons @ NJIT. It has been accepted for inclusion in Dissertations by an authorized administrator of Digital Commons @ NJIT. For more information, please contact [email protected]. Cprht Wrnn & trtn h prht l f th Untd Stt (tl , Untd Stt Cd vrn th n f phtp r thr rprdtn f prhtd trl. Undr rtn ndtn pfd n th l, lbrr nd rhv r thrzd t frnh phtp r thr rprdtn. On f th pfd ndtn tht th phtp r rprdtn nt t b “d fr n prp thr thn prvt td, hlrhp, r rrh. If , r rt fr, r ltr , phtp r rprdtn fr prp n x f “fr tht r b lbl fr prht nfrnnt, h ntttn rrv th rht t rf t pt pn rdr f, n t jdnt, flfllnt f th rdr ld nvlv vltn f prht l. l t: h thr rtn th prht hl th r Inttt f hnl rrv th rht t dtrbt th th r drttn rntn nt: If d nt h t prnt th p, thn lt “ fr: frt p t: lt p n th prnt dl rn h n tn lbrr h rvd f th prnl nfrtn nd ll ntr fr th pprvl p nd brphl th f th nd drttn n rdr t prtt th dntt f I rdt nd flt.
    [Show full text]
  • Pyglet Documentation Release 1.2Alpha1
    pyglet Documentation Release 1.2alpha1 Alex Holkner October 13, 2013 CONTENTS 1 Programming Guide: 3 1.1 pyglet Programming Guide.......................................3 2 API Reference: 87 2.1 API Reference.............................................. 87 3 Testing Framework: 117 4 Related Documentation: 119 Python Module Index 121 i ii pyglet Documentation, Release 1.2alpha1 Pyglet is a pure python cross-platform application framework intended for game development. It supports windowing, user interface event handling, OpenGL graphics, loading images and videos and playing sounds and music. It works on Windows, OS X and Linux. CONTENTS 1 pyglet Documentation, Release 1.2alpha1 2 CONTENTS CHAPTER ONE PROGRAMMING GUIDE: 1.1 pyglet Programming Guide breadcrumb Programming Guide The pyglet Programming Guide provides in-depth documentation for writing applications that use pyglet. Many topics described here reference the pyglet API reference, provided separately. If this is your first time reading about pyglet, we suggest you start at Writing a pyglet application. 1.1.1 Installation pyglet does not need to be installed. Because it uses no external libraries or compiled binaries, you can run it in-place. You can distribute the pyglet source code or runtime eggs alongside your application code (see Distribution). You might want to experiment with pyglet and run the example programs before you install it on your development machine. To do this, add either the extracted pyglet source archive directory or the compressed runtime egg to your PYTHONPATH. On Windows you can specify this from a command line: set PYTHONPATH c:\path\to\pyglet-1.1\;%PYTHONPATH% On Mac OS X, Linux or on Windows under cygwin using bash: set PYTHONPATH /path/to/pyglet-1.1/:$PYTHONPATH export PYTHONPATH or, using tcsh or a variant: setenv PYTHONPATH /path/to/pyglet-1.1/:$PYTHONPATH If you have downloaded a runtime egg instead of the source archive, you would specify the filename of the egg in place of pyglet-1.1/.
    [Show full text]
  • Python Language
    Python Language #python Table of Contents About 1 Chapter 1: Getting started with Python Language 2 Remarks 2 Versions 3 Python 3.x 3 Python 2.x 3 Examples 4 Getting Started 4 Verify if Python is installed 4 Hello, World in Python using IDLE 5 Hello World Python file 5 Launch an interactive Python shell 6 Other Online Shells 7 Run commands as a string 7 Shells and Beyond 8 Creating variables and assigning values 8 User Input 12 IDLE - Python GUI 13 Troubleshooting 14 Datatypes 15 Built-in Types 15 Booleans 15 Numbers 15 Strings 16 Sequences and collections 16 Built-in constants 17 Testing the type of variables 18 Converting between datatypes 18 Explicit string type at definition of literals 19 Mutable and Immutable Data Types 19 Built in Modules and Functions 20 Block Indentation 24 Spaces vs. Tabs 25 Collection Types 25 Help Utility 30 Creating a module 31 String function - str() and repr() 32 repr() 33 str() 33 Installing external modules using pip 34 Finding / installing a package 34 Upgrading installed packages 34 Upgrading pip 35 Installation of Python 2.7.x and 3.x 35 Chapter 2: *args and **kwargs 38 Remarks 38 h11 38 h12 38 h13 38 Examples 39 Using *args when writing functions 39 Using **kwargs when writing functions 39 Using *args when calling functions 40 Using **kwargs when calling functions 41 Using *args when calling functions 41 Keyword-only and Keyword-required arguments 42 Populating kwarg values with a dictionary 42 **kwargs and default values 42 Chapter 3: 2to3 tool 43 Syntax 43 Parameters 43 Remarks 44 Examples 44 Basic
    [Show full text]
  • Pipenightdreams Osgcal-Doc Mumudvb Mpg123-Alsa Tbb
    pipenightdreams osgcal-doc mumudvb mpg123-alsa tbb-examples libgammu4-dbg gcc-4.1-doc snort-rules-default davical cutmp3 libevolution5.0-cil aspell-am python-gobject-doc openoffice.org-l10n-mn libc6-xen xserver-xorg trophy-data t38modem pioneers-console libnb-platform10-java libgtkglext1-ruby libboost-wave1.39-dev drgenius bfbtester libchromexvmcpro1 isdnutils-xtools ubuntuone-client openoffice.org2-math openoffice.org-l10n-lt lsb-cxx-ia32 kdeartwork-emoticons-kde4 wmpuzzle trafshow python-plplot lx-gdb link-monitor-applet libscm-dev liblog-agent-logger-perl libccrtp-doc libclass-throwable-perl kde-i18n-csb jack-jconv hamradio-menus coinor-libvol-doc msx-emulator bitbake nabi language-pack-gnome-zh libpaperg popularity-contest xracer-tools xfont-nexus opendrim-lmp-baseserver libvorbisfile-ruby liblinebreak-doc libgfcui-2.0-0c2a-dbg libblacs-mpi-dev dict-freedict-spa-eng blender-ogrexml aspell-da x11-apps openoffice.org-l10n-lv openoffice.org-l10n-nl pnmtopng libodbcinstq1 libhsqldb-java-doc libmono-addins-gui0.2-cil sg3-utils linux-backports-modules-alsa-2.6.31-19-generic yorick-yeti-gsl python-pymssql plasma-widget-cpuload mcpp gpsim-lcd cl-csv libhtml-clean-perl asterisk-dbg apt-dater-dbg libgnome-mag1-dev language-pack-gnome-yo python-crypto svn-autoreleasedeb sugar-terminal-activity mii-diag maria-doc libplexus-component-api-java-doc libhugs-hgl-bundled libchipcard-libgwenhywfar47-plugins libghc6-random-dev freefem3d ezmlm cakephp-scripts aspell-ar ara-byte not+sparc openoffice.org-l10n-nn linux-backports-modules-karmic-generic-pae
    [Show full text]
  • Pyglet Programming Guide the Pyglet Programming Guide Provides In-Depth Documentation for Writing Applications That Use Pyglet
    Programming Guide Next: Installation pyglet Programming Guide The pyglet Programming Guide provides in-depth documentation for writing applications that use pyglet. Many topics described here reference the pyglet API reference, provided separately. If this is your first time reading about pyglet, we suggest you start at Writing a pyglet application. Sections • Installation • Writing a pyglet application • Creating an OpenGL context • The OpenGL interface • Graphics • Windowing • The application event loop • The pyglet event framework • Working with the keyboard • Working with the mouse • Keeping track of time • Displaying text • Images • Sound and video • Application resources • Debugging tools • Appendix: Migrating to pyglet 1.1 Table of contents • Installation ♦ Installing using setup.py ♦ Installation from the runtime eggs • Writing a pyglet application ♦ Hello, World ♦ Image viewer ♦ Handling mouse and keyboard events ♦ Playing sounds and music ♦ Where to next? • Creating an OpenGL context ♦ Displays, screens, configs and contexts ◊ Contexts and configs ◊ Displays ◊ Screens ♦ OpenGL configuration options ◊ The default configuration ♦ Simple context configuration ♦ Selecting the best configuration ♦ Sharing objects between contexts • The OpenGL interface ♦ Using OpenGL pyglet Programming Guide 1 Programming Guide ♦ Resizing the window ♦ Error checking ♦ Using extension functions ♦ Using multiple windows ♦ AGL, GLX and WGL • Graphics ♦ Drawing primitives ♦ Vertex attributes ♦ Vertex lists ◊ Updating vertex data ◊ Data usage ◊ Indexed
    [Show full text]
  • Travma Sonrası Stres Bozukluğu Tedavisinde Bilişsel Ve Davranışçı Yaklaşımlar 28
    Hasan Kalyoncu Üniversitesi Psikoloji Bölümü www.hku.edu.tr Havalimanı Yolu Üzeri 8.Km 27410 Şahinbey/GAZİANTEP +90 (342) 211 80 80 - [email protected] EDİTÖR Doç. Dr. Şaziye Senem BAŞGÜL YARDIMCI EDİTÖR Arş. Gör. Saadet YAPAN YAYIN KURULU Prof. Dr. Mehmet Hakan TÜRKÇAPAR Prof. Dr. Mücahit ÖZTÜRK Prof. Dr. Bengi SEMERCİ Prof. Dr. Osman Tolga ARICAK Doç. Dr. Şaziye Senem BAŞGÜL Doç. Dr. Hanna Nita SCHERLER Öğr. Gör. Mehmet DİNÇ Öğr. Gör. Ferhat Jak İÇÖZ Öğr. Gör. Mediha ÖMÜR Arş. Gör. Saadet YAPAN Arş. Gör. Mahmut YAY Arş. Gör. Feyza TOPÇU DANIŞMA KURULU Prof. Dr. Can TUNCER Doç. Dr. Zümra ÖZYEŞİL Yrd. Doç. Dr. Itır TARI CÖMERT Dr. Özge MERGEN Dr. Akif AVCU KAPAK TASARIM Uğur Servet KARALAR GRAFİK UYGULAMA Yakup BAYRAM 0342 211 80 80 psikoloji.hku.edu.tr [email protected] Havalimanı Yolu Üzeri 8. Km 27410 Şahinbey/GAZİANTEP Sonbahar ve ilkbahar sayıları olarak yılda iki kere çıkar. İÇİNDEKİLER Önsöz 4 Editörden 5 Bilimsel Bir Araştırmanın Yol Haritası 6 Psikoterapilere Varoluşçu Bir Bakış 10 Anne, Baba ve Çocuk Tarafından Algılanan Ebeveyn Kabul-Ret ve Kontrolünün Çocuğun Duygu Düzenleme Becerisi İle İlişkisi 13 Fabrika İşçilerinde Stres ve Depresyon Arasındaki İlişki 20 Travma Sonrası Stres Bozukluğu Tedavisinde Bilişsel ve Davranışçı Yaklaşımlar 28 Araştırmalar ve Olgu Değerlendirmelerinde Nöro-Psikolojik Testler ve Bilgisayar Programlarının Bütünleştirilmesi: Bir Gözden Geçirme 32 Tez Özetleri 35 ÖNSÖZ Değerli okurlarımız, Hasan Kalyoncu Üniversitesi Psikoloji Bölümü öğretim üyeleri ve öğ- rencilerinin girişimiyle yayınlanan Psikoloji Araştırmaları dergisinin yeni bir sayısıyla yeniden karşınızdayız. Yolculuğumuz dördüncü sayı- sına ulaştı. Psikoloji Araştırmaları dergisinin bu sayısında, klinik çalışmalar ve göz- den geçirme derleme yazıları yer alıyor.
    [Show full text]
  • Simple Though an Important Programming Language
    International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 02 | Feb 2019 www.irjet.net p-ISSN: 2395-0072 Python: Simple though an Important Programming language Yogesh Rana UG Scholar, IT Department, Vidyalankar School of Information Technology, Mumbai, Mumbai University, Maharashtra, India ---------------------------------------------------------------------***---------------------------------------------------------------------- Abstract - Python is an interpreted, high-level, general- Worlds one of the most popular coding language purpose programming language. Created by Guido van Rossum and first released in 1991, It is used for both The popularity of Python has risen steadily over the past 15 learning and real-life programming. It is easy to learn as years, finally breaking the top 5 on the Tiobe Index a few most of the commands are similar to normal word used by years ago. This is because Python is a major language in humans (e.g. To print any statement or any value we simply some of most exciting technologies today. Machine learning, write “print (statements)”). Many of the famous application artificial intelligence (AI), Big Data, and Robotics all rely are developed using python (e.g. Instagram, Dropbox.). heavily on Python (Robotics also relies on C for its use in Python might have a great scope in future so it’s a very systems programming). Cyber Security, one of the top important language. In this paper we will see why python is software challenges of our time, is also driven by Python. so important and why we should learn in. It’s surprising how simple Python is to learn. It’s now Key words - Python, Python programming, high level the most popular introductory language taught in programming language, Real world programming.
    [Show full text]
  • Package Pyglet.Gl
    Programming Guide Next: Installation pyglet Programming Guide The pyglet Programming Guide provides in-depth documentation for writing applications that use pyglet. Many topics described here reference the pyglet API reference, provided separately. If this is your first time reading about pyglet, we suggest you start at Writing a pyglet application. Sections • Installation • Writing a pyglet application • Creating an OpenGL context • The OpenGL interface • Graphics • Windowing • The application event loop • The pyglet event framework • Working with the keyboard • Working with the mouse • Keeping track of time • Displaying text • Images • Sound and video • Application resources • Debugging tools • Appendix: Migrating to pyglet 1.1 Table of contents • Installation ♦ Installing using setup.py ♦ Installation from the runtime eggs • Writing a pyglet application ♦ Hello, World ♦ Image viewer ♦ Handling mouse and keyboard events ♦ Playing sounds and music ♦ Where to next? • Creating an OpenGL context pyglet Programming Guide 1 Programming Guide ♦ Displays, screens, configs and contexts ◊ Contexts and configs ◊ Displays ◊ Screens ♦ OpenGL configuration options ◊ The default configuration ♦ Simple context configuration ♦ Selecting the best configuration ♦ Sharing objects between contexts • The OpenGL interface ♦ Using OpenGL ♦ Resizing the window ♦ Error checking ♦ Using extension functions ♦ Using multiple windows ♦ AGL, GLX and WGL • Graphics ♦ Drawing primitives ♦ Vertex attributes ♦ Vertex lists ◊ Updating vertex data ◊ Data usage ◊ Indexed
    [Show full text]
  • Towards Left Duff S Mdbg Holt Winters Gai Incl Tax Drupal Fapi Icici
    jimportneoneo_clienterrorentitynotfoundrelatedtonoeneo_j_sdn neo_j_traversalcyperneo_jclientpy_neo_neo_jneo_jphpgraphesrelsjshelltraverserwritebatchtransactioneventhandlerbatchinsertereverymangraphenedbgraphdatabaseserviceneo_j_communityjconfigurationjserverstartnodenotintransactionexceptionrest_graphdbneographytransactionfailureexceptionrelationshipentityneo_j_ogmsdnwrappingneoserverbootstrappergraphrepositoryneo_j_graphdbnodeentityembeddedgraphdatabaseneo_jtemplate neo_j_spatialcypher_neo_jneo_j_cyphercypher_querynoe_jcypherneo_jrestclientpy_neoallshortestpathscypher_querieslinkuriousneoclipseexecutionresultbatch_importerwebadmingraphdatabasetimetreegraphawarerelatedtoviacypherqueryrecorelationshiptypespringrestgraphdatabaseflockdbneomodelneo_j_rbshortpathpersistable withindistancegraphdbneo_jneo_j_webadminmiddle_ground_betweenanormcypher materialised handaling hinted finds_nothingbulbsbulbflowrexprorexster cayleygremlintitandborient_dbaurelius tinkerpoptitan_cassandratitan_graph_dbtitan_graphorientdbtitan rexter enough_ram arangotinkerpop_gremlinpyorientlinkset arangodb_graphfoxxodocumentarangodborientjssails_orientdborientgraphexectedbaasbox spark_javarddrddsunpersist asigned aql fetchplanoriento bsonobjectpyspark_rddrddmatrixfactorizationmodelresultiterablemlibpushdownlineage transforamtionspark_rddpairrddreducebykeymappartitionstakeorderedrowmatrixpair_rddblockmanagerlinearregressionwithsgddstreamsencouter fieldtypes spark_dataframejavarddgroupbykeyorg_apache_spark_rddlabeledpointdatabricksaggregatebykeyjavasparkcontextsaveastextfilejavapairdstreamcombinebykeysparkcontext_textfilejavadstreammappartitionswithindexupdatestatebykeyreducebykeyandwindowrepartitioning
    [Show full text]
  • A Glance at Psychophysics Software Programs
    Basic and Clinical Spring 2011, Volume 2, Number 3 A Glance at Psychophysics Software Programs Ali Yoonessi 1,2, Ahmad Yoonessi 3 1. School of Advanced Medical Technologies, Tehran University of Medical Sciences, Tehran, Iran. 2. Iranian National Center for Addiction Studies, Tehran University of Medical Sciences, Tehran, Iran. 2. McGill Vision Research, McGill University, Canada Article info: A B S T R A C T Received: 23 January 2011 First Revision: 14 February 2011 Accepted: 2 March 2011 Visual stimulation with precise control of stimulus has transformed the field of psychophysics since the introduction of personal computers. Luminance and chromatic features of stimulus, timing, and position of the stimulus are the main features that could be defined using programs written specifically for psychophysical experiments. In this manuscript, software used for the psychophysical experiments have been reviewed and evaluated for ease of use, license, popularity, and expandability. 1. Introduction and Psychopy4. Psychtoolbox is a toolbox written for Matlab, a commercially available software for signal ince the foundation of psychophysics by processing. Psychopy is an open-source platform-inde- Ibn-Haytham in the 4th century (Solar Hi- pendent program written under Python, a very popular jri, S.H.), scientists have been puzzled by programming tool. Visionegg is another open-source the complexity of visual experience1. His software for generating stimuli. Other programs include S method of controlled experiment made him Presentation, e-Prime, and Psykinematix. These pro- the pioneer of scientific methods 2. With development grams are commercial products and the prices vary up of personal computers in 1970s, vision scientists real- to more than 1000$.
    [Show full text]