Crowdsourcing User Reviews to Support the Evolution of Mobile Apps

Total Page:16

File Type:pdf, Size:1020Kb

Crowdsourcing User Reviews to Support the Evolution of Mobile Apps Crowdsourcing User Reviews to Support the Evolution of Mobile Apps Fabio Palomba1, Mario Linares-V´asquez2, Gabriele Bavota3, Rocco Oliveto4 Massimiliano Di Penta5, Denys Poshyvanyk6, Andrea De Lucia7 1Delft University of Technology, The Netherlands, 2Universidad de los Andes, Colombia 3Universit`adella Svizzera italiana (USI), Lugano, Switzerland, 4University of Molise, Pesche (IS), Italy 5University of Sannio, Italy, 6The College of William and Mary, USA, 7University of Salerno, Italy Abstract In recent software development and distribution scenarios, app stores are playing a major role, especially for mobile apps. On one hand, app stores allow continuous releases of app updates. On the other hand, they have become the premier point of interaction between app providers and users. After installing/updating apps, users can post reviews and provide ratings, expressing their level of satisfaction with apps, and possibly pointing out bugs or desired features. In this paper we empirically investigate|by performing a study on the evolution of 100 open source Android apps and by surveying 73 developers|to what extent app developers take user reviews into account, and whether addressing them contributes to apps' success in terms of ratings. In order to perform the study, as well as to provide a monitoring mechanism for developers and project managers, we devised an approach, named CRISTAL, for tracing informative crowd reviews onto source code changes, and for monitoring the extent to which developers accommodate crowd requests and follow-up user reactions as reflected in their ratings. The results of our study indicate that (i) on average, half of the informative reviews are addressed, and over 75% of the interviewed developers claimed to take them into account often or very often, and that (ii) developers implementing user reviews are rewarded in terms of significantly increased user ratings. Keywords: Mobile app evolution, User reviews, Mining app stores, Empirical Study 1. Introduction impressions, positions, comparisons, and attitudes toward the apps [9, 32, 38, 66, 71]. Therefore, online store re- Online app stores are promoting and supporting a more views are free and fast crowd feedback mechanisms that dynamic way of distributing software directly to users with can be used by developers as a backlog for the develop- release periods shorter than in the case of traditional soft- ment process. Also, given this easy online access to app- ware systems. For instance, periodically planned releases store-review mechanisms, thousands of these informative that follow an established road map have been replaced by reviews can describe various issues exhibited by the apps continuous releases that become available for an upgrade in certain combinations of devices, screen sizes, operating with a cadence of few weeks, if not days [63]. This phe- systems, and network conditions that may not necessarily nomenon is particularly evident for|though not limited be reproducible during regular development/testing activ- to|mobile apps, where releases are managed through on- ities. line app stores, such as the Apple App Store [2], Google Consequently, by analyzing ratings and reviews, de- Play Market [28], or Windows Phone App Store [57]. The velopment teams are encouraged to improve their apps, pressure for continuous delivery of apps is a constant in the for example by fixing bugs or by adding commonly re- mobile economy, in which users are demanding new and quested features. According to a Gartner report's rec- better features from the apps, and the ratings/reviews pro- ommendation [40], given the complexity of mobile test- vided by the users are a strong mechanism to promote the ing \development teams should monitor app store reviews apps in the market. to identify issues that are difficult to catch during test- The distribution of updates|related to the introduc- ing, and to clarify issues that cause problems on the users' tion of new features and to bug fixes—through app online side". Moreover, useful app reviews reflect crowd-based stores is accompanied by a mechanism that allows users needs and are a valuable source of comments, bug reports, to rate releases using scores (i.e., star ratings) and text feature requests, and informal user experience feedback reviews. The former (i.e., the score) is usually expressed [8, 9, 25, 38, 41, 48, 58, 60, 61, 66]. However, because of as a choice of one to five stars, and the latter (i.e., the this easy access to the app stores and lack of control over review) is a free text description that does not have a pre- the reviews' content, relying on crowdsourced reviews for defined structure and is used to describe informally bugs planning releases may not be widely used by mobile de- and desired features. The review is also used to describe velopers because (i) a large amount of reviews need to be Preprint submitted to Elsevier November 15, 2017 analyzed, and (ii) some of these reviews may not provide in crowd reviews, and (ii) inherent abstraction mis- tangible benefits to the developers [9]. match between reviews and developers' source code In this paper we investigate to what extent app devel- lexicon. opment teams can leverage crowdsourcing mechanisms for planning future changes, and how these changes impact 2. CRISTAL's monitoring mechanism. After linking user satisfaction as measured by follow-up ratings. Specif- reviews to changes, the premier goal of CRISTAL is ically, we present (i) a mining study conducted on 100 to enable developers tracking how many reviews have Android open source apps in which we linked user re- been addressed, and analyzing the ratings to assess views to source code changes, and analyzed the impact users' reaction to these changes. More specifically, of implementing those user reviews in terms of app suc- the ability to monitor the extent to which user re- cess (i.e., ratings); and (ii) an online survey featuring 73 views have been addressed over the projects' evo- responses from mobile app developers, aimed at investi- lution history can be used as a support for release gating whether they rely on user reviews, what kind of re- planning activities. quirements developers try to elicit from user reviews, and 3. Results of an empirical study conducted on 100 An- whether they observe benefits in doing that. The survey droid apps. The study leverages CRISTAL to pro- does not aim at obtaining a different, subjective measure vide quantitative evidence on (i) how development of success. Instead, it helps understanding to what extent teams follow suggestions contained in informative developers actually rely on user reviews to improve their reviews, and (ii) how users react to those changes. apps. The results of the study demonstrate that on aver- To support our study and to provide developers age 49% of the informative reviews are implemented and project managers with an automated mechanism by developers in the new app release and this results to monitor whether and how user reviews have been in a substantial reward in terms of an increase in implemented, we have developed an approach named the app's rating. It should be noted that the value CRISTAL (Crowdsourcing RevIews to SupporTApp of responding to a user review of a mobile app has evoLution) able to identify traceability links between in- never been explored. Our analysis of app reviews and coming app reviews and source code changes likely ad- responses from 10,713 top apps in the Google Play dressing them, and use such links to analyze the impact of Store shows that developers of frequently-reviewed crowd reviews on the development process. Although ap- apps never respond to reviews. However, we observe proaches for recovering traceability links between require- that there are positive effects to responding to re- ments and source code [1, 51] or even between development views (users change their rating 38.7% of the time mailing lists and code [4] exist, such approaches are not di- following a developer response) with a median rat- rectly applicable in our context because of specific charac- ing increase of 20%. teristics of user reviews in mobile markets. First, as shown by Chen et al. [9], not all the reviews can be considered 4. Results of a survey conducted with 73 Android de- as useful and/or informative. Also, there is a vocabulary velopers and aimed at investigating the developers' mismatch between user reviews and source code or issues perception of user reviews, to what extent they ad- reported in issue trackers. Unlike issue reports and emails, dress them and whether they observe any tangible reviews do not refer to implementation details; moreover, benefits from such an activity. The achieved results because of the GUI-driven nature of mobile apps and the show that developers (i) strongly rely on user re- user interaction based on user gestures, user reviews in- views when planning the new release of their apps, clude mobile-specific GUI terms. In order to address these (ii) mainly look in user reviews for bugs experienced challenges, CRISTAL includes a multi-step reviews-to- by their users or for suggestions for new features, code-changes traceability recovery approach that firstly and (iii) confirm the positive effect of implementing identifies informative comments among reviews (based on change requests embedded in the user reviews on the a recent approach by Chen et al. [9]), and then traces app's ratings. crowd reviews onto commit notes and issue reports by leveraging a set of heuristics specifically designed for this 5. A comprehensive replication package [69] that traceability task. includes all the materials used in our studies. In summary, the paper makes the following contribu- tions: Paper structure. Section 2 describes CRISTAL, while 1. CRISTAL's novel reviews-to-code-changes trace- Section 3 reports results of a study aimed at evaluating ability recovery approach.
Recommended publications
  • Assessing the Quality of Mobile Graphical User Interfaces Using Multi-Objective Optimization
    Noname manuscript No. (will be inserted by the editor) Assessing the Quality of Mobile Graphical User Interfaces using Multi-objective Optimization Makram Soui · Mabrouka Chouchane · Mohamed Wiem Mkaouer · Marouane Kessentini · Khaled Ghedira the date of receipt and acceptance should be inserted later Abstract Aesthetic defects are a violation of quality attributes that are symp-toms of bad interface design programming decisions. They lead to deteriorating the perceived usability of mobile user interfaces and negatively impact the Users eXperience (UX) with the mobile app. Most existing studies relied on a subjective evaluation of aesthetic defects depending on end-users feedback, which makes the manual evaluation of mobile user interfaces human-centric, time-consuming, and error-prone. Therefore, recent studies have dedicated their effort to focus on the definition of mathematical formulas that each targets a specific structural quality of the interface. As the UX is tightly dependent on the user profile, the combi-nation and calibration of quality attributes, formulas, and users characteristics, when defining a defect, is not straightforward. In this context, we propose a fully automated framework which combines literature quality attributes with the users profile to identify aesthetic defects of MUI. More precisely, we consider the mobile user interface evaluation as a multi-objective optimization problem where the goal is to maximize the number of detected violations while minimizing the detection complexity of detection rules and enhancing the interfaces overall quality in means M. Soui College of Computing and Informatics Saudi Electronic University, Saudi Arabia E-mail: [email protected] Mabrouka Chouchane School of computer science of Manouba, Tunisia E-mail: [email protected] Mohamed Wiem Mkaouer Rochester Institute of Technology E-mail: [email protected] Marouane Kessentini University of Michigan E-mail: [email protected] Khaled Ghedira Honoris United Universities E-mail: [email protected] 2 Makram Soui et al.
    [Show full text]
  • Free Your Android! Not Free As in Free Beer About the FSFE This flyer Was Printed by the Free Software You Don't Have to Pay for the Apps from F-Droid
    Free as in Freedom Free Your Android! Not Free as in Free Beer About the FSFE This flyer was printed by the Free Software You don't have to pay for the apps from F-Droid. A lot Foundation Europe (FSFE), a non-profit organi- of applications from Google Play or Apple's App Store sation dedicated to promoting Free Software Get a are also free of charge. However, Free Software is not and working to build a free digital society. about price, but liberty. Free App Store Access to software de- When you don't control a program, the program termines how we can take for Your Android controls you. Whoever controls the software therefore part in our society. There- controls you. fore, FSFE is dedicated to ensure equal access and For example, nobody is allowed to study how a non- participation in the infor- free app works and what it actually does on your mation age by fighting for phone. Sometimes it just doesn't do exactly what you digital freedom. want, but there are also apps that contain malicious features like leaking your data without your knowledge. Nobody should ever be forced to use software that does not grant the freedoms to use, Running exclusively Free Software on your device puts study, share and improve the software. You you in full control. Even though you may not have the should have the right to shape technology as skills to directly exercise all of your freedom, you you see fit. benefit from a vibrant community that is enabled by freedom and uses it collaboratively.
    [Show full text]
  • Recommended Tools for Podcast Listening
    Chapter 2 Recommended Tools for Podcast Listening How to Find and Subscribe within the app for finding podcasts to subscribe to. to a Podcast Once you subscribe to a podcast, you can choose whether to automatically download the latest episode Podcasts can be found in many ways. If someone tells and whether to delete episodes you’ve finished. You you about an interesting podcast, you might start by can also choose from several other choices to cus- searching Google for the title, with the word podcast tomize your listening. We’ll look at both desktop and after it. You can usually find the website of the pod- mobile apps in the upcoming sections of this report. cast and from there find a link to subscribe. If you’re looking for a particular episode, you don’t actually need to subscribe. Most podcasts offer ways to listen Mobile Apps for iOS and Android to individual episodes in your web browser without subscribing.1 There are many options for listening to podcasts on Once you start subscribing to several podcasts, mobile devices. In this section, I’ll list some of the you’ll probably want to use a desktop or mobile app to highest-rated apps for iOS and Android. organize your subscriptions. Apple’s iTunes is the most One of the reasons to use a dedicated mobile app popular desktop app (for Mac or Windows).2 It also has for podcasts is that these apps have very useful fea- one of the largest directories of podcasts available, so tures, such as skipping ahead or back by a few seconds it’s a good place to search for and browse podcasts by (to skip introductions), a sleep timer, volume boost for topic.
    [Show full text]
  • OSINT Handbook September 2020
    OPEN SOURCE INTELLIGENCE TOOLS AND RESOURCES HANDBOOK 2020 OPEN SOURCE INTELLIGENCE TOOLS AND RESOURCES HANDBOOK 2020 Aleksandra Bielska Noa Rebecca Kurz, Yves Baumgartner, Vytenis Benetis 2 Foreword I am delighted to share with you the 2020 edition of the OSINT Tools and Resources Handbook. Once again, the Handbook has been revised and updated to reflect the evolution of this discipline, and the many strategic, operational and technical challenges OSINT practitioners have to grapple with. Given the speed of change on the web, some might question the wisdom of pulling together such a resource. What’s wrong with the Top 10 tools, or the Top 100? There are only so many resources one can bookmark after all. Such arguments are not without merit. My fear, however, is that they are also shortsighted. I offer four reasons why. To begin, a shortlist betrays the widening spectrum of OSINT practice. Whereas OSINT was once the preserve of analysts working in national security, it now embraces a growing class of professionals in fields as diverse as journalism, cybersecurity, investment research, crisis management and human rights. A limited toolkit can never satisfy all of these constituencies. Second, a good OSINT practitioner is someone who is comfortable working with different tools, sources and collection strategies. The temptation toward narrow specialisation in OSINT is one that has to be resisted. Why? Because no research task is ever as tidy as the customer’s requirements are likely to suggest. Third, is the inevitable realisation that good tool awareness is equivalent to good source awareness. Indeed, the right tool can determine whether you harvest the right information.
    [Show full text]
  • Erfolgreich Podcasten
    ERFOLGREICH PODCASTEN INHALT Einleitung 2-3 Journalisten Am Anfang steht das Ziel 4-5 Schreiben fürs Sprechen 6-7 Mit der Stimme überzeugen 8-9 Werkstatt Das hörbare Interview 10-1 1 Wege zur Monetarisierung 12-13 Erfolgreiche Verbreitung 14-15 Nachgefragt 16-19 10 Bausteine 19 Erfolgreich podcasten Strategien, Sprache und Stimmtraining 1 ERFOLGREICH PODCASTEN Trümpfe für Podcasts Wer Gehör finden will, sollte an seiner Stimme und Sprache arbeiten. Und für seine Audio-Produkte trommeln. Brigitte Hagedorn ist Hörfunkjournalistin, Mentorin und Trainerin für Podcast-Produktionen. Sie schreibt, bloggt und podcastet rund um das Thema Podcasting. ADOBE STOCK ADOBE Hagedorn ist begeistert von Podcasts, weil damit jeder zum Sender werden oder anderen eine Stimme geben kann. https://www.audiobeitraege.de FOTO: 2 ERFOLGREICH PODCASTEN Spätestens seit dem „Coronavirus-Update“ vom NDR erzeugt Die erste Folge ist die aufwendigste. Doch bedenken Sie dabei der Begriff Podcast keine fragenden Blicke mehr. Ein Boom auch, es sind immer fünf Schritte, die Sie gehen müssen: Kon- wurde diesem smarten Kanal schon 2008 attestiert, aber so zept, Technik, Produktion, Hosting und Vermarktung. richtig deutlich wurde das 2014/15, als der Podcast „Serial“ In dieser Werkstatt erhalten Sie zu einigen dieser Schritte sehr internationale Aufmerksamkeit erregte. Großartiges Storytel- konkrete und umsetzbare Tipps und Anregungen, die Sie wei- ling! Da waren sich Fans wie Medien einig. Der Podcast „Seri- ter in die Tiefe führen. Dazu gehört auch der Aspekt Marketing, al“ sei eine der wichtigsten Audioveröffentlichungen seit Jahr- denn gerade der Podcast für „die gute Sache“ braucht Hörer zehnten, hieß es beispielsweise bei Deutschlandfunk Nova. Und und Hörerinnen, also Reichweite.
    [Show full text]
  • Download This PDF File
    Library Technology R E P O R T S Expert Guides to Library Systems and Services Podcast Literacy: Educational, Accessible, and Diverse Podcasts for Library Users Nicole Hennig alatechsource.org American Library Association About the Author Library Technology Nicole Hennig is an independent user experience pro- REPORTS fessional, helping librarians and educators effectively use mobile technologies. See her educational offerings ALA TechSource purchases fund advocacy, awareness, and at http://nicolehennig.com. She is the author of sev- accreditation programs for library professionals worldwide. eral books, including Apps for Librarians: Using the Best Volume 53, Number 2 Mobile Technology to Educate, Create, and Engage. Her Podcast Literacy: Educational, Accessible, and Diverse online courses, such as Apps for Librarians & Educators, Podcasts for Library Users have enabled librarians from all types of institutions to ISBN: 978-0-8389-5985-5 effectively implement mobile technologies in their pro- American Library Association grams and services. Her newsletter, Mobile Apps News, 50 East Huron St. helps librarians stay current with mobile technologies. Chicago, IL 60611-2795 USA Hennig worked for the MIT Libraries for fourteen years alatechsource.org as head of user experience and web manager. She is the 800-545-2433, ext. 4299 312-944-6780 winner of several awards, including the MIT Excellence 312-280-5275 (fax) Award for Innovative Solutions. Advertising Representative Samantha Imburgia [email protected] Abstract 312-280-3244 Editors Podcasts are experiencing a renaissance today. More Patrick Hogan high-quality programming is available for more [email protected] diverse audiences than ever before. 312-280-3240 When librarians are knowledgeable about pod- Samantha Imburgia casts, how to find the best ones, and what purposes [email protected] they serve, we can point our users to the very best 312-280-3244 content and help increase digital literacy.
    [Show full text]
  • Brno University of Technology Implement Android Application for Audio Podcasts
    BRNO UNIVERSITY OF TECHNOLOGY VYSOKÉ UČENÍ TECHNICKÉ V BRNĚ FACULTY OF INFORMATION TECHNOLOGY DEPARTMENT OF COMPUTER GRAPHICS AND MULTIMEDIA FAKULTA INFORMAČNÍCH TECHNOLOGIÍ ÚSTAV POČÍTAČOVÉ GRAFIKY A MULTIMÉDIÍ IMPLEMENT ANDROID APPLICATION FOR AUDIO PODCASTS IMPLEMENTUJTE ANDROID APLIKACI PRO STAHOVÁNÍ A POSLECH AUDIO PODCASTŮ MASTER’S THESIS DIPLOMOVÁ PRÁCE AUTHOR Bc. JÁN ŠVEHLA AUTOR PRÁCE SUPERVISOR Ing. IGOR SZŐKE, Ph.D. VEDOUCÍ PRÁCE BRNO 2016 Abstract This master’s thesis is aimed at the problems and issues which are encountered over the development of application for mobile devices. An application for managing and listening of audio podcasts was chosen to be developed. This thesis will guide the reader through general topics and issues of "User Experience" research, its usage and specification on mobile devices and actual design and the implementation of developed application. Results of this work can be used for the development of applications with respect to user centered design. Abstrakt Tato diplomová práce je zaměřená na problémy vyskytující se při tvorbě aplikace pro mobilní zařízení. Jako vyvíjená aplikace byla zvolena aplikace pro práci a poslech audio podcastů. Tato práce provede čtenáře základními tématami a problémy "User Experience" výzkumu, jeho využití a specializace na mobilních zařízeních a samotným návrhem a im- plementací vyvíjené aplikace. Výsledek této práce je možno využít pro vývoj mobilních aplikací s ohledem na "User centered design". Keywords Android, application, podcast, UX, user experience, mobile application, design, user re- search Klíčová slova Android, aplikace, podcast, UX, user experience, mobilní aplikace, design, uživatelský výzkum Reference ŠVEHLA, Ján. Implement Android Application for Audio Podcasts. Brno, 2016. Master’s thesis. Brno University of Technology, Faculty of Informa- tion Technology.
    [Show full text]
  • Opportunities for Libraries and How to Learn More
    Chapter 5 Opportunities for Libraries and How to Learn More ˳ Libraries Recommending Podcasts English: Podcasts, City College of New York Libraries (http://libguides.ccny.cuny.edu/c.php There seems to be plenty of information on producing ?g=346751&p=2337600) 1 ˳ podcasts (for libraries), and many libraries have inter- Recommended Podcasts, Orem Public Library, esting and creative podcasts that they produce for their Orem, Utah (www.oremlibrary.org/index.php audiences or for other librarians.2 For a few samples, /eresearch/legal-information/12-library/general see the section of this report “Podcasts by Librarians.” /35-recommended-podcasts) But another role that libraries can play is curator 2. Make printable handouts about podcasts on of podcasts, recommending the best ones as informa- specific topics. Offer PDF versions for download- tion sources to their communities. There are a number ing from your website and newsletters. of ways this can be done, and these tips could apply 3. Recommend podcast listening apps. Add infor- ReportsLibrary Technology to school, public, academic, corporate, or any kind of mation to your guides and handouts about how to library. subscribe to podcasts and recommended apps for convenient listening. See the section of this report 1. Create guides to recommended podcasts on “Recommended Tools for Podcast Listening.” our websites. Just as we make guides to recom- 4. Put podcast listening apps on mobile devices mend all types of resources, we can also recom- that you loan out. mend podcasts. We can make guides about pod- 5. Start a podcast discussion club. Make it like a casts, focusing on the specific subject needs of our book club, but for podcasts.
    [Show full text]
  • Shake It! Detecting Flaky Tests Caused by Concurrency with Shaker
    2020 IEEE International Conference on Software Maintenance and Evolution (ICSME) Shake It! Detecting Flaky Tests Caused by Concurrency with Shaker Denini Silva, Leopoldo Teixeira, Marcelo d’Amorim Federal University of Pernambuco, Brazil {dgs,lmt,damorim}@cin.ufpe.br Abstract—A test is said to be flaky when it non- 10 reasons, for slowing down software deployments. Finally, deterministically passes or fails. Test flakiness negatively affects Facebook considers flakiness detection a priority [6]. the effectiveness of regression testing and, consequently, impacts ReRun is the most popular approach in industry to detect software evolution. Detecting test flakiness is an important and challenging problem. ReRun is the most popular approach in test flakiness [1], [7]. It reruns tests multiple times. A test industry to detect test flakiness. It re-executes a test suite on a that failed and then passed is considered flaky, and vice-versa. fixed code version multiple times, looking for inconsistent outputs The status of a test that persistently failed is unknown, but across executions. Unfortunately, ReRun is costly and unreliable. developers typically treat this scenario as a problem in appli- This paper proposes SHAKER, a lightweight technique to improve cation code as opposed to a bug in test code. Unfortunately, the ability of ReRun to detect flaky tests. SHAKER adds noise in the execution environment (e.g., it adds stressor tasks to compete ReRun is unreliable and expensive. It is unreliable because it for the CPU or memory). It builds on the observations that is hard to determine the number of reruns to find discrepancy concurrency is an important source of flakiness and that adding in outputs.
    [Show full text]
  • The Free Android Podcasting App Antennapod Releases a Major New Version with a Friendlier User Interface and a Range of New Features
    The free Android podcasting app AntennaPod releases a major new version with a friendlier user interface and a range of new features Tuesday 13 October 2020 – Version 2 of AntennaPod has been released today on Google Play and F-Droid. This release is a major milestone for the popular free and Open Source podcast manager for Android. Over 20 volunteers from across the world collaborated on this version. The app has been given a fresh new brand selected through a unique public vote. Version 2 also brings a big update to the user interface and contains many new features and improvements under the hood. The lead developer Hans-Peter Lehmann points out: "With a total of over 500 commits, this release is probably the biggest update since the initial release of the app. I'm proud of this new version; the new app icon and many new functionalities were worth the wait." The new feature set includes 'app shortcuts', allowing users to directly go to their queue or episode list from their home screen. There are also new per-podcast settings to adjust each podcast's volume and to skip intros and outros of individual episodes. Additionally, a new option has been added to preview episodes before subscribing to a podcast. Other notable improvements include enhanced audio-description of the interface, which benefits the visually impaired in navigating around the app. AntennaPod is now available in 17 languages, with many more partially supported. A full list of new features, improvements and bug fixes is available on the project website. Possibly the biggest user-facing change is the project's new brand, most visible in the new app icon.
    [Show full text]
  • ROB WALCH [email protected] @ My Background
    YES - That Marketing Advice for your Podcast is BS ROB WALCH [email protected] @ My Background 2004 2006 Apr 2007 Aug 2007 Porters The Feed KC Startup 411 2016 Podcast @podcast411 How Will Your Podcast Benefit Your Listeners? @podcast411 How Will Your Podcast Benefit Your Listeners? @podcast411 Is Apple Podcasts Still #1??? @podcast411 YES!!!!!!! @podcast411 Aggregator Apps - July 2021 % Share Podcast Platform / Destination between 0.20% & 56.6% Apple Podcasts / iTunes 0.02%- in order: 15.02% Spotify PodcastRepublic Gaana 2.61% Google Podcasts TuneIn Radio 2.06% Overcast (iOS) Pandora 1.42% Podcast Addict (Android) Downcast Deezer 1.21% CastBox Acast 1.12% PocketCast AntennaPod 1.15% Stitcher Castro Podimo 0.97% iHeartRadio RSSRadio 0.95% Podbean Podkicker 0.50% PodNods iCatcher 0.30% Amazon Music Luminary Himalaya 0.28% Player.FM Breaker 0.26% The Podcast App SmartSpeakers + Set Top = 0.30% @podcast411 PODCAST DESTINATIONS - JUL 21 Google Social Media Apple Podcasts Podcasts 70% 3% Aggregators Blogs 8% 1% 91% 15% ~2% 1% SPOTIFY Personal Apps Music First Destinations @podcast411 MYTH: The Podcasting Space is too Crowded ~+90k new shows per month??? Podcast Industry Insights courtesy of Daniel J. Lewis https://podcastindustryinsights.com, 7/30/21 @podcast411 MYTH: The Podcasting Space is too Crowded >500,000 - 1 and DONE Podcast Industry Insights courtesy of Daniel J. Lewis https://podcastindustryinsights.com, 7/30/21 @podcast411 MYTH: The Podcasting Space is too Crowded Active Podcasts = A show with a new episode released in the Millions last 90 days AND 10 or more episodes released. 322k = Active Blogs Podcasts ~2,350 : 1 Ratio 322k from 472 mil Tumblr blogs - Jan19 - Statista.
    [Show full text]
  • How to Add Your Podcast to Directories - Transcription
    How To Add Your Podcast To Directories - Transcription Today is the continuation of our series on getting your audio on all podcast platforms. This is a tool both active podcasters to make sure that you're on every platform and for new podcasters to be sure that you have an available resource to get you there and get as much distribution and focus on your podcasts as possible. Make it as easy as possible for new people to listen by distributing your show to as many podcast directories as possible. Here are the steps to get approved on the leading directories. And we include platforms like Spotify, Google, iHeart, Stitcher, YouTube, Pandora. So you'll have a PDF to reference back as well as the webinar. So first you want to check and see if you can submit through your hosting platform. 24 of the most popular hosting platforms are already aggregators for Spotify, but there are a lot of podcast platforms that use that aren't integrated. You can submit your podcast through Libsyn as your hosting platform and get approved directly there. There are some podcast platforms that you have to manually go and grab your RSS Feed Link and submit it that way. And that's what we wanted to talk about because, with 271 million active users, Spotify is a valuable platform for content. If your podcast platform does not enable you to submit directly, then you want to log in to Spotify for podcasters, with your Spotify acount. Spotify Or if you don't have a Spotify account, you can choose to sign up to create one.
    [Show full text]