Welcome to PharmaSUG 2009 ...... 1 PharmaSUG 2009 Conference Committee...... 2 PharmaSUG 2009 Section Chairs ...... 3 PharmaSUG 2009 Special Volunteers ...... 4 Consolidated Conference Schedule ...... 5 Conference Highlights...... 9 Area Attractions...... 12 SAS at PharmaSUG 2009 ...... 14 SAS® Keynote Speaker...... 15 Conference Keynote Speaker ...... 16 Abstracts ...... 17 Pre-Post Conference Seminars ...... 88 Sponsors...... 96 Exhibitions and Demonstrations ...... 98

Welcome to PharmaSUG 2009!

The PharmaSUG 2009 Conference Committee welcomes you to the annual meeting of the Pharmaceutical Industry SAS® Users Group. We are pleased to be located at the Hilton Portland & Executive Tower in Portland, .

The conference formally begins with the Opening Session. This year the conference committee will be in- troduced and we’ll have two outstanding speakers addressing our conference. Kathy Council, vice presi- dent of Publications from SAS® Institute will talk about “Celebrating the SAS User - A history of the part- nership between SAS and its user community” and Representative Dr. Mitch Greenlick, elected to Oregon House of Representatives in 2002 House District 33, will address our conference and he will talk about “Towards Universal Access to Health Care”. An excellent dinner will immediately follow. We are pleased to again supplement the conference with pre- and post- conference training seminars. These seminars, ranging from beginning to advanced levels and taught by renowned experts, will be of- fered Saturday (May 30), Sunday (May 31), Wednesday afternoon (June 3) and Thursday (June 4). In addi- tion to seminars, this year Toastmasters International and Toastmasters District 7 will be providing free speaker training on Sunday (May 31).

Two and a half days packed with paper presentations and invited Hands-on Workshops will begin on Monday morning and continue through Wednesday morning. SAS personnel will present selected papers. A record number of Poster presentations will be available for your viewing in the SAS Exhibit and Demo room.

The Exhibit and Demo room will be open all day on Monday and Tuesday, plus Wednesday morning. This year, we will have over 12,500 sq. ft. of space for exhibits and demonstrations! This is also the location of the snack breaks and the SAS User Appreciation Mixer on Monday evening.

Monday evening is the 5th annual Game Night. This year, again we’re holding a ‘no money exchange’ Texas Hold ‘Em poker tournament and a Ping-Pong tournament. Both tournaments are additional fee events for participants, but spectators may attend for free. In addition to tournaments, this year we are of- fering Yoga classes Monday and Tuesday evening, 8:00 – 9:30 p.m., $20 per section payable at the door.

Portland offers many cultural and fun activities for all ages. We hope you can spend a few days prior to or after the conference to do some sightseeing in this versatile city. More information about things to do in Portland is contained later in this book.

Thank you for attending PharmaSUG 2009. We wish you a terrific conference that is both educational and fun. Welcome to Portland!

Syamala Ponnapalli Ellen Brookstein Academic Chair Operations Chair

1 PharmaSUG 2009 Conference Committee

Academic Chair Registration Desk Coordinator Sponsor & Exhibitor Coordinators Syamala Ponnapalli Jacques Lanoue Iza Peszek ICON Clinical Research MannKind Corporation Comprehensive Neuro Sciences Matt Becker Freddie Paino Operations Chair PharmaNet Quintiles Ellen Brookstein Octagon Research Scholarship and Speaker Sharing Co- Media & Materials Coordinators ordinator Dan Downing Publications Coordinator Richard Allen CVS Caremark James Wu Peak Statistical Services Joshua Horstman Sanofi-Aventis First Phase Consulting, Inc Seminar Coordinators Treasurer Margaret Hung Newsletter Editors Cindy Song Stat-Tech Services, LLC Mal Foley Sanofi-Aventis Cecilia Mauldin Contractor/Trainer Comprehensive Neuro Sciences Cindy Song Registrar Sanofi-Aventis Elaine Dempsey Meal Coordinator Omnicare Clinical Research Ellen Brookstein Events Coordinator Octagon Research Kim Truett Assistant Registrar Brian Shilling KCT Data, Inc. Matt Becker Octagon Research PharmaNet

Volunteer Coordinators Matthews Carol United Biosource Mullins Lynn Statking Consulting

2 PharmaSUG 2009 Section Chairs

Applications Development Management Statistics & Pharmacokinetics Mary Anne Hope Jackie Lane Guowei Wu BCBS of Arizona ICON Clinical Research Merck & Co Inc Gopal Rajagopal Pete Yribar Xingshu zhu Merck & Co Inc Genentech, Inc Merck & Co Inc

Coders Corner Posters Technical Techniques Sandy Paternotte Jeanina Worden Jim Johnson PPD ClinOps, LLC RPS Meera Kumar Helen Wang Eric Larson Sanofi-Aventis Merck & Co Inc Tutorials Data Management & Quality Public Health Research Nancy Brucken Laura DiTullio Richard Allen i3 Statprobe GE Healthcare Peak Statistical Services Xiaohui Wang Dave Izard Paul Slagle Novartis Octagon Research United Biosources

Hands-On Workshops Regulatory Submissions & Stan- Susan Fehrer dards BioClin Inc Eunice Ndungu Alissa Ruelle Merck & Co Inc PharmaNet Cindy Lee Hsiu-yung Eli Lilly

3 PharmaSUG 2009 Special Volunteers

Copyright Coordinator FDA & Industry Recruiter SAS Liason Lin Yan Sandra Minjoe Michael Smith Merck & Co Inc Genentech, Inc SAS Institute, Inc. Sandy Paternotte PPD, Inc Photographers Scholarship & Speaker Sharing Diane Carow Assistant Game Night Coordinators Quintiles Pamela Butler Truett Kim (ping pong) Sanofi-Aventis KCT Data, Inc. Speaker Training Yribar Pete (poker) Paul McDonald Webmaster Genentech, Inc PRA International Michelle Langston

Graphic Artists Code Clinic Kim Riddell Kirk Paul Lafler KCRiddell, LLC Software Intelligence Corp Ramona Martin Rographix Art & Design Studio

4 Consolidated Conference Schedule

SATURDAY, May 30 7:30 AM - 8:00 AM Seminar Registration – Seminar Attendees Only ...... Plaza Foyer 8:00 AM - Noon Morning Seminar Sessions * 1. CDISC De-mystified: An Introduction to CDISC ...... Broadway I/II 2. Testing and Validating SAS® Programs in an FDA Regulated Environment ...... Broadway III 3. Building the General-Purpose Program: Exploring the Developer’s Mindset and Toolbox ...... Broadway IV Noon - 1:00 PM Lunch – On Your Own 12:30 PM - 1:00 PM Seminar Registration – Seminar Attendees Only ...... Plaza Foyer 1:00 PM - 5:00 PM Afternoon Seminar Sessions * 4. Practical CDISC: Implementing CDISC with SAS® ...... Broadway I/II 5. Advanced ODS ...... Broadway III 6. SAS® & XML ...... Broadway IV

SUNDAY, May 31 7:30 AM - 8:00 AM Seminar Registration – Seminar Attendees Only ...... Plaza Foyer 8:00 AM - Noon Morning Seminar Sessions * 7. Advanced Reporting and Analysis Techniques for the SAS® Power User: It’s Not Just About the PROCs! ...... Broadway I/II 8. Best Practices in Base SAS® Coding ...... Broadway III Noon - 1:00 PM Lunch – On Your Own Noon - 4:00 PM Presentation rehearsal...... Galleria North 12:30 PM - 1:00 PM Seminar Registration – Seminar Attendees Only ...... Plaza Foyer 1:00 PM - 6:00 PM Conference Registration Desk Open ...... Plaza Foyer 1:00 PM - 5:00 PM Afternoon Seminar Sessions * 9. Leadership 101: What Everyone Should Know About Managing SAS® Programmers ...... Broadway I/II 10. Quick Results With SAS/GRAPH® Software ...... Broadway III 2:00 PM – 4:00 PM Toastmasters Training on Public Speaking...... Galleria South 5:00 PM - 6:00 PM Presenter, Section Chair, and Volunteer Meeting ...... Galleria North 6:00 PM - 7:30 PM Opening Session...... Pavilion Ballroom 7:30 PM – 9:00 PM Welcome Dinner...... Pavilion Ballroom

MONDAY, June 1 7:30 AM - 9:00 AM Light Breakfast...... Pavilion Ballroom 7:30 AM –10:00 AM Conference Registration Desk Open ...... Plaza Foyer 8:30 AM – Noon Morning Sessions Application Development (AD) ...... Galleria South Management (MA) ...... Galleria North Regulatory Submissions & Standards (RS) ...... Broadway III/IV Tutorials (TU) ...... Broadway I/II 8:30 AM – Noon Hands-On Workshops (HOW) ...... Parlor 9:00 AM - Noon Exhibit and Demo Room (includes Poster Displays) ...... Grand Ballroom 9:45 AM - 10:15 AM Morning Break ...... Grand Ballroom I

* The cost for these events is NOT included in your registration fee.

5 Consolidated Conference Schedule (Continued)

MONDAY, June 1 (continued) Noon - 1:30 PM Networking Lunch ...... Pavilion Ballroom 1:30 PM – 5:00 PM Afternoon Sessions Application Development (AD) ...... Galleria South Regulatory Submissions & Standards (RS) ...... Broadway III/IV Technical Techniques (TT)...... Galleria North Tutorials (TU) ...... Broadway I/II 1:30 PM - 5:00 PM Hands-On Workshops ...... Parlor 1:30 PM - 5:00 PM Exhibit and Demo Room (includes Poster Displays)...... Grand Ballroom 2:00 PM – 4:00 PM Conference Registration Desk Open ...... Plaza Foyer 2:30 PM - 3:30 PM Authors Explain Posters ...... Grand Ballroom 2:45 PM - 3:15 PM Afternoon Break...... Grand Ballroom I 5:30 PM - 6:30 PM SAS User Appreciation Mixer...... Grand Ballroom 8:00 PM - 9:30 PM Yoga Class* ...... 3rd floor room Game Night 7:00 PM – 8:00 PM Ping Pong Warm-up and Practice*...... Pavilion East 8:00 PM – 11:00 PM Ping Pong Tournament* ...... Pavilion East 8:00 PM – 11:00 PM Texas Hold ‘Em Poker Tournament* ...... Pavilion West 8:00 PM Ice Cream Social………...... Plaza Foyer

TUESDAY, June 2 7:30 AM - 9:00 AM Light Breakfast...... Pavilion Ballroom 7:30 AM -10:00 AM Conference Registration Desk Open ...... Plaza Foyer 8:00 AM - Noon Morning Sessions Application Development (AD) ...... Galleria South Coder’s Corner (CC) ...... Broadway III/IV Statistics & Pharmacokinetics (SP) ...... Broadway I/II Technical Techniques (TT)...... Galleria North Tutorials (TU) ...... Broadway I/II 8:30 AM - Noon Hands-On Workshops ...... Parlor 9:00 AM - Noon Exhibit and Demo Room (includes Poster Displays)...... Grand Ballroom 9:45 AM - 10:15 AM Morning Break ...... Grand Ballroom I Noon - 1:30 PM Networking Lunch ...... Pavilion Ballroom Noon - 1:30 PM PharmaSUG 2010 Planning Meeting...... Alexander’s 1:30 PM - 5:20 PM Afternoon Sessions Application Development (AD) ...... Galleria South Data Management (DM) ...... Galleria South Coder’s Corner (CC) ...... Broadway III/IV Public Health Research (PR) ...... Galleria North Regulatory Submissions & Standards (RS) ...... Broadway III/IV Statistics & Pharmacokinetics (SP) ...... Broadway I/II Technical Techniques (TT)...... Galleria North 1:30 PM - 3:00 PM SAS Hands-On Workshops ...... Parlor 1:30 PM - 5:00 PM Exhibit and Demo Room (includes Poster Displays)...... Grand Ballroom 2:00 PM - 4:00 PM Conference Registration Desk Open ...... Plaza Foyer 2:30 PM - 3:30 PM Authors Explain Posters ...... Grand Ballroom 2:45 PM - 3:15 PM Afternoon Break...... Grand Ballroom I 6:00 PM - 8:00 PM Volunteer Party (by invitation only)...... Alexander’s 8:00 PM - 9:30 PM Yoga Class* ...... 3rd floor rooms

* The cost for these events is NOT included in your registration fee.

6 Consolidated Conference Schedule (Continued)

WEDNESDAY, June 3 7:30 AM - 9:00 AM Light Breakfast...... Pavilion Ballroom 8:00 AM –10:00 AM Conference Registration Desk Open ...... Plaza Foyer 8:30 AM - 11:30 AM Morning Sessions Data Management (DM) ...... Galleria South Management (MA) ...... Parlor Public Health Research (PR) ...... Galleria North Regulatory Submissions & Standards (RS) ...... Broadway III/IV Statistics & Pharmacokinetics (SP) ...... Broadway I/II 9:00 AM - 11:30 AM Exhibit and Demo Room ...... Grand Ballroom 9:45 AM - 10:15 AM Morning Break ...... Grand Ballroom I 11:30 AM – 1:00 PM Closing Session ...... Pavilion Ballroom 12:45 PM - 2:00 PM Seminar Registration & box lunch – Seminar Attendees Only...... Registration Desk 1:30 PM - 5:30 PM Afternoon Seminar Sessions * 11. Insights into ADaM ...... Broadway I/II 12. Merging, Combining and Sub-setting SAS® Datasets (Tricks, Traps, and Techniques) ...... Broadway III THURSDAY, June 4 7:30 AM - 8:00 AM Seminar Registration – Seminar Attendees Only ...... Plaza Foyer 8:00 AM - 5:00 PM Seminar Sessions * 13. Advanced Techniques in the SAS® Macro Language ...... Broadway I

* The cost for these events is NOT included in your registration fee.

7

8 Conference Highlights

Presenter, Section Chair, Volunteer Meeting

SUNDAY, 5:00 PM – 6:00 PM, Galleria North All paper presenters, section chairs, and volunteers are requested to attend the Presenter, Section Chair, and Volun- teer meeting on Sunday afternoon.

Opening Session, Keynote, and Dinner

SUNDAY, 6:00 PM. – 9:30 PM, Pavilion Ballroom PharmaSUG 2009 officially kicks off at Opening Session on Sunday evening in Pavilion Ballroom from 6:00 to 7:30. The opening session is your official welcome to the conference. Here you will hear a summary of the confer- ence activities and be introduce to the conference chairs and conference planners. Our keynote speakers, Kathy Council of SAS Institute, will present “Celebrating the SAS User - A history of the partnership between SAS and its user community” and Representative Dr. Mitch Greenlick, elected to the Oregon House of Representatives, House District 33 comprises Northwest Portland will present “Towards Universal Access to Health Care”.

Badges

Your name badge will be your admission to the entire conference. It is required to attend the presentations and meal functions. You must wear your name badge in a visible location during the conference and all conference ac- tivities.

PharmaSUG 2010 Planning Meeting

TUESDAY, 12:00 PM - 1:30 PM, Alexander’s Get involved in planning next year’s conference. YOUR ideas and input are important for planning future confer- ences. This is a working lunch in the Baker Room, where we will enjoy lunch while contributing to the planning for the next year’s conference.

Presentation Rehearsal

SUNDAY, 1:00 PM – 4:00 PM, Galleria North A presentation rehearsal room is available for presenters to polish their presentations. Prior sign-up outside the door is required.

Meals

Your conference registration fee includes the following meals: Sunday night reception and dinner plus lunch on Monday and Tuesday. A light continental breakfast on Monday, Tuesday, and Wednesday mornings is also in- cluded in your conference registration fee. See the Consolidated Schedule, pages 5-7, for meal times and locations.

Snack Breaks

Morning coffee/tea will be provided on Monday, Tuesday and Wednesday from 9:45-10:15 AM. Afternoon re- freshments will be provided on Monday and Tuesday from 2:45-3:15 PM. All morning and afternoon snack breaks will be held in the Exhibit & Demo Room (Grand Ballroom).

9

SAS User Appreciation Mixer

MONDAY, 5:30 PM - 6:30 PM, Exhibit & Demo Room (Grand Ballroom) Join SAS for a mixer and celebrate our continued shared successes. Mingle and relax with other PharmaSUG at- tendees and enjoy complementary food and drinks.

Exhibit and Demo Room

MONDAY and TUESDAY, 9:00 AM – 12:00 PM, 1:30 PM - 5:00 PM, WEDNESDAY, 9:00 AM - 11:30 AM, Grand Ballroom The Exhibit and Demo Room will be open on Monday through Wednesday and features an extensive display of SAS products and services. SAS Staff will be on hand to discuss and demonstrate the latest software and services available. Additional corporate vendors will be in the Exhibit and Demo room. Participating vendors will provide live demos and descriptive product literature. Only vendors who register for the vendors’ exhibit are allowed to display their literature and products. One of these booths will contain information about PharmaSUG, including the 2010 con- ference, and the position referral manual. A diagram of the demo room will be available just inside that room. New this year, Code Clinic (Code Doctors). Have a free consultation with our Code Doctors. Bring a printout and/or e-file containing your code, source data and error messages for a definitive diagnosis.

Hands-On Workshops

MONDAY and TUESDAY, Parlor We are once again pleased to offer our own Hands-on Workshop Section, this year in an auditorium style setting. Presenters were invited to participate, and SAS Institute provides the computers for our use. Abstracts for the Hands-On Workshops are listed later in this book.

Posters

MONDAY and TUESDAY Viewing All Day, Exhibit and Demo Room (Grand Ballroom ) MONDAY and TUESDAY, 2:30 PM - 3:30 PM, Authors Explain Posters Posters are another way to present technical information for viewing. These are displayed Monday and Tuesday for your viewing pleasure and learning. Abstracts for the Posters are listed later on in this program book. The papers that correspond to the posters will be posted on the PharmaSUG web site at www.pharmasug.org. You may talk to the author one-on-one about the material in the poster from 2:30 pm to 3:30 PM on both Monday and Tuesday.

Position Referral Manual

A Position Referral Manual with job postings and resumés will be on display during the conference at the Pharma- SUG booth in Grand Hall East. There is no fee for placing job postings or resumés in the binder, but the material must fit into an 8.5" x 11" clear plastic sheet protector. You are strongly encouraged to place multiple copies of your material in the sheet protector so interested attendees may take one. We reserve the right to remove inappro- priate material. Direct recruiting at the conference is prohibited.

10

Closing Session

WEDNESDAY, 11:30 AM – 1:00 PM, Pavilion Ballroom The Closing Session concludes the conference. It includes the announcement of the Best Paper Awards for each section, plans for PharmaSUG 2010, and lots of door prizes.

Web Site

After the conference, visit the PharmaSUG web site at www.pharmasug.org to read a wrap-up and review of PharmaSUG 2009, fill out the on-line conference evaluation form, learn who the best paper award winners were in each section, and see plans for PharmaSUG 2010 in Orlando, Florida!

11

Area Attractions

Places to Explore on Your Own!

The Hilton Portland & Executive Tower hotel is located in the heart of Portland's city center financial and enter- tainment districts. The Hilton's central location is within blocks of downtown Portland's best restaurants such as Jake's Famous Crawfish and Higgins Restaurant and Bar. Upscale shopping is nearby at Nordstrom, Saks Fifth Avenue and Nike Town. Area attractions include the Portland Art Museum, Rose Garden Arena, Oregon Zoo, OMSI, Oregon Convention Center and more…

Conference City There are plenty of things to do in Portland for sure. The city offers many cultural and fun activities for all ages. So, spend a few days prior to or after the conference to do some sight-seeing in this versatile city.

• Rose Garden Arena - Home of the Portland Trailblazers basketball team • Oregon Zoo - with over 1,000 mammals, birds & reptiles as well as the Washington Park & Zoo Railway • Oregon Museum of Science & Industry - providing hours of entertainment for the young and young at heart with an IMAX theater, laser light shows, tours of the USS Blueback submarine and much more • Tax free shopping at Saks Fifth Avenue, Nike Town, Nordstrom, Macy's and many others • Portland Art Museum - featuring many artists as well as traveling exhibitions • Oregon Historical Society - with over 85,000 artifacts • Portland Classical Chinese Garden • The annual Rose Festival Parade in June • Willamette Valley Wineries

12 •

13

SAS at PharmaSUG 2009

We are particularly grateful to SAS Institute and to the many SAS employees for all the help and support they have given to PharmaSUG 2009. Just as SAS is always there for their users, SAS has been here for the organizers of PharmaSUG. A great big “thank you” has to go to SAS!

This section describes some of the ways that SAS is specifically present at this year’s conference.

SAS User Appreciation Mixer

MONDAY, 5:30 PM - 6:30 PM, Exhibit & Demo Room (Grand Ballroom) Join SAS for a mixer and celebrate our continued shared successes. Mingle and relax with other PharmaSUG atten- dees and enjoy complementary food and drinks. Exhibits, Demo, and Publications

MONDAY, 9:00 AM – 12:00 PM, 1:30 PM – 6:30 PM, TUESDAY, 9:00 AM – 12:00 PM, 1:30 PM - 5:00 PM, WEDNESDAY, 9:00 AM - 11:30 AM, Grand Ballroom

The Exhibit and Demo Room features an extensive display of SAS products and services. SAS Staff will be on hand to discuss and demonstrate the latest software and services available.

Papers and Presentations

Below is a list of papers in which SAS developers are one or more of the authors. You can find the abstracts and tim- ing for these papers in the Abstract Section in this conference program.

Paper Presentation Title SA-AD-01 The ODS Menu for All Appetites and Applications SA-AD-02 Clinical Trial Reporting Using SAS/GRAPH SG Procedures SA-CC-13 The Care and Feeding of SAS Macro Program Parameters SA-PR-01 SAS and Life Sciences: Trends, Capabilities and Progress SA-RS-01 CDISC Implementation Strategy: SAS® Clinical Data Integration Success Factors SA-RS-02 Using SAS® Clinical Data Integration Server to Implement and Manage CDISC Standards SA-RS-03 Supporting CDISC Standards in Base SAS Using the SAS Clinical Standards Toolkit SA-ST-01 Introduction to Logistic Regression SA-TT-01 Inline formatting with ODS Markup, SA-TT-02 Tiptoe Through the Templates

Paper Hands-On Workshop Title More Tips and Tricks for Creating Multi-Sheet Microsoft Excel Workbooks the Easy Way with SA-HW-01 SAS® SA-HW-02 A SAS Programmer’s Guide” to SAS Enterprise Guide

Additionally, two of our pre-conference seminars were presented by SAS authors: SEMINAR TITLE 6 SAS & XML 8 Best Practices in Base SAS Coding

14 SAS® Keynote Speaker

Kathy Council Title: Celebrating the SAS User - A history of the partnership between SAS and its user community Abstract: You won’t want to miss this year’s Opening Session & Keynote Address on Sunday evening! Our special guest is Kathy Council, Vice President of Publications, SAS Americas. Here’s a preview of Ms. Council’s address: In the past three decades, SAS has grown from its beginnings as a "statistical analysis system" to becoming the world's largest privately-held software company. SAS is the leading business intelligence provider, with software installed at more than 45,000 sites in 100 countries. With this success came the challenge of sustaining close customer relation- ships while extending operations globally.

In a recent webcast to all SAS employees, Jim Goodnight, president and CEO of SAS, said "When we started 30+ years ago...we had a solid customer base to build from. Many of these customers are still SAS customers today. Frankly, it's that focus on the customer that has allowed us to get as far as we have for as long as we have..." He chal- lenged all employees "...to build on our success, always focus on the customer and make this year our best ever." This focus on the customer continues to be the mantra for all of us at SAS. And because we listen, we have developed very close relationships with you. We listen to you via many channels, and that knowledge feeds back to all depart- ments at SAS including R&D, Publications, Education and Technical Support.

To show our appreciation for your valuable feedback, we even reward users for their contributions with the annual User Feedback and SAS Enterprise Intelligence Awards. In this presentation, you will hear stories of how SAS has improved its software and services by listening carefully to what you have to say. You will learn how SAS focuses on maintaining good customer relationships that improve customer satisfaction and loyalty -- even in tough economic times. And every indicator points to an even stronger future: after more than 30+ consecutive years of growth and profitability, SAS continues to be acclaimed as an outstanding employer, vendor and corporate citizen.

Bio: As vice president of Publications, Kathy Council oversees development and delivery of online and printed documentation and information products that support SAS products globally. Products developed within Publications include the SAS HELP system, software documentation, tutorials and self-paced e learning. The Publications divi- sion incorporates all publishing functions including content development, product coordination and testing, design, production, and sales and marketing. Publishing tools, content management systems, and the HELP and online documentation delivery systems are also developed within the division.

Vice president since 1985, Council has headed the Publications unit since originally joining SAS in 1977. Initially hired to handle promotional materials for the young company, she spent her early years at SAS working as a techni- cal support consultant, trainer and coordinator of the annual SAS Users Group (SUGI) Conference. She also au- thored or co-authored a number of early SAS manuals. She has a bachelor's degree in mathematics from the Univer- sity of North Carolina at Greensboro and a master's degree in statistics from North Carolina State University in Ra- leigh, NC.

Council is active in the community on many levels. Past member and president of the board of advisers for the Col- lege of Humanities and Social Sciences (CHASS) at North Carolina State University, she is now a member of the Universities Foundation Board where she represents CHASS' needs. A recent member of the Cary Chamber of Com- merce in Cary, NC, Council is currently co-chair of the Business of Women series for the chamber.

15

Conference Keynote Speaker

Rep. Mitch Greenlick Title: Towards Universal Access to Health Care Abstract: Western European countries began a move toward covering all of their citizens in some form of a national health care system in the 19th Century.. We have toyed with this concept for nearly 100 years in America, currently leaving more than 50 million Americans without access to health insurance. This talk will discuss national efforts to reform the health care system and will look specifically at how Oregon has worked to deal with the problem at the state level. Bio: Mitch Greenlick (D, HD33) was elected to the Oregon House of Representatives in 2002. House District 33 comprises Northwest Portland, Northwest Multnomah County and Northeast Washington County. He serves a Chair of the House Committee on Health Care, a post he has held since 2007. He also serves on the House Committee on Land Use and on the Ways and Means Sub-committee on Human Services. In past sessions he has served on the House Committee on Transportation, on the House Committee on Education and its Higher Education Sub-committee, on the House Rules Committee and as vice-chair of the House Committee on Land Use.

Representative Greenlick was the sponsor or the chief co-sponsor of five bills signed by the Governor during his freshman session in 2003. During the 2005 Session he successfully passed eight bills in which he was the sponsor or the chief co-sponsor. The key bills for which Mitch led the fight in 2003 were the venture capital bill (HB3613), the bulk prescription purchasing bill (SB 875), the patient safety bill (HB 2340). In 2005 he sponsored a bill to change HIV testing procedures during ob visits (HB 2706), several annexation bills, and a bill to make Oregon the first state to require safe handling of airbags containing sodium azide when an automobile is dismantled.(HB 2507). He suc- cessfully sponsored several key bills during the 2007 session and was an architect of SB 329, the bill which created the Oregon Health Fund Board.

Mitch was the chief petitioner for the HOPE for Oregon Families Initiative. The HOPE initiative campaign was an unsuccessful attempt to put a constitutional amendment on the November, 2006 ballot, although 116,000 signatures backed the initiative. If passed, the HOPE would have put the right to health care into the Oregon constitution.

Mitch received his BS and MS from in and his Ph.D. in Medical Care Organization from the University of . In addition to his duties as a legislator, Mitch is professor emeritus and past chair of the Department of Public Health and Preventive Medicine in the Medical School of OHSU. Mitch was director of the Kaiser Permanente Center for Health Research and Vice President for Research, Kaiser Foundation Hospitals for more than 30 years, until he retired from KP in 1995.

He served as a Trustee of the Northwest Health Foundation for ten years through 2008. Mitch was elected to the Na- tional Academies of Science’s Institute of Medicine in 1971. In 1995 he was awarded the Presidential Award by the Association for Health Services Research (now Academy Health) for his life-time achievements in health services research. In 2005 he was awarded the “Public Health Genius” award by the Community Health Partnership and the “Lifetime Achievement” award by the Oregon Public Health Association. He has published more than 200 books, articles, and papers.

16 Abstracts

The remainder of this program provides the details on each of the presentations given at Pharma- SUG 2009 including time, place, level, authors, title, and abstract. Please note that there may be a few last minute changes to the information given in this section. These changes will be noted out- side the room where the presentation is scheduled on the day of the presentation. The abstracts given on the following pages are ordered by paper number within section. The sections begin on the following pages.

APPLICATIONS DEVELOPMENT...... 19 CODERS CORNER...... 27 DATA MANAGEMENT & QUALITY...... 36 HANDS-ON WORKSHOPS...... 39 MANAGEMENT...... 42 POSTERS...... 46 PUBLIC HEALTH RESEARCH...... 56 REGULATORY SUBMISSIONS & STANDARDS...... 59 SAS INSTITUTE PAPERS...... 65 STATISTICS & PHARMACOKINETICS...... 71 TECHNICAL TECHNIQUES...... 78 TUTORIALS...... 84 Pre-Post Conference Seminars...... 88

17

18 Applications Development

Co-chairs: MaryAnn Hope (BCBS of Arizona) Gopal Rajagopal (Merck & Co Inc) ______

Paper AD01 Audience Level: intermediate Monday: 3:30 – 4:00 p.m. Room: Galleria South

Title: Managing very large EXCEL files using the XLS engine

The use of EXCEL spreadsheets is very common in SAS applications, especially in the pharmaceutical industry. EX- CEL sheets are fairly easy to manipulate and easy to edit. Also, most users are relatively at ease with using their EX- CEL skills. It is quite common for SAS applications to use EXCEL spreadsheets to hold and maintain metadata in their operational model. Given the excellent user interface and user skill levels in EXCEL, it is an excellent candidate for this role. This use of EXCEL spreadsheets does come with some limitations, however. Spreadsheets are limited to 256 columns and about 64K rows. For large applications, this can be a ‘show-stopper’. However, there are ways to work around these limitations. By using these ‘work-around’ solutions in two SAS interface macros (to EXCEL) we wind up with a simple implementation. There are, of course, several methods to handle the task of splitting the large data into multiple sheets and combining them, when needed. These multiple sheets could be saved as multiple files or as a workbook. A recent paper by this author at the 2008 PharmaSug conference discussed the solution of using mul- tiple files in SAS 8.2. With SAS 9.x and the new XLS engine, there is a much more elegant and transparent solution possible. This paper describes the solution of storing data in EXCEL sheets and retrieving data from EXCEL sheets no matter how large the data is.

Author(s): John H Adams ______

Paper AD02 Audience Level: intermediate Tuesday: 9:00 – 9:30 a.m. Room: Galleria South

Title: List Processing Routine CallXinc: Calling Parameterized Include Programs Using a Data Set as List of Parameters

This article reviews the list processing routine CallXinc, a parameterized include program. This routine reads a data set, converts each character variable in each row into a global macro variable assignment statement, and calls another parameterized include program. Examples are provided which illustrate list processing using this routine.

Author(s): Ronald J Fehd

______

Paper AD03 Audience Level: intermediate Monday: 11:00 – 11:30 a.m. Room: Galleria South

Title: Using Functions SYSFUNC and IFC to Conditionally Execute Statements in Open Code

This paper explains how to conditionally execute statements in open code without having to wrap the code in a macro. This is accomplished by combining the macro function SYSFUNC with the data step function IFC. The result is that parameterized include programs gain the power of conditional processing --- %if ... %then --- for which macros are usually used.

Author(s): Ronald J Fehd

______

19 ______

Paper AD04 Audience Level: beginner Tuesday: 11:00 - 11:30 a.m. Room: Galleria South

Title: Toolkits for Simulation

When simulation is conducted in the SAS System using a procedure like PROC NLP to find an optimization result, warnings and errors can be generated due to the conditions of how the optimization is run. These types of messages are different from those warnings/errors that indicate the condition of how the program statements are executed, which can be captured using automatic macro variables, such as &SYSERR. In the current version of SAS, there is no clearly defined solution at the end of the optimization to trap warnings and errors associated with initial parameter settings fitted into a statistical procedure such as PROC NLP. This can make the simulation a lengthy iterative trial and error process. It is a common practice to use a remote computer to run the batch simulation job under these condi- tions, it would be helpful to get a real-time email notification when the simulation batch job is completed. The purpose of this paper is to describe a toolkit to assist with the simulation process. This toolkit provides three key features to resolve these demands and help create a smooth simulation process. First, it generates a summary report to summarize whether there is warning/error message associated with each iterative process of the simulation. Second, it calculate the process duration time for each iteration of the simulation. Finally, it will send out an email to notify the user the simulation is complete, along with the previous two items.

Author(s): Huei-Ling Chen, Qian Dong

______

Paper AD05 Audience Level: intermediate Monday: 10:30 - 11:00 a.m. Room: Galleria South

Title: SAS® & VBSCRIPT FOR GENERATION OF POWERPOINT PRESENTATION

It is often required to present results of interim analysis during pre-clinical / clinical trials. The interim analysis results need to be delivered in a short span of time after the database lock. Some analysis reports are presented to the stake holders as PowerPoint presentations. The process of organizing tables and graphs in a PowerPoint presentation is la- borious and it needs to be done several times during the trial. A technique is presented in this paper which uses SAS/ Graphs and Windows Scripting Host (WSH) to automatically generate a PowerPoint presentation whenever the data is updated. This way, the presentation can be updated with any changes in tables / graphs by a simple macro call in SAS. WSH is a simple, powerful, and comprehensive tool which facilitates the controlling of Microsoft applications like Word, PowerPoint, Excel programmatically. The WSH allows VBScript programs to run on the Windows operating system as stand-alone applications. SAS and WSH can be used to automate routine tasks performed using MS Office. VBScript can be generated through a SAS program which can then be executed using WSH to perform various tasks in MS Office and other Windows based applications. The graphs generated during analysis are stored as images. Ta- bles can be converted into SAS/Graph objects using the GPRINT procedure, which are then saved as images. A SAS macro is then used to generate a VBScript which in turn populates a PowerPoint presentation with the image files.

Author(s): Suhas Ramesh K Sanjee, Christopher Tong

______

20 ______

Paper AD07 Audience Level: intermediate Monday: 8:00 - 8:30 a.m. Room: Galleria South

Title: Automate the Process of Creating and Updating Titles and Footnotes of TLG for a Clinical Study Report from a Table Shell Document

Preparation of TLG (tables, listings, and figures) for a clinical study report typically calls for programming efforts to manually type and/or copy the titles and footnotes from a table shell document into table programs. Since table shells may undergo a lot of changes until very late in the preparation stage, it is highly desirable to automate this process to ensure technical accuracy and operational efficiency. This paper introduces a SAS macro named %get_headfoot to extract the table numbers, titles, footnotes and output file names from a table shell document and use the information to create a SAS program called headfoot.sas, which contains a collection of SAS macro variables storing table titles and footnotes. Headfoot.sas will be included in each table program, with the corresponding macro variables invoked to generate titles and footnotes. As the table shell document evolves, simple re-runs of both the macro %get_headfoot and table programs automatically update the titles and footnotes in the output files. The macro %get_headfoot also saves the title and footnote information into a SAS data set, which can be used not only to create a CSV file serving as a project tracking sheet, but also to generate a tracking report that reflects any deletions, additions of tables, listings, and figures, and/or any updates of table numbers, titles, footnotes and output file names in the new version of table shells. The use of this macro results in significant reduction of programming load and error-prone manual processing in creating and updating report titles and footnotes. It also helps to limit the need to manually enter information for project management activities. Another notable benefit comes from its built-in capability to output a series of reports to aid the user to check the completeness of table shells and track any updates in them, therefore it ensures complete implementation of the table shell requirements and serves as a tracking tool for table shells. The macro is easy to use and works well in the PC environment as well as on the UNIX platform.

Author(s): Xiangchen(Bob) Cui, Mei Wu, Shan Chen

______

Paper AD08 Audience Level: intermediate Monday: 11:30 - 12:00 a.m. Room: Galleria South

Title: Those pesky SAS v5 transport files, what’s inside?

The use of SAS V5 transport files, unfortunate, is still currently required for FDA submissions. Many users in the pharmaceutical industry, however, are not well versed on the structure or detemining the contents of transprort files. It is quite common for many of us to receive a SAS transport file from a vendor or a CRO. The first question is always – is it a .XPT file format or a .CPT file format. Next, we don’t know what the contents are. Is there one dataset or are there multiple datasets. Of course, a proficient programmer could quickly write some code to look into and /or extract the desired datasets. The average user that doesn’t do this task very often, however , needs a simpler solution. This paper describes a utility macro that automatically determines the transport file type (.XPT or .CPT), the list of datasets it contains, the list of variables (plus their attributes) in each dataset and allows the user to extract any or all datasets.

Author(s): John H Adams

______

21 ______

Paper AD09 Audience Level: advanced Monday: 2:30 - 3:30 p.m. Room: Galleria South

Title: %ANYTL: A versatile Table/Listing Macro

Unlike traditional table macros, %ANTL requires only 3 parameters (corresponding to the row, column and ti- tle/footnote of a table respectively), but can handle most typical tables/listing production tasks. It is more like a simple language, using a grammar similar to PROC TABUALTE: distinguish continuous, categorical data and text with just a few specific special characters. %ANYTL is also very flexible in that it allows the user to manipulate the summary result prior to output to a table. The big N, column width, pagination, wrapping of the table can be either automati- cally calculated or specified by the user. This macro has proven to be powerful while succinct and easy to use.

Author(s): Yang Chen

______

Paper AD10 Audience Level: advanced Tuesday: 8:30 - 9:00 a.m. Room: Galleria South

Title: Data capture to study-specific metadata for analysis datasets: Automation along the way

Development of analysis datasets and associated metadata that conform to industry (CDISC) standards is one of the essential steps in preparation and analysis of clinical trial data intended for FDA submission. Data captured at investi- gation sites through case report forms (CRFs) and stored in a database is often subjected to a sequence of processing steps eventually resulting in analysis datasets. Metadata of the datasets and variables in the analysis datasets form a vital part of the Define document (of any study / trial) that is submitted to FDA. We outline a strategy (based on a SAS macro we developed) towards automating some of the steps involved in generation of study specific metadata for analysis datasets.

Author(s): Sathish Sundaram, Ganesh Munuswamy, Mominul Islam

______

Paper AD11 Audience Level: intermediate Monday: 4:00 - 4:30 p.m. Room: Galleria South

Title: Converting CDISC Controlled Terminology to SAS Formats

CDISC Controlled Terminology (CT) is used to define and support the terminology needs of the CDISC domains. Applying CT to individual variable with different kind of data across different datasets in different studies can be very cumbersome and pain-staking process. This paper describes an easy way of extracting unique values of the variable and unique values of the CT term and provides a GUI interface to carry out one-to-one mapping for CT terms and data values. This mapping allows generating customized SAS Formats which can be applied directly to raw dataset vari- ables when converting to CDISC compliant datasets. This utility not only reduces the time required to apply CT to applicable CDISC variables but also eliminates any errors caused by manually typing in each unique value for a vari- able. The software/modules used in this utility are SAS, Microsoft Excel, VBA and SAS Stored procedures.

Author(s): Sandeep R Juneja, Vivek Mohan

______

22 ______

Paper AD12 Audience Level: advanced Tuesday: 9:30 - 10:30 a.m. Room: Galleria South

Title: Let SAS Create SPLUS Graphs

To combine the superb graphic visualization of SPLUS with the powerful data manipulation and analysis features of SAS, this paper introduces a new technical approach and presents a practical application that allows the reader to gen- erate SPLUS graphs from within a SAS environment. The approach creates a software interface between SAS and SPLUS that merges SAS macro language and SPLUS scripting language.

Author(s): Jacksen Lou

______

Paper AD13 Audience Level: intermediate Tuesday: 10:30 - 11:00 a.m. Room: Galleria South

Title: Medical Coding System for Clinical Trials – 21 CFR Part 11 Compliant SAS/AF® Application

Medical coding in clinical trials is to classify the clinical research data such as adverse events and medications cap- tured during the drug development process. This paper presents a SAS/AF application that addresses the complexity of the task and accommodates standard terminologies such as MedDRA and WHODRUG to ensure the data is inter- preted in a consistent manner. It facilitates coding algorithms based on standard dictionaries and proprietary syno- nyms, auto-encode and manually code/decode verbatim terms, creating predefined reports and user-defined custom reports, and automatically loading dictionary data from ASCII files to SAS. Above all, the application adheres to the requirements of FDA 21 CFR part 11 and enforces system security as well as maintaining audit trail in a separate da- tabase.

Author(s): Annie Guo

______

Paper AD14 Audience Level: beginner Monday: 8:30 - 9:30 a.m. Room: Galleria South

Title: Creating Your Own Worksheet Formats in exportToXL

%exportToXL is a freely available SAS macro which allows the user to create custom-formatted Excel worksheets, using Dynamic Data Exchange (DDE). It works on all versions of PC Base SAS, Windows, and Excel. Using this macro, the user can create custom-formatted worksheets by either exporting the SAS data onto a pre-formatted Excel worksheet, or by using a worksheet format. This is a set of commands which tells Excel how to format certain aspects of a worksheet such as font sizes and column widths. While%exportToXL comes with a few different pre- programmed worksheet formats, any programmer with some basic knowledge of the SAS macro language and DDE commands can make his/her own. This paper is a tutorial about how to do just that.

Author(s): Nathaniel B Derby

______

23 ______

Paper AD15 Audience Level: beginner Tuesday: 8:00 - 8:30 p.m. Room: Galleria South

Title: Generic System for Generating Data Listings

Presenting data through the listing format is the fundamental report used to reveal what is in the database. Without systematic functions to automate the dynamic adjustments for layout requirements with data listings, the SAS pro- gramming work could be tedious and time consuming. Especially in the event that large volume of data mixed with a variety of complex information for mega trials, it becomes more critical to have a system in place, where global stan- dards can be established to ensure the consistency and accuracy. To reinforce an efficient and cost-effective process for conducting clinical trials, the Arrow Analysis and Reporting System is established at Biometrics and Clinical In- formatics, J&JPRD. This in-house system is implemented with several sub-systems, including the one for data listings generation. This paper will show the implementation concept of the in-house sub-system for data listings generation, and how SAS features/functions can be systematically constructed in an efficient fashion to empower the data listings.

Author(s): JauRuey A Huang, Cheng Jun Tian

______

Paper AD17 Audience Level: advanced Tuesday: 2:30 - 3:00 p.m. Room: Galleria South

Title: A Statistical Analysis System for Categorical Data using SAS

This article describes a SAS-application system that generates standard tables for categorical data analyses. For cate- gorical data analyses, it is basically about the selection of counts and denominators. This system allows users to select different denominator according to various requirements. It starts from data preparation, and ends with results presen- tation. It can handle various data structure and analyses requirements, including the selection of the best or worst re- cord per subject for data with multiple layers, etc. Common applications include the tables of adverse events and dis- tribution tables of several categories. Statistical analyses include most of the analysis options in “proc freq” and more. The output file can be PDF/RTF or TEXT files. The system has been successfully used in many submissions in John- son & Johnson Pharmaceutical Research and Development.

Author(s): Yining Wang, Cheng Jun Tian, Lisa Zhou

______

24 ______

Paper AD18 Audience Level: advanced Tuesday: 11:30 - 12:00 a.m. Room: Galleria South

Title: CUSTOMIZED EFFICIENT DATA-DRIVEN SAS BASED SYSTEM TAILORED TO CLIENTS NEEDS

The Pregnancy Risk Assessment Monitoring System (PRAMS) is a surveillance project of the Centers for Disease Control and Prevention (CDC) and state health departments. The PRAMS collects state-specific, population-based data on maternal attitudes and experiences before, during, and shortly after pregnancy. The PRAMS collected data is available online via the CDC’s PRAMS On-Line Data for Epidemiologic Research (CPONDER). CPONDER is an online query system allowing users to retrieve and explore statistical and comparative analysis from the PRAMS sur- veillance data in both tabular and graphical format. CPONDER was designed to be a data-driven, high performance system capable of handling large amounts of pre-processed data. A unique requirement of the system is that it must provide the ability to handle data incompatibility between states. Data may also be compatible in earlier years but be- come incompatible in later years. Much of the data behind the CPONDER system was processed using two major tools: SAS and SUDAAN. Advanced SAS functions were utilized such as SAS dictionary tables, callable SUDAAN, PROC SQL and highly complex SAS macros. Our paper provides further details of the CPONDER system and the data processing tasks under the hood. Unique sample codes are also provided for illustration purposes.

Author(s): Mila Chiflikyan, Renee' B Karlsen, Mai Nguyen

______

Paper AD21 Audience Level: advanced Tuesday: 1:30 - 2:30 p.m. Room: Galleria South

Title: Data Management: Building a Dynamic Application

You have written a series of interesting and often complex SAS® programs that perform a variety of data entry opera- tions, data checks, exception reporting, statistical analyses, and summary reporting. Since the next study is somewhat similar to the last one, you are planning to build another set of programs based on (cannibalized from) the ones that you just used. We have been there before, but STOP. Wouldn't you rather build and validate the programs just once? The solution lies in building and using data dictionaries that act as control files. These control files are in turn used to create a series of SAS macro variables that are available as arrays to each of the various programs. All project, data set, and variable specific information is stored in the macro variables and hence never in the programs themselves. Once implemented all the programs in your application become data independent. Let them change the data. Let them redefine the project. Your code is ready.

Author(s): Art Carpenter, Richard O Smith

______

25 ______

Paper AD22 Audience Level: intermediate Tuesday: 3:00 - 4:00 p.m. Room: Galleria South

Title: Building Reusable and Highly Effective Tools with the SAS® Macro Language

The SAS® Macro Language is a powerful feature for extending the capabilities of the SAS System. This presentation highlights a collection of techniques for constructing reusable and effective macros tools. Attendees are introduced to the techniques associated with building functional macros that process statements containing SAS code; design reus- able macro techniques; create macros containing keyword and positional parameters; utilize defensive programming tactics and techniques; build a library of macro utilities; interface the macro language with the SQL procedure; and develop efficient and portable macro language code.

Author(s); Kirk Paul Lafler

______

Paper AD23 Audience Level: intermediate Monday: 1:30 - 2:00 p.m. Room: Galleria South

Title: ODS Statistical Graphics: How You SHOULD Be Developing Your SAS® Apps

SAS® has always been proclaimed by its competitors as trailing other applications in areas such as graphics. In par- ticular, graphical analysis of data has always been a problem area, and manipulation or adjustment of graphs has al- ways been more limited than some other statistical analysis and graphics packages. That is changing with ODS Statis- tical Graphics in SAS 9.1.3 and SAS 9.2, not to mention the other new graphical features introduced in SAS 9.2. As a result, any application development process has new tools to facilitate the development of analyses, plots, and charts. This will be illustrated with a basic cost modeling example. We will look at building a standard statistical plot with an information box at the side. We will show how to use ODS Statistical Graphics and ODS statements to build such charts for multiple facilities. We will also look at how to work with multiple destinations, since the exact same graphs that are needed in PDF formats are also likely to be needed for company webpages or in acceptable formats for docu- ments and presentations.

Author(s): David L Cassell

______

Paper AD24 Audience Level: intermediate Monday: 2:00 - 2:30 p.m. Room: Galleria South

Title: Playing Favorites: How to Manage Date Conflicts When Some Date Ranges are Preferred Over Others

Data with dates often require reconciling conflicting date ranges. Sometimes a set of consecutive, non-overlapping date ranges needs to be created from a set of overlapping date ranges. This is easy when all date ranges are considered equal. However, if some date ranges are preferred over others, more thought is required. One example is reconciling clinical data and determining disease status from overlapping date ranges of normal and abnormal lab values. If ab- normal lab values are of interest, then ranges of abnormal values are preferred over normal values when overlap ex- ists. This paper provides one solution to any number of preferences, demonstrates that it executes in a reasonable time with time trial results, and provides a macro implementation.

Author(s): Eric Wong

______

26 Coder’s Corner

Co-chairs: Sandy Paternotte (PPD) Meera Kumar (Sanofi-Aventis) ______

Paper CC01 Audience Level: intermediate Tuesday: 9:00 - 9:15 a.m. Room: Broadway III/IV

Title: Step-by-Step approach in generating multiple plots on one page

In statistical analysis and reporting, it is essential to provide a clear presentation of analysis results. To achieve this goal, the appearance of figures plays a very important role in data presentation. The most common SAS procedure for generating figures is PROC GPLOT. However, it is necessary to provide a simple solution for the high demand of presenting multiple graphs on one page due to and the limitation of GPLOT procedure. The purpose of this paper is to provide the user friendly step-by-step instructions in generating multiple graphs on one page by using SAS procedure, PROC GREPLAY. This paper uses features of SAS/GRAPHS and its procedures.

Author(s): Yogesh K Pande

______

Paper CC02 Audience Level: beginner Tuesday: 1:45 - 2:00 p.m. Room: Broadway III/IV

Title: MACRO %NEWFLOW

If you use SAS ‘Proc Report’ frequently, you will be familiar with the ‘flow’ option. The ‘flow’ option wraps the value of a character variable in its column, and it honors the split character which is defined in ‘Proc Report’. How- ever, the ‘flow’ option has two limitations. First, it dose not allow you to insert any indent spaces for wrapped lines. Second, if a character variable contains leading blank space, the ‘flow’ option dose not carry on the leading blank space to wrapped lines. A macro program called ‘%NEWFLOW’ has been designed to overcome these limitations. In this paper, I will demonstrate the application of %NEWFLOW with examples, discuss the logics of %NEWFLOW with a flowchart, and finally, give the source code of the macro program.

Author(s): Jian (Daniel) Huang

______

Paper CC03 Audience Level: beginner Tuesday: 9:15 - 9:30 a.m. Room: Broadway III/IV

Title: PRODUCING SIMPLE AND QUICK GRAPHS WITH PROC GPLOT

PROC GPLOT is a widely used procedure in SAS®/GRAPH to produce scatter, line, box, and Kaplan-Meier plots. In this paper we present step by step procedures to generate several common graphics. We highlight the use of the IN- TERPOL symbol option. Combining various options with PROC GPLOT, we explore the flexibility and broad func- tionality of this SAS/GRAPH procedure to produce graphics simply and quickly.

Author(s): Xingshu Zhu, Sheng Zhang, Shuping Zhang, Weifeng Xu

______

27 ______

Paper CC05 Audience Level: intermediate Tuesday: 9:45 - 10:00 a.m. Room: Broadway III/IV

Title: Customizing Your Own Box Plot

PROC BOXPLOT is a good and efficient method for exploring data distributions (the mean, quartiles, minimum and maximum observations for a group), but the style is limited to skeletal, schematic and schematicfar. If you want to show the inference (Mean±SE/Mean±SD) as the upper/lower whisker as well as specific patient information or the user prepared percentile with upper/lower whisker, you can not count on PROC BOXPLOT to produce this kind of plot, and this is when PROC GPLOT and the annotate facility come in handy. This paper will show how to produce customized box plots and is targeted to the intermediate level audience.

Author(s): Jade(Xiqun) Huang, Wenjie Wang

______

Paper CC06 Audience Level: intermediate Tuesday: 10:45 - 11:00 a.m. Room: Broadway III/IV

Title: A SAS Macro for Creating AE dataset

Adverse events (AE) are commonly reported and analyzed through the use of various summary tables as requested by FDA and other regulatory agencies. However, writing and validating these SAS programs are very labour-intensive and time-consuming. In this paper, we introduce a SAS macro program that creates one adverse event analysis dataset for generating different summary tables.

Author(s): Suwen Li, Daniel Li, Stephanie Sproule

______

Paper CC07 Audience Level: beginner Tuesday: 8:00 - 8:15 a.m. Room: Broadway III/IV

Title: SAS 1-liners

In the programming world, code efficiency rules supreme. Terms used to describe sections of code as ‘slick’ or ‘smooth’ illustrate the uncertain boundary between art and science in programming. Those enamoured by the ex- tremes of code efficiency often find themselves addicted to (obsessed with?) the thrill that conquering a programming problem with the least number of keystrokes can bring. To that end, this paper seeks to challenge both the beginning and experienced programmer alike to look for new ways to accomplish mundane SAS programming tasks as stylisti- cally as is only humanly possible.

Author(s): Stephen W Hunt

______

28 ______

Paper CC08 Audience Level: intermediate Tuesday: 8:15 - 8:30 a.m. Room: Broadway III/IV

Title: One-Step Change from Baseline Calculations

Change from baseline is a common measure of safety and/or efficacy in clinical trials. The traditional way of calculat- ing changes from baseline in a vertically-structured dataset requires multiple DATA steps, and thus several passes through the data. This paper demonstrates how change from baseline calculations can be performed with a single pass through the data, through use of the Dorfman-Whitlock DO- (DOW-) Loop.

Author(s): Nancy Brucken

______

Paper CC09 Audience Level: intermediate Tuesday: 10:00 - 10:15 a.m. Room: Broadway III/IV

Title: Multiple Graphs on One Page Using GREPLAY with "100% Templates"

Creating multiple graphs on one page has been a challenge in SAS before version 9.2. However, even in earlier SAS versions there is an easy and elegant way to create multiple graphs on one page without facing the problems of dis- torted fonts and axes. The method presented in this paper uses PROC GREPLAY to overlay graphics. Individual graphs are created using SAS graphics procedures and stored in a SAS graphics catalog. When the individual graph is produced the axes are scaled using LENGTH parameter of the AXIS statement. The graphs are also placed appropri- ately on the page using the ORIGIN parameter of the AXIS statement. In the next step templates are defined with PROC GREPLAY for display of the individual graphs. But instead of defining a grid with scaled templates all tem- plates use 100% of the output area. The individual stored graphs are then simply overlaid to create one output with multiple graphs on one page. The beauty of this approach is that the graphics procedure takes care of the scaling. This prevents the graphics from being distorted when PROC GREPLAY is used to place them in a scaled template. An- other advantage is that we are not limited to the output of a single graphics procedure but can combine output from different procedures like GPLOT and GCHART. The AXIS statement gives full and transparent control over size of the individual graphs as well as the placement of the graphs in the output area.

Author(s): Dirk Spruck

______

29 ______

Paper CC10 Audience Level: intermediate Tuesday: 11:00 - 11:15 a.m. Room: Broadway III/IV

Title: Presenting Descriptive Statistics by the Rapid Processing of Datasets from Proc Means

Calculation and presentation of descriptive statistics for clinical data is frequently under the domain of complex mac- ros. Less commonly used features of SAS Procs, however, can rapidly output datasets containing statistics for multiple variables and data subgroups. Post Proc processing of these output datasets with simple but little utilized SAS features generates SAS datasets for subsequent output with very transparent and adaptable code. In this paper, I use Proc Means with keywords, formats, and multiple output statements to generate individual datasets for each statistic. Proc- essing of these datasets is aided by use of the automatic _type_ variable. A column is created to capture overall sub- group totals without need to double set the input data. An uncommonly used dataset merge employing the set state- ment with a by statement is used to reconstitute the statistics into one dataset. Finally, I show an example use of the putN and inputN functions to capture the specified decimal precision for each statistic.

Author(s): Rod Norman

______

Paper CC11 Audience Level: beginner Tuesday: 11:15 - 11:30 a.m. Room: Broadway III/IV

Title: SLEEPLESS IN SEATTLE - FOR HOW MANY CONSECUTIVE NIGHTS?

Looking for the number of consecutive events is an important edit check to find irregularities in otherwise normal data or a key derivation step for important analysis of variables. In this paper, we demonstrate a simple algorithm to find consecutive sleepless events in sample data that meets certain criteria. A macro has been attached based on the algo- rithm described in this paper at the end.

Author(s): Eric Qi, Fikret Karahoda

______

Paper CC12 Audience Level: intermediate Tuesday: 10:15 - 10:30 a.m. Room: Broadway III/IV

Title: Using SAS® to Create Graphs with Pop-up Functions

In addition to the static graph features, SAS provides another strong capability to create dynamic graph displays for information visualization. In this presentation, we will demonstrate a SAS component that can generate a SAS graph with pop-up and drill-down functions. In a pop-up graph, when the mouse is moved over different regions or points of such a graph, additional information will be appeared in a pop-up box. For a drill-down graph, it is often used to link different graph area with different new graphs or tables. This type of SAS graph outputs are quite suitable for a web- based presentation. The topics is prepared for an intermediate and advanced audience. Key words: SAS/graph, infor- mation visualization, pop-up, drill-down, html, Javameta, web-based analysis.

Author(s): Shiqun (Stan) Li, Wei Zhou

______

30 ______

Paper CC14 Audience Level: intermediate Tuesday: 2:00 - 2:15 p.m. Room: Broadway III/IV

Title: Automated Verification of Data Set Metadata to Specification

In clinical trial data set programming, it is important to compare the data set metadata to the data set specification in order to ensure each variable name, label, type, length, and format match. Considering the amount of detailed infor- mation, it can be time consuming and hard to guarantee accuracy if this verification step is performed manually. This paper describes a method using SAS that can automate the verification of each data set’s metadata to its corresponding data set specification. The program described retrieves each data set name and label, each variable name, label, type, length, and format from each data set and from a corresponding specification input file, then creates validation SAS programs and one VBScript file. The VBScript file executes each validation program and provides comparison out- puts for all data sets. This paper further discusses how to convert the data set specification from PDF, plain text, MS WORD or MS EXCEL format into a CSV file that is used by this SAS program as input.

Author(s): Zhuo Chen, David A Gray

______

Paper CC15 Audience Level: intermediate Tuesday: 11:45 - 12:00 a.m. Room: Broadway III/IV

Title: Get Your TLGs LinkedIn: Dynamically Hyperlinked Metadata and Output Files

This paper presents a dynamic solution for easily accessible and user friendly output file location. Create a content rich excel spreadsheet cataloguing all current output including title, footnote and table number specifications. For ease of review and quick reference after a study output production run, the spreadsheet is linked to each current output file. Excellent for ongoing study review and TLG management, additional features include options to rename files in a standard format and the ability to combine the spreadsheet and content files in a zip folder, allowing the files to be uncompressed and accessed remotely utilizing a standalone approach. Using a combination of SAS, VBS and VBA code this semi-automated process greatly enhances study output review. Intended audience – intermediate to ad- vanced, Windows platform, based on SAS 9.1.3, interactive SAS.

Author(s): Suzanne M Humphreys

______

Paper CC16 Audience Level: advanced Tuesday: 11:30 - 11:45 a.m. Room: Broadway III/IV

Title: Dynamic reporting – a data driven approach

One of the essential processes in clinical trial is to generate Tables, Figures & Listings (TFLs) by using SAS for pre- senting and communicating the study results. How to make the program very flexible to meet different reporting re- quirements from various studies is a challenge. This paper will present a data driven approach to the issue of dynamic reporting by taking advantage of one of SAS features: SASHELP dictionary tables.

Author(s):Shan Bai

______

31 ______

Paper CC17 Audience Level: intermediate Tuesday: 10:30 - 10:45 a.m. Room: Broadway III/IV

Title: Check Cumulative Incidence Plot

Validating plots generated by internal SAS macro calls is an important job responsibility for SAS programmers. For validating survival plots, the accuracy of the number of patients at risk is one of the major focuses. However, this is not straightforward. In this paper, the authors will first introduce a simple method using data steps only to calculate the number of patients at risk for a cumulative incidence rate plot. Furthermore, the authors will present a more com- plex approach that uses the output from the product-limit table from PROC LIFETEST to verify the number of pa- tients at risk and the cumulative incidence rates as well.

Author(s): Aiming Yang, Lin Yan

______

Paper CC18 Audience Level: intermediate Tuesday: 2:30 - 2:45 p.m. Room: Broadway III/IV

Title: Importing Data from Microsoft Word into SAS®

Documents in Word format are popular in the pharmaceutical industry. However, SAS does not provide a procedure to import data from Word into SAS as PROC IMPORT reads data from Excel. This deficiency stimulates SAS pro- grammers to explore the techniques of how to programmatically convert Word documents into SAS data. This paper will review several published solutions with pros and cons and present a creative way to import any Word-readable documents into SAS programmatically. In addition, a SAS macro, %word2sas, used for the converting process will be introduced. With this convenient and reliable solution, SAS programmers will become more effective and innovative to leverage MS Word and SAS for their daily work.

Author(s): Jay Zhou

______

Paper CC19 Audience Level: beginner Tuesday: 2:15 - 2:30 p.m. Room: Broadway III/IV

Title: Your Place or Mine: Data-Driven Summary Statistic Precision

The number of decimal places required for displaying summary statistics on a specific parameter is generally a func- tion of the precision of the raw values collected for that parameter and the specific summary statistic requested. This logic can easily be hard-coded for tables displaying only a limited number of parameters, but becomes more difficult to maintain for tables displaying many such parameters. This paper presents a data-driven solution to the problem of displaying summary statistics at the correct level of precision for a larger number of parameters, such as are com- monly found on summaries of clinical laboratory data.

Author(s): Nancy E Brucken

______

32 Paper CC20 Audience Level: intermediate Tuesday: 1:30 - 1:45 p.m. Room: Broadway III/IV

Title: Are you computing Confidence Interval for binomial proportion using PROC FREQ? Be Careful!!

PROC FREQ is the most commonly used procedure for the analysis of categorical data. However, in some situations output generated by this procedure needs special attention. One such case is, computing the confidence interval for responders using binomial proportion. In many situations it is possible that the data do not contain any responders, however, the summary table still needs the confidence interval for responder. Does PROC FREQ calculate the confi- dence interval for responders when the data has only non-responders? No, currently there is no functionality in PROC FREQ to handle this situation. This paper will discuss the limitations of PROC FREQ for the above situation and pro- vides the solution for the same.

Author(s): Sandeep S Sawant

______

Paper CC21 Audience Level: beginner Tuesday: 8:45 - 9:00 a.m. Room: Broadway III/IV

Title: Touring the SASHelp Neighborhood

The SASHelp data library can be of tremendous help to a user who is interested in programmatically accessing meta- data that describes data sets and catalogs in data libraries defined by the user, as well as key operational information for the SAS Environment. After a tour of some of the noteworthy landmarks in the SASHelp neighborhood, I will discuss ways in which that information can be employed to help you write analysis data set specifications and sanitize an interactive workspace in order to ensure you always submit your SAS programs with a clean slate.

Author(s): William Turner

______

Paper CC22 Audience Level: advanced Tuesday: 2:45 - 3:00 p.m. Room: Broadway III/IV

Title: A Macro for Transforming Almost Any Character Calendar Date into a SAS Date Value

We provide a macro for the creation of SAS date values from almost all standard and non-standard types of character representations of Gregorian calendar dates. The program can work with quite loosely written ‘contaminated’ data, with, practically, no restrictions on the type and number of delimiters and sequence of month, day, and year values in a string to be transformed, e.g., “Jan 17th 2008”, “2000-** Jan 2nd”, “03##Sept.$$$02”, “6/XII/08”, “ 2 012003”, etc. The current version of the program supports the following date types: 1) month, day, and year are ‘numbers’, 2) year and month are ‘numbers’, 3) all types of Julian dates, 4) all standard week date formats, 5) all standard year-quarter combination, 6) day and year are ‘numbers’, while month is presented in Gregorian month names, 7) day and year are ‘numbers’, while month is presented as a Roman numeral, 8) special type of non-standard dates, where month, day, and year are ‘numbers’ but there are 0,1, or 2 delimiters. Each date type is transformed in a separate macro designed specifically for it. The usage of the program requires the user to provide laconic directives for identifying the type of date and the sequence of month, day, and year in the string that has a nonambiguous number of high quality calendar date elements. The program is constructed in a modular fashion, which provides the user ease and flexibility when creating and adding new modules or updating and/or removing the existing ones. Even users with minimal knowledge of SAS can use it effectively.

Author(s): Ruben Chiflikyan, Donna Medeiros, Mila Chiflikyan

______

33 ______

Paper CC23 Audience Level: beginner Tuesday: 9:30 - 9:45 a.m. Room: Broadway III/IV

Title: Reverse-engineer a Reference Curve: Capturing Tabular Data from Graphical Output

The pharmaceutical industry is a competitive arena: new drugs inevitably have to find a market among competitors targeting the same indication. Comparisons against these competitor drugs can be problematic, however, since pro- prietary and legacy data may not be available in analysis-ready format. In some cases data must be captured or ‘re- verse-engineered’ from graphical output. How would you meet a sponsor’s request to overlay a reference curve on an efficacy plot where the data that produced the reference curve was not available? The most obvious but least appeal- ing approach would of course be to estimate ‘by eye’, or by using pencil and ruler, the X and Y coordinates for each data point on the reference curve as it appears in graphical output. This paper introduces a more accurate and less la- bor-intensive alternative: to capture screen coordinates using an application like Windows Paint, and scale the screen coordinates to X,Y coordinates which can then be plotted by a SAS/Graph procedure. To illustrate the technique, a ‘case study’ is presented where a legacy Kaplan-Meier survival curve is added as a reference curve to a survival plot.

Author(s): Brian Fairfield-Carter

______

34

35 Data Management & Quality

Co-chairs: Laura DiTullio (GE Healthcare) Dave Izard (Octagon Research) ______

Paper DM02 Audience Level: advanced Wednesday: 10:00 - 10:30 a.m. Room: Galleria South

Title: A Flexible, User-Friendly Methodology for Data Set Comparison

A common activity performed by data management personnel within a clinical trial is the tracking of changes in study data. On a periodic basis the database is downloaded into SAS® data sets and the current version is compared to a data from previous version. One option for doing this is using the COMPARE procedure built into SAS®. However, this can result in large amount of output and is not designed to do comparisons on multiple pairs of data sets. This paper proposes an alternate method for performing data set comparisons of this type. In this approach an initialization file is created which contains the names and primary keys for each of the data sets to be compared. The file is read into memory and using a macro loop statement the data sets are processed iteratively. For each data set a number of comparisons are made and the results are written in a format convenient for data management to analyze.

Author(s): Michael A Thompson

______

Paper DM03 Audience Level: intermediate Wednesday: 10:30 - 11:00 a.m. Room: Galleria South

Title: Clinically Significant Data Integration Studio

SAS Data Integration Studio is a traditional ETL (Extract/Transform/Load) solution for accessing a variety of data sources, transforming those data sources in a structured process, and managing the metadata around that process. One of the challenges of using a traditional ETL product to manage the clinical data transformation process is that this process requires a certain level of flexibility. The clinical data transformation process is sometimes its own art form, whereas an ETL product requires a very rigid process with unyielding inputs and outputs. These two goals often con- tradict each other and thus make the use of a traditional ETL product frustrating for clinical programmers. However, there are advantages to using a structured ETL process with the most important one being the management of the metadata and the reusability of processes. Over the last few years SAS has built a clinical plug-in to the DI product which includes the SDTM 3.1.1 model, specific SDTM add-ons, and metadata reporting tools. These tools help to facilitate the development of processes for transforming clinical data. In addition, the SAS Open Metadata Architec- ture provides the ability to build additional components that are relevant to the clinical transformation process. This paper will provide an overview of: • Best practices for using DI in a clinical data transformation process • Advantages and Limitations of the Clinical DI Plug-In • Capabilities to extend DI to be more clinically significant.

Author(s): Chris Decker, Stephen Baker

______

36 ______

Paper DM04 Audience Level: intermediate Wednesday: 8:00 - 9:00 a.m. Room: Galleria South

Title: Clinical Data Acquisition Standards Harmonization (CDASH) Standard Version 1.0

This isn't a paper that I have drafted - but I am on the Core Team for CDISC's CDASH initiative and therefore will gladly give this presentation if you think this is a topic of interest. The aim of the Clinical Data Acquisition Standards Harmonization (CDASH) Standard Version 1.0 (released this October 2008) is to describe recommended basic stan- dards for the collection of clinical trial data. CDASH has a standard presentation that I am happy to present regarding this new standard - if you think that your audience would be interested, I will already be at the conference and will have happy to talk about it. This would not be competing for best paper or anything - it would simply be informa- tional.

Author(s): Kim Truett

______

Paper DM05 Audience Level: beginner Tuesday: 4:30 - 5:00 p.m. Room: Galleria South

Title: Using Empirical Rules in Quality Control of Clinical Laboratory Data

A statistical programmer without any medical laboratory technology background can still use simple empirical rules to identify suspicious values in laboratory in clinical laboratory data. This paper illustrates the application of simple em- pirical rules in hematology and clinical chemistry. Although these rules are empirical only, nonetheless, the applica- tion of these rules can be quite useful for quality control purposes (e.g., reviewing outliers in preliminary data analy- sis, checking laboratory units reported by more than one laboratory data provider for the same analyte and creating edit checks and exception reports).

Author(s): Faustino Daria

______

Paper DM06 Audience Level: beginner Wednesday: 9:00 - 9:30 a.m. Room: Galleria South

Title: Using CDISC Lab Terminology and Determining Standard Units

In any clinical trial submission, laboratory test results are always necessary. However, there are many challenges that can face programming teams working through lab data. Lab test names can differ depending on what lab the informa- tion handles the testing. Units can vary from lab to lab, causing the need for standardization among units and conver- sions. Even within a submission, a need arises to maintain as much standardization as possible in lab test names, units and conversions. In addition to these difficulties, groups can also struggle with achieving the level of standardization required by CDISC (Clinical Data Interchange Standards Consortium) for SDTM (Study Data Tabulation Model) submissions. When submitting CDISC compliant Lab SDTMs, it is important you use CDISC compliant terminology for lab tests. This paper will discuss the help that this terminology can afford in both individual and ISS studies. It will also provide practical SAS techniques to ensure that LB submissions have LBTEST and LBTESTCD parameters that are SDTM compliant. In addition, standardized lab values will be discussed (--STRES variables) for use in individual and ISS studies. Particular attention will be paid to examples of SAS code that help determine if the values for – STRES variables are unique for each LBTESTCD throughout a submission. The ideas discussed in this paper will be applicable to SAS users on any platform. This paper is intended for those who are interested in creating CDISC com- pliant LB SDTMs and should be accessible to SAS users of all levels of skill and experience.

Author(s): Mat D Davis

37 ______

Paper DM07 Audience Level: intermediate Wednesday: 9:30 - 10:00 a.m. Room: Galleria South

Title: A Simple Macro to Flag New/Changed Records

In a long clinical study or a study with many subjects, listing reviews may be needed many times during the study. To gain efficiency, a reviewer would like to know what new or changed records have emerged since the last run of the listings. This paper describes a simple macro developed in SAS that flags new or changed records in the listing output. This flag will identify the new or changed records since the last run. This macro provides and option to output only new and changed records rather than a cumulative listing.

Author(s): Zhuo Chen, David A Gray

______

Paper DM08 Audience Level: intermediate Tuesday: 5:00 - 5:30 p.m. Room: Galleria South

Title: Working with a CRO / TPO Using SDD to Implement an Adaptive Design Clinical Trial Monitored by a DMC

Clinical trials are in of themselves complex due to dictionary look-ups, conversion of tables, different way of collect- ing case report forms, etc. Very recently, Bayesian trial designs have become more accepted and are used by the pharmaceutical companies. In these designs another level of complexity is added when there is a separate Data Moni- toring Committee is set up to review safety and efficacy data while the sponsor is remains blinded, and a CRO is used to create un-blinded tables for DMC to review. Although SDD will accommodate (or allow for) multiple entities to work on the same platform seamlessly, there are still challenges to ensure that study data is blinded when the sponsor and the CRO both are working in SDD. In fact, not all the procedures are in place with SDD since it is an emerging technology, this paper / presentation addresses the challenges and the solutions how we executed an adaptive design trial that is monitored by a DMC while working with a CRO.

Author(s): Barry M Brolley, Maruful Chowdhury

______

Paper DM09 Audience Level: intermediate Tuesday: 4:00 - 4:30 p.m. Room: Galleria South

Title: Using the IBM® Tivoli® Storage Manager Hierarchical Storage Management facility (HSM) to manage large datasets at IMS.

The main focus of this paper is to illustrate how the LifeLink™ Statistical Services Programming Group at IMS man- ages exceptionally large datasets using the IBM® Tivoli® Storage Manager Hierarchical Storage Management facility (HSM). The LifeLink™ programming group provides programming services in support of IMS’ Commercial Effec- tiveness Consulting Services organization as they deliver customized analyses using Anonymized Patient Level Data (APLD)/Longitudinal Prescription Data (LRx). IMS’ APLD/LRx database contains approximately 64% of all pre- scriptions written in the U.S. so it’s not unusual for the datasets processed by the LifeLink™ group to exceed 100 gigabytes in size and contain hundreds of millions of observations. At any given time, the group is working on 20-25 custom deliverables which follow a project timeline of approximately 3-6 weeks per project. The group is also respon- sible for approximately 50 recurring deliverables which follow monthly, quarterly, semi-annual or annual cycles. Con- sidering the number of projects and the size of the datasets being processed, one can see why a tool like IBM®’s HSM is necessary. This paper will provide a basic overview of how HSM is used at IMS to support these large data volumes and will secondarily examine other efficiencies in use by the LifeLink™ Programming Group. IMS LifeLink™ is a unique global program of patient-centered information, analytics and consulting.

Author(s): Paul Doucette

38 Hands-On Workshops

Co-chairs: Susan Fehrer (BioClin Inc) Alissa Ruelle (PharmaNet) ______

Paper HW01 Audience Level: beginner Monday: 1:30 - 3:30 p.m. Room: Parlor

Title: Exploring DICTIONARY Tables and SASHELP Views

SAS® users can quickly and conveniently obtain useful information about their SAS session with a number of read only SAS data views called DICTIONARY tables or SASHELP views. At any time during a SAS session, informa- tion about currently defined system options, libnames, table names, column names and attributes, formats, indexes, and more can be accessed and captured. This hands-on workshop (HOW) explores the purpose of DICTIONARY tables and views, how they are accessed, and what information is available to SAS users. Attendees will learn how these important tables and views can be accessed and applied using real-world scenarios.

Author(s): Kirk Paul Lafler

______

Paper HW02 Audience Level: beginner Monday: 10:00 - 12:00 a.m. Room: Parlor

Title: Understanding the define.xml and converting it to a relational database

When submitting clinical study data in electronic format to the FDA, not only information from trials has to be sub- mitted, but also information to help understand the data. Part of this information is a data definition file, which is the metadata describing the format and content of the submitted data sets. When submitting data in the CDISC SDTM format it is required to submit the data definition file in the Case Report Tabulation Data Definition Specification (de- fine.xml) format as prepared by the CDISC define.xml team. This workshop will provide a quick introduction to XML (eXtensible Markup Language) and will explain the structure and content of the define.xml file. The SAS XML Map- per will be used to convert the define.xml file into SAS datasets.

Author(s): Lex Jansen

______

Paper HW03 Audience Level: beginner Monday: 3:30 - 5:30 p.m. Room: Parlor

Title: PROC REPORT: Compute Block Basics – Part II Practicum

One of the unique features of the REPORT procedure is the Compute Block. Unlike most other SAS procedures, PROC REPORT has the ability to modify values within a column, to insert lines of text into the report, to create col- umns, and to control the content of a column. Through compute blocks it is possible to use a number of SAS language elements, many of which can otherwise only be used in the DATA step. While powerful, the compute block can also be complex and potentially confusing. This tutorial introduces basic compute block concepts, statements, and usages. It discusses a few of the issues that tend to cause folks consternation when first learning how to use the compute block in PROC REPORT. This paper is being presented in conjunction with the Tutorial PROC REPORT: Compute Block Basics –Part I Tutorial. Consult that paper for additional details.

Author(s): Art Carpenter

______

39 ______

Paper HW04 Audience Level: beginner Monday: 8:00 - 10:00 a.m. Room: Parlor

Title: A Pragmatic Programmers Introduction to Data Integration Studio: A Hands on Workshop

ETL is the process of moving data from a source system (such as operational systems or a table in a database) into a structure that supports analytics and reporting (target). This workshop will guide participants through a structured, hands-on exercise designed to give them a broad overview of what things we can accomplish with Data Integration Studio. Here we will prepare data for use by extracting data from an external file, creating transformations that enrich our data, combining it with other data for completeness and finally loading the data into tables that are part of a star schema. The goal of this workshop will be to get users comfortable with the tool and demonstrate its capability.

Author(s): Greg Nelson

______

Paper HW06 Audience Level: intermediate Tuesday: 1:30 - 3:30 p.m. Room: Parlor

Title: SAS® Enterprise Guide 4.2 and Stored Processes

New in SAS 9, stored processes are SAS programs that are stored centrally on a server so that they can be accessed from anywhere in the organization. The advantage of stored processes over other approaches is in the ability to cen- trally maintain and manage code. They can provide better control over changes, enhance security and application in- tegrity, and ensure that every client executes the latest version of code. SAS Stored Processes can be used in many different client applications, including SAS Enterprise Guide and the SAS Stored Process Web Application. This pa- per will cover creating and managing SAS Stored Processes using Enterprise Guide and the SAS Management Con- sole, including the following topics: • A brief overview of using SAS® Management Console to set up and manage Metadata, Workspace and Stored Process Servers; and • Using SAS® Enterprise Guide to create and view stored processes.

Author(s): Frederick Pratter

______

40

41 Management

Co-chairs: Jackie Lane (ICON Clinical Research) Pete Yribar (Genentech, Inc.) ______

Paper MA01 Audience Level: intermediate Monday: 8:00 - 8:30 a.m. Room: Galleria North

Title: Technical Support in Biometrics

As biometrics organization becomes larger and drugs development become more complicated, there is a growing need to establish a dedicated group within the biometrics organization (typically consists of Biostatisticians and clinical SAS Programmers) to build biostatistical applications(e.g. clinical data reporting macro library, e-Submission) and at the same time provide biostatistical system and applications support to users on a daily basis. This paper discusses the organization, the functions of such group, as well as the kind of services such group can provide to the biostatistics users community. The author worked at both large pharmaceutical firms and is currently working at a large biotech company.

Author(s): Wei Dong

______

Paper MA02 Audience Level: intermediate Monday: 11:30 - 12:00 a.m. Room: Galleria North

Title: Managing Independent Consultants - An Oxymoron?

Managing a group of experienced consultants can be challenging enough. Throw in some variables such as different time zones and communication styles and it can become a bit overwhelming. This paper uses real life examples to make a few keys points on working effectively with a team of consultants: - The Virtual Team - working successfully with centralized AND remote consultants - Effective use of team leads in a consulting environment - Maintaining good communication - How to keep consultants accountable without feeling like "Big Brother".

Author(s): David J Polus

______

Paper MA03 Audience Level: intermediate Monday: 9:30 - 10:00 a.m. Room: Galleria North

Title: Empowering SAS® Programmers: The Role of the Manager

How does a manager empower SAS® programmers in a fast-paced environment? SAS programmers need skills, in- formation and resources, authority and motivation to be successful. A manager can empower programmers by helping them to develop their skills, providing them with complete information and resources, giving them appropriate author- ity and motivating them. The manager that does these things will give their programmers the best chance for success.

Author(s): Carey G Smoak

______

42 ______

Paper MA04 Audience Level: intermediate Monday: 10:00 - 11:00 a.m. Room: Galleria North

Title: Establishing a True Virtual Partnership

The SAS Clinical Programming realm is beset by challenges facing organizations looking to partner with service pro- viders both local and international. Pharmaceutical and Biotech companies are both trying to identify a feasible solu- tion for low cost outsourcing of clinical programming activities. This challenge is intensified by the opposing objec- tives of maintaining quality output while minimizing training and on boarding costs. Outsourcing models vary widely from company to company and the success of those models is equally variable. This paper will examine one such model and look to identify success factors common with other models in order to establish a baseline for a true virtual partnership. The authors will determine the most effective strategy for merging the efforts of multiple divergent or- ganizations into a single effort that is aligned with the strategies of each. They will examine the challenges presented by a geographically dispersed project team and the requirements to achieve a successful working environment. The paper will evaluate and determine the most effective process for negating the growing pains that are inevitable in a virtual partnership. It will identify the strengths of the selected partnership structure and examine other possible struc- tures that could potentially achieve the same objectives. It will also examine the nature of the supplier-purchaser rela- tionship within the service provider model as partner organizations look to grow within the partnership environment. Ultimately, this paper provides Pharma and Biotech managers with an in depth analysis of a successful clinical pro- gramming outsourcing project with the goal of identifying the lessons learned and the application of these to a real world situation.

Author(s): Jonathan Botha, Sandra Vermeulen

______

Paper MA05 Audience Level: beginner Monday: 9:00 - 9:30 a.m. Room: Galleria North

Title: Using a Personality Inventory to Better Lead Your Team

The field of organizational psychology studies and attempts to understand the way people act in groups. Behavior is in turn an expression of one’s personality. An individual’s personality influences many areas of work life. It affects how he communicates with colleagues, how he approaches a new task, how he manages a team, and how he views the or- ganization. The result of this on the company can be measured by the individual’s productivity and the cooperation of those working close to him. As a manager and leader, understanding the personalities of your team members can help to enhance team performance. This paper presents a tool that helps managers achieve this understanding and discusses how to make use of it.

Author(s): Wayne Woo

______

43 ______

Paper MA07 Audience Level: beginner Monday: 11:00 - 11:30 a.m. Room: Galleria North

Title: Reducing Risk in your SAS Environment

As all managers know, even when projects appear to be running smoothly, risk must be constantly assessed, mitigated and dealt with. This is an every day practice with projects but what about issues specific to your SAS environment as a whole? Are there issues that can arise during a project that may not be thought of during project planning? It is im- portant to take into account anything that can add risk to your project. This paper will discuss issues that pertain to a SAS environment that can lead to risk now or down the road.

Author(s): Brian K. Varney

______

Paper panel #3 Audience Level: intermediate Wednesday: 8:30 a.m. – 10:00 a.m. Room: Parlor

Title: Managing Global Resources: A Panel Discussion Facilitator: Charles Du Mond

Pharmaceutical development is a global process. Companies ranging from small virtual start-up biotechs to large pharma all use global resources. Biostatistics and programming departments have also gone global, regardless of whether it is to provide local services on a global scale, or to distribute the workload across lower and higher cost cen- ters. The panel discussion will focus on the challenges of managing a global workforce, maintaining quality with dis- tributed work, and harmonizing work practices across multiple offices.

44

45 Posters

Co-chairs: Jeanina Worden (ClinOps, LLC) Helen Wang (Merck & Co., Inc.) ______

Paper PO01 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: Imbedding Custom Table of Contents Code in RTF Documents

The SAS Output Delivery System (ODS) allows output to be printed directly to a Rich Text Format (RTF) file. RTF files can be viewed by Microsoft Word and other word-processing packages. A table of contents (TOC) can greatly enhance an RTF file and make it more user-friendly. Headers and footers, as well as custom footnotes, can also greatly enhance an RTF file. This paper will describe a method to imbed “invisible” RTF control words into a file for the purpose of creating a custom TOC. The example presented will also include document headers, footers, and cus- tom footnotes. The example code was run with SAS Version 9.1.2 on Windows XP Professional. The RTF file is viewed with Microsoft Office Word 2003.

Author(s): Lori S Parsons

______

Paper PO02 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: A Sample Method to Generate Patient Profiles

Generating patient profiles is a tedious job for Programmers at pharmaceutical companies and CROs since a lot of data manipulation and procedures, such as the PRINT and REPORT, are involved. This paper describes a simple ap- proach to generating patient profiles that follow the order in which data is collected on the Case Report Forms (CRF). We introduce a macro that automatically creates SAS code to generate tables by using the CALL EXECUTE function within the DATA steps

Author(s): Suwen Li, Daniel Li, Stephanie Sproule, David Zhu

______

Paper PO03 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: A Generalized Procedure to Create SAS/Graph Error Bar Plots

Different methodologies exist to create Error bar related plots. Procedures exist to use the dataset by itself to create error bars either for standard deviation +/- mean (using the INTERPOLATION=STD1JT option), standard errors +/- mean (using the INTERPOLATION= STD1TJM option), min and max values but it becomes cumbersome to generate figures by just using the PROC Plot procedure as one has to go through numerous options available to find the right option. This paper is an attempt to have a general plot procedure for plotting all types of error bar related graphs.

Author(s): Sanjiv Ramalingam

______

46 ______

Paper PO04 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: All-purpose Programs

A frequent practice in programming for clinical trial reporting is to reuse a program by copying the code and making a small change to meet new specifications. The new program file might be called by a different name and yet have only one line changed. Or the program might have the same name, but creates output that differs from before due to a change in code. The original program can be changed several times to meet the needs of new ad hoc analyses or new exploratory analyses. The resultant proliferation of new or changing programs is a source of confusion and can lead to results that cannot be replicated or that are difficult to validate. Associated files, such as the log and output, may also have different file names if there is a change in the new program name, and this adds to the confusion. In this paper, I will present strategies for creating a SAS® program that does not change, but that produces different output depend- ing on environmental variables, global macro variables, or simply the location of the program in the file hierarchy. By building this versatility into a program once, the programmer can avoid file proliferation and the need for repeated validation, while at the same time meet the needs of people who want changes in analyses and displays of data. The strategies and code described have been developed on a Unix platform but should be adaptable to any SAS version and any platform, and should be of use to SAS users at all levels of experience.

Author(s): James Young

______

Paper PO05 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: A SAS Macro Application for Generating Patient-Level Reports in Clinical Studies

In clinical observational studies, participating physicians sometimes seek to obtain patient-level reports for a specific subgroup of patients in the studies to help make clinical treatment decisions. The patient-level reports often contain baseline characteristics, vital signs, adverse events and concomitant drug usages. Many analysis datasets (ADS) in pharmaceutical companies use record-based databases. Patient-level information cannot be directly derived from those databases since each patient may have multiple records in individual dataset (e.g. multiple adverse events or therapy dosages). Moreover, multiple datasets are involved to reproduce patient-level summary report. Therefore, it comes handy to have a SAS macro to summarize patient-level information easily and quickly, with minimal formatting, and sufficient but not redundant information for clinical investigators. The macro %PATINFO described in this paper was written with that goal in mind. It can be used to generate patient-level reports in clinical observational studies with study specific template as its key parameter. The template can be created easily in Excel with a few parameters such as variable names, variable labels and dataset names. The template provides a great deal of flexibility to meet various data structures of different studies. This macro can also be applied to non-observational clinical studies when patient- level reports are needed.

Author(s): Yao Huang

______

47 ______

Paper PO06 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: Open Source SAS® Software Applications: the OS3A Program for Community Programming

The Web ‘changes everything’!! We are entering a new paradigm of worldwide collaboration in scientific, educational and economic activity based on a large decrease in communication costs created by the internet. This will change how we write programs in the future. The paradigm shift has made possible new methods of open source software devel- opment and distribution. Can SAS® software users take advantage of the opportunities? How could this open source model ever work? At first glance, the open source model should fail as an example of the free rider problem of classic economics. The adverse incentives can be explained with a “programmers’ dilemma” analysis in game theory. The open source licenses have conditions that overcome the free rider problem based on the principles of the Open Source Initiative. As a practical matter, the facilities provided by several sites supporting open source projects and the prac- tices developed by the many Open Source projects over the past 15 years are key in generating cooperation peer pro- duction by software application developers.

Author(s): Dante diTommaso, Ann M Martin, Paul M OldenKamp, Paul D Hamilton

______

Paper PO08 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: Generating Model Based Subgroup Analysis Using SAS® Procedures

Subgroup analysis is often carried out for clinical trials to further understand the treatment effect in some subgroups of patients. One approach is to repeat the analysis done for the whole population for the subgroup of patients of interest. Another is to generate the subgroup analysis based on a statistical model with an interaction term of subgroup and treatment. The reason the later approach is not commonly used is the complexity of interpretation of the interaction in the model. This article is an effort to provide a SAS tool to generate such an analysis with ease. Through an example, we will demonstrate the programs and results of the subgroup analysis with these two approaches.

Author(s): Tracy Lin, Jie Huang

______

Paper PO09 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: SAS Viewer giving way to Universal Viewer

The SAS Viewer tool has been a useful, free, but somewhat limited tool provided by SAS for a long time. It is free. It has bugs and limitation. Starting with SAS92, this tool will no longer access the default data tables. The Universal Viewer is a fresh look or fresh take on a tool to fit this niche. It does not lock data tables so your SAS jobs complete rather than bombing with Viewer. It allows the user to sort on more than one column. It gives the right answer when running a where clause against missing values.

Author(s): Steve Wright

______

48 ______

Paper PO10 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: Chic FREQ and UNIVARIATE

Abstract There are many standards and often used SAS procedures for reporting the results from clinical trials. The aim of the efficient SAS programmer is to be able to macroize the programs that generate the regular tables reducing rework and repetitive programming. This paper will present two flexible and dynamic macros utilizing the PROC FREQ and PROC UNIVARIATE procedures. These macros allow the user to produce commonly used statistical out- put, including flexible control over formatting and output options.

Author(s): Jasmin Fredette

______

Paper PO11 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: Open Source Collaborations: what applications are available for Clinical Research

The purpose of this paper is to give an overview of the Open Source Software Collaborations ongoing in the Pharma- ceutical and Healthcare Industry. Today, a great many applications are available providing low cost, widely used ap- plications. Statistical programmers can learn from these examples to provide value for their organizations and the pa- tients they work for. From these examples it is clear that the statistical programming community has an interest in actively surveying the development of these applications and take a leading role in their organizations to bring innova- tion and cost reduction in their companies.

Author(s): Ann M Martin

______

Paper PO12 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: New Destinations: RTF Tagsets in SAS Version 9.2

The SAS® System has had the ability to directly create Rich Text Format Documents for many years. First available as a home-grown macro application, SAS Institute made the original RTF destination available with ODS in Version 7. However, the original ODS RTF destination was designed with significant limitations; while ODS knew about and controlled the output across a page (horizontally) it was completely ignorant of vertical page layout. This lead to a host of clever work-around from the SAS programmer community. With the release of Version 9.2, a new RTF tool has become available. Now, typical reporting tools such as Proc Report and Proc Tabulate know about the horizontal and vertical page dimensions as text is flowed into a report. This, along with other subtle but important changes in 9.2 ODS, may lead one to a reexamination of our output programs and can greatly simplify production reporting tools.

Author(s): Paul D Hamilton

______

49 ______

Paper PO13 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: Managing Project Schedule, SAS Can Help

It is always easier to have all project plans in front while you try to schedule workloads or reallocate resources. Graphic schedule displays complex timelines in order and helps you re-arrange your plans or resources in order to complete the projects on time, especially when some timelines are overlapped. Microsoft Office Project ‘s “Gantt Chart” is one of the powerful planning and scheduling tool, which has a number of visual functions to keep yourself and your team members updated. You also can create a Gantt chart in Microsoft Office Excel with simple functions. However, for those companies which do not purchase Microsoft Project or other Gantt chart software ®but SAS, there is an alternative option for you, which is one of SAS procedures, PROC GANTT. As to your source projects, you can simply input your schedules into SAS DATA step or Microsoft Excel as usual.

Author(s): Hui-Ping Chen

______

Paper PO14 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: Electronic validation of tables and listings in Rich Text Format or Microsoft Word format without leav- ing your own SAS

Validation of clinical trial data reporting is an extremely important step in ensuring quality and accuracy in a regula- tory submission. The way validation is performed varies between and sometimes even within companies, but in gen- eral falls into one of two categories: a review of the main programmer’s code by another programmer; or independent programming of the main programmer’s analysis data and/or output report for comparison purposes. In a trial where actual output (i.e., tables and listings) will be submitted to the regulatory body it is preferable to validate the final out- put itself rather than the analysis data used to generate that output. Even with perfect data there is still plenty of room for error in the data summarization and presentation process, such as miscalculated denominators, omitted patients, or juxtaposed columns and rows. This paper presents a process for facilitating the validation of output in Microsoft Word or Rich Text Format. It is not necessary for the main and validation programmers to validate their respective SAS analysis data a priori or even use the same variable names and variable formats. Using a SAS macro, the table or list- ing is read from the .doc or .rtf output into SAS and converted into a SAS data set (where each observation, in order, corresponds to a row of the table or listing). The resulting data set is then electronically compared to the validation programmer’s results. The macro prints the rows where there are discrepancies for further investigation. In addition to the added value of having the actual final output validated, this process also greatly reduces the tedious and human error-prone aspect of comparing anything “by eye”.

Author(s): Gina M Garding

______

50 ______

Paper PO16 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: A Framework For Achieving An Industry-Driven, Open-Source Clinical Reporting System

How many in-house reporting systems have been presented at recent clinical development conferences? Each has in- novative features that spark interest in the development community, but each remains a separate, incompatible system that is unlikely to be replicated within any other organization. We could continue individually culling ideas that are feasible for our separate environments. Or we could combine the best of each into a collaborative project that lever- ages standards in a flexible framework. Our purpose is two-fold. First, establish a clear rationale for a global, open- source clinical reporting library that builds on the success of industry standards such as CDISC’s Study Data Tabula- tion Model (SDTM) and MSSO’s Standardised MedDRA Queries (SMQs). We outline the benefits of participation for sponsors, regulatory agencies, individual programmers, and ultimately for patients who benefit from development transparency and efficiency. Second, outline a framework for a viable open-source project, minimum requirements to make adoption a realistic option, and an initial development plan.

Author(s): Dante diTommaso, Ann M Martin, Paul M OldenKamp, Paul D Hamilton

______

Paper PO17 Audience Level: advanced on display Monday and Tuesday Room: Grand Ballroom

Title: Arrow Clinical Data Analysis System, An Integration of Statistical Analysis and Reporting

Arrow system is written in SAS. The system has been developed and successfully used in a large amount of clinical studies for 15 years. Now it becomes a company standard of processes for statistical analysis and submission in John- son and Johnson Pharmaceutical Research and Development, LLC. This paper is to introduce the system from the viewpoint of system design. Basic design aspects are covered herein, including system scope, design criteria, architec- ture and structure, programming style, version control, self-diagnosis, system maintenance, and recovery process. In addition, some specifications of Arrow sub-systems are briefly described.

Author(s): Cheng Jun Tian, Denis Michel

______

Paper PO18 Audience Level: advanced on display Monday and Tuesday Room: Grand Ballroom

Title: Need for a Good Programming Practice for Clinical Trials

There is a need for a consensus Good Programming Practice for Clinical Trials. This would avoid that collaborating organizations need to review each others Good Programming Practice before being able to work together. It would enhance readability of our programs to the benefit of regulatory authorities and ultimately the patients. It also would enable Open Source Collaborations in developing tools for the statistical programmer community such as a Clinical Trial Reporting System. This Good Programming Practice could be developed through an Open Collaboration on e.g. www.sasCommunity.org. Currently, a practice has been posted on sasCommunity.org 'Good Programming Practice for Clinical Trials' in the hope to receive contributions from other programmers, companies, academia and regulatory authorities. The aim is to review the main points of this Good Programming Practice in order to get adoption of a con- sensus document, endorsed by the boards of PharmaSUG and PhUSE by end 2009.

Author(s): Ann M Martin ______

51

______

Paper PO19 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: Arrow Statistical Graphic System

Arrow Statistical Graphic System (Arrow/GR), written in SAS, presents effective statistical graphs for clinical data analysis in Johnson & Johnson Pharmaceutical Research & Development. Arrow/GR is a sub-system of Arrow Clini- cal Data Analysis System, focusing on data visualization. It covers services for all phases in clinical trials. Arrow/GR currently provides with eight types of statistical graphs: mean plot, vertical bar charts, horizontal bar charts, box plot, line plot, dot plot, survival curve plot, and point estimates with confidence intervals plot. With Arrow/GR, graphic presentation becomes easy, reliable, repeatable, and yet flexible. Arrow/GR is a modularized SAS macro package with well-organized structure. It sets up J&J’s graphic standard by applying graphic templates, consistent selection of col- ors, symbols, and line types. The graphic data are the output from well-validated Arrow statistical analysis modules. This makes the statistical graphs produced by Arrow/GR solid, trivial in quality control.

Author(s): Jiangfan Li, Cheng Jun Tian, Qin Li ______

Paper PO20 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: Oracle Clinical for Clinical Trial SAS Programmer

This paper is intended for SAS programmers who are interested in understanding the difference in the database struc- ture between Oracle Clinical and SAS. It also helps SAS programmers to use SAS/ACCESS® to extract raw data from Oracle Clinical. This paper will discuss the database structure of Oracle Clinical and its relationship with the extracted SAS data.

Author(s): Kevin Lee ______

Paper PO21 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: Statistics for Clinical Trial SAS Programmers 1: paired t-test

This paper is intended for SAS programmers who are interested in understanding common statistical method used in clinical trials. The paper will focus on paired t-test that proves the statistical significance. It will introduce the simple statistical background and the sample SAS codes in SAS/STAT®, and it will also show how to use JMP® Software to perform the paired t-test.

Author(s): Kevin Lee

______

52 ______

Paper PO22 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: When Simpler is Better – Visualizing Laboratory Data Using “SG Procedures”

In SAS® 9.2, SAS/GRAPH introduces a family of new procedures to create stand-alone graphs that use the ODS Sta- tistical Graphics infrastructure and are designed to use ODS styles. These new “SG procedures” include SGPLOT, SGPANEL, and SGSCATTER. With a simple and clear syntax, procedure-quality plots can be generated to assist your data exploration and presentation. This paper provides examples that illustrate how you can use these new proce- dures to examine and visualize laboratory data.

Author(s): Wei Cheng ______

Paper PO23 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: Establishing Consistent Programming for Pharmaceutical Safety Summarizations Using a DOW Loop

Paper Number: PO23

Abstract: The three basic categories of data collected during a pharmaceutical study to assess safety are clinical laboratory tests, electrocardiogram parameters, and vital signs. Simple summary statistics of the results and change from baseline are usually presented in an ICH summary report, and often it is useful to include a presentation of out of range shifts rela- tive to baseline. When several varieties of standard programs are available to facilitate summaries, one could argue that a consistent approach across all data and types of summaries in a specific study can increase efficiency and mini- mize the possibility of multiple errors. This paper will illustrate a basic approach which may be applied similarly to various types of data regardless of vertical or horizontal structure. To minimize data step processing and superfluous merging or processing, a DOW loop is employed. Advantages and disadvantages of using this process are discussed, and preliminary results to date are presented.

Author(s): Bradford J Danner, Steven R Kirby ______

Paper PO24 Audience Level: beginner on display Monday and Tuesday Room: Grand Ballroom

Title: Biostatistics and Data Management: A Match Made in ...Tasmania?

The relationship between the functional teams of Biostatistics and Data Management is very important and complex. It can also be very stressful and aggravating. But, at the same time it is gratifying and rewarding. Like I said, a com- plex relationship. I did not realize the complexity and mutual need that existed between these two groups until last year when I ended up being the lone Biostats person in a CRF workshop at a Clinical Data Management conference. Shortly thereafter, I also saw a documentary about the relationship between the male and female Tasmanian Devil on public television and I immediately saw a parallel that created the inspiration for this presentation.

Author(s): Elaine Z Dempsey

______

53 ______

Paper PO25 Audience Level: intermediate on display Monday and Tuesday Room: Grand Ballroom

Title: Catch the Bad Guys!!! A Utility Program to Check SAS® Log Files

Paper Number: PO25

Abstract: SAS log files are generated whenever we run a SAS program. It is really an important part of validation process be- cause it enables us to make our programs error free. Normally we look for messages like “ERROR” or “WARNING:”. But we are going to discuss some others which we really don’t want to see in our log file. This utility program scans through one or more SAS log file(s) for all these messages and gives a summary report of the bad ones which will allow a programmer to fix his/her code to get rid of these bad messages. Looking at this summary report is better than going through each SAS log file and finding all the unwanted messages. This program was developed on a PC SAS environment and works with both 8.x and 9.x versions of SAS.

Author(s): Amit K Baid ______

54

55 Public Health Research

Co-chairs: Richard Allen (Peak Statistical Services) Paul Slagle (United Biosources) ______

Paper PR01 Audience Level: beginner Wednesday: 9:00 - 10:00 a.m. Room: Galleria North

Title: Data Convergence in BioPharma and Healthcare: Overview and Implications

The increased pace of advances in science, technology and healthcare and the need for medical breakthroughs have blurred the lines between traditional Biotechnology, Pharmaceutical Research and Healthcare. Anyone who is familiar with these sectors knows that the data and information needs of these industries are large and complex and often daunting even within a single organization. In this paper, we will discuss implications of the patient data flow pro- duced by each of these organizations. We will define what each of these types of organizations do in terms of their interaction with the patient and the data footprint that each of them generate. Further, we will outline trends in the context of each industry and what that might mean for the integration of data and systems. Finally, we will talk about the implications of interoperability within and between these sectors including the electronic and personal health re- cord (EHR/ PHR), health information exchanges (HIEs), the role of patient registries and epidemiology databases and the converging data and interoperability standards (for example, CDISC, ICH and HL7.)

Author(s): Greg Nelson ______

Paper PR02 Audience Level: beginner Wednesday: 8:30 - 9:00 a.m. Room: Galleria North

Title: Pharmacogenomics: JMP Right In!

The understanding of pharmacogenomics has greatly accelerated in the past decade. With the unraveling of the human genome, the ongoing HAPMAP project and the development of microarray SNP chips by Affymetrix and Illumina, the chore of associating genetic markers with drug reaction states has become a weighty project. With the vast amounts of data being stored and sifted it is useful to have a powerful statistical tool for analysis. SAS® has become entrenched in the pharma world, and provides much of the horsepower needed for such a task. Powerful graphical analysis tools like JMP have not been able to successfully enter the pharmacogenomic market due to design con- straints. JMP stores data sets in memory, so the size of the data set you wish to work with is limited by the memory of your system. With the large data sets typical of pharmacogenomic data, this constraint is debilitating. JMP Genomics circumvents this problem by allowing JMP to directly work with SAS data sets on disks. This enables the graphical power of JMP to directly interact with the large data set capacity inherent in SAS, making for a powerful combination capable of effectively analyzing large pharmacogenomic data sets. JMP Genomics also presents a comprehensive ar- ray of tools to effectively analyze pharmacogenomic data sets. In this paper we will discuss the unique features JMP Genomics offers and demonstrate its effectiveness in analyzing large pharmacogenomic data sets, specifically single nucleotide polymorphisms (SNPs).

Author(s): Jason B Baucom ______

56 ______

Paper PR03 Audience Level: intermediate Tuesday: 5:00 - 5:30 p.m. Room: Galleria North

Title: Translation of Drug Metabolic Enzyme and Transporter (DMET) Genetic Variants into Star Allele Nota- tion using SAS.

Conversion of genotyping results from metabolic enzyme and transporter genetic assays into the consensus star-allele nomenclature is necessary for clinical utilization of DMET genetics. The problem involves translation of variant geno- types, such as single nucleotide polymorphisms (SNPs) or small insertion-deletion events, into gene level star allele nomenclature. Genetic variants in the DMET genes can occur in combinations. In some instances a single variant or often these combinations define a star allele. When and how these variants combine is often poorly understood. How- ever, given adequate definitions, a simple algorithm based on vector addition and comparison can be easily imple- mented in SAS proc IML to perform the conversion. The simplicity and transparency of this algorithm can help de- mystify the translation process and allow statisticians and data management experts to handle this new type of patient information.

Author(s): Mark Farmen, Sandra Close, William J Koh ______

Paper PR04 Audience Level: intermediate Tuesday: 3:30 - 4:00 p.m. Room: Galleria North

Title: Computing Hazard Ratios and Confident Intervals for a Two-Factor Cox Model with Interaction Terms

When interaction terms are included in a Cox Proportional Hazards model, it is very tricky to compute the hazard ratio (HR) and, especially, the confidence intervals. To compute confidence intervals using proc PHREG, we can get the variance of the interaction terms by using the estimated covariance matrix of the parameter estimator. Using proc TPHREG, however, dummy codes can be assigned, so that the CONTRAST statement can be used to obtain the esti- mates and confidence intervals for the interaction terms. In this paper, we describe the SAS code used for computation of parameter estimates of interaction terms and confidence intervals using proc PHREG. Also, we describe how the Class and Contrast statements are used to code dummy variables and obtain estimates and confidence intervals using proc TPHREG. An example of an application of these techniques from a Phase III randomized, multi-site HIV preven- tion trial conducted in the U.S from 2003-2007 (called HPTN039), is illustrated and discussed. The code described in proc PHREG is adaptable to any SAS version and the code described in proc TPHREG is adaptable to SAS version 9.1.3 or later.

Author(s): Jing Wang, Marla J Husnik ______

Paper PR06 Audience Level: intermediate Tuesday: 4:30 - 5:00 p.m. Room: Galleria North

Title: Stacked Cumulative Percent Plots

Longitudinal data presents analysts with the unique challenge of summarizing information over time without losing the detail of pattern changes, spikes, or other irregularities. This paper introduces a graphical method to display ordi- nal response trajectories that allow you to compare the percent of responses in multiple categories. As a result, re- searchers can not only detect symptom improvements or declines over time, but they can also visualize when and why a mean increased or decreased. The analyses and output discussed were created with Base SAS Version 9.1 and SAS/GRAPH software and will hopefully be of interest to SAS users of all skill levels.

Author(s): Theresa M Gilligan

______

57

58 Regulatory Submissions & Standards

Co-chairs: Eunice Ndungu (Merck & Co Inc) Cindy Lee Hsiu-yung (Eli Lilly) ______

Paper RS01 Audience Level: intermediate Monday: 10:30 - 11:30 a.m. Room: Broadway III/IV

Title: The Second CDISC Pilot – A Metastandard for Integrating Databases,

The Second CDISC Pilot Project – A Metastandard for Integrating Databases Gregory Steffens and Ian Fleming The first CDISC SDTM/ADaM pilot project created a test submission to the FDA using CDISC data and metadata stan- dards in order to test that these standards meet FDA requirements. (described in a paper I presented at SUGI 2007) The second CDISC pilot project tests the value of CDISC standards to create integrated databases and the new FDA safety review guidelines. This presentation describes 1.) the collection and use of metadata in the second pilot and 2.) a very important drafted extension to the define.xml schema that supports row-level metadata and fills a gap identified in the first pilot project. Metadata for the eight studies was populated in excel and used to describe, build and validate the databases and to automate the creation of the define.xml file. Row-level metadata allows the description of tall- thin data structures that exist in the SDTM data standard and that are becoming more prevalent in clinical databases. Metadata is an essential element of automating data flow and data mining. The define file and the row-level metadata specify a standard set of database attributes that should be used when creating and publishing any data standard or study database requirement. The well-designed ODM/define.xml metadata standard may turn out to have the most important positive impact that CDISC has made on the industry and to drug safety and efficacy! Recently, some in the CDISC community are considering wiki type methods of collecting data standards that will use metadata as the foun- dation and standard for proposed data standards.

Author(s): Gregory Steffens ______

Paper RS02 Audience Level: beginner Monday: 9:30 - 10:30 a.m. Room: Broadway III/IV

Title: The CDISC/FDA Integrated Data Pilot: A Case Study in implementing CDISC Standards to Support an Integrated Review

In alignment with the CDISC Technical Road Map released in 2008, the CDISC/FDA Integrated Safety Data Pilot was initiated to demonstrate the CDISC standards can improve the efficiency of FDA reviewers in conducting an inte- grated data review across clinical trials. The mission of the pilot is to demonstrate that a patient data submission cre- ated using CDISC Harmonized Standards will meet the needs and expectations of FDA reviewers in conducting an integrated safety review of data from multiple studies and compounds. This session will provide a case study of the Pilot including an overview of the objectives and progress of the project, a summary of the challenges in implement- ing the newest CDISC models including SDTM, ADaM, and the define.xml, and an overview of the efficiencies and limitations identified by FDA reviewers during the review process.

Author(s): Chris Decker, Steven Hirschfeld, Yuguang Zhao ______

59 ______

Paper RS03 Audience Level: beginner Monday: 11:30 - 12:00 a.m. Room: Broadway III/IV

Title: A Practical Introduction to Clinical Data Acquisition Standards Harmonization (CDASH)

On October 1st, 2008, CDASH released the first 16 common CRF streams (or domains) for use by the Pharmaceutical Industry. The goal when putting together the initial data streams is finding out which data fields are essential to the analysis of clinical data, and collecting that data and only that data in a standard way. Because of this goal, there are clear definitions, CDISC mapping (if relevant), CRF completion instructions and additional sponsor information for each element. The focus of CDASH is on data collection, not data reporting. In some instances the optimal data col- lection method conflicts with the Study Data Tabulation Model (SDTM) for reporting data. In these cases additional transformations and derivations may be needed to create the final SDTM compliant datasets. One way to think of it, is that SDTM standard is for presenting analysis data using Clinical Data Interchange Standards Consortium (CDISC) compliant specifications, and the CDASH standard is for the collection of data, and can be collected using CDISC compliant rules if the data is collected in the same manner as it gets reported. This presentation will show the details of CDASH and talk about how these standards are mapped to CDISC and STDM and more importantly, what infor- mation isn’t defined in CDASH that each sponsor will have to consider.

Author(s): Jennifer Price ______

Paper RS05 Audience Level: intermediate Tuesday: 5:00 - 5:30 p.m. Room: Broadway III/IV

Title: Nice SUPPQUAL Variables to Have

A Supplemental Qualifier (SUPPQUAL) data set is a special Study Data Tabulation Model (SDTM) data set that holds non-standard variables that cannot be mapped to any existing standard domain. Although non-standard, some variables can be very useful in data monitoring and standard reporting. This paper discusses some of those variables and the benefits of using them with the perspective of implementing SDTM in a linear fashion and early in the data management process.

Author(s): Beilei Xu, Changhong Shi ______

Paper RS07 Audience Level: intermediate Monday: 8:00 - 8:30 a.m. Room: Broadway III/IV

Title: An Implementation of CDISC SDTM and ADaM Standards at MedImmune

The Clinical Informatics group within MedImmune, LLC., has embarked on an implementation project for CDISC SDTM and ADaM data submission standards, known as CDISC-Lite. The main aspect of this implementation plan is to produce submission and analysis SAS data sets for use within the organization that can easily be converted to sub- mission ready, CDSIC compliant data formats. While both STDM and ADaM call for data to be stored in a vertical structure, most people in the Clinical Development process are more “at-home” with a horizontal structure. In a sim- plified view, CDISC-Lite’s goal is to produce SDTM and ADaM data sets that are approximately “one PROC TRANSPOSE away” from the standards’ requirements. This paper will provide details on the implementation plan and provide lessons-learned from the on-going pilot project.

Author(s): Alan Meier ______

60 ______

Paper RS08 Audience Level: intermediate Monday: 2:30 - 3:30 p.m. Room: Broadway III/IV

Title: Creating the Case Report Tabulation (CRT) for an NDA submission at the absolute last moment – NOT

If your company is anything like ours, you’ve waited until the very last possible moment to start assembling and building your CRT. Often times it is quite a flurry of activity filled with late nights spent meeting technical challenges, gathering information, and preparing the documents in order to meet the schedule specified by your regulatory or pub- lishing department. However, by waiting until the last minute, you have ensured that you have caught all the changes that happened during ad-hoc analysis or while fixing bugs or algorithms, that didn’t get recognized beforehand. The pain is worth it! But is it necessary? At our company, the CRT was rarely on the statistical programmer’s radar during the study. We were often busy writing programs supporting data cleaning, mapping data to company standards, recon- ciling vendor transfers, providing analyses to data safety monitoring boards (DSMBs), supporting other departments such as Clinical Pharmacology as well as providing standard tables, listings and figures (TLFs) for the clinical study report (CSR.) We began to question the submission process. Who can benefit if we start earlier? What are the compo- nents of the CRT that we need to create? When is the right time for us to start? Where can we find hints or tips to do this? How do we build the CRT? And finally, Why should we develop a new process for assembling CRTs? Answers to these and other questions will be discussed in the following paper.

Author(s): Steve Wong, Amanda Tweed, Christine Connolly, Kevin King ______

Paper RS09 Audience Level: intermediate Monday: 1:30 - 2:30 p.m. Room: Broadway III/IV

Title: Implementing CDISC When You Already Have Standards

How you choose to implement CDISC will be based on current company standards and their robustness, the amount of control you have at each step of the data flow process, your time vs. resource needs, and your willingness to change. Working in a company with established standards can both help and hinder CDISC adoption. Because you have standards in place that so many people are used to working with, it can be quite a hurdle to replace them with the CDISC standards. And depending on the structure of the current standards, they may look vastly different than CDISC. However working within a larger organization also means you have people in place whose job it is to manage those standards, and getting those folks to embrace the CDISC standard can really drive it forward. This paper will describe what we’ve done and are still doing to implement CDISC, and offer tips that would be helpful to any organi- zation looking to implement CDISC.

Author(s): Sandra Minjoe ______

61 ______

Paper RS10 Audience Level: intermediate Monday: 4:30 - 5:30 p.m. Room: Broadway III/IV

Title: Converting the define.xml to a relational database to enable printing and validation

When submitting clinical study data in electronic format to the FDA, not only information from trials has to be sub- mitted, but also information to help understand the data. Part of this information is a data definition file, which is the metadata describing the format and content of the submitted data sets. When submitting data in the CDISC SDTM format it is required to submit the data definition file in the Case Report Tabulation Data Definition Specification (de- fine.xml) format as prepared by the CDISC define.xml team. CDISC has made a basic style sheet available on their website to display the define.xml in a human readable form in a web browser. However, this style sheet is not suitable to generate a paper-based presentation of the define.xml. In January 2008 CDISC published a report that summarizes the work, experiences and findings of the first CDISC SDTM / ADaM Pilot Project. The objective of the pilot project was to test how well the submission of CDISC-adherent data sets and associated metadata met the needs and the ex- pectations of both medical and statistical FDA reviewers. During the pilot, a major issue identified by the regulatory review team was the difficulty in printing the Define file. The regulatory review team for the pilot project emphasized that the ability to print the Define file would be essential for the future use of XML files. This paper addresses the printing issue and illustrates how the Define file (define.xml) can be converted to Adobe's Portable Document Format (PDF) by using SAS/Base software, and also how the define.xml can be transformed into relational SAS data sets by using SAS XML Mapper technology. Once the data is available in SAS data sets it can be presented as a PDF file with the use of SAS ODS (Output Delivery System). This PDF file will include hyperlinks and bookmarks and can be eas- ily printed. The relational SAS datasets can also be used to validate the metadata contained in the define.xml file against the metadata in the SAS transport files and the different types of SDTM dataset metadata: - metadata for do- main datasets - metadata for domain content (including value level metadata) - metadata for controlled terminology

Author(s): Lex Jansen ______

Paper RS11 Audience Level: intermediate Wednesday: 8:30 - 9:30 a.m. Room: Broadway III/IV

Title: Data Conversion to SDTM: What Sponsors Can Do to Facilitate the Process

An increasing number of sponsors are submitting clinical trials data in the format of the Clinical Data Interchange Standards Consortium (CDISC) Study Data Tabulation Model (SDTM). In many cases, however, the data was not collected, stored, or extracted from the database in the SDTM format. As a result, the data must be converted. In order to be SDTM compliant, the converted data must meet a number of structure and content requirements that may be new to some sponsors. These requirements are described in the SDTM, the SDTM Implementation Guide (SDTMIG), and in the published validation checks the FDA runs. This paper discusses some of the steps sponsors can take to facilitate the conversion of data collected in their traditional format to data in the STDM format. For maximum efficiency, many of these steps should be undertaken at study setup or during the study (prior to database lock). There are, how- ever, some actions that sponsors can take that will facilitate even late-stage (after database lock) conversions.

Author(s): Fred Wood

62 ______

Panel #2 Audience Level: intermediate Wednesday 09:30 – 11:00 a.m. Room: Broadway III/IV

Title: CDISC Implementation: A Panel Discussion

Facilitator: Sandra Minjoe

Overview: CDISC (Clinical Data Interchange Standards Consortium) has been gaining ground as an industry standard, and many companies have begun implementing various components of CDISC within their work streams. This panel discussion will focus on the suite of CDISC standards from data collection (CDASH), through operational data (ODM), tabulation data (SDTM), analysis data (ADaM), and finally submission (define.xml). Panelists will share how they learned the various CDISC standards and their implementation issues and successes. All panelists are from the industry, and include both sponsor and vendor companies. If you have begun implementing CDISC or are just starting to consider how that might be done, this panel discussion is for you!

Panelists: Matt Becker, Jennifer Price, Brian Shilling, Gregory Steffens

______

63

64 SAS® Institute Papers

Paper SA-AD-01 Audience Level: beginner Monday: 9:30 - 10:30 a.m. Room: Galleria South

Title: The ODS Menu for All Appetites and Applications

This document will describe how to unleash the power of the various ODS destinations to generate fantastic applica- tions and reports. The document will describe how and when to harness the power of various ODS destinations. This power is evident when generating spreadsheets using the various ODS destinations. Methods will be described to gen- erate that perfect presentational output. This will include general formatting, creating styles with the addition of tem- plates specifically generated for Excel output, and all those little things that make for great looking output. Other menu items will include the following: - Arranging output on the page using various approaches, such as, ODS LAYOUT; HTMLPanel and TableEditor tagsets; the report writing features of the data step; customizing tables; and graphics enhancing output.

-Creating HTML which adds the ability to sort, freeze, reorder, and export data dynamically and as other dynamic features.

-Using ODS Markup to transform and modify output.

-Modifying objects using the ODS Document facility to generate the desired table of contents and bookmarks.

-Style enhancements using a variety of methods and more.

Author(s): Parker, Chevell

______

Paper SA-AD-02 Audience Level: beginner Monday: 4:30 - 5:30 p.m. Room: Galleria South

Title: Clinical Trial Reporting Using SAS/GRAPH SG Procedures

Graphics are a powerful way to display clinical trial data. By their very nature, clinical trials generate a large amount of information, and a concise visual presentation of the results is essential. Information about the patient population, drug dosages, clinical responses, and adverse events must be clear. Clinical laboratory results need to be presented within the context of acceptable limits, and subtle changes over time must be highlighted. This presentation will show by example how such graphs can easily be created using the SAS/GRAPH SG procedures. The techniques that will be emphasized in this presentation include: • Creation of a dose response plot by overlaying multiple plots in one graph • Construction of a hematology panel using treatment regimen and visit numbers as the classification variables • Presentation of a matrix of liver function tests (LFTs) for at-risk patients • Aggregation of data into “on the fly” classification variables using user-defined formats • Getting the axis you want using built-in “best fit” algorithms • Generation of publication-ready graphs in color and black and white.

Author(s): Schwartz, Susan L.

______

65 ______

Paper SA-HW-01 Audience Level: beginner Tuesday: 8:00 - 10:00 a.m. Room: Parlor

Title: Tips and Tricks for Creating Multi-Sheet Microsoft Excel Workbooks the Easy Way with SAS®

Transferring SAS® data and analytical results between SAS and Microsoft Excel can be difficult, especially when SAS is not installed on a Windows platform. This paper discusses using the XML support in Base SAS®-9 software to create multisheet Microsoft Excel workbooks (versions 2002 and later). You will learn step-by-step techniques for quickly and easily creating attractive multi-sheet Excel workbooks that contain your SAS output, and also tips and tricks for working with the ExcelXP ODS tagset. Most importantly, the techniques that are presented in this paper can be used regardless of the platform on which SAS software is installed. You can even use them on a mainframe! The use of SAS server technology is also discussed. Although the title is similar to previous papers by this author, this paper contains new and revised material not previ- ously presented.

Author(s): DelGobbo, Vince

______

Paper SA-HW-02 Audience Level: beginner Tuesday: 10:00 - 12:00 a.m. Room: Parlor

Title: A “SAS® Programmer’s Guide” to SAS® Enterprise Guide®

You have been programming in SAS for a while..... You have been told "we are moving to Enterprise Guide and re- moving Display Manager". You say ...... I can program everything just fine myself, thank you! OR What am I sup- posed to do with all these windows? This paper demonstrates how SAS programmers can use SAS Enterprise Guide as their primary interface to the SAS system while maintaining the flexibility of writing their own customized code. We will look at • how to navigate the views and menus • how SAS Enterprise Guide can be used as your primary SAS Editor • how you can leverage the more complex built-in capabilities available in SAS Enterprise Guide to further en- hance the information you deliver • some tips and tricks to get the most out of SAS Enterprise Guide. Enterprise Guide version 4.1 will be used.

Author(s): Rupinder Dhillon

______

Paper SA-PR-01 Audience Level: beginner Wednesday: 10:00 - 11:00 a.m. Room: Galleria North

Title: SAS and Life Sciences: Trends, Capabilities and Progress

This session will update the audience on SAS’ progress with its Life Sciences solutions. New products, new releases and progress since PharmaSUG 2008 will be presented.

Author(s): Handelsman, David

______

66 ______

Paper SA-RS-01 Audience Level: beginner Monday: 8:30 - 9:30 a.m. Room: Broadway III/IV

Title: CDISC Implementation Strategy: SAS® Clinical Data Integration Success Factors

This paper documents best practices for implementing data transformation tools in the Life Sciences industry includ- ing required process changes as well as pitfalls to avoid. Primary focus is the process of incorporating SAS® Clinical Data Integration Server for managing and standardizing clinical trial data, especially SDTM and CRT-DDS, for mul- tiple external audiences. The features of a good clinical trials data management system will also be covered including CDISC support, standards adherence validation, submission documentation support, and the ability to allow incre- mental validation checks. How to manage the challenges of implementing an enterprise solution and reap the benefits of the new solution will be covered. The paper will also provide tips for a successful migration strategy including set- ting expectations, managing role changes, and communicating the new process and its benefits.

Author(s): Gibson, Bill

______

Paper SA-RS-02 Audience Level: beginner Monday: 3:30 - 4:30 p.m. Room: Broadway III/IV

Title: Using SAS Clinical Data Integration Server to Implement and Manage CDISC Standards

The SAS metadata server is a core component of all SAS 9 solutions. It delivers the power to integrate, share, cen- trally manage and leverage metadata across entire organizations. Through these capabilities, standard data models such as the CDISC Study Data Tabulation Model (SDTM) can be deployed and leveraged by all users in your organi- zation without the need for developing additional metadata libraries or programs. In this paper, we examine the value that the SAS open metadata architecture can bring to your organization, how the SDTM data model is implemented in the metadata server, and how the metadata can be leveraged by SAS products and solutions such as Data Integration Studio.

Author(s): Kilhullen, Michael (presented by Andrew Fagan)

______

Paper Sa-RS-03 Audience Level: beginner Tuesday: 4:00 - 5:00 p.m. Room: Broadway III/IV

Title: Supporting CDISC Standards in Base SAS Using the SAS Clinical Standards Toolkit

The use of regulatory standards aimed at clinical research data and metadata has become more commonplace over recent years. In the USA, the FDA has adopted standards from the Clinical Data Interchange Standards Consortium (CDISC) for submission of tabulation data (SDTM) and the main study metadata XML file (CRT-DDS). There is con- tinuing work on new standards and updates to the existing ones. Implementing and managing these standards places an additional burden on SAS users, which translates to higher costs for companies as the standards are adopted. This paper will explain how SAS Clinical Standards Toolkit provides a framework of macro-based functionality to define, manage and verify that standards are being correctly followed in clinical data and metadata. It will explain how users can create their own standards and use them within the framework. Finally it will explain how updates to existing models will be supported as well as incorporation of emerging standards over time.

Author(s): Villiers, Peter

______

67 ______

Paper SA-SP-01 Audience Level: beginner Wednesday: 9:00 - 11:00 a.m. Room: Broadway I/II

Title: Introduction to Logistic Regression

Logistic regression is one of the basic modeling tools for a statistician or data analyst. This tutorial focuses on the ba- sic methodology behind logistic regression and discusses parameterization, testing goodness of fit, and model evalua- tion using the LOGISTIC procedure. The tutorial concentrates on binary response models, but direction for handling ordinal responses is also provided. This tutorial discusses numerous ODS graphics now available with the LOGISTIC procedure, as well as newer features of SAS 9.2 such as ROC comparisons and odds ratios with interactions. The tuto- rial includes numerous examples.

Author(s): Stokes,Maura

______

Paper SA-TT-01 Audience Level: beginner Monday: 2:00 - 3:00 p.m. Room: Galleria North

Title: Inline formatting with ODS Markup

Explore the power of inline formatting in ODS. And learn about the super powers of inline formatting with ODS markup and ODS measured, see how easy it is to make your reports look better than ever before. With the new and improved syntax, ODS Inline formatting is better and more powerful than ever before. Learn how to use it and extend it to do even more than you dreamed possible.

Author(s): Gebhart, Eric

______

Paper SA-TT-02 Audience Level: beginner Tuesday: 8:00 - 9:00 a.m. Room: Galleria North

Title: Tiptoe Through the Templates

Are you confused about the difference between style templates, table templates, tagset templates and graph templates? Do you wonder how they're all used with ODS?

This paper provides an overview of all the different template types and how they're used with the Output Delivery System. From style and table templates, that first appeared with SAS 7 to the newest graph templates that appeared with SAS 9.2, this paper will provide an overview and several concrete examples for each template type. Along the way, we'll also discuss the template garden (SASHELP.TMPLMST) where all the templates live, how to transplant your templates to a different garden and come up with your own new variety of templates (PROC TEMPLATE) and how to find your way to the new template garden (ODS PATH). New features of PROC TEMPLATE syntax (such as the IMPORT statement) will be highlighted.

Author(s): Zender, Cynthia

______

68 ______

Paper SA-CC-13 Audience Level: intermediate Tuesday: 8:30 - 8:45 a.m. Room: Broadway III/IV

Title: The Care and Feeding of SAS Macro Program Parameters

Most parameterized macros that I have either written or reviewed have taken a great deal of care to correctly perform the primary programming task that they were originally designed for. In retrospect however, I would propose that most also do not spend the same effort in checking the values passed through their parameters. To help remedy this situation, a number of generalizable parameter value processing and validation examples are presented in this paper. The examples include checking for null values, numeric values, integer values, case specific character values, existing file names, existing libnames, valid dataset names, and validity of the syntax of SAS statements passed as parameter input. Suggestions are also made to help the macro program developer communicate the requirements of the parame- ters to the user.

Author(s): Don Boudreaux

______

69

70 Statistics & Pharmacokinetics

Co-chairs: Guowei Wu (Merck & Co Inc) Xingshu Zhu (Merck & Co Inc) ______

Paper SP01 Audience Level: intermediate Tuesday: 4:00 - 4:30 p.m. Room: Broadway I/II

Title: Modeling the Treatment Effect on a Median of a Percent Change from Baseline in a Log-normal Vari- able Using SAS Procedure NLMIXED

Often, when a response percent change from baseline in a clinical parameter is not normally distributed, the medical community presents the results in terms of a median percent change and describing the treatment effect by the esti- mate of the difference in the treatment group medians. Such estimates are typically provided based on non-parametric analysis methods. However, if the response parameter itself is log-normally distributed, additional advantages can be derived from the parametric modeling. Following Wong et al. [1], the ANCOVA model can be fit to the log- transformed data, upon which the estimated treatment means will be back-transformed to become the estimates of the treatment group medians of the percent change from baseline. The confidence interval for the difference in medians can be provided using the delta method [1]. The introduction of SAS Procedure NLMIXED [2] made this approach easily accessible as the delta method-based estimates of the confidence limits are output by the procedure, eliminating the need for tedious coding. Other capabilities of Procedure NLMIXED, such as modeling the variance of the error as a function of the response, are also very useful. Finally, modeling with Procedure NLMIXED can be extended to the longitudinal percent change from baseline data, thus aligning the approaches to analysis of normal and log-normal longitudinal data. These capabilities of Procedure NLMIXED are illustrated in the examples (with the SAS code pro- vided) arising from the analysis of the lipid parameters, in particular, log-normally distributed triglycerides data.

Author(s): Aiming Yang, Olga M Kuznetsova, Nancy Liu ______

Paper SP02 Audience Level: intermediate Tuesday: 10:30 - 11:00 a.m. Room: Broadway I/II

Title: The Existence of Maximum Likelihood Estimates for the Logistic Regression Model

The existence of maximum likelihood estimates for the binary response logistic regression model depends on the con- figuration of the data points in your data set. There are three mutually exclusive and exhaustive categories for the con- figuration of data points in a data set: · Complete Separation · Quasi-Complete Separation · Overlap For this paper, a binary response logistic regression model is considered. A 2 x 2 tabular presentation of the data set to be modeled is provided for each of the three categories mentioned above. In addition, the paper will present an example of a data set whose data points have a linear dependency. Both unconditional maximum likelihood estimation (asymptotic infer- ence) and exact conditional estimation (exact inference) will be considered and contrasted in terms of results. The statistical software package SAS will be used for the binary response logistic regression modeling.

Author(s): William F McCarthy, Nan Guo ______

71 ______

Paper SP03 Audience Level: intermediate Tuesday: 11:00 - 11:30 a.m. Room: Broadway I/II

Title: Illustrative Logistic Regression Examples using PROC LOGISTIC: New Features in SAS/STAT® 9.2

PROC LOGISTIC has many useful features for model selection and the understanding of fitted models. The standard generated output will give valuable insight into important information such as significant variables and odds ratio con- fidence intervals. However, proper utilization of output files, graphical displays and relevant options can further en- hance justification of model choice and understanding of model fit. In this fairly general paper, a variety of logistic regression topics such as model building, model fitting and the ROC curve will be reviewed. The discussion will in- troduce the “PLOTS=” option, as well as the ROCCONTRAST statement as new features which are available in SAS/STAT® 9.2

Author(s): Robert G Downer, Patrick J Richardson ______

Paper SP04 Audience Level: intermediate Tuesday: 3:30 - 4:00 p.m. Room: Broadway I/II

Title: Using PROC IML to Generate Weighted ChiSquare Statistics

While the availability of statistical tests and computations evolves as do the procedures in SAS/STAT, SAS program- mers are often confronted with the problem of performing a statistical analysis for which SAS/STAT does not readily lend itself as a solution. Hence, we must leverage other tools in SAS to execute the results. One of those tools is PROC/IML, a powerful SAS module that performs matrix operations, and can read and write SAS datasets. In this paper, we will demonstrate how we implemented IML to translate mathematical and statistical specifications to SAS programming statements, in order to generate point estimates and confidence intervals for a weighted chi-square test.

Author(s): Daniel M DiPrimeo, Junming Yang, Lisa H Price, Bruce E Johnston ______

Paper SP05 Audience Level: intermediate Wednesday: 8:00 - 8:30 a.m. Room: Broadway I/II

Title: Distribution curves graphic, with patterned areas between minimum and maximum ranges using SAS®.

Paper Number: SP05

Abstract: Distribution plot graphic with patterned area help to show rapidly the difference between two compared items. They are under documented for proc gplot since they are mainly used in pie-charts. Nevertheless we found an innovative way to do it. This paper is explaining how to plot a distribution graphic comparing values of 2 items in graphic using patterned areas of different color for each item. The difficulties encountered with the SAS-Graph language to produce this graphic are discussed.

Author(s): Sylvain – Cadieux ______

72 ______

Paper SP06 Audience Level: intermediate Tuesday: 11:30 - 12:00 a.m. Room: Broadway I/II

Title: Detection of multiple outliers in univariate datasets

A number of methods are available to detect outliers in univariate data sets. Most of these tests are designed to handle one outlier at a time. As soon as an outlier is detected it is removed from the data set and the process is repeated until no more outliers are detected. Grubbs and Dixon tests can handle, in some cases, more than one outlier at a time. However, in general, when multiple outliers are present masking effect (an outlier is not detected, due to presence of other outliers) may prevent outlier detection. PROC ROBUSTREG may appear as a useful tool to detect multiple out- liers. It has four types of estimation available (M, LTS, S, MM). Performance of PROC ROBUSTREG will be com- pared with sequential application of Grubbs test, 3 sigma and Weisberg t-test. SAS macros to implement multiple out- lier testing will be presented as well.

Author(s): Marek K Solak ______

Paper SP07 Audience Level: intermediate Tuesday: 2:30 - 3:00 p.m. Room: Broadway I/II

Title: Adhoc analysis of Site by Treatment Interaction

In order to enroll patients more quickly, and to help establish the general efficacy of a new drug, multiple investigators may join a clinical trial. Each investigator constitutes a center or site. It is expected that sites will contribute a small effect in the results so that including site as a covariate in the analysis of treatment effects is a common means of in- creasing the power of a multi-center clinical trial. Controlling for Site and Site by Treatment effects in a linear model increases the power of analysis when those factors are found to contribute significantly to reducing the residual error of the model. There may be factors associated with site, such as patient demographics and medical history, as well as site characteristics which may not have been otherwise accounted for, which increase or decrease a response variable for all treatments or specific treatments. ICH Guideline E9 recommends that if treatment effects vary by site, this should be explained, requiring further analyses of an ad-hoc nature. These can follow two possible strategies: looking for sites which are outliers, and looking for sites which have commonalities with other sites. By starting with residuals of the treatment-only linear model, we assume that simple treatment effects are removed. The residuals are then aver- aged by site and treatment and displayed graphically to look for sites which are similar. Exploratory analysis may need to find correlative variables which explain the differences between sites. Several examples using data fabricated to illustrate site by treatment interaction will be used in this presentation.

Author(s): Alan B Davis ______

Paper SP08 Audience Level: beginner Tuesday: 1:30 - 2:00 p.m. Room: Broadway I/II

Title: A Little Stats Won't Hurt You

This paper gives an introduction to some basic but critically important concepts of statistics and data analysis for the SAS programmer who pulls or manipulates data, but who might not understand what goes into a proper data analysis. We first introduce some basic ideas of descriptive statistics for one-variable data, and then expand those ideas into many variables. We then introduce the idea of statistical significance, and then conclude with how all these ideas can be used to answer questions about the data. Examples and SAS code are provided.

Author(s): Nathaniel B Derby ______

73 ______

Paper SP09 Audience Level: intermediate Tuesday: 5:00 - 5:30 p.m. Room: Broadway I/II

Title: ROC analysis for the evaluation of continuous biomarkers: Existing tools and new features in SAS® 9.2

Biomarkers have become essential tools for proper diagnosis and treatment of a wide range of illnesses, including cancer, diabetes, and infectious diseases. The growing need for rigorous evaluation of new biomarkers has spurred the development and characterization of statistical methods for diagnostic accuracy. The receiver operating characteristic (ROC) curve is the standard analytical tool for evaluating diagnostic tests. However, recent studies suggest wide- spread use of inappropriate statistical methods for ROC analysis. In addition, while enhancements in SAS 9.2 have greatly simplified basic ROC analysis for SAS users, many common ROC techniques still require extensive additional programming for SAS users. This paper provides an overview of statistical methods for evaluating continuous bio- markers using ROC analysis. For each statistical technique, existing tools available in SAS for performing the task are described, with particular emphasis on methods not addressed in previous SAS papers. The paper also introduces new features for ROC analysis that are now available as a standard component of the LOGISTIC procedure in SAS 9.2. Statistical techniques addressed in the paper include the comparison of the area under the ROC curve (AUC) of two or more biomarkers, generation of confidence intervals for sensitivity given fixed specificity, and partial AUC analysis. Data from a previously published study of serum biomarkers for pancreatic cancer (Wieand et al., 1989) are used for illustration.

Author(s): Sanghyuk Shin ______

Paper SP10 Audience Level: intermediate Tuesday: 3:00 - 3:30 p.m. Room: Broadway I/II

Title: Confidence Intervals for the Binomial Proportion with Zero Frequency

Estimating confidence interval for the binomial proportion is a challenge to statisticians and programmers when the proportion has zero frequencies. The most widely used method based on Wald asymptotic statistics gives a degenerate interval, that is, (0, 0), in this case. This paper gives an overview of the statistical methods used for estimating confi- dence intervals, which are all available in SAS. Besides, when calculating the frequency and intervals, SAS by default does not present the missing categorical level; this level has zero frequency but is no less important than other cate- gorical levels. This paper also builds a macro to share tips on how to create confidence intervals with zero frequency.

Author(s): Xiaomin He, Shwu-Jen Wu ______

74 ______

Paper SP11 Audience Level: intermediate Tuesday: 4:30 - 5:00 p.m. Room: Broadway I/II

Title: Fitting Compartmental Models to Multiple Dose Pharmacokinetic Data using SAS® PROC NLMIXED

Pharmacokinetic (PK) modeling uses systems of ordinary differential equations derived from biological considera- tions along with statistical models to model the time course of drug in the body. The statistical model requires algo- rithms for fitting nonlinear mixed effects models. While the NLMIXED procedure in SAS is available, it does not allow for individual subject data to affect the structural form of the model. In the case of a multiple dose study where subjects experience different dosing times, a superposition principle can be used to recursively account for each addi- tional dose. A SAS template program was written to manipulate data and then construct mean functions for fitting pharmacokinetic data from multiple dose studies. One-compartment models for oral dose administration are consid- ered to illustrate the methodology and challenges for fitting multiple dose data using PROC NLMIXED. The template program contains sample SAS codes for fitting two types of models: a general model and a simplified model. The general structural model can handle many situations such as when a subject has irregular dosing intervals, changes dosage during therapy or has non-ignorable differences between actual dosing time and scheduled dosing time, etc. The simplified model is for the most common scenario in which all subjects are constrained to have the same multiple dosing schedules with regular dosing intervals and constant doses. Under each model, we also discuss how to handle missed dose problems.

Author(s): Jing Su, Jing Su, Xiaodong Li, Alan Hartford ______

Paper SP12 Audience Level: intermediate Tuesday: 2:00 - 2:30 p.m. Room: Broadway I/II

Title: Concepts and Ideas for Developing and Maintaining a Utility Toolbox of SAS Statistical Procedures

Most statistical analyses of clinical trials data require that one or more SAS statistical procedures be employed to de- termine significance of differences either between treatment groups or between points in time for a specific endpoint measurement. Oftentimes the shell SAS code necessary to implement such procedures is provided to the programmer either in the Statistical Analysis Plan (SAP) or in an analysis-related technical document. This is not the case however for all formal study analyses, and in these cases it can be helpful for the programmer if he/she has a library or toolbox of template statistical procedures at the ready. Of course, in order to possess such a utility the programmer must first develop and maintain the toolbox, as well as understand the procedures included therein. The purpose of this paper is to present some concepts for the development and maintenance of a SAS statistical procedures toolbox and to provide some examples of typical procedural code the SAS programmer might want to include. This presentation is based on SAS version 8.2 or above, is not limited to any particular operating system, and is intended for intermediate to ad- vanced SAS programmers who have some familiarity with rudimentary SAS statistical procedures.

Author(s): David W Carr ______

75 ______

Paper SP13 Audience Level: intermediate Wednesday: 8:30 - 9:00 a.m. Room: Broadway I/II

Title: Don't Be Loopy: Re-Sampling and Simulation the SAS® Way

The most common way that people do simulations and re-sampling plans in SAS® is, in fact, the slow and awkward way. People tend to think in terms of a huge macro loop wrapped around a piece of SAS code, with additional chunks of code to get the outputs of interest and then to weld together the pieces from each iteration. But SAS is designed to work with by-processing, so there is a better way. A faster way. This paper will show a simpler way to perform boot- strapping, jackknifing, cross-validation, and simulations from established populations. It is simpler and more efficient to get SAS to build all the iterations in one long SAS data set, then use by-processing to do all the computations at once. This lets us use SAS features to gather automatically the information from all the iterations, for simpler compu- tations afterward.

Author(s): David L Cassell ______

76

77 Technical Techniques

Co-chairs: Jim Johnson (RPS) Eric Larson ______

Paper TT01 Audience Level: intermediate Monday: 4:00 - 4:30 p.m. Room: Galleria North

Title: Using SAS Formats to Generate Code and Format Your Reports

The goal of this presentation is to demonstrate how a series of two complimentary formats (i.e., one containing values and one containing labels) can be used in conjunction with each other to provide the building blocks (namely macro fields) for generating code within SAS. Benefits: 1. Dynamic coding within a program (code driven by expected val- ues of the data) 2. Centralized location for establishing calculation parameters 3. Mechanism for porting consistent code across programs

Author(s): Marc Mucatel

______

Paper TT02 Audience Level: beginner Monday 3:00-4:00 p.m. Room: Galleria North

Title: CHECK YOUR DATA MORE EFFICIENTLY

%CHKDATA is a SAS macro program designed to check the data in an efficient and user-friendly way. First, the macro can check the data structure by generating three types of key information: the contents of dataset, its associated SAS format, and a collection of all variable names listed horizontally. Second, the macro can generate the distinct values and frequency counts for any specified variables. Third, the macro can deal with data issues. It can define any potential data issues and generate the reports. In addition, %CHKDATA has one important special feature. The macro can work on multiple datasets at the same time. When processing multiple datasets, it combines data structure infor- mation for each input dataset and list them side-by-side in one report, therefore, people can easily review and even compare the information among all input datasets. This is especially helpful for people working on data integration or anything where multiple datasets are used and compared. In summary, %CHKDATA is a very useful tool for anyone who wants to review and understand data quickly and correctly.

Author(s): JIAN (DANIEL) HUANG

______

Paper TT03 Audience Level: advanced Tuesday: 2:00 - 2:30 p.m. Room: Galleria North

Title: Windows and Unix Computers Now Have Multiple CPU’s; Why Not Control Two or Three or More Parallel Executing SAS Batch Jobs from One Master Job!

This paper will show the processes required to start multiple, parallel batch copies of SAS on the SAME Microsoft Windows or UNIX based computer. It may be easy to click the SAS Short-cut button or type SAS in a command prompt window a few times, but those jobs are independent of each other, and use resources to update the screen im- ages for the operator (you). The process proposed here allows one SAS job to start others as child batch processes, with no screen update requirements, with their own parameters and monitor their completion. This paper exposes some of the tricks required to get two (or more) batch copies of SAS running on a Windows or Unix computer.

Author(s): William E Benjamin Jr

78 ______

Paper TT04 Audience Level: intermediate Tuesday: 11:00 - 11:30 a.m. Room: Galleria North

Title: SAS/GRAPH ® Colors Made Easy

Creating visually appealing graphs is one of the most challenging tasks for SAS/GRAPH users. The judicious use of colors can make SAS graphs easier to interpret. However, using colors in SAS graphs is a challenge since SAS em- ploys a number of different color schemes. These color schemes include HLS, HSV, RGB, CMY(K) and Gray-Scale color schemes. Each color scheme uses different complex algorithms to construct a given color. Colors are repre- sented as hard-to-understand hexadecimal values. A SAS user needs to understand how to use these color schemes in order to choose the most appropriate colors in a graph. This paper describes these color schemes in basic terms and demonstrates how to efficiently apply any color and its variation to a given graph without having to learn the underly- ing algorithms used in SAS color schemes. Easy and reusable examples to create different colors are provided. These examples can serve either as the stand-alone programs or as the starting point for further experimentation with SAS colors. SAS also provides CNS and SAS registry color schemes which allow SAS users to use plain English words for various variations of colors. These underutilized schemes and their color pallets are explained and examples are pro- vided.

Author(s): Max Cherny

______

Paper TT05 Audience Level: beginner Tuesday 9:00-10:00 a..m. Room: Galleria North

Title: The MEANS/SUMMARY Procedure: Getting Started

The MEANS/SUMMARY procedure is a workhorse for most data analysts. It is used to create tables of summary statistics as well as complex summary data sets. The user has a great many options which can be used to customize what the procedure is to produce. Unfortunately most analysts rely on only a few of the simpler basic ways of setting up the PROC step, never realizing that a number of less commonly used options and statements exist that can greatly simplify the procedure code, the analysis steps, and the resulting output. This tutorial begins with the basic statements of the MEANS/SUMMARY procedure and follows up with introductions to a number of important and useful options and statements that can provide the analyst with much needed tools. With this practical knowledge, you can greatly enhance the usability of the procedure and then you too will be doing more with MEANS/SUMMARY.

Author(s): Art Carpenter

______

Paper TT06 Audience Level: intermediate Tuesday: 10:00 - 11:00 a.m. Room: Galleria North

Title: The MEANS/SUMMARY Procedure: Doing More

The MEANS/SUMMARY procedure is a workhorse for most data analysts. It is used to create tables of summary statistics as well as complex summary data sets. The user has a great many options which can be used to customize what the procedure is to produce. Unfortunately most analysts rely on only a few of the simpler basic ways of setting up the PROC step, never realizing that a number of less commonly used options and statements exist that can greatly simplify the procedure code, the analysis steps, and the resulting output. This tutorial covers a number of advanced and lesser known statements and options of the MEANS/SUMMARY procedure and follows up with introductions to a number of important and useful options and statements that can provide the analyst with much needed tools. With this practical knowledge, you can greatly enhance the usability of the procedure and then you too will be doing more with MEANS/SUMMARY.

Author(s): Art Carpenter

79 ______

Paper TT07 Audience Level: intermediate Tuesday: 2:30 - 3:00 p.m. Room: Galleria North

Title: Automating the pooling of variables across multiple datasets using Proc SQL and SAS Macro

Complex study designs may result in multiple raw datasets being created that have information that is normally pre- sent in one dataset present across two or more datasets. This necessitates the creation of a common dataset. To create this common dataset, information first needs to be gathered across studies and a spreadsheet can be created in EXCEL containing the target variables. An algorithm using Proc SQL and SAS Macro is described that reads the spreadsheet and automatically pools the data. The advantage is especially for ISS/ISE studies, code need not be written separately for individual studies.

Author(s): Sanjiv Ramalingam ______

Paper TT08 Audience Level: intermediate Monday: 4:30 - 5:00 p.m. Room: Galleria North

Title: Transposing Data Using PROC SUMMARY'S IDGROUP Option

When clinical data are stored with multiple observations per subject, a common task is to rearrange the data so there is only one observation per subject. That single observation contains all or part of the information previously spread over multiple observations. Such rearranging of data is commonly done with either PROC TRANSPOSE or with a data step that often contains one or more arrays. There is a little used alternative to these methods, the IDGROUP option in PROC SUMMARY. The method is little used since the task is not made explicit in the documentation for the proce- dure and it has not been described in any papers that the authors could find at SUGI, SAS Global Forum, or various regional user group meetings. This paper describes several situations and shows the SAS code needed for using the IDGROUP option in PROC SUMMARY as an alternative to the more common methods for rearranging data.

Author(s): John H King, Mike S Zdeb ______

Paper TT09 Audience Level: intermediate Monday: 1:30 - 2:00 p.m. Room: Galleria North

Title: Adverse Events of Special Interest and MedDRA Upgrades: A Dilemma and Proposed Solution

Adverse Events of Special Interest are those events thought to be [potentially] associated with the investigational compound or disease under study. Reporting on Adverse Events of Special Interest is an emerging and ever more critical aspect related to characterizing the safety profile of a compound. While Standard MedDRA Queries (SMQs), or grouping of terms that relate to the defined medical condition or area of interest, exist to select all events similar in some way (e.g., Malignancies). However, SMQs may be either too general or, in some cases, too specific for a project team to use ‘straight out of the box’. It’s also possible that there are not enough existing MedDRA SMQs to accom- modate the needs of a project team. In addition, as Events of Special Interest lists become more numerous and de- tailed, the potential coding changes caused by MedDRA updates every six months necessitate the team re-review these lists twice a year (after each upgrade) in order to ensure the ESI lists remain accurate. As a result, this paper dis- cusses a SAS macro developed specifically to easier facilitate the identification, selection and maintenance of Adverse Events of Special Interest. Specifically, this macro was created to (1) create a single vertical structure SAS dataset from multiple ESI Excel worksheets, and (2) to identify any MedDRA coding differences in Events of Special Interest between different versions of MedDRA.

Author(s): Todd J Case ______

80 ______

Paper TT10 Audience Level: intermediate Tuesday: 11:30 - 12:00 a.m. Room: Galleria North

Title: Configuring SAS to work for you, Part 2: More Programming Shortcuts

Several years ago, I presented “DO NOT EDIT BELOW THIS LINE” (Configuring SAS to work for you). That paper contained a variety of tips for tweaking SAS options to change the way it works so that programmers can be more efficient. This paper presents new tip and tricks that I have learned and or researched over the past three years. Most of these are Windows based SAS tricks and tips. This isn’t intended to present any new or undocumented features; rather it is a compilation of programming shortcuts and hints.

Author(s): Kim Truett ______

Paper TT11 Audience Level: intermediate Monday: 5:00 - 5:30 p.m. Room: Galleria North

Title: Multiple graphical and tabular reports on one page, multiple ways to do it

Creating different kind of reports for the presentation of same data sounds a normal day job, but it becomes a little tough job to generate all different kind of reports on a single page. In the world of clinical trials, sometimes it is im- portant to have the graphical output of data sitting next to the tabular output of data on same page to better understand the information captured in the data. It is generally a multiple frames one window kind of report and it is hard to get such report through conventional SAS® programming which usually gives listing form of reports. In such cases, Out- put Delivery System (ODS) is greatly helpful to overcome the limitations of traditional SAS output. Various features of ODS, new powerful SAS Version 9 capabilities and functionalities of output destinations like HTML and PDF en- able anyone to get multiple graphs, charts, maps, listings and tables on a single page. Use of ODS tagsets to create HTML panels in a single page with ActiveX functionality of SAS is described in this paper in detail. Also the use of SAS/GRAPH® options which control the size of graph, in combination of ODS are explained in this paper to insert multiple graphs on a single PDF page. In the end, new ODS LAYOUT capability of SAS is also explained to insert multiple tabular form of reports in a single PDF page. This paper is intended for new to intermediate level of SAS users working on Windows/UNIX platforms.

Author(s): Niraj J Pandya ______

Paper TT12 Audience Level: beginner Tuesday: 3:00 - 3:30 p.m. Room: Galleria North

Title: Automating Section 14 in Clinical Study Reports

This paper presents an innovative approach of using ODS with the RTF destination for automatically generating a customized Table of Contents in WORD with hyper linking to hundreds of tables from the Section 14 in Clinical Study Reports (CSRs). This approach has dramatically reduced the amount of errors that are inevitable in the manual process situation and also greatly expanded the contents feature in comparison with the HTML approach and signifi- cantly reduced the amount of time in CSR writings.

Author(s): Haibin Shu ______

81 ______

Paper TT13 Audience Level: intermediate Tuesday: 1:30 - 2:00 p.m. Room: Galleria North

Title: Getting More from the Compute Block to Enhance Table and Listing Output for Clinical Trials Report- ing

The REPORT procedure has become a popular tool of choice by a number of SAS programming groups in the phar- maceutical research industry for producing tables and listings (TLs). One component of PROC REPORT that can be utilized to great advantage in creating and enhancing TL output is the Compute Block. Among its many capabilities, the Compute Block can be used to create titles and footnotes, to create column headers, and to insert text strings into various sections of the report. The purpose of this paper is to present some examples, with relevant discussion points, of applying the Compute Block to enhance or help produce desired program output that follows the requirements out- lined in a study’s Statistical Analysis Plan (SAP). This presentation is based on SAS version 8.2 or above, is not lim- ited to any particular operating system, and is intended for intermediate SAS programmers who have some familiarity with the REPORT procedure.

Author(s): David W Carr ______

82

Genentech Named One of FORTUNE's "100 Best Companies to Work For" for Eleventh Consecutive Year On January 22, 2009

This year Genentech is number seven on the list. In selecting the "100 Best Companies to Work For," FORTUNE relies on two criteria: an evaluation of the policies and culture of each company, and the opinions of the company's own employees. The latter is deemed most important; two-thirds of the total score comes from employee responses to a survey created by the Great Place to Work Institute. The remaining one-third of the score comes from FORTUNE's evaluation of each company's demographic makeup, compensation, benefits programs and culture. A driving factor for the selection of companies on this year's list was that they excelled at creating jobs. Genentech is the only company in the biotech industry that has appeared on the list for eleven consecutive years.

Genentech Receives Clean Air Award

March 31, 2008 -- I'm pleased to inform you that Genentech's gRide Program has received the 2009 Clear Air Award for Transportation in recognition of our efforts to reduce dependence on fossil fuels, the generation of local air pollution and greenhouse gases. The gRide program encompasses a wide range of services including the GenenBus, DNA Shuttle and rewards programs that encourage employees to spare the air. The award is given by Breathe California, a non-profit organization that offers services addressing lung health issues in local communities. This award is a testament to the company's commitment to provide reliable and rewarding commute services to employees. Due to these efforts, Genentech continues to be a model for other companies aiming to reduce their impact on our local air quality as well as their overall carbon footprint.

Genentech Among Best Companies for Working Moms On September 24, 2008

Working Mother magazine named Genentech one of the "100 Best Companies for Working Mothers." The list identifies companies who are "using company-wide benefits and programs to ensure the retention and advancement of working mothers." For this year's 100 Best, the magazine gave particular weight to family-friendly programs, flexibility, leave policies and benefits for part-timers. In compiling the list, the magazine's editors assessed work/life programs including: workforce profile; compensation, child care; flexibility; time off and leaves; family-friendly pro- grams; and company culture. This is the 16th year Genentech has appeared on the list.

83 Tutorials

Co-chairs: Nancy Brucken (i3 Statprobe) Xiaohui Wang (Novartis) ______

Paper TU01 Audience Level: intermediate Monday: 9:00 - 10:00 a.m. Room: Broadway I/II

Title: Demystifying PROC SQL® Join Algorithms

When it comes to performing PROC SQL joins, users supply the names of the tables for joining along with the join conditions, and the PROC SQL optimizer determines which of the available join algorithms to use for performing the join operation. Attendees explore nested loop joins, merge joins, index joins, and hash join algorithms along with se- lective options to control processing. This presentation illustrates the various join processes including Cartesian Prod- uct joins, inner joins, and outer joins, as well as an explanation of the join algorithms.

Author(s): Kirk Paul Lafler

______

Paper TU02 Audience Level: beginner Monday: 8:00 - 9:00 a.m. Room: Broadway I/II

Title: Sweet Smell of SucSAS - When Your Data Step Does What is Intended!!

The fundamental of SAS programming is DATA step programming. The essence of DATA step programming is to understand how SAS processes the data during the compilation and execution phases. In this paper, you will be ex- posed to what happens “behind the scenes” while creating a SAS dataset. You will learn how a new dataset is created, one observation at a time, from either a raw text file or an existing SAS dataset, to the program data vector (PDV) and from the PDV to the newly-created SAS dataset. Once you fully understand DATA step processing, learning the SUM and RETAIN statements will become easier to grasp. Relating to this topic, this paper will also cover BY-group proc- essing.

Author(s): Arthur X Li

______

Paper TU03 Audience Level: intermediate Monday: 3:30 - 4:30 p.m. Room: Broadway I/II

Title: A Cup of Coffee and Proc FCMP: I Cannot Function Without Them

How many times have you tried to simplify your code with LINK/RETURN statements? How much grief have you put yourself through trying to create macro functions to encapsulate business logic? How many times have you uttered "If only I could call this DATA Step as a function"? If any of these statements describe you, then the new features of PROC FCMP are for you. If none of these statements describe you, then you really need the new features of PROC FCMP. This paper will get you started with everything you need to write, test, and distribute your own "data step" functions with the new (v9.2) PROC FCMP. This paper is intended for beginner to intermediate programmers, al- though anyone wanting to learn about PROC FCMP can benefit.

Author(s): Peter W Eberhardt

______

84 ______

Paper TU04 Audience Level: intermediate Monday: 11:00 - 12:00 a.m. Room: Broadway I/II

Title: Things Dr Johnson Did Not Tell Me:An Introduction to SAS® Dictionary Tables

SAS maintains a wealth of information about the active SAS session, including information on libraries, tables, files and system options; this information is contained in the Dictionary Tables. Understanding and using these tables will help you build interactive and dynamic applications. Unfortunately, Dictionary Tables are often considered an ‘Ad- vanced’ topic to SAS programmers. This paper will help novice and intermediate SAS programmers get started with their mastery of the Dictionary tables.

Author(s): Peter W Eberhardt

______

Paper TU05 Audience Level: beginner Monday: 10:00 - 11:00 a.m. Room: Broadway I/II

Title: PROC REPORT: Compute Block Basics (Part I - Tutorial)

One of the unique features of the REPORT procedure is the Compute Block. Unlike most other SAS procedures, PROC REPORT has the ability to modify values within a column, to insert lines of text into the report, to create col- umns, and to control the content of a column. Through compute blocks it is possible to use a number of SAS language elements, many of which can otherwise only be used in the DATA step. While powerful, the compute block can also be complex and potentially confusing. This tutorial introduces basic compute block concepts, statements, and usages. It discusses a few of the issues that tend to cause folks consternation when first learning how to use the compute block in PROC REPORT.

Author(s): Art Carpenter

______

Paper TU06 Audience Level: beginner Monday: 4:30 - 5:30 p.m. Room: Broadway I/II

Title: Building Your First Dashboard Using the SAS® 9 Business Intelligence Platform: A Tutorial

A dashboard is a visualization technique that provides an immediate view or snapshot of exactly where you are in a specific process relative to your stated goals and objectives. Visuals indicators such as temperature gauges, traffic lights and speedometers, help give you a real-world sense of your present progress and assist you in making decisions, adapting to current conditions or drilling into more detailed information. In a previous paper (Wright, 2008), we out- lined technologies that can be used to build dashboards and we further that discussion with information that will help you build your first dashboard. We then walk through the process of building your first dashboard step by step by step, beginning with design considerations and SAS options for creating dashboards, then continue with defining some key performance indicators (KPIs), connecting our data, customizing the visual indicators and then learning more about ways to make the dashboard actionable through drill down and click-through. Finally, we will conclude with a discussion of additional customizations that can be performed with the SAS Enterprise Business Intelligence Platform.

Author(s): Greg Nelson

______

85 ______

Paper TU07 Audience Level: intermediate Monday: 1:30 - 2:30 p.m. Room: Broadway I/II

Title: Building the Better Macro: Best Practices for the Design of Reliable, Effective Tools

The SAS® macro language has power and flexibility. When badly implemented, however, it demonstrates a chaos- inducing capacity unrivalled by other components of the SAS System. It can generate or supplement code for practi- cally any type of SAS application, and is an essential part of the serious programmer's tool box. Collections of macro applications and utilities can prove invaluable to an organization wanting to routinize work flow and quickly react to new programming challenges. But the language's flexibility is also one of its implementation hazards. The syntax, while sometimes rather baroque, is reasonably straightforward and imposes relatively few spacing, documentation, and similar requirements on the programmer. In the absence of many rules imposed by the language, the result is often awkward and ineffective coding. Some amount of self- imposed structure must be used during the program design process, particularly when writing systems of interconnected applications. This paper presents a collection of macro design guidelines and coding best practices. It is written primarily for programmers who create systems of macro- based applications and utilities, but will also be useful to programmers just starting to become familiar with the lan- guage.

Author(s): Frank DiIorio, Presented by Jeff Abolafia

______

Paper TU08 Audience Level: intermediate Monday: 2:30 - 3:30 p.m. Room: Broadway I/II

Title: PDF Can be Pretty Darn Fancy: Tips and Tricks for the ODS PDF Destination

We're not too far removed from the days when presentation-ready SAS® output meant lots of cutting and pasting or retyping. Now, the ODS PDF destination enables you to produce high quality output the first time, without other tools or applications. This paper explores a number of ODS options in general and, more specifically, their use in creating PDF output. We will cover ODS ESCAPECHAR, which allows for inline formatting of titles, footnotes and other text; ODS LAYOUT, which lets you place output wherever you want it on the page - even output from more than one procedure; inline formatting in PROC REPORT; overlaying output from SAS/Graph procedures on top of other out- put and more. We'll work from real life examples and see how you can produce output that looks like it took hours to create.

Author(s): Pete Lund

______

86 ______

Paper TU09 Audience Level: intermediate Tuesday 9:00 – 10:30 a.m. Room: Broadway I/II

Panel Discussion: The Creation and Validation of Analysis Datasets: A Panel Discussion Facilitator: Jeffrey Abolafia Overview: This panel discussion will focus on the process of creating and validating analysis datasets. Analysis datasets are a key component of clinical studies. Analysis datasets are used as input to produce statistical analysis and data displays. Although not required, it is expected that analysis datasets will accompany tabulation datasets when submitting data to the FDA. Also, analysis datasets facilitate the review process by FDA statistical reviewers. Given the significance of analysis datasets, the panel discussion will focus on practices for creating and validating ADNS keys to producing stat. analysis and to a successful submission

Topics to be discussed will include:

The rationale for creating analysis datasets. What business goals do they address? What were hoping to improve? Do you use analysis datasets for Tables? Listings? Patient Profiles? How do you decide what analysis datasets to create? What is the process of designing the analysis database? Who decides what datasets to create: programmers or statisticians? How are the specifications for analysis datasets developed and communicated? Are specifications typically conceptual or programming statements? Who programs ADNS: programmers or statisticians? Do you use internal or external standards when creating analysis datasets? Do you currently use (or are currently planning to use) the ADaM standard when creating analysis datasets? What methodologies do you use to validate analysis datasets? Who validates ADNS: programmers or statisticians?

______

Paper TU10 Audience Level: beginner Tuesday: 8:00 - 9:00 a.m. Room: Broadway I/II

Title: XML for SAS® Programmers

XML (the eXtended Markup Language) is an open standard for the definition, transmission, validation, and interpreta- tion of data. The standard was developed by the Worldwide Web Consortium (W3C) in order to provide a simple and efficient way to manage self-documenting data files. SAS® Software includes a number of useful tools for creating and parsing XML data. These include: The XML libname engine is used to import and export documents in XML format into or from a SAS dataset, optionally using an XMLMap. The new XML92 engine now supports XML Maps on output as well as input. The XML Mapper application is a graphical interface to analyze the structure of an XML document or an XML schema and generate XML syntax for the XMLMap. The ODS MARKUP output destination is used to create XML from SAS output; ODS MARKUP creates but does not read XML documents. This paper will introduce these concepts and present some brief examples of each. In addition, bonus topics include using PROC CDISC, the CDISC XML formats and creating XML files for Microsoft Office using the ExcelXP ODS tagset.

Author(s): Frederick Pratter

______

87 Pre-Post Conference Seminars

Seminar #1: Saturday May 30, 2009, 8:00 AM – 12:00 PM

TITLE: CDISC De-mystified: An Introduction to CDISC (Part I)

INSTRUCTOR: Greg Nelson

OVERVIEW: This workshop will guide participants through the various CDISC standards and outline what they are, what they mean (to the FDA, to BioPharmas and to CROs) and how systems, technologies, processes and roles may change as a result of these new standards.

PREREQUISITE: None

AUDIENCE/LEVEL: Statistical programmers and statisticians

DESCRIPTION: The Clinical Data Interchange Standards Consortium (CDISC) is a nonprofit organization with a mission to develop and support global, platform-independent data standards that enable information system interop- erability to improve medical research and related areas of healthcare. These standards are being endorsed by regula- tory organizations, such as the FDA, to use for the submission of clinical trial data.

There are a number of standards that have been developed and these include STDM, SEND, ADaM, ODM, LAB, and CDASH. Other standards, such as BRIDG, are being developed in concert with other organizations (notably, HL7 and NCI). CDISC supports the development of controlled terminology and glossaries. In addition, there are other standards in development including the Protocol Representation.

This workshop will guide participants through this bewildering set of standards and outline what they are, what they mean (to the FDA, to BioPharmas and to CROs) and how systems, technologies, processes and roles may change as a result of these new standards. Here, we will uncover the key principles of each of these models, how companies han- dle the business processes today and discuss implications and impacts for your work as a SAS programmer, biostatis- tician, technologist or manager within the BioPharma ecosystem. Finally, we will end the discussion with some high level approaches that are available to SAS programmers for implementing these standards.

Additionally, we will share with you the results of a survey that was conducted across all of the major pharmaceutical and biotech companies that talk about where they are with implementing CDISC and the lessons learned during these implementations.

The goal of this workshop is to provide a detailed examination of each of these standards and leave the participant with a firm understanding of the standards and how these fit together to help improve the overall state of clinical trials research processes within and among organizations.

BENEFITS OF TAKING THIS SEMINAR: The goal of this workshop is to provide a detailed examination of each of these standards and leave the participant with a firm understanding of the standards and how these fit together to help improve the overall state of clinical trials research processes within and among organizations.

88 Seminar #2: Saturday May 30, 2009, 8:00 AM – 12:00 PM

TITLE: Testing and Validating SAS Programs in an FDA Regulated Environment

INSTRUCTOR: Neil Howard & Brian Shilling

OVERVIEW: Testing and Validating SAS Programs in an FDA Regulated Environment

PREREQUISITE: Basic knowledge of SAS and clinical trials programming

AUDIENCE/LEVEL: Managers, programmers, analysts, validation programmers, statisticians who program or oversee programmers, database programmers, QC/QA staff

DESCRIPTION: This seminar will focus on testing and validating SAS programs, particularly in the context of an FDA regulated environment, with special emphasis on SAS tips and techniques to facilitate this process:

• WHY: examination of FDA regulations and guidances; exploration of reviewers’ expectations (processes and accountability); implications of audits; and discussion of client requirements and specifications • WHAT: interpretation of the guidances; definition of the terms testing, debugging, verification and valida- tion; test plans; and discussion of the types of things that must be validated • WHO: accountability in pharmaceutical companies and CROs • WHEN: planning and timing of testing and validation • WHERE: documentation specifics and tips • HOW: SAS and programming tips and techniques (for programmers and statisticians) for debugging, testing, and validation of production and ad hoc code for tables, listings, figures and graphs, and derived data sets; syntax, logic, requirements checking; error handling

The SAS system is easy to use and the learning curve to productivity is relatively short. But SAS is easy to abuse. Indisputable facts remain: data is seldom clean, logic is too often faulty, and fingers walk clumsily over keyboards. Condition codes and a ‘clean log’ are not accurate indicators of successful programs.

Since as much as 80% of a programmer's time is invested in testing and validation, it's important to focus on tools that facilitate correction of different types of errors in SAS programs. The workshop focuses on a variety of SAS features, tips, techniques, tricks, and system tools that can become part of your routine testing methodology consistent with 21 CFR Part 11 and other FDA guidances.

BENEFITS OF TAKING THIS SEMINAR: To get tips on creating and documenting a validation process for your organization.

89 Seminar #4: Saturday May 30, 2009, 1:00 PM – 05:00 PM

TITLE: Practical CDISC: Implementing CDISC with SAS (Part II)

INSTRUCTOR: Greg Nelson

OVERVIEW: Having developed a solid foundation of what CDISC is and an in-depth look at the various data stan- dards in Seminar #1, this second session covers the practical implementation of what it means to use CDISC stan- dards.

PREREQUISITE: None

AUDIENCE/LEVEL: Statistical programmers and statisticians

DESCRIPTION: Having developed a solid foundation of what CDISC is and an in-depth look at the various data standards, this second session covers the practical implementation of what it means to use CDISC standards. In the previous workshop, we talked about the macro level factors that influence your decision to use the standards and help decide architecturally which options make the most sense to you. In this workshop, the participants will learn how to implement the SDTM standards and associated define.xml file with SAS Included will be a discussion of important issues to consider when creating SDTM datasets and metadata so that harmonization with ADaM is maintained.

Here, we will review the various options that could be used to develop your strategy including: what tools are avail- able for data mappings, how SDTM relates to your existing stat/programming environment and how you generate a define.xml using both commercial software options and creating your own.

BENEFITS OF TAKING THIS SEMINAR: The goal of this workshop is to provide a more detailed look at SDTM and how to design and build an environment that takes advantage of the standards. Attendees will gain a firm understanding of the principles of SDTM and de- fine.xml characteristics and gain ideas on how to implement them within their processes.

Seminar #5: Saturday May 30, 2009, 1:00 PM – 05:00 PM

TITLE: Advanced ODS

INSTRUCTOR: Chris Olinger

OVERVIEW: A half-day advanced ODS seminar covering topics relating to Styles, Tagsets, and getting the most out of ODS.

PREREQUISITE: Basic knowledge of ODS and how ODS is used is required

AUDIENCE/LEVEL: SAS Report Writers with Medium to Advanced SAS skillsets

DESCRIPTION: This half-day advanced ODS seminar will cover topics relating to getting the most out of ODS Styles, reporting techniques using the Base SAS reporting procedures, ODS Tagsets, and using SAS to interface with alternative reporting environments. It will also touch on some of the newer technology in SAS version 9.2 such as differences to and enhancements to 9.1, Statistical Graphics, and ODS Style enhancements. Course attendees will be provided with course materials including a CD of all examples.

BENEFITS OF TAKING THIS SEMINAR: The course attendee will leave the course with a greater understanding of ODS, what it available in 9.2 SAS for report writers, and an understanding of what ODS is truly useful for.

90

SEMINAR #6 – Saturday May 30, 2009, 1:00 PM – 05:00 PM

TITLE: SAS & XML

INSTRUCTOR: Zender, Cynthia

OVERVIEW: This seminar provides an introduction to basic XML concepts and then provides examples of how XML is used in the “real world”.

DESCRIPTION: Everybody is talking about XML and you want to know what all the buzz is about. And, you want to know what it means for you. Perhaps you’ve heard about CDISC XML or you work for a financial institution and you send account or loan information to auditors in XML format or maybe you’ve heard the web team talk about de- livering web pages as XML. And now, you want to know more about XML and in particular, SAS and XML.

This seminar provides an introduction to basic XML concepts and then provides examples of how XML is used in the “real world”. In addition, the methods of creating XML files with SAS will be covered, including how to create an XML file from a SAS dataset using the SAS XML Libname engine and how to create an XML file from SAS Proce- dure output using ODS and the default XML tagset template. In addition, 2 separate examples of creating custom tagset templates will be covered. One example will generate a custom tagset template for use with the SAS XML Lib- name Engine. The second example will create an XML file from PROC FREQ and then use a custom tagset template and an XSL file to transform the file from XML to HTML.

A brief example of PROC CDISC will be used to contrast the nature of the types of XML files that it uses with the types of XML files created by default with the LIBNAME engine and ODS. In addition, a demo of the XML Mapper application will show how to convert a non-standard XML file into SAS format or convert a SAS file to XML format.

91

Seminar # 7: Sunday May 31, 2009, 8:00 AM – 12:00 PM

TITLE: Advanced Reporting and Analysis Techniques Used in the Clinical Setting: It's Not Just About The PROCs!

INSTRUCTOR: Art Carpenter

OVERVIEW: Learn a variety of techniques that will make your SAS programming more productive

PREREQUISITE: Basic understanding of the DATA and PROC steps

AUDIENCE/LEVEL: Intermediate

DESCRIPTION: There are literally hundreds of techniques used on a daily basis by the users of SAS® software as they perform analyses and generate reports. Although often obscure, most of these techniques are relatively easy to learn and generally do not require specialized training before they can be implemented. Unfortunately a majority of these techniques are used by only a very small minority of the analysts and programmers. They are not used more frequently, because a majority of SAS users have simply not been exposed to them. Left to ourselves it is often very difficult to ‘discover’ the intricacies of these techniques and then to sift through them for the nuggets that have imme- diate value.

This half day course presents a series of those nuggets. It covers a broad range of SAS topics that have proved to be useful to the intermediate and advanced SAS programmer that is involved with the analysis and reporting of clinical data. The intended audience is expected to have a firm grounding in Base SAS. For most of the covered topics, the course will introduce useful techniques and options, but will not ‘teach the procedure’.

The course includes options and techniques associated with:

MEANS/SUMMARY – the extras REPORT Compute Blocks Output Delivery System, ODS Operating System Interfaces DATA Step Functions and Options

BENEFITS OF TAKING THIS SEMINAR: Most of these techniques and tools will have an immediate impact on your SAS programming efforts.

SEMINAR #8 – Sunday May 31, 2009, 8:00 AM – 12:00 PM

TITLE: Best Practices in Base SAS Coding

INSTRUCTOR: Powers, Bill

OVERVIEW: *****This seminar will be taught by a SAS training staff. More details on this seminar and the instructor will be provided later *****

DESCRIPTION: With so many techniques to accomplish the same task in SAS, how do you choose between them? We'll look at some benchmarks run on common coding techniques to help you determine which technique to use when.

92

SEMINAR #9 – Sunday May 31, 2009, 01:00 PM – 05:00 PM

TITLE: Leadership 101: What Everyone Should Know About Managing SAS Programmers

INSTRUCTOR: Greg Nelson

OVERVIEW: What Everyone Should Know About Managing SAS Programmers

PREREQUISITE: Interest in skills for managing programmers

AUDIENCE/LEVEL: IT, technical and business professionals, including team leaders, managers, directors and oth- ers who want to enhance their leadership skills.

DESCRIPTION: In today’s environment, we see tremendous amounts of uncertainty and change - whether that be economic, cultural or social change - and it is critical to improve yourself by increasing your effectiveness as a leader and be ready to meet the difficult challenges in your organization, your industry and your career.

While there are a plethora of leadership and management training seminars, we take a slightly different approach by focusing on the practical perspectives that a person has when thinking about leading a project, a person or a team - the people, processes and organizational characteristics that surround us at any one time. These perspectives surround us on all sides and include our ability to:

Manage upwards: Nearly everyone has to operate in an environment where you have to carry out the mission of your organization and your management. Here, we will seek to understand your role in terms of the organization’s goals, priorities, constraints, culture and measurements and how you can use that information as a leader.

Manage laterally: There are times when you have to get along with people you don’t manage but yet you have to get the work done. Here we will discuss how you can effectively manage people and projects and the how to build a working relationship which is characterized by team-work and effective collaboration.

Manage your employees: From hiring the right people to managing poor performance, the role of managing employ- ees is challenging and often rewarding. Here, we will cover how you can create a successful environment for you and for the employee by having good processes for team communication, problem solving, and conflict resolution.

Manage yourself: One of the most effective tools that any manager can have is a good handle on how you lead your- self. Specifically, we are talking about how do you know when you’re doing a good job, are you on the right track, are you setting (and meeting) your goals, what kind of problem solver are you, how do you conduct yourself - it’s all about your character how you show your character as you lead upwards, sideways and with your employees.

BENEFITS OF TAKING THIS SEMINAR:

• Learn how to manage SAS programmers • Practice problem solving with real-world exercises • Take away tools that you can use to evaluate your employees and/or team members skills and interests

93

SEMINAR #10 – Sunday May 31, 2009, 01:00 PM – 05:00 PM

TITLE: Using SAS/GRAPH® Software in Clinical Trials

INSTRUCTOR: Art Carpenter

OVERVIEW: Learn the basic techniques needed to get started using SAS/GRAPH in the clinical setting

PREREQUISITE: Basic understanding of the DATA and PROC steps

AUDIENCE/LEVEL: Beginner

DESCRIPTION: This half day introductory workshop covers the essentials necessary for getting started with SAS/GRAPH. The workshop assumes no knowledge of SAS/GRAPH and even SAS programs with minimal experi- ence with the DATA and PROC steps will benefit from the instruction.

While the topics are not limited to clinical trials studies, the topics have been selected specifically to address the fun- damental issues encountered by the clinical programmer.

The course includes procedures, options, and techniques that will assist you to: Set up your graphics environment using specific graphics options Enhance text including titles, footnotes, legends, and labels Create Scatter plots and histograms Create, modify, and control axes and legends Specify plot symbols, lines, colors, and fill patterns Combining multiple plots using templates of your own creation Create plots specifically for publishing to the web or to a word processor

BENEFITS OF TAKING THIS SEMINAR: Bypass the sometimes long learning curve experience by those starting out using SAS/GRAPH.

SEMINAR #11 – Wednesday June 3, 2009, 01:30 PM – 05:30 PM

TITLE: Insights into ADaM

INSTRUCTOR: Matt Becker & Susan Fehrer

OVERVIEW: This seminar will give a brief overview of the CDISC SDTM structures and move into a more in-depth review of the ADaM structure’s benefits and construct

PREREQUISITE: CDISC SDTM knowledge

AUDIENCE/LEVEL: Intermediate

DESCRIPTION: ADaM (Analysis Dataset Model) is meant to describe the data attributes such as structure, content, and metadata that are typically found in clinical trial analysis datasets. The ADaM models are built from the CDISC SDTM baseline. In this seminar, we will briefly cover the CDISC SDTM 3.1.1 model and then move into a in-depth review of ADaM. The goal is to provide the attendees knowledge of the ADaM model, how it relates to the CDISC SDTM base, and how it may help in reducing FDA review time.

BENEFITS OF TAKING THIS SEMINAR: The benefits of taking this seminar is understanding of the ADaM model and how to develop ADaM-like analysis datasets for clinical trials.

94

SEMINAR #12 – Wednesday June 3, 2009, 01:30 PM – 05:30 PM

TITLE: Merging, Combining and Subsetting SAS Data Sets (Tricks, Traps, and Techniques)

INSTRUCTOR: Malachy (Mal) Foley

OVERVIEW: This seminar covers a wide range of topics related to data set manipulation, including over 30 common traps in manipulating files. Most of these traps produce erroneous results with no SAS message of any kind!

PREREQUISITE: Working knowledge of the SAS DATA Step

AUDIENCE/LEVEL: All Programmers who use the DATA Step to manipulate data sets

DESCRIPTION: This seminar covers a wide range of topics related to data set manipulation, such as… interleaving, subsetting, concatenations, the IN= data set option, BY groups, FIRST.variable, program data vectors (PDV), finding duplicate records, collapsing files, overlapping variables, random access, Cartesian products, one-to-one merges, match merges, and fuzzy merges. Moreover, it discusses over 30 common traps in manipulating data sets. Most of these traps produce erroneous results with no SAS message whatsoever!

The seminar starts with the basics and continues to build up to more complex examples of data set manipulation. The only prerequisite for the course is a working knowledge of the SAS DATA Step. Yet, this workshop will give intermediate and advanced programmers a great review and some surprises. Come see what mysteries lurk in manipulating SAS files!

BENEFITS OF TAKING THIS SEMINAR: At the end of the seminar, participants should be able to use the DATA Step to manipulate files with increased productivity and data integrity.

SEMINAR #13 – Thursday June 4, 2009, 08:00 PM – 05:-00 PM ****** This is a day long class ******

TITLE: Advanced Techniques in the SAS® Macro Language

INSTRUCTOR: Art Carpenter

OVERVIEW: Learn macro language techniques beyond simple code substitution and basic statements

PREREQUISITE: Attendee should have a basic understanding of simple macro language statements as well as the DATA and PROC steps.

AUDIENCE/LEVEL: Intermediate to Advanced

DESCRIPTION: This one day course is designed for students with a good understanding of the DATA and PROC steps and who al- ready understand the basic structure and syntax of the SAS Macro Language. The course will start with a short review of the macro basics and quickly move on topics selected to improve your macro language expertise. Several key macro functions will be introduced, explained and demonstrated. Course topics include: Macro Language Review Macro Functions Including quoting and Bridging functions User created functions Writing Dynamic Code Controlling Your Environment Working With SAS Data Sets SAS Macro Libraries Numerous other Macro Topics

95

Sponsors

Many thanks to the following Corporate Sponsors. They help make this conference possible.

Platinum Sponsors

Gold Sponsors

96

Silver Sponsors

Bronze Sponsors

Moving Beyond the Data

97

Exhibitions and Demonstrations

MONDAY and TUESDAY, 9:00 AM – 12:00 PM, 1:30 PM - 5:00 PM, WEDNESDAY, 9:00 AM - 11:30 AM Visit the exhibitors in Grand Ballroom to see what they have to offer you and your organization. Many have give-aways at their booths and/or you can enter to win prizes. Participating exhibitors will provide live demos and descriptive literatures of their products. We are grateful to this year’s exhibitors:

ASG, Inc. KFORCE Clinical Research Assent Consulting i3 Statprobe Benchwokzz MAJARO InfoSystems, Inc. ClinPlus MaxisIT, Inc. CLINPROB LLC MedFocus LLC COMSYS Clinical Placemart Personnel Service DataCeutics, Inc. Smith Hanley eClinical Solutions/Eliassen Group TAKE Solutions Inc. Johnson & Johnson Texas A&M University

98 Exhibitor Logos

99