Office of Sponsored Programs Simulation Study

Bartholomew, Benkula, Conley, Grossman, Hourihan, Jewell, Long 5/14/2009 Executive Summary Background and Purpose

The University of Idaho is a research institution that receives approximately eighty million dollars a year in different grants, contracts, and cooperative agreements. However, the current system that the Office of Sponsored Programs (OSP) and the Principle Investigators (PI) use to process the proposals is inefficient, which results in denied proposals. This report specifically addresses the pre-award process, through simulation in Arena software. The goal of the simulation study was to provide the University of Idaho's Office of Sponsored Programs with a decision support tool that will assist in evaluating the front-end proposal submission process. Specifically, the model will assist in identifying system constraints and methods of exploiting those constraints in an effort to improve customer service, given that the PI process of submission also improves.

Scope and Methods

The scope of this project was confined to researching, modeling, and simulating the current pre-award process of the Office of Sponsored Programs. Validation of the model was then conducted to ensure that the model matches what currently occurs in the pre-award process. Once the model was validated, bottlenecks were identified, and realistic changes were made in the model in an attempt to increase throughput. A presentation of the team’s findings was made upon completion of the project. This is a pilot project, and therefore was limited to the pre- award process within the award process of the Office of Sponsored Programs. The project was completed before the end of the spring 2009 semester. The methods used to analyze the pre- award process included interviews with OSP employees, data gathering at OSP, PI surveys, utilization of Output Analyzer in Arena, and other outside research.

Findings

Results from the analysis show that the OSP cannot currently handle the amount of proposals that flow through the pre-award process without expediting the majority of proposals. This situation is due to uneven flow and quality of PI submissions, and the pre approval process currently used. Several different models were developed based on current business rules and other possible solutions.

Recommendations/ Conclusions

Two recommendations target the current problems of late submittals and unhappy PI’s. First, adding two additional SPA’s would alleviate having to expedite proposals. Second, utilizing work flow in the OSP and interlocking the PI process to the pre-approval process would decrease the process times in the pre-award process by a projected fifty percent without having to hire additional SPA’s. Table of Contents

Executive Summary...... 2

Model Assumptions and Constraints...... 4

Process Flow and Modeling...... 6

Entity Arrival...... 6

Administrative Assistant (AA) – Proposal Receiving Process...... 7

Sponsored Program Administrator (SPA) – Review Process...... 7

Polly – Approval Process...... 8

SPA – Re-review Process...... 8

SPA – Submission Process...... 9

AA – Terminating the Review Process...... 9

Resource Scheduling...... 9

Entity Types...... 10

Proposal Due Dates...... 11

Proposal Values...... 11

Process Step Completion Time Distributions...... 13

Costing...... 14

Model Verification...... 15

Base Model Outputs...... 15

Model Variations...... 18

Additional Suggestions...... 25

Further Validation...... 26

Data Discrepancies...... 27

Data Corrections...... 27

New Model Variations…………………………………………………………………………………………………….…………………….29 Scope and Deliverables The University of Idaho is a research institution that receives approximately eighty million dollars a year in different grants, contracts, and cooperative agreements. However, the current systems and business rules employed at the University results in submittal issues. These systems include the PI process of creating the submittals and the submittal process its self. This has resulted in proposals being expedited through the office on a regular basis and unhappy customers. This project is focused on the OSP process, as that process is in their direct control. Business rules relevant to the interaction between the PI process and OSP process are also explored.

Therefore, this project will model the existing pre-award process through Arena software in an effort to spot system problems and then recommend solutions. Specifically, the model will assist in identifying system constraints and methods of exploiting those constraints in an effort to improve customer service and throughput, thus increasing the amount of grants received, and the funding for to the University. Currently, some of the initial contributing factors identified are issues with prioritization, issues with electronic submission, and a lack of personnel relative to the current workload.

In order to fulfill the project objectives, a simulation model was developed, incorporating the actual operational data and information collected from the OSP. The simulation utilizes actual process sequences and decision logic to represent office operations accurately.

The development of this model required time, information, and work. Initially, a thorough understanding of the OSP work flow was necessary to conceptualize and develop an accurate representation of this system. To do this task completely, the inputs and outputs of the model needed to be defined and the model output needed to be verified, both that the output was as anticipated with the given model, and that the model mirrored the actual processes of OSP.

The simulation project is deemed a success if the simulation is validated and at least one realistic suggestion for improvement is identified based on changes in the simulation model. The team researched and modeled the current pre-award process the OSP utilizes. Validation of the model was then conducted to ensure that the model matched what currently occurs in the pre-award process. Once the model was validated, bottlenecks were identified, and realistic changes were made in the model in an attempt to increase correct throughput. A presentation of the team’s findings was made upon completion of the project. This was a pilot project, and therefore was limited to the pre-award process of OSP.

Proposal arrivals for the model were based on 2008 data provided by an OSP representative. Due to gaps in the data, random values were generated to appropriately fill gaps in the dates. The outputs were based on 1 year. The model ran through 30 replications of that year to get a 95% confidence interval and be more statistically comfortable with the mean number awarded, late, and other values produced by the model. There were approximately 1249 proposals flowing through the model in one year. Model Assumptions and Constraints In order to gain an understanding of the capabilities of the model, it is important to understand the constraints and assumptions that governed the modeling process. These assumptions allowed us to model OSP’s pre-award process as accurately as possible, but may also limit the abilities of the model. By explaining the assumptions and constraints of the model, OSP will better understand the limitations of our assessment. This information is also necessary if the model is ever expanded, or if it is used for further analysis.

It was first assumed that 50% of an Administrative Assistant’s (AA’s) time was devoted to the pre-award process. The remaining 50% of the AA’s time was allotted to other duties, including tasks in the post-award process. When modeling the pre-award process, we did not utilize the AA past this 50% mark, as this represents the current amount of time the AA spends on the process. It was also assumed that only one AA worked at a time. There have been instances where multiple AAs have worked together on the pre-award process, but with the recent hiring of a full time AA, modeling one AA seemed to be the most accurate representation of the current pre-award process.

It was assumed that no pre-proposal entered the system as a draft. In the data received, there were a few instances when a pre-proposal was received as a draft. However, there were questions regarding the accuracy of these data points because the received dates for the draft of the pre-proposal, the pre-proposal, and the full proposal were all the same. Therefore, we did not model drafts of pre-proposals as entities. This assumption has no bearing on the outcomes of the model. Another assumption made was that no proposal was worked on before the Received dates listed in the provided data. This assumption was not actually correct, based on the interviews conducted with the Sponsored Program Administrator (SPA) and with the AA. There are instances when a SPA will receive and begin work on a proposal prior to the AA logging the proposal as received. The impact of our original assumption was that the distribution used to apply a due date to the entities in our model likely produces days until due that are lower than those realized at OSP. Lower due dates result in less processing time available before proposals are technically late, resulting in more late proposals than actually occur at OSP. Specific data regarding the frequency of SPAs working on proposals before they are stamped as received was not available, so the original due date distribution was used in the model.

In the original model, it was assumed that no SPA ever helped another SPA. Although SPAS do help one another, OSP has no specific business rule as to when this occurs. Due to this fact we could not model SPA assistance in the base model. Additionally, in the model SPAs worked on a proposal until it is completed and passed on to the next process. At OSP, however, SPAs will often work on a piece of a proposal and then work on another proposal. This procedure was not included in the model, as we concluded that the process times provided by the SPAs would reflect this information (if it is faster to work on pieces of the proposal at a time then overall process times provided by the SPAs would be lower).

While working on the project, there were multiple data and communication issues. When requesting data from OSP, it was difficult to clearly express our needs. Part of this confusion is attributed to the fact that we did not know exactly what data was needed because of our limited experience with Arena. We also ran into some problems with the data we received. We made initial assumptions about the data received in order to insert it into the model. Initially, we incorrectly determined that the data provided regarding proposal review times included only process times. Later we learned that the data actually included wait times, process times, and times when workers were not at work (i.e. vacation, weekends, and evenings). While this problem was easily corrected, such issues made the process of gathering correct information longer than we had hoped.

Data from 2008 was used to determine that approximately four percent of all incoming proposals have already been awarded by the requesting agency. We were also informed that only 25% of the incoming proposals have a due date specified by the submitting PIs. In order to determine the due dates of the remaining 75% of proposals, the AA stamps a due date of 4 days in the future on the proposals when received. Although some of the proposals were given to a SPA to work on before the AA physically stamped the due date, we were not able to determine how often this happened and made the assumption that is was not significant enough to impact the model. Furthermore, from the provided 2008 data, it was determined that 80% of the proposals submitted to OSP are funded. (See the Model Validation section below for further discussion of this data). This mirrors what we calculated in our original model. However, after talking with OSP staff, we learned that the funding rate is actually about 40%. This error was changed in the updated model and will better reflect OSP’s actual process.

Entity Types Based on the data available, we identified 33 different types of proposals received by OSP. Products are differentiated because they go through different processes, each proposal’s process times are different, and OSP desires that they be tracked separately. We identified three categories from which the range of products was determined. The first category includes drafts, full proposals, drafts that return as proposals, pre-proposals, pre-proposals that return as proposals and proposals that already received funding before the OSP received them (the PI circumvented the system). The second category includes proposals that are submitted electronically and proposals that are submitted by hard copy. The final category is based on the SPAs, as each SPA process proposals from different Colleges.

Process Flow and Modeling Process flow identifies the process steps which a proposal travels from start to finish. In the case of OSP, proposals arrive and the majority of these proposals flow from the Administrative Assistant (AA), to the SPAs, to Polly, and back to the AA before being submitted. Any deviation from this flow is discussed in the following sections.

Entity Arrival OSP does not receive a constant inflow of proposals. Instead, OSP’s demand mirrors that of a restaurant, where there are times of low demand and times of extremely high demand. In order to reflect OSP’s actual process, it was necessary to capture the unpredictability in demand. To facilitate proposal arrival, we used data provided by OSP to match the exact pattern of arrivals realized by OSP in 2008. The data provided by OSP was from part of 2007 through the beginning of 2009, and included a column for proposal received dates. Some of the proposals in the data sheet had no received date allocated to them. To ensure that the arrival pattern was as accurate as possible, we determined the percentage of proposals that had received dates in 2008 from the total number of proposals that had recorded received dates. We then multiplied that percentage by the number of proposals in the data that lacked due dates, giving us the number of proposals that did not have received dates but were likely received in 2008. We followed the same process to determine the spread of proposals throughout the months of 2008. Finally, we used a random number generator in Excel to determine the date and time of arrival to assign to the proposals. This process resulted in 1249 proposals being modeled as received by the OSP in 2008.

In Arena, we modeled a create function that would enter all 1249 proposals into the system at the beginning of model playback. Once created, the entities go into a hold module, where they were stored until the read-write file instructs the hold to release an entity based on the Excel arrival schedule referenced by the read-write file. This schedule was created from actual arrival times from 2008.

Administrative Assistant (AA) – Proposal Receiving Process To begin the process, a proposal is received by an OSP representative. The AA date/time stamps the external support form (ESF) included in the proposal and checks to see if the proposal is in draft or completed form. If the proposal is a draft, a new file is created and it is entered into FWAPROP and OSPRE What are these??, The AA then emails the PI notification of received draft and routes the draft to the SPA. If it is a completed proposal, the AA identifies the due date and searches Banner for an existing entry relating to the proposal (this would occur if the proposal is being resubmitted or if it was previously submitted in draft form.) If an existing entry is found, the proposal is pulled from the hard file system and then entered into FWAPROP and OSPRE to generate a proposal number. The AA then emails the PI receipt notification and routes the proposal to the SPA. If there is no existing entry, a new paper file (purple if the proposal will be submitted to the agency in hard copy form, red if the proposal will be submitted to the agency electronically) is created, and a copy is sent to a SPA. It was determined through interviews with the AA, that there is no significant difference in process times between drafts and full proposals. Based on this information, all entities were modeled to flow through the same AA process.

This process is represented in Arena as “AA Inputs.” Note that it was not necessary to model the individual steps performed by the AA in this process to get a full picture of the pre-award process. We were able to model the entire step as if it were one process. As explained in the Data Inputs section, two separate process distribution times for AA Inputs were identified by the AA. To facilitate modeling this complexity, we used a sub-model to represent the AA’s process. Within the sub-model we were able to utilize a decide module to make an assigned percentage of proposals go through one process time distribution and the remaining percentage go through another. One resource was assigned to these processes and was simply named “r AA” (see Data Inputs section for information regarding the AA’s schedule and pay). A subsequent model of just the AA steps can be created to facilitate process improvement of the AA’s position. Sponsored Program Administrator (SPA) – Review Process The SPA receives the proposal from the AA and reviews it, making sure all the necessary information and signatures are included and that the budget is correct. A Request for Proposal (RFP) review is completed to ensure that all information required by the funding agency is included and completed per agency guidelines. If the University of Idaho (UI) is the sub recipient then a UI letter of support is drafted. SPAs then check to ensure that a financial disclosure form is included if necessary. Export controls are also checked. If the proposal is to be submitted through grants.gov, the SPA checks to ensure that the formatting is done correctly. Any communication between the PI and the SPA is recorded in the proposal file. The file is prepped for approval by Polly. Like the AA process, the SPA process was modeled as one step due to the lack of data, and like the AA process, the SPA process can be modeled in detail once that data is collected.

In the model, proposals are routed to SPAs using sequencing. Sequencing is much like a decide module, but with more flexibility. It allows the entities to flow to wherever a station is placed. Arena has the ability to control what entities flow to what stations. Once entities leave the AA process, they enter a route where the sequencing commands are referenced and the entities are sent to the station designated in those commands. If a proposal leaves a route it will only go to the station or stations designated, and in the order designated. Sequencing is performed based on the order established for each entity. If the entity type name is FEC (full proposal, electronic submission, assigned to Cathy), then the sequencing commands will dictate that the entity flows from the AA to Cathy’s review process.

OSP currently employs three SPAs in the pre-award process. Each SPA has a sub-model in the simulation, representing their individual review processes. As in the AA process described above, it was not necessary to model the individual steps performed by the SPAs in this process. Each sub-model was given the name of the SPA performing the processes included within the sub-model (Cathy, Kelly, and Steve). Data for process time distributions were provided by each SPA. A decide module was again used to dictate the percentage of proposals that were to follow the different time distributions. The team was informed that entity distribution times for this process are determined not by the type of entity (draft, pre-proposal, full proposal, etc.) but by the PI submitting the proposal. Therefore, all entities flow through this review process and are subject to process distribution assignment based on the process times provided by the SPAs.

Polly – Approval Process After the SPAs complete their reviews, the proposals must be approved for submission by Polly. If the proposal is not in compliance with the regulations set by the requested organization, which only happens about 1% of the time, the proposal is sent back to the designated SPA. Polly will deny approval for submission if the proper signatures are not included, or if there are not enough notes in the proposal file regarding the review process. Once Polly has reviewed a proposal, one of three things can occur: the proposal can go back to a SPA for re-review, the proposal can be sent back to a SPA for electronic submission, or the proposal can be routed to the AA for hardcopy submission. Polly’s utilization in this model is minimal because she is only used for this specific process; her administrative duties are not taken into account in this model. Note, to mimic reality, that drafts do not go through this process, but instead exit the system through a dispose module.

SPA – Re-review Process To mimic reality, one percent of the proposals reviewed by Polly are sent back to the SPAs for additional work by the use of a decide module in the model. The decide module sends denied proposals to a sub-model, in which another decide module sends the denied proposal to the three SPAs based on the names of the entities (FHSt goes to Steve). Once the re-review process is completed, the proposals are sent back to Polly for approval.

SPA – Submission Process The submission process is of highest priority to the SPAs. If a proposal is to be electronically submitted, then the PI will upload the files to the necessary submission site. The SPA then submits the proposal to the requesting agency. The submission process is the time that it takes the SPA to send the proposal off to sites like grants.gov. Each SPA uses the same distribution time for this process because they go through the same process to submit a proposal. These processes are given a high priority because a majority of proposals are expedited through the system to achieve on-time submission.

AA – Terminating the Review Process After submission, the AA scans the External Support Form (ESF) and submission confirmation form (if the proposal was submitted electronically). The AA emails a copy of the ESF and proposal submission confirmation to the PI, the SPA, and the department for which the PI works. All hard copy and electronic submissions are finished in FWAPROP by changing the status, the status date, and double checking all numbers. Then, the AA finishes the OSPRE file, entering in any additional information. The proposal completion date is filled out on the ESF, dated, and filed.

This was not modeled due to its standardization for all entities. Instead, a sub-model was created to decide the process time the entities would be assigned. All distribution times were verified with the AA, as there are both slow and fast process times. In the model, after the AA handles the entity, there are several decide processes that determine whether the entity is to go on to the post award process, whether it was already awarded when received, if it has been denied, if it was late, or if it was a pre-proposal. After the entity goes through these decides, its completion and assigned dollar values are noted using a record module and disposed. The AA always does the first AA process before she does the second AA process. Model Input In order to gather accurate inputs for the model, OSP provided us with data regarding proposal receiving dates, due dates, process times, dollar values of each submission, and assignments to SPAs. Data was also gathered through personal interviews with OSP personnel and through email interchanges.

Resource Scheduling OSP provided information regarding current schedules for all employees represented in the model. Interviews with OSP employees revealed the amount of time each employee estimates they do not work during scheduled working hours. This time is spent using the restroom or catching up with coworkers. This information created allowing more accurate data regarding the amount of time spent processing proposals. Employee sick and vacation leave was modeled as resource failure within the model. (Note: Holidays were built into the schedule as Calendar Schedules Exceptions). The data displayed in Table 1 below was provided regarding employee scheduling:

Table 1: Staff Schedules Resource In Out In Out In Out In Out Resource Failures/Year SPA 1 (St) 7:00 11:30 12:30 14:00 14:06 16:00 - - 15 AL, 2 SL

SPA 2 (C) 8:00 10:00 10:15 12:15 13:15 15:15 15:30 17:00 12 AL, 10 SL

SPA 3 (K) 8:00 10:00 10:12 12:12 13:12 15:12 15:24 17:00 12 AL, 10 SL

Polly 8:00 12:00 13:00 15:00 15:15 18:30 - - Unknown -Used Kelly’s AA (Full 8:00 12:00 13:00 17:00 - - - - Unknown -Used Time) Kelly’s

Entity Types There were various entities that OSP requested the model track. In order to model different entities entering the system, the entities pass through an assign toward the beginning of the model where the entities are assigned a type based on a distribution. That distribution was calculated in Excel and was based off of actual data provided by OSP. For instance, the data was filtered for any proposal number ending in D, giving us all the proposals that were received as drafts. Dividing this number by the total proposals in the data sheet gave the approximate percentage of entities that OSP receives in draft form. From there it was determined which proposal numbers reentered the system as full proposals, by checking for the existence of a duplicate entry without a D. The total number of drafts returning as proposals was divided by the total number of proposals in the data set to determine the percentage of total entities that are full proposals that were once drafts. The following are the base percentages from which the 33 different entity types were derived: Table 2: Entity Types Proposal Type Label Percentage of Total Draft D 1% Full F 90% D>F (draft returning as S 6% full) Pre-proposal P 2% P>F (pre-proposal R 1% returning as full) Electronic E 21% Hard Copy H 79% SPA Cathy Knock C 44% SPA Kelly Morgan K 17% SPA Steve Kirkham St 39%

From this data proposals were formed to be produced in the model. For instance, full proposals that need to be submitted electronically and reviewed by Kelly (labeled “FEK”) represent about three percent of the total proposals received by OSP. After we determined the above entity variables another entity type was discovered. Proposals that are received by OSP after already being awarded (those that circumvent the system) represent about 4% of total arriving proposals (determined by an interview with an OSP employee). To add these proposals into the model, a decide module was included through which proposals would flow prior to being assigned an entity type. This module would require that about 4% of all incoming proposals would re-route to another assign where they would be assigned an Already Awarded” label (labeled A) and destined to be reviewed by each of the SPAs. These needed to be included in the model because these proposals would have a lower priority (based on due date) than proposals that still need to be submitted.

Proposal Due Dates It is necessary to assign a due date to the proposals in the model so that the proposal prioritization rule (earliest due date first) is reflected in the model. From the data received from OSP, we used the columns “Received” or “Draft Received” and “Proposal Due,” to calculate a distribution that would enable us to assign proposals with a “days until due” or “due date” attribute. We uploaded the data provided into Arena’s Input Analyzer which gave us the following due date distribution: TNOW + (-0.001 + EXPO (148)). This expression indicates that the distribution of due dates is exponential with a mean of 148 hours. Thus, proposals are turned into OSP 3.7 days (on average) before they are due to the requesting agency. Twenty-five percent of proposals flow through the module that assigns this due date. The remaining 75% of incoming proposals are submitted without a due date. In these cases, OSP automatically assigns the proposals with four days until due in order to facilitate the prioritization of their work. The model employs a decide module which sends 75% of proposals through an assign module, which assigns those proposals a due date of four days. It is important to note that the due date distribution might be incorrect. Through interviews with OSP employees, it was learned that a “Received” date is assigned to a proposal when the AA receives the completed Electronic Support Form (ESF) from the PI. There are some instances, however, when the PI sends their proposals to SPAs without turning in the ESF to the AA. When this occurs, the days to process (same as days until due in the model) is incorrect, as the SPA would have been processing the proposal before it technically was received by OSP.

Proposal Values In order to facilitate a net present value analysis of the different modeling alternatives, it was necessary to assign dollar values to incoming proposals. Values were derived from the data provided by OSP in the column titled “FRBPROP_REQUESTED_AMT.” In order to get the most accurate value distribution in the model, we separated proposal amounts based on which SPA was to review the proposals. Using Excel and Arena’s Input Analyzer, the appropriate distributions to assign to proposals were determined. These distributions were placed in the SPA review sub-model, within multiple assign modules. Values were assigned to SPAs based on the information we had regarding SPA assignments relative to University college/department and the information about proposals received from each college/department in 2007, 2008, and 2009. The following is the data used to assign proposal values:

Table 3: Value of Proposals Completed by Kelly % of Proposals Range of Distribution from Input Analyzer Following this Values in Distribution Distribution 24% 0-9999 TRIA(731, 3.07e+003, 9.93e+003) 47% 10000-140000 1e+004 + 1.18e+005 * BETA(0.567, 1.32) 16% 140001-320000 NORM(2.04e+005, 4.16e+004) 5% 320001-460000 3.25e+005 + 1.24e+005 * BETA(0.526, 0.632) 5% 460001-520000 UNIF(4.62e+005, 5.2e+005) 3% 520001- 6.71e+005 + 5.95e+005 * BETA(0.0131, 1266052 0.018)

Table 4: Value of Proposals Completed by Cathy % of Proposals Following Range of Distribution from Input Analyzer this Distribution Values in Distribution 14% 0-9999 535 + 9.45e+003 * BETA(0.817, 0.938) 32% 10000-40000 1e+004 + 3e+004 * BETA(0.769, 1.16) 11% 40001-60000 4.1e+004 + 1.9e+004 * BETA(1.04, 1.07) 29% 60001-300000 6e+004 + WEIB(7.38e+004, 1.05) 11% 300001-800000 3.12e+005 + WEIB(1.65e+005, 1.19) 3% 800001- 8.43e+005 + WEIB(6.27e+005, 0.582) 3199992 Table 5: Value of Proposals Completed by Steve % of Proposals Range of Values Distribution from Input Analyzer Following this in Distribution Distribution 13% 0-9999 800 + 8.94e+003 * BETA(0.558, 1.02) 19% 10000-40000 1e+004 + 2.9e+004 * BETA(0.822, 1.14) 8% 40001-60000 TRIA(4.07e+004, 4.75e+004, 6e+004) 17% 60001-149999 6.07e+004 + WEIB(3.6e+004, 1.12) 15% 150000-300000 1.5e+005 + 1.5e+005 * BETA(0.742, 0.768) 12% 300001-500000 3.01e+005 + 1.99e+005 * BETA(0.864, 0.828) 10% 500001-999999 5.1e+005 + EXPO(1.87e+005) 7% 1000000- 1e+006 + WEIB(1.67e+006, 0.596) 16615689

Process Step Completion Time Distributions Originally, we hoped to determine processing time from the data provided by OSP. After analyzing the data, however, it was determined that the process times provided by the office contained wait time, process time, and time when the office is closed. In order to get more accurate process times, we interviewed each employee working in the pre-award process in regards to the time it takes them to complete their pieces of the process for each proposal. Employees provided one or more sets of minimum, mode, and maximum processing times. For instance, the AA informed the team members that 90% of the proposals she works on take at least 5 minutes, with a mode of 10 minutes, and a maximum time of 15 minutes, while the remaining 10% take at least 16 minutes, with a mode of 3 hours, and a maximum of 6 hours to process. (Note that the percentages in the following tables do not refer to the portion of total entities that follow the corresponding distribution times but the percentage of entities that go through the process that follow those times – for instance, 71% of all proposals reviewed by Steve follow the first distribution listed for Steve’s review process times).

While all of the process times were provided by OSP employees, we derived some of the percentages used to determine what portion of total proposals followed the provided process distributions. This information was calculated from the data regarding the number of proposals received from each college (as the process times provided by the SPAs seemed to change based on the college from which the proposal originated) relative to the total number of proposals received in 2007, 2008, and 2009. It is also important to note that it was very difficult for the SPAs to give us estimations of their process times so the accuracy of the process times utilized is unknown. However, we believe it is optimistic on the fast side as the model does not show them as utilized as expected. Therefore, the process times were increased to reflect a higher utilization rate. Table 6: AA Process Times AA Process Percent of Proposals Process Time Distributions (hours) First AA Process 90% Fast: tria(.0833, .1667, .25) 10% Slow: tria(.2667, 3, 6) Second AA Process 95% Fast: tria(.0833, .1166, .1666) 5% Slow: tria(.1833, 1.5, 3.5)

Table 7: SPA Review Process Times SPA Percent of Proposals Process Time Distributions (hours) Steve Kirkham 71% St1: tria(2, 3, 4) 29% St2: tria(.5, 2.5, 5) Cathy Knock 61% CNR: tria(.1667, 1, 4) 39% College of Ag: tria(1, 3, 6) Kelly Morgan 6% Other: tria(.5, 5, 6) 29% Arts & Humanities: tria(.5, 1.5, 3) 34% Institutes: tria(2, 3.5, 10) 26% Education: tria(3, 5, 9.5) 6% Center for Adv. Energy Studies: tria(4, 4, 4)

Table 8: Polly Review Time Resource Process Time Distribution (hours) Polly tria(.01667, .03333, .50)

Table 9: SPA Re-review Times SPA Percent of Proposals Process Time Distributions (hours) Steve Kirkham 100% tria(.5, 1, 2) Cathy Knock 100% tria(.5, 1, 2) Kelly Morgan 100% tria(.5, 1, 2)

Table 10: SPA Proposal Submission Times SPA Percent of Proposals Process Time Distributions (hours) Steve Kirkham 100% tria(.01667, .5, 2) Cathy Knock 100% tria(.01667, .5, 2) Kelly Morgan 100% tria(.01667, .5, 2)

Costing The process costs included in the model were derived from employee wage and benefits information provided by OSP. We were given annual earnings and benefit payments for each employee. From this data, an hourly wage was determined to use in the model for both the busy and idle resource rates. Table 11: Employee Wages OSP Employee Hourly Wages Cathy Knock $28.86 Polly Knutson $54.98 Kelly Morgan $30.52 Steve Kirkham $30.68 AA $9.52

Model Verification Verification is the process by of confirming the information put into the model is correctly directing the process as intended. The first input that is important to verify is the scheduling, regarding both the entity arrival schedule and the resource work schedules. The entities were verified to be going into the system as designated by the schedule and not arriving based on various constraints (such as not arriving on weekends). This was done by slowing the playing model down significantly and monitoring the arrival of the entities in the model while looking at Arena’s date and time clock. By doing this, we were able to observe that entities were not entering the system during the weekends, nor were they being created on holidays. With regards to the scheduling of resources, employing a similar process of verification by monitoring the model clock and including a visual cue that signaled when resources were busy and when they were idle, was used.

Another area requiring verification was ensuring that the correct entities were being transferred to the correct SPAs based on the model sequencing. This verification was done by applying additional visual cues that were used to represent each individual entity, based on the SPA to which they were allocated. For instance, proposals processed by Steve were represented by an open envelope (proposals reviewed by the other SPAs were represented with an open folder and a closed folder). Representing entities in this manner allowed for confirmation that entities were being assigned to different SPAs and that there were no entities that entered the queue of a SPA to which they were not assigned.

Verification was also done to ensure that entities were being correctly prioritized within the model. By pausing the running model, we were able to select entities waiting in queue and examine the information about those entities, including their assigned due dates. The ability to access this information allowed us to conclude that due dates were being assigned correctly to entities and were correctly being prioritized in queue based on earliest due date first.

Finally, there were a number of decision functions placed throughout the model that were used to guide the entities in the model. These decision functions were based on the information we received and inputs. In order to verify that our inputs were correctly introduced and our decision functions were operating as intended, we compared our input data to the outputs we received. By doing this, a measure of the percentage of each type of entity out against the intended percentage of each entity type in was conducted and it justified that the model was performing appropriately. Base Model Outputs The “Model Outputs” are the average values output by the model based on the information used within the model. The actual figures are derived by averaging the total values after thirty replications of one year through the model. This was done in order to get the most accurate averages from the distributions used in the model, and provide a long enough time period to acquire an accurate perception of the process. Table 12 represents each proposal output, including the number of entities that went to the post-award process, the number of entities that did not make it through the process before their due dates, the number of entities that were denied funding, the number of entities that were pre-determined to be awarded, the number of pre-proposals, and the number of entities that were drafts. Take note that all but four of entities that entered the system made it out into one of these categories. Based on this information, we concluded that the main constraint on the system was the issues with inputs that make the SPA review process take longer. Table 13 also contains a fiscal representation of the average total dollar value of the entities that fell into each category. Finally, the third table is a representation of the scheduled utilization of each SPA. Scheduled Utilization is the amount of time a resource was busy in the process over the total amount of time the entity was scheduled to work.

Table 12: Base Model Outputs Number to Post 911 Number Late 237 Number Denied 19 Number Already Awarded 42 Number Pre-proposals 19 Drafts 17 $ to Post & Already Awarded $232,397,418 $ Late $64,791,325 $ Denied $4,425,172

Table 13: Base Model Utilizations

SPA 1 SPA 2 SPA 3 AA Polly Utilization Utilization Utilization Utilization Utilization 48.11% 71.03% 76.24% 41.49% 0.10%

The utilizations above may appear to be low for a number of reasons. Primarily, there is not a consistent flow of entities into the system. This means there is downtime throughout the year that decreases resource utilization. It is possible that the resource is busy assisting other resources; however, one of the constraints of our model is that the resources are assumed to not be assisting one another. It is also a possibility that the process times provided by the SPAs do not accurately represent the time it takes to process proposals. If the provided process times are faster than modeled, utilization would appear to be less than it is at OSP. In addition to the low utilization rates, the SPA utilizations are not consistent (SPA 1 utilization is less than the other SPA utilizations). This variation occurs because the percent of entities that goes to SPA 1 is significantly less, as demonstrated in Model Inputs section of this report.

Model Validation Validation is the process by which we confirm that the model performance and outputs accurately depict the actual process performance and outputs. In order to do this, a comparison of the actual output results of OSP to the model’s outputs was done. The primary validation for the process focuses both on the SPA Utilization rates (which were discussed previously) and the number of entities output to the post-award process.

Table 14: OSP Results

Total of above coded as “A” in status (awarded) 448 Total of above coded as “R” in status (in review) 111 Total of above coded as “S” in status (submitted) 338 Total of above coded as “I” in status (inactive) 79 Total of above coded as “D” in status (declined) 123 Total of above coded as “P” in status (presumed rejected) 1 Total of above coded as “C” in status (cancelled in house) 6 Total of above coded as “W” in status (withdrawn by PI) 5 TOTAL 1111

Table 15: Validating Model Results

Model OSP Total Number of Entities 1167 571 Average Number to Post 911 448 Average Number Denied/Late 255 123 Percent Awarded 78.1% (911/1197) 78.5% (448/571)

The values are derived from the data received, both the raw data that was input, along with the actual results that were given by OSP. It is not important that the numbers are high, but that the model numbers match up correctly with the raw data and result data. This lets us know that the model is functioning correctly based on the data. If the data needs to be updated or clarified, we now know that it can be input, processed, and distribute correct values that accurately depict reality.

The other information, such as the “number submitted” or “number in review”, is irrelevant to outputs, since we are unable to determine where those proposals will end up. Thus, we are constrained to using only the “number awarded” and “number denied” when focusing on the total number that were fully processed through the system.

There is one area of validation that requires explanation. The average total dollar amount awarded for a year is $232,397,418.76 based on the model. We recognize that the actual value awarded for 2008 is approximately $65,000,000. However, there are two primary factors that have significantly impacted the result. First, values in the model were assigned based on the actual proposal values, which significantly differ from the actual awarded values. This is because it is not uncommon for a PI to receive only partial funding based on what was requested. Second, the number of entities that were awarded, based both on the inputs, the model outputs, and OSP validation results appears to be significantly higher than expected. An important factor to keep in mind, however, is that our average individual proposal value was actually very close regarding the raw data compared to the model outputs. The base models average output was $255,138.79, while the average proposal value from the raw data was $244,807.61.

Model Scenarios to test Process Improvements Once the model representing OSP’s current process flow was built and verified, the next step was to alter the model to see how changes would affect OSP pre-award process. Fourteen different scenarios were developed based on current OSP process improvement initiatives, our knowledge of the processes, and PI recommendations gathered from a survey developed by our team. The scenarios we developed sought to increase the number of proposals awarded, increase money coming into the University of Idaho, and improve PI relations. The results of our model variations were compared against the outcomes of the base model in order to gauge their effect on the OSP pre-award process.

Base with Four Day Rule Currently OSP is trying to get an initiative passed such that any proposal received by the office with less than four days processing time available until the proposal due date will be rejected by OSP and returned to the PI. In order to see how implementation of this rule will affect the pre- award process at OSP, a decide module was added that would test each entities’ due date against the four day rule, rejecting those with due dates less than four days and allowing those with due dates after four days to continue through the process. Then, a comparison of the outcomes of this change with the original outcomes of the base model was completed.

Table 16: Base with Four Day Rule BASE BASE with 4 Day Rule Number Awarded 911 No statistically significant difference Number Late 237 36

Originally, the base number awarded (the number submitted to the requesting agency on time) was, on average, 911. When the 4 day rule was implemented in the system, the average number awarded did not have a statistically significant difference from the base (we used Arena’s Output Analyzer to determine if there was a statistically significant difference). The rule did, however, have an impact on the number of proposals that were late (did not get through the process before their due dates and are assumed to be denied by the requesting agency). The number late decreased from approximately 237 to 36. The 4 day rule removed the proposals that had very low due dates from the system. These proposals had a higher likelihood of being late. In essence, the rule allows SPAs to spend their time working on proposals that have a greater likelihood of being submitted on time and being funded. On average this rule resulted in an average of 143 proposals being returned to the PIs. It is important to note that, while it was not modeled, we would expect that the enforcement of this rule would encourage PIs to turn in proposals in earlier, resulting in fewer proposals being denied by OSP and more proposals being submitted to the requesting agency.

Base with Electronic ESF In addition to the four day rule, OSP is also working on developing an electronic version of the external support form. Doing so will give OSP better control over the quality of its inputs. By forcing PIs to have all information included in the electronic ESF (E-ESF) prior to submission, the office will no longer have the problem of receiving proposals without a due date . To translate this change to our model, we changed the percentage of proposals that are received without a due date and assigned an automatic four days until due to 0% of all incoming proposals (this was previously 75%). All proposals would now go through the due date distribution as described in the Proposal Due Dates section above. This change resulted in the following output:

Table 17: Base with E-ESF BASE BASE with E-ESF Number Awarded 911 854 Number Late 237 294

As displayed in the above table, the implementation of the electronic ESF results in fewer proposals being submitted and more being late. This may seem counterintuitive. In the base model, 75% of the incoming proposals were assigned a due date of four days. Because the electronic ESF would eliminate the need for such a business rule, 100% of incoming proposals are being assigned a due date through the distribution discussed in the Proposal Due Dates section above. This results in proposals that were assigned a due date of four days now receiving a due date anywhere from 0 days until due to 9+ days until due, with a good percentage of proposal due dates assigned as 0 to 3 days until due. Since the due dates are generally lower than they were in the base model, more proposals run the likelihood of not being processed before the due date is reached, resulting in a greater number of late proposals. While this outcome would suggest that the electronic ESF will not have a positive effect on the process, it is believed that the e-ESF has the potential to produce favorable results. One issue with the above results is that our due date distribution may be incorrect for the reasons discussed in the Proposal Due Dates section above. A correct due date distribution would likely result in model outputs similar to or better than those of the base model.

Base with Electronic ESF and Four Day Rule

To further investigate the electronic ESF and four day rule, we modeled these business rules together. It is important to note that due date distribution times were not increased in our model due to the four day rule. However, the team believes that implementing the four day rule would increase the number of days until due on each proposal by motivating PIs to turn in their proposals sooner as to avoid being denied by OSP.

Table 18: Base with E-ESF and Four Day Rule BASE BASE with E-ESF Adding 4 Day Rule Number Awarded 911 854 599 Number Late 237 294 1

The table indicates that implementing the four day rule causes the throughput to decrease. Note that there is only one proposal deemed as late whereas the Base with E-ESF has 294 late. There are so few late when adding the four day rule because the proposals that would have been late, as they were in the Base with E-ESF model, were taken out before they started the pre-award process. This gives proposals with a four day due date or higher a better chance of making it through the system than in the Base with E-ESF model. However, overall less proposals get through the system because proposals going through the due date distribution which are assigned a due date of less than four days are denied by OSP due to the 4 day rule. The recommendation would be to further analyze this model, with specific emphasis on changing the due date distribution due to the 4 day rule.

Base with an Increase in Inputs Next, an increase in the number of entities flowing through the pre-award process was modeled. One reason we determined this increase could occur is due to an increase in funding from government entities. An increase of funding might occur for various reasons not delineated in this report. In addition, OSP could hire a person to research proposal opportunities, thereby increasing the number of proposals entering the system. The basic idea of this person’s job would be to research proposal opportunities and to contact specific PIs through phone conversations and follow up emails to inform them about the available opportunities. Hiring this individual has many advantages to OSP, including a possible increase in proposals entering the system. By knowing which proposals were being pursued by PIs when the PI begins the proposal writing process, SPAs can utilize demand planning. In addition, SPAs would be able to encourage PIs to get their proposal into the office for review in a timely manner. One of the possible reasons that PIs turn their proposals into OSP at the last minute is that the PIs did not hear about the funding opportunity until the proposal was almost due. If there were someone looking for opportunities for PIs, then PIs would know about these opportunities sooner and, theoretically, have more time to produce better quality proposals, and to finish the proposal in plenty of time for OSP’s review. All in all, this position would allow OSP to better control its inputs and reposition OSP in the minds of the PIs. Instead of seeing OSP as a hurdle to get over, PIs would see OSP as an agency that assists PIs in creating successful proposals. The results from modeling an increase in proposals entering the system are shown in Table 19.

Table 19: Base with 30-40% Increase in Inputs BASE BASE with 30-40% increase Number Awarded 911 598 Number Late 237 846

When the flow of entities into the base model was increased 30 to 40 percent, the number awarded decreased from 911 proposals through the system to 598. The number late also increased from 237 to 846. The decrease in successful throughput is due to the large number of proposals with short due dates, ending up late at the end of the process. These proposals are necessitate a fast turnaround time, but when limited resources are used to accomplish this, other proposals become closer to their due, and must be expedited as well. This only compounds the problem. OSP would need to increase its capacity in order to get more proposals through the system successfully. (See section Workflow Software Implementation)

Base with Increase in Inputs and Four Day Rule Since OSP is trying to get the four day rule approved for implementation, it was important to see what effect the rule would have on the system when the office experiences an increase in proposal input as discussed in the previous section.

Table 20: Base with 30-40% Increase in Inputs and Four Day Rule BASE BASE with 30-40% increase Add 4 Day Rule Number Awarded 911 598 928 Number Late 237 846 598

When adding the four day rule to the Base with an increase in entities in the system the number awarded increased from 598 to 928. The total number late also decreased from 846 to 598. This occurs for reasons explained in the Base with 4 Day Rule section above. The 4 day rule prevents entities with less than four days until due from entering the system; allowing for those with more than four days a better chance of making it through the system on time.

Workflow Software Implementation After analyzing the process and modeling several variations of OSP’s pre-award process with Arena, we feel that OSP’s process could best be improved by purchasing a workflow system or hiring more SPAs. A workflow system is software that will allow for greater ease of proposal processing. There are several types of workflow systems currently in existence. We researched two of them for implementation: Cayuse424 and Savvion. If a workflow system was a serious consideration for the University, we would suggest doing further research and discussing the potential of other workflow systems. The benefits of a workflow system are varied and great. First, it will decrease process times significantly for all participants. This includes the AA, the SPAs, and the PIs. In addition, the software has the ability to spot real time process errors. For example, if a PI did not type in a due date for the proposal, the system would alert him that he had made an error and would not allow him to continue until it was fixed. Workflow will also increase PI satisfaction for several reasons. First, PIs will be able to submit proposals faster and easier because of the decreased process times and the ability to more clearly understand the proposal writing and submission process as delegated by the software. Also, since the SPAs have a decrease in process times due to proposals being submitted without missing information, they will be able to assist the PIs more throughout the process if necessary. Workflow will increase the transparency of the system. OSP will be able to see when a PI has started a proposal and track it through until he is finished. The contrast is also true, as the PI will be able to see the steps OSP is going through to submit his proposal and will thus be able to better understand what OSP does. Finally, this software is equipped with a real time data collector, which will allow OSP to better track information that is necessary to get an accurate picture of how the process is performing (i.e. proposal wait time, days until due upon receiving a proposal, etc.). It is our understanding that some workflow systems are also beneficial in the post-award process. “Cayuse424 [is designed] to supply complete proposal information to post- award accounting and reporting systems.”(Cayuse)

Table 21: Base with 50% Decrease in Process Time Base Base with 50% Decrease in Process Times Number Awarded 911 1110 Number Late 237 36

In order to demonstrate the effect of workflow software, a model was developed with a 50% decrease in process times. Representatives from Cayuse expressed that process times decrease at least 50% with the software and we anticipate Savvion would have the same capability. When this decrease in process times was modeled, the number of proposals out increased drastically and the number of proposals late decreased, as shown in Table 21. The results of the model were as expected. However, the full potential of workflow software is more evident when more proposals are added to the system. To see how much demand a system like this could handle, an alternative model was developed. We modeled the decrease in process times concurrently with a 30-40% increase in the amount of proposals being submitted. As previously discussed, there are many reasons why this increase would occur, and is already occurring. When this model was run, more proposals were awarded, and again few were late. The results of this model are shown in Table 22 below.

Table 22: Base with 50% Decrease in Process Times and 30-40% Increase in Inputs BASE BASE with 50% Decrease Base with 50% decrease in Process Times & 30-40% Increase in Inputs Number Awarded 911 1110 1452 Number Late 237 36 67

The results show us that a workflow system could handle an increase in demand with relative ease. The two systems investigated for implementation at OSP were Cayuse 424 and Savvion. Cayuse424 has a system-to-system pathway with grants.gov, which allows for ease of file acceptance for grants.gov proposals. After speaking with several Cayuse employees, we learned that the software is also customizable for non-grants.gov entries. “Through a variety of product innovations, we have created a Generic Proposal workspace that allows you to upload primary documents and enter information about proposals to any federal and non-federal agency. This allows you to use Cayuse424 as a comprehensive proposal repository for managing, routing and approving all grant proposals.” (Cayuse) It also can track data, and has an auto-fill component that will fill in certain parts of the proposal, such as institution name, once it has been saved in the system. It can develop, validate, route, and submit to reporting systems, compliance systems, financial systems, and management systems. The cost to run and maintain such a system is $37,000 per year. The benefit of such a system is that it would be fast to get up and running at the University. Since this software has already been developed and tried at a number of universities, it will be relatively easy to implement into the existing OSP environment. However, implementation will require some modification of certain OSP processes, because Cayuse is relatively inflexible compared to other workflow systems like Savvion.

Savvion is a more complex workflow system. It has similar capabilities to Cayuse however, it has more broad based application to the University. Multiple departments could utilize the Savvion system to improve their processes. In addition, it allows more flexibility to OSP in the development phase. Cayuse424 is pre-packaged software, and although it is somewhat customizable, it is not likely that it will fit exactly into the existing processes flow of OSP. With Savvion, it is possible to develop real-time, error catching forms that are customizable to each individual proposal. If this workflow system was purchased, after an OSP employee was trained to make forms, the SPAs would simply have to spend a short time creating a form for a specific PI by dragging in the necessary items for the proposal. Savvion requires an initial investment of $200,000 and requires an on-site tech support employee that would cost approximately $80,000 per year. Since we anticipate Savvion being used throughout the University, we estimate that OSP would only have to pay a quarter of that amount ($50,000 initial, $20,000 per year).

In order to determine if a workflow system is a financially reasonable investment for the University and OSP specifically, we ran a net present value (NPV) analysis on both of these products with the current expectation of the 30-40% increase in proposals coming into the system and both a 50% decrease and 30% decrease in process times. NPV is essentially the difference between the present value of cash inflows (profits) and the present value of cash outflows (costs) in the base model compared with those values in the altered models. It compares the value of a dollar invested today, to the value of that same dollar in the future, including returns, and accounting for inflation. We used a discount rate of 8%, because that is an average return of a medium risk asset. If OSP felt that this project was riskier than the type they would usually take on, the discount rate could be raised to reflect this, and this would reflect in the NPVs. We also used monthly compounding on the theoretical interest the investment would earn. Finally, the NPV was conducted for a five year period. This means that the amount the NPV shows is the ending cash flow that would have occurred over five years with the new software. This cash flow is then discounted back to today’s dollars. In capital budgeting, if the NPV of a prospective project is positive, it should be accepted. We ran multiple NPV analyses for several situations, which are shown below.

Table 23: NPV Cayuse 30% Reduction in Process Times – No Change to Inputs Mean $219,824.47 Median $232,657.91 Confidence Level $131,543.98 Min and Max $88,280.49 - $351,368.44

Table 24: NPV Cayuse 50% Reduction in Process Times – No Change to Inputs Mean $365,885.09 Median $381,341.90 Confidence Level $132,647.28 Min and Max $233,237.82 - $498,532.37 Table 25: NPV Savvion 30% Reduction in Process Times – No Change to Inputs Mean $234,554.91 Median $247,388.35 Confidence Level $131,543.98 Min and Max $103,010.94 - $366,098.89

Table 26: NPV Savvion 50% Reduction in Process Times – No Change to Inputs Mean $380,615.54 Median $396,072.34 Confidence Level $132,647.28 Min and Max $247,968.26 - $513,262.81

Table 27: NPV Cayuse 50% Reduction in Process Times and 30-40% Increase in Inputs Mean $1,535,317.26 Median $1,593,388.32 Confidence Level $111,053.09 Min and Max $1,424,264.25 - $1,646,370.26

Table 28: Savvion 50% Reduction in Process Times and 30-40% Increase in Inputs Mean $1,423,670.15 Median $1,481,741.21 Confidence Level $111,053.01 Min and Max $1,1312,617.14 - $1,534,723.15

As you can see, all the NPVs are positive, which means that any one of these models would increase the future cash flows of OSP, and consequently, the university. We used a 95% confidence level to run our analysis. With a confidence level of 95%, the actual NPV of these cash flows will be within approximately $100,000 of the average number displayed above. This means that even if there is some deviance in the model, it is still correct to accept the project to increase cash outflows. This is also evidenced by the minimum and maximum calculations, as they are also all positive. This analysis makes our conviction that workflow is a viable option to solve current OSP problems verifiable by quantifiable evidence. We therefore greatly advocate the implementation of such a system.

Additional Suggestions In addition to the different scenarios that we were able to model, we developed other, more qualitative suggestions centered on business rule changes and training that might assist OSP efforts.

 Based on the information gathered from the PI surveys that were distributed, it is evident that the PIs are somewhat confused about the exact role of OSP in the proposal funding process. Many referenced the fact that OSP doesn’t do a good enough job of informing faculty of funding opportunities. While we are recommending that OSP eventually take on this role, it is clear that OSP’s current role needs to be better defined to the University community. It is good that OSP’s website clearly outlines its duties and responsibilities, but it is clear that this message is not being received by all PIs. A cause of this issue might be inconsistency with services provided, which sends mixed signals about OSP’s roles to the PIs. For example, one SPA stated that he receives occasional emails from the office about funding opportunities. When this occurs, PIs begin to believe that it is a function of OSP to find such opportunities.

 On OSP’s website, it lists the responsibilities of OSP including to “Educate and train faculty, staff, and student employees regarding the management of sponsored projects.” (OSP Responsibilities) It is recommended that OSP utilize training seminars in order to clearly communicate OSP’s role to PIs and to educate PIs about the proposal submission process. It is understood that OSP is currently overwhelmed with proposals and does not have time to host such seminars. If, however, OSP were to implement a work flow system, the SPAs would have more time to assist PIs with understanding their role in the proposal submission process.

 In addition to training seminars, we suggest that OSP sit down with new PIs before they write their first proposal. A PI can gain a lot of confidence by having some one on one communication with a SPA. They will also emerge with a better understanding of the proposal writing process. This additional system might not be necessary if a work flow system is employed, as the proposal submission process will be more transparent to all parties involved.

 Have a feedback mechanism for PIs and other users of OSP. From the PI survey, it has become clear that they would like some sort of anonymous survey in order to voice their opinion about their experiences with OSP. A survey could be posted on OSP’s website. In addition to the survey, the office should post comments back to PIs ensuring them that their specific frustrations and/or ideas have been noted and are being considered. This gives the PIs an outlet to voice their frustrations and a source for the OSP to recognize any misconceptions about its role and correct those errors. In addition, the PIs might have some good ideas about how to improve the entire research funding process and the survey would allow the OSP to gain access to those ideas. Finally, an outlet such as this (as long as it is done properly) would increase PI’s satisfaction with the process, making them more likely to work with the OSP and view the OSP as a partner rather than as a hurdle.

 We know that OSP does not have control over the way Faculty and Administrative (F&A) funds are distributed, but we suggest that the research office reexamine this allocation. Currently, a portion of F&A is given to the college from which the proposal originated. It is our understanding that colleges are then allowed to decide where the money is spent. Some colleges keep the funds for overhead expenses and some colleges give some of or all of the funds to the PIs in the form of Y-accounts. There is no oversight as to how these funds are spent. We believe that this system was originally established in order to give PIs incentive to submit more proposals. Because some PIs receive these funds and some do not, an inequity exists that actually acts as a disincentive to some PIs. Therefore, we suggest that if any F&A is to be sent to the originating college, that the funds be kept at the department level to cover overhead expenses.  OSP needs a mechanism with which to accurately track information regarding proposal processing. Information like SPA processing times, could lead to many process improvements. For instance, data might reveal that proposals received from a college that recently received a training seminar on the proposal submission and review process are being processed faster. This information would be helpful in gauging the effectiveness of training seminars. In addition, data might indicate that one SPA is able to process proposals much more quickly. Once identified that SPA could then share their techniques with the others, making the review process for everyone shorter. Such data collection is easily facilitated by a work flow system, which will automatically record relevant information.

 Were the information available, OSP would benefit from demand forecasting. Forecasting might reveal patterns in OSP’s demand that would allow the office to accurately plan its capacity and resource utilization. For instance, demand patterns might reveal that, in a certain month one SPA always has lower demand. OSP could then use this data to task that SPA with other work, like hosting a training session or assisting a SPA that has high demand during that period.

Further Validation On May 7, 2009, we presented the information addressed up to this point in this report to four employees of the Office of Sponsored Programs. The purpose of the presentation was to explain our findings, validate our model, and present some ideas for improving OSP’s processes. During the presentation, multiple issues arose regarding the data outputs from the model. The purpose of this section of the report is to clarify those issues, explain what we’ve done to correct those issues, and present additional process improvement scenarios based on information gathered at our meeting.

Data Discrepancies One of the main output issues that occurred with our model was the fact that we had proposals that were not making it through the system within their due dates. OSP does not have late proposals. All proposals are finished being reviewed prior to submission deadlines. There a few possible causes of this discrepancy. First, there is the possibility that our due date distribution is inaccurate, as referenced earlier in the report. If our distribution is applying due dates that are lower than OSP’s actual processes, then it is expected that the model will have late proposals. Process distribution times input into the model were based on estimations provided by the SPAs. It is highly likely that these times are not exact. Process times that are too slow would result in proposals being late within the model. While it is likely that the process distributions used in the model are incorrect, it is not likely that they err on the slow side. SPA utilization rates were too low for this to be likely, so we believe that an inaccurate due date distribution is more likely to be the main factor contributing to this discrepancy.

Our model output data showed that all of the SPAs were underutilized. The SPA utilization rates were 48%, 71%, and 76%. OSP employees have stated that this is not actually the case and that utilization rates should be 100%. Low utilization rates are likely the result of the process times that are represented in the model being faster than they are in real life. In the model, if a SPA finishes a proposal quickly, they must wait idle for another proposal to enter the system, thus causing the SPA’s utilization to decrease.

Based on the data provided by OSP, we had determined that 80% of submitted proposals were funded. This information is not accurate and should be closer to 30-40% (or less). In addition, OSP employees validated the fact that the dollar values assigned to the proposals were much higher than the actual values seen by OSP.

Data Corrections In order to correct the discussed data issues, we made a number of adjustments to our model. First we changed the way due dates were assigned. Originally they were assigned based on using a distribution and currently they are set to be automatically one year from the time the entities enter the system. While this does not accurately reflect what occurs at OSP, we had no way of obtaining correct due date data information within the available time frame and needed to ensure that all proposals were being processed before their due dates, so that they would not be late. If this model is to be further utilized as a tool to adjust OSP’s processes, a correct due date distribution will need to be calculated.

After changing the due date distribution, the focus changed to increasing utilization rates. It was determined that it would be accurate to consider 80-85% utilization as being equivalent to 100% utilization, as there is some time during the days that SPAs will not be working on the pre-award process, but instead will be doing things such as talking with coworkers, getting coffee, going to the restroom etc. For the remaining analysis we equate 80-85% utilization with 100% utilization. In order to increase SPA utilizations, we increased each SPA’s processing time distributions to the times listed in Table 29.

Table 29: New Process Times SPA Percent of Proposals Process Time Distributions (hours) Steve Kirkham 71% St1: tria(2.8,3.5,3.92) 29% St2: tria(0.7,3.5,5.32) Cathy Knock 61% CNR: tria(0.23338,1.4,5.6) 39% College of Ag: tria(1.4,4.2,8.4) Kelly Morgan 6% Other: tria(0.7,7,8.4) 29% Arts & Humanities: tria(0.7,2.1,4.2) 34% Institutes: tria(2.8,4.9,14) 26% Education: tria(4.2,7,13.3)

6% Center for Adv. Energy Studies: tria(5.6,5.6,5.6)

In addition to altering process distribution times, we also entered into the model a series of hold and decide modules that delay entities before they enter the SPA review process in order to examine the queue build ups in each SPA process. The hold releases entities if any of the SPA queues are less than two in length. The decide modules then analyze each queue to determine which SPA has less than 2 proposals in queue and sends the released entity to that SPA. Basically, this mimics a rule where SPAs will assist each other by reviewing proposals from colleges that they normally do not work with. We also smoothed out the assignment of entities by SPA through altering the percentage of incoming entities sent to each SPA to 33.33% each. By utilizing the distributions listed in the above table and the new rule that allows SPAs to assist one another, we were able to increase SPA utilizations to 87%, 85%, and 89%. Again, if this model is to be utilized for further study of OSP’s pre-award process, accurate data regarding SPA processing times will need to be gathered.

Correcting the number of proposals denied was a matter of simply changing the percentage in a decide module from 80% to 40%. While this alteration was relatively simple, fixing the dollar value assigned to proposals was a bit more complex. Data regarding accurate proposal values was not available and would need to be gathered in order to determine a more accurate distribution from which the model could assign dollar values to proposals. In order to facilitate a correct net present value analysis, the average total awarded output from the model was multiplied by a percentage until the average from the 30 iterations or years equaled about $75,000,000. Since OSP has historically received 60-80 million dollars annually, it was determined that 75 million dollars was accurate enough to run a reliable NPV analysis. Note that this distinction was made prior to the meeting on the 7th, so any NPV analyses discussed in the previous sections of this paper were also correct.

The end result of these efforts is a model that more closely resembles OSPs actual results and are represented in Table 30. The model now results in greater than 80% utilization for all SPAs, it outputs zero late proposals, and only 40% of submitted proposals are awarded (not including the proposals that entered the post-award process already awarded, drafts, or preproposals).

Table 30: New Base Output Values New Base

Number Late 0

Number Denied 700

Number Awarded 463

Cathy Utilization 87%

Kelly Utilization 85%

Steve Utilization 87%

New Model Variations Additional Inputs Once a new base model was designed, it was necessary to model the increase in incoming proposals that OSP is currently realizing. To model this influx in entities, a create module was added that increases entities in the system by 43%. Table 31: New Base with Additional Inputs New Base Base with 43% Increase in Proposals Number Late 0 0

Number Denied 700 767

Number Awarded 463 506

Cathy Utilization 87% 93%

Kelly Utilization 85% 93%

Steve Utilization 87% 95%

The result was as expected. More proposals were put through the system, but SPA utilization increased to the point that they cannot sustain (greater than 100%). Remember that 100% utilization in the model is equal to roughly 80%.

Additional SPA’s In order to reach the target utilizations of approximately 80 percent for Each SPA with the increase in proposal flow, we added additional SPAs into the system, one at a time until the utilization rates returned to approximately 80 percent. An assumption made in doing this was that the new SPA’s times would be a combination of the slowest process times of the ‘old’ SPA’s.

Table 32: Additional SPAs New Base Base with 43% Increase With 2 Additional in Proposals Employees Number Late 0 0 0

Number Denied 700 767 1009

Number 463 506 668 Awarded Cathy 87% 93% 88.20% Utilization Kelly Utilization 85% 93% 81.08%

Steve Utilization 87% 95% 80.11%

New SPA 1 - - 84.77%

New SPA 2 - - 79.03% It was found that two additional SPAs would be needed in order to bring SPA utilization back to around 80%. The NPV of the additional SPA’s is positive. Therefore, obtaining two additional SPAs will both improve the current process and will be financially feasible with the current increase in proposals flowing through the system.

Conclusion The University currently has a complex PI submittal and pre-award system that is currently operating inefficiently. To improve the system, OSP should consider implementing some or all of the recommendations above. Obviously, a continuation of this project would allow for better data gathering and more reliable outputs from the model. Before any of these recommendations are implemented, further research should be done on the options available and the feasibility of those options here at the University. This project allowed all parties an excellent learning experience and it was a great opportunity for our team.

Sources:

Cayuse Inc. 1 Jan. 2009. Cayuse424: Frequently Asked Questions. Accessed: 5 May 2009. http://www.cayuse.com/solutions/product-faq.php

Office of Sponsored Programs. 1 Jan. 2009. OSP Responsibilities. Accessed: 14 May 2009. http://www.uro.uidaho.edu/default.aspx?pid=109899