- Original article
- Open Access
The relationship between timely delivery of vocational rehabilitation services and subsequent federal disability benefit application and receipt
IZA Journal of Labor Policy volume 3, Article number: 15 (2014)
Federal/state vocational rehabilitation (VR) agencies offer services to individuals with disabilities that may help them remain in the labor force and avoid entering Social Security Administration (SSA) disability programs. We assess how the availability of VR services within an agency at the time an individual applies to receive VR services is related to subsequent application and receipt of SSA disability benefits. We find that individuals have a higher likelihood of subsequently applying for and receiving disability benefits when they apply in months that the VR agency serves a lower percentage of applicants or has a longer average wait for services.
H55, H75, J29
Federal/state vocational rehabilitation (VR) agencies provide rehabilitation and employment-related services to individuals with significant disabilities with vocational goals. Each state has either one or two VR agencies, and in 2010, more than 1.4 million individuals received services from VR agencies (U.S. Department of Education 2012). When demand exceeds available funding, VR agencies must ration services, either by making qualified applicants wait or, in some instances, not providing services at all. Often, agencies provide services on a first-come, first-served basis. But, when agencies are severely financially constrained, they operate in an order of selection (OOS) status, meaning that they have developed a service delivery plan that prioritizes services based on applicant need and had that plan approved by the Department of Education (Silverstein 2010). Operating in OOS is not uncommon; 41 of the 72 state VR agencies we consider in our analysis reported being in OOS during at least one year from fiscal year 2002 through 2005.
An important question is whether VR services act as an effective early intervention support to keep people in the labor force and thereby deter or delay entry into the two disability programs administered by the Social Security Administration (SSA): Social Security Disability Insurance (SSDI) and Supplemental Security Income (SSI). There is some evidence that most VR applicants are in the labor force (though not necessarily employed) and not already receiving disability benefits when they first seek VR services. For example, among a cohort of VR applicants in 2003, only 6.3 percent were SSDI beneficiaries at the time of their application (Stapleton and Martin 2012)1. However, there is no information on how many might have applied for SSDI in the absence of VR services, particularly when resources were constrained. Prior analysis has found a negative relationship between the employment outcomes of VR applicants already receiving SSDI benefits and how long they must wait to receive VR services (Honeycutt and Stapleton 2013).
This article provides a first look at the association between the availability of VR services at the time non-beneficiary applicants apply to receive them and the subsequent likelihood of applying for and receiving SSDI and SSI. We first provide descriptive statistics about the availability of services among applicants in fiscal years 2002 through 2005. Then, we examine the relationship between two ex post measures of VR service availability and four SSA outcomes 48 months after VR application (application to SSDI, application to SSI, receipt of SSDI, and receipt of SSI), using a regression model to control for individual characteristics, the state of the economy, and time-invariant factors specific to individual agencies. Our measures of VR service availability include the percentage of applicants that received a service plan from the VR agency and the mean duration from application month to service plan for those with a service plan, each defined based on the month and agency of application. We present estimates separately by OOS status, reflecting the fact that agencies in OOS face very different constraints and presumably have might have very different mechanisms for deciding which applicants to serve first, if at all.
Our findings offer evidence of a strong association between VR service availability and subsequent federal disability benefit claiming patterns. In states not operating in OOS, VR applicants who apply for services in months where services are less readily available are more likely to subsequently apply for and receive disability benefits. The estimated coefficients for SSDI application and receipt are larger and more significant than those for SSI, and estimates for application are larger and more significant than those for receipt. The evidence is consistent with the hypothesis that limitations on access to VR services increase applications and entry into both SSA programs, but it is also possible that these relationships are explained by other factors.
In the next section, we explain why we might expect an association between availability of VR services and federal disability benefit claiming. In Section 3, we detail our data sources, key variables of interest in our analysis, and regression specification. Section 4 contains our findings, and Section 5 concludes with a summary of findings and discussion of policy implications.
2 The relationship between VR service access and SSA program outcomes for non-beneficiary applicants
We hypothesize that limited access to VR services by non-beneficiary applicants increases the chance that they will apply and, in some cases receive, SSA disability benefits. The hypothesis is based on the inverse hypothesis that the timely provision of VR services can help some such applicants (re)enter the labor force rather than seek SSA disability benefits. Testing this hypothesis from available data is challenging for two reasons. First, it requires development of one or more reliable measures of access to services. Second, it requires the ability to separate the effect of exogenous variation in those measures on SSA program outcomes from other factors that might account for observed relationships between the access measures and the outcomes.
Available federal data allow the construction of two monthly agency-level measures of ease of VR service access for virtually all agencies. Both are based on the signing of an individualized plan for employment (IPE), which is a documented service plan developed by VR agency staff in collaboration with an eligible applicant-a formal indication that the applicant has received, at a minimum, initial assessment services. The first monthly access measure is the percentage of all non-beneficiary applicants to an agency in the month that receive an IPE before closure. Many VR applicants have their cases closed without ever generating an IPE; in our sample, 43 percent of non-beneficiary applicants from 2002 through 2005 did not have an IPE when their case was closed. The second monthly access measure is the mean duration from application date to IPE signing date among those with an IPE. In our sample, the median of this value across all agency-month combinations was 2.6 months.
Of course, these measures do not completely capture the extent to which non-beneficiary applicants have access to VR services. Access also depends on the perceptions of potential applicants about the availability of VR services, which affects the number that will apply, and on the nature and quality of the services that are delivered after IPE completion.
Access for those who apply also depends on how VR agencies assign priority to their applicants. If an agency serves all applicants meeting minimum eligibility criteria on a first-come-first-serve basis, then constraints on an agency’s resources will affect access for all applicants in essentially the same manner2. First-come-first-serve may be a reasonable assumption for agencies not in OOS; we have no information that would suggest otherwise. It is clearly wrong for agencies in OOS, however. Under OOS, agencies are required to give priority to those with the most significant disabilities (including current SSDI and SSI beneficiaries) who otherwise meet the minimum eligibility criteria (Hager 2004; Silverstein 2010). In effect, they move to the front of the line. If resources are insufficient, those with less significant disabilities will not be served at all, even if they meet the minimum eligibility criteria.
The fact that some agencies operated in OOS status in at least some years of the sample period complicates the interpretation of our VR access measures, perhaps especially for those most likely to apply for SSA benefits. It is possible, for instance, that those non-beneficiary applicants most likely to qualify for SSA benefits are among those who move to the front of the line in agencies that are in OOS status. Hence, these monthly agency-level measures of access might be better measures of service access for those most likely to apply for SSA benefits when agencies are not in OOS than when they are.
Beyond the hypothesized effect of variation in VR service availability on SSA outcomes, there are numerous other reasons why the two access measures used here might be associated with SSA outcomes. Some relate directly to variation in the characteristics of the applicants in any give month. Some applicants might be ineligible for services regardless of service availability; require an extensive assessment and planning period because of the nature of their conditions or other circumstances; experience medical deterioration that prevents further pursuit of employment; find employment without VR services and no longer need them; or simply fail to stay in contact with the VR agency for a variety of possible reasons. Other factors relate to the agency’s service delivery model: some agencies choose to serve relatively few clients intensively, whereas others choose to serve more clients with less intensive services. A third set of factors relates to the economic and service environment. The strength of the economy and the availability and quality of non-VR services might affect the number of non-beneficiary applicants in each month, the state resources available to serve those applicants, and the opportunities for applicants to (re)enter the labor force rather than seek SSA benefits. These effects all induce relationships between the measures of service access and SSA outcomes that are likely to be confounded with the effect of exogenous variation in access on SSA outcomes.
There is also a countervailing hypothesis about the direction of the effect exogenous variation in service access on SSA program outcomes for at least some non-beneficiary applicants. Individuals with significant disabilities might discover through their receipt of VR services that engaging in work at a substantial level is unattractive, or impossible, because of available positions, their own skills, or their disability. For such applicants, greater and more rapid access to VR services would increase the likelihood of subsequent SSA benefit application and receipt3. In general, we would expect VR counselors to provide assistance that is in the best interest of their clients, and in some cases that may mean guiding them toward SSDI or SSI application. Thus, in certain cases it is possible that more timely delivery of VR services hastens application to SSDI or SSI.
It is also important to distinguish between the effect of exogenous variation in access on application for SSA benefits and receipt of SSA benefits. We expect effects on receipt to be smaller than effects on applications, because not all VR applicants who are induced to apply for SSA benefits by limits on service access will be successful. The fact that they applied for VR services suggests that they are on the margin between staying in the labor force and claiming disability benefits. Hence, the likelihood that their SSA application will be successful might be lower than for the average applicant for SSA benefits.
Because of the complexity of the relationships between our measures of access and individual SSA outcomes, we are unable to rigorously estimate the causal effects of access to VR services on SSA outcomes. We are, however, able to partially net out the effects of some of the other factors that might explain these relationships, as described in the next section.
3 Data and methods
To support the analysis, we built a file using administrative data from the Rehabilitation Services Administration (RSA) and SSA, matched at the individual level. In this section, we describe these data, explain how we constructed measures of VR service availability, present our sample selection criteria, and then present the empirical model that we estimate.
3.1 Linked administrative data files
Our first task was to identify VR applicants and track them from their application date for as long as we could. We identified first-time non-beneficiary applicants for VR services from 2002 through 2005 and followed their SSDI and SSI program activity for the 48 months after VR application using a constructed dataset containing linked administrative data from the RSA Case Service Report (RSA-911) files, the SSA 831 File, and the 2009 SSA Ticket Research File (TRF, now called the Disability Analysis File).
The RSA-911 files consist of annual cohorts of VR closures, submitted by VR agencies to the RSA about all VR clients who had their cases closed by a VR agency during the fiscal year. We reoriented these closure records based on the VR application date contained in the files, combining data from fiscal year (FY) 2002 through FY 2009, the most recently available when this study began. We then identified all VR applicants in FY 2002 through FY 2005. Because we had closure data through 2009, we captured all records closed within four years of VR application as well as some records that were closed as long as seven years later. Nonetheless, a small share of records for applicants in this period is missing from our file; below, we provide additional information to gauge the sensitivity of the results to the missing records.
Using the RSA-911 files, we observed for each individual whether an IPE was developed as well as the time between the initial VR application and the date the IPE was signed. We also obtained information about VR applicants including demographics and disability status. We further identified the agency providing services as either combined (it is the only VR agency in the state), blind (it is one of two agencies in a state, offering specialized services only to blind or low vision individuals) or general (it is one of two agencies in a state and serves all individuals other than those who are blind).
We then linked records from these four annual cohorts of VR applicants to SSA administrative records to identify the timing of application for, and receipt of, SSDI and SSI4. SSA 831 files were used to determine benefit application date, and the SSA 2009 Ticket Research File (TRF09) was used to identify the receipt of disability benefits. Following VR applicants’ disability benefit applications and receipt through 2009, we constructed binary measures for (1) application for SSDI, (2) application for SSI5, (3) receipt of SSDI, and (4) receipt of SSI within 48 months of the VR application date6. We examined VR applicants’ decisions to apply for SSDI and SSI benefits separately because the nonmedical eligibility criteria differ for the two types of benefits; the former requires significant work history, whereas the latter is means tested and limited to individuals with disabilities who also have low incomes and assets. To ensure that the follow-up period was the same across the four cohorts, we truncated benefit data at 48 months after the VR application month, the period we are able to observe for the latest VR applicants in the analysis cohorts. One could ultimately extend our analysis to explore the timing of benefit application and receipt relative to VR service provision, but we confine our attention to application or receipt in any month within this 48-month window.
The final piece of administrative data necessary for our analysis was an estimate of whether the VR applicant met the nonmedical SSDI eligibility criterion at the time of VR application. To be eligible for SSDI, individuals must first work enough to attain “disability-insured” status; in essence, they must have substantial earnings in 5 of the last 10 years7. SSA staff supported our analysis by use of SSA software that is designed to determine disability-insured status based on the earning histories of the VR applicants in the sample8. Because the method used is only an approximation of insured status, some cases on the margin might be misclassified, but such errors are likely rare. Using the indicators generated allows us, in essence, to limit our analysis of SSDI application and receipt behavior to VR applicants who were disability insured when they applied to VR9.
As described previously, agencies in OOS have a formalized service delivery process that differs from agencies not facing such resource constraints, in ways that substantively affect the relationship between our variables of interest. To account for this, we identified which agencies were in OOS in a given fiscal year using an OOS identifier in RSA’s quarterly cumulative caseload reports (RSA-113). Within our sample period (2002 through 2005), this allows us to estimate models that are conditional on OOS status, using the 29 agencies that were in OOS status for the whole period (“always OOS”) and, separately, the 34 agencies that were not in OOS status during any part of the whole period (“never OOS”). For completeness, we estimate a third model for the residual group: the 10 agencies that were in OOS status in at least one year, but not all years. Of course, differences between the estimates across these groups of agencies might reflect differences other than OOS (such as the state’s economic, political, fiscal or cultural environment), though our model controls for such differences to some extent.
3.2 Subpopulation selection
Because our interest is in understanding the association between VR service accessibility and subsequent disability benefit application and award, we limited our sample to individuals who applied to VR agencies between 2002 and 2005 who had no record (according to an element in the RSA-911 data) of previously applying for VR services in the 36 months preceding the first application in the sample period. For those with multiple VR records, we kept only the earliest application record within this period. The intent of this limitation was to capture cases who were new to the service system. Because of incomplete information about past applications, some applicants in our “first-time” applicant sample might have applied to VR in earlier years, but it is very unlikely that they had applied within the previous three years. Between FY 2002 and FY 2009, 4,108,832 individuals had VR case closures and validated Social Security numbers; the latter is necessary for matching to the TRF09. We excluded 929,801 individuals because they applied to VR before FY 2002 and an additional 1,120,392 individuals who applied after FY 2005 (Table 1).
The remaining 2,058,639 applied for VR during our study period and represent 80 percent of the total 2,561,952 applicants during this period, as identified in the Department of Education’s RSA-113 data. There are two reasons that 20 percent of cases are missing. First, some individuals may take longer to close from services; based on our own comparisons of RSA-911 closures across multiple years with published numbers of applicants from the RSA-113 data, about 92 percent of VR applicants close within four years of application and 98 percent close within seven years. We currently have no information on the characteristics of the cases excluded because they had not yet closed. The duration of service receipt for these cases might mean that some have funding for postsecondary education services, some have disabilities requiring very lengthy services, and some might be “lost” to the VR agency whose staff had yet to close the case. Second, some RSA-911 records cannot be matched to the TRF because of inconsistent matching information, such as Social Security number, gender, or date of birth. The missing records are a potential source of bias in our estimates, especially if individuals whose cases are closed more than four years after application behave differently in response to changes in VR service availability than those whose cases close more quickly. We have no reason to think, however, that any bias would be substantial.
We further limited our sample to VR applicants age 18 to 64 who (a) were not receiving SSDI or SSI at the time of VR application and (b) had not applied for SSDI and SSI in the three years prior to VR application. The purpose of the latter restriction was to limit our sample to VR applicants who did not already have SSA disability benefit applications “in the pipeline” at the time they applied to VR, as SSA decisions do not consider VR service availability. This restriction also ruled out another potentially interesting group of applicants: those who were already receiving SSI when they sought VR services, but not receiving SSDI because of insufficient work history; this group could potentially use VR services to return to work and become insured for SSDI10.
More minor restrictions, detailed in Table 1, excluded individuals whose reason for VR closure was death, who resided in a U.S. territory, or who applied to the Kentucky VR agency (for a data reason)11. We also excluded 20,963 cases in which the number of applicants in a given month for an agency was less than three, to facilitate the calculation of our state-month service availability variables and ensure that one’s own value did not contribute too substantially to the agency average. After applying the selection criteria described above, our final sample for the analysis of SSI outcomes consisted of 1,184,810 first-time VR applicants from 2002 to 2005. The always-OOS sample includes 451,477 non-beneficiary applicants, the never-OOS sample includes 571,252, and the intermittently OOS sample includes 162,081. Just about half of the sample was estimated to be insured for SSDI; the sample for analysis of SSDI outcomes is limited to these 595,884 individuals.
3.3 Benefit application and receipt after applying for VR services
Many individuals in our sample applied for SSA benefits in the years immediately after seeking VR services. In the first 12 months after application, 14 percent of those classified as disability insured applied for SSDI benefits; this number increased steadily each year to 26 percent by month 48 after VR application (Figure 1). We also found that 8 percent of the full sample of first-time VR applicants applied for SSI benefits within 12 months of seeking VR services, and 16 percent by month 48.The corresponding percentages for benefit receipt are lower. Just 3 percent of the disability-insured VR applicants received SSDI benefits within 12 months of their VR application, increasing to 10 percent by month 48 (Figure 1). Less than 1 percent of all first-time applicants received SSI benefits within 12 months, increasing to less than 4 percent by month 48. It is important to note that the data on benefit receipt by month 48 are incomplete because some awards are made retroactively only long after the application is filed. Although our definition of benefit receipt includes retroactive benefits received during the period, it does not include retroactive benefits that had not yet been awarded when the sample was drawn.
3.4 Empirical model
For each of our four outcomes, we estimate a regression model that predicts the association between disability benefit-claiming behavior for first-time VR applicants and VR service availability. We use regression to at least partially isolate the variation in SSDI and SSI application and receipt attributable to availability of VR services, after controlling for other individual, agency, and year characteristics. Because an agency’s use of OOS likely affects the relationship between the wait time measures and SSA outcomes, we estimate the model separately for agencies never in OOS during the sample period, agencies always in OOS, and those in OOS in some years, but not all.
The regression model takes the following form:
The dependent variable Yiat is binary and indicates one of our four outcomes: whether an individual i who applies for services from agency a in month t subsequently applies or receives SSDI or SSI, through the next 48 months. Our regression is specified as a linear probability model for ease of coefficient interpretation. IPEat and WAITat are the two measures of VR service availability discussed previously, defined using information from all non-beneficiary VR applicants who applied to the same agency in the same month. IPEat is the percentage of those applicants who received a formal service plan from the VR agency prior to case closure. WAITat is the mean duration, in months, from application date to IPE date for those receiving an IPE. If the control variables served to completely isolate the components of b1 and b2 that reflect the effect of access on each SSA outcome, our hypotheses predict that b1 would be negative and b2 would be positive. However, these coefficients might also be confounded by other relationships between the two measures and the SSA outcomes.
Agency fixed effects (b5a) serve to control for two sources of variation in outcomes that are likely correlated with the measures of VR service availability but not reflective of agency resource constraints. The first source involves features of the agency’s service delivery model that do not change over time but that might affect the likelihood of providing an IPE and average wait time as well as the SSA outcomes. Individuals seeking services do not have a choice of provider but rather seek services from the relevant VR agency in their state, of which there is either one (combined) agency or two (blind and general) agencies. In the case of combined agencies, agency fixed effects are equivalent to state fixed effects, whereas in the case of separate agencies, there is a fixed effect for each of the two agencies in the state. We do not know of agencies that changed their service models in significant ways during this period. The second source of variation captured by agency fixed effects is variation in environmental factors including the state’s economy, policies that affect employment among individuals with disabilities, and cultural factors that might affect mean outcomes for individuals in a state.
Because we include agency fixed effects, the coefficients of IPEat and WAITat are designed to reflect only sources of within-agency variation in these variables. We believe that the main source of this variation during this period relates to the demand for VR services relative to the agency’s resources. This could occur because of seasonal variation (such as fewer individuals applying during end-of-the-year holidays or a surge in applications at the end of the school year), periodic marketing or referral activities affecting the number of applications to an agency, or having more limited resources toward the end of the fiscal year. Other potential sources of variation include variation in VR applicant characteristics; monthly variation in the economy; and more gradual, but possibly substantial, changes in national policy. The latter include innovations arising from the Ticket to Work and Work Incentives Improvement Act of 1999, which were being implemented by both SSA and the Centers for Medicare & Medicaid Services during our study period.
The other variables in the regression control for individual characteristics (Xi), the monthly unemployment rate in the agency’s state (Vat), and year fixed effects (b6y; the index for year y is dependent on t, in that all values of t in a single fiscal year map to the same value for y). Characteristics in Xi include age at VR application (categorical), gender, race, educational attainment, employment status at the time of VR application, and primary disabling condition reported to the VR agency12.
The standard errors, associated p-values, and adjusted R-squared values are all corrected for the heteroskedasticity implicit in a linear probability model and for clustering at the agency level.
3.5. Descriptive statistics for measures of VR availability
Table 2 highlights the distribution of the VR availability measures across all application month-agency observations included in our analysis (for all first-time non-beneficiary applications in 2002 through 2005); the results for each individual year (not reported) are quite similar. Each of the 73 agencies in our analysis has 48 monthly observations (12 months a year for 4 years), excepting agency-month combinations that had fewer than 3 applicants. The leftmost column shows that in some months, all individuals who applied for services at an agency received an IPE, whereas in other months, no applicants ever received an IPE before closure. The median percentage receiving an IPE was 58, meaning that at least 42 percent of applicants in half of the months never received an IPE. In some months, applicants who received an IPE did not have to wait at all (their IPE was completed in the same month). The median of the mean agency-month wait time was about 3 months, but in at least one agency-month, those who applied waited an average of 22 months to receive an IPE.
Though we do not show the results in the table, we investigated the relationship between IPEat and WAITat. This analysis showed that agencies in the top quartile of offering an IPE to applicants had lower mean wait time than did agencies in the bottom quartile of offering an IPE, with a monotonic relationship across quartiles. The mean wait in the bottom quartile of IPEat was 6.4 months, compared with 2.6 months in the top quartile.
As expected, there is a strong correlation between OOS status and each of the VR availability measures. The right columns of Table 2 stratify the service delivery measures by OOS status and highlight the effect that being financially constrained has on providing services. States that never operated in OOS during our study period had a higher proportion of applicants who received an IPE, but even for this group the 75th percentile is only 74.1 percent; that is, in 75 percent of the agency-months, at least 25 percent of the applicants did not receive an IEP. The comparable figure for agencies always in OOS is 60.9 percent. Mean wait times also vary across groups as anticipated, but there is also considerable variation even within the group never in OOS, where wait times are lowest: the 25th percentile is 1.6 months and the 75th percentile is 3.4 month. In contrast, the 25th and 75th percentiles for the always in OOS group are 2.7 months and 5.7 months, respectively. Percentiles for the sometimes in OOS group are between those for the other two groups.
In this section, we begin by describing the bivariate relationships between our measures of VR service availability and subsequent application and award of disability benefits. We then present the results of our main empirical model.
4.1 Bivariate relationships between VR service availability measures and SSA outcome measures
As we would expect if the primary factor determining the direction of the bivariate relationships is the hypothesized effect of VR access on SSA outcomes, in agencies never in OOS we find a negative relationship between agency-month values of IPEat and each of the SSA outcomes (higher proportion served associated with a lower likelihood of application). The relationship between WAITat and each of the SSA outcomes is less strong, though is not inconsistent with our predictions. Figure 2 shows the bivariate relationship between the outcome measures (percentage applying for SSDI, receiving SSDI, applying for for SSI, and receiving SSI in the 48 months following VR application) and groups of agencies never in OOS defined by quartiles of IPEat; Figure 3 shows the analogous statistics for WAITat. The SSDI application percentage declines with the IPEat quartile and increases with the WAITat quartile. The bivariate relationships between the SSDI receipt percentage and the IPEat and WAITat quartiles follow similar directional patterns, but are not monotonic and not as strong. The relationships between the SSI outcomes and IPEat are very similar to those for SSDI, but at lower levels. The relationships between the SSI outcomes and WAITat follow a u-shaped pattern, with higher SSI outcome percentages at the two WAITat quartile ends than for the middle two groups, and the relationships are considerably weaker than those observed for SSDI.
Though not shown, some of these patterns are also seen for applicants to agencies sometimes or always in OOS, but less consistently. For agencies always in OOS, the IPEat quartiles were monotonic and decreasing for SSDI and SSI applications, with no clear relationship for SSDI or SSI receipt; the WAITat quartiles were increasing for all SSA outcomes, though not always monotonically. For agencies sometimes in OOS, the patterns were inconsistent for SSDI and SSI application for both of our independent variables of interest, but the relationship for IPEat and WAITat tended to be in the expected direction for SSDI and SSI receipt.
4.2. Regression estimates of the relationships between VR service availability measures and SSDI and SSI application and receipt
Among agencies never in OOS, where we expect to find the strongest relationship between our access measures and SSA outcomes, the likelihood of a non-beneficiary VR applicant applying to SSDI within 48 months of VR application decreases with the percentage completing an IPE, after controlling for WAITat as well as for other factors that might contribute to the bivariate relationship (Table 3, “agencies never in OOS” column)13. In addition, we find a weakly significant relationship (p-value between .05 and .10) between WAITat and SSDI application, controlling for IPEat and other factors. Results for SSDI receipt, as predicted, are in the same direction as those for application, but smaller in magnitude and the coefficient on WAITat is more strongly significant (p-value below .05). For SSI, all coefficients have the expected sign, though only two are significant and not in a consistent pattern—WAITat is significant for application, while IPEat is significant for receipt. One possibility for the muted and inconsistent coefficients for SSI, relative to SSDI, is that the applicant population for the SSI estimates (all non-beneficiary applicants) includes a large portion of people not eligible for SSI because of that program’s income and asset requirements.
The results for the agencies sometimes or always in OOS are different; in general, the magnitude of the coefficients tends to be closer to zero and in most cases the estimates are not statistically significant for agencies in OOS. As described previously, because these agencies are actively managing their waiting lists in a manner requiring them to serve those with the most severe disabilities, they might serve non-beneficiary applicants most likely to apply for SSDI or SSI first, and may only serve those with less significant disabilities much later, if at all. Because of this, we would expect—and find—that the results for agencies in OOS are biased against finding the hypothesized effect for IPEat and WAITat. Based on conversations with a few agency leaders, another factor that might weaken the relationship between the two access measures and SSA outcomes is that counselors are more likely to discourage potential applicants with less significant disabilities from applying when wait lists due to OOS are long, knowing they will never be served.
We include the full sample results only for completeness, as other studies on VR outcomes often present results without consideration of OOS status. Using the full sample, we find evidence of a negative relationship between IPEat percentage and SSDI and SSI application as well as SSDI receipt, but no relationship between WAITat and our outcomes of interest.
For agencies not in OOS status, we find that in a bivariate context, limited VR service availability (a lower percentage of non-beneficiary applicants to the same agency in the same month that receive an IPE and a longer mean duration from application to IPE, a measure of access delay) is associated with higher SSA program application and receipt. After controlling for several factors that might partially explain the bivariate relationships, these relationships remain. Although other factors might account for the latter results, they are consistent with the hypothesis that more timely access to VR services for those applicants not already receiving SSA program benefits reduces the likelihood that they will apply and receive those benefits in the next 48 months. The results are substantially stronger for SSDI than for SSI. The weaker results for SSI might reflect the inclusion of many VR applicants who do not meet the SSI means test in the sample for the SSI analysis.
Of course, the results do not prove that greater access to VR services for non-beneficiary applicants would reduce SSA program application and allowance receipt. Our statistical methods and focus on never in OOS agencies tries to isolate variations in service availability that are plausibly exogenous to individual factors, but our results nonetheless remain associations.
Parallel analyses for agencies that were in OOS during all or part of our sample period found much weaker and less consistent evidence of a negative relationship between our access measures and SSA program outcomes. It seems likely that the way these agencies provide priority to applicants with the most significant disabilities undermines the validity of our access measures for those most likely to apply for SSA benefits.
One caveat to the magnitude of our estimates is that the time period over which we were able to follow individuals necessarily led to right-censoring on the outcome measures at month 48 after the VR applications. Censoring might be especially important for benefit receipt. If waiting for VR services induces VR applicants to apply for SSA benefits, it seems likely that SSA applications would be filed only some months after VR application. Receipt of SSA benefits may not occur until much later. During the study period, initial SSDI decisions took approximately 6 months, and many awards were made only after the appeal of an initial denial—a process that can take years (Lindner and Burdick 2013). Though not presented, our findings were qualitatively robust to alternative specifications, including both shorter and longer time frames of application and benefit receipt and consideration of an outcome measure that accounted for application to either SSDI or SSI (rather than considering each outcome separately).
There are several limitations to the analysis. First, we did not consider two groups of VR applicants that might be deterred from leaving the labor force and entering SSDI by timely access to VR services: those who had applied for SSDI and then applied for VR services while their SSDI applications were pending, and those who were not disability insured at the time they applied VR services but became disability insured in the near future, perhaps with the help of VR services. Both groups are problematic to identify but could be very important. The first group is important because the SSDI determination process is lengthy, so some might receive VR services and return to work while waiting for a decision from SSA. The second group is important because VR services received by those who are not disability insured might be instrumental to gaining the work experience they need to become disability insured. It seems likely that the effect of VR service availability for these groups would be in the same direction as for the disability-insured VR applicants included in our sample, but smaller.
Second, we have focused on just one aspect of how resource constraints affect VR services: timely availability. Resource constraints can also affect SSA outcomes by their effects on other aspects of service quality, but no suitable measures of quality are available. It might be that our two measures of service availability are highly correlated with other dimensions of quality that are affected by resource availability, in which case the estimated coefficients reflect effects of resource constraints on SSA outcomes via effects on unobserved dimensions of service quality and overstate the impact of the timely availability of VR services per se.
Third, our model controls for only characteristics and the environment at the time of VR application. The broader environment for workers with disabilities could have changed in the years after VR application in ways that could have affected the decision to apply for federal disability benefits. For example, during the time period of our study, states were enacting programs such as Medicaid Buy-In and making broader systems change with the help of Medicaid Infrastructure Grant funding, which aimed to promote employment. Perhaps more importantly, the 48-month window of the last annual applicant cohort (2005) includes the recession that started in the last quarter of 2007. These factors might well be important for some VR applicants, but they would have to be correlated with the service availability measures even after accounting for our control variables to affect the estimated coefficients for the service availability measures; we have no reason to think that they are.
To illustrate the magnitude of the point estimates for agencies never in OOS, we considered how modest exogenous changes in our access measures would change SSDI application and receipt if we assume that the estimated coefficients are unbiased estimates of the effect of such exogenous changes over the range of the variable considered. The specific change we simulate is an increase of IPEat in each month in which the value is below the 90th percentile of the observed distribution to the 90th percentile, and a reduction in the mean waiting time in months that are above the 10th percentile to the 10th percentile. The selection of these percentiles was arbitrary, but can be considered to reflect attainable levels of service availability as they have, in fact, been observed in a substantial share of sample months. The exercise also assumes that the effects of the two changes are additive—an assumption that is built into the regression model, but not necessarily true. Given our assumptions, these changes would together yield 9,300 fewer applicants to SSDI in the following 48 months, a reduction of 11.0 percent, and 6,900 fewer recipients, a reduction of 24 percent. As these predictions are for four annual applicant cohorts, the mean reductions per annual cohort over the sample period are 2,325 applicants and 1,725 fewer recipients. These predictions are only suggestive because of the limitations that are inherent to our analysis.
Even if we assume the coefficients are unbiased estimates of the additive effects of exogenous changes to the two access measures, the estimated effects of these substantial changes in access on SSA outcomes would be modest relative to the number of annual SSDI applications and awards. From 2002 to 2005, SSA averaged around 2 million applications for SSDI each year, of whom approximately 900,000 eventually received benefits (Social Security Administration 2013a). It should be noted that the agencies not in OOS include only about half of all agencies, and we do not have a plausible prediction for how expanding access to their non-beneficiary applicants would affect SSA outcomes. Even if we could double or triple the projected impact for the non-OOS agencies, however, the impact would still be small relative to the scale of all applicants and awards.
Yet, if VR services can divert even a few individuals from applying for federal benefits, cost savings could be substantial. Using administrative data from all four programs, Riley and Rupp (2012) estimated the present value of federal expenditures on SSDI, SSI, Medicare, and Medicaid for the average new adult SSDI or SSI beneficiary in 2000 to be $123,000 (adjusted for inflation to 2011 dollars) through 2006 (six to seven years after award)14. As savings would continue to accrue over time, it is not hard to see that relatively few cases diverted from benefits by VR services would lead to savings rivaling the payments that SSA makes to VR agencies for providing services to existing beneficiaries under its cost reimbursement payment system (which was $73 million in 2011) (SSA 2013b).
It is also possible that substantial savings could be achieved by changing the regulations and incentives under which VR agencies operate in a manner that would cause them to give greater priority to those non-beneficiary applicants who are most likely to apply for and enter SSDI or SSI. Current regulations and incentives favor serving existing beneficiaries, and even helping non-beneficiaries obtain benefits. The federal requirement to give priority to those with the most significant disabilities means that most SSA beneficiary applicants for VR services receive the highest priority. Further, SSA will pay the VR agency for serving beneficiary applicants if the applicant becomes competitively employed for nine months at the completion of VR service receipt, even if the beneficiary continues to receive benefits. The state government may also benefit when a VR applicant becomes an SSDI beneficiary, especially if the state is paying for some of the applicant’s health care under Medicaid or through other state programs. That is because the new beneficiary will eventually be entitled to Medicare, a federal program. Our estimates reflect a system that was not designed to deter entry into SSDI or SSI.
While our findings are not sufficiently strong to support a major overhaul in federal funding and regulations for VR services, they do suggest the need to develop and test approaches to determine whether VR agencies could help keep their applicants, or potential applicants, in the labor force and out of SSDI, and whether such efforts could pay for themselves through reduced benefits and, more broadly, greater productivity. For example, VR agencies might be able to reduce SSDI entry by reaching out to workers who are experiencing the onset of disability, perhaps via their employers. Delivering services to such workers before they lose their jobs might be a more effective way to keep them in the labor force. VR agencies rarely do this however. Alabama’s Retaining a Valued Employee program is an interesting exception that has been gaining interest from other state agencies in recent years. At present, however, federal regulations and funding do not encourage such innovations, and any such innovations are not being systematically evaluated.
1VR applicants do not report pending SSDI applications, though, so presumably the percentage of VR applicants who applied for SSDI is higher than the percentage awarded SSDI.
2Minimum eligibility requirements for VR services are conceptually the same for all agencies: individuals are eligible for VR services if they have a physical or mental impairment that limits the ability to work, the individual has a vocational goal, and VR services can help the individual achieve that vocational goal. Agencies may vary in how they define eligibility and employment goals, but presumably each agency consistently applied minimum criteria that are consistent with this conceptual definition over the years studied here.
3Additionally, some individuals initially receiving SSI may be ineligible for SSDI because they lack sufficient work history, and VR service receipt might allow them to qualify for SSDI (as shown in Stapleton and Martin 2012). These individuals are excluded from our study.
4These data were accessed via an interagency agreement between the Department of Education (for whom this work was conducted) and the SSA.
5Application was based on the filing date of the first application record after the VR application date, the record identification code, and the beneficiary identification code from the 831 file.
6This approach allows us to include individuals starting in the month that SSA deemed them eligible for benefits, even if a beneficiary did not actually receive a cash payment in that month (perhaps because the disability determination occurred after the time he or she was first eligible).
7The criteria used to determine disability-insured status vary with age and are computed on a rolling basis. AARP (2009) provides a summary of the calculation.
8As described in Stapleton and Martin (2012), this measure relies on access to annual earnings data from the Internal Revenue Service (IRS), which only qualified SSA staff may access subject to section 6103 of the IRS code. Because disability-insured status is normally calculated by SSA only when an individual applies for SSDI, this measure should be thought of as an approximation of true insured status rather than a measure that SSA would use in its benefit-determination process.
9Some VR applicants—particularly young applicants because SSDI rules allow them complete the work history requirement over a shorter time period compared to older applicants—become disability insured after VR application (in as little as 1.5 years for those under age 24).
10Individuals who start as SSDI-only or SSI-only beneficiaries may, through their involvement with VR, become eligible for the other program. For instance, most VR applicants who at the time they applied for VR received SSI-only benefits eventually received SSDI benefits (Stapleton and Martin 2012). In such cases, the work experience gained as time elapsed likely allowed the individual to achieve disability-insured status, and longer wait times could have delayed his or her entry to SSDI. Conversely, individuals with SSDI-only benefits who encounter wait times at VR agencies could have increased their likelihood of accessing SSI as they spent down their resources, thereby becoming eligible for SSI. By excluding all disability beneficiaries at VR application, we focus on the application and benefit receipt patterns of only non-beneficiaries.
11Kentucky cases were omitted because in that agency, the IPE date is often coded as the closure date for those receiving services.
12More detailed information on other factors that might predict the relative likelihood of working or applying for disability benefits (for example, one’s connection to the labor force, former industry/occupation, income, marital status, or disability severity) would be desirable but are unavailable in administrative data.
13An appendix available from the authors includes complete results for each model shown in Table 3.
14The Riley and Rupp (2012) estimate is in 2006 dollars: $111,160. We inflated this value by 10.91 percent, which is the percentage increase in the annual consumer price index from 2006 to 2011.
J. Schimmel Hyde and T. Honeycutt are both senior researchers at the Center for Studying Disability Policy (CSDP) at Mathematica Policy Research. D. Stapleton is a Mathematica senior fellow and the director of CSDP.
AARP: Social Security Disability Insurance: a primer. 2009.http://assets.aarp.org/rgcenter/econ/i28_ssdi.pdf . Accessed 25 Sept 2013
Hager R: Order of selection for vocational rehabilitation services: an option for state VR agencies who cannot serve all eligible individuals. 2004.http://www.edi.cornell.edu/publications/PPBriefs/PP_23.pdf . Accessed 15 Aug 2013
Honeycutt T, Stapleton D: Striking while the iron is hot: the effect of vocational rehabilitation service wait times on employment outcomes for applicants receiving Social Security disability benefits. J Vocat Rehabil 2013,39(2):137–152.
Lindner S, Burdick C: Characteristics and Employment of Applicants for Social Security Disability Insurance Over The Business Cycle. Working Paper 2013–11. Boston College Center for Retirement Research, Boston, MA; 2013.
Riley GR, Rupp K: Expenditure patterns under the four major public cash benefit and health insurance programs for working-age adults with disabilities. J Disab Pol Stud 2012. online before print. http://dps.sagepub.com/content/early/2012/12/26/1044207312469828.full.pdf
Silverstein RA: Description and Analysis of State Policy Frameworks Regarding Order of Selection Under Title I of the Rehabilitation Act. Institute for Community Inclusion, Rehabilitation Research and Training Center for Vocational Rehabilitation, Boston, MA; 2010.
Social Security Administration: SSI Annual Statistical Report, 2012. SSA Publication No. 13–11827. Social Security Administration, Washington, DC; 2013a.
Social Security Administration: State vocational rehabilitation agency reimbursements. 2013b.http://www.ssa.gov/work/claimsprocessing.html . Accessed 2 Oct 2013
Stapleton D, Martin F: Vocational Rehabilitation on the Road to Social Security Disability: Longitudinal Statistics from Matched Administrative Data. Report Submitted to the Michigan Retirement Research Center. Mathematica Policy Research, Washington, DC; 2012.
U.S. Department of Education, Office of Special Education and Rehabilitative Services, Rehabilitation Services Administration: Annual Report, Fiscal Year 2010, Report on Federal Activities Under the Rehabilitation Act. U.S. Department of Education, Washington, DC; 2012.
We are grateful for the contributions of Max Benjamin, who provided programming assistance, and to David Wittenburg, David Neumark, and an anonymous reviewer who provided important suggestions for revisions to this manuscript. Funding for this article was made possible by the Research and Training Center on Employment Policy and Measurement Rehabilitation Research and Training Center, which is funded by the U.S. Department of Education, National Institute for Disability and Rehabilitation Research, under cooperative agreement H133B100030. The contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the federal government (Edgar, 75.620 (b)). The authors are solely responsible for any errors or omissions. This paper was submitted to the IZA Journal of Labor Policy's call for papers on "Social Security Disability Benefits: Finding Alternatives to Benefit Receipt." Two special editors, David Wittenburg and Gina Livermore, were sponsored by the University of New Hampshire’s Rehabilitation, Research, and Training Center on Employ¬ment Policy and Measurement, funded by the U.S. Department of Education (ED), National Institute on Disability and Reha¬bilitation Research (cooperative agreement no. H133B100030). Their comments do not necessarily represent the policies of ED or any other federal agency (Edgar, 75.620 (b)). The authors are solely responsible for all views expressed.
Responsible editor: David Neumark
The IZA Journal of Labor Policy is committed to the IZA Guiding Principles of Research Integrity. The authors declare that they have observed these principles.
Jody Schimmel Hyde, Todd Honeycutt and David Stapleton contributed equally to this work.