Skip Navigation

Federal Communications Commission

English Display Options

Commission Document

FCC Releases Report on Quality of Service of Incumbent Local Exchange Carriers

Download Options

Released: December 24, 2009



QUALITY OF SERVICE OF INCUMBENT LOCAL

EXCHANGE CARRIERS




DECEMBER 2009



Industry Analysis and Technology Division
Wireline Competition Bureau
Federal Communications Commission













This report was authored by Jonathan M. Kraushaar of the Industry Analysis and Technology Division of
the FCC’s Wireline Competition Bureau. The author can be reached at (202) 418-0947; e-mail address:
jonathan.kraushaar@fcc.gov; TTY: (202) 418-0484. This report is available for reference in the FCC's
Reference Information Center, Courtyard Level, 445 12th Street, S.W. Copies may be purchased by calling
Best Copy and Printing, Inc. at (202) 488-5300. The report can be downloaded from the Wireline
Competition Bureau Statistical Reports Internet site at http://www.fcc.gov/wcb/stats.


Quality of Service of Incumbent Local Exchange Carriers




1. Executive Summary

1.1 Overview

This report summarizes the Automated Reporting Management Information System (ARMIS)
service quality data filed by the regional Bell companies,1 Embarq2 and other price-cap regulated
incumbent local exchange carriers for calendar year 2008.3 The data track the quality of service
provided to both retail customers (business and residential) and access customers (interexchange
carriers).

The Federal Communications Commission (FCC or Commission) does not impose service
quality standards on communications common carriers. Rather, the Commission monitors quality of
service data submitted by incumbent local exchange carriers that are regulated as price-cap carriers.
The Commission summarizes these data and publishes a report on quality of service trends annually.4
The tables of this report present comparative data on key company performance indicators. These data
include several objective indicators of installation, maintenance, switch outage and trunk blocking
performance for each reporting company. The tables also present data on customer perception of
service and the level of consumer complaints. A number of indicators are charted over time to present a
multi-year view. In addition, the Commission uses statistical methods to analyze the data for long term
trends and to establish patterns of industry performance. The results of these analyses are also
contained in this report.

1
BellSouth merged with AT&T in December 2006. The charts and tables in this report continue to track
BellSouth and other regional Bell companies that have merged with AT&T as separate entities. This has been
done mainly to capture performance differences that may still exist across the former regional Bell
companies. This report identifies these entities by placing an “AT&T” in front of the regional company name
(e.g., AT&T BellSouth). Other merger activity is summarized in footnote 22.
2
In May 2006, Sprint spun off its Local Telecommunications Division as an independent entity under the name
Embarq. Embarq data are included in the tables and charts in this report. This year's report covers the period
through December 2008 and does not include information from the merger of CenturyTel and Embarq, which
occurred on July 1, 2009.
3
See Revision of ARMIS Annual Summary Report (FCC Report 43-01), ARMIS USOA Report (FCC Report 43-
02), ARMIS Joint Cost Report (FCC Report 43-03), ARMIS Access Report (FCC Report 43-04), ARMIS
Service Quality Report (FCC Report 43-05), ARMIS Customer Satisfaction Report (FCC Report 43-06),
ARMIS Infrastructure Report (FCC Report 43-07), ARMIS Operating Data Report (FCC Report 43-08),
ARMIS Forecast of Investment Usage Report (FCC Report 495A), and ARMIS Actual Usage of Investment
Report (FCC Report 495B) for Certain Class A and Tier 1 Telephone Companies
, CC Docket No. 86-182,
Order, 20 FCC Rcd 19377 (2005).

4
The last report, which included data for 2007, was released in March 2009. See Industry Analysis and
Technology Division, Wireline Competition Bureau, Federal Communications Commission, Quality of
Service of Incumbent Local Exchange Carriers
(March, 2009). That report (as a PDF file) and previous
reports can be found on the Commission’s website at www.fcc.gov/wcb/stats. Source data used to prepare
this report may be useful for further investigation and can be extracted from the ARMIS 43-05 and 43-06
tables on the online database maintained on the FCC website at www.fcc.gov/wcb/eafs.


1

1.2 Key Findings for 2008


The quality of service report tracks large-company,5 small-company,6 and industry
performance over time on eight key quality of service indicators: average complaints per million
lines,7 percent of installation commitments met, lengths of installation intervals, lengths of repair
intervals, percentage of switches with outages, trouble report rate per thousand access lines,
percentage dissatisfied with installation, and percentage dissatisfied with repair. Since our last
report, there have been only small changes in the values of most of these indicators. However, our
analysis, which incorporated service quality data from the most recent six years, identified the
presence of statistically significant long-term upward or downward trends in some of the indicators
of industry-wide performance (i.e., with data for large and small companies combined) and in
indicators of large and small company performance, when these data were analyzed separately.8
These trends are identified below:


• Repair intervals are increasing on average 5.8% annually for the industry overall, 4.9%
annually for the larger companies, and 7.3% annually for the smaller companies.

• Percentage of customers dissatisfied with residential installations is increasing on average
7.3% per year for the larger companies.9

• Percentage of customers dissatisfied with residential repairs is increasing on average 4.3%
per year for the larger companies.

• Percentage of switches with downtime is decreasing by 0.9% annually for the large
companies.

No statistically significant long-term upward or downward trends were observed in any of
the other indicators of large-company, small-company or industry-wide performance. The
absence of a statistically significant industry trend does not, however, exclude the possibility that
individual companies have significant performance trends. Indeed, our statistical analysis also

5
The larger companies of this report are AT&T Ameritech, AT&T BellSouth, AT&T Pacific, AT&T SNET,
AT&T Southwestern, Embarq, Qwest, Verizon GTE, Verizon North, and Verizon South.
6 The smaller companies of this report are Alltel Corp, Cincinnati Bell, Citizens, Citizens Frontier, Century
Tel., Hawaiian Telecom, Iowa Telecom, and Valor. Alltel and Valor are now owned by Windstream Corp.
7
Unless otherwise stated the term “average complaints per million lines” or “average complaint level” as used
in this report is the simple average of the residential complaints per million lines and the business complaints
per million lines filed with State or Federal regulatory agencies. This average is computed at the company
level for the charts and the study area level for the statistical analysis. When computing industry composites,
individual company data are weighted by the number of access lines.
8
A trend is the average (or expected) annual percentage decline or increase in the value of the indicator. Our
statistical analysis shows that, for a number of the indicators, the probability that these trends occurred by
chance is very small (i.e., less than one chance in one thousand for some indicators, and less than one chance
in one hundred for others). In these cases, we say the trend is statistically significant. This year, we found
some trends were significant at the 0.001 level, while others were significant at the 0.01 level. For further
discussion of the statistical techniques employed in this report and detailed results, see infra Section 5.2.
9
The smaller companies covered in this report are not required to file data on customer dissatisfaction with
repairs and installations. These data are collected in the ARMIS 43-06 reports, filed only by the larger
incumbent local exchange carriers.

2

shows that both trends and performance differ significantly across companies for many of the
tracked indicators. Charts 1-8 of this report illustrate graphically how individual companies have
performed over the last six years relative to other companies in the same size class. In particular,
Chart 1A covering the average of business and residential complaints per million access lines,
Chart 5A covering residential installation intervals, and Chart 8 covering switches with downtime
provide good illustrations of apparent long-term differences in performance among the charted
companies.10

In addition to continued statistically significant long-term, industry-wide trends toward
longer repair intervals, almost all the larger companies and some of the smaller companies reported
repair-interval increases in 2008.11 However, almost all large companies showed decreases in
installation length as shown in Chart 5A.


2. Report History


At the end of 1983, anticipating AT&T's imminent divestiture of its local operating
companies, the Commission directed the Common Carrier Bureau12 to establish a monitoring
program that would provide a basis for detecting adverse trends in Bell operating company network
service quality. The Bureau subsequently worked with industry to refine the reporting requirements,
ensuring that the data were provided in a uniform format. Initially, the data were filed twice yearly.
The data collected for 1989 and 1990 formed the basis for FCC service quality reports published in
June 1990 and July 1991, respectively. These reports highlighted five basic service quality
measurements collected at that time.13


With the implementation of price-cap regulation for certain local exchange carriers, the
Commission made several major changes to the service quality monitoring program. These changes
first affected data filed for calendar year 1991. First, the Commission expanded the class of
companies required to file quality of service data to include non-Bell carriers that elected to be subject
to price-cap regulation.14 These carriers are known collectively as non-mandatory price-cap carriers,
and most of them are much smaller than the Bell operating companies. Second, the Commission
included service quality reporting in the ARMIS data collection system.15 Finally, the Commission

10
Tables 1A and 2A contain separate current complaint data categories for residential and business customers.
11
See charts 7A and 7B.
12
As the result of a reorganization in March 2002, the Wireline Competition Bureau now performs Common
Carrier Bureau functions described in this report. In this report, references to the Common Carrier Bureau
apply to activities prior to the above date.

13
These were customer satisfaction level, dial-tone delay, transmission quality, on time service orders and
percentage of call blocking due to equipment failure.

14
Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Second Report and
Order, 5 FCC Rcd 6786, 6827-31 (1990) (LEC Price-Cap Order) (establishing the current service quality
monitoring program and incorporating the service quality reports into the ARMIS program), Erratum, 5 FCC
Rcd 7664 (1990), modified on recon., 6 FCC Rcd 2637 (1991), aff'd sub nom., Nat'l Rural Telecom Ass'n v.
FCC
, 988 F.2d 174 (D.C. Cir. 1993). The incumbent local exchange carriers that are rate-of-return regulated
are not subject to federal service quality reporting requirements.

15
LEC Price-Cap Order, 5 FCC Rcd at 6827-30. The ARMIS database includes a variety of mechanized

3

ordered significant changes to the kinds of data carriers had to report.16 Following these
developments, the Commission released service quality reports in February 1993, March 1994, and
March 1996.

In 1996, pursuant to requirements in the Telecommunications Act of 1996,17 the Commission
reduced the frequency of ARMIS data reporting to annual submissions, and in May 1997, clarified
relevant definitions.18 The raw data are now filed in April of each year. The Commission has
summarized these data and published the quality of service report annually.19 However,
in 2008, the Commission granted forbearance from carriers’ obligations to file ARMIS Reports 43-05
and 43-06, which provide the source data for the service quality report, subject to the condition that
the carriers continue to collect service quality data and file these ARMIS reports for a two year period
following the effective date of the forbearance order. All price-cap carriers have agreed to this
condition. 20

company financial and infrastructure reports in addition to the quality-of-service reports. Most data are
available disaggregated to a study area level which generally represents operations within a given state.

16
Id.; Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum
Opinion and Order, 6 FCC Rcd 2974 (1991) (Service Quality Order), recon., 6 FCC Rcd 7482 (1991).
Previously the Common Carrier Bureau had collected data on five basic service quality measurements from
the Bell operating companies, described earlier.

17
Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56.

18
Orders implementing filing frequency and other reporting requirement changes associated with
implementation of the Telecommunications Act of 1996 are as follows: Implementation of the
Telecommunications Act of 1996: Reform of Filing Requirements and Carrier Classifications
, CC Docket No.
96-193, Order and Notice of Proposed Rulemaking, 11 FCC Rcd 11716 (1996); Revision of ARMIS Quarterly
Report (FCC Report 43-01) et al.
, CC Docket No. 96-193, Order, 11 FCC Rcd 22508 (1996); Policy and
Rules Concerning Rates for Dominant Carriers
, CC Docket No. 87-313, Memorandum Opinion and Order,
12 FCC Rcd 8115 (1997); Revision of ARMIS Annual Summary Report (FCC Report 43-01) et al., AAD No.
95-91, Order, 12 FCC Rcd 21831 (1997).

19
Until 2003, the quality of service reports included data only from the mandatory price-cap companies and the
largest non-mandatory carrier, Sprint (now Embarq). Beginning with the December 2004 report, the
following smaller non-mandatory price-cap companies that are required to file only ARMIS 43-05 data have
also been included: Alltel Corp., Century Tel., Cincinnati Bell, Citizens, Citizens Frontier, Iowa Telecom,
and Valor Telecommunications. Alltel and Valor are now owned by Windstream Corp. The last report
published in March 2009 included data from Hawaiian Telecom, a non-mandatory carrier for the first time.
(Non-mandatory carriers are not required to file customer satisfaction data that appear in the ARMIS 43-06
report.) The current report does not include data concerning facilities in Maine, New Hampshire and
Vermont that Fairpoint Communications acquired from Verizon in 2008 and operated for less than a year.
(See footnote 22.)

20
See Service Quality, Customer Satisfaction, Infrastructure and Operating Data Gathering; Petition of AT&T

Inc. for Forbearance Under 47 U.S.C. § 160(c) From Enforcement of Certain of the Commission’s ARMIS

Reporting Requirements; Petition of Qwest Corporation for Forbearance from Enforcement of the

Commission’s ARMIS and 492A Reporting Requirements Pursuant to 47 U.S.C. § 160(c); Petition of the

Embarq Local Operating Companies for Forbearance Under 47 U.S.C. § 160(c) From Enforcement of Certain

of ARMIS Reporting Requirements; Petition of Frontier and Citizens ILECs for Forbearance Under 47

U.S.C. § 160(c) From Enforcement of Certain of the Commission’s ARMIS Reporting Requirements;

Petition of Verizon for Forbearance Under 47 U.S.C. § 160(c) From Enforcement of Certain of the

Commission’s Recordkeeping and Reporting Requirements; Petition of AT&T Inc. For Forbearance Under

47 U.S.C. § 160 From Enforcement of Certain of the Commission’s Cost Assignment Rules, WC Docket

4



3.

The Data

3.1 Tables



The data presented in this report summarize the most recent ARMIS 43-05 and 43-06 carrier
reports.21 Included are data from the regional Bell companies, Embarq and all other reporting
incumbent local exchange carriers.22 Tables 1(a) through 1(e) cover data from the regional Bell
companies, or mandatory price-cap companies. Tables 2(a) through 2(c) cover data from the smaller
non-mandatory price-cap companies. These companies report quality of service data at a study area
level which generally represents operations within a given state. Although reporting companies
provide selected company aggregate data, the tables of this report contain summary data that have been
recalculated by Commission staff as the composite aggregate of all study areas for each listed entity.
This report also includes an extensive summary of data about individual switching outages, including
outage durations and numbers of lines affected, for which no company calculated aggregates are
provided. Switch outage data have also been aggregated to the company level for inclusion in the
tables.



Nos. 08-190, 07-139, 07-204, 07-273, 07-21, Memorandum Opinion and Order and Notice of Proposed

Rulemaking, 23 FCC Rcd 13647 (2008) (ARMIS Forbearance Order), pet. for recon. pending, pet. for review

pending, NASUCA v. FCC, Case No. 08-1353 (D.C. Cir. filed Nov. 4, 2008). However, reporting carriers

have committed to continue collecting service quality and customer satisfaction data, and to filing those data

publicly through ARMIS Report 43-05 and 43-06 filings for twenty four months from the effective date of

this order (September 6, 2008).

21
Source data used in preparing this report can be extracted from the ARMIS 43-05 and 43-06 tables on the
online database maintained on the FCC website at www.fcc.gov/wcb/eafs. The data are also available from
Best Copy and Printing, Inc at (202) 488-5300. A number of prior-year data summary reports are available
through the FCC’s Reference Information Center (Courtyard Level) at 445 12th Street, S.W., Washington,
D.C. 20554 and the Wireline Competition Bureau Statistical Reports website at www.fcc.gov/wcb/stats.

22
In February 1992, United Telecommunications Inc. became Sprint Corporation (Local Division); and in
March 1993, Sprint Corporation acquired Centel Corporation. Sprint recently spun off its local telephone
division as a new entity, Embarq, and that name is now used in the charts and tables in this report. Bell
Atlantic and NYNEX merged in August 1997, and then merged with GTE in 2000. Verizon Communications
is shown separately in our report tables for GTE, Verizon North (the former NYNEX companies), and
Verizon South (the former Bell Atlantic Companies). Similarly, SBC and Pacific Telesis merged in April
1997, SBC and SNET merged in October 1998, and SBC and Ameritech merged in October 1999. SBC and
AT&T then merged at the end of 2005 and the merged company retained the name AT&T. In 2006
BellSouth merged with AT&T and again retained the AT&T name. Data from the entities originally known
as SBC Southwestern, Ameritech, Pacific Telesis, SNET, and BellSouth are shown separately in the charts
and tables with the AT&T company name. In the summaries of smaller companies, Windstream Corp. was
created in 2006 from the spin-off of Altel’s wireline division and a simultaneous merger with Valor
Telecommunications. Data for acquired entities are still shown separately in this report, where possible.
Hawaiian Telecom was formerly part of Verizon GTE and was spun off in April 2005. Quality of Service
data was first filed for Hawaiian Telecom in 2007. Similarly, in March 2008 Fairpoint Communication
acquired facilities in Maine, New Hampshire and Vermont and now operates them as a price cap carrier.
(Fairpoint Communication was not required to file data for these facilities for 2008, the year the transfer took
place and existed as a rate of return company before acquiring these facilities.)


5

The tables contained in this report cover data for 2008. Tables 1(a) and 2(a) provide
installation, maintenance and customer complaint data. The installation and maintenance data are
presented separately for local services provided to end users and access services provided to
interexchange carriers. Tables 1(b) and 2(b) show switch downtime and trunk servicing data. Tables
1(c) and 2(c) show outage data by cause. Table 1(d) presents the percentages of residential, small
business and large business customers indicating dissatisfaction with BOC installations, repairs and
business offices, as determined by BOC customer perception surveys.23 Table 1(e) shows the
underlying survey sample sizes.


The company-level quality of service data included in Tables 1(a)-1(e) and Tables 2(a)-2(c)
are derived by calculating sums or weighted averages of data reported at the study area level. In
particular, where companies report study area information in terms of percentages or average time
intervals, this report presents company composites that are calculated by weighting the percentage or
time interval figures from all study areas within that company. For example, we weight the percent of
commitments met by the corresponding number of orders provided in the filed data.24


In the case of outage data summarized in Tables 1(b), 1(c), 2(b), and 2(c), we calculate a
number of useful statistics from raw data records for individual switches with outages lasting more
than two minutes. These statistics include the total number of events lasting more than two minutes,
the average outage duration, the average number of outages per hundred switches, the average number
of outages per million access lines, and the average outage line-minutes per thousand access lines and
per event. Outage line-minutes is a measure that combines both duration and number of lines affected
in a single parameter. We derive this parameter from the raw data by multiplying the number of lines
involved in each outage by the duration of the outage and summing the resulting values over all
outages. We then divide the resulting sum by the total number of thousands of access lines or of
events to obtain average outage line-minutes per access line and average outage line minutes per
event, respectively.

3.2 Charts


This report displays data elements that have remained roughly comparable over the past few
years. Such data are useful in identifying and assessing trends. In addition to the tables, this report
contains charts that highlight company trends for the last 6 years. Unlike the tables in which the
company composites are recalculated, the data in the charts are derived from company provided roll-
up or composite data.25 Charts 1 through 7 graphically illustrate trends in complaint levels, initial

23
Customer satisfaction data collected in the 43-06 report and summarized in Tables 1(d) and 1(e) are required
to be reported only by the mandatory price-cap carriers.

24
Although companies file their own company composites, we have recalculated a number of them from study
area data for presentation in the tables to assure that company averages are calculated in a consistent manner.
We weight data involving percentages or time intervals in order to arrive at consistent composite data shown
in the tables. Parameters used for weighting in this report were appropriate for the composite being
calculated and were based on the raw data filed by the carriers but are not necessarily shown in the tables.
For example, we calculate composite installation interval data by multiplying the average installation interval
at the individual study area level by the number of orders in that study area, summing the results for all study
areas, and then dividing that sum by the total number of orders.

25
Calculations to normalize data and derive percentages in charts 1, 2A, 2B and 8 in this year’s report were
performed directly on company provided composite data rather than from recalculated composites in the

6

trouble reports, residential installation dissatisfaction, percent of residential installation commitments
met, residential installation intervals, residential repair dissatisfaction, and residential initial out-of-
service repair intervals, respectively. Chart 8 displays trends among the larger price-cap carriers in
the percentage of switches with outages. Data for Embarq (formerly Sprint Local Division, the largest
non-mandatory price-cap company) are included only in those charts displaying ARMIS 43-05 data
that it is required to file.


This report charts the performance of the smaller price-cap carriers only on selected quality of
service indicators. These include the average number of residential and business complaints per
million access lines, the trouble report rate per thousand lines, the lengths of repair intervals and the
lengths of installation intervals. These indicators were selected for charting because they are
generally less volatile than the others, thus allowing better comparison with similar trended data from
the larger companies. (In the cases where we chart both large and small company performance, the
larger companies are tracked on the chart with an ‘A’ designation, e.g., Chart 7A, while the smaller
companies are tracked on the chart with a ‘B’ designation, e.g., Chart 7B.) Access line counts are
used as weighting factors in calculation of weighted BOC/Embarq and small company composites
that appear in the charts. We use separate updated weights for each year. Line counts used for
weights are the most current values available corresponding to each year of data. Thus, small changes
to the reported numbers of access lines for a previous year may account for small differences in the
values of these composites from previous quality of service reports.

3.3 For More Information about the Data

More detailed information about the raw data from which this report has been developed may
be found on the Commission’s ARMIS web page cited earlier. Descriptions of the raw ARMIS 43-05
source data items from which Tables 1(a), 1(b), 1(c), 2(a), 2(b), and 2(c) were prepared can be found
in Appendix A of this report. Tables 1(d) and 1(e) were prepared from data filed only by the Bell
operating companies in the ARMIS 43-06 report. The statistics presented in Tables 1(d) and 1(e) are
straightforward and reflect the data in the format filed. Complete data descriptions are available in
several Commission orders.26


4. Qualifications


Overall, we caution readers to be aware of potential inconsistencies in the service quality data
and methodological shortcomings affecting both the collection and interpretation of the data. Some
common sources of issues are described below.

4.1 Data Re-filings


Commission staff generally screen company-filed service quality data for irregularities and
provide feedback to reporting companies on suspected problems. The reporting companies are then
given an opportunity to re-file. Re-filed data appear in this report if they are received in time to be

attached tables. Other charts contain data that were taken directly from company provided composite data.
Graphed composite AT&T data in the charts do not include data for BellSouth (which merged with AT&T at
the very end of 2006). BellSouth data are shown separately to facilitate comparisons with prior year data.
26
See supra n.16.


7

included in the Commission’s recalculation of holding company totals and other data aggregates
described in Section 3.1. However, it is expected that the process of data correction continues beyond
the date of publication of this report, as new problems are identified. Reporting companies frequently
re-file data, not only for the current reporting period, but also occasionally for previous reporting
periods. Hence, users of the quality of service report data may find some inconsistencies with data
extracted from the ARMIS database at a later or earlier date.

4.2 Commission Recalculation of Holding Company Aggregate Statistics


Commission staff do not typically delete or adjust company-filed data for presentation in the
tables and charts of the quality of service report, except for recalculating holding company totals and
other data aggregates as described in Section 3.1. Recalculated aggregates appear in the tables of the
quality of service report. These may not match corresponding company-filed totals and composites.27
Such inconsistencies are due primarily to differences in the way we and the reporting company derive
the data element, for example, in the use of percentages or average intervals that require weighting in
the calculations.

4.3 Company-specific Variations



Users conducting further analysis of the data should be aware that variations in service quality
measurements may occur among companies and even within the same company over time for reasons
other than differences in company performance. For example, data definitions must be properly and
consistently interpreted.28 The Commission has, on occasion, provided clarifications when it became
apparent that reporting companies had interpreted reporting requirements inconsistently.29 Changes in
a company’s internal data collection procedures or measurement technology may also result in
fluctuations in its service quality measurements over time. In some cases, procedural changes in the
data measurement and collection process may be subtle enough so that they are not immediately
noticeable in the data. However, significant changes in company data collection procedures usually
result in noticeable and abrupt changes in the data as had been described in previous reports.30 It
appears that at least some of these changes have not been reported to the Commission. These factors
tend to limit the number of years of reliable data available to track service quality trends.




27
Data presented in the charts are company-filed composites, except where noted.
28
In Chart 5A Qwest appears to have a different but consistent interpretation from all the other larger
companies providing installation interval data.
29
For example, the Commission addressed data problems relating to subtleties in the definitions associated with
the terms “initial” and “repeat” trouble reports. See Policy and Rules Concerning Rates for Dominant
Carriers,
CC Docket No. 87-313, Memorandum Opinion and Order, 12 FCC Rcd 8115, 8133, para. 40
(1997); Policy and Rules Concerning Rates for Dominant Carriers, AAD No. 92-47, Memorandum Opinion
and Order, 8 FCC Rcd 7474, 7478, para. 26, 7487-7549, Attachment (1993); Revision of ARMIS Annual
Summary Report (FCC Report 43-01) et al
., AAD 95-91, Order, 12 FCC Rcd 21831, 21835, para. 10 (1997)
(introducing reporting of “subsequent” troubles). This issue was discussed at greater length in a prior
summary report. See Industry Analysis Division, Common Carrier Bureau, Federal Communications
Commission, Quality of Service for the Local Operating Companies Aggregated to the Holding Company
Level
(March 1996).

30
For example, See Quality of Service of Incumbent Local Exchange Carriers (February, 2008), footnote 28.


8


Although the Commission has made considerable efforts to standardize data reporting
requirements over the years, given the number of changes to the reporting regimes and predictable
future changes, one should not assume exact comparability on all measurements for data sets as they
are presented year by year. In spite of all of the foregoing, deteriorating or improving service quality
trends that persist for more than a year or two usually become obvious and can provide a critical
record for state regulators and others.

4.4 Trend Analysis and Data Volatility


Because measurements of any particular quality of service indicator may fluctuate over time,
trend analysis can be an effective tool in helping to evaluate longer-term company and industry
performance. Consideration of trends may also provide insight into typical lead times that might be
needed to correct certain problems once they have been identified. In addition, adverse trends in
complaint levels of significant duration, when identified, can serve as warning indicators of problems
not included in the more specific objective measurements.31 For these reasons we identify statistically
significant trends in the data. Identification of such trends assists in evaluating the significance of
year-to-year changes in the data.


With respect to individual measures of company performance, it is our experience that
in evaluating customer satisfaction data one must consider longer term trends and take into account
the effects of filing intervals and lag times in data preparation and filing.

4.5 Interpretation of Outage Statistics

Statistics
describing
the impact of outages should be considered in context. Switch outage
severity is affected both by the number of lines that are out of service and the duration of the outage.
A performance indicator that captures only the average number of lines out of service per event would
tend to favor a company with a large number of small switches and low line counts per switch over a
company with a few switches and many lines per switch, since on average, only a small number of
lines would be out of service per event. For example, using the average number of lines out of service
per event indicator, a company with one 25,000 line switch that is out of service for five minutes
(average lines out of service per event = 25,000) would appear to have poorer performance than a
company with ten 2,500 line switches that are each out of service for five minutes (average lines out
of service per event = 2,500). A statistic capturing only total minutes of outage for the same two
companies is likely to favor the company with a few larger switches. To provide a consistent basis for
comparison of performance of companies having different switch size characteristics, we present a
group of outage statistics that can capture the impact of both the number of lines affected and the
duration of the outage. These statistics include outage line-minutes per event and per 1,000 access
lines.

4.6

External Factors



We note that external factors, including economic conditions and natural disasters, the level
of competitive activity, and changes in regulation have the potential to affect the quality of service

31
Nevertheless, negative perceptions of past company performance may stimulate customers’ tendency to
complain. Continuing improvements may be needed to reverse trends resulting from such perceptions. Thus
the data must be considered in context.

9

available in specific regions of the country or in the industry as a whole, and these effects may be
manifested in the quality of service data.32 The Commission does not currently consider these
effects in its analysis.


5. Observations and Statistical Analysis

5.1 Observations from the Current Year Summary Data


The large company average trouble report rate increased from 169.2 troubles per thousand
lines in 2007 to 176.2 in 2008, as shown in chart 2A. Similarly, the average large company
residential repair interval length increased from 26.7 hours in 2007 to 30.8 hours in 2008, with most
large entities shown in Chart 7A reporting increasing repair interval length for 2008. The average
repair interval for the smaller companies increased from 17.0 hours in 2006, to 21.5 hours in 2007 and
to 22.7 hours in 2008, as shown in chart 7B. The average large company residential customer
dissatisfaction with repairs increased from 13.1 to 15.1 percent dissatisfied in 2008, and it is now at its
highest level in the six year period shown in Chart 6.


By way of contrast, the average length of the residential installation interval for large
companies has remained constant over the past six years, as shown in chart 5A. Nevertheless, the
large company residential installation dissatisfaction level, as shown in chart 3, increased for the third
consecutive year from 6.2 percent dissatisfied in 2005, to 6.5 percent dissatisfied in 2006, to 7.0
percent dissatisfied in 2007 and to 8.7 percent dissatisfied in 2008. For the smaller companies, the
average installation interval, as shown in chart 5B, has also continued to increase from 2.7 days in
2005, to 2.9 days in 2006, to 3.1 days in 2007, and to 3.7 days in 2008.

The
weighted
average
number of complaints per million access lines among the large price-cap
carriers, as shown in chart 1A, increased for the fourth consecutive year from 93.9 in 2004, to 102.0 in
2005, to 119.1 in 2006, to 124.1 in 2007 and to 147.7 in 2008. 33 Data from prior years’ quality of
service reports show the average large price-cap carrier complaint levels peaked at more than 255 per
million lines in the year 2000.34

5.2 Statistical Analysis


The FCC’s quality of service report tracks several key indicators of industry and company
performance. The indicators currently tracked are complaints per million lines, length of
installation intervals, length of repair intervals, percent of installation commitments met, trouble
reports per thousand lines, percent installation dissatisfaction, percent repair dissatisfaction and
percent of switches with outages. In this year’s report we update the results of the statistical


32
For example, the actions of the California Public Utilities Commission to clear a complaint backlog in 2005
may have affected complaint levels in that state.
33
The increase in the overall trend is primarily due to one entity as shown in Chart 1A. Our analysis
determined, however, that the overall industry trend was not statistically significant because most entities
have not exhibited similar trends.
34
This observation includes data over the past ten years. See, for example, Quality of Service of Incumbent
Local Exchange Carriers (November, 2005), Chart 1.

10

analysis of these indicators using raw data samples received from reporting companies.35 The
overall goals of our statistical analysis are to:

• determine if there were any discernable trends in performance as tracked by these
indicators across the years,
• determine if reporting companies performed differently from each other,
• determine whether the large reporting companies performed differently or had different
trend behavior from small reporting companies, and
• develop models of trends in performance that could be used to predict next year’s
performance.


For the purpose of our analysis, we classified companies as “large” or “small.” This
classification is largely the same as that used earlier in creating the charts -- the larger companies36
are tracked on the charts with an “A” designation (e.g., chart 2A), and the smaller companies37 are
tracked on the charts with a “B” designation (e.g., chart 2B).


We used several types of statistical techniques in analyzing the data. These included
ANOVA (Analysis of Variance), ANCOVA (Analysis of Covariance) and simple linear
regression. They allowed us to analyze small-versus-large company effects, individual company
effects, and year effects (i.e., does performance vary from year-to-year) in the performance data
for each of the key indicators. We tested for the existence of overall trends,38 trends for only the
large companies, and trends for only the small companies. If a trend existed, we then determined
its direction and magnitude. In addition, the statistical testing allowed us to determine if the trends
varied widely across companies, if there were performance differences across companies, and if
large company performance differed from small company performance.


The table entitled “Results of Statistical Testing of Key Industry Performance Indicators”
appearing later in this section summarizes the results of our statistical analysis on data filed by
reporting companies since the year 2002, representing the most recent six-year reporting period.39
(Note that smaller non-mandatory price cap carriers are not required to file data on all performance
indicators. These are designated as “NA” in the table.)


35
We have excluded zero values from the data when performing the statistical analysis in this year’s report,
except in the complaint and switch outage categories, where zero values for individual study areas are
believed to be likely. Filing companies have been instructed to report zeros only when reporting zero values,
not to record missing data values as zero.
36
See supra n.5.
37
See supra n.6.
38
A trend is the expected annual change in the value of the performance indicator. For example, a negative
trend of -5.2% means that every year the value of the indicator is expected to decrease by 5.2%. A positive
trend (e.g., +6.3%), means that every year the value of the indicator is expected to increase by 6.3%. The
magnitude and direction of the trend for a particular performance indicator is estimated by using linear
regression to fit a line to the logarithms of the values of that performance indicator reported for all study areas
in the size class for the past six years.
39
The table is based on individual raw study area samples from the ARMIS database which have not been
weighted. The trends calculated from these samples may therefore differ from composite trends calculated as
weighted company totals.

11


The rows of the table contain the key indicators of company performance tracked by this
report. The columns contain the effects described above. A “Yes” entry in the table means that we
have concluded with a high level of statistical confidence that the effect for which we have tested is
indeed present. A “No” entry means that the data did not support such a conclusion. For example,
we tested to determine whether large company performance differs from small company
performance on the installation intervals indicator, and we concluded with statistical confidence that
large company performance does differ from small company performance on this indicator. We
included the direction and magnitude of a trend in the table if our statistical testing indicated that
there was a low probability the trend occurred as a result of random fluctuations in the data, i.e.,
was statistically significant. A number of the trends were found significant at less than the 0.001
level, meaning there was less than one chance in 1000 that these trends occurred as a result of
random data fluctuations. However, asterisked trends were found significant at less than the 0.01
level, but not at the 0.001 level, meaning that there was a greater probability—between one chance
in 100 and one chance in 1000— that these trends happened by chance. The word “No” appearing
in any of the first three columns of the table indicates that a trend could not be established at the
0.01 level of significance. In the last three columns of the table the word “Yes” indicates that
statistically significant differences were found among companies or groups of companies, and the
word “No” indicates that such differences could not be established statistically.


Results of Statistical Testing of Key Industry Performance Indicators


Trends

Large

Vary

Performance

Company

Trend

Trend For

Trend for

Widely

Differences

Performance

Over All

Large

Small

Across

Across

Differs From


Companies

Companies

Companies

Companies

Companies

Small

Average complaints per
million lines

No No No No No No

Installation intervals

No No No Yes Yes Yes

Repair intervals

+5.8%
+4.9% +7.3%* Yes
Yes
No

Percent commitments met

No No No No No
No

Trouble report rate per
1000 lines

No No No No No No

Percent residential
installation dissatisfaction

+7.3% +7.3*
NA
Yes
Yes
NA

Percent residential repair
dissatisfaction

+4.3* +4.3* NA
No
Yes
NA

Percent switches with
outages

No -0.9% No Yes No
No
All results are significant at less than the 0.001 level except as noted below.
* Indicates a result which was significant at less than the 0.01 level, but not at the .001 level.


As noted earlier, a trend represents the expected or average change in the value of the
performance indicator from year to year. Considering columns 1 through 3, we note our analysis
has allowed us to conclude with a high degree of confidence that statistically significant trends do
exist in the data for some indicators of performance. Factors other than random data variability are
likely to be responsible for these trends. However, what those factors are cannot be determined
from our data alone. (Section 4 of this report discusses some factors that may impact the data in

12

addition to company performance.) Observed annual performance changes may not necessarily be
in a direction or magnitude consistent with trends which are estimated using the most recent six
years of data.


Column 4 indicates that we found that company trends vary widely across companies for
about half the industry performance indicators. Column 5 shows our finding that there are
statistically significant differences in company performance for about half of the tracked
indicators. Finally, column 6 shows our finding that large company performance is statistically
indistinguishable from small company performance on all common indicators, except in the
lengths of their installation intervals.


Overall, our analysis shows that there are statistically significant trends over the most recent
six-year period for some of the performance indicators. Upward trends in the length of repair
intervals, installation dissatisfaction, and repair dissatisfaction provide evidence of longer-term
declining performance in these areas.


On a more positive note, one of our indicators-- the large company switch outage percentage--
continues to show small but statistically significant long-term trends of improving performance.


Complaint levels present a mixed picture. Although we found no long-term statistical trend
toward either improving or declining complaint performance, we note that the average large company
complaint level has increased every year beginning in 2005 and now stands at 147.7 complaints per
million access lines. Complaint levels for one large holding company more than doubled since 2003,
as shown graphically in Chart 1A, 40 but complaint levels for most of the other entities listed within
the chart have remained relatively stable with reported average complaint levels of fewer than 150
complaints per million lines in 2008. A closer look at the graph in Chart 1A reveals that for the last
two years, only one large entity had complaint levels that exceeded the large company average. These
findings demonstrate the potential for a single large company to impact the industry statistics and
serve to caution users of this data to take individual company performance into account when
interpreting our results.41


In closing, we note that because long-term trends are calculated with performance data from
the most recent six year period, trends tend to lag current changes in performance, and it may take
several years of sustained improved performance to reverse the direction of an unfavorable trend.
Thus, the direction, magnitude and statistical significance of trends may change in the future as
companies respond or fail to respond to quality of service issues.








40
Verizon, Qwest, AT&T (without BellSouth) and AT&T BellSouth are treated as large holding companies in

our graphical analysis this year.

41
See footnote 33.

13

Chart 1A

Relative Complaint Levels
Large Price-Cap Carriers
Weighted
BOC/Embarq
350.0
Composite*
Weighted
300.0
Verizon Avg.
250.0
AT&T
BellSouth
n Lines
200.0
Weighted
AT&T Avg.
(excluding
150.0
ints per Millio
BellSouth)
mpla
Qwest

Co

100.0
Embarq
50.0
(formerly
Sprint)
0.0
2003
2004
2005
2006
2007
2008

Years

Average of Residential and Business Complaints per Million Access Lines

(Calculated Using Data from Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
AT&T Ameritech
13.2
11.2
12.0
8.3
13.1
17.7
AT&T BellSouth
128.0
131.4
137.7
119.1
96.4
133.7
AT&T Pacific
10.6
10.4
23.3
42.1
14.7
26.1
AT&T Southwestern
13.4
21.9
21.9
14.9
24.3
26.9
AT&T SNET
87.1
88.5
20.4
21.1
9.2
74.9
Qwest
103.5
89.1
80.8
69.3
65.8
73.2
Verizon GTE
79.1
104.8
161.0
171.2
160.0
167.6
Verizon North (Combined with Verizon South)
Verizon South
190.7
184.7
191.9
266.7
315.9
386.5
Embarq (formerly Sprint)
78.9
43.3
46.0
60.6
52.9
32.3
Weighted BOC/Embarq Composite*
94.9
93.9
102.0
119.1
124.1
147.7
*Weighted composite is calculated using access line counts.
14

Chart 1B

Relative Complaint Levels
Small Price-Cap Carriers
Windstream --
600.0
Alltel
Cincinnati Bell
500.0
Citizens
400.0
rts
Citizens
(Frontier)
f
Repo
300.0
Weighted Small
Co.Composite*

Number o

Weighted
200.0
BOC/Embarq
Composite*
Century Tel.
100.0
Iowa Telecom
0.0
Windstream --
2003
2004
2005
2006
2007
2008
Valor

Years

Average of Residential and Business Complaints per Million Access Lines

(Calculated Using Data from Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
Century Tel.
536.9
402.9
518.8
522.4
526.1
557.7
Cincinnati Bell
246.9
374.0
173.8
179.7
176.8
188.6
Citizens
339.7
412.5
538.1
544.2
415.0
397.8
Citizens (Frontier)
142.6
418.7
337.5
310.5
151.2
214.5
Hawaiian Telecom
57.3
41.6
Iowa Telecom
12.5
10.5
8.3
0.0
3.2
24.2
Windstream -- Alltel
194.7
129.8
110.1
88.6
88.3
152.6
Windstream --Valor
222.4
95.3
152.3
264.1
192.2
217.6
Weighted BOC/Embarq Composite*
94.9
93.9
102.0
119.1
124.1
147.7
Weighted Small Co.Composite*
263.9
316.0
311.4
317.6
233.2
255.3
* Weighted composite is calculated using access line counts.
15

Chart 2A

Initial Trouble Reports per Thousand Lines
Large Price-Cap Carriers
350.0
Weighted
Verizon Avg.
300.0
AT&T BellSouth
250.0
rts
Weighted AT&T
200.0
Avg.(excluding
f
Repo

Bellsouth)
150.0
Qwest

Number o

100.0
Embarq (formerly
Sprint)
50.0
Weighted
BOC/Embarq
0.0
Composite*
2003
2004
2005
2006
2007
2008

Years

Total Initial Trouble Reports per Thousand Lines (Residence + Business)

(Calculated Using Data from Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
AT&T Ameritech
149.7
146.2
144.3
153.8
164.7
184.2
AT&T BellSouth
278.5
298.2
307.3
265.8
241.6
252.5
AT&T Pacific
119.4
116.1
129.4
101.7
101.2
116.2
AT&T Southwestern
175.4
190.5
173.3
179.8
220.8
228.6
AT&T SNET
180.3
165.8
184.9
176.1
147.4
164.2
Qwest
113.4
117.6
112.6
111.3
107.7
101.9
Verizon GTE
153.0
167.2
191.7
176.7
163.4
168.4
Verizon North (Combined with Verizon South)
Verizon South
169.4
157.8
164.1
167.4
164.3
169.3
Embarq (formerly Sprint)
192.2
216.1
221.1
220.1
184.3
167.8
Weighted BOC/Embarq Composite*
172.4
175.8
180.9
172.3
169.2
176.2
* Weighted composite is calculated using access line counts.
16

Chart 2B

Initial Trouble Reports per Thousand Lines
Small Price-Cap Carriers
Windstream --
600.0
Alltel
Cincinnati Bell
500.0
Citizens
400.0
Citizens
rts
(Frontier)
f
Repo
300.0
Weighted Small
Co.Composite*

Number o

Weighted
200.0
BOC/Embarq
Composite*
Century Tel.
100.0
Iowa Telecom
0.0
Windstream --
2003
2004
2005
2006
2007
2008
Valor

Years

Total Initial Trouble Reports per Thousand Lines (Residence + Business)

(Calculated Using Data from Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
Century Tel.
266.9
265.0
231.1
213.3
180.6
197.2
Cincinnati Bell
114.6
113.6
131.4
119.5
118.4
130.4
Citizens
260.2
296.0
325.1
270.3
275.0
281.8
Citizens (Frontier)
266.6
257.2
252.4
242.7
268.8
290.1
Hawaiian Telecom
114.7
122.3
Iowa Telecom
132.6
157.2
155.4
161.1
182.6
173.7
Windstream -- Alltel
233.5
193.1
128.2
206.2
175.2
258.6
Windstream --Valor
368.0
422.6
479.8
506.4
287.4
302.3
Weighted BOC/Embarq Composite*
172.4
175.8
180.9
172.3
169.2
176.2
Weighted Small Co.Composite*
237.1
244.0
245.5
241.1
206.0
227.8
* Weighted composite is calculated using access line counts.
17

Chart 3

Residential Installation Dissatisfaction
BOCs
12.0
Weighted
Verizon Avg
10.0
AT&T
BellSouth
8.0
tisfied
Weighted
6.0
AT&T Avg.
(excluding
Bellsouth)

Percent Dissa

4.0
Qwest
2.0
Weighted
0.0
BOC
Composite*
2003
2004
2005
2006
2007
2008

Years

Percent Dissatisfied --BOC Residential Installations

(Using Company Provided Composites)
ARMIS 43-06 Report
2003
2004
2005
2006
2007
2008
AT&T Ameritech
8.1
7.6
6.7
7.4
7.5
7.9
AT&T BellSouth
6.7
6.4
5.7
6.2
6.7
7.2
AT&T Pacific
6.1
6.1
6.4
6.9
5.7
7.6
AT&T Southwestern
7.9
8.4
7.1
6.6
7.2
9.0
AT&T SNET
7.6
8.6
8.4
8.3
9.9
13.2
Qwest
5.5
3.9
3.7
3.8
3.8
4.5
Verizon GTE
3.5
5.3
6.9
7.3
7.6
10.0
Verizon North (Combined with Verizon South)
Verizon South
6.2
6.4
6.2
6.5
8.3
11.2
Weighted BOC Composite*
6.3
6.4
6.2
6.5
7.0
8.7
*Weighted composite is calculated using access line counts.
18

Chart 4

Percent Residential Installation Commitments Met
Large Price-Cap Carriers
100.0
Weighted
BOC/Embarq
Composite*
99.5
Weighted
Verizon Avg
e
t

99.0
AT&T
mmitments M
98.5
BellSouth
f
Co

Weighted
98.0

Percent o

AT&T Avg.
(excluding
BellSouth)
97.5
Qwest
97.0
2003
2004
2005
2006
2007
2008

Years

Percent Installation Commitments Met -- Residential Services

(Using Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
AT&T Ameritech
98.9
98.6
98.6
98.6
98.4
98.5
AT&T BellSouth
98.2
98.7
98.7
98.2
98.2
98.5
AT&T Pacific
99.6
99.4
99.2
99.3
99.3
99.6
AT&T Southwestern
99.1
99.0
99.1
99.3
99.2
99.0
AT&T SNET
99.5
99.6
99.6
99.7
99.7
99.7
Qwest
99.7
99.7
99.6
99.6
99.7
99.8
Verizon GTE
98.3
98.4
98.0
97.9
98.1
98.2
Verizon North (Combined with Verizon South)
Verizon South
98.7
98.8
98.9
98.9
98.6
98.7
Embarq (formerly Sprint)
97.5
96.8
97.2
97.0
97.3
96.8
Weighted BOC/Embarq Composite*
98.8
98.8
98.8
98.7
98.6
98.7
*Weighted composite is calculated using access line counts.
19

Chart 5A

Residential Installation Intervals
Weighted
Large Price-Cap Carriers
BOC/Embar
q
Composite*
2.0
Weighted
Verizon Avg
1.8
1.6
AT&T
BellSouth
1.4
Weighted
1.2
AT&T Avg.
ays
(excluding
D
n
i

BellSouth)
1.0
Qwest
t
e
r
val

In

0.8
0.6
Embarq
(formerly
Sprint)
0.4
0.2
0.0
2003
2004
2005
2006
2007
2008

Years

Average Residential Installation Interval in Days

(Using Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
AT&T Ameritech
1.5
1.4
1.4
1.5
1.7
1.7
AT&T BellSouth
1.1
1.1
1.3
1.3
1.4
1.2
AT&T Pacific
1.5
1.6
1.5
1.6
1.3
1.1
AT&T Southwestern
1.9
2.0
2.1
1.1
1.0
0.9
AT&T SNET
1.0
1.0
1.0
1.1
0.8
0.7
Qwest
0.4
0.3
0.3
0.2
0.1
**
Verizon GTE
0.6
0.6
0.9
0.8
0.8
1.0
Verizon North (Combined with Verizon South)
Verizon South
1.1
1.1
1.0
1.1
1.4
1.9
Embarq (formerly Sprint)
1.4
1.7
1.7
1.8
1.6
1.5
Weighted BOC/Embarq Composite*
1.2
1.2
1.2
1.2
1.2
1.2
* Weighted composite is calculated using access line counts. **Qwest reports an average value of less than 0.1
20

Chart 5B

Residential Installation Intervals
Small Price-Cap Carriers
Weighted
BOC/Embarq
7.0
Composite*
Windstream --
Alltel
6.0
Cincinnati Bell
5.0
Citizens
ays 4.0
Citizens
D
(Frontier)
n
i

Weighted
t
e
r
val

3.0
Small

In

Co.Composite*
Century Tel.
2.0
Iowa Telecom
1.0
Windstream --
Valor
0.0
2003
2004
2005
2006
2007
2008

Years

Average Residential Installation Interval in Days

(Using Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
Century Tel.
3.3
1.6
1.3
0.8
0.6
0.3
Cincinnati Bell
4.5
1.7
2.1
2.0
1.9
1.8
Citizens
5.3
4.1
2.7
4.2
3.8
6.1
Citizens (Frontier)
4.8
5.1
5.0
4.1
3.9
4.4
Hawaiian Telecom
4.2
3.5
Iowa Telecom
1.8
1.9
1.7
1.4
1.1
2.9
Windstream -- Alltel
1.8
1.6
2.6
2.5
3.1
3.6
Windstream --Valor
2.0
1.6
2.2
2.8
4.2
4.7
Weighted BOC/Embarq Composite*
1.2
1.2
1.2
1.2
1.2
1.2
Weighted Small Co.Composite*
3.9
2.9
2.7
2.9
3.1
3.7
* Weighted composite is calculated using access line counts.
21

Chart 6

Residential Repair Dissatisfaction
BOCs
25.0
Weighted
BOC
Composite*
20.0
Weighted
Verizon Avg
e
d

sfi 15.0
AT&T
ssati
BellSouth
t Di
10.0
ercen
P

Weighted
AT&T Avg.
(excluding
5.0
BellSouth)
Qwest
0.0
2003
2004
2005
2006
2007
2008

Years

Percent Dissatisfied -- BOC Residential Repairs

(Using Company Provided Composites)
ARMIS 43-06 Report
2003
2004
2005
2006
2007
2008
AT&T Ameritech
11.4
11.0
11.1
9.5
9.2
9.0
AT&T BellSouth
10.1
10.0
10.1
9.0
9.7
11.5
AT&T Pacific
7.6
7.4
8.9
10.9
8.5
9.1
AT&T Southwestern
9.9
10.4
9.2
9.5
8.5
9.6
AT&T SNET
11.9
11.6
11.2
13.8
10.5
14.0
Qwest
6.5
5.9
6.2
6.4
6.8
9.2
Verizon GTE
11.2
14.0
16.1
16.4
16.0
16.8
Verizon North (Combined with Verizon South)
Verizon South
20.8
19.0
20.4
22.7
22.6
27.5
Weighted BOC Composite*
12.6
12.3
13.0
13.6
13.1
15.1
* Weighted composite is calculated using access line counts.
22

Chart 7A

Residential Initial Out-of-Service Repair Intervals
Large Price-Cap Carriers
Weighted
50.0
BOC/Embarq
Composite*
45.0
40.0
Weighted
Verizon Avg
35.0
urs 30.0
AT&T BellSouth
o
25.0
erval in H 20.0
Weighted AT&T

Int

Avg. (excluding
BellSouth)
15.0
10.0
Qwest
5.0
0.0
Embarq
2003
2004
2005
2006
2007
2008
(formerly Sprint)

Years

Average Initial Out-of-Service Repair Interval in Hours -- Residential Services

(Using Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
AT&T Ameritech
16.8
17.2
16.3
17.3
22.3
26.4
AT&T BellSouth
21.5
33.5
44.8
20.6
19.8
25.7
AT&T Pacific
25.8
28.8
45.2
52.6
32.2
32.9
AT&T Southwestern
22.1
29.0
24.6
22.4
31.2
31.0
AT&T SNET
26.7
27.2
30.6
34.4
22.2
34.9
Qwest
14.7
16.3
18.8
18.3
17.0
18.2
Verizon GTE
15.7
28.9
28.5
24.2
24.1
31.5
Verizon North (Combined with Verizon South)
Verizon South
34.5
29.2
34.3
40.5
36.0
42.3
Embarq (formerly Sprint)
17.3
22.6
23.8
18.8
18.0
20.1
Weighted BOC/Embarq Composite*
23.3
26.7
31.3
29.3
26.7
30.8
* Weighted composite is calculated using access line counts.
23

Chart 7B

Residential Initial Out-of-Service Repair Intervals
Small Price-Cap Carriers
40.0
Weighted
BOC/Embarq
*
Composite*
35.0
Windstream --
Alltel
30.0
Cincinnati Bell
25.0
urs
o

Citizens
20.0
erval in H
Century Tel.

Int

15.0
10.0
Weighted Small
Co.Composite*
5.0
Iowa Telecom
0.0
2003
2004
2005
2006
2007
2008
Windstream --
Valor

Years

Average Initial Out-of-Service Repair Interval in Hours -- Residential Services

(Using Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
Century Tel.
14.9
13.9
16.4
9.5
17.5
17.0
Cincinnati Bell
37.5
28.2
30.3
21.6
21.3
24.9
Citizens
16.3
16.7
18.1
17.7
18.2
23.8
Citizens (Frontier)
28.1
22.3
17.6
17.0
17.5
25.2
Hawaiian Telecom
43.4
35.7
Iowa Telecom
10.1
11.1
11.3
12.2
19.1
18.1
Windstream -- Alltel
25.9
15.4
13.6
14.6
16.0
16.5
Windstream --Valor
16.8
17.3
21.1
21.9
23.7
16.1
Weighted BOC/Embarq Composite*
23.3
26.7
31.3
29.3
26.7
30.8
Weighted Small Co.Composite*
23.1
22.2
19.2
17.0
21.5
22.7
* Weighted composite is calculated using access line counts.
24

Chart 8

Percentage of Switches with Downtime
Large Price-Cap Carriers
25.0
Weighted
BOC/Embarq
Composite*
20.0
Weighted
Verizon Avg
15.0
AT&T
BellSouth

Percent

Weighted
10.0
AT&T Avg.
(excluding
BellSouth)
Qwest
5.0
Embarq
0.0
2003
2004
2005
2006
2007
2008

Years

Percentage of Switches with Downtime

(Calculated Using Data from Company Provided Composites)
ARMIS 43-05 Report
2003
2004
2005
2006
2007
2008
AT&T Ameritech
1.5
1.0
0.3
0.4
0.4
0.3
AT&T BellSouth
2.5
1.6
2.3
0.9
0.4
0.6
AT&T Pacific
3.3
3.7
2.3
1.9
11.2
1.0
AT&T Southwestern
3.9
1.5
1.2
1.5
3.8
0.6
AT&T SNET
0.6
6.2
1.3
9.0
0.0
0.6
Qwest
11.1
20.0
13.7
10.0
11.9
12.7
Verizon GTE
2.7
1.5
1.5
3.2
2.4
3.9
Verizon North (Combined with Verizon South)
Verizon South
4.4
0.9
0.8
0.8
0.4
1.5
Embarq (formerly Sprint)
3.5
7.5
13.8
8.3
10.5
0.1
Weighted BOC/Embarq Composite*
3.9
3.7
3.2
2.7
3.9
2.4
*Weighted composite is calculated using access line counts.
25

Table 1(a):

Installation, Maintenance, & Customer Complaints

Mandatory Price-Cap Company Comparison -- 2008

AT&T

AT&T

AT&T

AT&T

AT&T

Qwest Verizon

Verizon

Verizon

Ameritech BellSouth

Pacific

SWBT

SNET

North

South

GTE

Access Services Provided to Carriers-- Switched Access

Percent Installation Commitments Met
99.7
100.0
99.7
98.6
86.4
99.4
99.4
99.7
96.8
Average Installation Interval (days)
21.2
17.7
20.4
23.2
15.4
15.1
12.9
12.4
17.2
Average Repair Interval (hours)
8.3
11.1
11.8
4.3
3.1
2.6
2.5
8.5
8.4

Access Services Provided to Carriers -- Special Access

Percent Installation Commitments Met
89.4
99.4
94.6
97.5
98.4
96.5
95.0
94.7
96.3
Average Installation Interval (days)
18.0
14.2
16.3
16.0
19.3
4.3
12.9
12.9
11.8
Average Repair Interval (hours)
6.0
3.5
5.8
5.0
4.1
3.7
4.0
3.7
3.8

Local Services Provided to Res. and Business Customers

Percent Installation Commitments Met
98.4
97.3
99.5
99.1
99.7
99.7
99.0
98.3
98.0
Residence
98.5
98.5
99.6
99.0
99.7
99.8
99.1
98.4
98.2
Business
98.3
89.5
99.3
99.0
99.5
99.1
98.2
97.4
95.8
Average Installation Interval (days)
1.8
1.3
1.2
1.0
1.3
0.1
2.5
1.8
1.1
Residence
1.8
1.2
1.1
0.9
0.7
0.0
2.6
1.7
1.0
Business
1.7
1.3
1.8
1.5
3.4
0.4
1.6
2.1
2.0
Avg. Out of Svc. Repair Interval (hours)
25.5
23.6
31.7
29.5
33.9
17.5
27.3
46.2
29.0
Total Residence
26.4
25.7
32.9
31.0
34.9
18.2
29.4
52.2
31.5
Total Business
21.3
14.5
24.8
23.8
28.3
14.8
20.4
18.9
15.7

Initial Trouble Reports per Thousand Lines

184.2
252.5
116.2
228.6
164.2
101.9
181.7
161.5
168.4
Total MSA
184.6
242.8
115.2
227.1
161.9
114.8
179.3
154.6
159.1
Total Non MSA
180.3
308.8
143.6
235.3
187.7
43.0
235.8
248.2
206.7
Total Residence
258.6
319.9
167.2
299.9
218.6
128.3
246.2
231.7
211.0
Total Business
81.1
136.5
46.4
105.2
68.8
55.5
96.0
69.4
84.8
Troubles Found per Thousand Lines
147.3
182.7
93.7
174.6
116.1
84.0
148.2
128.6
138.7
Repeat Troubles as a Pct. of Trouble Rpts.
13.4%
15.4%
9.4%
14.6%
14.1% 20.0%
17.0%
16.1%
15.2%

Residential Complaints per Million Res. Access Lines

30.5
202.3
44.0
42.9
134.5
121.3
97.5
1,118.4
278.8

Business Complaints per Million Business Access Lines

4.9
65.2
8.1
10.9
15.4
25.2
24.6
68.1
56.3
* Please refer to text for notes and data qualifications.

Table 1(b):

Switch Downtime & Trunk Blocking

Mandatory Price-Cap Company Comparison -- 2008

AT&T

AT&T

AT&T

AT&T

AT&T

Qwest Verizon

Verizon

Verizon

Ameritech BellSouth

Pacific

SWBT

SNET

North

South

GTE

Total Access Lines in Thousands

11,627
15,519
12,254
10,034
1,436 10,199
9,292
14,708
10,994

Total Trunk Groups

763
1,299
922
623
88
1,357
633
1,011
1,504

Total Switches

1,432
1,608
778
1,591
181
1,304
943
1,403
2,411

Switches with Downtime

Number of Switches
5
12
8
9
1
166
9
26
93
As a percentage of Total Switches
0.3%
0.7%
1.0%
0.6%
0.6% 12.7%
1.0%
1.9%
3.9%

Average Switch Downtime in seconds per Switch*

For All Events (including events over 2 minutes)
10.6
83.4
70.9
672.3
1.0
77.6
65.8
44.7
634.7
For Unscheduled Events Over 2 Minutes
10.5
83.4
70.7
672.0
1.0
70.3
65.5
37.4
634.1

For Unscheduled Downtime More than 2 Minutes

Number of Occurrences or Events
3.0
7.0
6.0
3.0
1.0
19.0
4.0
13.0
115.0
Events per Hundred Switches
0.2
0.4
0.8
0.2
0.6
1.5
0.4
0.9
4.8
Events per Million Access Lines
0.3
0.5
0.5
0.3
0.7
1.9
0.4
0.9
10.5
Average Outage Duration in Minutes
83.8
319.2
152.8
5939.8
3.0
80.4
257.4
67.2
221.6
Average Lines Affected per Event in Thousands
8.7
7.0
23.1
13.2
24.8
8.6
0.5
13.3
2.2
Outage Line-Minutes per Event in Thousands
185.9
486.6
4430.7
7541.0
74.4
174.2
74.5
1319.3
219.6
Outage Line-Minutes per 1,000 Access Lines
48.0
219.5
2169.4
2254.6
51.8
324.4
32.1
1166.1
2296.6

For Scheduled Downtime More than 2 Minutes

Number of Occurrences or Events
0.0
0.0
0.0
1.0
0.0
3.0
0.0
1.0
1.0
Events per Hundred Switches
0.0
0.0
0.0
0.1
0.0
0.2
0.0
0.1
0.0
Events per Million Access Lines
0.0
0.0
0.0
0.1
0.0
0.3
0.0
0.1
0.1
Average Outage Duration in Minutes
NA
NA
NA
5.8
NA
5.3
NA
155.0
12.8
Avg. Lines Affected per Event in Thousands
NA
NA
NA
8.7
NA
13.8
NA
2.3
23.2
Outage Line-Minutes per Event in Thousands
NA
NA
NA
50.4
NA
83.6
NA
354.8
295.3
Outage Line-Minutes per 1,000 Access Lines
0.0
0.0
0.0
5.0
0.0
24.6
0.0
24.1
26.9
% Trunk Grps. Exceeding Blocking Objectives
0.0%
12.8%
1.4%
3.9%
0.0% 29.1%
3.2%
9.4%
0.7%
* Aggregate downtime divided by total number of company switches.
Please refer to text for notes and data qualifications.

Table 1(c):

Switch Downtime Causes -- Outages more than 2 Minutes in Duration

Mandatory Price-Cap Company Comparison -- 2008

AT&T

AT&T

AT&T

AT&T

AT&T

Qwest Verizon

Verizon

Verizon

Ameritech BellSouth

Pacific

SWBT

SNET

North

South

GTE

Total Number of Outages

1. Scheduled
0
0
0
1
0
3
0
1
1
2. Proced. Errors -- Telco. (Inst./Maint.)
0
1
2
1
0
0
0
3
6
3. Proced. Errors -- Telco. (Other)
0
0
0
0
0
0
0
0
0
4. Procedural Errors -- System Vendors
0
0
0
0
0
3
0
0
0
5. Procedural Errors -- Other Vendors
1
0
1
0
0
0
1
0
1
6. Software Design
0
0
0
0
1
0
0
2
1
7. Hardware design
0
0
0
0
0
0
0
0
1
8. Hardware Failure
0
2
3
0
0
12
2
3
39
9. Natural Causes
1
2
0
2
0
1
0
0
0
10. Traffic Overload
0
0
0
0
0
0
0
0
0
11. Environmental
0
0
0
0
0
0
0
1
6
12. External Power Failure
0
0
0
0
0
1
0
2
54
13. Massive Line Outage
0
0
0
0
0
0
0
0
0
14. Remote
0
0
0
1
0
3
0
1
1
15. Other/Unknown
1
0
0
0
0
0
1
2
5

Total Outage Line-Minutes per Thousand Access Lines

1. Scheduled
0.0
0.0
0.0
5.0
0.0
24.6
0.0
24.1
26.9
2. Proced. Errors -- Telco. (Inst./Maint.)
0.0
5.4
15.2
40.8
0.0
0.0
0.0
25.2
211.5
3. Proced. Errors -- Telco. (Other)
0.0
1.5
0.0
0.0
0.0
0.0
0.0
0.0
6.7
4. Procedural Errors -- System Vendors
0.0
0.0
0.0
0.0
0.0
31.6
0.0
0.0
0.0
5. Procedural Errors -- Other Vendors
25.4
0.0
2,069.2
0.0
0.0
0.0
1.0
0.0
8.6
6. Software Design
0
0
0
0
52
0
0
10
0
7. Hardware design
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
12.2
8. Hardware Failure
0.0
44.9
84.9
0.0
0.0
237.9
20.0
47.6
471.9
9. Natural Causes
4.9
167.6
0.0
2,213.8
0.0
47.6
0.0
0.0
0.0
10. Traffic Overload
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
11. Environmental
0
0
0
0
0
0
0
899
49
12. External Power Failure
0.0
0.0
0.0
0.0
0.0
5.6
0.0
183.5
1,520.7
13. Massive Line Outage
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
14. Remote
0.0
0.0
0.0
0.0
0.0
1.8
0.0
0.0
1.4
15. Other/Unknown
17.7
0.0
0.0
0.0
0.0
0.0
11.1
0.3
14.5
* Please refer to text for notes and data qualifications.

Table 1(d):

Company Comparison -- 2008 Customer Perception Surveys

Mandatory Price-Cap Companies:

AT&T

AT&T

AT&T

AT&T AT&T Qwest Verizon Verizon Verizon

Ameritech BellSouth

Pacific Southwestern

SNET

North

South

GTE

Percentage of Customers Dissatisfied

Installations:
Residential
7.93%
7.22%
7.64%
8.95%
13.18%
4.52%
9.31%
11.24%
10.03%
Small Business
10.08%
8.59%
7.78%
7.47%
11.39%
6.32% 14.44%
16.98%
13.98%
Large Business
NA
NA
NA
NA
NA
NA 16.11%
14.12%
13.16%
Repairs:
Residential
8.97%
11.54%
9.13%
9.62%
14.00%
9.24% 18.91%
27.48%
16.79%
Small Business
8.02%
6.30%
6.50%
7.73%
12.21%
8.56% 14.64%
14.98%
12.29%
Large Business
NA
NA
NA
NA
NA
NA
9.69%
12.47%
12.57%
Business Office:
Residential
12.14%
10.43%
7.63%
9.61%
12.04%
4.36% 12.39%
16.03%
15.63%
Small Business
7.33%
9.31%
6.08%
7.90%
12.21%
6.09% 10.78%
13.22%
13.80%
Large Business
NA
NA
NA
NA
11.96%
NA 27.86%
33.42%
28.82%
* Please refer to text for notes and data qualifications

Table 1(e):

Company Comparison -- 2008 Customer Perception Surveys

Mandatory Price-Cap Companies:

AT&T

AT&T

AT&T

AT&T AT&T Qwest Verizon Verizon Verizon

Ameritech BellSouth

Pacific Southwestern

SNET

North

South

GTE

Sample Sizes -- Customer Perception Surveys

Installations:
Residential
6,075
6,077
5,958
5,989
2,428
2,669
15,916
20,149
26,087
Small Business
6,012
6,043
6,057
5,996
316
807
9,824
11,814
12,228
Large Business
0
0
0
0
0
0
267
354
190
Repairs:
Residential
6,064
6,058
6,169
5,601
1,229
4,256
14,474
16,698
18,944
Small Business
6,068
4,265
5,984
5,805
893
2,410
9,857
11,178
11,218
Large Business
0
0
0
0
0
0
258
369
191
Business Office:
Residential
12,409
10,380
12,045
11,979
1,761
21,439
10,349
12,776
18,234
Small Business
11,611
9,767
10,837
11,372
598
3,188
3,470
6,031
5,427
Large Business
0
0
0
0
184
0
219
325
170
* Please refer to text for notes and data qualifications

Table 2(a):

Installation, Maintenance, & Customer Complaints

Non-Mandatory Price-Cap Company Comparison -- 2008

Century

Cincinnati Citizens

Citizens

Embarq Hawaiian

Iowa

Windstream Windstream

Tel.

Frontier

Telecom Telecom

Alltel

Valor

Access Services Provided to Carriers-- Switched Access

Percent Installation Commitments Met
97.4
95.3
86.1
93.9
92.0
92.5
50.3
99.6
93.1
Average Installation Interval (days)
18.0
54.6
33.6
20.5
10.0
10.5
18.1
4.7
5.7
Average Repair Interval (hours)
206.0
NA
21.6
24.7
2.2
22.8
9.5
3.8
9.2

Access Services Provided to Carriers -- Special Access

Percent Installation Commitments Met
88.2
91.7
86.9
96.1
93.1
81.8
85.6
97.4
84.3
Average Installation Interval (days)
18.7
47.4
14.4
17.8
11.9
13.8
1.8
7.2
8.6
Average Repair Interval (hours)
115.5
7.2
21.1
71.4
3.8
23.5
20.3
3.6
11.2

Local Services Provided to Res. and Business Customers

Percent Installation Commitments Met
98.3
99.5
95.1
97.6
96.5
90.4
96.2
96.8
96.6
Residence
99.0
99.7
95.2
97.9
96.8
91.8
96.4
97.2
96.9
Business
95.7
99.1
94.5
96.2
95.2
80.8
95.3
93.5
92.7
Average Installation Interval (days)
0.5
2.2
6.1
4.6
1.6
3.5
2.9
3.6
4.6
Residence
0.4
1.8
6.1
4.4
1.5
3.5
2.9
3.6
4.7
Business
1.2
4.1
6.0
5.6
2.0
3.2
2.7
3.6
4.7
Avg. Out of Svc. Repair Interval (hours)
17.0
23.7
23.7
24.6
19.9
32.3
17.4
16.1
16.0
Total Residence
17.1
24.9
24.1
25.2
20.1
35.7
18.1
16.5
16.1
Total Business
16.1
15.7
20.8
21.8
18.4
24.7
11.4
13.4
15.6

Initial Trouble Reports per Thousand Lines

197.2
130.4
284.8
290.1
167.8
101.0
173.7
179.8
254.4
Total MSA
176.4
130.4
NA
303.0
140.5
103.4
174.3
159.3
155.2
Total Non MSA
215.7
NA
284.9
278.9
226.1
96.7
173.6
197.4
330.4
Total Residence
241.7
178.7
329.0
329.1
215.0
122.3
203.7
258.6
302.3
Total Business
76.9
47.9
162.8
179.1
72.5
70.1
84.6
63.1
129.2
Troubles Found per Thousand Lines
172.5
120.6
256.8
265.4
95.0
88.1
154.8
150.1
208.5
Repeat Troubles as a Pct. of Trouble Rpts.
11.5%
10.6%
21.1%
15.1%
20.2%
11.5%
19.3%
17.4%
21.9%

Residential Complaints per Million Res. Access Lines

900.1
308.2
674.8
325.4
52.3
58.3
48.4
273.9
335.1

Business Complaints per Million Bus. Access Lines

215.2
69.1
120.8
103.5
12.2
24.8
0.0
31.2
100.1
* Please refer to text for notes and data qualifications

Table 2(b):

Switch Downtime & Trunk Blocking

Non-Mandatory Price-Cap Company Comparison -- 2008

Century

Cincinnati

Citizens

Citizens

Embarq Hawaiian

Iowa

Windstream Windstream

Tel.

Frontier

Telecom

Telecom

Alltel

Valor

Total Access Lines in Thousands

498
705
1,034
594
5,688
493
193
636
433

Total Trunk Groups

299
44
244
308
243
76
53
97
253

Total Switches

187
91
208
74
1,320
157
270
243
265

Switches with Downtime

Number of Switches
0
3
12
5
1
6
29
57
81
As a percentage of Total Switches
0.0%
3.3%
5.8%
6.8%
0.0
3.8%
10.7%
23.5%
30.6%

Average Switch Downtime in seconds per Switch *

For All Events (including events over 2 minutes)
0.0
33.7
827.9
932.4
45.2
587.0
7,973.8
12,458.5
36,571.2
For Unscheduled Events Over 2 Minutes
NA
NA
827.9
466.2
45.2
587.0
7,973.8
12,149.4
36,076.8

For Unscheduled Downtime More than 2 Minutes

Number of Occurrences or Events
0.0
0.0
14.0
3.0
1.0
7.0
29.0
164.0
380.0
Events per Hundred Switches
0.0
0.0
6.7
4.1
0.1
4.5
10.7
67.5
143.4
Events per Million Access Lines
0.0
0.0
13.5
5.1
0.2
14.2
150.0
257.8
877.2
Average Outage Duration in Minutes
NA
NA
205.0
191.7
994.0
219.4
1,237.3
300.0
419.3
Average Lines Affected per Event in Thousands
NA
NA
3.4
1.9
25.8
3.4
0.6
3.0
1.2
Outage Line-Minutes per Event in Thousands
NA
NA
643.4
436.3
25,693.9
842.9
515.4
1,147.6
408.4
Outage Line-Minutes per 1,000 Access Lines
0.0
0.0
8,713.8
2,203.6
4,517.4
11,974.5
77,300.8
295,875.5
358,201.1

For Scheduled Downtime More than 2 Minutes

Number of Occurrences or Events
0.0
0.0
0.0
3.0
0.0
0.0
0.0
14.0
13.0
Events per Hundred Switches
0.0
0.0
0.0
4.1
0.0
0.0
0.0
5.8
4.9
Events per Million Access Lines
0.0
0.0
0.0
5.1
0.0
0.0
0.0
22.0
30.0
Average Outage Duration in Minutes
NA
NA
NA
191.7
NA
NA
NA
86.4
160.6
Avg. Lines Affected per Event in Thousands
NA
NA
NA
1.9
NA
NA
NA
4.4
1.4
Outage Line-Minutes per Event in Thousands
NA
NA
NA
440.0
NA
NA
NA
306.1
100.9
Outage Line-Minutes per 1,000 Access Lines
0.0
0.0
0.0
2,221.9
0.0
0.0
0.0
6,737.1
3,028.0
% Trunk Grps. Exceeding Blocking Objectives
2.0%
43.2%
0.0%
8.1%
17.7%
0.0%
0.0%
0.0%
0.0%
* Aggregate downtime divided by total number of company switches.
Please refer to text for notes and data qualifications.

Table 2(c):

Switch Downtime Causes -- Outages More than 2 Minutes in Duration

Non-Mandatory Price-Cap Company Comparison -- 2008

Century Cincinnati Citizens

Citizens

Embarq Hawaiian

Iowa

Windstream Windstream

Tel.

Frontier

Telecom Telecom

Alltel

Valor

Total Number of Outages

1. Scheduled
0
0
0
3
0
0
0
14
13
2. Proced. Errors -- Telco. (Inst./Maint.)
0
0
1
0
0
1
6
2
3
3. Proced. Errors -- Telco. (Other)
0
0
0
0
0
0
0
0
0
4. Procedural Errors -- System Vendors
0
0
0
0
0
0
0
1
0
5. Procedural Errors -- Other Vendors
0
0
2
0
0
0
5
1
3
6. Software Design
0
0
2
0
0
0
6
0
5
7. Hardware design
0
0
0
0
0
0
0
0
0
8. Hardware Failure
0
0
3
0
0
3
1
89
131
9. Natural Causes
0
0
1
0
0
0
0
5
76
10. Traffic Overload
0
0
0
0
0
0
0
1
0
11. Environmental
0
0
1
0
0
0
4
1
11
12. External Power Failure
0
0
4
3
1
3
5
7
34
13. Massive Line Outage
0
0
0
0
0
0
1
21
30
14. Remote
0
0
0
3
0
0
0
14
13
15. Other/Unknown
0
0
0
0
0
0
1
30
72

Total Outage Line-Minutes per Thousand Access Lines

1. Scheduled
0.0
0.0
0.0
2,221.9
0.0
0.0
0.0
6,737.1
3,028.0
2. Proced. Errors -- Telco. (Inst./Maint.)
0.0
0.0
83.4
0.0
0.0
344.0
1,234.5
233.0
268.2
3. Proced. Errors -- Telco. (Other)
0.0
0.0
0.0
0.0
0.0
0.0
0.0
2,599.9
1,168.4
4. Procedural Errors -- System Vendors
0.0
0.0
0.0
0.0
0.0
0.0
0.0
1,080.5
0.0
5. Procedural Errors -- Other Vendors
0.0
0.0
78.3
0.0
0.0
0.0
2,596.9
0.1
5,422.0
6. Software Design
0
0
759
0
0
0
3476
0
340
7. Hardware design
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
8. Hardware Failure
0.0
0.0
1,963.6
0.0
0.0
3,672.4
2,280.0
162,733.0
46,463.4
9. Natural Causes
0.0
0.0
2,491.2
0.0
0.0
0.0
0.0
9,603.5
67,526.7
10. Traffic Overload
0.0
0.0
0.0
0.0
0.0
0.0
0.0
22.7
0.0
11. Environmental
0
0
52
0
0
0
56230
43
2190
12. External Power Failure
0.0
0.0
3,286.1
2,203.6
4,517.4
7,958.1 11,390.0
6,286.4
7,415.2
13. Massive Line Outage
0.0
0.0
0.0
0.0
0.0
0.0
50.1
102,386.7
126,013.4
14. Remote
0.0
0.0
0.0
0.0
0.0
0.0
0.0
244.3
13,807.0
15. Other/Unknown
0.0
0.0
0.0
0.0
0.0
0.0
42.7
10,642.1
87,586.6
* Please refer to text for notes and data qualifications



Appendix A – Description of Key Terminology in the Tables



This Appendix contains descriptions of key terms that appear in the tables and charts
of the Quality of Service Report. The data elements in the tables are derived from raw
source data for individual study areas submitted by carriers in the ARMIS 43-05 reports. A
detailed specification of each element used in the tables of this summary report follows this
general description. Data in the charts are derived from composite data provided by the
companies.

1. Percent of Installation Commitments Met


This term represents the percent of installations that were met by the date
promised by the company to the customer. The associated data are presented
separately for residential and business customers’ local service in the tables.
These data are also summarized in the accompanying charts.

2. Average Installation Interval (in days)

This term represents the average interval (in days) between the installation
service order and completion of installation. The associated ARMIS 43-05
report data are highlighted in the accompanying charts along with customer
installation dissatisfaction data from the ARMIS 43-06 report.

3. Average Repair Interval (in hours)


This term represents the average time (in hours) for the company to repair access
lines with service subcategories for switched access, high-speed special access,
and all special access. Repair interval data are also highlighted in the
accompanying charts along with results from company conducted surveys
relating to customer repair dissatisfaction. This customer feedback is extracted
from the ARMIS 43-06 report.

4. Initial Trouble Reports per Thousand Access Lines


This term is calculated as the total count of trouble reports reported as "initial
trouble reports," divided by the number of access lines in thousands. (Note that
multiple calls within a 30 day period associated with the same problem are
counted as a single initial trouble, and the number of access lines reported and
used in the calculation is the total number of access lines divided by 1,000.)
1

5. Found or Verified Troubles per Thousand Access Lines


This term is calculated as 1000 times the number of verified troubles divided by
the number of access lines. Only those trouble reports for which the company
identified a problem are included.

6. Repeat Troubles as a percent of Initial Trouble Reports


This term is calculated as the number of initial trouble reports cleared by the
company that recur, or remain unresolved, within 30 days of the initial trouble
report, divided by the number of initial trouble reports as described above.

7. Complaints per Million Access Lines


This term is calculated as 1 million times the number of residential and business
customer complaints divided by the number of access lines, reported to state or
federal regulatory bodies during the reporting period.

8. Number of Access Lines, Trunk Groups and Switches


These terms represent the numbers of in-service access lines, trunk groups, and
switches, respectively, as shown in the ARMIS 43-05 report. Trunk groups only
include common trunk groups between Incumbent Local Exchange Carrier
(ILEC) access tandems and ILEC end offices. When comparing current data
herein with data in prior reports the reader should note that access lines were
reported in thousands in pre-1997 data submissions. Starting with 1997 data
submissions, access line information in the raw carrier data filings has been
reported in whole numbers.

9. Switches with Downtime


This term represents the number of network switches experiencing downtime
and the percentage of the total number of company network switches
experiencing downtime.

10. Average Switch Downtime in Seconds per Switch


This term includes (1) the total switch downtime divided by the total number of
company network switches and (2) the total switch downtime for outages longer
than 2 minutes divided by the total number of switches. Results for average
overall switch downtime are shown in seconds per switch.
2

11. Unscheduled Downtime Over 2 Minutes per Occurrence


This term presents several summary statistics including, (1) the number of
occurrences of more than 2 minutes in duration that were unscheduled, (2) the
number of occurrences per million access lines, (3) the average number of
minutes per occurrence, (4) the average number of lines affected per occurrence,
(5) the average number of line-minutes per occurrence in thousands, and (6) the
outage line-minutes per access line. For each outage, the number of lines
affected was multiplied by the duration of the outage to provide the line-minutes
of outage. The resulting sums of these data represent total outage line-minutes.
This number was divided by the total number of access lines to provide line-
minutes-per-access-line, and, by the number of occurrences, to provide the line-
minutes-per-occurrence. This categorizes the normalized magnitude of the
outage in two ways and provides a realistic means to compare the impact of such
outages between companies. Data is presented for each company showing the
number of outages and outage line-minutes by cause.

12. Scheduled Downtime Over 2 Minutes per Occurrence


This term is determined as in item 11, above, except that it consists of scheduled
occurrences.

13. Percent of Trunk Groups Meeting Design Objectives


This term relates to the percentage of trunk groups exceeding the design
blocking objectives (typically 0.5 percent for trunk groups that include feature
group D and 1.0 percent for other trunk groups) for three or more consecutive
months. The trunk groups measured and reported are interexchange access
facilities. These represent only a small portion of the total trunk groups in
service.
3

Appendix A

Detailed Quality of Service Report Table Specifications


Report Tables 1(a) and 2(a) (ARMIS 43-05 data)

Statistic


Access Services Provided to Carriers-- Switched
Access


Percent Installation Commitments Met
row 112 weighted by row 110 (column aa)
Average Installation Interval (days)
row 114 weighted by row 110 (column aa)
Average Repair Interval (hours)
row 121 weighted by row 120 (column aa)


Access Services Provided to Carriers -- Special Access


Percent Installation Commitments Met
row 112 weighted by row 110 (column ac)
Average Installation Interval (days)
row 114 weighted by row 110 (column ac)
Average Repair Interval (hours)
row 121 weighted by row 120 (column ac)


Local Services Provided to Res. and Business
Customers


Percent Installation Commitments Met
row 132 weighted by row 130 (column aj)
Residence
row 132 weighted by row 130 (column af)
Business
row 132 weighted by row 130 (column ai)
Average Installation Interval (days)
row 134 weighted by row 130 (column aj)
Residence
row 134 weighted by row 130 (column af)
Business
row 134 weighted by row 130 (column ai)
Avg. Out of Svc. Repair Interval (hours)
row 145 weighted by row 144 (column aj)
Total Residence
row 145 weighted by row 144 (column af)
Total Business
row 145 weighted by row 144 (column ai)


Initial Trouble Reports per Thousand Lines

1000 * row 141 col aj / row 140 col aj
Total MSA
1000 * (row 141 column ad + column ag)/

(row 140 column ad + column ag)
Total Non MSA
1000 * (row 141 column ae + column ah)/

(row 140 column ae + column ah)
Total Residence
1000 * (row 141 column af)/ (row 140 column af)
Total Business
1000 * (row 141 column ai)/ (row 140 column ai)
Troubles Found per Thousand Lines
1000 * (row 141 column aj - row 143 column aj)/

row 140 column aj
Repeat Troubles as a Pct. of Trouble Rpts.
(row 142 column aj) / (row 141 column aj)


Residential Complaints per Million Res. Access Lines

(row 331 column da + row332 column da)/

(row 330 column da)

Business Complaints per Million Bus. Access Lines

(row 321 column da + row 322 column da)/

(row 320 column da)




4

Appendix A--Detailed Quality of Service Report Table Specifications


Report Table 1(b) and 2(b) (ARMIS 43-05 data)

Statistic


Total Access Lines in Thousands

row 140 column aj

Total Trunk Groups

row 180 column ak

Total Switches

row 200 column an + row 201 column an


Switches with Downtime

row 200 column ao + row 201 column ao
Number of Switches
row 200 column ao + row 201 column ao
As a percentage of Total Switches
(row 200 column ao + row 201 column ao)/

(row 200 column an + row 201 column an)

Average Switch Downtime in seconds per
Switch*


For All Events (including events over 2 minutes) 60 * (row 200 column ap + row 201 column ap)/
(row 200 column an + row 201 column an)
For Unscheduled Events Over 2 Minutes 60 * (unscheduled events * average duration in min.)/

(row 200 column an + row 201 column an)

For Unscheduled Downtime More than 2
Minutes

Items where rows 220 to 500 column t > 1
Number of Occurrences or Events
E = Number of records in row 220 to row 500

excluding rows 320, 321, 322, 330, 331 and 332
Events per Hundred Switches 100 *E/ (row 200 column an + row 201 column an)
Events per Million Access Lines E/ 1,000,000
Average Outage Duration in Minutes (sum of rows 220 to 500 column x)/ E
Average Lines Affected per Event in Thousands (sum of rows 220 to 500 column v)/ E
Outage Line-Minutes per Event in Thousands (sum of rows 220 to 500 column x * column v)/ E
Outage Line-Minutes per 1,000 Access Lines
1000 * (sum of rows 220 to 500 column x * column v)/

(row 140 column aj)

For Scheduled Downtime More than 2 Minutes

Items where rows 220 to 500 column t = 1
Number of Occurrences or Events E = Number of records in row 220 to row 500

excluding rows 320, 321, 322, 330, 331 and 332
Events per Hundred Switches 100 * E/ (row 200 column an + row 201 column an)
Events per Million Access Lines E/ 1,000,000
Average Outage Duration in Minutes (sum of rows 220 to 500 column x)/ E
Avg. Lines Affected per Event in Thousands (sum of rows 220 to 500 column v)/ E
Outage Line-Minutes per Event in Thousands (sum of rows 220 to 500 column x * column v)/ E
Outage Line-Minutes per 1,000 Access Lines
1000 * (sum of rows 220 to 500 column x * column v)/

(row 140 column aj)
% Trunk Grps. Exceeding Blocking Objectives (row 189 column ak + row 190 column ak)/
(row 180 column ak)
Notes:
ARMIS 43-05 database rows 110-121 are contained in database table I
ARMIS 43-05 database rows 130-170 are contained in database table II
ARMIS 43-05 database rows 180-190 are contained in database table III
ARMIS 43-05 database rows 200-214 are contained in database table IV
ARMIS 43-05 database rows 220- 319 are contained in database table IVa
ARMIS 43-05 database rows 320-332 are contained in database table V
5


Appendix A

Detailed Quality of Service Report Table Specifications



Report Table 1(c) and 2(c) (ARMIS 43-05 data)


Total Number of Outages

Number of rows between 220 and 500 for each value of column t

1. Scheduled


2. Proced. Errors -- Telco. (Inst./Maint.)
3. Proced. Errors -- Telco. (Other)
4. Procedural Errors -- System Vendors
5. Procedural Errors -- Other Vendors
6. Software Design

7. Hardware design

8. Hardware Failure

9. Natural Causes

10. Traffic Overload

11. Environmental

12. External Power Failure

13. Massive Line Outage

14. Remote


15. Other/Unknown

Total Outage Line-Minutes per Thousand Access Lines (Sum of rows 200 to 500 column v *
- column x for each value of column t)
/row 140 col aj

1. Scheduled


2. Proced. Errors -- Telco. (Inst./Maint.)
3. Proced. Errors -- Telco. (Other)
4. Procedural Errors -- System Vendors
5. Procedural Errors -- Other Vendors
6. Software Design

7. Hardware design

8. Hardware Failure

9. Natural Causes

10. Traffic Overload

11. Environmental

12. External Power Failure

13. Massive Line Outage

14. Remote


15. Other/Unknown


Notes:

ARMIS 43-05 database rows 110-121 are contained in database table I
ARMIS 43-05 database rows 130-170 are contained in database table II
ARMIS 43-05 database rows 180-190 are contained in database table III
ARMIS 43-05 database rows 200-214 are contained in database table IV
ARMIS 43-05 database rows 220- 319 are contained in database table IVa
ARMIS 43-05 database rows 320-332 are contained in database table V
6

Appendix A

Detailed Quality of Service Report Table Specifications



Report Table 1(d) (ARMIS 43-06 data)





Percentage of Customers Dissatisfied




Installations:



Residential
Row 40 column ac weighted by column ab

Small Business
Row 40 column ae weighted by column ad

Large Business
Row 40 column ag weighted by column af








Repairs:



Residential
Row 60 column ac weighted by column ab

Small Business
Row 60 column ae weighted by column ad

Large Business
Row 60 column ag weighted by column af








Business Office:



Residential
Row 80 column ac weighted by column ab

Small Business
Row 80 column ae weighted by column ad

Large Business
Row 80 column ag weighted by column af









Note:

ARMIS 43-06 database rows 40-80 are contained in database table I

7

Appendix A

Detailed Quality of Service Report Table Specifications



Report Table 1(e) (ARMIS 43-06 data)



Sample Sizes -- Customer Perception Surveys



Installations:
Residential
Sum of Row 40 column ab
Small Business
Sum of Row 40 column ad
Large Business
Sum of Row 40 column af




Repairs:

Residential
Sum of Row 60 column ab
Small Business
Sum of Row 60 column ad
Large Business
Sum of Row 60 column af




Business Office:
Residential
Sum of Row 80 column ab
Small Business
Sum of Row 80 column ad
Large Business
Sum of Row 80 column af





Note:

ARMIS 43-06 database rows 40-80 are contained in database table I

8

Customer Response


Publication: Quality
of
Service of Incumbent Local Exchange Carriers Report (December 2009)

You can help us provide the best possible information to the public by completing this form and
returning it to the Industry Analysis and Technology Division of the FCC's Wireline Competition
Bureau.

1.
Please check the category that best describes you:
____
press

____ current telecommunications carrier

____ potential telecommunications carrier

____ business customer evaluating vendors/service options

____ consultant, law firm, lobbyist

____ other business customer
____
academic/student
____
residential
customer
____
FCC
employee

____ other federal government employee

____ state or local government employee
____
Other
(please specify)

2.
Please rate the report: Excellent Good Satisfactory Poor No opinion

Data accuracy
(_)
(_)
(_)
(_)
(_)

Data presentation
(_)
(_)
(_)
(_)
(_)

Timeliness of data
(_)
(_)
(_)
(_)
(_)

Completeness of data (_)
(_)
(_)
(_)
(_)

Text clarity

(_)
(_)
(_)
(_)
(_)

Completeness of text (_)
(_)
(_)
(_)
(_)

3.
Overall, how do you Excellent Good Satisfactory Poor No opinion

rate this report?
(_)
(_)
(_)
(_)
(_)

4.
How can this report be improved?

5.
May we contact you to discuss possible improvements?
Name:

Telephone
#:

To discuss this report contact Jonathan Kraushaar at 202-418-0947
Fax this response to
or
Mail this response to
202-418-0520
FCC/WCB/IATD


Washington, DC 20554

Note: We are currently transitioning our documents into web compatible formats for easier reading. We have done our best to supply this content to you in a presentable form, but there may be some formatting issues while we improve the technology. The original version of the document is available as a PDF, Word Document, or as plain text.

close
FCC

You are leaving the FCC website

You are about to leave the FCC website and visit a third-party, non-governmental website that the FCC does not maintain or control. The FCC does not endorse any product or service, and is not responsible for, nor can it guarantee the validity or timeliness of the content on the page you are about to visit. Additionally, the privacy policies of this third-party page may differ from those of the FCC.