Perimeter rules – DCA, LGA, and the challenges of regulating both airline and passenger behavior

Recently, the Port Authority of New York and New Jersey floated the idea of eliminating LaGuardia Airport’s 1,500 mile perimeter rule. Only two major airports in the United States have perimeter restrictions that ban flights beyond a certain distance: LaGuardia and Washington National.

Both National and LaGuarida airports share a common history: both pre-date the jet age. both were constructed with the assistance of the Works Progress Administration, both later proved too small for jet traffic and the boom in air travel, requiring the construction of newer, larger airports.

Today, there are also several characteristics in common: both National and LaGuardia are governed and operated as a part of an airport system (administered respectively by the Metropolitan Washington Airports Authority, also operating Dulles International; and the Port Authority of NY and NJ, operating Newark and JFK airports), both airports are popular with business travelers, and both airports are subject to perimeter rule restrictions that limit the distance of scheduled flights.

DCAperimeter1

The evolution of DCA perimeter restrictions. Rings around DCA show the 1965 650mi rule, the 1981 1,000mi rule, the 1986 1,250mi rule, and the current beyond-perimeter destinations. Image from the Great Circle Mapper – www.gcmap.com

The rule first appeared with the dawn of the jet age. National Airport had non-stop long-distance airline service via propellor-driven aircraft, prior to the rise of jets in commercial aviation. However, DCA was not equipped to deal with the different geometry required for efficient operations of jet aircraft. Dulles International Airport, purpose-built for the jet age, opened in 1962. Noise from jet aircraft was a large reason behind the perimeter rule, but part of the reasoning for the rule was to drive jet traffic to Dulles as well.

The first version of the rule, put in place in 1965, limited flights to a 650 mile radius of Washington, DC. This range just barely includes Chicago; airports that already had non-stop service into DCA (such as Minneapolis and Denver) were granted exemptions. Long-distance flights, exploiting the rapidly growing capabilities of jet aircraft, were forced to use either Dulles or neighboring BWI airport.

The perimeter expanded to 1,000 miles in 1981, allowing non-stop service to South Florida, Kansas City, Saint Louis, and others. In 1986, the perimeter expanded again, to 1,250 miles, far enough to allow non-stop flights from Dallas and Houston.

In 1999, Senator John McCain of Arizona campaigned to remove the perimeter rule entirely. As a compromise, Senator McCain’s hometown airline, America West (later merged with US Air, and now American Airlines) was granted new beyond-perimeter exemptions to serve Phoenix and Las Vegas.

In 2012, the FAA granted several new beyond-perimeter exemptions for new flights to Portland, San Juan, and Austin. The FAA was directed to allow these exemptions by Congress as a part of the FAA’s reauthorization.

Each successive modification of the perimeter rule involved direction action from Congress. As a quirk of DC’s status as a federal enclave, both DCA and IAD (despite both being located outside of the District of Columbia) were built and operated by the Federal government, acting in its capacity as the local government for the National Capital. Both airports were the only airports directly operated by the Federal Aviation Administration.

Since then, several conditions changed. In 1973, Congress granted limited home rule to the District of Columbia, thereby differentiating local government services from those provided by the Federal government. In 1987, Congress created (in conjunction with DC and Virginia) the Metropolitan Washington Airports Authority to operate both National and Dulles. The federal government retains ownership of both airports.

However, despite the move for increased local control for the region’s airports, much of the regulation surrounding them is still codified in federal laws and regulations.

1500 mile perimeter around LGA, with one beyond-perimeter exception for Denver. Image from the Great Circle Mapper - www.gcmap.com

1500 mile perimeter around LGA, with one beyond-perimeter exception for Denver. Image from the Great Circle Mapper – www.gcmap.com

Unlike National, LaGuardia’s perimeter rule is entirely self-imposed. The Port Authority imposed LaGuardia’s 1500-mile perimeter rule (with an exception for beyond-perimeter flights to Denver) in 1984 as a means to manage congestion at the airport and force some traffic to either EWR or JFK.

When looking into additional perimeter exemptions for DCA, the Government Accountability Office argued that the potential loss of flights from Dulles and BWI wouldn’t be catastrophic, and additional competition at the most central airport (in this case, DCA) would be good for consumers.

However, both MWAA and the Port Authority are tasked with managing an airport system, not just maximizing value at one particular airport. Data from MWAA shows a strong correlation between additional capacity for beyond-perimeter flights at DCA with reduced capacity for those same destinations at Dulles.

DCAperimeter2

Dulles is now caught in a vicious cycle. To deal with growth in the mid-2000s, Dulles began a series of massive capital improvements to increase the airport’s capacity and address some of the inherent flaws in the airport’s design (e.g. replacing the plane-mate ‘moon buggies’ with the Aerotrain APM). Unfortunately, since MWAA took on these costs domestic passenger numbers are down, thanks to the collapse of Independence Air, the Great Recession, and the merger of United and Continental (making Dulles is no longer United’s primary east coast hub). All of these factors are driving up the cost per passenger for each remaining enplanement at Dulles. Add in the increase competition from new slots at DCA, and Dulles is struggling.

In response, MWAA is not only dealing with falling traffic at Dulles, but with DCA’s growing pains. The Authority’s new use and lease agreement with the various airlines that use the airports includes a substantial capital program over the next 10 years at DCA to accommodate additional passengers. Part of the Authority’s response is to argue vociferously against any additional exemptions to the DCA perimeter rule; however, they are at the mercy of Congress.

The Port Authority might not need to protect JFK to the same extent that MWAA would like to protect their investments in Dulles, but MWAA’s current experience should provide a cautionary tale. Removal of the perimeter restrictions at LGA would certainly produce winners and losers among both airline tenants at each airport and for the passengers that use them; it’s certainly unlikely to decrease passenger loads at LGA. In fact, American Airlines’ president argues that any changes should wait until upgrades to LGA’s terminals are complete so that they can handle additional passengers.

First, it’s also worth remembering the reason for the imposition of the perimeter rule in the first place: managing demand for one particular airport. True, it’s a somewhat crude tool to manage demand (many are already predicting that DCA-style exemptions to the rule is where the PA will end up), and even without the perimeter rule, there are still slot rules to contend with (another tricky subject).

A second challenge is addressing uncertainty: with airport funding dependent on revenues from airline traffic, a small change can have a big impact. Dulles’ capital program has been greatly affected by changes in traffic levels and by mergers in the industry that shift the airport’s importance to their main tenant in an instant. The need for several of the projects (as well as Dulles’ unaddressed capital needs, such as a new C/D concourse) stems from the airport’s original design, unable to foresee the changes in security requirements, airline boarding practice (jet bridges instead of plane mates), or airline business models (deregulation, leading to the adoption of the hub and spoke model, requiring large concourses for transferring passengers). Dulles was planned and built for the jet age. The original decisions on runway geometry and airfield characteristics have proven to be very accurate; the decisions based on predictions about the behavior of both passengers and airlines has been less successful.

Finally, there’s the need to manage the behavior of two different kinds of users: passengers and airlines. Look at the comments in just about any thread about DCA’s perimeter rule and you’ll find plenty of frequent flyers arguing against the rule. Yet, MWAA can’t successfully implement any changes to their airports without the cooperation of their tenant airlines, acting based on their own set of incentives and preferences. In asking about DCA’s ideal role in the DC region, David Alpert asks:

Should DCA be a sort of niche airport with smaller planes to many little destinations, or an airport that tries to serve as much of the travel demand, close in to the center of the region, as possible? There’s no obvious answer.

Not only is the answer not obvious, but the question itself is more complicated: an airport’s role is only as good as the service that airlines provide; the economics of the kinds of service airlines can provide at any given airport will depend a great deal on a number of factors: airport capacity, costs per enplanement, demand for travel, location/role in an airline’s network, etc.

Shifting an airport’s role can’t be imposed on the airlines; it takes a partnership.

Transit as a regulated public utility: myopic?

Cap’n Transit looks at my recent discussion of transit governance structures (summarizing a good back and forth between David Levinson and Lisa Schweitzer) and sees transportation myopia:

They were all three suffering from transportation myopia: the condition of seeing transit as a self-contained system rather than as an option in competition with private cars and other modes, and of seeing transit as an end in itself, rather than a means to an end.

The Cap’n defines transportation myopia as follows, complete with this illustration of the bigger picture:

Cap'n Transit's virtuous cycle - a reminder of the big picture.

Cap’n Transit’s virtuous cycle – a reminder of the big picture.

Essentially, transportation myopia involves people forgetting that transit competes with cars. As a result they often forget why they care about transit, and treat transit as a goal in itself.

I both agree and disagree. It can be hard to not be a bit myopic when transit operations fail to meet their potential. On the other hand, the accusation of myopia also strikes me as unfair:

What we need to talk about is how to get full cost pricing for roads, including potential challenges and ways to overcome them. But for some reason Levinson doesn’t talk about any of that, he just goes on to talk about smart cards and land value capture and bond markets.

Levinson’s initial post wasn’t an unlimited forum; he noted his word count limit in one of his blog follow-ups. He’s also written extensively on road pricing (including some really in-the-weeds stuff).

These policies did not go unmentioned. Looking to other examples of good transit governance, the cases from Germany explicitly mention the key role of policies that both make car use more expensive, less convenient, and less detrimental to urban life and ‘last mile’ transportation modes (e.g. biking and walking) complimentary to transit. From Ralph Buehler and John Pucher:

Transport, taxation, and land-use policies at all levels of government have helped to make German public transport more attractive compared to the automobile. For example, area-wide traffic calming, car-free pedestrian zones, increased fees for car parking, and reduced parking supply slow down car travel, raise its cost, and make it less convenient. Similarly, federal taxation policies have helped make car use more expensive…

Since the 1970s, most German cities have improved conditions for cycling and walking by traffic-calming nearly all neighborhood streets to 30 km/h or less, pedestrianizing downtowns, and expanding networks of separate bike paths and lanes (Pucher and Buehler, 2008). The vast majority of German passengers access public transport by bicycle or foot…

City planners deliberately connect sidewalks, crosswalks, and bike paths and lanes with transit stops…

German land-use laws and regulations encourage dense and mixed-use settlements, which facilitate transit use…

When considering Boston, I included this parenthetical about the cause of much of the MBTA’s debt and the failures of the Massachusetts decision-makers in prioritizing a massive urban freeway undergrounding project:

(It’s worth noting the decision-making priorities involved in the Big Dig – the massive tunnelling project was only approved because the transit mitigation projects, backed by transit advocates as a way to hitch their wagon to omnipresent highway funding – yet those projects were never fully funded and now play a large role in exacerbating the agency’s stability. Imagine a project that simply removed the Central Artery and ‘replaced’ it with the long-imagined North/South rail link instead; or where the response to the Big Dig proposal was focused on re-defining the project itself rather than just tacking on ‘mitigation’ transit expansion.)

It’s true that I could’ve put more emphasis on the complimentary policies that go with good transit governance. However, that doesn’t address the broader questions of how to better govern, fund, and operate our transit systems. Looking at governance models for transit operators is certainly narrow in focus compared to debates about the bigger picture priorities, but I don’t think it deserves the negative connotations of myopia.

That said, I still welcome the critique. In the Cap’n’s page on transportation myopia, he closes with this:

A lot of transit advocates that I know and respect have demonstrated transportation myopia. If I call you out on it, it’s nothing personal. We’re on the same side, and I’m doing it to help you accomplish a goal that we all share.

I appreciate the reminder. Seeing the forest for the trees can be a challenge, and it always helps to have a reminder about the big picture.

Governing transit: the regulated public utility

Public utilities, from Chris Potter. CC BY 2.0

Public utilities, from Chris Potter. CC BY 2.0

The MBTA is struggling, but they’re not the only transit authority facing both near and long-term challenges. The MTA in New York is trying to find the funds for its capital plan; WMATA is facing systemic budget deficits while trying to restore rider confidence in the system.

For-profit corporations such as airlines aren’t the right answer to govern transit in an American context. So, what kind of structure could work?

Writing at Citylab, David Levinson made the case for structuring American transit operations as regulated public utilities, able to pull the best elements of private sector management and pair them with the fundamentally public purpose required for urban mass transit.

David cites seven key elements of this model:

  1. Competitive tendering for services
  2. The ability to raise fares (with regulatory approval)
  3. Using a smartcard as a common platform for fare payment
  4. Specific contracts with local governments to operate subsidized service
  5. Ability to recapture land value through land ownership and real estate development
  6. Access to private capital markets
  7. Local governance, funding, and decision-making

These elements aren’t substantively different from the elements of German public transport governance reforms outlined by Ralph Buehler and John Pucher: competitive tendering for many services, increased fares, investments in technology to improve capacity, efficiency, and revenue. Public regulation oversees these efforts to operate the core business more efficiently.


Lisa Schweitzer (USC Professor focusing on urban planning and transportation) offered extensive feedback on her blog (in several parts). All are worth reading, I’ve linked to each and included a short summary and/or quote:

1. On the regulated public utility concept: “First of all, even though quangos [a British term: quasi-autonomous non-governmental organizations – what we’d usually refer to as a public authority] are somewhat insulated from voters and politics, they still have play with budgetary politics, and those games are where lots of stupid enters into transit provision.”

Schweitzer identifies three main problems with applying the concept to transit. First, unlike water or electric service, the demand for transit use isn’t universal. Aside from a few dense cities, there isn’t necessarily a built in customer base. Second (and related to the spotty demand for transit service), some jurisdictions can/do opt out of transit service, hurting the overall network. Third, unlike water or electricity, there are many different levels of transit service.

2. The challenges of competitive tendering: the devil is in the details for how to successfully structure operations contracts: “And that’s a really the key point for competitive tendering and service quality gains you hope to achieve: if you are going to to do this, you need to be clear on service expectations. The reason the cable guy gets to treat you like crap is that’s not part of the franchise agreement which centers on channels and rights for particular sports events–not customer service response times.”

3. Farecards and technology: Schweitzer notes that most transit agencies already offer smart farecards, but perhaps a regulated utility would have more incentive to invest in technology to collect additional revenues or adopt policies (such as all-door boarding, or proof of payment) that would speed operations and improve efficiency. This is really a matter of institutional incentives rather than simply adopting farecards.

4. Capital cost recovery: While Levinson argues that new transit lines should only be built if they can break even on fare revenues and value capture from adjacent land, Schweitzer counters that this formulation depends on the mode and the type of transit line:”Right now, you have jurisdictions with people who are very avid about wanting rail transit. We must have rail now.”

“You want a train? Fine. Either let us build 70 100-story apartment complexes next to the station (if it pencils for us) or you pay whatever portion of the capital and operating costs that apartment complex would have covered for the utility. Your choice. Again, rich districts can have their single-acre lots if they want, and they can have their trains if they want them–even if nobody wants to take the train and they just use it as decoration. They just can’t stick the rest of us with the bills for those trains.”

5. Asset values and access to private capital: This isn’t exactly a silver bullet. For as well as competitive bidding worked for London’s buses, the similar deal for the Underground flopped:” The Metronet-London Underground deal came about in 1998 in part because the transit provider, Transport for London, was financially stretched and their capital stock decayed. This is a big deal: taking over large capital stocks is risky, let alone doing so because you have to bail somebody out. It means you probably have crumbling assets with an uncertain price tag to fix.”

In London’s case, one rail company delivered on their agreements while another operator came back to the public for additional funds and eventually went into bankruptcy: “While newspapers blamed the public sector partner for failing to manage the contracts properly, the public audit on the deal cited Metronet’s own corporate governance and poor management as the primary reason for the failed partnership.”

6. Local funding: While Schweitzer sees the virtues of local funding, there are risks to completely forgoing federal funds. If there is a chance to reform things, it will likely involve the feds: “If we really do believe that there are normatively better ways for cities to be, then there is a role for federal governments to play in setting standards and incentives.”


David, freed from the space constraints of Citylab when writing via his own blog, responded in depth:

1. The regulated public utility model: “I imagine like most reforms, it would be phased in, tested, refined, and revised in the various laboratories of democracy. Some city has to go first, some other city has to go second, and hopefully learn from the first, before every last city does.”

2. Competitive tendering: “…the answer is quite complicated about how to configure to maximize consumer welfare, and experimentation is probably required. Just giving the system away is certainly not the answer. Having the franchises be of a limited duration (5-7 years, e.g.) is better than a 20-30 year franchise. This is feasible for buses where the capital is the ultimate in mobile capital. It would be much harder for a traditional utility where the infrastructure is expensive, embedded in the ground, and long-lived.”

In other words, it’s a lot easier to structure a deal for competitive contracts for bus operations than it is for fixed, naturally monopolistic rail services – both in terms of structuring the deal, and in terms of attracting operators.

3. Farecards: “I would go further and say we should have pre-payment via stop-based farecard reader, i.e. all significant bus stops should have arterial BRT like payment”

4. Capital cost recovery: “Capital investments are new stocks while operating expenditures are continuing flows. From a public policy perspective, continuing with existing commitments (which may be an implied social contract) may be more important than making investments that bring about new commitments. Thus new commitments (such as new rail lines which have irreversibly embedded immobile capital) should only be undertaken if we believe at the outset (admittedly a forecast, which have problems) that they have cost recovery.”

5. Asset values: “Investing in new infrastructure is a lot riskier than investing in already built infrastructure (thus the early financiers of the Channel Tunnel got wiped out twice, similarly the Dulles Greenway and many other privately funded pieces of new infrastructure that were either more expensive than expected, or built too far in advance of demand.”


The broad concept of a regulated public utility has a lot to recommend: it threads the needle between the public purpose inherent to modern transit, while also pulling the best elements from private enterprise and the benefits of running a service-oriented business like a business.

While demanding additional efficiency from transit operators, German public policy worked in concert with these reforms – traffic calming, dense development around transit stations, and increased taxes and fees on car-based transport both improved transit’s attractiveness and also provided new revenue sources.

As Dr. Schweitzer notes, the single biggest take-away from Levinson’s article is the concept of transit as a public utility in the first place. Getting over that mental hump can open doors to plausible reforms.

What might those reforms be? In addition to Levinson’s list, Ralph Buehler and John Pucher offer their lessons from the German experience:

  • Encourage regulated competition; take advantage of private sector expertise
  • Collaboration between local governments, transit operators, and labor unions
  • Focus on profitable services – not to ignore ‘equity’ services. Jarrett Walker would refer to this as a focus on ‘ridership’ routes instead of ‘coverage’ routes – and building political consensus around this isn’t an easy task!
  • Collaborate with other transit operators; encourage easy exchanges between systems for passengers, interoperable systems, etc.
  • Improve service quality; focus on customer service.
  • Increase transit’s competitiveness with complimentary public policies – for example, increased fees on driving/owning a car, encouraging dense development near stations, etc.

All in all, the list is quite similar to Levinson’s.

However, in Germany, the push towards some of these reforms came from the outside (EU regulations); existing transit operators viewed them as a threat forcing reform and a new focus on customer service, efficiency, and overall quality – all while working to reduce costs. Similar to an airline facing bankruptcy, German operators used the EU mandate to find common purpose with their unions to improve efficiency and reduce overall costs.

Both Schweitzer and Levinson sing the virtues of local funding, but reform of this magnitude might require outside stimulus. In the same vein as Schweitzer’s defense of federal experimentation in policy, the federal government is well suited to fill that role. However problematic the federal focus on streetcars may be, the federal focus has certainly shifted the attention of local governments; the TIGER grant process shook up the traditional relationship between the FTA serving a few transit authority grantees. The projects might not be the best investments in mobility, but it does reveal the potential for the feds to drive change in transit governance.

Airlines: the strengths and weaknesses for corporate transportation governance

CC image from Christian Junker.

CC image from Christian Junker.

David D’Alessandro’s review of the MBTA’s finances came to a stark conclusion: “A private sector firm faced with this mountain of red ink would likely fold or seek bankruptcy.” That red ink is thanks to a systemic operating deficit; yet as a provider of a key public service, the MBTA was also “too big to fail” and therefore cannot simply cease operations. Likewise, though municipalities and public authorities can declare bankruptcy, they seldom do.

However, there are examples of transportation operators declaring bankruptcy in the face of systemic deficits: airlines. Comparing for-profit airlines to subsidized urban transit might seem like a stretch, but consider the similarities:

  • Both provide a transportation service
  • Both require capital-intensive operations
  • Both are historically a low-margin business; transit has been largely subsidized for generations in the US; historic profitability for airlines is slim-to-nonexistant.
  • Labor is a significant cost for both; both featured highly unionized workforces.
  • Both are sensitive to swings in energy prices
  • Both include a high level of coordination with the government (regulations, funding for facilities, etc)

Reform proposals for the MBTA set goals for reducing operating costs, but didn’t necessarily give the MBTA the tools to reach that goal. Compare that to the major airline reform – the Airline Deregulation Act of 1978. Prior to deregulation, all air routes needed approval from the Civil Aeronautics Board (CAB). Matt Yglesias explains:

Passenger aviation clearly needs some regulation for the sake of passenger safety, pollution control, and the community impacts of airports. But in the early decades of the industry, CAB went far beyond that to regulate what fares airlines were allowed to offer and which routes they were allowed to fly. This became a classic case of regulatory capture. Airlines cared a lot about the actions of CAB while ordinary voters had bigger fish to fry. As a consequence, the board ended up creating a cozy cartel where airlines didn’t compete much and certainly didn’t compete on price. With price competition off the table, airlines invested lavishly in offering a high level of service. Labor unions got in on the act, using their clout to force managers and owners to share with workers some of the excess profits generated by CAB.

Removing regulatory approval for new routes unleashed new competition, dramatically lowering airfares for consumers. Airlines explored new route network concepts, eventually leading to the dominance of today’s hub-and-spoke system. Existing airlines still had to work within their cost structure, based on the old regulated business model. Soon, many airlines also faced a sea of red ink. Faced with the same choice David D’Alessandro saw for the MBTA, many airlines either ceased operations or entered bankruptcy.

Today, airlines use bankruptcy as a tool to lower labor costs by renegotiating contracts. Yglesias, writing about the 2011 bankruptcy of American Airlines, notes “the real aim of the filing, in the words of S&P 500 analyst Philip Baggaley is to ’emerge as a somewhat smaller airline with more competitive labor costs.’ ”

While the MBTA Forward Funding plan set goals to reduce operating costs, it did not include the tools to make those cost reductions happen. Using bankruptcy as a tool to reduce structural costs, as airlines have done, might technically be available to a public authority like the MBTA, political pressure often prevents this course of action.

In a look at sustainable transit funding, Ralph Buehler and John Pucher study the fiscal sustainability of German public transport systems. The abstract:

Over the past two decades, Germany has improved the quality of its public transport services and attracted more passengers while increasing productivity, reducing costs, and cutting subsidies. Public transport systems reduced their costs through organizational restructuring and outsourcing to newly founded subsidiaries; cutting employee benefits and freezing salaries; increasing work hours, using part-time employees, expanding job tasks, and encouraging retirement of older employees; cooperation with other agencies to share employees, vehicles, and facilities; cutting underutilized routes and services; and buying new vehicles with lower maintenance costs and greater passenger capacity per driver. Revenues were increased through fare hikes for single tickets while maintaining deep discounts for monthly, semester, and annual tickets; and raising passenger volumes by improved quality of service, and full regional coordination of timetables, fares, and services. Those efforts by public transport agencies were enhanced by the increasing costs and restrictions on car use in German cities. Although the financial performance of German public transport has greatly improved, there are concerns of inequitable burdens on labor, since many of the cost reduction measures involved reducing wages or benefits of workers.

The outcomes aren’t all that different than those achieved by airlines utilizing bankruptcy. Unlike either US airline deregulation or the MBTA’s Big Dig deal on transit expansion as mitigation for a massive increase in urban highway capacity, German reforms also included policies aimed to shift the market in favor of public transportation. Fares and schedules are coordinated though a verkehrsverbund, or transport association.

Setting fares, coordinating routes and timetables sounds awfully similar to the Civil Aeronautics Board. However, because air transport is expected to operate profitably and urban mass transit is not. The middle ground is a structure that can combine the best elements of a for-profit corporation (“run it like a business”) with the public purpose of a government agency or public authority. Writing at Citylab, David Levinson makes the case for governing transit as a regulated public utility, operating as a business and billing the public for the full cost of services:

Like any other enterprise, transit should be successful and cover its costs. This is entirely feasible if we change the model of transit finance from a branch of government to a regulated public utility, as is done in much of Europe and Asia. A public utility provides a service, and in exchange, it is compensated for that service. The compensation comes from consumers (e.g. users, riders), and from the public for any unprofitable services that it wishes to maintain for other (e.g. political) reasons.

Just as the public sector pays the electric utility for street lights, it should pay the transit utility for services that the government insists on but that the transit provider cannot charge users enough for.

The public utility model provides a more realistic model for mass transit than airlines do. The lack of an inherent profit motive makes the direct comparison for airline governance a mis-match; yet there are elements of the private corporation that would inherently benefit public transit, thanks to the similiar roles for airlines and transit agencies.

Lessons for transit agency funding, finance, and governance – MBTA

It’s been a rough winter for transit in Boston. The agency’s general manager resigned; they’re buried in 90 inches of snow – it’s a natural disaster in slow-motion. All of those problems are piled on top of the MBTA’s structural deficiencies, outlined in this 2009 review of the agency’s finances. The review, led by former John Hancock CEO David D’Alessandro, paints a bleak picture.

Prior to 2000, the MBTA was backward-funding – sending a bill to the state to cover the organization’s annual operating deficit. A reform program sought to make the MBTA fiscally self-sufficient by dedicating a portion of the state’s sales tax revenue to the agency in exchange for a requirement that the MBTA balance their budget every year. This requirement to balance the budget every year would serve as an incentive for the MBTA to control costs and grow revenues.

Often, similar conversations emerge around WMATA, noting Metro’s lack of a dedicated funding source. However, the MBTA case study shows that dedicated funding alone isn’t a silver bullet. There are other elements to the MBTA’s structural deficit beyond funding.

The MBTA blueprint for self-sufficiency was based on several bad assumptions: The plan called for the MBTA to decrease operations costs by 2% a year. In actuality, they increased by an average of 5% per year. Fuel and energy costs account for a large portion of the shortfall as oil prices rose dramatically (and unexpectedly). Sales tax revenues were expected to grow at 3% per year, the actual growth averaged to 1% per year. The net impact, even with rising fare revenue, is a sea of red ink:

Cumulative impacts from the MBTA funding plan, showing large net negative impacts from the baseline.

Cumulative impacts from the MBTA funding plan, showing large net negative impacts from the baseline.

There are two different kinds of error here: one is a failure to account for uncertainty in the forecast. Sales tax revenue is strongly influenced by the larger economy; fuel and energy prices are similarly based on much larger and unpredictable energy markets. The size of the error also increases with time from the original plan. Error in the MBTA’s fuel cost assumptions gets larger with each successive year from FY01 to FY08 – beware the cone of uncertainty.

The second type of error stems from wishful thinking. While it’s nice to plan on reducing operations costs, and there’s value in budgeting accordingly in order to set a goal to do so, it’s not clear that the legislation had a clear idea for how the MBTA would reduce those costs. Another analysis from the MBTA shows binding arbitration between the MBTA and labor unions imposed substantial wage increases with no regard for the MBTA’s operating deficit. In that light, assuming the MBTA’s operating costs would decrease seems like wishful thinking at best.

The D’Alessandro review notes that the MBTA’s headcount is actually down, yet wages are up. The agency showed progress in reducing costs, but they “could not pare staff below the number needed to move hundreds of thousands of riders across hundreds of routes each workday.” Baumol’s Cost Disease in action – increasing costs without a corresponding increase in productivity.

To meet the requirement to balance their annual budget, the MBTA sought to lower their annual debt service payments by refinancing their debt to push the principal into the out years and lower near term payments. Much of this refinancing simply ‘papered over’ the agency’s structural deficit. Again, the faulty assumptions of the financing plan exacerbated that structural deficit.

The MBTA’s debt load is also a major issue, one that dates back well before the Forward Funding plan. As a part of a 1991 consent decree to get approval for Boston’s Big Dig, the courts required a broad array of transit expansion projects as “environmental mitigation.” The decree did not identify any funding for those projects. Now, the MBTA has a massive amount of debt, of which approximately 2/3rds is dedicated to prior obligations before the Forward Funding agreement or towards state-mandated expansion projects.

(It’s worth noting the decision-making priorities involved in the Big Dig – the massive tunnelling project was only approved because the transit mitigation projects, backed by transit advocates as a way to hitch their wagon to omnipresent highway funding – yet those projects were never fully funded and now play a large role in exacerbating the agency’s stability. Imagine a project that simply removed the Central Artery and ‘replaced’ it with the long-imagined North/South rail link instead; or where the response to the Big Dig proposal was focused on re-defining the project itself rather than just tacking on ‘mitigation’ transit expansion.)

D’Alessandro’s conclusion is stark: “A private sector firm faced with this mountain of red ink would likely fold or seek bankruptcy.”

Yet, at the same time, the MBTA is “too big to fail.” Transit provides a critical service for any large city’s economy. Given the subsidized nature of public transit in the US, any reform must involve the public sector.

Airlines provide an interesting point of comparison: While US airlines operate for-profit businesses, the nature of air transport is deeply intertwined with the public sector. However, US Airlines are private, for profit corporations. Unlike the MBTA, they can seek legal protections to restructure their business through bankruptcy – and every major airline has done precisely that over the last decade. Airlines used bankruptcy to reduce operations costs from long-term labor agreements. German transit agencies have achieved fiscal stability using similar tools.

Unfortunately, the simplified narrative in the wake of the T’s failure to function normally in the face of Boston’s record snowfall has been to set up a false dichotomy between transit system expansion and system maintenance. In spite of the Big Dig deal, the challenge isn’t between expansion vs. maintenance, but between the political governance and funding mechanisms and the technical requirements to operate and maintain the system.

This political challenge isn’t limited to transit. Highway spending is overwhelmingly focused on expanding the system, at the expense of maintaining the system we already have. Angie Schmidt at Streetsblog put it bluntly: More money for transportation won’t matter if we don’t change how that money is spent.

Pop-ups – what counts as ‘reasonable?’

Beware the imperative that we have to do something.

Despite protestations from DC’s former planning director Harriet Tregoning, the preliminary vote count on the plan to limit rowhouse pop-ups in DC is poised to pass, 3-2 (note that two of the zoning commissioners tentatively in favor are the federal representatives to the commission; see this Washington City Paper profile of commissioner Peter May for more about the federal role in local decisions in DC).

Among the local media, the Washington Post editorial board came out against the proposed regulations. Other local papers, such as the Northwest Current, are in favor. The single biggest reason for supporting the proposed changes is that they seem ‘reasonable.’

IMAG2257

It’s not hard to see why many DC residents are eager for ‘reasonable’ restrictions on pop-ups. There are quite a few ugly ones out there; some include suspect construction. However, the proposed changes in the zoning code won’t outlaw ugly additions and the zoning code doesn’t regulate construction methods or enforce the building code.

Part of the challenge with ‘reasonable’ restrictions on new development is that many of the impacts aren’t intuitive. Consider the aesthetics of pop-ups: Just as zoning code parking requirements won’t solve on-street parking hassles (you must manage those parking hassles directly), a small reduction in the allowable height and shifting certain elements away from by-right construction towards requiring a special exception won’t address concerns about design. Implement these changes to DC’s zoning code and many will still complain about pop-up development.

Pop-ups need not be ugly. Nor are they a new phenomenon.

Part of the concern about overly restrictive regulations is that limiting small-scale development is a serious constraint on the market’s ability to provide housing that is affordable to a wide range of incomes (here’s a perfect place to shift the narrative away from the nebulous ‘affordable housing’ and instead focus on providing abundant housing instead).

Still, without that background knowledge, it’s not hard to think that these restrictions won’t harm the District’s progress towards abundant housing. Proponents of allowing more growth argue pop-ups provide an opportunity for families and individuals to live in desirable neighborhoods at a lower price point. Meanwhile, the Northwest Current editorial board isn’t convinced that allowing additional housing supply helps ease the supply crunch. Instead, they would wish housing prices would drop naturally:

IMAG2256

However, the flip side of the “we’d rather just see the existing houses priced more affordably” coin is essentially an argument to lower property values. I don’t think we’ll see such an editorial from the Northwest Current anytime soon. Why? Because I doubt neither the editorial board nor the paper’s readership would consider advocacy to lower property values to be ‘reasonable.’

So, what are options to regulate pop-ups? A few ideas, keeping in mind the differing perspectives and scales)

  • Recognize the value of by-right development and the path of least resistance. Similarly, the idea of negotiating every single building project on a case-by-case basis might also seem reasonable, beware the unintended consequences of this approach.
  • Consider a form-based approach. The Coalition for Smarter Growth suggested an approach that mandates a setback for true pop-ups (those that retain the existing facade) or some other design treatment to minimize the visual impact. The challenge for this approach would be in enforcement. The advantage is that the regulatory authorities can offer clear guidance for this form of ‘lite’ administrative design review. It also avoids the perils of full-scale design review; a process that doesn’t keep the desired outcomes on the path of least resistance.
  • Remember: one of the goals of DC’s pending zoning code re-write was to reduce the burden on the BZA’s case load. Simply adding more cases to the pool of potential special exceptions is a step in the opposite direction.
  • Build more rowhouses. Part of the rationale for regulating pop-ups is a desire not just to preserve the urban design of DC’s rowhouse neighborhoods, but also to preserve larger housing units for families. If this is indeed a goal for the city’s housing strategy (and consistent with the desires for abundant housing), then the goal shouldn’t just be about preserving rowhouses, but encouraging the construction of more of them in existing single-family detached areas. This is also consistent with the city’s goals for accessory dwelling units as a part of the zoning re-write.
  • Build more multi-family housing. Work to relieve development pressure from the other end by allowing the construction of more small-scale apartment and condo buildings. DC has many of these grandfathered into existing R-4 (rowhouse) zones. While the Comprehensive Plan does prioritize the preservation of rowhouse areas, the existing zoning clearly allows multi-unit buildings. While much of the commentary focuses on micro effects and ugly additions, lurking beneath the surface is a clear bias against additional dwelling units. This backlash mirrors other DC planning debates about accessory dwelling units and growth in general.
  • Develop a market-based housing plan for the city as a whole. Collect and distribute data on the overall housing market to better inform decisions on demand as well as new supply.
  • Shift the narrative around housing discussions away from ‘affordable housing’ and towards ‘abundant housing.’ Hopefully this shift can help avoid the counterfactual trap of new supply that is still expensive, yet cheaper than it would’ve been. Consider this: if car manufacturers could only build a limited number of cars, they would likely focus on higher-margin luxury models. The same is true of housing; yet this doesn’t disprove the impact of supply.  Just because new condos in popped-up buildings aren’t always cheap, that doesn’t mean the impact on the overall market isn’t real.

Any other ideas?

Seeing the forest for the trees, and vice versa

CC image from Vincent Ferron

CC image from Vincent Ferron

As the saying goes, sometimes you can’t see the forest for the trees. You can’t focus too hard on the details of each individual tree and still get the bigger picture – all of those trees form a larger ecosystem – a forest.

The expression (almost always used negatively) only speaks to one’s perspective, however. No matter that perspective, there is a forest comprised of many individual trees. The phrase is targeted at a person’s perspective, but it does speak to the differences in both scale and perspective about any given issue.

Let’s Go LA used this formulation to discuss the division between two broad schools of thought on urban housing, particularly in constrained markets with rising housing prices: those that focus on supply restrictions and those that focus on community integrity and preventing displacement.

The difference in tactics between these two groups often leaves them at odds with each other. However, these schools of thought are two sides of the same coin, with similar goals but approaching the problem from opposite ends. Call the land use liberalization advocates the “macro” view, focusing on overall regional housing supply, and the anti-displacement advocates the “micro” view, focusing on the stories of individuals affected by rapid neighborhood change.

The challenge in crafting policy is that both schools of thought have a claim to the truth. Crafting policy for a city isn’t a choice between the forest or the trees, as there isn’t a difference between the two approaches.

This isn’t the only dichotomy you’ll find in a city. I’ve written previously about the tensions that rise out of the different views of real estate in cities – it is both a financial investment and a component of a city’s urban design. Tensions between these schools of thought can be exacerbated by policies that conflate the two – is the mortgage interest deduction a policy focused on housing or on real estate investment?

  • Trees v. forest
  • Micro v. macro
  • Neighborhood v. region
  • Building v. neighborhood

DC’s debate about pop-up development similarly pits two competing views about the same city against one another: is the city an urban design forest being altered by the trees of the individual property rights of owners? Are those pop-ups representing a healthy regional housing market responding to demand, a forest ecosystem regenerating itself; or a metastastizing growth that threatens ‘neighborhood character?

Both are lenses we can use to look at the city. The challenge is finding a policy that can thread the needle without ignoring the bigger picture goals that can be more abstract: not the forest or trees, but a desire for a healthy environment (as an example). Let’s Go LA makes the case that finding that common ground and realizing that the trees make the forest while the forest comprised of the trees is critical in moving forward:

See the forest for the trees, or see the trees for the forest.

The key is to realize that we all share a common goal – a city that is affordable and accessible to all those who want it. When land use liberalization advocates and anti-displacement advocates argue with each other, we let the truly responsible parties – wealthy neighborhoods that stifle any and all development – off the hook.

Too often, the conversation turns into a debate about which perspective is ‘right.’ The reality is that both (all?) perspectives have value. The debate can obscure areas of agreement; it can also foster a misunderstanding of how cities evolve. The only constant is change.

Forecasting uncertainty in practice: Snowperbole

Example of snow forecast communicating levels of undertainty; image from the Capital Weather Gang

Example of snow forecast communicating levels of uncertainty; image from the Capital Weather Gang

Because making accurate predictions is extremely difficult, we can dramatically improve both the accuracy of forecasts and enable effective communication about the forecast by embracing the uncertainty involved in the forecast. This allows decision-makers to both use the information available while understanding the limits of those predictions.

Following forecasts for a “potentially historic” storm set to hit New York and New England, public officials in New York City went to great lengths to emphasize the dangers of the storm. The Governor closed down New York’s subways in anticipation of the storm (showing one of the quirks of New York’s transit governance, local transit is under state control).

There was just one problem: the storm mostly missed NYC.

In their forecast post-mortem, the Washington Post’s Capital Weather Gang highlighted the key shortcomings of the forecast – a failure to present the level of uncertainty in the forecast.

Why were the forecasts so bad?

It’s simple: Many forecasters failed to adequately communicate the uncertainty in what was an extremely complicated forecast. Instead of presenting the forecast as a range of possibilities, many outlets simply presented the worst-case scenario.

Especially for New York City, some computer model forecasts were extremely dire, predicting upwards of 30 inches of snow – shattering all-time snowfall records. The models producing these forecasts (the NAM model and European model) had a sufficiently good enough track record to take them seriously.

However, some model forecasts (e.g. the GFS model) signaled reason for caution. They predicted closer to a foot of snow.

Part of the challenge here is that most of the forecast was accurate. This was a historic storm; the storm simply tracked a bit further to the east. Areas like New York City were right on the margins, where a small change to the inputs can mean a large change in the outcome  – and the forecast did not adequately convey that uncertainty. Add in the fact that the forecast miss happened to be the largest city in the United States, and you have a very public error.

When a forecast is so sensitive to small changes (eastern Long Island, not far away, received 30-plus inches), it is imperative to loudly convey the reality that small changes could have profound effects on what actually happens.

It’s easy to second-guess public officials making key decisions like closing transit systems after the fact (and after the forecast bust), but they can only act on the information that they have in front of them. It’s easy to argue that it is better to be safe than sorry (and this is certainly true – it is better safe than sorry) but there is a real risk of eroding public confidence in these kinds of decisions when the forecast doesn’t pan out. (It doesn’t help that despite closing the subways, the MTA’s snow plan called for trains to remain in operation without passengers to keep the tracks clear of snow)

As some meteorologists suggest, conveying the uncertainty in their forecasts should be a larger element of both the forecast and communication. It’s not just a matter of using the best information available, but also understanding the uncertainty involved.

The cone of uncertainty

One of the elements that makes prediction difficult is uncertainty. In one of the chapters of Donald Shoup’s High Cost of Free Parking (adapted for Access here), Professor Shoup poses the question:

HOW FAR IS IT from San Diego to San Francisco? An estimate of 632.125 miles is precise—but not accurate. An estimate of somewhere between 400 and 500 miles is less precise but more accurate because the correct answer is 460 miles. Nevertheless, if you had no idea how far it is from San Diego to San Francisco, whom would you believe: someone who confidently says 632.125 miles, or someone who tentatively says somewhere between 400 and 500 miles? Probably the first, because precision implies certainty.

Shoup uses this example to illustrate the illusion of certainty present in the parking and trip generation estimates from the Institute of Transportation Engineers. Many of the rates are based on small samples of potentially unrepresentative cases – often with a very wide range of observed parking/trip generation. Shoup’s concluding paragraph states:

Placing unwarranted trust in the accuracy of these precise but uncertain data leads to bad policy choices. Being roughly right is better than being precisely wrong. We need less precision—and more truth—in transportation planning

Part of the challenge is not just knowing the limitations of the data, but also understanding the ultimate goals for policy. David Levinson notes that most municipalities simply adopt these rates as requirements for off-street parking. This translation of parking estimates to hard-and-fast regulation is “odd” in and of itself. What is the purpose of a parking requirement? To meet the demand generated by new development?

Parking demand for a given building will be a range throughout the course of a day and a year, and demand for any given building category will itself fall within a large range. That range is reality, but that unfortunately doesn’t translate into simply codified regulations.

In the previous post, I discussed the challenges of accurate prediction and specifically referenced Nate Silver’s work on documenting the many failures and few successes in accurate forecasting. One area where forecasting improved tremendously is in meteorology – weather forecasts have been steadily improving – and a large part of that is disclosing the uncertainty involved in the forecasts. One example is in hurricane forecasts, where instead of publicizing just the predicted hurricane track, they also show the ‘cone of uncertainty‘ where the hurricane might end up:

Example of a hurricane forecast with the cone of uncertainty - image from NOAA.

Example of a hurricane forecast with the cone of uncertainty – image from NOAA.

So, why not apply these methods to city planning? A few ideas: as hypothesized before, the primary goal for parking regulations isn’t to develop the most accurate forecasts. The incentives for weather forecasting are different. The shifts to embrace uncertainty stems from a desire finding the most effective way to communicate the forecast to the population. There are a whole host of forecast models that can predict a hurricane track, but their individual results can be a bit messy – producing a ‘spaghetti plot,’ often with divergent results. The cone of uncertainty both embraces the lack of precision in the forecast, but also simplifies communication.

For zoning, a hard and fast requirement doesn’t lend itself to any cone of uncertainty. Expressing demand in terms of a plausible range means that the actual requirement would need to be set at the low end of that range – and in urban examples, the low end of potential parking demand for any given project could be zero. Of course, unlike weather forecasts, these regulations and policies are political creations, not scientific predictions.

Meteorologists also have the benefit of immediate feedback. We will know how well hurricane forecasters did within a matter of days, and even then we will have the benefit of several days of iterations to better hone that forecast. Comparatively, many cities added on-site parking requirements to their zoning codes in the 1960s; regulations that often persist today. Donald Shoup didn’t publish his parking opus until 2005.

There’s also the matter of influencing one’s environment. Another key difference between a hurricane forecast and zoning codes is that the weather forecasters are looking to predict natural phenomena; ITE is trying to predict human behavior – and the very requirements cities impose based on those predictions will themselves influence human behavior. Build unnecessary parking spaces, and eventually those spaces will find a use – inducing the very demand they were built to satisfy. There, the impacts of ignoring uncertainty can be long-lasting.

Here’s to embracing the cone of uncertainty!

Prediction is hard – so why do we make key decisions based on bad information?

Comparison of USDOT predictions for Vehicle Miles Traveled, compared to actual values. Chart from SSTI.

Comparison of USDOT predictions for Vehicle Miles Traveled, compared to actual values. Chart from SSTI.

Back in December, David Levinson put up a wonderful post with graphical representations looking to match predictions to reality. The results aren’t good for the predictors. Lots of official forecasts call for increased vehicle travel, while many places have seen stagnant or declining VMT. It’s not just a problem for traffic engineers, but for a variety of professions (I took note of similar challenges for airport traffic here previously).

Prediction is hard. What’s curious for cities is that despite the inherent challenges of developing an accurate forecast, we nonetheless bet the house on those numbers with expensive regulations (e.g. requiring off-street parking to meet demand) and projects (building more road capacity to relieve congestion) based on bad information and incorrect assumptions.

One of the books I’ve included in the reading list is Nate Silver’s The Signal and the Noise, Silver’s discussion of why most efforts at prediction fail. In Matt Yglesias’s review of the book, he summarizes Silver’s core argument: “For all that modern technology has enhanced our computational abilities, there are still an awful lot of ways for predictions to go wrong thanks to bad incentives and bad methods.”

Silver rose to prominence by successfully forecasting US elections based on available polling data. In the process, he argued the spin of pundits added nothing to the discussion; political analysts were seldom held accountable for their bad analysis. Yet, because of the incentives for punditry, these analysts with poor track records continued to get work and airtime.

Traffic forecasts have a lot in common with political punditry – many of the projects are woefully incorrect; the methods for predicting are based more on ideology than observation and analysis.

More troubling, for city planning, is the tendency to take these kinds of projections and enshrine them in our regulations, such as the way that the ITE (Institute of Transportation Engineers) projections for parking demand are translated into zoning code requirements for on-site parking. Levinson again:

But this requirement itself is odd, and leads to the construction of excess off-street parking, since at least some of that parking is vacant 300, 350, 360, or even 364 days per year depending on how tight you set the threshold and how flat the peak demand is seasonally. Is it really worth vacant paved impervious surface 364 days so that 1 day there is no spillover to nearby streets?

In other words, the ideology behind the requirement wants to maximize parking.

It’s not just the ideology behind these projections that is suspect; the methods are also questionable at best. In the fall 2014 issue of Access, Adam Millard-Ball discusses the methodological flaws of ITE’s parking generation estimates. (Streetsblog has a summary available) Millard-Ball notes that the “seemingly mundane” work of traffic analysis has enormous consequences for the shape of our built environment, due to the associated requirements for new development. Indeed, the trip generation estimates for any given project appear to massively overestimate the actual impact on traffic.

There are three big problems with the ITE estimates: first, they massively overestimate the actual traffic generated by a new development, due to non-representative samples and small sample sizes. Second, the estimates confuse marginal and average trip generation. Build a replacement court house, Millard-Bell notes, and you won’t generate new trips to the court – you’ll just move them. Third, the rates have a big issue with scale. Are we concerned about the trips generated to determine the impact on a local street, or on a neighborhood, or the city, or the region?

What is clear is that these estimates aren’t accurate. Why do we continue to use them as the basis of important policy decisions? Why continue to make decisions based on bad information? A few hypotheses:

  • Path dependence and sticky regulations: Once these kinds of regulations and procedures are in place, they are hard to change. Altering parking requirements in a zoning code can seem simple, but could take a long time. In DC, the 2006 Comprehensive Plan recommended a review and re-write of the zoning code. That process started in earnest in 2007. Final action didn’t come until late in 2014, with implementation still to come – and even then, only after some serious alterations of the initial proposals.
  • Leverage: Even if everyone knows these estimates are garbage, the forecasts of large traffic impacts provide useful leverage for cities and citizens to leverage improvements and other contributions from developers. As Let’s Go LA notes, “traffic forecasting works that way because politicians want it to work that way.”
  • Rent seeking: There’s money to be made from consultants and others in developing these inaccurate estimates and then proposing remedies to them.