U.S. Congress

  • Media
    Are OTT Platforms Abusing Their Market Power?
    The chief executives of some of the world’s most dominant technology companies will appear before the House Judiciary Antitrust Subcommittee on July 27. Congress should press them to operate in ways that adhere to principles of neutrality and non-discrimination that have served the interests of consumers and citizens around the world.
  • U.S. Congress
    U.S. Congress Returns to Session, Schools Grapple With Reopening, and More
    Podcast
    The U.S. Congress returns to session; schools and universities struggle with reopening amid a surge in COVID-19 cases; and, China, the United Arab Emirates, and the United States prepare for spacecraft launches to Mars.
  • United States
    TWE Remembers: The Pacificus-Helvidius Debate
    Original intent. The term pops up frequently in debates over how to interpret the U.S. Constitution. At its core, the concept of original intent holds that constitutional interpretation should be guided (or bound) by what the framers envisioned when they wrote the document back in 1787. But what happens to original intent when the framers themselves disagreed on how to interpret their handiwork? It’s not an academic question. Just look at the so-called Pacificus-Helvidius debate, which began 227 years ago today. The debate pitted Alexander Hamilton (writing under the pen name “Pacificus”) and James Madison (writing under the pen name “Helvidius.”) Even though the two men had had more influence than anyone else on the writing and ratification of the Constitution, they painted decidedly different views of the relative powers of Congress and the president in foreign policy. Their disagreement has echoes to this day. The debate originated in President George Washington’s issuance of the Proclamation of Neutrality on April 22, 1793, declaring that the United States would not take sides in the war that that had just erupted between France and a range of European powers, including Britain. The proclamation excited passions at home, all the more so when by happenstance a representative of the revolutionary French government, Edmond-Charles Genêt, traveled from Charleston, South Carolina, to Philadelphia whipping up pro-French sentiment. Then Vice President John Adams would remember it later, perhaps with a touch of hyperbole, as a time “when ten thousand people in the streets of Philadelphia, day after day, threatened to drag Washington out of his house and effect a revolution in the government or compel it to declare war in favor of the French Revolution and against England.” As Adams’s recollection suggests, much of the opposition to Washington’s decision was on the substance. Many Americans favored siding with France. The two countries had a treaty of alliance, French support had been critical to winning the War of Independence, and neutrality would help the hated British. Indeed, in an effort to forestall these criticisms and in keeping with the recommendation of Washington’s Francophile secretary of state, Thomas Jefferson, the proclamation did not use the word “neutrality” but instead declared America’s “friendly and impartial” attitude toward all the belligerents. But for many critics the Proclamation of Neutrality also raised an important constitutional question: under what authority did Washington act? The Constitution said nothing about neutrality. It did, however, lodge the power to declare war with Congress. By opting for neutrality even if he hadn’t used the term, hadn’t Washington infringed on Congress’s constitutional authority? Doubts about the propriety of Washington’s action extended to members of his own cabinet, with Jefferson writing that “my objections to the competence of the Executive to declare neutrality (that being understood to respect the future) were supposed to be got over by avoiding the use of that term.”  The first of Hamilton’s seven essays defending Washington appeared in a Philadelphia newspaper, The Gazette of the United States, on June 29, 1793. The seventh and final one appeared on July 27. The essays clearly rankled Jefferson. He wrote to Madison several times urging him to respond to Hamilton’s “heresies”: Nobody answers him, & his doctrines will therefore be taken for confessed. For God’s sake, my dear Sir, take up your pen, select the most striking heresies, and cut him to pieces in the face of the public. There is nobody else who can & will enter the lists with him. Madison initially tried dodging his friend’s request. He ultimately relented, though he told Jefferson that “I have forced myself into the task of a reply. I can truly say I find it the most grating one I ever experienced.” Madison’s five essays were published in The Gazette of the United States between August 24 and September 18. Both sets of essays addressed the substantive criticisms of the Proclamation of Neutrality. But the enduring legacy of the exchange was the differing constitutional visions the two framers sketched. Hamilton argued for a broad reading of presidential power, insisting that “the general doctrine of our Constitution…is… that the executive power of the nation is vested in the President; subject only to the exceptions and qualificationswhich are expressed in the instrument.” In contrast, the role of the Senate in treaty-making and Congress in declaring war were “exceptions out of the general ‘executive power’ vested in the President, they are to be construed strictly, and ought to be extended no further than is essential to their execution.” Hamilton further argued that while the “legislature have the right to declare war, it is on the other, the duty of the executive to preserve peace, till the declaration is made; and in fulfilling this duty, it must necessarily possess a right of judging what is the nature of the obligations which the treaties of the country impose on the government.” This vision of a powerful president armed with implied constitutional powers differed markedly from the vision of presidential authority Hamilton had laid out in the Federalist Papers. There he had painted a presidency with circumscribed powers. The president’s position as commander in chief “would amount to nothing more than the supreme command and direction of the military and naval forces,” and his authority to receive foreign ambassadors would be “more a matter of dignity than of authority.” And perhaps most notably, he argued that “the history of human conduct does not warrant that exalted opinion of human virtue which would make it wise in a nation to commit interests of so delicate and momentous a kind, as those which concern its intercourse with the rest of the world, to the sole disposal of a magistrate created and circumstanced as would be a President of the United States.” Madison responded that Hamilton had it backward—the balance of power between the executive and the legislature tilted toward Congress. The president’s foreign policy powers were restricted to those specifically mentioned in the Constitution and even those did not amount to much. For instance, Madison dismissed Hamilton’s contention that the power to receive foreign ambassadors imbued the president with broader authorities by observing “that little, if anything, more was intended by that clause, than to provide for a particular mode of communication.” For Madison, the president was essentially an agent who acted on behalf of Congress and who should not act in ways that would circumscribe its freedom of action. This vision of the presidency was more pinched than the one Madison had sketched five years earlier at the Virginia ratifying convention, showing that Hamilton was not the only framer whose constitutional views shifted with time and circumstance. (Madison’s flexibility on constitutional interpretation showed up again a few years later during the debate over the Jay Treaty.) Madison probably had the better of the argument in terms of the system the framers who met in Philadelphia in the summer of 1787 thought they were creating. But Hamilton had the better of the argument of how the system would actually unfold. Over the next two centuries, power—both constitutional and practical—shifted toward the president. It did so in good part because of a dynamic that Hamilton had recognized in the Federalist Papers, namely, that “decision, activity, secrecy, and dispatch will generally characterise the proceedings of one man, in a much more eminent degree, than the proceedings of any greater number; and in proportion as the number is increased, these qualities will be diminished.” The shift would be helped along by the frequent willingness of members of Congress to put aside qualms about the president’s authority to act when he delivered policy outcomes they wanted. In their jousting, Hamilton and Madison both pointed to constitutional provisions that supported their position. In doing so, they highlighted how the framers, perhaps unwittingly, had created a constitutional structure of separated institutions with overlapping powers. These overlapping, or concurrent, authorities, in turn have meant, as the legal scholar Edwin Corwin famously put it, that the Constitution extends “an invitation to struggle for the privilege of directing American foreign policy.” That struggle generate a productive tension, with the two branches checking each other’s worst tendencies—or bring out each other’s best qualities. But it also raised the possibility that one branch trump the other. As the historian Arthur Schlesinger wrote nearly a half-century ago in The Imperial Presidency, “if the President were to claim all the implications of his control of diplomacy, he could, by creating an antecedent state of things, swallow up the congressional power to authorize hostilities. If Congress were to claim all the implications of its power to authorize hostilities, it could swallow up much of the presidential power to conduct diplomacy.” I will leave it to you to decide which is the greater risk today. Noah Mulligan and Anna Shortridge assisted in the preparation of this post.
  • United States
    TWE Remembers: Truman’s Decision to Intervene in Korea
    Seventy years ago today, President Harry Truman ordered the U.S. military to aid South Korea in repulsing an invasion from North Korea. The decision had geopolitical consequences that are still felt to this day. The decision was equally momentous for its impact on America’s constitutional practice. Truman acted without seeking congressional authorization either in advance or in retrospect. He instead justified his decision on his authority as commander in chief. The move dramatically expanded presidential power at the expense of Congress, which eagerly cooperated in the sacrifice of its constitutional prerogatives. Truman’s decision hardly fit the framers’ vision of how the war power would be exercised. When Pierce Butler of South Carolina proposed at the Constitutional Convention to vest the war power with the president, no one seconded the motion and a fellow delegate exclaimed that he "never expected to hear in a republic a motion to empower the Executive alone to declare war.” James Wilson, a leading voice at the convention, assured the Pennsylvania state ratifying convention that the new Constitution “will not hurry us into war; it is calculated to guard against it. It will not be in the power of a single man, or a single body of men, to involve us in such distress.” Alexander Hamilton offered similar reassurances in Federalist 69. The president’s role as commander in chief “would amount to nothing more than the supreme command and direction of the military and naval forces” while Congress would possess the powers of “DECLARING of war and…RAISING and REGULATING of fleets and armies.”  The framers’ restricted vision of presidential war-making powers carried over into practice. In 1810, James Madison, a man who knew something about original intent, dismissed as unconstitutional a Senate-passed bill that would have delegated to him the authority to order the Navy to protect American merchant ships against attacks from British and French raiders. His reasoning? Only Congress could take the country from peace to war. Nearly forty years later, President James Buchanan wrote that “without the authority of Congress the President can not fire a hostile gun in any case except to repel the attacks of an enemy.” Just nine years before Truman unilaterally decided to defend Korea, President Franklin Roosevelt asked Congress to declare war on Japan even though the Japanese had bombed Pearl Harbor—an attack that clearly met Buchanan’s (and the framers’) standard of when a president could act without soliciting congressional approval. Truman clearly believed that time was essence, and with the memory of Munich hovering in the background, that the United States had to counter communist aggression. But he couldn’t argue he didn’t have time to go to Congress or that Congress couldn’t act quickly. Congress was in session when North Korea invaded and almost certainly would have endorsed his decision. And Truman knew from experience that Congress could act swiftly. FDR asked for a declaration of war against Japan the day after Pearl Harbor. Congress provided it within hours. Truman also couldn’t argue that he hadn’t been reminded that the Constitution gave the war power to Congress. Sen. Robert A. Taft of Ohio, one of the leading Republicans on Capitol Hill at the time, took to the Senate floor on June 28 to argue that “there is no legal authority for what he [Truman] has done.” Nor could Truman argue that the Korean conflict didn’t constitute war in a constitutional sense, even if he did downplay the significance of his decision. (At a press conference on June 29, Truman denied the country was at war, prompting a journalist to ask, “would it be correct…to call this a police action?” Truman answered simply, “Yes.”) The framers understood the difference between full-scale and limited wars—or what they would have called “perfect” versus “imperfect” wars. Over the years, Congress had authorized many small-scale conflicts. The Supreme Court had even ruled that Congress’s war power encompassed both large and small-scale conflicts and that when Congress authorized limited wars the president could not go beyond what Congress permitted. And, of course, the Korean War was “limited” only in the sense that it was smaller in scope than the two world wars. Truman would argue that he was obligated to act because the UN Security Council had called on UN members to repel the attack and that his decision was in keeping with past practice. He in fact had decided to intervene in Korea before the UN Security Council voted, and he couldn’t assume the Council would vote as it did. (Had the Soviets not boycotted the meeting for unrelated reasons, they would have blocked action.) More important, he was not legally obligated or empowered to respond to the UN’s call. The Senate’s approval of the UN Charter and Congress’s subsequent passage of the UN Participation Act of 1945 were explicitly predicated on agreement that UN membership did not override Congress’s war power. (Precisely that fear had helped torpedo Senate approval of the Treaty of Versailles a quarter century before.) The list of supposed precedents of unilateral presidential actions, which were presumed somehow to have put a “gloss” on constitutional meaning, was unimpressive. As a leading legal scholar at the time noted, the list consisted of “fights with pirates, landings of small naval contingents on barbarous or semi-barbarous coasts, the dispatch of small bodies of troops to chase bandits or cattle rustlers across the Mexican border and the like.” Truman in the end acted because he believed, contrary to what the framers envisioned and the historical record showed, that as commander-in-chief he had the authority to order U.S. troops into combat. Indeed, when asked after he left office whether he still would have intervened in Korea had the UN Security Council failed to approve a response, he answered: “No question about it.”   Truman was able to establish the precedent that presidents can take the country to war,  though, because members of Congress were unwilling, Taft’s complaints notwithstanding, to defend their constitutional power from executive encroachment. Truman met with fourteen leading members of Congress on Tuesday, June 27, shortly after he ordered the U.S. military to move toward combat status. According to Secretary of State Dean Acheson’s telling, lawmakers responded to the news that the United States would come to the aid of South Korea with a “general chorus of approval” while saying nothing about taking the issue to Capitol Hill. When Truman met with congressional leaders again three days later, moments after he committed U.S. ground troops to the war, Senate Minority Leader Kenneth Wherry (R-NE), who had not attended the first meeting, argued that Truman should have gone to Congress. Senator Alexander Smith (R-NJ) then suggested, but didn’t insist, that the president still seek congressional approval. Truman promised to consider the request. As the meeting ended, Representative Dewey Short (R-MO), the ranking Republican on the House Armed Services Committee, endorsed Truman’s decision to act unilaterally. Acheson subsequently recommended that Congress pass a resolution to “commend”—but not “authorize”—the action the United States—not the president—had taken in Korea. However, Acheson argued that Congress, rather than the president, should initiate the process. Truman raised Acheson’s recommendation and a draft resolution the State Department had prepared with Senator Scott Lucas (D-IL) in a meeting on July 3. The Senate Democratic leader had no appetite to take up any resolution. He argued that “that the president had very properly done what he had to do without consulting the Congress” and that “many members of Congress had suggested to him that the president should keep away from Congress and avoid debate.” Truman gladly followed the advice. The refusal of Lucas and other lawmakers to force a vote was hardly the first time that Congress sacrificed its constitutional prerogatives in the service of immediate political needs. In doing so, however, it helped greatly expand the boundary of presidential power. To be sure, Truman’s immediate successors were more impressed by how the Korean War consumed his presidency than by the authority he asserted in entering it. Dwight Eisenhower and Lyndon Johnson both saw Truman’s experience showing the need, as the saying went, to get Congress in on the takeoffs in foreign policy if they wanted it around for the crash landings. So whether it was the crises over Dien Bien Phu and Formosa, or the Gulf of Tonkin incident, their initial instincts were to turn to Congress for resolutions to bless their authority to act. (After his experience in Vietnam, Johnson lamented that he had “failed to reckon on one thing: the parachute: I got them on the takeoff, but a lot of them bailed out before the end of the flight.”) The fears that drove Eisenhower and LBJ eventually receded. What remained—particularly in the legal briefs prepared over the years by White House lawyers for Democratic and Republican presidents alike—was the contention that Truman showed that presidents can go to war on their own initiative. Members of Congress have long to sought to put that genie back in the box. They have largely failed, as the Kosovo War, the Libyan intervention, and the Yemen War all attest. Powers easily given away are exceedingly difficult to reclaim. Noah Mulligan and Anna Shortridge contributed to the preparation of this post.      
  • United States
    TWE Remembers: The Jay Treaty 
    Today marks the 225th anniversary of the Senate vote to approve the Jay Treaty, or more formally the Treaty of Amity, Commerce, and Navigation, Between His Britannic Majesty and the United States of America. You’re forgiven if you don’t have the date marked down in your calendar. Chances are your high school U.S. history class didn’t make much of the Jay Treaty. But to the Americans of the time it was as controversial and consequential, if not more so, as the vote on Treaty of Versailles would be 124 years later. The Jay Treaty didn’t just set precedents that have governed executive-legislative relations on foreign policy for more than two centuries. It also showed that even early on, politics didn’t stop at the water’s edge. It helped spur the formation of America’s first political parties, the Federalists and the Republicans, and it created an irreparable rift between the man who wrote the Declaration of Independence and the man who won the War of Independence.  To understand the Jay Treaty, some context is necessary. The United States in 1795 was a long way from the superpower it would become. It had overthrown British rule, but it was a weak country, lacking both a standing army and significant naval forces. Its first attempt at self-government—the Articles of Confederation—had been so disastrous it was scrapped after less than a decade. Its replacement had been operating for just six years at the time of the Jay Treaty debate. Many Americans still wondered whether the new government would be any more successful than its predecessor.   Making matters worse, Europe was at war. The French Revolution had triggered a series of military conflicts across the continent. The most important for U.S. interests came in February 1793 when France declared war on Britain. The French declaration put President George Washington in a difficult position. The United States had won its independence in good part because of its military alliance with France. But alliances run both ways. The French now expected Americans to return the favor. Secretary of State Thomas Jefferson urged Washington to do just that, arguing that fulfilling the alliance obligation was a matter of honor and interest. Secretary of the Treasury Alexander Hamilton countered that going to war with Britain would be suicide for the young nation. Not only was Britain America’s largest trading partner, war would bring British attacks on American shipping, a possible invasion from Canada, and raids across the northwestern frontier by Native American tribes allied with Britain. Washington sided with Hamilton and opted for neutrality. The decision hardened the emerging dividing line in American politics between Federalists, who favored a stronger national government and reconciliation with Great Britain, and Republicans, who feared centralized national power and saw the French Revolution as fulfilling the promise of the American experiment. (Those Republicans were the antecedents of today’s Democratic Party; the modern Republican Party was founded in 1854.) Tempers might have cooled if neutrality had enabled the United States to avoid the fight between the two great powers. But Washington's decision at best postponed outright war rather than prevented it. British warships soon began seizing American merchant ships entering or leaving French ports in the West Indies, violating the traditional norm that neutral ships were allowed free passage provided they were not carrying munitions. The British navy also began stopping American merchant ships and carrying off, or impressing, sailors judged to be British deserters.   Fearful that the country was being dragged into a war it did not want and could not afford, a small group of Federalist senators approached Washington with a proposal to send an envoy to London. They hoped diplomacy could end the attacks on American shipping and settle a range of other disputes that roiled relations between the two countries. They quickly won over the president, showing that senatorial influence does not always come through floor votes.   Washington initially wanted to send Hamilton to London to negotiate. The practice at the time, and one that would not last much beyond Washington’s presidency, was for the Senate to confirm the president’s choice of an envoy. That consent would not be easy to come by in Hamilton’s case, even though Federalists dominated the Senate. The Treasury secretary’s efforts to build a stronger national government had made him a lightning rod American politics, and his pro-British views guaranteed that critics would question any agreement he negotiated. Hamilton made things easier for Washington by reading the room and asking not to be considered for the post. It would not be the last time an administration changed course on policy or personnel because of judgments about the traffic Congress would bear.  The job of special envoy went instead to John Jay, the chief justice of the Supreme Court and the coauthor, along with Hamilton and James Madison, of The Federalist Papers. Jay had impeccable credentials for the position. He had been a president of the Continental Congress, served as a U.S. minister to Spain, and spent five years as secretary of foreign affairs under the Articles of Confederation. His nomination nonetheless sparked its own controversy, in part because he intended to continue as chief justice. (The Supreme Court in the 1790s had neither the importance nor the workload it has today.) Critics reasonably questioned the propriety of a justice negotiating an agreement he might one day be asked to rule on. The deeper issue, though, was that Jay, like Hamilton, favored Britain over France. Republican senators feared that Jay would sell out America’s interests to the British crown.   Just as important as who the envoy would be was what negotiating instructions he would be given. The framers of the Constitution had envisioned the Senate as an advisory council to the president on treaty-making, offering not just consent at the end of the process but advice at the beginning. Washington, who had chaired the Constitutional Convention, had put that idea into practice in his first term by formally asking the Senate to approve negotiating instructions. But the divisions within the Senate (and the country more broadly) over what could and should be gotten from Britain meant that a debate over negotiating instructions would trigger a political donnybrook. “From the Difficulty of passing particular instructions in the Senate,” a leading Federalist senator concluded, “it seems to me the most suitable that the Pr. sh. instruct, and that the Treaty sh. be concluded subject to the approbation of the Senate.” So when Republican senators moved that “the President of the United States be requested to inform the Senate of the whole business with which the proposed Envoy is to be charged,” Federalist senators voted the resolution down. In solving their immediate political problem, though, they unwittingly ceded a power the institution would never reclaim. No future president would feel compelled, as a matter of principle as opposed to expediency, to formally solicit the Senate’s collective views before opening diplomatic negotiations.  Jay sailed for London on May 12, 1794. By November he had struck a deal. Given the dangers of mid-winter crossings of the North Atlantic, the official version of the treaty did not reach Philadelphia—then the capital of the United States—until March 7, 1795. Whether Hamilton or anyone else could have struck a better bargain is debatable. What isn’t debatable is that Jay’s deal left even his supporters disappointed. He had reduced the tension between the two countries, and with it the chance of war. And he had persuaded the British to finally fulfill their earlier pledge to dismantle forts in the Northwest Territories and to agree to arbitrate claims left over from the Revolutionary War. Beyond that, however, Jay had seemingly given up a lot in exchange for little. He had failed to persuade London to respect neutral trading rights or to end the practice of impressment. Worse yet for the Federalists, for whom merchants were a major constituency, the treaty’s Article 12 denied American shipping full access to the lucrative trade with the British West Indies.   Fearing public reaction, Washington kept the terms of the treaty secret, while waiting for the Senate to reconvene for a special session in early June. (Congress had adjourned just days before the ship carrying the treaty arrived.) When the Senate debate eventually began, critics first moved to force publication of the treaty’s terms. They calculated—for the same reasons Washington had—that making the terms public would derail Jay’s handiwork. After that move was defeated, the debate turned to Article 12. Aaron Burr—yes, that Aaron Burr—proposed shelving the treaty and directing Washington to renegotiate its terms. It was a pivotal moment. Had Burr’s motion passed, the treaty-making process might look quite different today. As one scholar summarized it:  Washington might well have considered such an act as notice that, in the future, the Senate would expect to participate in the determination of the conditions under which a proposed treaty would be signed; at the very least it would have suggested forcibly the expediency of always consulting them before opening negotiations. It might also have led the Senate to expect such consultation and thus have made it easier for Senators or groups of Senators to demand it. Burr’s motion lost twenty to ten on what today would be called a party-line vote. A related effort by Southern senators to direct Washington to reopen negotiations with Britain to demand compensation for slave-owners whose enslaved peoples had been freed by the British during the Revolutionary War fared only marginally better.   With the idea of shelving the treaty itself shelved, the Senate turned to the position that Federalist senators favored: conditional approval. The Senate voted—again by a margin of twenty to ten, the minimum support needed—to approve the treaty, provided that Article 12 was amended to address its concerns. It was a novel, and perhaps even presumptuous, move. Under the practice at the time, treaties were binding on countries unless it could be shown that their negotiators had strayed materially from their negotiating instructions. Here the Senate was throwing tradition aside, inserting in its place what would become the American practice of the Senate modifying treaties through reservations, understandings, and declarations.   The Constitution, however, said nothing about conditional approval of treaties. So a question immediately arose: Would the revised treaty, if one could be had, need to be resubmitted to the Senate? Republicans certainly thought so. That would give them another opportunity to defeat a treaty they detested, or as Jefferson, who had stepped down as secretary of state at the end of 1793, put it, “give the majority an opportunity of correcting the error into which their exclusion of public light has led them.” However, the decision wasn’t Jefferson’s to make but Washington’s. After consulting with his Cabinet, Washington concluded that resubmission was unnecessary.   As Washington was sounding out his cabinet he was faced with a more immediate political problem: the terms of the Jay Treaty had become public. Senator Stevens Thomson Mason of Virginia, convinced of the “importance that the People should possess a full and accurate knowledge of the subject to which their attention may be drawn, and which I think has already been improperly withheld from them,” leaked the treaty to the editor of the Aurora, a pro-Republican Philadelphia newspaper, which published it on July 1. Protests erupted “like an electric velocity” hitting “every part of the Union.” Washington’s house was “surrounded by an innumerable multitude from day to day, buzzing, demanding war against England,” while some Republicans called for a “speedy death to General Washington” and newspapers ran political cartoons showing him being taken to the guillotine. Jay, who had just stepped down as chief justice to become governor of New York, was burned in effigy in so many towns and cities that “he said he could have walked the length of America by the glow from his own flaming figure.” Hamilton was heckled, jeered, and then struck in the forehead with a rock when he publicly defended the treaty in New York City.   Washington returned to Mount Vernon to decide whether to ratify the treaty. (While it is common to talk about the Senate ratifying treaties because it votes on resolutions of ratification, ratification is the step presidents take after the Senate provides its consent. And yes, presidents can and have declined to ratify treaties the Senate has approved.) Noah Webster, a Federalist and the man who gave Americans their own dictionary, described Washington’s political dilemma:  The peace of our Country stands almost committed in either event. A rejection of the Treaty leaves all the causes of hostility, unadjusted, with the double exasperation of temper. A ratification threatens evil commotions, at least in the Southern States. A rejection sacrifices Mr. Jay & perhaps many of his friends, a ratification threatens the popularity of the President, whose personal influence is now more essential than ever to our Union.  In mid-August, Washington made his decision: he would ratify the treaty and seek Britain’s consent to the changes the Senate had directed. He calculated that for all its flaws, the treaty would buy what the United States needed most—time. As he subsequently explained it:   Twenty years peace, with such an increase of population and resources as we have a right to expect; added to our remote situation from the jarring powers, will in all probability enable us in a just cause to bid defiance to any power on earth. Why then should we—prematurely embarrass (for the attainment of trifles comparatively speaking) in hostilities. the issue of which is never  certain—always expensive—& beneficial to a few only (the least deserving perhaps) whilst it [must] be distressing & ruinous to the great mass of our Citizens.   Perhaps to Washington’s surprise, the British readily agreed to the Senate’s changes. On February 29, 1796, Washington proclaimed the treaty publicly. Neither the Senate nor individual senators protested the decision not to resubmit the treaty to the Senate. A vigorous public relations campaign by Hamilton no doubt played a role. Over the course of the previous seven months he had written twenty-eight essays under the pen name “Camillus” defending the treaty. This industry prompted Jefferson to complain that “Hamilton is really a colossus to the antirepublican party. Without numbers, he is an host within himself. They have got themselves into a defile, where they might be finished; but too much security on the Republican part, will give time to his talents & indefatigableness to extricate them.”  Washington’s proclamation should have ended the debate over the Jay Treaty. It didn’t. That spring, House Republicans, led by Madison, made a final run at derailing the agreement. They first called for Washington to provide the House with Jay’s negotiating instructions. In doing so, Madison asserted that the House had the right to a say in the Jay Treaty because its provisions affected the House’s role in regulating commerce with foreign nations. This ran counter to the position Madison had argued for at the Constitutional Convention. Knowing full well what the framers had discussed nine years earlier in  Philadelphia, Washington believed that providing the documents would “render the Treaty making power a nullity without” the House’s consent. He denied the request, noting that the Constitution had not given the House a role in treaty-making: “To admit … a right in the House of Representatives to demand to have as a matter of course all the papers respecting a negotiation with a foreign policy would be to establish a dangerous precedent." Washington’s refusal set the ground for the modern concept of executive privilege, though as he himself noted, the requested papers had been shared with the Senate pursuant to the discharge of its constitutional duties.  Madison and his supporters then turned their efforts to denying the funds needed to implement the Jay Treaty. The move exploited a silence in the Constitution: Could a treaty approved by only the Senate obligate Congress to appropriate funds? Oliver Ellsworth, the new chief justice of the Supreme Court, issued an advisory opinion—a practice the modern Supreme Court steadfastly avoids—saying it could and did because a treaty is an “organized & perfect compact which binds the Nation & its Representatives." But Ellsworth’s option counted for far less than how House members voted. On April 30, 1796, they sided with Washington by just three votes. A diplomatic, as well as constitutional crisis, was averted.  More than two centuries of hindsight has vindicated the Jay Treaty. As the historian Joseph Ellis has put it:  While the specific terms of the treaty were decidedly one-sided in England’s favor, the consensus reached by most historians who have studied the subject is that Jay’s Treaty was a shrewd bargain for the United States. It bet, in effect, on England rather than France as the hegemonic European power of the future, which proved prophetic. It recognized the massive dependence of the American economy on trade with England [and] linked American security and economic development to the British fleet, which provided a protective shield of incalculable value throughout the nineteenth century. Mostly,  it postponed war with England until America was economically and politically more capable of fighting one.   Whether you accept the historians’ consensus or not, the Jay Treaty debate clearly established the outlines of the treaty-making process we know today, just as it hardened the divisions between Federalists and Republicans. It also produced a lasting break between Washington and his fellow Virginians, Jefferson and Madison. Washington never forgave Jefferson for encouraging his allies in the Republican press to vilify his character and to dismiss him as Hamilton’s pawn because of his support for the treaty. When Martha Washington later called Jefferson “most detestable,” she likely was reflecting her husband’s views. Washington’s anger at Madison, who had once been one of his closest aides, went perhaps even deeper. Washington dug up the secret minutes of the Constitutional Convention to show Madison’s duplicity in claiming a role for the House in treaty-making. America’s first president never spoke to his former aide again.   Noah Mulligan and Anna Shortridge assisted in the preparation of this post. 
  • U.S. Congress
    FISA’s Current Controversies and Room for Improvement (Part Two)
    The Foreign Intelligence Surveillance Act (FISA) has become almost impossibly political. In part two of our two-part series on FISA, former General Counsel of the National Security Agency Glenn Gerstell argues that the U.S. government needs to reimagine its approach to surveillance for intelligence purposes. 
  • U.S. Congress
    Making Sense of the Debates Over FISA (Part One)
    Somehow, FISA has become a four-letter word. In the first part of a two-part series on the current debates surrounding the Foreign Intelligence Surveillance Act, former General Counsel of the National Security Agency Glenn Gerstell explains how the act has changed since its inception. 
  • Defense and Security
    Lawmakers Should Push the Pentagon to Draw on Women’s Contributions to Security
    As Congress drafts this year’s defense spending bills, lawmakers should increase their support for a proven way to boost national security: fostering and drawing upon women’s contributions. 
  • Argentina
    Argentina’s Debt Deadline, U.S. Congress Returns, and More
    Podcast
    The deadline approaches for Argentina to restructure $65 billion in debt, the U.S. House of Representatives is set to join the Senate in getting back to work, and the spring graduation season is marked by virtual ceremonies and limited job opportunities.
  • Women and Women's Rights
    Ambassadors for Gender Equality: Who They Are, What They Do, and Why They Matter
    Since the United States appointed the first-ever Ambassador-at-Large for Global Women's Issues in 2009, ten more countries have followed, creating new posts focused on women's rights and gender equality.
  • COVID-19
    After the Pandemic: Can the United States Finally Retool for the Twenty-First Century?
    Over the more than half a century since the United States embraced its integration into the global economy, it has produced both the strongest and the weakest of the advanced economies. The strengths are obvious in the United States' brilliant scientific establishment, its top-ranked universities, its lead in innovation, and its world-beating companies from Apple to Amazon. The weaknesses have never been more obvious than during the current outbreak of the coronavirus–among these a woefully inadequate health insurance system, lack of paid sick leave and other basic job protections, and an unemployment insurance system that encourages companies to fire workers quickly. The virus has ruthlessly exposed the shortcomings of a country that has failed to remake itself for the world it now occupies. When the pandemic recedes, the United States will face some of the toughest questions in its history about how to retool itself for the modern world. In my 2016 book, Failure to Adjust: How Americans Got Left Behind in the Global Economy, I told the story of how economic globalization caught the United States off-guard. For most of our history, we were a reasonably self-sufficient economy, with an expanding domestic market that was more than large enough to exploit economies of scale. So as trade, global travel, and financial integration began to grow explosively in the 1960s, the United States was slow to recognize that it needed to adapt its institutions to the new realities. One of the most telling examples is the program known as Trade Adjustment Assistance (TAA). It was launched by President John F. Kennedy in 1962 with the explicit goal of helping to support and retrain those who would lose jobs as a result of the coming acceleration of global competition that Kennedy and future presidents embraced. "When considerations of national policy make it desirable to avoid higher tariffs,” Kennedy said, “those injured by that competition should not be required to bear the full brunt of the impact.” Despite the soaring rhetoric, the program was stillborn–under-funded by Congress and overly restrictive from the start. When the surge in Chinese imports in the early 2000s contributed to the loss of millions of manufacturing jobs, only a small fraction of displaced workers received TAA. Far more exited the labor market entirely through programs such as Social Security disability. TAA is only one example of where U.S. institutions are poorly designed to deal with disruptive change, which has been accelerating over the past several decades. Whether the causes are trade competition, financial crises, job-displacing automation, or an unexpected and lethal pandemic that spreads across the world, the United States sorely lacks the capacity to help its citizens manage these shocks. Two of the most glaring deficiencies are the absence of sick leave for a significant portion of the workforce, and an unemployment benefits system that requires companies to fire their employees before those workers have access to any government aid. The first has helped undermine efforts to contain the virus, and the second means that economic recovery in the United States is likely to be especially prolonged. The importance of paid sick leave has never been more obvious than during this pandemic. Workers who fear losing pay, or even their jobs, if they fail to show for work are likely to shrug off the slight cough or fever that is the first sign of infection. Yet a large portion of U.S. workers lack access to this basic right. Only 43 percent of part-time workers get paid sick leave, for example, compared with 83 percent of the full-time employees. Among the top 10 percent of income earners, 93 percent have paid sick leave; for the bottom 10 percent, fewer than one in three enjoy the same right. The unemployment insurance system is similarly riddled with holes. Benefits for workers only kick in after they have been fired from their jobs, which has encouraged companies to lay off workers in droves. In the last two weeks of March, more than ten million Americans were thrown onto the unemployment rolls, more than 6 percent of the entire U.S. labor force. In contrast, Germany, the UK, Denmark and many other European countries are supporting the wages for workers who remain employed, allowing companies to keep them on staff and resume operations quickly as the economy recovers. In contrast, many U.S. workers will not be rehired until the consumer economy picks up and companies regain confidence in the future. That could take several years, depending on how long it takes to develop a vaccine or other cures for the coronavirus. Unemployed workers also face impossible choices on health care coverage, because the United States remains the only advanced economy without some form of universal health insurance. They can maintain their former job-based plan–if they had one–only by paying the full costs through COBRA. Or they can take their chances on the ObamaCare market, where many plans come with huge deductibles. These issues are just the tip of a very large iceberg. Lower-income Americans are woefully unprepared for retirement, and the crash in stock markets will make it worse. A new St. Louis Federal Reserve Bank survey says that among those without a high school diploma, or only a GED, just 22 percent had any sort of retirement savings, and among those with savings the median balance was just $35,000. Even among middle-income families, the picture is fairly bleak. The average retirement savings for couples over sixty-one, for example, is just $132,000–enough to generate just $5,200/year in retirement income on top of Social Security. One piece of good news is that the stimulus bill passed with strong bipartisan support in Congress was properly ambitious. The measures to increase unemployment insurance, including expanding coverage to gig economy workers, will be especially critical in helping the growing ranks of the jobless. But the bill was premised on a short-term economic shutdown of no more than a few months. If it lasts longer than that, Congress will either need to find more funding, or many Americans will run out of resources. Poorer Americans will be at the mercy of whether the two parties in Congress can continue to find ways to cooperate. If the money does not keep flowing from Washington, many Americans will find themselves unable to pay for mortgages, rent, health care bills, and other critical needs. What the country needs is not a series of short-term bailouts, but long-term plans to ensure that most Americans are protected against such crises in the future. Will this be the event that finally drags the United States into the twenty-first century? We certainly have the capacity to learn. The 2008 financial crisis, for example, exposed the dangerous fragility of the U.S. banking system; reforms put in place by Congress and the Obama administration in the aftermath, for all their shortcomings, left the financial system in a much stronger place to withstand the current economic shutdown. But the lessons of the pandemic will be harder to absorb, because it has fully revealed the massive inadequacies of a social safety net designed for another era. We can learn that lesson and remake the country for the world we now inhabit. Or we can keep lurching from one crisis to the next.
  • U.S. Congress
    Andrew Yang’s Moment: The Economic Costs of the Pandemic Mean the Time for UBI Is Now
    As fears of the growing coronavirus pandemic are leading to something close to a temporary shutdown of the U.S. economy, the moment has come to listen to the most important young political voice in the country: Andrew Yang. Yang’s dark horse run for the Democratic presidential nomination was based on the simplest of ideas: if Americans are poor and struggling, give them money. He took the idea of “universal basic income” (UBI) from the stuff of think tank analyses and policy books to the front pages of newspapers. Its moment has come more quickly than he could have imagined. Mitt Romney, the Utah Republican senator, has joined a growing chorus of Democrats in calling for direct cash grants of $1,000 to all American adults to help them weather the economic hit from the virus. As Congress is considering additional measures to help an economy that is careening into recession, getting money quickly into the hands of struggling individuals and families must be a top priority. To be clear, I have not always been a fan of UBI. In our 2018 CFR Independent Task Force on the Future of Work, we called for more targeted measures of the sort that are also under consideration now—extending sick leave to all working Americans, wage subsidies, increasing tax credits for lower-income workers, and strengthening unemployment insurance. In ordinary economic circumstances, such targeted measures may offer more bang for the buck. But the overwhelming virtue of UBI is its simplicity. It gets money to individuals in need, and out into the wider economy, more quickly than any other alternative. Unemployment insurance only kicks in after people lose their jobs, and does not fully cover many part-time and gig economy workers, or others who may see a temporary sharp reduction of their income during the crisis. Aid to small businesses will be critical, but the loans are complicated and often take months to disburse. A cash transfer has immediate impact that these other measures cannot match. That money is going to be needed quickly. In just the past several days, governors in major states from New York to Washington have ordered the closure of bars, restaurants, gyms and other recreational facilities. All concerts, conventions, sporting events and other mass gatherings have been canceled. Most Americans have appropriately stopped travelling, which is pummeling the airlines and hotels. Many retail establishments from Apple to Starbucks are shutting down or reducing hours. In an economy where consumer spending drives 70 percent of economic growth, millions of American workers are going to feel the impact immediately. It is heartening to see shows of personal generosity, such as NBA rookie phenomenon Zion Williamson, who has pledged to cover the salaries of New Orleans’s arena workers for one month. But the reality is that most Americans will have little or nothing to fall back on. Even with the solid economic growth of the past decade, some 40 percent of Americans still say they do not have the resources to cover a $400 emergency. The cost of UBI, of course, looks daunting. There are roughly 210 million Americans aged 18 or older, so the first $1,000 check would cost the government about $210 billion. And there is no reason to think one month will be sufficient. The current closures are likely to last at least two months, and possibly much longer. Despite the $1 trillion budget deficit currently being run by Washington—much of it brought about by the irresponsible 2017 tax cuts for companies and wealthier Americans—there is no question that such quick relief is affordable. The Fed has now cut overnight interest rates to near zero, and in the current market chaos, investors will still want to hold even very low interest-bearing Treasury debt. The money is there if Congress asks for it. Beyond that, who knows? Americans may find that the stability provided by a steady monthly check is exactly what they need in the current era, where the economic uncertainties of daily life are multiplying. It could mark the beginning of a long-overdue rethinking of how to help more Americans flourish in the economy of the twenty-first century. The 2008 financial crisis and the Great Recession left a poisonous political legacy in part because Americans believed that we were not all in it together. Big banks and others were bailed out, while many Americans suffered through grinding months and years of unemployment or part-time work or unmanageable mortgage payments.  This crisis is a chance at a do-over. All Americans must have the means to take time from work to protect their health, or the income to stay home and support their families as needed. If they don’t, the virus will likely spread more quickly and the economic pain will linger far longer. Andrew Yang is right. Give money to people. Do it now.