U.S. ADVISORY COMMISSION ON PUBLIC DIPLOMACY

Minutes and Transcript from the Quarterly Public Meeting on Public Diplomacy’s Role in Countering State-Sponsored Disinformation, based on the September 2020 ACPD special report, “Public Diplomacy and the New “Old” War: Countering State-Sponsored Disinformation” 

U.S. Advisory Commission on Public Diplomacy Quarterly Meeting

Wednesday, September 30, 2020 | 12:00 p.m. – 1:30 p.m.

Virtual Public Meeting via Videoconference

COMMISSION MEMBERS PRESENT:

TH Sim Farar, Chair

TH William Hybl, Vice-Chair

TH Anne Terman Wedner

COMMISSION STAFF MEMBERS PRESENT:

Dr. Vivian S. Walker, Executive Director

Mr. Shawn Baxter, Senior Advisor

Ms. Kristy Zamary, Program Assistant

MINUTES:

The U.S. Advisory Commission on Public Diplomacy (ACPD) met in an open virtual session from 12:00 p.m. to 1:30 p.m. on Wednesday, September 30, 2020, to review public diplomacy’s role in countering state-sponsored disinformation, based on the September 2020 ACPD special report, “Public Diplomacy and the New “Old” War: Countering State-Sponsored Disinformation.”  An expert panel of scholars and practitioners discussed opportunities and challenges for PD practitioners in responding to malign influence operations.  Panelists included U.S. Ambassador (ret.) Bruce Wharton; James Pamment, Nonresident Scholar at the Carnegie Endowment for International Peace; and Graham Brookie, Director and Managing Editor of the Atlantic Council’s Digital Forensic Research Lab.  ACPD Executive Director Vivian Walker opened the session, and Chairman Sim Farar provided introductory remarks.  Senior Advisor Shawn Baxter moderated the Q&A, Commissioner Anne Wedner provided a discussion wrap-up and Vice-Chairman Bill Hybl closed the meeting.  The speakers took questions from the Commissioners and the online audience, as detailed in the transcript below.

AUDIENCE:

More than 100 participants joined the ACPD’s virtual public meeting, including:

  • PD practitioners and PD leadership from the Department of State, USAGM, and other agencies;
  • Members of the foreign affairs and PD think tank communities,
  • Academics in communications, foreign affairs, and other fields,
  • Congressional staff members,
  • Retired USIA and State PD officers, and
  • Members of the general public.

Vivian Walker: Hello everybody, my name is Vivian Walker, and I’m the Executive Director of the U.S. Advisory Commission on Public Diplomacy. Along with Chairman Sim Farar, Vice Chairman Bill Hybl, and Commissioner Anne Wedner, it is my pleasure to welcome all of you to today’s public meeting in fulfillment of the ACPD’s mandate to provide regular reports to the public about what the U.S. government is doing in terms of its public diplomacy activities.

Today’s meeting showcases a just-published ACPD report on “Public Diplomacy and the New “Old” War: Countering State-Sponsored Disinformation.” This report, which you can now access via the ACPD website, assesses recent State Department and U.S. Agency for Global Media initiatives with respect to countering disinformation effects.

I’d like to take this opportunity to express, on behalf of the Commission, our sincere thanks to report co-author Ryan Walsh, whose subject matter expertise, meticulous research, and strong analytical skills helped to make the report what it is. In addition, we greatly appreciate the public diplomacy officers at home and abroad who contributed to this report through their experience and expertise and who, day after day, serve on the front lines of the geostrategic competition for influence.

To frame today’s discussion and to help us think through the key issues, we have a panel of experts, each of whom brings a wealth of professional expertise–both practical and scholarly–to the table:

  • James Pamment, Co-director of the Partnership for Countering Influence Operations at the Carnegie Endowment for International Peace;
  • Graham Brookie, Director and Managing Editor at the Atlantic Council’s Digital Forensic Research Lab; and
  • S. Ambassador Bruce Wharton (ret.), former acting Under Secretary of State for Public Diplomacy and Public Affairs and an accomplished public diplomacy officer in his own right.

Before we begin, I’d like to say a few words about process. Following Chairman Farar’s introductory remarks, I will provide a brief overview of the report and its top line recommendations, which, I hope, will be useful to those of you who haven’t yet had an opportunity to review it.

We will then hear from each of our panelists in succession. Please note that there will be no break between their presentations. While they are speaking, we encourage you to submit your questions via Slido at the link provided using the access code #ACPD for the Q&A session moderated by ACPD Senior Advisor Shawn Baxter. As you do so, please be sure to indicate to whom the question is directed.

Finally, no need to take notes because there will be a full transcript of this event available in about a month’s time on our website and through a special email distribution. With that, I’d like to turn this over to the Chairman of the U.S. Advisory Commission on Public Diplomacy, Sim Farar.

Sim Farar: Thank you, Vivian, and all of you who joined us today from the United States and around the world. We are so pleased to welcome you to the ever growing community of interest around public diplomacy. With me today are my distinguished colleagues from the Commission, Vice Chairman Bill Hybl, from Colorado, and Commissioner Anne Wedner, from Chicago, Illinois.

The ACPD is a bipartisan panel created by Congress in 1948 to appraise U.S. government activities intended to understand, inform, and influence foreign publics and to increase the understanding of, and support for, these same activities.  The Commission assesses the effectiveness of USG public diplomacy activities across the globe and recommends changes when needed. For more than 70 years, the Commission has represented the public interest by advising the U.S. government’s global information media, cultural, and education exchange programs.

The Commission is mandated by law to report its findings and recommendations to the President of United States, Congress, the Secretary of State, and of course, the American people. In addition to our flagship annual report, the ACPD produces a series of focused studies on issues of importance to the practice of public diplomacy.

Today we are pleased to roll out our most recent special report, Public Diplomacy and the New “Old” War: Countering State-Sponsored Disinformation.  U.S. government public diplomacy efforts face a sophisticated array of technology-enabled, information-based threats. Our new report assesses State Department and U.S. Agency for Global Media counter disinformation activities and offers a set of recommendations that balance long-term resilience and capacity building measures with short-term deterrence and messaging initiatives.

My thanks to our stellar panelists, who have so generously agreed to share their insights and experiences, and to our audience members, for your continued commitment to these important issues.

Vivian, now, back to you.

Vivian Walker: Great, thank you, Sim. So, a little bit about the inception of this report and our recommendations. This report grew out of the recognition that there is an overwhelming amount of information about, and discussion of, the scope and impact of disinformation effects, as well as what constitutes appropriate countermeasures to address them. So, in the face of this wealth of information, the ACPD wanted to provide a precise accounting of just what the State Department and USAGM are doing to address this threat, as well as to provide some actionable, concrete recommendations for the future.

Our findings coalesced into three broad streams. First, we found that current countering state disinformation public diplomacy initiatives needed to be generally more flexibly resourced and configured to the ever shifting demands of the digital information space. Secondly, we saw the need for greater internal consensus about the definition of the threat and better internal coordination of countermeasures to maximize efficiency and assure targeted impacts. Finally, we detected the need for a more conscious balance between short-term deterrence measures and long-term measures of resilience to counteract disinformation effects. Our targeted recommendations fell into six categories: define, invest, compete, specialize, experiment, and evaluate.  All of them are action verbs, you’ll note.

Let me quickly go through the list. What do we mean by define? Well, we saw an opportunity here for the creation of a Department-wide lexicon for disinformation. We felt that a shared definition of terms and concepts about the nature of disinformation would facilitate program coordination as well as rationalize resource distribution.

Secondly, we saw the need for even greater investment in digital capabilities, but not at the expense of the proverbial last three feet. We must continually work on that balance between short-term responsiveness and long-term engagement.

Third, we saw the need to become even more competitive in the information space by doing more to restructure overseas public diplomacy sections with teams dedicated to modern digital communications. We felt that there were some good lessons to learn from the private sector in this regard.

We also saw a need for increased professional specialization. This could take the form of, perhaps, a separate job series for mid-career Foreign Service officer counter disinformation specialists or, failing that, integrating that specialization more fully into the Foreign Service officer career track.

Next, we saw the need for greater experimentation to look for opportunities to develop mechanisms that could rapidly redirect funding to get programs off the ground quickly, but also to pull them back just as quickly if they’re not working. In short, build more risk-taking into the program development process.

And finally, and you’ve heard this frequently from the Advisory Commission, we focused on the need for more and better evaluation, monitoring and assessment of the impact of CSD programs. We all know that long-term program effectiveness requires evaluation and monitoring in order to make these programs better and more efficient in the future. Also, monitoring and evaluation and assessment activities minimize duplication of effort and maximize program effectiveness. In an era of perpetually scarce resources, this is something that is extremely important to consider.

We have always been and always will be engaged in the defense of our national values, our interests, and our strategic priorities. The question is: how do we make the best use of our public diplomacy resources to maintain our competitive edge? To answer that question, we are very fortunate to have our panel of experts. I’d like to turn this over now to James, who is going to share his expertise with us.

James Pamment: Thank you, Vivian. Good afternoon, everyone. It’s an honor to be here and a pleasure to speak about some of the work I’ve been doing for the last five years. I’ve been based out of a university in Sweden, predominantly, as some of you may know, and my background is as a public diplomacy scholar. For the last five years, I’ve become more interested in dirty tricks in communication. As the disinformation topic became more prevalent again, particularly in my part of the world, the Baltic corner, it became quite a natural step to begin looking at some of these problems and some of these solutions.

Our first big commission was from the Swedish government.  I’ll talk a bit about that project shortly, but it was to help them, essentially, protect their elections from interference. That was my first big project. This is a list of all of the organizations that have commissioned work from me, or from my team over the last five years. You see, it’s quite a lot but often quite small amounts of money. My team has averaged around $200,000 a year. So, the first thing I want to say is that throwing bags of money at this problem is not the solution. You don’t need to spend a fortune. For around $200,000 a year we’ve delivered about 200 training sessions, including tabletop exercises and tailored training to all sorts of actors. Our training programs have also been conducted by our government. So, I would estimate that probably another 200 training sessions have been done using our materials.

I want to talk very briefly about four of the projects that I’ve been working on.  The first was for the Swedish government Civil Contingencies Agency (MSB). This was to protect the election in 2018, so we worked during most of 2017 to develop it. First of all, we developed the intellectual underpinnings for the government’s understanding of the problem, which is in the research report. It’s available on the MSB website in English. And then, we were commissioned to write a handbook for communications professionals to deal with hostile foreign influence, which was a really interesting experiment. We think around 12,000 civil servants in Sweden were trained in this program altogether, and Finland adopted it for their election. I’m aware of at least three or four other countries that have used basically the same materials in their work.  We also did some work with the foreign ministry in training embassies. We worked quite extensively with Sweden in revisiting their psychological defense capabilities.

The second bucket of work was with UK government Cabinet Office, where we did a handbook called “Resist.” I was commissioned to basically spend time hanging around the Cabinet Office and other government departments picking up best practices learned from the Salisbury poisoning and turning it into a handbook for government departments to use in the UK. This has become part of the UK’s international outreach efforts as well as a standard training program. The GEC also sponsored this quite heavily. They’ve been very generous with funding to us. They’ve also translated “Resist” into a couple of different languages and supported our work with around 10-12 countries to train a mixture of government communicators, election officials, security policy people, and intelligence analysts. We put them in mixed teams and train them up on their capabilities to handle election interference. This is the kind of work we’ve been doing using “Resist” as the starting point.

The third piece of work has been on hybrid deterrence, done mainly in partnership with the Hybrid Center of Excellence in Helsinki, also with NATO and the EU. We developed a model to essentially game out a deterrence process. Unfortunately, this isn’t publicly available. But, it’s quite an interesting piece of work because it’s built on some of the recent experiences that many European institutions have had in dealing with hostile state actors in hybrid and disinformation scenarios.

Lastly, the final piece of work that I wanted to touch on is a policy piece of work that the European External Action Service commissioned. It’s three reports looking at the future of disinformation policy. We were asked to write a comprehensive policy in support of the commission’s European democracy action plan.  This included working with various teams to brainstorm future threats. The cornerstone of this approach is actually common definitions and how to set up EU institutions to counter disinformation.

What I want to spend the rest of my time doing is actually just talking about some of the lessons learned. The first one is, don’t let others set your agenda. I think this is a massive issue in this field. Digital and social media platforms have significant problems with disinformation, but those aren’t the same as government problems. At the moment, these platforms, rather than governments, are defining the problem set. I see that as a major hurdle because governments cannot take down disinformation in the same way that the private sector can.  These platforms can give indications that are helpful, but they’re not the answer.

Similarly, the research community; there’s a tendency to think of research as neutral. It isn’t. Researchers have their own interests. Moreover, there’s a scramble at the moment for resources because money is available. I think we need to start asking slightly more difficult questions of the research community. The point I really want to get across is the national interest. A government is driven by its own interests. A platform shouldn’t be telling you what your interests are. Researchers shouldn’t be telling you what they are, You need to get your own house in order and decide what your priorities are.

Which leads on nicely to my next point. You know, the biggest problem I see in all these trainings that I’ve been doing around the world is that governments and institutions have to get their own houses in order. There’s no point in blaming Russia or Iran or China for disinformation if they’re exploiting vulnerabilities in your system. You need to begin by fixing your system. That means getting the policies right. I’m not sure that any country really has got it right so far. The UK approach is interesting.  The EU currently has an incredible opportunity, and I’m committed to helping develop that process. Some important questions must be addressed: what are the positions and skills needed?  Are roles and responsibilities sufficiently defined, particularly where there’s an overlap between intelligence, security policy, the civilian side, and the more secret side? Are diplomats and communicators working together? Are skill sets coordinated? Are key concepts in place?

We go back to the definitions, as Vivian was talking about. Do we even have a language to talk about this? When the EU had a code of practice, which was a voluntary agreement with the platforms, they wanted all the platforms to report back on disinformation on their services. Instead, the different platforms reported back using their terminology. So, Facebook talked about coordinated inauthentic behavior while Twitter talked about information operations. And so, you just get all of these reports talking about completely different things, loosely related to each other but not the same thing. We need those standards and definitions derived from national interests, driven by what governments require.

It’s the same with analytical methods. People sitting in different teams with different roles who see different pieces of the puzzle should be able to coordinate their work and actually share information in a reasonable and timely way. The lack of information sharing is a significant problem in this field. Then lastly, in terms of getting your house in order, we need laws that support policies. I think a lot of these problems can be solved through non-regulatory means.

Finally, you need a mixture of actor specific and actor agnostic responses. By this I mean some of the problem having to do with disinformation is a democratic problem and some of it is a security problem. You solve the democratic problem with public awareness education. The security problem has to be solved within a hybrid framework using deterrence models and using the security apparatus.  That difference needs to teased out because in some cases, education is the answer, while in others, it’s not going to help.

And then lastly, we need to remember that public diplomacy fundamentals still apply regarding disinformation: understanding your audience; working with your friends; identifying the influencers and thought leaders; having very clear narratives and accurate, truthful messaging; building trust; and developing a reputation. Finding that overlap between your interest and their interest.  Those principles still apply here and form the basis of a public diplomacy response to disinformation.

I’ll leave it there. Thank you very much for your attention. Happy to answer any questions you might have.

Vivian Walker: Great, thank you so much for that, and, as promised, we will move directly into the next presentation. Graham, over to you.

Graham Brookie: Thank you so much. I’ll just share a few thoughts and not take up the majority of my time so that we can move on to questions and answers, which I hope would be most useful to all of you communicators out there. So, first and foremost, thank you so much to Vivian and the entire team at the ACPD. This is an extremely important element of our government. I say that having served in government prior to my current perch at the DFR Lab at the Atlantic Council. The “Public Diplomacy and the New ‘Old’ War” report is an extremely valuable piece of work. I don’t think we could have written it better ourselves at the DFR Lab.

By way of introduction, my name is Graham Brookie. I’m the Director and Managing Editor of the Atlantic Council’s Digital Forensic Research Lab. Prior to working at the DFR lab, I served at the National Security Council at the White House for four years, and in my last role at the National Security Council, my job was as a strategic communications advisor. As most of you that have served in government know, as soon as you start having more words in your title, the less relevant it becomes.

I want to start by sharing a story, the origin story of the DFR Lab, which I think is pertinent to this conversation. In 2015, when I was still sitting in the Old Executive Office Building, we faced a very real public diplomacy and public communications challenge: the United States government knew that there were Russian regular troops in eastern Ukraine, but neither my colleagues nor I could call the Washington Post or the New York Times. We could not tell a journalist that “we know for a fact that there are Russian troops in eastern Ukraine. I’m not going to go on record, and I can’t tell you how I know that, but I think that you should dig into this.”  The U.S. government was backfooted from a public diplomacy and public relations standpoint.

The team that would become the DFR lab outside of government – the investigators now at Bellingcat, some of the policymakers at the Atlantic Council, and a number of journalists – got together and said “we’re going to start researching it, we’re going to write a report.” That report was called “Hiding in Plain Sight.” And, this matters only at the Atlantic Council and literally nowhere else, but that report did gangbusters. More people saw that report than any other report in the history of the Atlantic Council.

So, the Atlantic Council said we should probably do more of that. But what’s more important is the substance of the report. The report proved that there were Russian troops in eastern Ukraine, and the way that that report did it was by using open source research. So, rather than relying on classified intelligence that had to be cleared and then communicated into a set of talking points, the report looked at Instagram and VKontakte posts by Russian troops in eastern Ukraine and then geolocated them using Google Earth and a number of other tools that you have access to on your phone right now, which was a pretty good way to tell that story.  Very high confidence, very transparent, very engaging.

That report was a very useful engagement tool. The day after that report was released, it was easy to call up the Washington Post and the New York Times from the United States government and say: “Did you see that report? We can verify that.” It was a new way of engaging. Today, the DFR lab is a center at the Atlantic Council that has grown quite a bit, with 30 people around the world on five continents, and the whole point is credibility and distribution in local information environments as opposed to having a large center that’s located in Washington, D.C., where we’re studying and then communicating with the world from a perch.

We do three basic things. We do ongoing open source research that is pretty varied. We look at human rights violations in typically closed media environments, from Russian regulars in eastern Ukraine, to the use of chemical weapons in Syria or large graves that ISIS dug up in Syria. The way that we conduct open source research in those information environments is very different than the second thing that we do, which is looking at open information environments and how overall narratives spread across platforms, across languages, and across the world. And that tactic in the open information environment focuses on sifting through things to figure out what is what – what is fact and what is fiction, what is engaging to people and what is not.

The third aspect of our research is just explaining the nuts and the bolts of the information environment. There is much that, as James mentioned, lacks strong definitions or descriptions, in part because there is very little public information available. An example of that is bots. Not everybody that you disagree with on Twitter or that disagrees with you on Twitter is a Russian bot. Explaining how bot activity and networked activity works on the internet builds up your credibility. So, that’s the first thing: research.

The second thing that we do is training, as James mentioned, training up as many people as humanly possible. The thing that we really struggle with, as a community, is the fact that there’s a wide degree of variance. There’s a little bit of a gold rush right now in the counter-disinformation field. Everybody is rushing in, and the variance in expertise and work is pretty drastic. James and I can attest to the fact that we probably spend more time pointing out what is good research and what is bad research than actually conducting research across the community. It is important to create across the board training standards for those engaging in public diplomacy, especially for those in government who have additional responsibility, especially in democratic governments, but also journalists and people in the private sector – Facebook and Twitter analysts or large social media platforms or engagement platforms that create infrastructure for the information environment. Creating standards across this entire community will reduce variance in what expertise looks like.

Then the third is dealing with data and translating it into policy. We need to create policy solutions based on evidence of what we’re seeing from around the world rather than creating policy solutions that are in search of evidence. That applies not only to policy solutions like what we should do with the infrastructure of the Internet, but how we should be engaging from our diplomatic posts or from the seat of the United States government, including internally as well as externally with allies and partners against hostile actors. So, policy analysis and engagement is the third thing we do.

The other point that I would want to make by way of reference and framing this conversation is definitional. To the point that James made, we don’t have agreed upon definitions. In our day-to-day work at the DFR lab, we don’t use the term “fake news.” It turns out it’s not a particularly good research term. You can’t really measure it, and it creates a lot of connotations that aren’t very helpful around the world. So, we bundled that term up and threw it out the window. We don’t use it in our work; we don’t use it in our engagement.

The second term that we do use is “misinformation.” We define it as the spread of false information without intent. We all have a family member that spreads misinformation on a pretty regular basis and/or makes the Thanksgiving table uncomfortable. “Disinformation” is different than misinformation. We define disinformation as the spread of false information with intent, meaning whoever’s spreading that information means to lie to you or means to manipulate the information environment. For those of you that have worked on cyber security issues, you know that attribution is always the hardest thing, and it is even harder in the soft information environment. The definition of disinformation is extremely important in making attributions.

Another definition is one that we have stolen from the Australian government Department of Home Affairs, which describes “foreign interference” as activity by foreign actors that is coercive, corrupting, deceptive, or clandestine in nature, which distinguishes it from the more benign phenomena of influence. By focusing more expressly on digital activity, this definition denotes a range of interference activities, including disinformation, media manipulation, and cyber intrusion that are conducted by foreign actors to affect outcomes. That’s really important when we’re looking at this issue.

Also, just by way of framing, I think that it’s important to note two things. One is that we are in a truly global competition for information. As the report from the ACPD states, that’s not new, but the way that it plays out is extremely new and changing at a faster pace than ever before. And that belies the fact that we need to have engagement at all times. As an example, for public diplomacy, it doesn’t quite work to export baseball and Coca Cola and drive a conversation for a decade. Try that with basketball program in a place like the Baltics, and they’ll say something like “We’ve seen that on YouTube. What else do you have for us?” So, the need for engagement on a faster, more consistent, basis is extremely important.

How we set ourselves up to do that is an outstanding question. How do we organize our bureaucracy? How do we organize our tactics? How do we organize our infrastructure? That’s something that I think the report makes really good recommendations on, but I think it’s also important to note that strategic communications is not just a communications issue. One of the things that we see pretty consistently in government is whenever we are having a bad news cycle, the communicators look at the policy folks and say, “I think this is probably a policy problem.” And the policy folks look at the communicators and say something like, “Well, I think we have a big communications problem.” And the fact is that it’s both. Especially with disinformation, which leads to the point that disinformation is a catalytic threat. It is an ongoing vulnerability.

Disinformation and how foreign actors use disinformation is a transnational threat. Disinformation does not care about our borders, and it doesn’t care about our information environments. It is actor agnostic. But if we don’t deal with both the catalytic threat that is disinformation as well as the vulnerabilities that it touches on, then we’re back footed at all times. And not only that, when we talk about disinformation as a catalytic threat, the main point there is that if you don’t solve for the challenge that is disinformation, then it will be extremely difficult to solve for literally anything else. And that might sound hyperbolic, but I can’t think of a single policy issue in which disinformation hasn’t made it harder to make progress.  This includes any of the largest diplomatic priorities of the Department of State.

If you don’t solve for disinformation, then you’re not going to be able to solve for any number of multilateral agreements. You’re not going to be able to solve for massive challenges like climate change. You’re not going to be able to solve for any of these large priorities of the American government. And that is not something that translates very well outside of the public diplomacy community. I think that everybody on this phone call or in this webinar knows that existentially thanks to their day to day work.

But, if you talk with one of the regional bureaus or one of the functional bureaus, when you talk about disinformation, it feels intangible. It feels like it’s messy, and, for those not directly engaged in public diplomacy, it feels like it’s not part of their core job. There needs to be a culture shift in order to be more efficient in how we conduct policy and how we compete in a global competition for information. And with that, I think that I would cede any remaining time that I might have to Q&A.  I very much look forward to it.

Vivian Walker: Thank you so much, Graham. That was really interesting, and I look forward to being able to explore some of those issues during the Q&A. But right now is our opportunity to hear from Bruce. So, Bruce Wharton, please, over to you.

Bruce Wharton: Thanks very much. Vivian. I’ve been out of the mainstream of public diplomacy for exactly three years. Today is the third anniversary of my retirement from federal service, and I am absolutely delighted to see the progress that has taken place in the critical analysis of the disinformation problem set in the last three years.

Thanks for including me today. You know, I think back to my earliest experience in the Foreign Service, and, at that point, Soviet disinformation was a very real problem. And it’s something that I dealt with in my early assignments in Latin America. At that point, the narrative was the baby parts story. This was a Soviet-planted story that wealthy Americans were either kidnapping or stealing infants in developing countries to harvest their organs – an absolutely ghastly story that had no basis whatsoever in fact. The technique was that a Soviet agent would plant this story in a small provincial newspaper or radio station in a country like India, for example, and then, by pointing to earlier news stories would begin to build a momentum of belief in this story.

Eventually, we were challenged in places like Mexico and Argentina and Bolivia, with serious journalists beginning to think that this might be a real story. So, part of my job as a press officer in Bolivia, for example, in the late 1980s was to counter this. I did it using several tools. We had a small office in Washington at USIA that analyzed the problems and developed responses that field officers could use – a sort of history of the development of the problem and rebuttals. And then it was up to me to use the personal relationships that I had developed with editors and reporters to present the information that I had from Washington. Fortunately, thanks to educational and professional exchanges or long term contact with the United States, there was a general predisposition not to believe such an outrageous story among the people that I dealt with. So, we were fairly successful in pushing back on this narrative.

Now today, the challenge is orders and orders of magnitude more complex and larger, but I think that the fundamentals still apply. That is, we need people in capital cities. We need people in Washington who analyze and provide information to the field officers as well as develop a global response. We need to continue the sorts of exchanges and programs that give our important and influential interlocutors overseas knowledge of our society and of our culture that then makes it hard for them to accept crazy stories. And, we need the ability through field officers to apply knowledge locally through personal contacts. Long-term and short-term effects of contact with influential people are both important. We can’t divorce one from the other.

Of course, most of these recommendations, most of these ideas, are contained in the Advisory Commission’s latest report on countering state-sponsored disinformation. So, fundamentally, I agree with the recommendations in the report, and I really do commend the authors’ work. One of the things that impressed me most about the report was the broad outreach and research that has clearly gone into the report.

One of the ideas that I think is critical here is that the United States is not alone in this. Both James and Graham alluded to this. We need to work very closely with our allies, governments around the world, journalists, and civil society groups that share our values and concerns about the destructive effects of disinformation. So, let me share just a few ideas about initiatives that I think are especially critical as we move forward.

First, we really need to focus on building networks and partnerships with groups outside of the United States, as well as within the country. I think immediately of journalists. I think of people who are bloggers or otherwise influential in social media circles. I think of academic communities and civil society groups. TechCamps, a Rick Stengel initiative that I inherited and continued when I was the acting Under Secretary, was a great example. TechCamps allowed us to convene groups of people who shared our concerns and then, instead of telling them what to do, we let them work on developing responses to Russian disinformation effects that threaten their own countries and societies. Participants shared practices, and they shared information. I was hugely impressed with these groups of people and their ability to respond constructively to the threat of disinformation. Similar to the idea of “leading from the middle,” when we bring people together, we’re not telling them what to say or how to think, but we’re recognizing that we share values and looking to them to develop responses that makes sense in their context. I thought that was very effective.

James spoke very clearly about the importance of sharing practices, ideas, and information with allied governments. Again, when I was in the Service, we did some of that, mostly with the UK, but I now see examples from Sweden, Australia, and other places. I think this is going to take a global network of effort to address.

When we are working in social media with our own resources, we need to adopt what the Air Force called a “train and trust mentality,” in which policymakers establish the left and the right borders for what we need to do, but then we need to turn people loose within those parameters. Otherwise, our ability to respond quickly will be crippled by the infamous clearance process. I don’t think that works. So, that implies backing people up when they make mistakes as they inevitably will. But, we have to trust people. Again, establishing the mission within the left and right lines of where we need to be, but trusting people to use their own judgment.

Given state-sponsored disinformation, weaponized information, and weaponized lies, at some point we need to begin to think about talking to our adversaries about a non-proliferation agreement on the use of information to threaten other countries, other governments.  I’m sure that if we had Russian and Chinese and Iranian participants in this conversation, they would point to things that the United States does – some of the things probably that I did as a public diplomacy officer – that feel threatening to them. We’re going to have to go into such a conversation with the ability to understand their points of view and find compromises. Admittedly, this idea is a little bit out in left field, but at some point, I do think that it would be useful to at least experiment with conversations about a non-proliferation agreement with adversaries on the use of information. Analytic capabilities are absolutely critical and could provide some basis for measuring nonproliferation efforts.

And then finally, there is the education piece. We’ve got to do what we can at home and abroad to “spoof proof” publics. I’ve had conversations with people in the gaming world about incorporating disinformation into video games that kids play in which they need to distinguish between credible and non-credible information as a skill building tool. Most critically, though, and I think this is really foundational to all of this, is that we here in the United States have got to do a better job of living by our own values.  And by values I really mean the Bill of Rights, the Constitution, the things that we as Americans believe are important. We’ve got to reset expectations about the American respect for science, for truth, for tolerance, for justice, for equity, for human rights, for fairness. We’ve got to build the expectation that when the United States says something, it’s credible and that we will stand behind it. I fear that we’ve lost some of that. I do see this is foundational to all of the exchange and information work. If people don’t trust us, then a lot of that work will be much more difficult. I will stop there. Thank you.

Vivian Walker: Great, thank you, so much, to our panelists for these inspiring and thought-provoking comments. We’re going to go into our Q&A session, but just before we do that I want to remind everybody out there listening to submit your questions. Please send them in via Slido. ACPD Senior Advisor Shawn Baxter will capture them and direct them to our panelists. This is an opportunity to really learn from one another today. So, I hope you’ll take advantage of that. Over to you, Shawn.

Shawn Baxter: Thank you, Vivian. Good afternoon, everyone. Before I start looking at Slido, I want to give an opportunity to one of our commissioners to kick us off and ask the first question. Please go ahead, Anne.

Anne Wedner: Hi, everyone. I thought this was an excellent presentation. I think the report is even more valuable and merits a longer read beyond today’s discussion.

Listening to James at the beginning, I felt like we’re sort of approaching this problem in a flat, old media sense. After Vivian and I made our rounds through several eastern European and central European countries, we learned that the biggest issue, and maybe the elephant in the room, is really the social media companies themselves. We found that people in government and the NGO sector were agonizing trying to figure out the algorithms used by social media companies to push different kinds of information and stories.

The other side of that which people were concerned about is the violation of privacy laws – that it is possible for social media platforms to know more about you than you know about yourself. From a strategic perspective, as we heard at the NATO Centre of Excellence in Latvia, there was real concern that even our NATO officers could be vulnerable to the weaponization of information via social media platforms. So, it’s one thing to say, “Oh, let’s educate ourselves and be a resilient society.” It’s another thing to figure out how to fend off messages that are specifically and psychologically tailored for individuals at that micro-targeting level.

James Pamment: Okay, thank you for giving me the most difficult question out there. The reason I talk about non-regulatory solutions is that regulation is necessary, and it’s inevitable at this point, but, I think we need to bear in mind that spreading disinformation isn’t illegal. I’m allowed to say that I think Elvis is alive and well and living in San Diego, right? I should be allowed to say that, and, thank goodness, I can. But, actually, the platforms themselves can decide what’s permissible on their territory in the same way that a fast food restaurant can decide who gets to come in and who doesn’t.

Governments can do a lot more work with the platforms around terms of service that are derived from fundamental freedoms, such as freedom of thought and expression. What would the terms of service for platforms, if you’re just developing the building blocks and trying to apply them across the board to new and old platforms, big and small; how would you develop them, building on our fundamental rights, so that they protect people from things like disinformation, particularly the more harmful kinds? I think there are tremendous opportunities that are basically unexplored at this point, to start looking into that. Those kinds of activities, in conjunction with regulation that enables, particularly, auditing, we should consider. I don’t think we’re going to get away from the idea that the platforms self-report, but we do need to check their reporting.

So, my point would be, yes, regulation is coming. Don’t expect access to data to solve all the problems, but a lot of these problems can be solved by working with platforms, working through the spirit of our fundamental freedoms, trying to figure out what is doable with the platforms. Then, how can these platforms adapt their promotion of content policies to support that kind of approach?

This is a great question.  I’ve only answered part of it, but maybe the other panelists will answer the other parts.

Graham Brookie: I would add one thing. I think that if we are relying on what social media companies are doing or not doing to create standards for speech or countering disinformation across democracies, then we are in an extremely bad place as a starting point. If we waited for Facebook and Twitter and Google and some other platforms to come in and save the day for how we are doing to engage as democracies across the world, that’s not going to work.

Coming at this issue from the lens of a truly global competition for information, the thing that makes authoritarians or hostile states the most nervous about free and open societies, collectively, are privacy and free speech. We have drastically different standards for privacy and free speech across free and open societies. Europe has probably a stronger standard for privacy than the United States, and the United States has a stronger standard for freedom of expression or freedom of speech than does Europe. Until there is wide agreement or more alignment on both of those issues across democracies, as well as, frankly, penalties within purported democracies for not protecting both of those things, then I don’t think that we are going to be able to give policy solutions to social media companies in a way that is clear, coherent, and collaborative. I think that we’re probably a long way off on that, even though in democracies we agree much more than we disagree on this issue with regard to technology.

Shawn Baxter: All right, thank you, Graham and James. Bruce, I don’t know if you want to chime in on that at all, or if not, we can move on to the next question.

Bruce Wharton: No, I think we’ve heard from the people who actually know that stuff. So, thank you.

Shawn Baxter: One of our first questions from Slido concerns the fact that this is for many a field-focused issue for public diplomacy practitioners. Most of our attendees today are PD practitioners, whether they’re based in Washington or the field. How we can better support our officers in the field to counter disinformation? As a corollary to that, we also have a couple of questions on structure. The ACPD report looked at some structural changes made at the State Department and USAGM over the last three to four years. Do you see a need for more structural changes in order to effectively face the threat to support what our officers are doing in the field? I’ll throw that out to Bruce first, given his experience, and then ask James and Graham to chime in if they’d like.

Bruce Wharton: Initially, I saw the Global Engagement Center (GEC) as a back office, in essence, for the field, although with two missions. One was to develop sort of a global presence that pushed back against global disinformation activities, but the other was to be extremely responsive to field officers, because I do think that dealing with disinformation is different from country to country. Going back to the baby parts story, for example, it was very different dealing with that in Bolivia, where there was a very low adoption rate to the United States, than dealing with it in Guatemala, where there was a high adoption rate and a greater propensity to believe the story. So, I see GEC’s work partly to respond to a specific request from a field officer for either information or history or strategy or tactics in responding to disinformation in a way that makes sense to the local context.

Shawn Baxter: James, would you like to jump in?

James Pamment: Sure. So, my experience comes from working with the UK Foreign Office in developing their capabilities. The issues that we were facing had to do with developing situational awareness. That comes from digital monitoring capabilities: where should they be seated? Should every post have that capability, or should monitoring be centralized? Second question is how you do risk assessment. How do you assess what you’re seeing online? How do you weigh up what’s likely to have an impact and what isn’t against your priorities? And then thirdly, there is the question of response capability. How do you counter or ignore or whatever it is you’re going to do with that disinformation?

I don’t think there is a single solution. I think it’s different for everyone. In the U.S., you have 3-4,000 PD officers around the world. That’s a completely different level of capability as compared to any other of the governments I’ve consulted with. But I think, generally speaking, situational awareness assessments should be centralized to review and collate all of the reports that are coming in to see if there’s a bigger picture emerging. Individual posts can’t do that. So, a post needs some capability, but there needs to be a central office looking at the big picture. In terms of risk analysis, I think posts also need to be able to do that. They need to assess in their own context what they think is problematic for their strategic objectives. And in terms of response capabilities, I think it would be best if posts were capable of doing that because timing can often be critical. Certainly, the center can provide coaching and support, but you want to have tactical capabilities at post.

Graham Brookie: A few quick thoughts. The trope with Russian strategy is that they have very clearly defined goals and then let all elements in the infrastructure achieve those by any means necessary. We have the exact opposite within our own bureaucracy, which isn’t helpful. I am a big fan of the Department of State language [media] hubs, having worked with them. I agree with James’ point about trusting posts in particular, but also having some degree of centralization. Having clear coordination with Main State is absolutely necessary. You’re never going to get rid of that, nor should you. Everything should tie back into a central strategy. But, I think it is also important to train, empower, and trust the field.

One of the things that in my experience has been interesting across government is that at any given embassy you might have a Public Affairs Officer who is good at and interested in social media or you might have a Public Affairs Officer who could care less about what’s going on across social media or within the local information environment. Drawing down that degree of variance is really important.  Knowing and being interested in how to engage across information environments, where people are engaging locally, is important.  It may be very surprising to many of you to learn that the largest social media platform in Kenya, for instance, is LinkedIn, whereas the internet in Southeast Asia is literally Facebook, and so engaging with people where they are at locally is absolutely necessary.

Also, having the ability to not only engage, but to monitor and understand what is happening across the information environment is something that is understudied and under-evaluated. And then, having a degree of trust to engage on a consistent basis, engaging at the speed of information that is already traveling, and not just responding, but shaping information environments, shaping narratives. If you are responding to disinformation, in particular, you’ve already lost. So, having that ability to be proactive and to engage on a consistent basis so that when something does happen, you are already there, is incredibly important, and that requires a huge degree of trust.

Shawn Baxter: Thanks to all three of you. I think we hit a nerve there with the training question because we’ve had a number of comments and questions come in about how to better train PD officers overseas to do these kinds of things. So, if you’d like to make more comments on that later, please do. For now, let’s move on to the next question. This one concerns mistakes that the United States government has made in confronting the disinformation threat in a foreign context. What do you think that we can learn from these mistakes, assuming you believe that significant mistakes have been made? I’ll direct that to you, Graham.

Graham Brookie: It’s very hard for those of us who are current or former government employees to admit that we ever made any mistakes. But one thing that comes immediately to mind on disinformation is the failure to get ahead of disinformation rather than merely reacting to it. If you’re responding to disinformation, then you’ve already lost, for two reasons. One, you’ve accepted the premise of the disinformation narrative. Second, you give it credibility or amplification by responding to it. We face this on a consistent basis with the Russians in Syria. If you’re getting into a back and forth about who bombed what and where and which bad actors are doing what and where, you are perpetuating the narrative rather than defusing it.

I think one of the parallels that we can draw in order to make that point clearer across the bureaucracy is that responding to disinformation from a public relations standpoint is a little like negotiating with an illogical actor. For any given case of disinformation, I would compare it to trying to negotiate any kind of deal with the North Korean government. You’re not dealing with a logical actor; you’re not dealing with reason; and you’re not dealing with evidence. You’re dealing with raw emotion and narratives that you know will engage the information environment in the basest way possible.

And so, understanding that going in and being clear eyed about that on a consistent basis, and adjusting your speed to the pace at which information flows are two things that are existentially important. I don’t think that the U.S. government, or almost any democratic government, that has dealt with this, with the exception of a few specific governments on very specific issues, such as the Baltics with Russia, are very good.

Shawn Baxter: James, Bruce, do you want to jump in on USG mistakes and what we can learn from them?

Bruce Wharton: I think there are too many to count at this point, and I was certainly guilty of my own share. I do remember early on the philosophy was “no, we won’t dignify something with a response.” That connects to Graham’s point. But, it was never clear to me what we should do instead of responding directly to a false charge, knocking it up to a higher level. I’m not sure if that was an effective approach, so I’m tempted to ask the question back to James and Graham. If you don’t respond directly to an outrageous charge, how do you avoid leaving the field open to your opponent?

James Pamment: Graham, I can see that you’re letting me take this here, that’s fine. I want to say I agree with everything Graham said, and then the point you just made, Bruce, was the one I was thinking of making. But I have another point to answer your question. I do think we should ignore some things. I don’t think we should answer every piece of disinformation. I think we should weigh whether it affects our interests. And if there’s a problem to take care of in the background, then we should monitor it, and design our messaging with an awareness of it, but it shouldn’t become the centerpiece of the messaging.

One mistake that I think everyone’s making, not just the USG, is that we’re pumping money into this problem and we are not really evaluating whether any of these techniques work, particularly the countermeasures. Does truth tracking work, does debunking work, does training communications staff work? We have good anecdotal evidence that some of the stuff we’ve been doing may have had an impact, but I don’t know. That’s the mistake that everyone’s making at the moment. We need to follow up and find out what actually does work.

Graham Brookie: One very specific tactical thought on that is that we’ve had a good amount of success with exposing disinformation, and exposing isn’t the only tool in the toolkit. How do you not directly respond to a case of disinformation? One, you call it what it is, first and foremost. Fancy academic institutions would say, and this is a technical term, take the sandwich approach: the truth, and then the disinformation, and then the truth again. Next, think about disinformation as a form of corruption. Explaining how that corruption works, to include the supply chain, and exactly how it reached that audience, is a good way to go about this and make the point on a more strategic level that someone is trying to manipulate the information environment in order to manipulate you or influence you. Explaining that supply chain is how you avoid responding directly, but you can shape the ecosystem that produced the disinformation.

Shawn Baxter: Thank you, all. We don’t want to lose sight of the other side of the public diplomacy house, our in-person initiatives, our cultural and educational exchange programs. In the rush to digitize programs and to address challenges in the information space, where does that leave us with some of our other programs? Edward R. Murrow talked about the proverbial “last three feet.” Is that still important regarding disinformation? Could the three of you address that?

Bruce Wharton: Let me take a first shot at that. I think it remains incredibly important. One of the lessons that we’re learning from social media is that the power of a credible voice is really important. If I hear information or disinformation from someone that I know and trust, then I’m more inclined to believe it. So, the relevance of personal contact in a field setting with government officials, with media influencers, with academics, remains vitally important. The problem, of course, is that we can’t reach out and touch seven and a half billion people personally. It comes back to analytics. We need to find a way to identify where the most influential nodes are, whether that’s a personal network or online social media network, and then make sure that we develop meaningful relationships with those actors and people.

Graham Brookie: I think the quick point there is that the world is run by those who show up, and the in-person programs are designed to create the structure in order to show up. I say this, having worked on the young leaders programs across the world. These programs are extremely successful but also extremely difficult to measure from an M&E standpoint because the whole point of a young leaders program is generational. You’ll be able to measure the success of that program 10 years from now, which is the point. How you tie that back to disinformation is that a big part of building resilience to disinformation or to foreign influence operations is long-term investment in consistent contact. So, when something does hit the information environment that is outright false, misleading, or harmful, there’s already a structure in place that is evidence-based and accountable and trusted to address it.

James Pamment: I’ll just agree with the previous speakers. Past public diplomacy studies have shown that with greater distance comes less knowledge of a country, certainly less detailed knowledge. What these exchanges do is mitigate that to a certain degree. “I don’t judge you by the leader of your country. Hopefully, you won’t judge me by the leader in mine because you know enough about my culture. I know enough about yours not to do the same.” Hopefully, that also applies to disinformation. Hopefully, if you have contacts in that country, you can call them up and say, “Have you heard about this – is this true?” So, I do hope that those basic principles of public diplomacy do apply to countering disinformation. The old exchanges are extremely valuable.

Shawn Baxter: Great, thank you. We have time for one more question, and I want to take one based on the idea of a disinformation nonproliferation agreement that Bruce raised as a possible course of action. Bruce, could you clarify what you mean by that?  And, James and Graham, do you have any comments on that idea?

Graham Brookie: I’ll say this is a hard one. I agree with Bruce, but there is a recent example with regard to U.S. elections and foreign policy that would indicate that we have some work to do on this idea. Last week, Vladimir Putin called for detente on interference. What he meant was, okay, well, we’ll admit essentially to running influence operations targeting the United States elections, but if we agree to stop, then, in return, don’t ever talk to civil society in Belarus ever again. And that’s a bad trade off.

So, the devil is going to be in the details regarding nonproliferation vs. détente. Having norms for these issues that don’t have norms is important. That probably starts with working with our allies and partners in free and open democratic societies to establish basic norms about behavior in the information space rather than trying to launch a new START treaty on disinformation and social media with Russia, China, and Iran. I’m not sure that will work in the near term. One parallel is cyber norms and figuring out what the norms are for hard cyber activity, and we are struggling with that in a major way. That has wide implications for our economic security, national security, our infrastructure, and things of that nature.  It’s even harder in the soft information environment involving content.

James Pamment: I get to advertise a piece of work that actually came out today through Carnegie. It’s the third report of my European policy magnum opus. It has a section on deterrence for influence operations. In answer to the challenge that Bruce has given us, yes, you can talk about these kinds of things, but it has to be within a framework of other deterrence components. So, you also need sanctions and strike back capabilities before you can start talking about making peace. That’s how deterrence works. Underlying that framework is the need for a consistent policy. In the report that came out today, there is an operational model for using this kind of approach. This model, which is classified, describes these trade-offs.

I want to give one example as well, since Graham gave an example of Russian behavior. Spain negotiated a partnership on countering disinformation with the Russians. Pretty much before the ink was dry on the agreement, the Russians just went out with their press team and just announced to the world, “We’ve made a deal with Spain to do this.” Spain wasn’t informed beforehand. It was just a PR coup, and as far as I understand, nothing has come of the partnership.

Bruce Wharton: Well, like I said, it was sort of a crazy idea out of left field. I think one of the biggest problems is that with nuclear nonproliferation, we had the capability for mutually assured destruction. I hope that Western democratic nations would not respond to Russian or Chinese or Iranian disinformation in the same way. That is, we would not launch our own disinformation campaigns, and we would continue to talk to civil society, to conduct exchanges, and to promote education, which is very different than disinformation. Frankly, I hope we’re never willing to go to the mat and respond to aggressive disinformation from opponent states in the same way that they do, which puts us at a terrible disadvantage in negotiating any sort of détente.

I think that that Graham’s idea of establishing norms may be a slightly more productive path. I don’t trust the Russian or the Chinese or the Iranian governments to abide by norms, but again, at some point it’s a conversation worth having. Maybe it’s in the context of spreading the gospel of liberal democracy. I don’t regret having thrown the idea out there, although I do recognize that it’s a bit far-fetched at this point.

Vivian Walker: Thank you again to our panelists for truly productive and inspiring presentations and to our audience for your informed and searching questions. To close, I’d like to turn first to Commissioner Wedner, who will give us a couple of thoughts to wrap up our discussion. Anne, over to you.

Anne Wedner: Thanks, Vivian. And, thanks to everybody. I thought it was a really high quality conversation. There were several useful insights. First, this idea of privacy versus freedom of expression is super important, and we need to carry that forward. Then, as Graham noted, the false presumption of logical actors in the space is critical. We had some interesting discussions on our tour about nonlinear ways of influencing through public art and humor. For example, there was a lot of interest in teaching the kind of improv that we see on SNL to promote the diffusion of ideas and push back against the conditions that allow disinformation to thrive.

In any case, we are really grateful for all that you have accomplished so far, but there’s still a lot more to do in this space. Now, I’ll turn it over to Bill to conclude.

Bill Hybl: Thank you. On behalf of the Commission, I want to thank our panelists today and also those of you that have been with us throughout this event. Our next quarterly public meeting will be timed with the release of our annual report. We look forward to sharing our findings and discussing them with you. This concludes today’s meeting. Thanks again to all of you.

U.S. Department of State

The Lessons of 1989: Freedom and Our Future