KEMBAR78
Brian Fishman Transcript | PDF | Facebook | Perjury
0% found this document useful (0 votes)
161 views125 pages

Brian Fishman Transcript

This document summarizes an interview of Brian Fishman conducted by the House Select Committee investigating the January 6th attack on the U.S. Capitol. The interview covered Fishman's background, including his education at UCLA and Columbia, as well as his current role as co-founder of Cinder and previous role at Facebook. At Facebook, Fishman's title was Director of Public Policy and he was responsible for policy around terrorist organizations, hate groups, and criminal groups. He reported to Monika Bickert and Neil Potts and worked closely with operations, engineering, legal, and public policy teams to develop and implement content policies at Facebook.

Uploaded by

Daily Kos
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
161 views125 pages

Brian Fishman Transcript

This document summarizes an interview of Brian Fishman conducted by the House Select Committee investigating the January 6th attack on the U.S. Capitol. The interview covered Fishman's background, including his education at UCLA and Columbia, as well as his current role as co-founder of Cinder and previous role at Facebook. At Facebook, Fishman's title was Director of Public Policy and he was responsible for policy around terrorist organizations, hate groups, and criminal groups. He reported to Monika Bickert and Neil Potts and worked closely with operations, engineering, legal, and public policy teams to develop and implement content policies at Facebook.

Uploaded by

Daily Kos
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 125

1

4 SELECT COMMITTEE TO INVESTIGATE THE

5 JANUARY 6TH ATTACK ON THE U.S. CAPITOL,

6 U.S. HOUSE OF REPRESENTATIVES,

7 WASHINGTON, D.C.

10

11 INTERVIEW OF: BRIAN FISHMAN

12

13

14

15 Tuesday, April 26, 2022

16

17 Washington, D.C.

18

19

20 The interview in the above matter was held via Webex, commencing at 11:04 a.m.
2

2 Appearances:

6 For the SELECT COMMITTEE TO INVESTIGATE

7 THE JANUARY 6TH ATTACK ON THE U.S. CAPITOL:

9 STAFF ASSOCIATE

10 PROFESSIONAL STAFF MEMBER

11 INVESTIGATIVE COUNSEL

12 INVESTIGATIVE ANALYST

13 COUNSEL

14 INVESTIGATIVE COUNSEL
3

2 - So we can go on the record at 11:04.

3 Good morning, everyone. This is a transcribed interview of Mr. Brian Fishman,

4 conducted by the House Select Committee to Investigate the January 6th Attack on the

5 United States Capitol pursuant to House Resolution 503.

6 At this time, I'd like to ask the witness to please state your full name and spell it

7 for the record.

8 Mr. Fishman. My name is Brian Fishman, B-r-i-a-n, F-i-s-h-m-a-n.

9 - Great. Thank you.

10 This will be a staff-led interview, and members of course may choose to come in

11 and ask questions, although I don't currently see any members in the Webex. If anyone

12 joins, either-or I will announce their presence.

13 There may be several people asking questions throughout the deposition.

14 Mainly, it will be myself and but there may be a few other counsels

15 coming in to ask questions. If you don't understand a question, please just ask one of us

16 to repeat it.

17 I will note that the reporters can only record verbal responses, so it's very

18 important to get a "yes" or a "no" answer. And we'll try to prompt you in case you nod

19 or shake your head just to repeat the answer so the record can reflect it.

20 In the room today are m y s e l f , _ investigative counsel; a n d -

21 - investigative analyst. We also have professional staff

22 member. There are currently no members in the room.

23 So, again, Mr. Fishman, you are here voluntarily for a transcribed interview.

24 There is an official reporter transcribing the record of the interview. Please wait until

25 each question is completed before you begin your response, because cross-talk is very
4

1 hard to have the reporters capture, and we'll try to wait for your response to be finished

2 before we ask our next question. And, again, the stenographers cannot record

3 nonverbal responses.

4 We ask that you provide complete answers based on your best recollection, and if

5 any question isn't clear, you can just ask for a clarification. And if you don't know the

6 answer, simply say so.

7 It's important that you understand this interview is voluntarily. If at any point

8 you want to stop speaking with us, that is your choice.

9 Similarly, if at any point you need to discuss something -- well, I see you don't

10 have an attorney here at present, but if you do have an attorney coming up at any point

11 in the interview, you can take a break and confer with them. And if you also need to

12 take a break to have a comfort break or collect your thoughts, that's also totally fine.

13 This interview is not under oath, but because it's the formal select committee

14 investigation, you are obligated under Federal law to tell the truth, the same as if you

15 were speaking to the FBI or DOJ. And this is something that we tell all of our witnesses

16 who are in transcribed interviews. It's unlawful to deliberately provide false information

17 to Congress. And for this interview, providing false information could result in criminal

18 penalties for perjury and/or false statements.

19 Do you understand?

20 Mr. Fishman. Yep.

21 - You are not obligated to keep the fact of this interview and what we

22 discuss confidential. You are free to tell whomever you wish to discuss it with that we

23 met and you can share what we discuss today. We, however, will not share what was

24 discussed today and we'll keep that confidential. It's completely up to you.

25 And, again, please let us know if you need any breaks or would like to discuss
5

1 anything throughout the interview. We can go off the record and have that

2 conversation.

3 And just as a final point, if at any point you don't understand where the questions

4 are going or you don't understand the question, just ask us to repeat it, and we'll be

5 happy to do so.

6 All right. Well, with that, I think I will hand it over t o - f o r some

7 preliminary opening questions, and we'll be switching back and forth throughout the

8 interview. But thank you again.

9 Thanks,_

10 And thanks again, Mr. Fishman, for joining us.

11 EXAMINATION

12 BY

13 Q Just for the record, I have a few very basic identifying questions for you. So

14 if you could let us know, where is your current address?

15 A Menlo Park, California.

16 Q Okay.

17 And tell me a bit about your educational background. Where did you earn your

18 bachelor's degree?

19 A UCLA.

20 Q And any graduate or postgraduate work?

21 A Yeah. I did a master's at Columbia University, the School of International

22 Public Affairs.

23 Q Okay. Excellent.

24 And as for your work history, what is your current employment and your most

25 recent previous employment?


6

1 A I am currently a co-founder of a company called Cinder, which is building a

2 fully integrated stack of tools for companies to use for trust and safety purposes.

3 Q Okay.

4 A And previously I worked at Facebook.

5 Q Great.

6 And that takes me right into my next line of questioning, which is, I want to talk a

7 little bit about your time at Facebook.

8 What was your title while you were at Facebook? Or did it change?

9 A I think it was never formal. I think it was probably the -- you know, names,

10 titles at tech companies are vague and elusive things sometimes. I was a director of

11 public policy, is probably the most formal of them, but I was also the head of what we

12 referred to as "dangerous organizations," which meant I was responsible for policy

13 around terrorist organizations, hate groups, large-scale criminal groups in particular.

14 Q And what were your responsibilities in this role?

15 A Yeah, that's a good question. So the core role was within what was called

16 the product policy team. The product policy team at Facebook develops internal policy

17 for what can be done on the platform and the various platforms, ideally -- so Facebook,

18 lnstagram, and to some extent WhatsApp.

19 But the way that I sort of executed that role was often serving more as a

20 coordinator among the various elements of the company in order to execute -- both to

21 build the policy and to try and execute on those policies.

22 So much of what I did on a day-to-day basis was bring together the operations

23 teams, the appropriate engineering teams, the legal teams, with the policy team in order

24 to try to better understand the problems and then construct policies and then execute

25 them to the best of our ability.


7

1 Q Okay. And to whom did you report in this role?

2 A I reported to Monika Bickert and Neil Potts.

3 Q And you said that this organization sat within the product policy team. Did

4 you also interact with the public policy team?

5 A Yes. I did.

6 And I should, you know, just for the sake of completeness -- you guys are

7 Congress -- when I was originally hired, I reported to Monika and Alex Stamos before he

8 left, but that was a very sort of brief time period, and for the most part I was reporting to

9 Monika.

10 We did interact with the public policy team, and there were times when I would,

11 you know, go talk to Members with public policy counterparts and, sort of, corresponding

12 teams and efforts around the world, go talk to nonprofit groups and civil society, et

13 cetera.

14 You know, a key part of the role was to be sort of an in-house expert on how

15 terrorist groups, hate groups, and others operate.

16 Q You also mentioned engineering operations, legal. Are there other teams

17 that you frequently interacted with closely?

18 A I mean, certainly public policy team, engineering ops, legal. You know,

19 occasionally the research teams. Those are the -- those would be the core ones though.

20 I mean, there were always occasional reasons to talk to, you know, some of the sales

21 teams or something if something strange came up, but usually that was more out of the

22 ordinary.

23 Q And around the period of the 2020 election and the lead-up to January 6,

24 2021, did you interact with the civic integrity team at Facebook?

25 A Yes. The -- and we did work with the civic integrity team to some extent,
8

1 but not as closely as I think many folks would assume.

2 But there were processes in the run-up to 2020 and afterwards that I'm sure you

3 guys have seen. I think you guys have some of the -- some of the exhibits mention the

4 IPOC, which was, sort of, Facebook's mode for coordinating across the company on key

5 issues. And we would participate in those processes, but usually in that sort of space.

6 Q And how did your work intersect with -- were there specific topics where

7 your work intersected most closely with the civic integrity team where you would

8 typically engage with them?

9 A Well, you know, the work of my team and the work of the, sort of,

10 dangerous organizations, quote/unquote, "XFN" -- "XFN" is the term that Facebook uses

11 to talk about, sort of, cross-functional groups, right? So, you know, in U.S. Government

12 terms, it would be the interagency, right?

13 And we were very much focused on violence, real-world violence, and groups,

14 organizations that were planning, contributing to, engaged in real-world violence, and

15 trying to do whatever we could to keep those groups from using Face book. And so that

16 was our, sort of, core focus.

17 The civic integrity team was focused on a broader set of issues related to election

18 security -- misinformation related to elections, voting processes, those sorts of things.

19 And where that would come together was in, sort of, when either you'd get an

20 effort by known hate groups or terrorist groups or something around the world that

21 would be targeting an electoral process, which is something that happens in a variety of

22 places, or you would get real threats of election-related violence where some of these

23 sort of entities would come into play.

24 But it's important to understand that there are a bunch of other policies that also

25 touched on violence that we did not control as directly. So policies against violence and
9

1 incitement, policies against hate speech, those were -- my group had some influence on

2 those, but we did not control them as directly as we did the "dangerous organizations"

3 piece of it. And I think that's an important distinction to understand.

4 Q Thank you for that.

5 What about other integrity teams at Facebook? I understand there were also,

6 for example, like, a core product integrity team. Civic integrity was just one integrity

7 team. Were there other integrity teams that you interacted with?

8 A Yeah, I mean, we interacted with many of the integrity teams. But, to be

9 honest, you know, internally, in a place like Facebook, sometimes those distinctions are

10 clear, and sometimes it's all just kind of a jumble.

11 But, certainly, right, the core integrity team is building basic tools and basic

12 systems that other teams would plug into and utilize. And so, in order to understand

13 what those capabilities were and to try to influence the roadmapping processes -- which

14 means the process by which they prioritize and build certain capabilities -- you would plug

15 in and sort of hope that your voice was heard in those sessions.

16 Q And when the civic integrity team was restructured in December of 2020,

17 what was your reaction to that announcement?

18 A Honestly, I didn't have much of a reaction. The civic -- and it was widely

19 understood internally at Facebook that the civic integrity team was not always getting

20 along with other similar teams. But we didn't -- my group, we didn't have enough of,

21 sort of, an engagement to have a real strong view about it, to be totally honest.

22 You know, I think it was not terribly surprising, in understanding how Facebook

23 works, if you've got, sort of, elements where there's sort of, you know, enough rumor

24 spilling around the company that there is some sort of rub on these kinds of -- you know,

25 between teams. Like, that's -- tech companies love to reorg, and Facebook's no
10

1 different.

2 So I think it wasn't terribly surprising, but I didn't have a strong -- I honestly didn't

3 have a strong view about whether it was a good idea or a bad idea.

4 Q And from your vantage point, what were those points of strife between civic

5 integrity and other teams? With whom, and over what issues?

6 A You know, I hesitate to get too specific, because I think my understanding is

7 limited. But I think that my general impression was that the civic integrity team wanted

8 to be more forward-leaning in terms of trying to impact, you know, some of the

9 recommendation systems, some of the others, and that they had sort of rubbed partner

10 organizations the wrong way because they didn't always bring those folks in, I think

11 would be the critique you would hear from others.

12 And product organizations -- the civic integrity team, the great advantage that

13 they had and the great advantage that any product team has at a technology company is

14 they've got the engineers built in. And so, you know, everybody at a tech company

15 orbits around the engineers at the end of the day, because that's how you actually get

16 things done when push comes to shove.

17 And so I think that was a great advantage that they had. That's a really good

18 thing in some ways. But the disadvantage is that, when you've got that internal

19 inherent capability within your group, you don't have an incentive to go out and

20 collaborate, you know? And I think that there was the impression among other teams

21 that they were not as collaborative as they should be.

22 But, honestly, that's -- you know, I'm relaying, effectively, hearsay when I say that,

23 because that was not my experience. I just didn't have that much of it.

24 Q Were there particular teams or individuals that you've heard these

25 complaints from?
11

1 A I don't know. It's been 18 months. The -- I don't know. That was a

2 grumble that I heard. Like, I really don't -- I'm not -- I wouldn't want to throw out a

3 name because I'm not sure that it's accurate.

4 Q Okay. That's fair. It's been 18 months.

5 A Yeah.

6 did you have any other questions in followup to any of those

7 points?

8 - Yes. Thanks,

9 First, I wanted to announce that , who's a counsel with the

10 select committee, has joined the Webex.

11 BY-:

12 Q And, Mr. Fishman, that was all, I think, extremely helpful context.

13 And I wanted to follow up on one point that you raised earlier about your role as a

14 coordinator amongst various elements of Facebook with regards to violent organizations,

15 and I was wondering if you could expand on that. Was that more of a coordinator role

16 for discrete crisis events involving violent organizations or sort of a prospective policy

17 coordinator?

18 A Yeah, so I think there's -- that's a good question.

19 When I first arrived at Facebook, there was a team called -- what's the operations

20 team? It's amazing how quickly you forget some of these things. There was a discrete

21 team within the operations organization that would respond to crises. Because there's

22 constantly a crisis. On Facebook, with the global scope, there's constantly some disaster

23 happening that needs to be investigated and understood, and there was an operations

24 team that does that.

25 There was no real cross-organization crisis response team. And so, when I first
12

1 arrived in 2016, we would play that role if it was a terrorist attack.

2 This was a bad system. It was not a good way to do things, because we were

3 looking at longer-term changes, trying to systemically address things, trying to be

4 subject-matter experts. And, unfortunately, on a global scale, there's very often some

5 terrible thing happening somewhere in the world, and because of Facebook's scope,

6 there was often some connection to Facebook, whether it was planned on Facebook or

7 there was just, you know, video or one of the attackers had an account and we need to

8 understand, you know, whether or not there was something happened on Facebook, and

9 most -- you know, whatever it is, there was something to be done.

10 And so that gets very distracting, and so the team built a -- so the policy team built

11 something called the strategic response team. The strategic response team, policy

12 team, then became sort of the central node for coordinating crises, for lack of a better

13 term. And a lot of that is trying to ensure that all of these different organizations are

14 able to plug in and provide a perspective and communicate up to leadership in a way

15 that's reasonably coherent.

16 I don't think that Face book, during my time period, ever got really, really good at

17 that, but they got way better during the, sort of, 5-plus years that I was there, and that's

18 because of that strategic response team.

19 So my direct role was really broader coordination among those teams on

20 longer-term projects and coordination in particular with the legal side on investigations

21 that were more sensitive, where not everything would be sent up in an email or to a very

22 select group.

23 So the other thing, though, that I want to make sure that I'm saying to you guys,

24 because I think it's important and it shapes -- I hope it will help shape your understanding

25 of the way that these policies work -- is, there's a framework that social media policies
13

1 operate in, sort of an actor -- ABC -- actor, behavior, or content base.

2 And I see-shaking his head and nodding his head. You've heard this before.

3 But what's really important about the policies that I ran is that they were

4 actor-based policies. They were built on a determination that specific actors had

5 engaged in behavior, either on platform or off platform, such that they were so high-risk

6 that we didn't want to allow them to be on the platform.

7 And, in many cases, as in the case of, like, an ISIS or the KKK or, you know, some

8 sort of first-tier terrorist or hate organizations, we wouldn't even allow them to be

9 praised on the platform. You couldn't say nice things about them. This is an incredibly

10 blunt policy.

11 Most of these other policies that we're going to talk about -- not all of them, but

12 most of the other ones that we're going to talk about are content-based policies.

13 They're policies that are enforced at the level of the content that is posted on the

14 platform at a particular moment.

15 The advantage of a content-based policy is that it's enforced in a more targeted

16 way, it's very specific, it applies to anybody, but it's fundamentally reactive, because it

17 can only be applied after the content has been posted on the platform, after it has been

18 surfaced in one way or another, whereas an actor-based policy is a blunt instrument in a

19 lot of ways, because you're basically saying this group, this individual, you know,

20 whatever kind of entity it is, may not be here. But it allows you to get ahead of

21 problems.

22 And so there is a real trade-off between these actor- and content-based policies.

23 And I think when you guys think about hate speech policies or violence and incitement

24 policies or misinformation policies, for the most part, those are all content-based policies,

25 whereas the dangerous organizations policy, for the most part, is an actor-based policy.
14

1 And that distinction is a really important one, and it shapes not just for your

2 understanding of Facebook on January 6th, but I think it's an important distinction for

3 thinking about how these policies work more generally, because they have pros and cons.

4 And one of the biggest is that a content-based policy can be a lot more focused, a lot

5 more targeted, but it's reactive. An actor-based policy is a blunt instrument,

6 fundamentally.

7 Q And that's helpful.

8 If you could expand on one point you made. You said that the violent

9 organizations policy was mainly an actor-based policy. Can you kind of explain how it

10 can be both?

11 A Yeah, yeah. So, like, fundamentally, it's based on sort of a designation of

12 various actors as falling under the policy, right? And so that's what makes it

13 fundamentally an actor-based policy.

14 The reason why I say both is because of that prohibition against praise of those

15 actors, which I think kind of bridges that gap a little bit between content and -- you know,

16 because that means you can't go -- you're not a member of ISIS, I assume,_ but you

17 can't go on Facebook and praise ISIS, right? So, you know, your content, if you were to

18 do that, would be removed on the basis of what you had done there, the content itself,

19 not on the basis of you and your affiliations in the real world. That's what I mean.

20 Does that make sense?

21 Q Yes, definitely.

22 And I want to let-follow up on this, but I had one question on your role in

23 crisis situations.

24 So, at the time we get around to beginning of 2020, I assume the strategic

25 response team is in full swing in responding to these situations as they arise?


15

1 A Yeah, the strategic response team is in full swing. But, by this point, there's

2 also this IPOC process, which is sort of a standing coordinating mechanism. I mean, I

3 forget when they took the IPOC off. You guys probably can get that from Face book.

4 don't remember. The -- before 2020. But, I mean, it really launched, you know, a

5 year-ish ahead of time. I don't remember exactly.

6 And the idea was -- the idea of the IPOC was, sort of, you know, standing levels of

7 operations teams that are kind of there, ready to go, to solve problems in real-time and

8 regular, sort of, coordinating sessions producing reports.

9 And so that IPOC process was sort of a more formalized and structured -- you

10 know, think of it as kind of an ad-hoc, not-physical fusion-center concept, right, where

11 you've got people from all of these different organizations coming together, doing

12 stand-ups -- you know, in its most intense form, doing stand-ups a couple times a day,

13 saying, here's where we are with this problem, here's where we are with this problem,

14 here's where we are with that problem, et cetera.

15 Q So, within that structure, what was your role during crises, as head of

16 dangerous ergs?

17 A Yeah. So, as -- you know, within the -- I mean, it depends on the crisis,

18 right? So there were IPOCs related to, like, COVID misinformation and things like that in

19 which I didn't really have much of a role.

20 But in the run-up to 2020, especially towards the end, you know, I would jump in

21 and sit in through stand-ups when I could, and some on my team would do that as well,

22 so that we made sure that we understood what the rest of the company was doing, and if

23 we had something to contribute, we would contribute.

24 - We'll get into more of that later, but first I wanted to pause to see if

25 _ , a d any questions on these points before we move into some of the more
16

1 nitty-gritty.

2 I just have one clarifying followup question.

4 Q The strategic response team, was that the team led by Molly Cutler?

5 A No. Well, so, yes. Yes and no. So there's two strategic response teams

6 at Facebook, which is worthy of satire. So one is the strategic response policy team that

7 was built under Monika Bickert, originally by Neil Potts.

8 And then Molly built another one, another strategic response team, that was

9 focused on sort of larger-picture, you know, kind of higher strategy in the sense of, like,

10 bigger-picture company initiatives, whereas the strategic response policy team was more

11 focused on sort of the core -- like, as a core mission, coordinating in a crisis.

12 And they did other initiatives as well. When, sort of, you know, key geopolitical

13 questions and things like that would come up and it wasn't totally clear who the owner

14 would be, they were really a useful, kind of, owner to coordinate things like that.

15 Q Thanks. I'm glad I asked.

16 A Yeah.

17 That was the only thing I wanted to follow up on. ~

18 - Great. Thank you.

19 BY-:

20 Q So now we have a few questions on the violent organizations policy, and

21 we'll build on there.

22 So, if, in the first instance, you could walk us through the violent organizations

23 policy during Face book during your tenure and how it evolved throughout your time.

24 A Sure.

25 So I think there are -- you know, if you look at the policy today on the website, on
17

1 the, sort of, community standards website, there are three tiers. When I first got there,

2 it was really that first tier, which is terrorist groups, hate ergs, large-scale criminal groups.

3 And the policy there basically says you may not praise, support, or represent any

4 of these entities on Facebook or lnstagram, right? So you can't be ISIS and be on

5 Facebook for any reason, to talk about the weather, to talk about your dogs, to talk about

6 what you had for lunch yesterday, whatever it is. Like, you're not allowed.

7 And the policy is really far-reaching in the sense that it also disallows praise. This

8 is a really blunt instrument policy.

9 Now, one of the reasons it's so blunt, part of it is that Face book made the

10 determination that it didn't want to allow any sort of fostering of these kinds of groups on

11 the platform, to the extent that it could. But, also, it's really hard to distinguish at scale

12 between what constitutes praise and what constitutes support.

13 That may matter in the sense that, for some of these groups, there may be -- and I

14 think it's vague, frankly, but there may be legal obligations, because of sanctions law, to

15 remove support and representation of these groups.

16 And yet, sort of, neatly distinguishing between praise and support, where that

17 legal obligation may come into play and where it may just be somebody saying, "I like

18 Hezbollah," or whatever it is, this is a very difficult thing to do to scale. Like, the scale of

19 social media platforms leads to blunt policy, because highly nuanced policy is very difficult

20 to apply at scale.

21 And so this actor-based policy, that's what it was. There were processes

22 internally for identifying different groups. Going through a designation process is not so

23 different than what the State Department or the Treasury Department does, I think.

24 And so, via those processes, that's what we did.

25 I'm sure you guys have seen, you know, the leaked version of that list that came
18

1 out, that The Intercept got their hands on. Be a little bit careful with that. It's not

2 as -- I don't think it's totally comprehensive, but it gives you the idea.

3 So that's one.

4 Then there was this point where there were a number of actors that were pretty

5 problematic, they were doing things that were pretty hateful on the platform, they

6 qualified as hate actors under the policy, but they were also really part of a political

7 debate. And so there was an effort to try to identify some of those folks, remove them

8 from the platform, but allow some praise for them.

9 And so this is folks like Alex Jones, Laura Loomer, people like that, where there is a

10 social, there is a civic discussion about these people. The decision was made not to

11 allow them onto the platform. And I actually was on paternity leave when the Alex

12 Jones decision was made, so I can't speak to that one specifically, but I think it's the

13 important one.

14 But the effort there was to try to take this very blunt instrument of a policy, which

15 is the dangerous organizations policy, and give it a little bit more nuance, right? To say,

16 look, these are people that we don't want to allow on the platform, they're doing all sorts

17 of nasty stuff, they're being very careful in some cases to try to walk right inside of the

18 policy lines, but they're doing really nasty things off the platform, and so we don't have a

19 lot of confidence that they will not. And so the effort was made to designate those

20 folks. They aren't allowed to use the platform, but supporters may praise them, et

21 cetera. Because you don't want to have the broader impact.

22 And then there is, sort of, the last effort, which was during 2020, and this was an

23 effort to do two things. One was to define two new categories -- really to define two

24 new categories of entities that could be designated under the broader dangerous

25 organizations policy.
19

1 One was militarized social movements. And militarized social movements have

2 two components. One was, sort of, organized militias. The second was groups that

3 were organizing violence in the midst of protests. And so, you know, that wound up

4 catching groups like the Oath Keepers, for example, in addition to a bunch of other, sort

5 of, militia groups in the United States and some, you know, even less specific kind of

6 organizations that were advocating certain kinds of, like, tactics in the midst of protests,

7 militarized kind of tactics in the midst of protests.

8 The important thing to understand about this process was that those policies did

9 not remove every individual member of those organizations. They were designed to

10 disrupt the ability of such organizations to organize on Face book. They were not

11 designed to remove every individual member of those organizations.

12 So that is a different policy, it is a different approach than, like, what happens to

13 the KKK or even the Proud Boys, which was designated years earlier as a hate org, or ISIS,

14 for example.

15 And so one of the reasons there is that many of those militia groups -- it was clear

16 that the threats of violence were increasing as the summer of 2020 went on, but it was

17 also clear that many members of these groups had never participated in acts of political

18 or social violence. And we did not want these groups to be able to operate, to organize

19 on the platform; at the same time, we didn't want to wipe a bunch of folks off that, in

20 many cases, we were not aware that they had broken a law, they individually hadn't

21 threatened violence, they didn't think -- in some cases, they didn't think of their

22 organizations as preparing for violence. And so we built a new category that was slightly

23 different than some of the preexisting ones.

24 The other piece that was established in the summer of 2020 was the so-called

25 violence-inducing conspiracy network policy. The violence-inducing conspiracy network


20

1 policy was ultimately what allowed Facebook to address QAnon more substantively than

2 it had in the past. And this was similar. It didn't endeavor to take down every member

3 of QAnon, everybody that said something nice about QAnon. It was focused on

4 disrupting the organization of that movement.

5 And it required a new standard for designating these kinds of entities, because,

6 you know, QAnon was particularly difficult because it's not a formal organization, and so

7 there were a lot of efforts to try to understand, you know, how the -- I almost said

8 "group," but I think it's explicitly not a group -- but how the movement operates and

9 connects with each other. And so we were able to disrupt it.

10 Now, the thing that I think is important when you think about these policies that

11 were instituted in the, you know, sort of -- when? -- August 2020 is that it's not that those

12 were the first time that Facebook had taken action against these movements. That was

13 the first time the dangerous organizations team and an actor-based approach had been

14 used against these movements.

15 Violence and incitement policies, hate speech policies, all of those other kinds of

16 coordinated harm policies, all of those other policies applied to them previously. The

17 challenge was misinformation policies in the case of QAnon, And that's how Facebook

18 attempted to deal with those groups to that point. The problem is that those policies,

19 because they are content-based, were fundamentally reactive, and they were not having

20 a significant impact.

21 And so, when push comes to shove, you know -- and I'm very proud of that team

22 of people -- the team that gets things done at Facebook is the dangerous organizations

23 team. And when threats really get serious, that's who you would call.

24 Q Thank you for that walkthrough.

25 I think I'll take the general platform policy areas in the order that you first brought
21

1 them up.

2 So, first, I guess, my question is, how did you go about choosing which groups to

3 designate as falling in the general baseline dangerous ergs policy?

4 A Well, I mean, you don't -- you set up a set of criteria, and then you apply that

5 criteria. And that criteria is all based around violence, you know. So, you know, you

6 can look at Facebook's definition of terrorism. I don't remember it exactly off the top of

7 my head, but it's a pretty standard, academic-y definition of terrorism, for example.

8 And you break that down into a series of discrete signals. You assess organizations

9 based on those signals, and if they meet the signals, you designate. And same thing

10 with hate ergs; same thing with criminal ergs.

11 The hard part is, there are so many entities of these in the world, right? Like,

12 Facebook has designated far more organizations than the U.S. Government, far more

13 White supremacist organizations than all of the Western governments combined. And

14 so there is a dynamic here where it is -- you know, that effort moves really -- you know, in

15 governmental terms, it moves pretty fast. I think in, like, civil society terms, I think, you

16 know, civil society folks would say it moves pretty slow.

17 But that process is designed to try to keep people's personal views out of it. It's

18 designed to identify a set of criteria, assess those groups as honestly and rigorously as

19 possible -- which is sometimes frustrating, because there are times when, you know,

20 you're pretty sure a group, you know, falls into this category but you don't actually have

21 the evidentiary basis for it, and so you can't make a decision on those grounds.

22 And I don't think that those, sort of, evidentiary standards -- those evidentiary

23 standards that Facebook uses are not as strict as what you'd have in a Federal courtroom,

24 but they are real, because those decisions are real and they matter. And so I do think

25 people take it seriously.


22

1 Q And so this process would start with your team?

2 A For a designation of, like, a terrorist group or a hate group?

3 Q Yes.

4 A Not the entire time. No, not the entire time. Like, that responsibility, we

5 fully took control of that in 2019? Yeah. Before that, it was a group called the organic

6 content policy team, which was sort of a sister team to ours.

7 Q So, from 2019 onward, if it started with you, where would it go after your

8 team had suggested a group?

9 A So we had a sort of formal process with committees where we would sort of

10 nominate and prioritize groups to look at, generally based on, you know, what we

11 estimated was the threat of forward-going violence, right?

12 Because ultimately what you're doing here -- the thing that's different about the

13 dangerous organizations group is that they're fundamentally indexed on the real

14 possibility of real-world violence, not what is the nasty thing somebody's going to do on

15 the internet.

16 And so, when you're prioritizing, you're not prioritizing -- I mean, like, of course

17 you're interested in, like, do they have a big presence on Facebook, right? Like, that

18 raises the, sort of, prioritization as well. But you're also really looking at, like, am I really

19 afraid that this group is going to be engaged in violence soon? Right? Like, do we

20 need to go after these guys soon because we're really worried about the possibility that

21 they may leverage Facebook for violence in the real world? And, you know, I'm not

22 going to suggest that's a perfect process. It's not. There are some judgment calls

23 there on how you try to prioritize those things. It's hard.

24 But, yes, from some point in 2019 forward, we ran that process. But we still

25 included a number of other, sort of, decision-making bodies from across the organization.
23

1 Q And my last question on this --

2 A Yeah.

3 Q -- first bucket of policy: Once you had taken action against a group, how

4 did you follow up to measure efficacy of the ban, decrease in content?

5 A Yeah. It's hard. And I don't think we ever fully did a great job of this

6 when it comes to, like, group by group.

7 Q Okay.

8 A We certainly had a sense overall of -- and, you know, you can look at the

9 prevalence measures of dangerous organizations content on Facebook. But I think

10 that's the closest that we ever really got.

11 And that's one of the -- you know, I mean, I think -- like, those were the kinds of

12 things that the data science and the central integrity team would put together, are these

13 high-level, really important sort of measurement efforts, but they don't give you the

14 granular information that you really want. You know, oftentimes, in more ad-hoc ways

15 we would try to understand, you know, how frustrated were these groups after we had

16 removed them, you know, could they come back, how evasive were they having to be

17 over time, and pieces like that.

18 Q Okay.

19 A But it was not as scientific as you'd like it to be.

20 Q So that's helpful.

21 And I guess I'd like to understand better, how did you identify signals that groups

22 were using to express their frustration or their ability to organize?

23 A We were almost always looking off of Facebook.

24 One of the -- you know, when I say that the dangerous organizations team was

25 looking at these kinds of entities, one of the fundamental insights -- right? So, when I
24

1 got to Facebook, the original mission was deal with 1S1S, which was still very much in its

2 heyday and all over the internet.

3 The thing about ISIS is that -- and what people misunderstand, I think, a lot of the

4 time when they talk about Face book in particular is, they think this is where organizations

5 organize specifically. And that's not always true. It's certainly not the core

6 organizational forum for groups like ISIS and now for many of the White supremacist

7 groups. Oftentimes that's Telegram. And certainly for ISIS it was Telegram.

8 And so, sort of, our key assumption within the dangerous organizations realm was,

9 if you want to understand the threat to Face book, you have to look off of Face book,

10 because we're oftentimes not upstream of everything else. Oftentimes we're

11 downstream, right? So, if you see something on Telegram, it may not be on Facebook

12 yet, but it will be by tomorrow, right?

13 And so there is a notion of kind of an intelligence model of understanding the

14 actual full-scope threat environment in order to understand the risk that these groups

15 pose. And that's the approach that we took within the dangerous organizations world.

16 That's the approach that I think some of the other teams that are focused on, like, state

17 espionage and cybersecurity, things like that, use.

18 But that is not a standard way of thinking for most kinds of integrity teams at

19 Face book or elsewhere in industry. That's a pretty unique way of thinking about these

20 problems. And we really sort of, you know, I think, helped to really think about that.

21 But I think that is a -- that notion of looking elsewhere to understand what the

22 threat is going to be, not just what it is right now, so you can try to get ahead of things,

23 rather than be reactive, was a really key insight for us and informed not just the, like,

24 operations of the team but, you know, the ethos of the team.

25 And so we would look at -- we had vendors that were looking at these things and
25

1 reporting stuff in. And we were, you know -- like, we didn't do a lot of direct collection

2 ourselves, you know, which hurt my old academic soul, you know, who used to chase

3 jihadists around the internet. But I think, you know, we found ways to do it, and it really

4 informed the way that we would prioritize and think about these problems.

5 Q So I'm grateful for that explanation.

6 And I wonder what other platforms you were looking at. You mentioned

7 Telegram. What other platforms were you looking at to mine threats as they were

8 coming down the pike?

9 A Yeah, I mean, Telegram became an increasingly important one, but, you

10 know -- you know, I know you guys are actually focused on the run-up to January 6th.

11 Vendors were providing information about a wide range of other platforms -- you know,

12 Telegram, Parler, Gab, stuff that was happening on Twitter, you know, things that were

13 getting posted on YouTube, et cetera, and, you know, TheDonald.win, et cetera. And

14 those were all things that we were aware of and really concerned about.

15 Q How about 8kun?

16 A Oh, sure, in some cases. Yeah. Yeah.

17 Q And in terms of The Donald.win, did you trace the rise of -- were you actively

18 tracing the rise of that platform following the --

19 A No, I wouldn't go that far. I mean, I think we were -- you know, I'm listing

20 off those various platforms, but really what I'm doing is sort of trying to give you a

21 representative sample of -- what I'm getting to is a point that I want to make to you guys

22 and that I think you'll -- we would get around to the question, so I'll just say it to you,

23 which is: When I became really concerned about the prospect of violence on

24 January 6th -- you know, I think I probably said this to the committee when I worked for

25 Facebook, but I'll say it to you again, because it was true then, it's true now -- it was
26

1 mostly because of content that was not on Face book, and it was because of the content

2 that we were seeing other places.

3 The difference there is that, coming from the dangerous organizations'

4 perspective, we had come to assume that if we started to see it on other places that it

5 posed a direct threat to our platform. And that's important when you think

6 about -- that is an important difference in perspective than when you think about, like,

7 the kinds of metrics, really valid, important, useful metrics, that were used to understand,

8 when do you turn off the break-the-glass measures? Right? Because there you've got

9 really thoughtful -- dealing with very difficult tradeoffs, but basing those decisions

10 primarily on what's happening on Facebook. And, from my perspective, you can't just

11 look at what's happening on Facebook.

12 And so that's the point that I think is really important here. But it's not an easy

13 one, and it's a really tricky one and a difficult one. But I think that that is where -- you

14 know, that history and that framework of assessing threats based on wherever you can

15 get information about them, that's how the dangerous organizations team approached

16 this. That's how I certainly approached these kinds of things. That is not a standard

17 way of approaching these things. But I think it gives you a better perspective and a

18 better way to, sort of, measure risk more generally.

19 Q Thank you.

20 We're jumping ahead a little bit, but you brought it up.

21 A Yeah.

22 Q So when did you see this content off site that was getting you concerned

23 about the violence on January 6th, potential for violence?

24 A I think this -- well, the story of the run-up to January 6th, both I think in the

25 real world but also on Facebook, starts with the Virginia Civil Defense League's Lobby Day
27

1 on January 19th, 20th, 21st, somewhere around there, in 2020 in Richmond, Virginia.

2 That was a day when we were really concerned about the possibility of violence.

3 A bunch of the Boogaloo groups were chattering really nastily. You know, it's a gun

4 rights lobbying day at the Virginia statehouse. And we went on high alert, because we

5 were really concerned about it, mostly because of some of the Boogaloo movements.

6 And nothing wound up happening, though the FBI wound up arresting a couple of

7 guys from a White supremacist group called the Base that allegedly were planning to go

8 to this event and fire off some shots in order to start a shoot-out between police and, you

9 know, the assembled crowd.

10 That, to me, was the sort of hallmark. And we thought about that a lot, going

11 through the year of 2020. We thought about that incident quite a bit. So it started

12 then, of, like, hey, what does a really angry crowd that's potentially armed and what do

13 small groups of people that want to incite that crowd, what kind of threat does that

14 pose?

15 At the same time, you're balancing that against the notion that people -- this is the

16 United States of America. Free speech matters. People have the right to lobby their

17 government. It is legal to carry -- or to open carry in front of the Richmond statehouse

18 and lobby on behalf of gun rights. And the, sort of, social responsibility of a tech

19 company is to empower speech, civil speech. These are really difficult, kind of, balances

20 to strike.

21 But I think that that -- you know, that certainly shaped my thinking. It was the

22 beginning of how I was really thinking about 2020. That was the kickoff. And it was

23 almost -- you know, I mean, I could google it; I don't remember the exact date, but it was

24 almost a year exactly from the ultimate inauguration of Joe Biden. And so I think that

25 was a key one.


28

1 Moving forward, I think we were tracking very closely some of the efforts around

2 the election, specifically where there was an incident with some folks that had gone to

3 Philadelphia that posed a potential violent threat. The night of the election, we

4 understood the range of different arrests of various figures up and to that point.

5 Obviously, we were aware of the, sort of, Boogaloo-related arrests and others earlier in

6 the year in 2020.

7 And then the violence in D.C. on December 12th, you know, was nothing like what

8 happened on January 6th, but it was certainly something that we were aware of. And

9 we'd been tracking, you know, the Proud Boys in particular because of a range of violence

10 in Portland and the Pacific Northwest through that year. So we knew that this was, you

11 know, a -- anytime where there are real calls for lots of these folks to get together was

12 sort of concerning.

13 But we also knew that, when it came to the most acute of the groups that were

14 gathering on January 6th, we had policies in place that were causing them a lot of trouble,

15 right? We were constantly taking down Proud Boys. They were mad. The Oath

16 Keepers, some of the individuals were still there, but they were not using Face book as a

17 primary coordinating mechanism in the way that they had.

18 And so, you know, there was some comfort in that, especially from the dangerous

19 organizations' perspective, where we're looking at the worst, you know, the core

20 organizations that may cause violence, not the more, sort of, general

21 misinformation-driven, you know, frustration with the election.

22 But immediately after the first of January, when it became very clear that the

23 rhetoric had changed across the web, in my mind, several of us coordinated and said,

24 look, this is looking really -- this looks really concerning.

25 And so, you know, I called several meetings with the team outside of the IPOC
29

1 process to really make sure that we had everything buttoned up, that we were

2 coordinated with law enforcement, that we were having those kinds of conversations,

3 because we wanted to be ready. We didn't know; you never know. We'd sort of gone

4 into overdrive for the Lobby Day a year prior, and then nothing happened. But we

5 wanted to be ready.

6 Q So, when you convened -- this is all really helpful.

7 When you convened the meeting outside of IPOC, that was early January 2021

8 when you thought the violence potential for January 6th specifically was worth

9 considering?

10 A Yeah, so we did two things. One is, we called sort of a separate meeting,

11 mostly with the legal team, to make sure that we were coordinated with the law

12 enforcement. And --

13 Q Who else was involved in that meeting?

14 A Oh, I don't remember. It was a small group. It was mostly me and the

15 legal team. It was basically I just wanted -- look, those folks are pros. They really are.

16 They're people with backgrounds in the FBI and everything else. Like, I had no concerns

17 that they weren't coordinating with law enforcement, but, you know, it's my job to ask,

18 right, and to make sure, and so I did.

19 But the other thing we did was we reached out to the IPOC folks -- because,

20 remember, January 5th was the Georgia special election, right? And the attitude of

21 the -- the IPOC was really built around the elections, right? And so, like, there were

22 levels to the IPOC. And so high-level, like, alert for January 5th. But we had to go to

23 them and say, "Hey, you need to be on high alert for January 6th too." And they did.

24 We got them to go on high alert.

25 But that's what I mean. Like, the dangerous organizations' perspective, because
30

1 we're looking off of Facebook, we understood more intrinsically the way that the threats

2 of violence were manifesting around January 6th. And that's not something that was as

3 obvious if you were just looking at Face book.

4 And I know some people will say, "No, it was obvious," and they'll point to specific

5 pieces of content. But there's so much of that crap on Face book all the time that it's

6 hard to pick out noise from the -- you know, wheat from the chaff. But we were able to

7 do that because we were concerned and primarily driven by the concerns of what we saw

8 off the platform.

9 Q So first question: Do you happen to recall a date for that initial meeting

10 with the dangerous ergs team and legal?

11 A When -- well, I mean, the thing is, we'd been meeting, you know, twice a

12 week since May, you know, around all of these issues. You know, maybe it was April, I

13 don't remember. So it wasn't like this was the first time. But I had called a special

14 session with the legal team to make sure that we were coordinated in the days ahead of

15 January 6th.

16 Q And what was the nature of your conversation about communication with

17 law enforcement? Did legal let you know that law enforcement had been in contact

18 with them by the 6th, or was there kind of --

19 A No. What you want to make sure in a situation like that -- no, I don't know

20 that law enforcement had reached out to us. But what I wanted to make sure was, if we

21 get emergency law enforcement requests about January 6th, are we ready to get them?

22 Are we going to make sure they don't slip through the cracks? Are we going to be able

23 to respond? If we see something that looks like the clear planning of violence that we

24 need to get to law enforcement, you know, are we ready to do that?

25 And those kinds of systems exist at a company like Facebook. They're going to
31

1 exist at big tech companies -- Facebook, Google, Microsoft, Apple, et cetera. But in a

2 situation like that, where the stakes are high, my job was to check, so I checked.

3 And they assured me that we were good, which I really never had any doubts

4 about, because those folks, A, had been doing it really intensely for a year, basically,

5 through what was a really -- you know, I don't have to tell you guys -- what was a really

6 intense year in American history, 2020. But they're also people that -- they understood

7 the sort of concerns that we all had about some of the intelligence that we were seeing,

8 and so they were ready.


32

2 [12:05 p.m.]

3 BY-

4 Q Following that meeting, is that the point at which you tried to get IPOC

5 involved in the 6th as well as the 5th?

6 A That is when we -- no, I don't think it was -- I don't remember the order, to

7 be totally honest with you. I remember the -- I remember the -- there was a weekend.

8 I think the 6th was a Wednesday if I'm remembering that correctly.

9 Q Yeah.

10 A I remember the weekend prior to that, looking at some of the material that

11 was off platform, and saying this has really materially changed. The threat is higher than

12 we understood. And we need to make sure that we are really ready. And so, I reached

13 out to the folks running the IPOC. I don't remember exactly what that process was like,

14 and said, Hey, whatever we're doing on the 5th, we should do it on the 6th. Be ready.

15 And really that's a staffing question, right? It's like, you know, it's how many

16 people are going to be on call, who's gonna be there, you know? And as I recall

17 we -- you know, they made the change and kept a bunch of folks on call for the 6th

18 because of those concerns.

19 Q So that would have been around the January 1st/January 2nd timeframe?

20 A Something like that, yeah. I think there was a real -- there was shifts in the

21 plans for the actual protest itself. I think they were going to move from the -- did they

22 move from the east side to the west side of the Capitol? There were permitting

23 changes, right? And there was a shift in tone, and a specificity about the Capitol that we

24 started to see, you know, online. And by on line, I mean, I really -- I know it may sound

25 strange, but when I say on line, I don't mean Facebook specifically. But that raised my
33

1 level of concern. And we'd been concerned throughout December especially after the

2 12th, but my concern went up after seeing that stuff.

3 Q And we touched on this earlier, but were there particular platforms that you

4 remember raising your concern especially?

5 A I remember the stuff from Donald.Win. I remember even some of the

6 rhetoric that was being -- you know, I think there was decent work being done by some of

7 the researchers. And I don't always agree with all of the researchers, but I think there

8 was a number of researchers tracking some of this stuff, you know, that were publishing

9 on it, that those things concerned us. But I remember the Donald.Win. I remember

10 some of the stuff we saw on Parler.

11 We also saw some of the stuff Stewart Rhodes was putting out on the Oath

12 Keepers site that had a little bit of a different tone, suggested some planning and

13 organization that was concerning. But that's the thing, right? Like, you know, you're

14 trying to measure risk in these situations, and you have to make judgment calls. And

15 this was really about being prepared. You know, how high of an alert do we need to go

16 on? And so, we wound up going on a higher alert, and it was clearly the right decision.

17 Q Was it your hope that some of the information you were seeing would be

18 given to law enforcement after your meeting with legal, or did you understand them to

19 already have that information?

20 A No. I mean, I don't think -- I mean, all this stuff was just on the internet,

21 you know? Like none of this was the kind of material that law enforcement needed to

22 get from Facebook, you know? Like, there are things -- there are plenty of instances

23 where Facebook comes across something on Facebook that is clearly a threat, and gets

24 that stuff to law enforcement. Like, those processes are well-worn.

25 But that's not really the dynamic that was going on here. What was happening
34

1 here was the kinds of stuff that the -- you know, all sorts of researchers were publishing

2 on. I don't think this was really special knowledge. It was awareness, and our sort of

3 predilection for looking outside of the platform to understand general levels of risk.

4 Q That's helpful.

5 I wanted to zoom back a few days because you mentioned your risk level kind of

6 being pinged around the New Vear, and then in between the December 12th event in D.C.

7 and the New Vear.

8 On December 19th, President Trump tweeted about the January 6th rally for the

9 first time.

10 Does that -- do you recall that being an important moment in your risk calculus?

11 A I mean, I remember that we -- we were concerned by all of the -- by all of the

12 rhetoric from various organizers, and sort of provocateurs, including President Trump,

13 that really seemed to really want to rile people up. But -- because we saw the level of

14 anger and the frustration and confusion, you know, genuine confusion by supporters of

15 the former President who, you know, genuinely believed that the election was stolen, and

16 genuinely believed they were losing their democracy. And so, we were certainly

17 worried about that. But I don't remember that to be in particular -- which, you know,

18 again, it was 18 months ago. So some of these things stick in my mind. Some do not.

19 Q Just figured I would ask.

20 A Yup.

21 Q When you did get IPOC, or seek to get IPOC involved in January 6th, could

22 you tell us some of the individuals who were involved in that process or involved in IPOC

23 at the time?

24 A I mean, I don't remember. You know, Molly was certainly a key driver of

25 the IPOC process. And Molly runs a really good meeting in those situations. She's
35

1 really calm and cool.

2 There were lots of folks, you know, lower-level operational folks who, you know,

3 were on sort of -- they were on call, right? And so, they were the ones that were sort of

4 running the actual operation on a moment-to-moment basis for some set period of time,

5 and then they would hand it over.

6 You know, there were like a bunch of those. Rosa Birch, Rosa was doing some of

7 the work for Molly. You know, lots of those folks. I don't remember exactly the

8 process we went to when we said, you know, we needed to -- like we wanted to make

9 sure we were on high alert on the 6th itself. But we submitted it in. We said, Look,

10 there's a bunch of crazy stuff going on. We need to be really ready. It wasn't that the

11 IPOC was going to be gone. It was just gonna be on a lower level of alert after the 5th.

12 We made -- at least that's my recollection of it. And, you know, we endeavored to keep

13 them on a higher level of alert.

14 Q That makes sense. I want to pause here because I've been monopolizing

15 the exchange to see if have any questions.

16 Thanks,_

17 BY

18 Q Could we back up a little bit, and when you say that you wanted to keep the

19 IPOC on the same level alert, to make sure their alert level didn't recede. What is the

20 practical effect of having the IPOC on alert? If they've been alerted on the 5th, what can

21 they do on the day of the 6th to make a difference within that time period?

22 A Yeah. I think what it really comes down to is staffing, because you've got

23 more folks. You know, it's like any -- I don't know if you guys have ever been out to

24 NCTC or seen an ops centers, or military ops center, or something like that, you know, the

25 practical reality is you've got a lot of folks just sitting around a lot of the time, right?
36

1 They're kind of twiddling their thumbs, and they're getting ready, and they're making sure

2 their systems work.

3 But the reality is, they're there for a crisis, which means between crises, there's

4 some slack. Now, in a situation like Facebook, that's -- nobody was really sitting around,

5 because there's always something going on, right? It's just so big, and like, the scope is

6 so wide. But, you know -- so I really do not want to imply some of those folks

7 working -- I mean, they worked their butts off, like, especially the lower-level folks.

8 They really, really work hard, and I'm grateful to them, and I hope other people are as

9 well.

10 But what it really was is staffing, is like how many people you have there ready

11 assigned to this in case something goes wrong, that are able to quickly investigate and

12 triage if a question comes in from, you know -- if there is an emergency request from law

13 enforcement, how many people are ready to go chase it down?

14 If there is a violation on the platform that looks like there's a network associated

15 with it that can chase that down and figure out what's going on, things like that. None

16 of those things are automatic, even though those systems are better than they used to

17 be, they still require people to make smart judgment calls to deal with glitches in the

18 software, to try to communicate that up to leadership in a way that is comprehensible to

19 people that don't understand how all the internal tools work. And so, there's a lot of

20 room there for error.

21 And I think there is a perception outside of Silicon Valley that like tech companies

22 have everything totally wired, and all you have to do is press a button on a like you know,

23 a black screen with green letters, and all the sudden all sorts of perfect, sexy information

24 drops out of the sky. But that's not the case.

25 What really happens is a bunch of 25-year-olds go chase down that information


37

1 and query four or five different systems and try to compile it in a way that is reasonable

2 for their leadership so they can make good decisions, and it's hard. And so, you need

3 people on call to do it. I think that's -- you know, you could ask Facebook. I'm sure

4 they would be able to give a better explanation of exactly the differences of the IPOC

5 levels. But what we pushed forward was just a higher alert on the 6th itself.

6 Q So the types of things they would have been able to act on in the moment of

7 a crisis, it seems like we're talking about things like, you know, if there's an act of violence

8 that's being live-streamed, if there's evidence of coordinated inauthentic activities, if

9 there's sort of an outbreak of violative content in response to the events of the day and

10 action needs to be taken quickly.

11 Are those the types of things they would be there to handle in the office?

12 A Yeah.

13 Q Am I missing any kind of obvious things they're trained or prepped for?

14 A No. Those are the big ones. But I think -- remember that this is like a

15 fusion center, right? So there's some people there, their responsibility is to take on the

16 most acute problems; but they also are kind of the tips of the spear for larger

17 organizations that are still doing their things, right? So there's a larger operational

18 organization. There's a larger legal organization that's dealing with law enforcement

19 requests. You know, there's -- you know, there are larger entities out doing their

20 everyday work that may be sort of redirected towards -- certainly towards an event like

21 January 6th.

22 So, you know, I don't want to give the impression that all of the activity that

23 Face book or a company like it does would occur within that fusion center, right? You

24 know, some of it does. But some of it is also there to represent just to share

25 communications with other elements in common.


38

1 Q This might be a good time to move to the events of the day itself in more

2 granular detail.

3 So starting with the morning of January 6th, can you describe sort of the

4 environment inside Facebook, its posture, its readiness?

5 A Yeah. Everyone was really nervous. I think that was -- at least I was really

6 nervous. The -- no, I think there was a real sense of concern over how this would go,

7 because we had seen -- but there was also that sort of moment especially when you are,

8 you know, when you're at a frankly more senior level, and you're in a coordinating role

9 when something starts to happen, you know, you just don't want to get in everybody's

10 way, right?

11 You've sort of like -- okay, there's a bunch of folks whose job it is to deal with this,

12 you want to be there if they need you, but you don't want to get in their way. So I

13 remember the 6th, I basically just cleared my schedule on the 6th to, you know, to make

14 sure that I was there to help. But, you know, I remember being -- there's sort of that

15 moment when you are in a leadership role of any kind where, when things start to

16 happen, you've got to let people go do it, and it can be very frustrating, because you want

17 to go -- you know, you feel a little bit useless for a moment.

18 So I remember watching. I was watching the live-stream on YouTube or

19 somewhere, and the -- I was sitting right here where I am now. And I remember very

20 clearly, at one point, I had a phone call with my colleague, Nathaniel Gleicher. We had a

21 sort of standard Wednesday call with each other, and we're chatting, and so, we had a

22 phone call. And I thought I'd go for a walk around the block. And, you know, I got

23 halfway down the block, and Nathaniel was still at his house or something, and he's like,

24 oh God, they're attacking the Capitol. So I immediately came back. And I remember it

25 very, very clearly, obviously. Before that began to happen, it was rhetoric, it was
39

1 volatile, but it wasn't violent. And then it was, obviously.

2 Q Who was your first call after you got back to your back to your computer

3 after you got back to your house?

4 A I don't think it was a call. It probably would have been a chat across, you

5 know, to a bunch of -- a chat with a bunch of the cross-functional folks to say, Hey, what

6 the hell is going on?

7 Q You don't have to worry so much about the chronology or the exact order,

8 but who would you have reached out to first? Like which cross-functional teams would

9 you have chatted?

10 A Well, certainly my policy team, you know, and the folks that were tracking

11 some of these things closely, our legal team. And -- you know, to see whether we were

12 engaged with law enforcement, whether we were getting stuff in, whether we were

13 seeing practical threats that we needed to get out the door, and then the operations

14 team, you know. Certainly, at that point, the IPOC, because the IPOC was a central

15 coordinating body. So if something was coming in somewhere weird, like, the hope

16 would be it would make it to the IPOC.

17 Facebook -- you know, tech companies don't like to believe that they're

18 bureaucracies, but they are. So stuff comes up in one part of the company, and you

19 may not hear about it, right? So the intent of something like the IPOC is to try to limit

20 that, right? But where you -- if somebody has something important to say, you get it in

21 there and then it's shared, right? So I don't remember -- I'm sure there was sort of an

22 emergency IPOC convened, you know, stand up, but I don't remember specifically.

23 Q This might be recreding some of the things Mr. -asked about, but who

24 were the immediate points of contact for the IPOC? Who was involved in that that

25 would have taken your -- who would have received your message?
40

1 A I don't know. It would have been whoever was on call at that time.

2 don't remember.

3 Q What was your message to your team? What did you tell them to do, or

4 what did you ask them?

5 A Well, I think what you ask, you know, the folks that -- like, in a situation like

6 that, you know -- right, there were at least two things happening on the Mall that day,

7 right? One was you had a mass group of folks that had been subject to misinformation

8 around the election itself that were angry, that were there, that were, on some level,

9 primed for violence. And then you had more organized groups that had shown up with,

10 according to various indictments now, with intent to commit acts of violence and

11 prepared on some level for those acts of violence.

12 When you break that down, it was those groups that are more sort of the ones

13 that I would be thinking about more than the masses of people informed by sort of

14 misinformation. Now, obviously, there was some crossover, and we should talk about

15 that crossover and where that gets difficult, but those organizations, you know it's like,

16 Hey, what are they doing?

17 Like, it's not rocket science, right? You know, it's -- we'd been following these

18 things before. We knew the Proud Boys were going to be there. We knew the Oath

19 Keepers were going to be there. So it's like, Hey, what do we know about what these

20 groups are doing now as best you can, and oftentimes, you don't know much, right?

21 You wish you knew more, but especially because we had tried to chase these groups off

22 of Facebook. You know, we didn't always have the most tangible information on those

23 entities themselves. And so, you run around. You chase it down. You see what other

24 researchers are doing.

25 You reach out to vendors and say, Hey, do you guys have anything? And that
41

1 process is pretty hit and miss, you know, especially in a really fast-moving crisis where

2 you're trying to understand what's going on. Of course, you're watching it on TV, trying

3 to see what other folks are, you know -- what's coming up. So, you know, I think that

4 that process is, you know, is a really, you know, dangerous and tricky one. I mean,

5 dangerous is the wrong word. That process is fraught because the information coming

6 in is spotty. It's not always clear how to analyze it. It's not always clear what's

7 important, and you have to sort of whittle that down as best you can.

8 But we wanted to make sure, like, you know, are we missing something here with

9 Oath Keepers on platform, where they're coordinating? Or have we missed something

10 about Proud Boys on platform because you know these groups you kicked off are coming

11 back, right? They always do. And so, you know that there is, as much as you've tried

12 to sort of prevent these things, you know you've not done it perfectly. And you want to

13 see if there is harm that is being facilitated on the platform. And so, you want to talk to

14 the legal teams and make sure that as requests are coming in -- and they started coming

15 in that day from law enforcement agencies -- that we are in a position to triage them, to

16 respond, to provide, you know, to make good decisions about how we respond, because

17 sometimes, you know, sometimes even an environment like that, requests will come in

18 that are wildly off base, and you want to make sure that everybody in the organization

19 with their individual set of responsibilities is doing them. And in many ways, that was

20 my sort of job in that setting is within the dangerous org writ, are we all chasing these

21 things down? Are the folks that are responsible for gathering intelligence, gathering

22 that intelligence and getting it over to us? Are the folks that are dealing with law

23 enforcement, dealing with law enforcement? And you know, do we have any true crises

24 that we need to, like, throw more resources at? So those are the kinds of things that

25 you're thinking about.


42

1 Q Do any -- so as you have this real-time sort of intelligence gathering going on

2 in response to the crisis, do any -- in retrospect, do any pieces of evidence stand out in

3 your memory? Is there anything important that you discovered on that day that you

4 remember as being important now?

5 A It all kind of blends together. I don't think so on that day, you know?

6 mean, look, you know, I think, you know, the indictments tell the story now, which is that,

7 you know, many of the people that were there that participated in the attack were

8 informed by misinformation. Facebook was a vector, like many others, for that

9 misinformation. That sort of misinformation piece was not squarely in my wheelhouse.

10 There were some activities by Oath Keepers and Proud Boys on Facebook.

11 don't think we discovered that stuff that day. I don't remember to be honest. But

12 those activities were pretty limited for the most part. And they were in Messenger

13 traffic I think. So things like some of the pre-stuff about where to stay in hotels, all that

14 stuff's in indictments now, you know, that are publicly available. That's the kind of stuff

15 we surfaced about the operations of those groups.

16 And, you know, I think, you know, what's more important was, you know, efforts

17 to try to like, Hey, if we've got people that are claiming --you know, that are posting

18 videos and pictures of themselves inside the Capitol that day, you know, these are

19 things -- these are things we want to probably hang on to, and understand that there's

20 going to be law enforcement requests about that, you know? But a lot of that is, like,

21 kind of the masses that were there. And, you know, as you know, there have been lots

22 of indictments and lots of stuff that's come in, and Face book provided information in

23 response to many of those requests.

24 Q I have a list of, sort of, individuals I'm just wondering if you could -- I could

25 run through it with you and we could talk about any interactions or recollections you
43

1 have about their -- any interactions you had with them on the day of 1/6, or the

2 immediate days afterward.

3 And I know we've been going for, gosh, an hour and a half. Is that right? So we

4 could look at taking a break after that, if you like, that could be helpful?

5 A Yeah. That would be great. Thank you.

6 Q Okay. So the first on my list is Samidh Chakrabarti, the head of the civic

7 integrity team.

8 What do you remember about him from 1/6 or the days immediately after?

9 A Very little.

10 Q Very little. You said your teams didn't overlap much?

11 A We did sometimes, but not a ton. I think, you know, the civic team was

12 more focused on efforts like break-the-glass measures, and frankly, the sort of really

13 important work to try to protect elections generally, misinformation and things like this.

14 That's really -- you know, I don't want to give the impression at all that just because I

15 wasn't focused on it doesn't mean that it wasn't really important. That stuff's really

16 important, but our focus was violence, like, real-world violence.

17 And I do think that, you know -- and I do think that when you get to those

18 more -- and those groups, and like, so, these acute organized groups that try to use the

19 internet, they are -- they know that you're trying to get them on line, so they're trying to

20 evade you, you know? So it's not the same story as, like, the misinformation story of,

21 you know, our recommendations doing this. It's they're trying to use private spaces and

22 do things in ways that are evasive. They're not trying to make a big slash all the time.

23 That was certainly the case with, you know, the Proud Boys and the Oath Keepers

24 and even QAnon, by that point, which was still moreover, but you know, much of QAnon

25 had been -- the really overt stuff had been kind of pretty dramatically disrupted in August
44

1 and October in the run up to the election, and they were still around of course. None of

2 this ever goes away, but it's -- but they were less overt. And so I think that

3 changed -- you know, we were -- there's a very different perspective between a group

4 that is like the civic integrity group and the broader integrity efforts of Facebook, and not

5 just civic, but like, where you're really trying to move big metrics over time, right?

6 Because you're dealing with these mass movements all the time. And a group like Mon,

7 which is very unique, that's focused on very specific small acute, really, you know, the

8 most acute problems. And that is a -- you know, that's why there's -- we weren't always

9 overlapping because we were solving different problems in a lot of ways, or trying to

10 address different problems.

11 Does that make sense?

12 Q That makes sense, and it's useful. I have two follow-ups on that, which you

13 don't have to answer in too much length, but just to help me understand the points of

14 overlap.

15 Were they involved at all in the IPOC, and did your team have any kind of input, or

16 did you give any advice on the break-the-glass measures?

17 A So I -- yeah. I'm sure the civic team was involved in the IPOC. And I don't

18 remember when -- I guess in December at some point, the reorg of the civic integrity

19 team occurred. I don't remember the exact timeline. I'm sure they were involved.

20 The -- we had some input on the break-the-glass measures. But there

21 were -- you know, in the dangerous ergs world, we were already doing a lot of that stuff.

22 If you -- stuff like -- you know, a lot of those things were stuff we already did. So, like, if

23 you were a, you know, user that had, you know, that had posted some dangerous ergs

24 content or things, you were already subject to these kinds of restrictions, in many cases.

25 So, you know, like the break-the-glass -- the dangerous ergs approach was the blunter
45

1 instrument, right? And it's blunter instrument, but it was applied to a narrower set of

2 users, right? I mean, it's much more narrow. But a lot of those kinds of things -- we

3 started doing that stuff in 2017, and you know, so those kinds of things I think were -- we

4 did have some input on it, but it was less critical for us.

5 Now, where we wanted, you know -- I think the danger is that we recognized that

6 many of the sort of, you know -- the danger is that what we saw, and what we've seen, as

7 a steady movement -- and this is a problem that's going to confound, not just your

8 investigation of January 6th, but it's going to confound social media, it's going to

9 confound everybody that has been a steady movement, is that many of these violent

10 networks are really -- operate now more as networks, right?

11 And they operate using symbols and sort of, you know, in the case of Boogaloo,

12 like, aesthetics as a way to unit themselves rather than organizational structures, that

13 creates real problems for applying policies like the dangerous ergs policies. So you wind

14 up using these other kinds of policies.

15 The challenge there is that those other policies are reactive, right? And that's a

16 real challenge. That's a real problem.

17 So you can try to create, sort of, creative responses to that, like the

18 violence-inducing conspiracy network policy, but, you know, you're dealing with really

19 difficult questions at that point, and extending that in a really broad sense I think is a very

20 difficult thing to do. Dealing with dangerous organizations online is hard conceptually.

21 Dealing with misinformation is way harder, conceptually. But that's kind of where we're

22 headed.

23 And one of the reasons we're headed there is because of the 6th -- you know, it's

24 not the only reason. But one of the reasons we're headed there is because of the

25 success of things like actor-based policies, because they made life more difficult for folks
46

1 to operate in sort of more structured ways. But people aren't stupid, you know, and so

2 they recognize that and they adapt, and they try to evolve around it. And I think part of

3 what we saw, you know -- actually, I don't even want to say in the run-up just to January

4 6th, but I think part of what we're seeing right now generally in the world, and certainly

5 on line, is a range of different organizations, not just in United States, but globally, that

6 are recognizing those challenges, and they are adapting to them. And you know, if

7 anybody says they know how to solve that problem perfectly, they are full of it. But I

8 think, you know, you're certainly pointing to something that's very real.

9 Q I have a lot of questions about what you just talked about that I want to

10 table for later. But certainly, the point about the evolution from organized groups to

11 loser networks and the challenges you highlighted are things that are on my mind and I

12 do want to get to.

13 I suspect, just really quickly, that the next individual on my list, Guy Rosen,

14 probably similarly low interaction?

15 A No. I worked with Guy on a number of different things. So, you know,

16 like Guy was one of those -- Guy was top of the integrity organization. And you know,

17 Guy is somebody who really understood I think the different perspective of the dangerous

18 ergs team. And so, when push came to shove and something was broken

19 bureaucratically, Guy was somebody who could help break those logjams sometimes.

20 Q Did you interact with him on the 6th?

21 A Oh, I don't remember. It wouldn't surprise me. We probably were in a

22 meeting together, but I don't remember specifically.

23 Q Do you remember anything about what he was focused on or --

24 A No, not specifically.

25 Q Okay. Next person I think would be you might have more recollection of,
47

1 Monica Bickert. What were you interactions with Monica like?

2 A Well -- so I worked with Monica -- I reported to Monica for years, and even

3 afterwards, I reported up to Monica. I don't remember. I'm sure I talked to Monica on

4 the 6th. I'm sure we were -- again, I'm sure we were sort of pinging back and forth and

5 in the meetings together, but I don't remember specifically what she was focused on.

6 Q Do you remember communicating any of your communications to her?

7 A Not specifically. You know, I'm sure we communicated, but I don't have

8 any specific recollection.

9 Q How closely did Monica oversee your team? Would you describe her as a

10 hands-on manager?

11 A I think less hands-on as time went on. Mostly because she sort of

12 developed trust, you know? And, you know, a lot of the things that I wound up doing

13 were things that she did before I got there. She was [inaudible] former Federal

14 prosecutor. So she -- part of-- like there is correctly kind of a line between the sort of

15 legal operations of a big tech company, and the sort of community standards, terms of

16 service. Most of the, like, terroristy stuff Face book removes is not a threat in the real

17 world. It just violates Facebook's rules, right?

18 So the folks that are enforcing that kind of stuff, they don't need to know about

19 the legal wrangling and the sort of sharper tip of the spear where things like that

20 manifest. And so where that stuff comes together is in a few people. I was one of

21 those people. Monica was one of those people. And where you're trying to make sure

22 that there is communication across, and lessons learned across and ways that are safe

23 and that are privacy protective and all of those sorts of things. So she would still do that

24 kind of stuff, but less nitty gritty as time went on.

25 Q Who else did Monica oversee? Like, was most of her writ related to
48

1 dangerous ergs? What were the other areas that she had insight?

2 A No. Monica's writ was across all of product policy. So that included, you

3 know, everything from information operations and state espionage, child safety, to all of

4 the sort of content-focused teams. And teams -- ultimately teams that were building

5 sort of product-focused recommendations like, you know, what you can now see as

6 the -- what do they call them? The content distribution guidelines, which are now

7 online, which are kind of more generalized principles about content ranking and things

8 like that based on policy principles. So Mon led all product policy, so that's all of that

9 stuff.

10 Q What about after 1/6 when the smoke cleared, and you and Monica were

11 maybe able to have a more focused conversation about what happened, do you recall

12 kind of your next sitdown with her?

13 A I don't -- I mean, I don't. At some point, Mon had some health issues, and I

14 don't remember exactly when those occurred. The -- I don't remember the next precise

15 sitdown. I mean, you know, we would chat regularly, you know, both in formal, sort of,

16 meeting settings but also, if I just needed to let her know about something. I'm sure

17 that I was sending her notes about some of the things we were getting from our legal

18 team, and updates on that kind of stuff. I'm sure that those updates were going at

19 various points, but those communications weren't always terribly formal to be honest.

20 mean, sometimes they were. Sometimes they were not.

21 Q Did you ever have a sort of retrospective discussion on what went wrong in

22 the run-up to the 6th, or what could have been done better?

23 A I'm sure generally. But that was not something that Monica would have

24 run specifically, you know? Like there were certainly broader company efforts to try to

25 understand what happened.


49

1 Q Maybe we can return to that. I want to make sure we get through some of

2 these names before -- I'd like to break around 1 o'clock.

3 What about Joel Kaplan?

4 A Yeah. Joel -- so similarly, I'd be surprised -- I'm sure that I must have

5 communicated with Joel that day on some level, or been in meetings with Joel or been

6 on, you know, emails or something with Joel, you know, because we were certainly

7 plugging in what we knew to the IPOC process, and sending updates, you know, and -- but

8 I don't recall a specific conversation with Joel about, you know, the day of or something

9 like that.

10 Q And did Monica report to Joel?

11 A Yeah.

12 Q She did. Okay.

13 What about Molly Cutler?

14 A Yeah. Similar. I mean, you know, Molly didn't always run sort of the IPOC

15 stand-ups. Sometimes she would deputize others for that. But again, I don't know.

16 don't remember specifically. But like, when it was something really important, Molly

17 would oftentimes run those meetings herself, and she was very good at that. And so, I

18 would have to a -- my working assumption, which is really an assumption rather than a

19 recollection, is that she probably ran some of those on the 6th, and immediately

20 afterwards.

21 Q So the IPOC process, was that overseen by Molly's team?

22 A Yeah.

23 Q Okay. They administered those meetings?

24 A Yeah. Yeah.

25 Q On more of the product side, what about John Hagerman?


50

1 A Very little. Yeah. Very little. I mean, sometimes, but, like, not a lot.

2 Q Tom Allison?

3 A Who?

4 Q Tom Allison?

5 A Again, it's a name I know. I know we had some interactions, but not a lot

6 of detail.

7 Q What about Sheryl Sandberg? Do you have a sense of her activities on 1/6?

8 A I'm -- I would be surprised if there -- I mean, certainly, there were

9 communications and updates that would go up to Sheryl and Mark that I probably was

10 CC'd on because we plugged in some information. I don't recall those specifically, but

11 that would have been standard practice.

12 Q And what about Mark Zuckerberg?

13 A Yeah. Same. The -- I mean, it would be pretty standard in any kind of

14 major event like that where there's any nexus to Facebook, updates to very senior

15 leadership that, on something like this, I assume I was CC'd. I don't remember, but

16 the -- like, in many cases, the distribution of such messages would not go very wide.

17 You know, but on a topic like that, I would assume I was CC'd on these kinds of things.

18 But you got to -- I mean, you know -- yeah, I don't recall specifically.

19 Q So there must have been, I think, a series of high-level decisions that needed

20 to be made in the days following 1/6 by people like Mark Zuckerberg, Sheryl Sandberg,

21 Joel, Monica.

22 Do you have a sense of, again, once maybe starting the evening of the 6th going

23 into the 7th, what their priorities were? What were their -- the sort of priority zero

24 things, or priority 1 things that they saw as having to be decided or done?

25 A In those -- you know, again, I don't recall specifically. I think in those


51

1 scenarios, the overriding PO, generally speaking, was to cooperate with law enforcement

2 and make sure that we were providing information that could protect lives. That

3 commitment, you know, like that commitment from senior leadership at Facebook is

4 pretty good. It really is. And people I think -- I've got my gripes about decisions made

5 at Facebook, but on that count, I think they really held right. And they did it right.

6 But, you know, the thing that everybody always wants to know is, like, what the

7 hell happened on Facebook, right? That's what the questions are always. Like, what is

8 the nexus on Facebook? Was this planned on Facebook? Was this organized on

9 Facebook? How was this organized on Facebook? To what extent? Et cetera. And

10 then identifying and removing, you know, violative material that relates to the event,

11 right?

12 So there were decisions made about some of the remnant Stop the Steal groups,

13 and groups like that that were removed in the wake of January 6th. There was the

14 removal of the Patriot Party Network, which was essentially a mirror de facto of some of

15 the Stop the Steal movement.

16 And, you know, I remember the outlines of those decisions. I don't remember

17 the exact timeline, but in the wake of -- in the wake of January 6th the -- but I think, you

18 know, there is this sort of deal with law enforcement, real-world harm, and then there's

19 what is the Facebook implication, like how do we clean this up? And then, of course,

20 there is the, like, how do we deal with all the questions we're going to get? Like what

21 do we know?

22 That is both, you know -- big company like Facebook that is both a question of,

23 like, what is our communication stance, like how do we protect the brand, but also, how

24 do we answer questions when we don't know all the answers yet in a way that's not

25 gonna turn out to be erroneous, right? You know, that is -- that takes up a lot of time
52

1 inside of Face book, is like how do we make sure we've got all of the information so we

2 don't say something we're going to find out was wrong in a month even though when we

3 said it, we thought it was right, you know?

4 Q Is my recollection correct, correct me if I'm wrong, that Stop the Steal was

5 sort of banned -- that type of content was banned from Facebook in the days after 1/6?

6 A So the -- so I think -- this is an interesting one, I think when you think about

7 Stop the Steal, people use that phrase to mean lots of different things. And it's

8 important, I think, to be specific to about them. One, there's the real-world

9 organization Stop the Steal led by Ali Alexander and others, right? Real world, 501{c)3

10 or 501{c)4, whatever it is. There is that sort of, you know, specific group. There was

11 an original Stop the Steal group on Facebook that sort of showed up in the days after the

12 election that was actually removed for -- I forget what it was removed -- misinformation

13 or coordinated harm or something, I don't remember exactly.

14 Then you've got a bunch of other groups on Facebook that were kind of called

15 Stop the Steal or sort of a derivative and maybe they were actually organized by the same

16 people. Some of them probably weren't, but they used -- sort of named themselves

17 that way. Then you've got the, like, phrase, Stop the Steal, right, which lots of different

18 groups used.

19 Then you've got this sort of more general belief that the election was stolen, and

20 people sort of getting at that idea using a lot of different sort of language and

21 terminology, including Stop the Steal.

22 And so, I do think that when people say Stop the Steal, and they talk about Stop

23 the Steal in the world more broadly, and when they talk about Stop the Steal on

24 Facebook, they're not specific about what they mean. That lack of specificity leads to

25 confusion sometimes. People say, Well, Facebook, why didn't you take down Stop the
53

1 Steal, and Facebook will say, Well, we took down the one Stop the Steal group, but not all

2 these other Stop the Steal groups, not necessarily the phrase, not necessarily targeting Ali

3 Alexander and all of his manifestations, et cetera.

4 So, I do think that there were efforts to take down some of these things. Some

5 of the other groups came down because they had other violations, like violence and

6 incitement, and mis info and hate speech and other sorts of things. But the sort of more

7 aggressive effort against Stop the Steal, more generally, did happen after January 6th

8 where it hit a much wider swath of the movement, but even then, it didn't take down all

9 misinformation related to the election.

10 And I do think that this is an important thing for folks to understand. Which is, I

11 thought Facebook should be more aggressive in taking down Stop the Steal stuff prior to

12 January 6th. But I don't think that that would have prevented the protest on the Mall.

13 I don't think it would have prevented violence on January 6th. And I think that that's

14 really, really important. I think there were some groups that probably should have

15 come down that didn't.

16 But this sentiment of the election being stolen was being promulgated by so many

17 sources across the MAGA movement that both on Facebook and off, that if Facebook

18 were to have tried to take that off generally, it would have had to take down much of the

19 conservative movement on the platform, far beyond just groups that said Stop the Steal,

20 mainstream conservative commentators. And without taking really dramatic action like

21 that, I don't think Facebook would have -- any action Facebook would have taken would

22 have limited the protest on the Mall.

23 And I think that that's really important, because such action would have been so

24 sweeping, that I think it would be extremely difficult for Facebook to defend, both on its

25 existing policy grounds, but also more generally. So while I think Facebook should have
54

1 been more aggressive, I don't think that the sort of -- the steps that it might have taken in

2 scope would have prevented the violence on January 6th. In fact, I don't think it could

3 have taken any steps that would have prevented the violence on January 6th.

4 Q Did Facebook have a policy against election delegitimization for 1/6?

5 A Facebook had policies against misinformation around voting and things like

6 that. But it was focused on voting, not sort of ex post facto delegitimization if I'm

7 remembering correctly.

8 Q So, you laid out the different varieties of Stop the Steal groups. Was Stop

9 the Steal, the real organization, ever designated as a dangerous org, a violent conspiracy

10 network, any of the types of entities you covered?

11 A No. It would not have qualified except perhaps after January 6th because it

12 hasn't induced violence to that point.

13 Q You mentioned earlier in our conversation that sometimes for the category

14 of violence inducing conspiracy narratives and militarized social movements, that you

15 would look to offline signals to see if an entity was on the verge of violence, and that, in

16 some cases, then you would take action. Is that correct?

17 A Sort of. Like, we would -- so, for the military social movements, for the

18 militias, we would look to see whether their sort of sole purpose, their purpose was to

19 organize for the purposes of violence, right? And I forget the exact criteria, but that was

20 the gist of it, right? You're looking for things like, is this an armed group? Are they

21 preparing in, sort of, militarized fashion? Do they state they're there to assert control in

22 lieu of law enforcement or something like that, right?

23 In violence-inducing conspiracy networks, you're looking for conspiracy that is

24 bound by a specific symbols, or you know, has some kind of boundaries, and you're

25 looking for a track record of violence associated with that conspiracy that is directly
55

1 attributable to those concepts. And the -- you know, QAnon met that standard. Stop

2 the Steal, prior to January 6th, I think, would not meet that standard.

3 Q What about some of the incidents that you mentioned between the election

4 and January 6th, things like the Million MAGA March, the violence around State houses,

5 in Philadelphia and Arizona, some of those events? Did you understand those to be

6 related to Stop the Steal in any way?

7 A Well, I think the -- I mean, I think the Million -- is that December 12th?

8 Q I believe so. It was in December.

9 That one's November 14th, but then there's also the event on

10 December 12th.

11 Mr. Fishman. You know, keeping track of them all is difficult. I think for some

12 of those, we had already identified smaller groups and more discrete organizations were

13 involved. So some of the militias -- there was some stuff at the State House in Arizona

14 that we were able to -- that were -- you know, that we addressed via some of the Arizona

15 militias that were not allowed there. December 12th, it was, you know, the Proud Boys,

16 which had been a hate group since 2018. So there were -- you know, so we already had

17 sort of policy pieces in place around some of those events.

18 did you have any follow-up questions on that?

19 Yeah. I had a couple.

20 BY-:

21 Q Mr. Fishman, this is, I think, a good way for us to get a better understanding

22 of what you saw the threat landscape as, and Face book's response leading up to the 6th.

23 I'm a little curious to dig in more about how you think Facebook could have, I guess, in

24 your words, not done anything more in advance of the 6th to prevent some of that

25 violence. And I know we were discussing sort of the nebulousness of Stop the Steal at
56

1 sort of varying layers of organizations?

2 But at any point, was there a conversation about the need to curtail particular

3 narratives of election fraud that would be tied to this kind of violence?

4 A Yes. Yeah. There were discussions around that, and there were

5 discussions around taking broader action against elements in the Stop the Steal network.

6 I argued for some of those. And I think, you know, and I think Face book should have

7 taken some of them. You know, I certainly was not the only one. Other people were

8 making those cases as well.

9 But the -- but, I think even if those steps were taken -- the issue is, I think if you

10 are a citizen, or you are a company, you have a responsibility to take actions, even -- so

11 your tools are not abused, even if that's not going to prevent the final outcome. And I

12 felt like there were steps we could have taken against a broader swath of the Stop the

13 Steal network on Facebook, that, in my mind, would have been appropriate. But I don't

14 think that that would have changed the final outcome. I think they should have -- those

15 are steps they should have been taken anyway, because I think it's the right thing to do.

16 But I think it's the big leap to suggest that would have changed the final outcome.
57

2 [1:04 p.m.]

3 BY

4 Q That point taken.

5 Can you describe some of the steps that you were advocating?

6 A Yeah. I mean, we were pushing for broader removals of some of the other

7 Stop the Steal groups, you know, that, you know, under coordinating harm policies or -- I

8 guess the coordinating harm policy was sort of the key one.

9 I mean, that's the thing, is, like, dangerous organizations policies are, like, last

10 resort, you know? And, frankly, many of us were uncomfortable with the way that we

11 had to apply that even to QAnon, because we felt like there were other policies that

12 should've had actor elements. And when we had to come in and resolve that issue in

13 the end, like, you know, I certainly felt some frustration that it hadn't been addressed

14 more effectively earlier using some of the other policy tools that were available.

15 So I think there was -- you know, there was clearly a decision made at the

16 company to try to address Stop the Steal through some of these other content-based

17 policies. Because an actor-based policy is a really big, blunt object, and there's just no

18 way to do it without collateral damage, even if you try to focus it, right? Because we

19 talked about trying to -- when you add nuance, you know -- like, nuance at scale winds up

20 leading to mistakes. It just does.

21 And so, you know, I thought we should take more aggressive action against some

22 of those Stop the Steal groups based on other policies, but I also -- I want to be really

23 clear that, like, I don't think that would've changed the outcome on January 6th.

24 Q Yes. And --

25 A We should've done it because it would've been the right thing to do.


58

1 Q Right. Right.

2 And so just trying to distill that --

3 A Yeah.

4 Q -- a lot of what it seems like you were advocating for might've been more on

5 the actor policy side, and the response you got from Facebook was that the preferred

6 route was to stick with some of the content moderation policies, content-based policies?

7 A Yeah. In general, that's a way to be more focused and targeted.

8 Q Okay. Yeah.

9 A And actor-based policies do get blunt. I mean, you know, like, you

10 know -- but there are things you can do. Like, if you see that, like, the refugees from a

11 group that gets disabled sort of re-coalesce in another group or something like that, you

12 can understand it essentially as a recreation of that other group, things like that.

13 But those are places where, you know, the area gets gray. And, in general, I

14 think moderating groups, in particular, is a very difficult thing to do, because oftentimes

15 it's not the administrator of a group that is doing the, like, worst things within that group;

16 it's other accounts. And so I think, you know, that's a tricky thing for policymakers to

17 deal with -- "policymakers" meaning at tech companies, not policymakers in the D.C.

18 sense.

19 But I do think that there should've been stronger action against some of them, but

20 I'm skeptical that it would've changed the outcome.

21 Q That's all really interesting.

22 And one thing that's occurred to me as you're talking is the interplay between

23 some of these groups that were more comfortably in the dangerous ergs

24 universe -- Proud Boys; I guess Oath Keepers was somewhere in the middle -- and then

25 also Stop the Steal.


59

1 And I wonder how the increasing intersection of those various groups -- for

2 example, on December 12th, you have a lot of Stop the Steal folks speaking at the same

3 rally as Stewart Rhodes, basically all reading from the same playback. And I wonder

4 how that confluence affected your view of what should be done and how that was

5 received by others at the company.

6 A It was certainly something that concerned me, right? Because when you've

7 got a movement like Stop the Steal that is, sort of, aligning itself with groups that you

8 have made a determination are dangerous, then that certainly increases your concern

9 about those groups.

10 And we absolutely felt that. That's a pretty typical way of thinking if you are

11 focused on organizations and you're thinking about the way that influence is transmitted

12 within a political movement.

13 At the same time, you have to be really careful about guilt by association.

14 And some of those principles are difficult.

15 Now, my job was to be paranoid, and I did it pretty well. And so we looked at

16 those relationships and saw danger.

17 You know, when you think about --you know, the difference between that Lobby

18 Day, the Virginia Civil Defense Lobby Day, a year prior was that the organizers of that

19 event said, "Hey, we do not want violence." They were very clear about it. "We're

20 here to talk. We're here to yell. We're here to complain and scream and maybe

21 intimidate a little bit, but yell and scream at our lawmakers to get them to do what we

22 want. We don't want anybody showing up and shooting anybody."

23 But you didn't hear that from the Stop the Steal guys. You know, they stood up

24 next to folks that we knew had a track record of violence, and they weren't trying to draw

25 that line about, "Hey, this is actually just a peaceful protest. That's all this is. Because
60

1 we want to express our frustration with" -- you know, based on an erroneous belief that

2 the election was stolen.

3 And that's a very reasonable judgment that I'm describing, process of making a

4 judgment call about risk and perspective. It is a much harder one to build into a

5 replicable policy that you're going to make judgments about over and over again.

6 And I think that when you are grounded in -- fundamentally grounded in fear of

7 real-world violence, it's easier to sort of say, we need to make some judgment calls

8 sometimes. And I think, in this case, we were right. But I also think it's a very difficult

9 call.

10 And I also think that even under more aggressive actions against Stop the Steal I

11 don't think it would've changed things. I think that this messaging was so prevalent

12 across the conservative media ecosystem, both on Facebook in groups that never said

13 "Stop the Steal" or they said "Stop the Steal" deep in groups, you know, in comment

14 sections and things, and across the broader, you know, radio/TV/digital ecosystem, that

15 those folks would've been on The Mall that day regardless.

16 And if they hadn't attacked the Capitol, then they just would've been wrong.

17 And so, in my mind -- you know, which is sort of a basic thing to say. But when I think

18 about what really led to violence that day, it was a bunch of very confused and misled

19 people informed by misinformation that they got on Facebook and they got on Twitter

20 and that they got on FOX News and they got in other places, coupled with these more

21 acute organizations that showed up looking to incite.

22 And that's a kind of dynamic that we'd seen the possibility of with the Virginia Civil

23 Defense League's protest a year prior. It didn't actually happen then, but it did happen

24 on January 6th.

25 Q Yeah.
61

1 I know our break is long overdue, but I had a couple followups on that.

2 You mentioned a couple times the crowd being informed by misinformation that

3 they got on Facebook and other places. And I'm curious about how Facebook's posture

4 towards the underlying misinformation about the election changed as you saw more

5 potentially violent organizing by groups like the Oath Keepers and organizing that toed

6 the line between not being violent but also not condemning it more broadly.

7 A Well, you know, I don't know. I mean, it's a tricky one to answer,_

8 I think where that question came to the fore was the decision to roll back some of

9 the break-the-glass measures. That decision was informed by real data on platform, on

10 Facebook, that suggested that various kinds of concerns were being reduced.

11 So it's not as if, you know, Facebook made that decision willy-nilly. Now, again,

12 like, this wasn't the decision I would've made were I king, but they made that decision

13 based on that kind of data. And they made that decision because all of those

14 break-the-glass measures -- and, you know, I looked at -- you guys sent over one of the

15 documents describing some of the break-the-glass measures. I don't know if that's all of

16 them. I don't know, you know, what version of that document that is.

17 Many of those measures come with, you know, I think they probably called it the

18 "collateral damage," or, you know, they come with false positives, where you're

19 impacting perfectly legitimate speech because you're making decisions based on

20 classifiers. And machine-learning classifiers always have errors. They have false

21 negatives, and they have false positives. You can lower the thresholds at which you

22 make a decision using that, which is largely what the break-the-glass measures did, but

23 that meant that lots of legitimate speech was getting suppressed.

24 And that's the kind of decision that you make when you're really, really concerned

25 about real-world violence or some other kind of tremendous danger. But there are real
62

1 costs to it. Now, I would've accepted some of those costs for a bit longer because of the

2 concerns that I had about the possibility of violence on the 6th and all the way up through

3 January 20th, frankly. But there was no decision that wouldn't carry important costs.

4 That's an important cost that would've come with some false positives.

5 And what I really hope is that you guys reflect -- even in situations where I

6 disagree with the ultimate decision that Facebook made, I think it's really important to

7 reflect the tradeoffs that they were facing and the -- because it's going to happen again.

8 These questions are going to come up again, and those tradeoffs are going to come up

9 again. And Facebook, other social media companies, policymakers in government that

10 ultimately are going to be using artificial intelligence more often going forward, they're

11 going to be facing those tradeoffs.

12 And if there's one thing we need to learn from this, it's that those tradeoffs are

13 real. Measuring them and balancing them against each other is hard. And hopefully at

14 some point we can come, you know, to some broader agreement about how you weigh

15 those tradeoffs in making these decisions. Like, that would be a -- I don't know if it's a

16 good outcome, but it would be a piece of a good outcome for some of the work that you

17 guys are doing. And I hope that we get there, which is why I'm going on this soliloquy to

18 make the point, because it's something that I think really matters.

19 And that algorithmic policymaking is -- the social media companies are canaries in

20 the coal mine. And this is the future, and those decisions are not easy and they're not

21 good. And so they come with hard tradeoffs.

22 And so, even though I disagree, I would've been more conservative in the sense of

23 concern about the possibility of real-world violence and accepting of some of the false

24 positives, I think that it is a -- and I would've done that primarily because of my concern

25 about real-world violence based on off-platform information that I don't think Facebook
63

1 as a whole adequately incorporates. But I also think that the course of action that I'm

2 recommending -- that I recommended and that I still think was probably the right decision

3 is not an easy one to make because of the false-positive risk.

4 Mr.- Well, I know Mr.-has a followup.

5 Mr. Fishman. Guys, can we take a 5-minute break?

6 Mr.1111 That's what I was going to recommend. Well, we can take a longer

7 break. I think -- yeah, we should take a 20-minute break for lunch. Does that sound

8 good, everyone?

9 Mr.- I think 20 minutes sounds good. It is a little past due our

10 lunchtime here, Mr. Fishman. You might want to refresh your coffee or something as

11 well.

12 So let's take 20. Maybe we can reconvene at 1:40. That's 22 minutes. 1:40

13 eastern time.

14 Mr. Fishman. Okay, sounds good. And I'll just stop video and mute for now?

15 Does that sound right?

16 Mr.~ Perfect. We'll do the same.

17 And we're going off the record at 1:18.

18 [Recess.]

19 Mr.- We can go back on the record at 1:42 p.m.

20 And thank you for coming back for another session, Mr. Fishman. And I think, at

21 this point, I'll hand it over to M r . - who had some more questions for you.

22 Mr.- Thank you.

23 BYMR.-

24 Q And thank you again, Mr. Fishman, for your time today.

25 If I could get exhibit Cup? Or, that's exhibit No. 5. We changed our
64

1 nomenclature.

2 Okay. So you should be able to see on your screen this copy of an internal

3 Facebook report with the title "Stop the Steal and Patriot Party: The Growth and

4 Mitigation of an Adversarial Movement." Is that coming through?

5 A Yep.

6 Q Okay. Are you familiar with this report?

7 A It certainly is generally familiar, but I wouldn't say that I -- you know, I

8 suspect that I read it at some point, but I don't remember it in detail.

9 Q Were there many such reports? My understanding is that Facebook

10 operates often in a decentralized manner. Were there a lot of retrospectives like this?

11 A Yes, there are a lot -- I mean, it's -- yes.

12 Saying that Facebook operates in a decentralized manner is a very understated

13 way of putting it. Every organization and function has ways of, and desires to, sort of

14 learn lessons. It's a really good thing about the culture in a lot of ways, but it also means

15 that those retrospectives oftentimes don't actually speak directly to each other, because

16 they define terms differently, they gather slightly different data. And that can be really

17 problematic.

18 And it's something that I warn folks -- and I warn not only you guys, but I warn

19 others that are looking at some of the leaked Haugen documents, just to be really careful

20 in understanding how those terms are defined, where the data came from, et cetera.

21 Because it doesn't necessarily match up with what would be seen as canon inside the

22 company and certainly at the leadership level.

23 And I'm not saying that's the case here. I just -- as a general concern.

24 Q That makes sense, though, and thank you for clarifying it. It's not always

25 transparent to outsiders, and it is an important point.


65

1 I wanted to start by asking you about one point this report makes that I thought

2 might touch on your area of expertise.

3 The authors claim that one of the most effective and compelling things that they

4 did when studying Stop the Steal was to look for overlaps in the observed Stop the Steal

5 networks with militias and hate organizations.

6 A Yep.

7 Q Do you have any insight into what they meant when they wrote that?

8 A Well, I don't know for sure. It may have been -- membership overlap I think

9 is probably the most natural. Now, what I couldn't tell you is whether -- sometimes

10 you'd look at membership overlap and you'd look at including deleted accounts. So just

11 because there was membership overlap or something like that here doesn't mean

12 that -- you know, it may mean that, like -- and this is purely hypothetical, but it might

13 mean that there was a Stop the Steal group and then there was some Proud Boys group,

14 the Proud Boys group came down, a bunch of the Proud Boys -- like, identified Proud Boys

15 within the group came down, but there was membership overlap between the two

16 groups, something like that, right? Those are the kinds of things that you might look for.

17 And I do think that one of the, sort of, adversarial pieces that you see is, you

18 know, accounts that are part of a network but that can't be removed because they don't

19 meet some policy threshold, they'll oftentimes try to come back within a setting or

20 operate within a larger setting where they feel like they're safer for some reason. And

21 that might be a smaller group, but it also might be a group that for whatever reason isn't

22 coming down for policy purposes.

23 Q When a violent or dangerous org gets designated and groups and pages

24 come down, what is the threshold for taking accounts down?

25 A Administrators or moderators would come down as soon as a group or page,


66

1 but not necessarily every member of the group. And you can imagine, in many cases

2 you've got journalists, researchers, you know, in other settings that are not actually

3 members of the organization. So the, you know, membership in the group itself is not

4 always indicative of membership in their, like, actual, real organization. And so it really

5 depends.

6 Oftentimes what you'll do is you'll go through and review the users of that group

7 to see if they themselves indicate that they're members of an organization or something

8 along those lines.

9 Q And then when you do that review, is it then sort of the next step to remove

10 them? Or is there a series of, sort of, policy checkpoints that have to be met before

11 removal of confirmed members?

12 A It depends, right? So, you know -- and, again, like, I can't speak to

13 Face book's policy today, right? But a user that is a member of a tier-one hate group, for

14 example, or ISIS, right, if you get a member of ISIS on Face book and they say, "I'm a

15 member of ISIS," their account comes down. Right? Like, you can't be ISIS on the

16 platform. You can't be the KKK on the platform. And so, if you are representing

17 yourself as such, then you would come down.

18 The other thing that happens sometimes is they don't represent themselves as

19 such but, because of who they are, they will have content violations on their profile.

20 And so sometimes that fan-out, you know, manifests in different ways depending on the

21 circumstance.

22 Q But this wasn't the case for T3 organizations, which included MSMs and

23 violence-inducing conspiracy networks. Is that right?

24 A That's correct. Just to be clear, that was disclosed publicly when the policy

25 was announced.
67

1 Q Right, yes. That's still available on the website today, I think.

2 A Yeah, I think so.

3 Q Yeah.

4 So are you generally -- at any point after 1/6, are you aware of an intersection

5 between known militia and conspiracy groups and either the organizers or membership

6 of the Stop the Steal groups?

7 A I mean, there was certainly -- there was certainly, you know, membership

8 overlap to varying extents with some of those Stop the Steal groups, you know, and the

9 Patriot Party groups. And I believe this document details -- you know, as you suggested,

10 this document details some of that.

11 And those are the kinds of things that, to us, were quite concerning from a

12 dangerous organizations perspective.

13 Q When you said earlier that dangerous organizations is an actor-level policy, I

14 understand what you mean by that, but I wondered if that ever extended, then, to the

15 implication that accounts are a form of actor that needed to be sort of monitored and

16 enforced against. Did Facebook have an inclination on that question?

17 A Yes. I mean, so there are definitely mechanisms where, you know, if

18 accounts represent themselves as -- well, so this manifests a couple different ways.

19 One is that sometimes an individual is designated under the policy, right, like

20 Hitler or the Unabomber or somebody like that, right, where they're not clearly

21 associated with an organization but individuals can be designated.

22 And then there are other folks that were sort of brought under but in less, sort of,

23 extreme cases, the Alex Joneses of the world, where there's a lot of nastiness but not

24 necessarily a direct track record of violence.

25 And then you've got situations where you've got leaders of designated groups or
68

1 members of designated groups, and if they are understood to be that, then they can

2 come down as well, right?

3 And, now, as a practical matter, right, like, if you are a low-level member of many

4 different kinds of organizations, that's not something that Facebook is going to know

5 unless they say it, you know, generally on Facebook, right?

6 Now, if it's a prominent leader of an organization, if it's, you know, Ayman

7 al-Zawahiri, you know, you know who they are, you can look out for that kind of thing.

8 And folks representing themselves as Ayman al-Zawahiri -- even though, you know, I don't

9 think he personally had a Facebook account, right? But if it's a, you know, Joe Schmoe

10 that's a member of, you know, even a noxious organization like the KKK, if they're not,

11 sort of, putting that front and center, you know, Facebook's not going to know. How

12 would they?

13 Q Was recidivism a problem when accounts were taken down?

14 A Recidivism is always a problem. Like, recidivism is a huge problem. It's a

15 huge problem not just for dangerous organizations but for everybody else, right?

16 And so there are efforts to prevent recidivist accounts. Dangerous organizations

17 definitely used some of those. Some of those things were developed originally by us.

18 You know, I think spam and some of the more sophisticated state-actor teams developed

19 some of those kinds of techniques that were then borrowed and adapted.

20 But the thing that I would say is that very, you know, smart, resilient actors that

21 want to beat that stuff, they're going to find ways to do it. Like, anti-recidivist efforts

22 are good, they raise the bar, they make life more difficult, they can make it seem like it's

23 not worth it to operate on a platform and drive people elsewhere, but they are

24 incomplete solutions.

25 Q Who was the first line of defense against recidivism, at least in your policy
69

1 area? I mean, was that your team? Was there someone proactively looking for it, or

2 was it handled on a sort of as-it-arises basis?

3 A No. I mean, we -- there were some standardized anti-recidivism things that

4 had been built. Like, we built some in the dangerous ergs world years back.

5 And they were kind of -- you know, the challenge is, when you build something,

6 you're -- like, when you're a subject-matter-focused team, like the dangerous ergs

7 network, you can build something, but oftentimes you're sort of holding it together with

8 duct tape, with digital duct tape, right? And if you really want to build something that's

9 scalable, that is flexible, that is easy to update, do those kinds of things, then you really

10 need to push those things into more centralized engineering teams that oftentimes take

11 longer to build something, but they'll build something that's more robust. And those

12 are the kinds of things that the central integrity team would build. And so we would do

13 that.

14 But, also, we ran a number of SNDs, strategic network disruptions, against some

15 of these groups, right? Proud Boys, Oath Keepers, a range of others. You know, we

16 ran them against ISIS. And Facebook announces those occasionally. But those kinds of

17 efforts would be more targeted investigations of a network, attempt to take the network

18 down as a whole, or as much of it as you could, you know, sort of gather in and

19 understand as part of the network. And then you can, sort of, enroll some of those in

20 more targeted recidivism programs that some of the more specialized teams built.

21 And those are good. They're still not perfect. And, you know, if somebody

22 decides they're going to go get a new device, for example, they can make another

23 account, you know? And there's not a lot Facebook can do about that. So recidivism is

24 a real problem.

25 Q Were any of those strategic network disruptions ever targeted toward any of
70

1 the actors involved with the initial Stop the Steal groups or the others that percolated

2 after that takedown?

3 A I don't recall. They certainly -- there were network disruptions dealing with

4 the Proud Boys, the Oath Keepers, other militia groups, Three Percenters. I don't recall

5 on Stop the Steal.

6 Q Okay.

7 Earlier, you talked about the tradeoffs related to false positives. I was

8 wondering, when you do enforce against accounts in this space, are there ever

9 discussions about tradeoffs related to voice, to preserving an individual's voice on the

10 platform?

11 A Always. Yeah. Of course. It's one of your great fears when you're

12 dealing with these kinds of situations, is that, like, you know, the purpose of a social

13 media platform, whether it's Facebook or Twitter or whatever else it is, is to, you know,

14 maximize voice. That's one of -- has always been one of Facebook's values. You want

15 to empower people to speak. It's one of the great promises of the internet, is that it

16 gives people that don't necessarily have a prominent platform a platform.

17 The dangers that come along with that are manifest, right? That some of those

18 people are going to use that platform for real harm. And drawing the line between

19 those that, you know, may have contentious ideas and those that are going to use a

20 platform to advance real harm is not always easy. And so, you know, all of these

21 policies are efforts to try to do that, but they are imperfect.

22 Q And what were the views of the senior leadership that you interacted with

23 on this question? Did they ever -- did they justify any decisions during the election

24 period, 2020 into 1/6, based on these concerns about voice, this purpose?

25 A Sure. Yeah. You know, I mean, voice is a core value of Facebook.


71

1 And so, you know, this is one of the risks of keeping the break-the-glass measures going,

2 for example, because they reduce voice for some, and they don't just reduce it for the

3 bad guys, right? That's the problem, is, when you've got algorithmically adjudicated

4 policies, you are, by definition, reducing voice not just for the ones that you want to get

5 but for other people who are not necessarily doing anything wrong.

6 And that is a -- you know, that's a very, very difficult -- you don't want to make

7 that choice, but that is the choice you have to make at scale. And it's the kind of choice

8 you make when you feel like there is acute threat.

9 But certainly that's the kind of thing that -- you know, Facebook doesn't want to

10 reduce the voice of people that are expressing -- I shouldn't characterize. I don't speak

11 for Facebook anymore. My view is that leadership of Facebook doesn't want to

12 suppress anybody's voice, so long as they are expressing views in a reasonably

13 constructive way and aren't, sort of, coordinating real-world harm and real-world

14 violence.

15 And I think the trick here is that, clearly, the Stop the Steal movement did

16 contribute to real-world harm and real-world violence, but they were also reflecting a

17 genuine, misinformed view by lots of Americans that the election had been stolen, and

18 expressing that frustration, as crazy as it was, is protected speech. And I think that that

19 puts that narrative in a very difficult place for leaders of a social media platform.

20 Now, you know, when you put that all together, from the perspective that I'm

21 coming from, I think, you know, there was real possibility of violence, and that's how it

22 played out. But I do think that it's a tricky judgment.

23 Q So what you said makes sense in the instance of account takedowns, right?

24 You don't want to necessarily remove someone's voice from the platform.

25 Many of the break-the-glass measures have been characterized as soft


72

1 interventions, as more about things about how Facebook steers content around the

2 platform.

3 Did Facebook make a distinction in its commitment to voice in areas about the

4 speech acts of individuals versus areas in which Facebook demoted or amplified certain

5 points of view, certain narratives, certain pieces of content?

6 A Can you repeat the question, I'm not sure I totally understand the

7 distinction that you're making.

8 Q I'm asking you to characterize how Facebook leadership applied this

9 principle of voice when thinking about soft interventions, about things like, say, reducing

10 or, let's say, demoting content that's likely to be hate speech or likely to be violent

11 incitement in the news feed, or --

12 A Right.

13 Q -- making groups less findable or recommendable or discoverable because of

14 the content there, while not taking the content down.

15 A Yeah.

16 Q So the hard versus soft intervention distinction, how did Facebook apply

17 voice there?

18 A Yeah. So I think that -- I mean, this is a very tricky question. All social

19 media platforms are wrestling with it, to some extent.

20 I think what they were trying to do, what social media platforms in general are

21 trying to do, is serve up content in an environment where a user -- there's no way a user

22 can see everything on the platform. It has to be organized somehow. On Facebook,

23 it's primarily organized by your friend network, the groups that you've joined, the pages

24 you've liked, the stuff you've searched for, right, by signals that users are putting in.

25 But all of that does get sort of mediated through an algorithm that has to select
73

1 which posts from your friends, which posts from the groups that you like, all of that kind

2 of stuff, and you're probably not going to see everything. And so there is an effort to try

3 to rank what's going to be most useful for somebody.

4 This is the same instinct that drives -- you know, people say, well, it should just be

5 reverse chronological order. And the assumption there is, stuff that's more recent is

6 more relevant. And what Twitter and Facebook and VouTube and everybody else are

7 doing is they're saying, well, how do we get better than that? Recency is not the only

8 thing that's relevant. It's also who are your friends, et cetera, and then something about

9 the content of those messages.

10 And so what Facebook released are these, you know, content distribution

11 guidelines, which I think you guys have probably seen. They're pretty high-level. But

12 they basically say two things. They say that, at some level, on a classifier score,

13 if something is, like, borderline violating, right, if we think it's close to violating, we may

14 down-rank it. So it might not be hate speech, but it's pretty nasty stuff; we're going to

15 try to make sure that we don't, you know, promote that stuff, you know, more than we'd

16 like. So that's one of the things that they've said.

17 And then the second thing they say,_ is what you pointed to, which is, our

18 classifier says there is some percentage likelihood that this is hate speech, right, and

19 we're going to down-rank it, you know, based -- we're going to down-rank content that

20 hits that likelihood level, right? And that's what was changed oftentimes in the

21 break-the-glass measures, is the likelihood.

22 Now, the problem is, if you're saying, we're going to down-rank something if it's

23 50 percent likely that it is hate speech, that it is the thing that we want to

24 down-rank -- because we don't have the capacity to review it all to see if it's hate speech,

25 and we're worried about the damage that it might do, so we're going to reduce its
74

1 visibility, et cetera -- that means that half the things you're down-ranking aren't hate

2 speech at that confidence level.

3 And that has an impact on voice. Because that other half, it might not be close

4 to that. It might just be -- that's the problem with algorithms, is that sometimes when

5 an algorithm is wrong -- like, when human beings make mistakes -- because human

6 beings make mistakes when they review stuff all the time, right? Everyone always

7 blames the algorithms, but human beings make mistakes too. But human beings tend to

8 make mistakes in ways that sort of make sense, right? Like they missed satire or they

9 were moving too fast or they just pressed the wrong button, you know, like we make

10 mistakes. Sometimes the algorithms make mistakes in ways that are really hard to sort

11 of understand what led it to a bad decision.

12 And so, if you're going to take action against those things based on that

13 probability, you're going to be -- fundamentally, you're going to be down-ranking material

14 that in an ideal world you would not be down-ranking, you would not be sanctioning in

15 any way. And there's no way not to, at scale, when you're using an algorithm.

16 And I think that's the tension with the break-the-glass measures, is that that

17 problem is always there. It's inherent in using algorithmic mechanisms to down-rank

18 content.

19 What happened with the break-the-glass measures is that, because of the general

20 circumstance, Facebook's leadership said, you know, we're going to accept more false

21 positives during this particularly concerning time period, but that we want to get back

22 and don't want to have that impact on voice over a longer period chime.

23 And so, you know, I think the context suggested that the risk remained during that

24 period, but I think that basic tradeoff is a really, really difficult one. And, frankly,

25 nobody knows exactly -- like, nobody knows exactly where to draw those lines.
75

1 Q In your view, how accurate are those classifiers? What would be a

2 reasonable threshold?

3 A Oh, that's --1 mean, I don't know,_ Like, I think we need -- you guys

4 should -- there should be a congressional commission looking at classifiers, not just social

5 media classifiers but, like, Al generally.

6 Like, I mean, I really mean this. There -- like, we -- maybe not a congressional.

7 It shouldn't be congressional, actually. But we need a national

8 civil-society-government-industry effort to define ethical standards for the use of artificial

9 intelligence. Like, we really need this.

10 And it should not just be about social media. Like, social media is one way we

11 should think about this, but it should be, you know, facial recognition by police

12 departments; it should be targeting decisions by autonomous weapons; it should be,

13 what do we do for real estate classification and its impact on civil rights.

14 Like, there are so many areas where we don't have ethical guidelines for the use

15 of artificial intelligence when there are inherently going to be false positives and false

16 negatives, and every industry has to make it up for themselves with no sort of

17 overarching framework.

18 Now, there are people out there thinking about this stuff, but I just -- this is a

19 debate that is going to increasingly enter the policy arena. It has done so through the

20 lens of social media is where it gets attention. But this is part of the 21st century

21 and -- or the 20th -- what century are we in? -- right? It is part of our policymaking

22 reality going forward, not just in the realm of social media.

23 And I just -- like, I could pull some numbers out, but, honestly, I just think that the

24 takeaway is that we actually need a much deeper investigation and discussion about this,

25 where, like, a lot of stakeholders come to the table, so that, in the future, social
76

1 media -- in part so that social media companies can say, look, we are abiding by the

2 guidelines that have been set by some commission, rather than just we made it up

3 ourselves. Like, in the most well-intentioned way, they still made it up themselves, and

4 that worries people.

5 Q As part of this multistakeholder process, which is speculative, does Facebook

6 have a responsibility, then, to be transparent about the classifiers, how they're applied,

7 their impact, any kind of assessment? Is there a need to sort of pull back the curtain on

8 these kinds of questions? Which, to date, I don't think they have really done in a

9 meaningful way.

10 A I think that -- I think social media companies do have a responsibility to be

11 more transparent about what these, you know, classifiers do. But, at the same time, I

12 think that responsibility needs to be applied broadly across industries. And I don't think

13 that it's right to say that Face book specifically has that obligation, distinct from other

14 platforms.

15 Q Do you think the platforms, broadly speaking, have that responsibility?

16 A Yes, I do.

17 You know, I don't think that we should expect every little detail. I mean, you

18 know, Elon Musk says he's going to open-source the Twitter algorithm, and I think what's

19 going to happen if they do that, it's going to be totally anticlimatic and uninteresting.

20 And I think that's mostly the case here for Face book, with the exception of, you know,

21 what are the guidelines and the circumstances in which you're going to alter things

22 because the context is so different?

23 And I think those are kinds of things that platforms ought to be transparent about.

24 What are the factors that really go into these algorithms, and what are the situations

25 where platforms make changes, like the break-the-glass measures? How are they
77

1 assessing those, whether those changes are effective?

2 And they should be transparent about when they turn them on and when they

3 turn them off and that decision process. But they should be able to do that in a way

4 within some boundaries that we have collectively sort of resolved on in society that are

5 reasonable, right, like, these are reasonable places, as opposed to the companies making

6 it up themselves.

7 Because I think that's -- you know, I think -- I just -- people don't want the

8 companies to make those decisions on their own. We, as a society, I think can help

9 provide some ethical guidelines that would be good for the companies and, frankly, good

10 for everybody.

11 Q I want to move back up -- this has been really useful, but I want to get out of

12 this segue a little bit to move back to the exhibit and ask: Were you aware of any

13 studies like this on the growth of Stop the Steal, any monitoring efforts or investigations,

14 before 1/6?

15 A Yeah, I'm sure there was something. I'm sure there was something.

16 mean, there -- but I couldn't point specifically. I mean, it was certainly a movement that

17 we were aware of and are aware of, you know, other elements of it that were not those

18 that were taken down. So I imagine somebody was investigating. But I don't

19 remember a specific document or anything to that effect.

20 Q Do you remember any conversations about holding investigations or

21 conducting studies?

22 A I remember discussion about Stop the Steal and whether or not we should

23 take more aggressive action against that, sort of, manifestation on the platform.

24 Q Uh-huh.

25 A And because the original group did come down, you know -- and, again, I
78

1 don't remember exact -- I think it was coordinating harm. I don't remember exactly

2 why. And so there was discussion about, you know, what to do with the rest

3 of the -- you know, with other parts that were reflecting similar sort of perspectives.

4 And, inevitably, there'd be some analysis that went along with that, but I don't remember

5 precisely.

6 Q Who would've made the call to take that original group down?

7 A Monika, Joel, probably. Maybe Nick. I don't remember.

8 Q Were they typically involved in taking enforcement actions like the deletion

9 of groups?

10 A Not at scale. I mean, like, you know -- you know, untold numbers of

11 Facebook groups will be deleted, you know, in the next 20 minutes, you know, via rote

12 actions, right? And they will probably not be involved in any of those.

13 But certainly when you get to, like, really, you know, either very difficult,

14 borderline calls, areas where the policy may have, sort of -- you know, the language they

15 use is "spirit of the policy" versus "letter of the policy." I'm not sure that's the best

16 language, but that's often what's used -- and things that are going to be scrutinized

17 highly, like a decision like that, you know, those are the kinds of things that would be

18 elevated up the chain.

19 Q What was the controversy around the takedown of Stop the Steal that

20 necessitated it being elevated to them?

21 A I mean, I don't remember the exact details, to be honest,~ I think it

22 inevitably would be two things.

23 One is that the group was -- such groups were primarily distributing

24 misinformation about the election. And so there is a -- you know, the erroneous

25 assertion that the election had been stolen. And this was before the election was
79

1 certified, I think, that it came down, right? So there is, you know, some -- you know,

2 removing that stuff is easier after the election is certified and, you know, including after

3 Congress certifies it, right, on the 6th itself.

4 So there's that, right? So, like, this is misinformation. It's a content-based

5 assessment, you know, but you're bringing in other perspectives. Like, maybe there's

6 overlap with, you know, some militia groups that have come down or something like that.

7 And so, you know, you're operating in a little bit of a gray area there.

8 But it's also a dynamic where this is a high -- like, you know, could not be a more

9 highly politicized environment, obviously. And so, you know, senior leadership, that's

10 the kind of thing that they would want to be aware of and want to understand and, you

11 know, potentially shape, but certainly you'd want to be able to explain any decision in

12 very clear terms. And that gets tricky when it's a gray area, right?

13 Q Was this something that they discussed, this highly politicized nature of the

14 takedown? This was something they were worried about?

15 A Well, I wouldn't say that the takedown -- I mean, maybe I did say that, and I

16 didn't mean to. I wouldn't say that the decision itself was highly politicized. I said that,

17 like, you could -- you know, Facebook really tries to make these decisions without, sort of,

18 reference to, like, political perspective. But, obviously, the impact would be, this would

19 be treated as a political decision, externally.

20 And I'm sure that that was discussed. I mean, how could it not be, you know,

21 when there was, you know, a huge -- you know, the MAGA movement was roiling for a

22 fight at that point, and Face book is obviously one of the targets of that movement.

23 Q Were there instances where takedowns would be escalated to them but

24 they weren't around civic content or they weren't around politicized content? Were

25 there other borderline cases they ruled on?


80

1 A For the most part, that's where things get really tricky. I mean, sometimes

2 there'd be harder ones, if it's a bullying case or, you know, certainly some of the

3 misinformation, medical misinformation stuff, which, you know, clearly has a civic

4 component but is a little bit different. Those sorts of things.

5 You know, there were cases -- you know, there are edge cases with all of them.

6 think most of the time things would go up when there's a political dynamic, but there are

7 edge cases in all of these policies, including things like child safety, right?

8 And, you know, once you get to the edge of the policy, it's hard and it's tricky, and

9 there's -- you know, you may be impacting somebody's livelihood. And so it may not be

10 a big civic/political question, but it is a question that may matter in the real world, and if

11 you're going to take action, you've got to try to think through the consequences, the

12 second- and third-order consequences.

13 So I would say that it's, you know, mostly the civic stuff and the political stuff, but

14 not exclusively.

15 Q Do you remember the arguments for and against taking down the initial

16 groups?

17 A I don't. You know, I'd have to characterize, but I worry that I'd just be sort

18 of making some assumptions at this point.

19 Q Okay.

20 Moving on from that decision, obviously, Stop the Steal groups continued to

21 bubble up through November and December.

22 A Yep.

23 Q What was the ongoing debate like? I mean, who were the players in the

24 broader conversation after the initial takedown?

25 A I don't remember the ongoing debate there. I mean, I remember -- like,


81

1 you know, I remember that the decision was made to take down the core group and not a

2 wider set of the movement. And then much of the conversation moved on.

3 There was a discussion about the break-the-glass stuff and what to do with that.

4 There was a ton of work still ongoing within the dangerous ergs world about specific

5 kinds of threats that we were seeing in various places. You guys mentioned earlier

6 some of the protests around capitol buildings, and those were -- you know, various

7 elements were investigating those.

8 And then I think there was, you know, how to prepare for some of the -- the

9 Georgia special election, which was obviously a big one. And I think much of the, sort

10 of, civic team and others were really focused on making sure that that got the same kind

11 of protection, because it was so critical -- you know, everyone, of course, understands the

12 context -- and that disruptions there, you know, were just as likely, perhaps, as they were

13 in the Presidential election.

14 But I don't remember, sort of, specific debates about other pieces. I mean, like,

15 we were doing a lot of work trying to, you know, chase down the Proud Boys. And we

16 did some network disruptions in December, if I recall correctly -- I'm pretty sure that we

17 did -- of militias and maybe the Proud Boys again. We'd done a bunch of that stuff.

18 But it was not -- I don't remember the, sort of, ongoing debate about Stop the

19 Steal. I'm sure that it lingered to some extent, but it was sort of a decision that was

20 made where we took down the main group, we'll leave up some of the others.

21 And there may have been subsequent debates about specific groups that had

22 come up for one reason or another: There was a lot of hate speech or there was a lot of

23 violence and incitement or something. I don't remember exactly.

24 Q So, going back to the exhibit, this report describes a pattern in which a small

25 number of connected individuals drove a great deal of the Stop the Steal activity,
82

1 including invitations and content violations.

2 I think one particular statistic I'll quote is that 0.3 percent of the members were

3 responsible for 30 percent of the invitations sent from this group.

4 A Yes.

5 Q And up to two-thirds of the invitations were sent by people who sent, I think,

6 more than 100 invites.

7 I'm wondering, is that typical of MSM and violence-inducing conspiracy network

8 groups?

9 A So that model -- so everyone talks about, like, the algorithm providing

10 recommendations and that's how groups grow. I definitely think that is garbage. The

11 way that we saw groups grow was through invites.

12 That's how -- you know, I know you guys have another exhibit with -- is it Carol's

13 Journey? Am I getting her name right? Or, she's not a -- I guess -- anyway, the

14 persona. And that's sort of telling a story through recommendations, right, that it is

15 possible to rabbit-hole. And I think it is possible -- and, especially, this was, you know, in

16 the middle of 2019. Some of those other protections were not probably in place yet.

17 But, in 2020, after we took action against QAnon and with Stop the Steal, I think

18 the way that those groups grew after that was through invites. It was through

19 invitations, you know?

20 And that's not surprising, right? That really shouldn't be surprising, because,

21 quote/unquote, "grassroots organizations" always have organizers. Whether these

22 organizations, whether these movements are built for, you know, "let's build a new

23 library in our community" or they're built around "let's call into question the Presidential

24 election on totally unfounded grounds," there's always an organizer.

25 And that kind of perspective, though, of, like, go find the person, go find the
83

1 organizer, go find the real-world thing, inside of a place like Facebook, that attitude was

2 very, sort of, dangerous ergs' thing. Find the real-world thing.

3 And that's not exclusive. Like, other people -- I think some of the great

4 researchers put together some of these documents that have now been leaked, and they

5 were clearly coming to some of those findings also.

6 But that notion of "go find the real-world thing that's driving this" is just not how

7 social media has thought about these things. They've thought about this stuff as just

8 sort of, it's a systemic thing, that it all comes together. And, you know, honestly, some

9 of that is true, but at the end of the day, push comes to shove, there are still oftentimes,

10 in my mind, people back there behind this trying to pull the strings, trying to make these

11 things happen, trying to do it. And I think some of the discussion about social media,

12 frankly, misses that.


84

2 [2:25 p.m.]

3 BYMR.-

4 Q And during the period after the election but before 1/6, there was no study,

5 to your knowledge, no assessment that would have identified these super inviters and

6 attempted to link them back to offline organizations, individuals, activities?

7 A You know, I don't want to -- I don't want to categorically say it wasn't there,

8 but I don't remember it, yeah.

9 Q Within your -- to your knowledge?

10 A Yeah. Yeah, I don't remember it. We certainly did some of that stuff

11 where we saw on QAnon prior to the election, we understood the dynamics. There

12 were a bunch of invite rs on Q that were -- some of those groups that were coming back

13 after we took them down. They would come back, and they would try to rebuild, and it

14 was that invite kind of functionality, which said to me that somebody had lists of names

15 and somebody had lists of, you know, Facebook handles or somebody had email

16 addresses that they were keeping somewhere trying to bring those things back and

17 recreate those communities. That's how I interpreted it, but I don't actually have

18 evidence of that.

19 Q This report actually alleges that that is how these groups reconstituted

20 themselves, that there were backup groups made in preparation for takedowns.

21 Was that in their playback, in your view?

22 A Oh, yeah, backup groups is definitely a thing. I mean, all of the things we

23 saw these folks do, everybody does. And, you know, I'm not relating these folks -- this is

24 not a direct comparison to some of the more extreme groups I'm about to mention, in

25 terms of like who they are and what they believe. It's different. But when groups get
85

1 pressure by a social media company, they take standard behavioral adaptations. One is

2 they make backup groups. They make backup accounts. They create ways to get back

3 on. They keep lists of their friends. They create a safe account where they -- a fake

4 safe account so that they can coordinate there but not actually do any of the nasty stuff

5 there, but they can figure out where they're going to plan and recreate, right? And ISIS

6 has done all of these things. Ha mas has done all of these things. Patriot Front did a lot

7 of these things. Proud Boys did some of these things, right?

8 And so, these groups that were doing nasty stuff and facing that kind of pressure

9 through a variety of different policies on Facebook, they started do the same thing, the

10 same thing you or I would do, right, were we in that situation. They are just trying to

11 adapt and avoid that kind of pressure.

12 Q Was part of this tool kit the use of multiple accounts, single-user multiple

13 account activity?

14 A Yeah, in many cases.

15 Q Yeah. Is that something -- if you're caught using multiple accounts, that's a

16 violation of Facebook terms and service? Is that right?

17 A That's right.

18 Q Is that a bannable offense? Are people removed from the platform for

19 that?

20 A Yeah, if you're using fake accounts like that, you can be. You can be

21 banned. But usually what would happen is -- I think where that would happen more

22 often is if one of those accounts was taken down for something, would you fan out to the

23 SUMAs.

24 Q Sorry. To clarify, would you do that if one of those accounts was taken

25 down, would you then fan out to the SUMAs?


86

1 A Dangerous ergs you did.

2 Q In dangerous ergs you did. But for other policy violations, say, for

3 violations of -- you know, for someone who runs a group that's highly toxic that has high

4 incidences of hate speech, something like that?

5 A I don't know. I'm not exactly sure what all the details were. This is

6 where, like, things get complicated, and I'm just not sure across the board.

7 Q Sure, sure. You have a particular window, and I appreciate that.

8 So the individuals who are these super inviters, the report also says that they

9 tended to interact with one another a lot. They messaged each other, commented on

10 each other, tagged each other in posts. Is that a signal that you would look for to

11 identify something like a dangerous org or a network of people engaged in activity that

12 could be concerning?

13 A An investigator might use that as a way to feed an investigation, but it

14 wouldn't necessarily be sufficient. People tag each other all the time.

15 Q That's a way to describe me in my friend group as well, I suppose.

16 A Right.

17 Q Yeah. Do you have any insight into who these super inviters actually were?

18 I mean, was there ever, even after 1/6, an effort to identify the offline groups of

19 individuals, if not their individual identity, but the offline network pushing these groups?

20 A I don't know if that question was ever fully answered. It's an interesting

21 one. It's certainly one that we were interested in. I don't know if it ever really got

22 answered. I don't know.

23 Q Did you attempt to answer it? Was there an effort?

24 A There were attempts to answer a lot of questions about January 6th.

25 don't know if that specific question was addressed. I just don't know. I don't think -- I
87

1 mean, I don't think we ever really knew who, like, the offline people were that were

2 behind those specific accounts. I don't think we ever found out.

3 Q Did they tend to not use their real name? I mean, I have to use an email

4 address to sign up for a Facebook account, and I guess I could create, you know, a false

5 email address, but did they tend to obscure their identity? Do the type of people who

6 do this tend to obscure their identity?

7 A I think, in many cases, people tend to obscure their identity. I don't

8 know -- I mean, I don't know. There may be more in that document. I just sort of

9 skimmed it this morning. I don't know what their practice was in this case, you know.

10 It's possible that I once knew with more specificity than I do now, but it's been a while.

11 Q Did you feel like you had runway to investigate who they were? I mean,

12 was there an issue of priorities or did you get any guidance from leadership on what you

13 should or shouldn't be looking into?

14 A No. I never felt like I was getting leadership direction on that question.

15 We were more -- you know, the question that I was most interested in was how much of

16 the violence on January 6th was organized on Facebook. That's a very dangerous way of

17 looking at it, right?

18 Q Of course, these super inviters were organizing on Facebook?

19 A Oh, absolutely, they were organizing on Face book. But the most acute

20 versions of organization for violence was not happening in the Stop the Steal groups,

21 right? It was -- that's where people were certainly using some violent rhetoric and doing

22 those kinds of things, spreading misinformation related to the election to get people onto

23 the Mall.

24 But when you look at the planning that was done by the Proud Boys, the Oath

25 Keepers, other organized groups, like I said, there was a little bit of that in terms of, like,
88

1 logistics that occurred on Facebook, and it's come out in the indictments. But most of

2 that was occurring on other platforms.

3 Q But there was a membership overlap between those organizations, the

4 Proud Boys, the Oath Keepers, the types of militias you just mentioned, and the

5 memberships of these groups?

6 A I'm sure. But the membership overlap is going to be relatively -- right?

7 The overall militia movement, yes. But, you know, the Proud Boys and the Oath Keepers

8 are relatively small organizations. Some of these Stop the Steal groups were massive.

9 And so, yes, I'm sure, but at the same time, to my knowledge, those groups, those

10 real-world groups, meaning Proud Boys, Oath Keepers, et cetera, their actual logistical

11 planning for activities on January 6th was not happening in those groups.

12 Q But, again, it seems to me like the super inviters were engaged in sort of a

13 logistical activity of manipulating the group's feature. Wouldn't you have wanted to

14 know if part of the overlap was between the organizers of those groups and the

15 organizers of the dangerous folks offline? Wouldn't that have been a pertinent

16 question?

17 A I think it is, and I think it's a fascinating question. I think it's an important

18 question. And, to my knowledge, there is -- like, I am not aware of that overlap, which

19 doesn't mean it doesn't exist. It just means I'm not aware of it.

20 Q But you said it was something that you were interested in. It doesn't seem

21 to have been a question that was answered. Why wasn't this -- why wouldn't this have

22 been a priority to understanding whether or not January 6th was organized on Facebook

23 which was, I think, the key question, as you phrased it?

24 A Well, look, because -- you know, the folks that we were looking at, primarily

25 as organizers of violence, were operating in other -- primarily in other settings, and that's
89

1 where we were investigating those levels of violence.

2 Now, I presume that if we had seen that they were leading one of the, you know,

3 major Stop the Steal groups that's something like -- that's a finding that I suspect would

4 have been surfaced to me. I don't remember that finding being surfaced to me.

5 And so, I'd like you could read into that whatever you like. Did we not look as

6 much as we should have? I don't think that's the case. But on the other hand, it's not

7 definitive evidence that that question was answered in the negative either. You know,

8 my -- well, I'm not going to make a presumption. I don't know. But we certainly

9 endeavored to understand the operations of the Proud Boys, the Oath Keepers, what

10 they were doing on platform. This is not something that was raised to me as a core -- as

11 something that they were doing, or I do not recall it.

12 Q If this report had been available to you, let's say, December 1st, if this -- that

13 may be too early, but let's say sometime in December, you have the insights about the

14 way this network is constituted. This report also mentions, I should say, that these

15 groups had -- and these individuals had much higher prevalence of violent incitement,

16 hate speech, racism, and election misinformation.

17 If this data had been available in real time in December, would you have

18 advocated for a change in policy, either, you know, that maybe election delegitimization

19 needs to be an explicit Facebook policy, or maybe these groups need to be considered

20 violent-inducing conspiracy networks? Did this information change opinions at

21 Face book?

22 A I think that this -- and I did make those arguments in December, you know.

23 You know, like, I did make the argument that we needed to address a broader range of

24 the Stop the Steal groups. I did make -- because I thought that there was evidence that

25 they were being sort of constructed, and that there was overlap with militia
90

1 organizations -- unit groups, and things like that. I mean, I did make that argument.

2 I think all of those arguments were happening, were occurring in policy gray areas,

3 and that's why it was tricky. The -- so -- but I did. I mean, I thought we should be more

4 aggressive against the Stop the Steal movement.

5 The -- I still -- and I just can't say this enough. I don't think that that would have

6 changed things on January 6th because I think that this sentiment was so widespread, like

7 so widespread across the MAGA movement, on Facebook, off Facebook, et cetera, and

8 through a range of different, sort of, entities beyond just groups like Stop the Steal. In

9 fact, it's not clear to me at all that those were the most important groups driving the

10 people to the Mall that day. And I think they've gotten the most attention because

11 they've got a sexy slogan. Doesn't mean they were the most important, in my view.

12 So I think that sentiment was really, really widespread. But I did argue that we

13 should take a stronger action against a wider range of those groups because of that -- just

14 for the same reasons you're talking about, because of the overlap with militias, because

15 of the prevalence of hate speech and violence and incitement.

16 I don't think that we knew those things in as polished a way as that report, you

17 know, puts them together in a pretty coherent way. We didn't know all of that stuff, but

18 we had hints of those kinds of things as time was going on. And, you know -- and I

19 thought we should take action as a result of it. But I also think that some of this was a

20 gray area. We didn't know who these people necessarily really were in the offline

21 world. And I do think that, you know, it would have been -- and I am skeptical that even

22 the steps that I advocated and that I still think would have been the right thing to do

23 would have had an impact on the outcome.

24 Q What about phrases that were also used, like "Storm the Capitol," which is a

25 crime, and it's an act of violence. It would be difficult to storm a Federal Building I think
91

1 without being an act of violence. If that had been the main slogan -- it was certainly

2 present. I was able to find an example of a Facebook group with that as its banner just

3 the other day.

4 A Yeah.

5 Q But if that had been the main slogan, would the decisions have been

6 different?

7 A I don't know. So a phrase like "Storm the Capitol," I think it's possible that

8 that would have been implicated the violence and the incite policy a little bit more,

9 because it's a bit more specific about an action and a place. And usually with the V&I

10 policy, a lot of times it's an action, a place, and a time. You know, if you want to get real

11 specific, and you would know January 6th.

12 I think the question that would come up there is, you know, how much of this is a,

13 you know, heated, political rhetoric versus an operation and --

14 Q And when you made these arguments, to whom did you make them?

15 A Oh, I don't remember specifically, but, you know, there were many

16 discussions about this through November and December. So those discussions

17 happened within the policy team, you know, within the product policy team and then

18 beyond.

19 Q What other teams were involved in those discussions?

20 A I mean, it would have been the usual suspects, so it would have been the

21 policy team, you know, product policy, some of the public policy folks at a senior level, I

22 assume some of the strategic response folks like Molly, and key engineering leadership

23 like Guy.

24 Q How would you characterize the stances of those -- of the leaders of those

25 teams? I mean, people like Guy Rosen, Molly Cutler, and Joel Kaplan, where did they
92

1 come down on these questions?

2 A You know, it's a fair question. I don't remember where everybody was, you

3 know. I didn't keep detailed notes on, you know, where everybody was in the debate.

4 You know, what I would say is, that I think everyone was falling in slightly different places

5 in terms of, you know -- characterize it all you want -- paranoia, caution about the

6 possibility of violence at that point. I think it's fair to say that the dangerous ergs world,

7 we were generally on the more paranoid, cautious side of things, and -- but everybody

8 recognized the balance and the tradeoffs and the risks between, you know, as you're

9 attempting to limit the possibilities of violence. And, you know, you can never prevent

10 it, but you can mitigate it, and I think companies do have an obligation, even when they

11 can't necessarily prevent something entirely, to mitigate risk in reasonable ways. And I

12 think we could have done more during that time period, but I don't think we -- you know,

13 but I'm skeptical that it would have had an impact on what actually occurred on

14 January 6th. And I think that that's part of it is, you know, that other folks certainly I

15 think were reticent to disrupt, you know, an emotional and political sort of cry of

16 frustration which was based on, sort of, like wanton misinformation, but was still

17 genuinely held by a big chunk of the electorate and many of whom, you know, were going

18 to gather on the Mall on the 6th to express themselves in that way.

19 Now, I was concerned, as were others, about the possibility that that would spill

20 into violence, and it did, but I think, you know, it was -- you know, even the most

21 concerned of us didn't know for a fact that was going to happen. We were -- you know,

22 we understood that there was risk, and we were -- and we thought we should take steps

23 to mitigate it, but we couldn't say definitively this is going to happen.

24 Q And the information -- the misinformation that compelled many of these

25 individuals to go to the Mall was the kind of thing that was circulating Stop the Steal
93

1 groups? It was from social media in part?

2 A I think it was the belief that the election had been stolen, that there had

3 been fraud somewhere in the process. And I think that that misinformation was spread

4 by the President. It was spread by broadcasts and cable news, and it was -- not so much

5 broadcasts, I guess -- mostly cable news, and it was spread online, including on Facebook.

6 Q You said earlier that it would have been difficult to shut down these groups,

7 this narrative, this movement without shutting down a large part of the conservative

8 news media ecosystem on Facebook. Is that right? Is that accurate?

9 A I do think that. I don't think it would have -- I think it would have been -- I

10 think that that narrative was so closely entwined in the larger conservative media

11 ecosystem that to -- so to address that narrative in any substantive way would have

12 meant the widespread removal of the conservative media ecosystem on Face book.

13 Q Do you think that was a major factor for some of the individuals that we just

14 discussed in its decision-making process?

15 A I don't know. I mean, like, of course, the -- if you're going to take a decision

16 that's going to impact, you know, 50 million Americans, that's something that people are

17 going to think hard about, right? You know, I'm making up numbers. But I think

18 that -- yeah, of course you have to understand when a decision is that significant, I think,

19 of course, people are going to be thinking about the scale of the potential impact, but I

20 also think the reality -- you know, some of that is that, you know, much of that was an

21 expression of protected political speech. And, you know, connecting that to the direct

22 risk of violence was something that I was more willing to do. But even I wasn't arguing

23 that we should take down the entire ecosystem, right? Like, I wasn't. And I think that

24 would have been a really dramatic step.

25 I think the possibilities for the further de legitimization of the election among
94

1 those people would have been high, and I think even then the impact on January 6th is

2 really up for debate. I don't know at all that that would have limited the amount of

3 people that showed up on the Mall, and I think there's a real possibility that they might

4 have showed up even angry.

5 So, you know, I think that's -- the only way that that notion, that idea would have

6 been dramatically reduced in its prevalence on Facebook, would have been that very,

7 very extraordinary step, and I'm not at all convinced that that would have resulted in a

8 better real-world outcome, even though, like, as I said, I argued for more aggressive

9 measures than the ones we took. But I just think that when the President and leading

10 media influencers are pushing an erroneous narrative, throwing doubt on an election,

11 companies like Facebook can do better, but they can't be ultimately the gatekeepers.

12 They can't hold that back. And I think Facebook could have done better, but I don't

13 think they could have realistically held it back.

14 Q The exhibit mentions that most, if not all, of the fastest growing civic groups

15 on Facebook during this period were Stop the Steal groups. The groups, as a feature of

16 Face book, facilitate the spread of this narrative. I mean, what is the culpability of the

17 groups feature in the prevalence of this narrative on Face book? Let's keep it bounded

18 to on-platform consequences.

19 A Yeah. I think groups are oftentimes tricky and problematic, right? And

20 it's not all groups. There are public groups. There are private groups. There are

21 secret groups. They all pose different kinds of worries, right? You know, public

22 groups, obviously are, in some ways, more important for the proliferation of

23 misinformation at wide scale. Secret groups were more often used by -- you know, by

24 more private entities, right? So a lot of the militias used secret groups, for example.

25 And because they're not trying to broadcast everything, they're trying to create a
95

1 tighter-knit group and community. And those dynamics are really different.

2 So I do think that, you know, any time that you sort of can create that kind of

3 environment, you know, there's risk of abuse. That said, like what people have got to

4 understand -- and one of the great lessons of studying terrorism over the years is you

5 just -- you can't approach any community or any group by their absolute worst people

6 and expect to understand the wider population, right? So you can't say, you know what,

7 I'm going to understand, you know, Iraqis via the lens of ISIS. Like, that's wrong. It

8 doesn't work that way. Are there Iraqi members of ISIS? There are. But that's not

9 how you understand Iraq. That's not how you understand Iraqis.

10 And anytime we approach these kinds of risks, like, it's -- you know, just recognize

11 and be wary of the danger, the analytical danger of saying, Look, there are some of these

12 bad groups; therefore, all the groups are bad when we know many of these groups are,

13 you know, community groups or whatever else it is.

14 Those things are real, and I think sometimes they're easy to dismiss in debates

15 about the downsides of social media, but we have to remember them. And look, I am

16 not some -- you know, I am not pollyannish about the role of social media in society by

17 any stretch of the imagination, but I do think that it's easy to forget that reality, that

18 those same tools that are abused are used by people that are doing genuine and good

19 things.

20 That said, the challenge with groups is that it's hard to figure out who's going to

21 be responsible, right? When does the group go bad? Does the group go bad when the

22 administrator of the group is setting it up for a bad purpose, right? Does the group go

23 bad when a bunch of users in the group use it for a bad purpose and the administrator

24 does or does not do something about it?

25 How do you set up accountability within that entire structure? When is it that
96

1 administrator's responsibility to just govern the group and make sure that they're kicking

2 out people that are doing nasty thing? When is it Facebook's responsibility to take

3 action against it?

4 And I do think that that sort of notion of who's going to be accountable, and who's

5 responsible in that sort of larger chain is one that is harder to resolve with the

6 product-like groups where you do have users that are running an authoritative position,

7 but do have the ability to moderate content on their own independently of Facebook, and

8 I think that that sort of dichotomy sometimes led to paralysis inside Facebook, because

9 the idea was, Hey, we just need to provide that administrator more tools. And

10 sometimes these groups, they were -- they were, like, a conservative political group, and

11 the administrator didn't set it up to be a place where people are fomenting violence on

12 January 6th, but a bunch of people showed up there and that's what they're using it for,

13 right? And so that's not totally -- that's not hypothetical, right? Those are real things.

14 At the same time, you've got groups set up by somebody for the express purpose

15 of having a discussion about how the election was stolen, we need to do something about

16 it, let's all get riled up, show up on January 6th and maybe -- and, you know, we're raring

17 to go if the Proud Boys start knocking down police barricades, you know.

18 And all of those things are true at the same time. There's a tremendous amount

19 of variation in the use of these groups, but fundamentally, the challenge from a

20 moderation standpoint is that accountability and authority within those groups is

21 federated out. Some of it is at the company level, some of it is down at the

22 administrator level, and some of it is at the user level. And I do think that how those

23 sanctions get applied and how those rules are going to operate is trickier with a product

24 like that than it is with something more straightforward like an account, for example.

25 Q Was it the view before the election of senior Facebook leadership that the
97

1 best way to assess a group for harm was the intent of the administrator?

2 A I think oftentimes.

3 Q Oftentimes?

4 Mr- Okay. I want to make sure,_ if you had any points for

5 follow-up. I want to pass the mic over to Mr. -

6 Mr.. I think what might make sense is to take a break for 5 minutes, and

7 then we can come back for the last hour, if that's all right with everyone?

8 Mr. Fishman. Sure.

9 Mr.- Fine with me.

10 Mr.. So let's go off the record, and we'll come back on the record at 3:00

11 p.m.

12 Mr. Fishman. Sounds good.

13 Mr.. And we're off the record at 2:55.

14 [Recess.]

15 Mr. - So we'll resume the record at 3:03.

16 And I want to note for the record that a staffer on the select

17 committee, has left the interview for another commitment.

18 And now I want to turn it over to Mr. • who has some more questions, and

19 we'll try to be out of here at 4 o'clock as promised.

20 Mr. • .

21 Mr.Ill Great. Thank you, M r -

22 And thank you, Mr. Fishman, for bearing with us.

23 BYMR. •

24 Q So I think what I would like to do is run through a series questions about

25 Facebook's prepared response and interactions with law enforcement leading up to


98

1 January 6th. And some of this has already been covered in our conversations so far

2 today, so if that's the case, if you'll briefly restate or point us back to what we were

3 discussing earlier, that's completely fine.

4 A Sure.

5 Q We can dive in, and then I think we'll hand it back to M r - f o r some

6 more questions which will hopefully take us to the end.

7 So, first, can you walk us through what Facebook's operation looked like in the

8 lead-up to the election and how that changed after the election with regards to potential

9 violence and misinformation?

10 A You mean from the legal side specifically, or more generally?

11 Q I guess we're mainly interested in how the legal side would have interacted

12 with your team.

13 A Oh. Well, so, you know, the dangerous organizations XFN at Facebook was

14 one of the closest, you know, working across different teams at the company. And so,

15 historically, we would get together -- you know, we would meet weekly. Leadership

16 would meet weekly as well separately from across those different organizations. You

17 know, prior to the pandemic, we met, you know, quarterly to do planning as an entire

18 XFN. And the idea there was to make sure that we were pushing road maps, each of the

19 various teams would roadmap, which in tech company terms is, you know, build a plan

20 independently, but we would meet across our issue areas to make sure that we were

21 collectively pushing things simultaneously, and in conjunction with one another, things

22 like that. That included the legal team, which was closely integrated.

23 Starting in the spring of 2020 -- I don't remember exactly when it was; April, May,

24 somewhere in there -- we began a pace of, you know, several additional weekly meetings.

25 Those began with a lot of the work up around the Boogaloo movement, and I think
99

1 they -- if I remember correctly, they slowed -- we took them down to, like, one a week

2 because we were hoping things would calm down, and then they didn't, and so then they

3 went back up.

4 And so those became -- we also set up other meetings that my team ran that was

5 focused on what we call time-sensitive threats. And so, a lot of that was sharing focused

6 on various upcoming real-world events where there might be violence. Even if they

7 weren't sort of planned, it wasn't specifically organized, but things that we needed to be

8 focused on.

9 Those are the kinds of things that the dangerous ergs XFN ran, you know, my team

10 in conjunction with other parts of the XFN. And some of that was stuff that we would

11 take action on directly, but some of it was really just to make sure that there was good

12 information sharing around those kinds of incidents and potential incidents for other

13 kinds of teams, right? Because sometimes an operations team wouldn't be looking off

14 platform, so they wouldn't know that the Proud Boys had been stating on Telegram that

15 they're going to show up at some event, right? And they wouldn't know that that was

16 part of the context that they needed to have when they were dealing on the day with

17 that sort of thing. So we built systems to try to facilitate that kind of work.

18 Those all continued from the dangerous ergs' perspective after the election. We

19 did not -- we understood that there was still a risk of violence. Those pieces continued.

20 The time-sensitive threats meeting continued. All of that continued. And, to my

21 knowledge, they're still doing some of that today, although I don't think they're doing

22 those two extra meetings, which is really about the 2020 cycle at that point, but, you

23 know, I hope they do it again now.

24 And all of that, from a dangerous ergs' perspective, those sort of institutional

25 pieces continued after the election.


100

1 Q Great. That's helpful context.

2 So for the pre-election period, up to and including the election, was there a

3 designated channel that Facebook was using to communicate threats to law enforcement

4 or receive threat information from law enforcement?

5 A Yeah. So Facebook has ways to communicate with law enforcement.

6 There are portals -- Facebook has portals, and I imagine that other large tech companies

7 do too where law enforcement agencies can put in requests. But they also have

8 telephones, can make phone calls to say, Hey, we're going to send something. This is

9 really important, you know.

10 And Facebook actually has programs through something -- they call it the legal

11 outreach team to train law enforcement so that those requests come in in a format that

12 aligns with the law, that they're checking the boxes correctly, that they're making sure

13 that they submit things in a way that a company can respond to, you know, fulsomely and

14 effectively. And so, those kinds of ongoing programs, you know, are out there as well.

15 Q Were there ever any conversations between law enforcement and multiple

16 platforms that you were aware of to, sort of, broader threat landscape discussions?

17 A There absolutely are sometimes those kinds of discussions, either from law

18 enforcement or from, you know, DHS or NCTC sometimes. DHS did a lot of work on the

19 information security side, not so much with my side, but with other -- with Nathaniel

20 Gleicher and his team, looking at, you know, threats of foreign interference, and things

21 like that, to make sure that channels for that are open. But those are little bit different

22 than -- you know, those are broader sorts of threat environments than the FBI, you know,

23 providing a warrant, you know, process saying, Look, we need disclosure of something

24 about this because there's a real-world threat of violence or something like that.

25 And there also are procedures there in case of -- you know, Facebook comes
101

1 across something that looks like a threat that law enforcement is unaware of where it is a

2 real-world danger, that Facebook can proactively provide that information to law

3 enforcement.

4 Q Except for those broader level conversations, were those happening related

5 to the election in October, November?

6 A I think there were some real high-level conversations. It was not as

7 well-established, at least for me, as what was happening on the information security side.

8 It was not as well-structured, but I do think that -- again, like, specific on the law

9 enforcement side, we're plugging in with our legal team, with the Facebook legal team,

10 and they were communicating about one thing or another all the time. I mean, there's

11 always something, you know. And, you know, so I think those avenues were wide open.

12 Q How about after the election?

13 A I don't remember exactly, to be honest. I don't remember sort of specific

14 cross-cutting kinds of meetings, but I know that -- you know, I mean, I just can't

15 overemphasize how regularized some of those communications are, because so many

16 funds and investigations now have, you know, an online digital component as evidence.

17 And so, companies are really well-practiced at communicating with the --you know, big

18 companies. Silicon Valley, as a whole, still has a ways to go, in terms of, like, recognizing

19 that you can talk to the FBI without -- you're not -- it doesn't mean you're going to jail,

20 right? And I'm making a little bit of a joke, but, like, I think historically in Silicon Valley

21 that's sort of the fear, right? Like, Oh, the Feds called, you know.

22 But that's not where Facebook is at. Facebook is a much more mature

23 organization now than it was a decade ago. They've got folks with background in law

24 enforcement, in those kinds of agencies. They know how to have those conversations.

25 They know how to push back when a request is not sort of constructed in a way that
102

1 is -- you know, aligns with the law. But they also know how to provide that guidance so

2 that law enforcement agencies are providing process in the appropriate ways as well.

3 And I think -- you know, I just -- those avenues are wide open.

4 Q Got it.

5 So moving forward past the election, again, you mentioned this, that the pace of

6 meetings had slowed down because you had hoped that it would be calmer. We talked

7 about this a little bit earlier, but could you walk us through how the posture of your team

8 and Face book more generally changed after the election, and when did it start to ratchet

9 back up again?

10 A Yeah, that's a good question. Like, my memory is imperfect at this point.

11 I distinctly, however, remember having the impression that we in the dangerous ergs

12 world were still terrified, and everyone else wasn't. Everyone was terrified in the run-up

13 to the election. We remained terrified, and I don't think that that was as widespread

14 after the election.

15 And, you know, I don't -- like, I don't remember the details over the sort of the

16 meeting cadence, you know. We certainly still had the time-sensitive threats, which

17 turned out to be a great thing. It wasn't my idea. Some of the other folks had it.

18 They made it happen. You know, for some of these, my job was to drive it personally.

19 For some of it, my job was to just sort of sponsor and say I think this is a good idea and

20 create some space in people's schedules.

21 I think we probably went down to 1 day a week on these really special sort of

22 coordination kind of meetings, at least for part of it. I don't remember exactly, exactly

23 how that worked, but I distinctly remember the concern that we were really -- we were

24 really worried.

25 And largely -- you know, I know I said this at the beginning -- a big part of the
103

1 reason why the dangerous ergs team was so worried was because we were looking at

2 things off of Facebook. And that was not a natural way for many teams to think about

3 threats on Facebook, but it was a very natural way for the dangerous ergs team to be

4 thinking about things off of Facebook.

5 Q What was that experience like for your team in the weeks after the election,

6 being the only team that was terrified?

7 A Well, I mean, we were kind of used to that idea. Like, this was a

8 team -- this is the team of people whose job it is to be terrified, right? It's the team of

9 people whose job it is to be paranoid and worried about stuff, right? So on some level I

10 think there was just some general consistency. If you went to, you know, Face book on

11 any day of the week and said, Who are the people that are most terrified, you know, it

12 would be the dangerous ergs people, because they are the ones looking at the most acute

13 kinds of threats, you know, unless you go talk to the child safety folks or, you know,

14 they're looking at different kinds of terrible things, right? You know, and so on some

15 level, that is just, you know, you stand where you sit kind of dynamic, right?

16 But I also think that the more important lesson for the social media companies is

17 that you have to combine -- in order to understand risks holistically, it's really important

18 to combine data science approaches to understand what's happening on your platform at

19 scale with intelligence style approaches to understand what's the connections between

20 your platform and things that are happening elsewhere in the world, both in the real

21 world and on other platforms.

22 And that is a jump that I just don't think -- I think all companies probably can learn

23 from that and get better at that. I definitely think Facebook needs to get better at that.

24 I don't think that they've nailed that across the board. I don't think we did it perfectly in

25 dangerous ergs, but I really -- but I do think that the dangerous ergs team does that
104

1 better than most. And I think that, you know -- and I think where that comes to the fore

2 most is in understanding, you know, what the risk environment was in relationship to the

3 break-the-glass measures, because those efforts were -- you know, and it's really easy to

4 criticize after the fact, to say that they got that wrong.

5 But they had a bunch -- they had a series of metrics that they were looking at.

6 They were not crazy metrics, but they were all on file, and those metrics did go down.

7 Things did change. It did look like things were calming down. They were not making

8 this up and being irresponsible. From my -- they were not.

9 From my perspective, though, that perspective was too focused on the things that

10 they could see, and it's like using a single source of intelligence in the IC, right? If you

11 use a single source, you get -- you may get one answer. You may get a bias from that

12 source. And if you use multiple sources, you can triangulate. You may get a different

13 perspective on things. And I just don't think -- I don't think that companies have really

14 figured out how to do that very well. Part of the reason is that sometimes that means

15 you're sort of using intelligence judgments, right? Because you're looking at things off

16 platform. You can't measure it precisely. It's not the kind of thing that speaks to an

17 engineering brain, to a certain extent, whereas metrics that just tick up and down do.

18 That's how platforms want to operate. That's how founders that are engineers want to

19 operate. It's totally understandable. It's not evil, but it's limited.

20 And I think one of the things that platforms need to get better at is combining

21 these sources of information so that they can make good judgments, and then be

22 transparent about those kinds of decisions, as we discussed previously more -- you know,

23 earlier. And I think that Facebook -- I don't remember what the communications

24 strategy was around the break-the-glass measures, but I do think that Face book had

25 announced that they were being put into place and that they were coming off, and I don't
105

1 remember exactly. And that's good. Those things are good. Now, maybe there

2 should be more precision about that. There should be more detail. I think there

3 probably should.

4 But, I think, most importantly, there needs to be ways to measure the risk

5 environment that doesn't just look at the obvious signals on a platform in a quantitative

6 way. You've got to look at the wider environment, especially if you're a big platform

7 where every bad thing is going to show up there eventually, even if it's not there yet.

8 And that's Facebook. That's YouTube. That's even Twitter, to some extent. It's

9 certainly TikTok now. You know, all of those platforms, just because you don't have it

10 today, if it's on the internet, it's going to be there. It might be tomorrow, it might be

11 next week, but it's going to get there. And so some of those things you can get ahead if

12 you keep your eye elsewhere.

13 This is how we dealt with ISIS, you know, in the early days, and it largely worked.

14 Q Right. So are there ways that you wish Face book's -- some of this talk and

15 information that you were seeing on your team had been different, specifically in the

16 December -- November to January 2020 to 2021 period?

17 A Um, I don't know. I'm sure there are things that we could have done to be

18 more effective at getting that message out. We tried. You know, inevitably there are

19 better ways -- there would have been better ways to do it. I think the structural -- like, I

20 think the structural problem is that you don't want -- is that social media companies,

21 including Face book, need real -- people aren't going to like to hear this, but, you know,

22 especially in Silicon Valley, but you need an intelligence function, and I don't think that

23 Facebook really has that.

24 Because what you don't want -- and there's all sorts of literature about this.

25 Anybody studying security studies at Georgetown or Johns Hopkins, or wherever, can


106

1 read about it all day, about how you don't want policy people doing the intelligence

2 analysis. And you want -- those are separate functions. You want -- and I don't -- and I

3 think -- and I don't think that distinction has occurred enough at Facebook, because the

4 fear is, for a decision-maker, whoever the decision-maker is in the chain, if you've got a

5 policy person that's also doing the intelligence analysis, you're afraid that they're skewing

6 the intelligence, and they might be skewing the intelligence to get what they want.

7 They might be doing it on purpose. They might be doing it on accident. They might

8 just have gotten lazy when they got the answer that they wanted.

9 And so, if you're the decision-maker, you're always worried about that. I think it

10 would be easier to get better decisions if companies structured themselves so that there

11 were bodies that were strictly doing intelligence analysis of risk so that you could make

12 better decisions and you weren't -- and so you wouldn't have a team like mine sort of

13 consolidating information and trying to serve it up, and then also making a policy

14 recommendation on something. That's not how you want to do it. We did it that way

15 because we had to. I think we did a pretty honest and good job of it, but that's not

16 actually how you want to structure it. The way you want to structure it is you want to

17 have a body whose job it is to measure risk, assess that risk, to provide an impartial

18 description of that risk, and then you want somebody else making the case, Well, what do

19 we do about it right now? Because if you combine those things together, then the

20 decision-maker has to question and has to wonder, Where is the bias coming from?

21 And, like, there's a million -- like, this is not a new idea. There are a million studies

22 like it.

23 You know, I'm not saying anything terribly exciting. I'm just adapting a

24 discussion that comes from intelligence studies and security studies programs all over the

25 place to the social media environment that has to make similar kinds of decisions now.
107

1 And I don't think that social media companies have fully incorporated that idea, and they

2 need to do it better.

3 Q I think 1111 brought up something about the post-election landscape.


4 When there were the break-the-glass measures that were rolled back in December, did

5 that include some of the limits that were placed on groups? Do you recall?

6 A I don't recall specifically. I presume so, but I don't recall specifically.

7 Q Thank you. That's helpful.

8 In your broader conversation about the issue of intelligence, and sort of separate

9 chain of intelligence collection as opposed to social media policymakers, are there -- why

10 do you think the reasons -- what do you think the reasons are that that hasn't been set

11 up? Is it a lack of awareness of the importance of those separate accounts apparatus, or

12 is it sort of another incentive that's keeping companies like Facebook from doing that?

13 A So I think it's a great question. I think their overriding reason is a lack of

14 awareness, and sort of focus on doing it. I think it would -- there will be bureaucratic

15 pushback from various existing stakeholders, right, that don't -- wouldn't want to give

16 things up. So I think that's a big one.

17 But I think another one is everybody is afraid of the word "intelligence." It

18 sounds spooky. Everyone is mad at social media companies already. Nobody wants to

19 hear that they have an intelligence function, right? It sounds creepy. It's scary. You

20 know, potentially it is ripe for abuse. But it would improve the kinds of information flow

21 to senior decision-makers, and put them in the position where they could, I think, more

22 easily trust that data coming in the door, because right now, if they're honest with

23 themselves, like I'm sure this is true at anybody above my level in that organization that

24 has to make a hard decision, when information comes up, they're going to be asking

25 themselves, Is the raw data that I'm getting being construed in a way to advance the
108

1 policy perspective of whoever is bringing it to me? Because they're also bringing me a

2 recommendation.

3 And in government we would say, Yeah, that's a problem. Let's separate those

4 two things out. Let's separate them, you know. And social media companies need to

5 start thinking that way too, especially the big ones that can afford it. Like, this is not a

6 thing that smaller companies would be able to do, even a Twitter, something like that,

7 right? That just doesn't have the same resources. But big companies have the ability

8 to do that. They do have, you know, intelligence functions that do more tactical stuff,

9 that are doing things -- you know, that are doing investigations on foreign influence

10 operations, that are, you know, doing the detailed dives for network disruptions and

11 doing those sorts of things.

12 But in terms of that macro risk analysis so that you can make good decisions about

13 when you want to adjust things, like the break-the-glass measures and that kind of stuff,

14 those are the kinds of changes that I would want to see. And I think that it's sort of

15 strange, but, like, this is actually -- this is what I would want for senior decision-makers.

16 This is what I wish all of those people in the chain that you mentioned -- names that you

17 mentioned earlier, I wish they had access to that, you know, and they'd get better

18 information if they had that kind of thing. And they wouldn't have to question -- it

19 would be painful at first. It's bureaucracy. But I think over the long run, there's a

20 reason why government after government tries to structure these things this way.

21 There's a million case studies of why it works better when information flows that way to

22 senior decision-makers. And I just think -- look, again, would this have fixed -- would

23 this have changed what happened on January 6th? I don't think it would have. But I

24 do think that this is an illustration of a broader challenge, and there will be other really

25 hard decisions that social media companies have to make going forward, and I hope that
109

1 they learn this lesson.

2 Q Got it.

3 After January 6th, the [inaudible] was deferred for a number of months following

4 the attack as well, and I'm curious about how you saw the violent ergs -- dangerous ergs

5 policy change after the attack, particularly with regard to the kinds of far-right militarized

6 groups that were first included just a few months prior.

7 Did monitoring increase? Did some of the criteria that you were using get

8 sharper? Was there more interest from other areas of the company and the work that

9 you were doing domestically? Walk us through this.

10 A Yeah. I mean, I think there's -- I think there are -- what's the right word?

11 So I think there was an effort that was not led by my team, but it was led by a partner's

12 team to develop something called the coordinated social harm policy that was utilized, to

13 my knowledge, I think the only time, with a group in Germany called Querdenken, which

14 is sort of a -- effectively, it's a German, you know, version of QAnon. But it's a little bit

15 of an odd group. And so the coordinate social harm policy tries to get at the same sort

16 of problem as the violence-inducing conspiracy network policy, but doing so through a

17 more behavioral lens, like, you know, how are they -- you know, how are they structuring

18 and coordinating in order to push messages tied to violence offline?

19 And, you know, I don't know where that policy sits today. You know, I've been

20 out of Facebook for 6 months. I'm just not sure. But that was one effort. And, again,

21 these are efforts to try to think creatively about how you get at these more amorphous

22 entities, right? Because the world is moving in that direction.

23 I think there were efforts to go back, and I don't know where -- again, I don't know

24 where these stand today, but to think through some of the policy changes that we put in

25 2020 and make sure that we got them right. You know, there was certainly an effort in
110

1 2020 to change the external facing representation of the dangerous organizations policy.

2 The tiered structure that you see on their website today went up sometime in the

3 summer, I think, of 2021-- I forget when -- because we made some of these changes, but

4 we just, you know, were running so fast, they weren't, sort of, publicly enumerated in the

5 way that they should be. And so there were a lot of efforts to try to do that to make

6 sure that it was clear to people.

7 There continued to be enforcement against militias and QAnon. I think it

8 was -- there was awhile there where there was a really tense focus on that, of course, and

9 their sort of hyper prioritization, I think, probably changed and reduced as the rest of the

10 world -- you know, you've got to sort of balance resources and, you know, resources is

11 always an issue in any organization.

12 So I think there were a range of those different kinds of efforts, but, you know, the

13 policy is in place now, as far as I know, and being enforced. I don't know all of the

14 mechanisms that have been put in place to enforce it.

15 I think the larger issue, from my perspective, is that there has been this shift to the

16 place where civic violence now doesn't just come from organizations, right? It is driven

17 by a range of other pieces. And, in my mind, social media companies need to put in

18 front and center the risk of civic and political violence, and make sure that there are

19 policies that are unified from content-focused policies, behavioral policies, and

20 actor-based policies, so that you don't have situations where content-focused approaches

21 don't get it done for a long time, and then you need to come in with an actor-focused

22 hammer at the end.

23 And I don't think that that kind of restructuring has occurred, at least it hadn't

24 when I left. And, to my knowledge, it hasn't since but, you know, my specific knowledge

25 is now out of date by 6 months.


111

2 [3:33 p.m.]

3 BYMR.-

4 Q That's real helpful.

5 Mr. - I think M r . - has some followup on the policy, so I'll hand it over

6 to him.

7 Mr.- Thanks, Mr. •

8 And thanks, Mr. Fishman.

9 BYMR.-

10 Q Yeah. I did want to ask about the coordinated social harm policy. So that

11 wasn't developed under your team; that was developed under a different team?

12 A No. That was developed primarily by Nathaniel Gleicher's team. I mean,

13 my team sort of advised and coordinated on it, but that team did a lot of work with

14 behavioral-based policies. And this was an effort -- so it was really a collaborative effort.

15 I mean, those two teams worked very closely together. But they had a lot of expertise

16 in behavioral-based approaches, and so the idea was they would take the lead on it and

17 we'd sort of play the supporting role.

18 Q That was the team that would've focused on coordinated inauthentic

19 behavior. Is that correct?

20 A That's correct.

21 Q Yeah.

22 Was that policy a direct outgrowth of anything that happened during the election

23 or January 6th?

24 A I don't know if I would say direct outgrowth. I think it was definitely a

25 response -- I think it was definitely another way to try to get at this problem that we all
112

1 recognized. But building a firm policy foundation for dealing with a range of different

2 organizations is hard, right?

3 The violence-inducing conspiracy network policy is a good one. It has real

4 boundaries and a set of standards that works, but it has its limits, you know? And one

5 of the limits is that you really need clear evidence that a group has done a bunch of

6 violence you know, or that members of that group have done a bunch of violence that

7 aligns specifically with the ideological perspective of that conspiracy, right?

8 And, you know, what you're trying to do is find ways where you can define very

9 high-risk behavior that has a very, like -- that risks real-world violence and harm in a

10 pretty serious way without dramatically spilling over and capturing all sorts of things that

11 don't really, you know, lead to violence.

12 And so the CSH effort was another way to do that. And I don't know where -- I

13 mean, I just don't know where that stands today.

14 Q We have an exhibit; it's exhibit No. 3. It's the "Problematic Non-Violating

15 Narratives" report.

16 A Uh-huh.

17 Q I am wondering if those themes are related, if that report was part of the

18 development of that policy.

19 A You know, I skimmed it this morning. Do you mind putting it up?

20 Mr. ll!!l!lll!l - it's exhibit 3, if you're --

21 Mr. Fishman. Yeah. I skimmed it this morning. I don't -- well, let me take

22 another look at it. I don't want to accidentally say something inaccurate.

23 Mr- Thankyou,_

24 BYMR.-

25 Q And take your time, Mr. Fishman.


113

1 A Yeah.

2 Yeah, I don't remember this document, which doesn't mean that it wasn't part of

3 the process somewhere in the line, you know? But I don't remember this document,

4 personally.

5 You know, I think a lot of folks recog- -- you know, I mean, it's not rocket science.

6 Everybody recognized this idea of these really difficult, sort of, diffuse movements that,

7 you know, likely have nasty players back there somewhere pulling the strings, but those

8 nasty players are hard to put your finger on and enforce against because they're playing it

9 smart. And so you want to try to deal with that somehow without having a dramatic,

10 terrible impact on voice and without making, sort of, decisions willy-nilly without a clear

11 evidentiary basis and strong preexisting rationale.

12 And so, you know, I think there were a lot of efforts internally to try to figure that

13 out. The VICN was something that we came up with in 2020, and the CSH is another

14 effort. I don't know if this got plugged into that or if this was -- you know, I just don't

15 know -- or if this was somebody else that was wrestling with similar ideas and coming up

16 with something else, you know?

17 Q What were the -- let's just take these themes, the ones we've just been

18 discussing. What were the conversations at Facebook like that after the election like, in

19 your recollection?

20 You said that your team was terrified, but the other teams, maybe their level of

21 terror diminished following 1/6. This was an environment in which it seems like a lot of

22 reflection and studying and policy development was going on. I know that it's always

23 going on. But some of it is very interesting, for the reasons we've been discussing.

24 What were those conversations like?

25 A So I think there was -- you know, look, Facebook had some wins here that
114

1 shouldn't be diminished, right? Like, around the election itself, I think there was -- you

2 know, there wasn't a, sort of, tsunami of misinformation about voting places and things

3 like that. There were a bunch of pieces in all sorts of different places. There was, of

4 course, misinformation, including, and I think more acutely, in Spanish, you know, related

5 to the election.

6 So, you know, it wasn't perfect, by any stretch of the imagination. But, at the

7 same time, I think there was a ton of work that went into it, and worst-case scenarios

8 around the actual election did not manifest.

9 And so I think there were efforts to try to understand what did we do right, what

10 did we do wrong. And some of that was related to some of these extraordinary product

11 measures, you know, that are encapsulated in a break-the-glass sort of concept, but I

12 don't remember, sort of, precise lessons learned there.

13 You know, I think some of those, sort of, civic product efforts -- I mean, this is one

14 of the challenges, right? In some of those civic product efforts, we as the dangerous

15 ergs team were not always central to. And that was sometimes frustrating. Because I

16 think, in some cases, those efforts did not directly incorporate the risk of violence. They

17 were focused on more traditional forms of election disruption.

18 And I think there were elements of that even -- you know, even after January 6th,

19 we sort of had to say, "Look, this is still a thing, this is going to happen again," right?

20 And some of that is just bureaucracy, right? Like, that's why you have different

21 bureaucracies, and you need somebody standing up to be the one that says that, right?

22 But I do think that that's one of the risks here, when you're talking about -- that

23 perspective, the focus is very much on there's sort of the sanctity of the election, rather

24 than, like, okay, what happens if people just decide to blow up the whole thing?

25 Which, effectively, is what the efforts to de legitimize the election after the fact
115

1 were. They were efforts to say, "Look, we're not just going to de legitimize and try to get

2 a few people to vote in the wrong place. We're just going to say that everybody that

3 voted in Pennsylvania's vote doesn't count." Right? Which is a -- like, it's just a

4 different level of disruption that I think was, you know -- and we're going to physically

5 stop the Congress from certifying the election. Right? That is not, I think -- you know,

6 that's the kind of paranoia that, you know, in that physical, real-world-violence frame,

7 that's not where you're thinking of if you're coming from a traditional election protection

8 perspective, you know?

9 And so I think you need those two things together. And, you know, over time,

10 unfortunately, I worry that we're in a place where this kind of thing is going to be more

11 common, both at a Federal level and a State and local level.

12 Q I have just, like, three more questions for you, and then I know Mr. has

13 a concluding question, and then I think we can wrap up. I want to be respectful of your

14 time. You've been very generous.

15 I know you haven't seen this document. It does have some interesting

16 recommendations, and I want to just quickly run them by you. And I'll group them

17 together and see if -- I'd like to know if, in your opinion, these would have helped limit the

18 kinds of narratives that build up into something that eventually manifests in offline

19 violence.

20 So one of the recommendations is: more proactive removal of platform entities;

21 better soft, targeted interventions -- better soft interventions and better targeted

22 interventions; and then a focus on the highest percentile harmful groups, networks, and

23 accounts. It notes that you can often segment harmful spaces and users on Facebook in

24 ways that make it clear that some actors are much worse than others, and that can help

25 isolate bad actors.


116

1 Are those types of interventions things that you think would be worthwhile

2 tradeoffs?

3 A If they're targeted effectively, I think they can help. Which is a big "if,"

4 right? That's a pretty strong conditional statement.

5 So I think, yeah, like, better proactive removal of bad actors is something we

6 struggled with over and over and over again at dangerous ergs. And sometimes we did

7 it well, and sometimes we were behind in chasing.

8 So I think -- yes, I think, you know, understanding and defining categories of actors

9 that are doing things so anathema to civic functioning and democracy, that maybe they

10 should be removed. I think it's something that should be considered.

11 I also think that's a really dangerous thing to do. Actor-based policies are

12 dangerous. They're right, they work, they're the way you get ahead of problems, but

13 they're dangerous. And we should just -- you've got to look that in the eye. It is.

14 Like, it's dangerous. Because you're saying, these folks have done something in the

15 world so much that we're just not going to allow them to use these products at all or

16 we're going to deliberately limit their access to certain features. And that precedent is a

17 dangerous one, because it can be abused.

18 And I think better soft interventions -- I think soft interventions are good. I think

19 they come with real tradeoffs, because what they do is they hit a wider range of things.

20 They're often contingent on Al, which means that they have some false positives, and

21 that can be problematic.

22 I think what's important about that is, you know, in those soft interventions,

23 everyone always thinks it's the algorithm, it's the algorithm, it's the algorithm.

24 Sometimes it's the algorithm. Sometimes it's people inviting thousands of people to

25 their group, right?


117

1 So, as long as we're understanding that soft interventions means more than just

2 the algorithm, which, frankly, I think gets more attention than it deserves as a driver of

3 problems, and what gets less attention is that there are those few people, like you noted,

4 that are doing, like, an outsized number of invitations, right?

5 What I worry about is everybody's public pressure on companies around the

6 algorithm distracts them from getting better at identifying those kind of nefarious actors,

7 and I would rather they spend more time on those nefarious actors.

8 And then that gets to the focus on the highest percentile counts, which is sort of a

9 bucket that I think gets at that issue. And that, to me, is really it. Like, how do you

10 work backwards from those counts that are really finding ways to make things happen?

11 Because they will abuse any platform. They will find the gaps. They will try again.

12 They will wait until they'll get disabled 10 times and then they'll create another account

13 and, you know, know the right path to go to avoid detection. They will do all of these

14 things. It's not just, like, they manifest on the platform and it just does exactly what

15 they want. It's adversarial.

16 And, I think, what I hope is that, you know, as folks think about what platforms

17 can do better, it's not just reiterating what I think has been an excessive focus on the

18 algorithm and the business model related to adds and more on how do you get better at

19 understanding the discrete networks that are doing harm and the discrete accounts and

20 networks and actors behind them doing harm.

21 Because I think -- and I think you guys asked good questions about those things.

22 You know, I didn't have great answers on some of them, because I didn't know the

23 answers. Those are things that platforms should get better at, is being able to do those

24 kind of things in real-time. And I think that is just as important, if not more so, than this

25 discussion that has gotten so much attention in the media around the algorithms, the
118

1 algorithms, the algorithms.

2 Q Do those actors, do those networks -- are they presented greater

3 opportunity in a highly polarized media ecosystem? And is there a way in which social

4 media contributes to that?

5 I mean, this is a "step back" kind of question, like, a sort of bigger-picture

6 question. But do you think those things --

7 A I think the more that our politics use violent, militarized language, the harder

8 it is to distinguish a real threat from an exhortative political claim.

9 I think that the unwillingness of political leadership to set reasonable boundaries

10 on what is acceptable political speech and what is not defers some of that responsibility

11 to social media companies in ways that we just should never want to give it to them.

12 Even the best-intentioned companies are not the people -- are not the institutions that

13 we want drawing those normative lines, if we can help it.

14 That doesn't excuse those companies, because when that responsibility falls to

15 them, they have a responsibility to do what they can. But our democracy would

16 function better if political leaders, civic leaders were drawing those lines in normative

17 ways, not in legal ones, more aggressively than they do.

18 And we've blown through some of those lines. And I think it is -- you know, we

19 can hope that social media will fill that gap, but it won't. It can't fill that gap. And, you

20 know, even if they act perfectly, they don't have the legitimacy. They're not

21 democratically elected. We don't want them setting those norms. We want our

22 political leaders setting those norms; we want our civic leaders setting those norms.

23 And, you know, whatever you think about Mark Zuckerberg, that's not why he

24 signed up to make Facebook, you know? And, you know, I don't agree with every

25 decision he's made, but, like, you know, that's not why he wanted to do it. It's his job
119

1 today, whether he likes it or not, but that's not why he signed up to do this.

2 And we shouldn't want to imbue that kind authority in him or Elon Musk or

3 Sundar, right? None of these people. Those aren't the people that we want making

4 those decisions. We want our political and civic leaders making them. We want them

5 drawing those ethical lines. And I don't think they've done that nearly well.

6 Q I think that is a profoundly important point.

7 I do want to play off against it the revelations that in 2018 or 2019 political party

8 leaders did come to Facebook and say that the way that the platform had changed

9 political communications incentives had led directly to changes in party communications

10 online and party platforms in some cases.

11 Is there a way in which we should be thinking more about that, the incentives and

12 the communications environment that have changed since the advent of social media?

13 A Yeah, I do -- I mean, so I think you're speaking -- you know, I'm

14 remembering, I think it was one of the other Haugen docs, right, about a -- it was like a

15 Polish political party or something, right? Yeah. So, you know, I wasn't aware of that,

16 you know, when I was at Facebook.

17 But I do think, like, you know, obviously, not just political leaders but, you know,

18 businesses and everybody else is optimizing for eyeballs and engagement on line, right?

19 They are doing search engine optimization. They are trying to game various algorithms

20 to get surfaced and get attention. And I think that is a noxious game.

21 I also think that any usable, scaled social media platform is going to have an

22 algorithm that ranks posts so that people can try to see things that are relevant to them,

23 right? An unranked algorithm on Facebook would show us, you know, I guess in reverse

24 chronological order, every post made on the platform. It'd be totally unusable. So

25 you're going to sort things by friends and the groups you're in. You're going to do some
120

1 of that, just inherently. Because that's what people are signing up for.

2 I do think that this notion that more, you know, noxious things get more

3 engagement and that that cycle leads to even more engagement is a dangerous one and

4 is a worrisome one. I don't claim to understand it fully, in part because, you know, I

5 think the data is sort of mixed. I wish that data was out there and more available so

6 that we could analyze it better.

7 But I also think that during my time period at Face book they took real efforts to

8 try to improve those things, you know, and the content distribution guidelines are an

9 effort to try to do things like that.

10 So I don't really know what the, sort of, status quo today is versus 2018, 2019.

11 That was a long time ago. And, you know, I agree wholeheartedly that companies that

12 are doing these sorts of things ought to be more transparent around them. And I think

13 that, in the long run, will be good for them and will be good for society writ large.

14 I do think it's really important that those calls don't just focus on Face book, that

15 they move across the social media ecosystem. Because I do think that there is a sense,

16 you know, that is not helpful for anyone that leadership, you know, and folks inside of

17 Facebook think that they get unfairly ganged up on.

18 And I think, you know, Facebook's made plenty of mistakes. It's the

19 biggest -- well, I guess TikTok's bigger now, but it is one of the largest. It deserves a hell

20 of a lot of scrutiny. At the same time, it is a wounded whale that everybody wants to

21 get in a fight with because it benefits them politically. And that focus has led to, I think,

22 an intransigent view inside of the company that isn't, you know, totally wrong that they

23 get picked on.

24 And with the solution here for society -- if our goal is to improve society, we need

25 to make sure that these standards apply across the board, that everybody is living up to
121

1 them, and including Facebook. But narrowly focusing on that platform I think is a

2 mistake, because these problems exist all over the place, and, you know, none of these

3 platforms are forever.

4 Mr.- Mr. Fishman, I want to make sure you don't have a hard stop at

5 4:00 or that there's no reason we couldn't go 10 or so minutes over. I know Mr. -

6 has a final question, and if we do need to stop, I want to give him the chance to ask it.

7 I also want to make sure the court reporters are okay if we stay.

8 Mr. Fishman. I can go until 10 after, but I should stop then, because I've got to

9 get ready for another call at my 1:30.

10 Mr.- That is fine with me. And as long as there's no reason we can't do

11 that, maybe we'll use even less time than that.

12 BYMR.-

13 Q I wanted to go way back. There was something I just wanted to clear up

14 from your earlier answers.

15 When you said that you were advocating for more robust response to the Stop the

16 Steal groups after the election in part because of the overlap with militias, but it seemed

17 like no comprehensive study had been done about those groups and their membership.

18 I wondered, I just wanted to know if there was any intelligence you were acting on

19 that suggested that. You said you didn't have a crystal-clear picture, but, like, what

20 informed your view?

21 A Well, you know, I don't remember all of the details. What I recall is

22 understanding that a number of those groups had significant overlap. And I think we

23 saw this with the original group, that some of those groups had significant overlap with

24 groups that had been -- and, you know, by "significant," I'm not talking 80 percent; I'm

25 talking, you know, a much smaller percentage -- but that there was overlap with some of
122

1 those militia groups, that there were high prevalence of hate speech and violence and

2 incitement, sort of, violations inside the group, and that they were manifesting in ways

3 very similar in intent, apparent intent, as the Stop the Steal group that did come down.

4 So, you know, all of those things in aggregate, in my mind, represented sort of

5 frameworks that could be removed under the coordinating harm policy.

6 But these were things that were operating in sort of gray areas. And I thought

7 we should be more aggressive against some of the groups anyway, where a lot of the

8 harm was happening, just generally, a lot of harm was happening in the group. I think

9 that there was responsibility on the app for the group owner and administrator

10 themselves, and I thought we could be more aggressive against some of those kinds of

11 entities.

12 But I do want to -- like, I said it before, and I'll say it again, that I do not think there

13 was an action that Facebook could've taken that would've demonstratively changed the

14 presence of that many angry people on The Mall on January 6th. I think it was going to

15 happen. And I think it was going to happen because of the organization off of Face book,

16 and I think it was going to happen because of the, sort of, pushing of it by the President

17 and by big elements of conservative media.

18 And, you know, I don't even want to say everybody, because I think there were

19 plenty of responsible folks and there were plenty of conservatives and Republicans that

20 were upset because they lost an election but took it like good citizens, which we all do

21 sometimes.

22 Q Okay. Thank you so much.

23 Mr. - I'm going to cede the rest of my time to Mr. •just so he can get

24 his final questions in and so you can get out of here in time for your next call.

25 BYMR.-
123

1 Q Thank you, Mr. Fishman.

2 I just have one more question, and it's admittedly broad, but are you concerned

3 about the potential for violence in the 2024 election cycle, given what we saw and how

4 you saw these groups mobilizing online?

5 A Yes. I am. I'm concerned about 2024. I'm also concerned about 2022.

6 And, you know, I think that, unfortunately, in the United States, we have now

7 crossed into a time period in which political violence has been normalized for a significant

8 element of the electorate, and there are people out there that want to instrumentalize

9 that political violence to achieve politically and strategically.

10 And that's why the connection between, you know, sort of, delaying the

11 congressional certification and having new slates of electors is so concerning. You guys

12 probably understand if there's a link between those things better than I do. But I think

13 the broad, sort of, facilitation of that and, sort of, entrance of that into our political

14 bloodstream suggests to me that it is going to get played out when we are certifying

15 congressional elections and State senate and assembly elections and possibly again when

16 we elect a new President next time.

17 So I am very concerned about it. And I think that those threats are -- you know,

18 we're rightly going to look at how are elections counted, you know, what are we doing to

19 support secretaries of state as they try to do this responsibly.

20 But what we've seen is that there is a component of the electorate that is going to

21 use whatever means they have at their disposal -- which, Facebook is one of those tools.

22 Twitter is one of those tools. Telegram, WhatsApp are some of those tools. Bespoke

23 websites like TheDonald.win are just a tool. All of these things are just tools. And

24 they're going to use them to incite and inspire folks to disrupt in ways that they think are

25 going to give them some sort of political advantage.


124

1 And so I think it's incumbent upon Americans not to allow themselves to be

2 abused and manipulated in that way. It's also really incumbent upon political leaders,

3 whether in Congress, the White House, or at State levels, local levels, to stand up and say

4 that it's unacceptable. It's important for business leaders, like those at social media

5 companies and elsewhere, to say that it's unacceptable; civic leaders to stand up and say

6 that it's unacceptable.

7 Because once it's gone, it's gone. And I really do worry that we've crossed a line

8 that's going to be very hard to get back across. It's going to take some time. And it's

9 going to take demonstration of more political leadership than I see right now.

10 Mr.. Well, thank you.

11 I think on that note we'll call it a day, unless you have anything more you wanted

12 to add. But you've been extremely generous with your time, and thank you again for

13 coming to speak with us. And it's been really helpful for our context.

14 Mr. Fishman. Great. Thanks very much.

15 Mr.. Thanks. And we'll --

16 Mr. Fishman. Good luck with the hard work. I'm sure you guys have a lot of

17 other things going on.

18 Mr.- Thank you.

19 Mr.- Thank you, Brian.

20 And I think we have to close the record. Can we --

21 Mr.1111. Yeah, we --

22 Mr.- At4:05?

23 Mr.. We'll go off the record at 4:05.

24 [Whereupon, at 4:05 p.m., the interview was concluded.]


125

1 Certificate of Deponent/Interviewee

4 I have read the foregoing _ _ pages, which contain the correct transcript of the

5 answers made by me to the questions therein recorded.

10 Witness Name

11

12

13

14 Date

15

You might also like