Brian Fishman Transcript
Brian Fishman Transcript
7 WASHINGTON, D.C.
10
12
13
14
16
17 Washington, D.C.
18
19
20 The interview in the above matter was held via Webex, commencing at 11:04 a.m.
2
2 Appearances:
9 STAFF ASSOCIATE
11 INVESTIGATIVE COUNSEL
12 INVESTIGATIVE ANALYST
13 COUNSEL
14 INVESTIGATIVE COUNSEL
3
4 conducted by the House Select Committee to Investigate the January 6th Attack on the
6 At this time, I'd like to ask the witness to please state your full name and spell it
10 This will be a staff-led interview, and members of course may choose to come in
11 and ask questions, although I don't currently see any members in the Webex. If anyone
14 Mainly, it will be myself and but there may be a few other counsels
15 coming in to ask questions. If you don't understand a question, please just ask one of us
16 to repeat it.
17 I will note that the reporters can only record verbal responses, so it's very
18 important to get a "yes" or a "no" answer. And we'll try to prompt you in case you nod
19 or shake your head just to repeat the answer so the record can reflect it.
23 So, again, Mr. Fishman, you are here voluntarily for a transcribed interview.
24 There is an official reporter transcribing the record of the interview. Please wait until
25 each question is completed before you begin your response, because cross-talk is very
4
1 hard to have the reporters capture, and we'll try to wait for your response to be finished
2 before we ask our next question. And, again, the stenographers cannot record
3 nonverbal responses.
4 We ask that you provide complete answers based on your best recollection, and if
5 any question isn't clear, you can just ask for a clarification. And if you don't know the
7 It's important that you understand this interview is voluntarily. If at any point
9 Similarly, if at any point you need to discuss something -- well, I see you don't
10 have an attorney here at present, but if you do have an attorney coming up at any point
11 in the interview, you can take a break and confer with them. And if you also need to
12 take a break to have a comfort break or collect your thoughts, that's also totally fine.
13 This interview is not under oath, but because it's the formal select committee
14 investigation, you are obligated under Federal law to tell the truth, the same as if you
15 were speaking to the FBI or DOJ. And this is something that we tell all of our witnesses
16 who are in transcribed interviews. It's unlawful to deliberately provide false information
17 to Congress. And for this interview, providing false information could result in criminal
19 Do you understand?
21 - You are not obligated to keep the fact of this interview and what we
22 discuss confidential. You are free to tell whomever you wish to discuss it with that we
23 met and you can share what we discuss today. We, however, will not share what was
24 discussed today and we'll keep that confidential. It's completely up to you.
25 And, again, please let us know if you need any breaks or would like to discuss
5
1 anything throughout the interview. We can go off the record and have that
2 conversation.
3 And just as a final point, if at any point you don't understand where the questions
4 are going or you don't understand the question, just ask us to repeat it, and we'll be
5 happy to do so.
6 All right. Well, with that, I think I will hand it over t o - f o r some
7 preliminary opening questions, and we'll be switching back and forth throughout the
9 Thanks,_
11 EXAMINATION
12 BY
13 Q Just for the record, I have a few very basic identifying questions for you. So
16 Q Okay.
17 And tell me a bit about your educational background. Where did you earn your
18 bachelor's degree?
19 A UCLA.
22 Public Affairs.
23 Q Okay. Excellent.
24 And as for your work history, what is your current employment and your most
2 fully integrated stack of tools for companies to use for trust and safety purposes.
3 Q Okay.
5 Q Great.
6 And that takes me right into my next line of questioning, which is, I want to talk a
8 What was your title while you were at Facebook? Or did it change?
9 A I think it was never formal. I think it was probably the -- you know, names,
10 titles at tech companies are vague and elusive things sometimes. I was a director of
11 public policy, is probably the most formal of them, but I was also the head of what we
15 A Yeah, that's a good question. So the core role was within what was called
16 the product policy team. The product policy team at Facebook develops internal policy
17 for what can be done on the platform and the various platforms, ideally -- so Facebook,
19 But the way that I sort of executed that role was often serving more as a
20 coordinator among the various elements of the company in order to execute -- both to
22 So much of what I did on a day-to-day basis was bring together the operations
23 teams, the appropriate engineering teams, the legal teams, with the policy team in order
24 to try to better understand the problems and then construct policies and then execute
3 Q And you said that this organization sat within the product policy team. Did
5 A Yes. I did.
6 And I should, you know, just for the sake of completeness -- you guys are
7 Congress -- when I was originally hired, I reported to Monika and Alex Stamos before he
8 left, but that was a very sort of brief time period, and for the most part I was reporting to
9 Monika.
10 We did interact with the public policy team, and there were times when I would,
11 you know, go talk to Members with public policy counterparts and, sort of, corresponding
12 teams and efforts around the world, go talk to nonprofit groups and civil society, et
13 cetera.
14 You know, a key part of the role was to be sort of an in-house expert on how
16 Q You also mentioned engineering operations, legal. Are there other teams
18 A I mean, certainly public policy team, engineering ops, legal. You know,
19 occasionally the research teams. Those are the -- those would be the core ones though.
20 I mean, there were always occasional reasons to talk to, you know, some of the sales
21 teams or something if something strange came up, but usually that was more out of the
22 ordinary.
23 Q And around the period of the 2020 election and the lead-up to January 6,
24 2021, did you interact with the civic integrity team at Facebook?
25 A Yes. The -- and we did work with the civic integrity team to some extent,
8
2 But there were processes in the run-up to 2020 and afterwards that I'm sure you
3 guys have seen. I think you guys have some of the -- some of the exhibits mention the
4 IPOC, which was, sort of, Facebook's mode for coordinating across the company on key
5 issues. And we would participate in those processes, but usually in that sort of space.
6 Q And how did your work intersect with -- were there specific topics where
7 your work intersected most closely with the civic integrity team where you would
9 A Well, you know, the work of my team and the work of the, sort of,
10 dangerous organizations, quote/unquote, "XFN" -- "XFN" is the term that Facebook uses
11 to talk about, sort of, cross-functional groups, right? So, you know, in U.S. Government
13 And we were very much focused on violence, real-world violence, and groups,
14 organizations that were planning, contributing to, engaged in real-world violence, and
15 trying to do whatever we could to keep those groups from using Face book. And so that
17 The civic integrity team was focused on a broader set of issues related to election
19 And where that would come together was in, sort of, when either you'd get an
20 effort by known hate groups or terrorist groups or something around the world that
22 places, or you would get real threats of election-related violence where some of these
24 But it's important to understand that there are a bunch of other policies that also
25 touched on violence that we did not control as directly. So policies against violence and
9
1 incitement, policies against hate speech, those were -- my group had some influence on
2 those, but we did not control them as directly as we did the "dangerous organizations"
5 What about other integrity teams at Facebook? I understand there were also,
6 for example, like, a core product integrity team. Civic integrity was just one integrity
7 team. Were there other integrity teams that you interacted with?
9 honest, you know, internally, in a place like Facebook, sometimes those distinctions are
11 But, certainly, right, the core integrity team is building basic tools and basic
12 systems that other teams would plug into and utilize. And so, in order to understand
13 what those capabilities were and to try to influence the roadmapping processes -- which
14 means the process by which they prioritize and build certain capabilities -- you would plug
15 in and sort of hope that your voice was heard in those sessions.
16 Q And when the civic integrity team was restructured in December of 2020,
18 A Honestly, I didn't have much of a reaction. The civic -- and it was widely
19 understood internally at Facebook that the civic integrity team was not always getting
20 along with other similar teams. But we didn't -- my group, we didn't have enough of,
21 sort of, an engagement to have a real strong view about it, to be totally honest.
22 You know, I think it was not terribly surprising, in understanding how Facebook
23 works, if you've got, sort of, elements where there's sort of, you know, enough rumor
24 spilling around the company that there is some sort of rub on these kinds of -- you know,
25 between teams. Like, that's -- tech companies love to reorg, and Facebook's no
10
1 different.
2 So I think it wasn't terribly surprising, but I didn't have a strong -- I honestly didn't
3 have a strong view about whether it was a good idea or a bad idea.
4 Q And from your vantage point, what were those points of strife between civic
5 integrity and other teams? With whom, and over what issues?
7 limited. But I think that my general impression was that the civic integrity team wanted
9 recommendation systems, some of the others, and that they had sort of rubbed partner
10 organizations the wrong way because they didn't always bring those folks in, I think
12 And product organizations -- the civic integrity team, the great advantage that
13 they had and the great advantage that any product team has at a technology company is
14 they've got the engineers built in. And so, you know, everybody at a tech company
15 orbits around the engineers at the end of the day, because that's how you actually get
17 And so I think that was a great advantage that they had. That's a really good
18 thing in some ways. But the disadvantage is that, when you've got that internal
19 inherent capability within your group, you don't have an incentive to go out and
20 collaborate, you know? And I think that there was the impression among other teams
22 But, honestly, that's -- you know, I'm relaying, effectively, hearsay when I say that,
23 because that was not my experience. I just didn't have that much of it.
25 complaints from?
11
1 A I don't know. It's been 18 months. The -- I don't know. That was a
2 grumble that I heard. Like, I really don't -- I'm not -- I wouldn't want to throw out a
5 A Yeah.
7 points?
8 - Yes. Thanks,
11 BY-:
12 Q And, Mr. Fishman, that was all, I think, extremely helpful context.
13 And I wanted to follow up on one point that you raised earlier about your role as a
15 and I was wondering if you could expand on that. Was that more of a coordinator role
16 for discrete crisis events involving violent organizations or sort of a prospective policy
17 coordinator?
19 When I first arrived at Facebook, there was a team called -- what's the operations
20 team? It's amazing how quickly you forget some of these things. There was a discrete
21 team within the operations organization that would respond to crises. Because there's
22 constantly a crisis. On Facebook, with the global scope, there's constantly some disaster
23 happening that needs to be investigated and understood, and there was an operations
25 There was no real cross-organization crisis response team. And so, when I first
12
2 This was a bad system. It was not a good way to do things, because we were
4 subject-matter experts. And, unfortunately, on a global scale, there's very often some
5 terrible thing happening somewhere in the world, and because of Facebook's scope,
6 there was often some connection to Facebook, whether it was planned on Facebook or
7 there was just, you know, video or one of the attackers had an account and we need to
8 understand, you know, whether or not there was something happened on Facebook, and
10 And so that gets very distracting, and so the team built a -- so the policy team built
11 something called the strategic response team. The strategic response team, policy
12 team, then became sort of the central node for coordinating crises, for lack of a better
13 term. And a lot of that is trying to ensure that all of these different organizations are
16 I don't think that Face book, during my time period, ever got really, really good at
17 that, but they got way better during the, sort of, 5-plus years that I was there, and that's
20 longer-term projects and coordination in particular with the legal side on investigations
21 that were more sensitive, where not everything would be sent up in an email or to a very
22 select group.
23 So the other thing, though, that I want to make sure that I'm saying to you guys,
24 because I think it's important and it shapes -- I hope it will help shape your understanding
25 of the way that these policies work -- is, there's a framework that social media policies
13
2 And I see-shaking his head and nodding his head. You've heard this before.
3 But what's really important about the policies that I ran is that they were
4 actor-based policies. They were built on a determination that specific actors had
5 engaged in behavior, either on platform or off platform, such that they were so high-risk
7 And, in many cases, as in the case of, like, an ISIS or the KKK or, you know, some
9 praised on the platform. You couldn't say nice things about them. This is an incredibly
10 blunt policy.
11 Most of these other policies that we're going to talk about -- not all of them, but
12 most of the other ones that we're going to talk about are content-based policies.
13 They're policies that are enforced at the level of the content that is posted on the
16 way, it's very specific, it applies to anybody, but it's fundamentally reactive, because it
17 can only be applied after the content has been posted on the platform, after it has been
19 lot of ways, because you're basically saying this group, this individual, you know,
20 whatever kind of entity it is, may not be here. But it allows you to get ahead of
21 problems.
22 And so there is a real trade-off between these actor- and content-based policies.
23 And I think when you guys think about hate speech policies or violence and incitement
24 policies or misinformation policies, for the most part, those are all content-based policies,
25 whereas the dangerous organizations policy, for the most part, is an actor-based policy.
14
1 And that distinction is a really important one, and it shapes not just for your
2 understanding of Facebook on January 6th, but I think it's an important distinction for
3 thinking about how these policies work more generally, because they have pros and cons.
4 And one of the biggest is that a content-based policy can be a lot more focused, a lot
6 fundamentally.
8 If you could expand on one point you made. You said that the violent
9 organizations policy was mainly an actor-based policy. Can you kind of explain how it
10 can be both?
12 various actors as falling under the policy, right? And so that's what makes it
14 The reason why I say both is because of that prohibition against praise of those
15 actors, which I think kind of bridges that gap a little bit between content and -- you know,
16 because that means you can't go -- you're not a member of ISIS, I assume,_ but you
17 can't go on Facebook and praise ISIS, right? So, you know, your content, if you were to
18 do that, would be removed on the basis of what you had done there, the content itself,
19 not on the basis of you and your affiliations in the real world. That's what I mean.
21 Q Yes, definitely.
22 And I want to let-follow up on this, but I had one question on your role in
23 crisis situations.
24 So, at the time we get around to beginning of 2020, I assume the strategic
1 A Yeah, the strategic response team is in full swing. But, by this point, there's
2 also this IPOC process, which is sort of a standing coordinating mechanism. I mean, I
3 forget when they took the IPOC off. You guys probably can get that from Face book.
4 don't remember. The -- before 2020. But, I mean, it really launched, you know, a
6 And the idea was -- the idea of the IPOC was, sort of, you know, standing levels of
7 operations teams that are kind of there, ready to go, to solve problems in real-time and
9 And so that IPOC process was sort of a more formalized and structured -- you
11 you've got people from all of these different organizations coming together, doing
12 stand-ups -- you know, in its most intense form, doing stand-ups a couple times a day,
13 saying, here's where we are with this problem, here's where we are with this problem,
15 Q So, within that structure, what was your role during crises, as head of
16 dangerous ergs?
17 A Yeah. So, as -- you know, within the -- I mean, it depends on the crisis,
18 right? So there were IPOCs related to, like, COVID misinformation and things like that in
20 But in the run-up to 2020, especially towards the end, you know, I would jump in
21 and sit in through stand-ups when I could, and some on my team would do that as well,
22 so that we made sure that we understood what the rest of the company was doing, and if
24 - We'll get into more of that later, but first I wanted to pause to see if
25 _ , a d any questions on these points before we move into some of the more
16
1 nitty-gritty.
4 Q The strategic response team, was that the team led by Molly Cutler?
5 A No. Well, so, yes. Yes and no. So there's two strategic response teams
6 at Facebook, which is worthy of satire. So one is the strategic response policy team that
8 And then Molly built another one, another strategic response team, that was
9 focused on sort of larger-picture, you know, kind of higher strategy in the sense of, like,
10 bigger-picture company initiatives, whereas the strategic response policy team was more
12 And they did other initiatives as well. When, sort of, you know, key geopolitical
13 questions and things like that would come up and it wasn't totally clear who the owner
14 would be, they were really a useful, kind of, owner to coordinate things like that.
16 A Yeah.
19 BY-:
22 So, if, in the first instance, you could walk us through the violent organizations
23 policy during Face book during your tenure and how it evolved throughout your time.
24 A Sure.
25 So I think there are -- you know, if you look at the policy today on the website, on
17
1 the, sort of, community standards website, there are three tiers. When I first got there,
2 it was really that first tier, which is terrorist groups, hate ergs, large-scale criminal groups.
3 And the policy there basically says you may not praise, support, or represent any
5 Facebook for any reason, to talk about the weather, to talk about your dogs, to talk about
6 what you had for lunch yesterday, whatever it is. Like, you're not allowed.
7 And the policy is really far-reaching in the sense that it also disallows praise. This
9 Now, one of the reasons it's so blunt, part of it is that Face book made the
10 determination that it didn't want to allow any sort of fostering of these kinds of groups on
11 the platform, to the extent that it could. But, also, it's really hard to distinguish at scale
13 That may matter in the sense that, for some of these groups, there may be -- and I
14 think it's vague, frankly, but there may be legal obligations, because of sanctions law, to
16 And yet, sort of, neatly distinguishing between praise and support, where that
17 legal obligation may come into play and where it may just be somebody saying, "I like
18 Hezbollah," or whatever it is, this is a very difficult thing to do to scale. Like, the scale of
19 social media platforms leads to blunt policy, because highly nuanced policy is very difficult
20 to apply at scale.
21 And so this actor-based policy, that's what it was. There were processes
22 internally for identifying different groups. Going through a designation process is not so
23 different than what the State Department or the Treasury Department does, I think.
25 I'm sure you guys have seen, you know, the leaked version of that list that came
18
1 out, that The Intercept got their hands on. Be a little bit careful with that. It's not
2 as -- I don't think it's totally comprehensive, but it gives you the idea.
3 So that's one.
4 Then there was this point where there were a number of actors that were pretty
5 problematic, they were doing things that were pretty hateful on the platform, they
6 qualified as hate actors under the policy, but they were also really part of a political
7 debate. And so there was an effort to try to identify some of those folks, remove them
9 And so this is folks like Alex Jones, Laura Loomer, people like that, where there is a
10 social, there is a civic discussion about these people. The decision was made not to
11 allow them onto the platform. And I actually was on paternity leave when the Alex
12 Jones decision was made, so I can't speak to that one specifically, but I think it's the
13 important one.
14 But the effort there was to try to take this very blunt instrument of a policy, which
15 is the dangerous organizations policy, and give it a little bit more nuance, right? To say,
16 look, these are people that we don't want to allow on the platform, they're doing all sorts
17 of nasty stuff, they're being very careful in some cases to try to walk right inside of the
18 policy lines, but they're doing really nasty things off the platform, and so we don't have a
19 lot of confidence that they will not. And so the effort was made to designate those
20 folks. They aren't allowed to use the platform, but supporters may praise them, et
22 And then there is, sort of, the last effort, which was during 2020, and this was an
23 effort to do two things. One was to define two new categories -- really to define two
24 new categories of entities that could be designated under the broader dangerous
25 organizations policy.
19
1 One was militarized social movements. And militarized social movements have
2 two components. One was, sort of, organized militias. The second was groups that
3 were organizing violence in the midst of protests. And so, you know, that wound up
4 catching groups like the Oath Keepers, for example, in addition to a bunch of other, sort
5 of, militia groups in the United States and some, you know, even less specific kind of
6 organizations that were advocating certain kinds of, like, tactics in the midst of protests,
8 The important thing to understand about this process was that those policies did
9 not remove every individual member of those organizations. They were designed to
10 disrupt the ability of such organizations to organize on Face book. They were not
13 the KKK or even the Proud Boys, which was designated years earlier as a hate org, or ISIS,
14 for example.
15 And so one of the reasons there is that many of those militia groups -- it was clear
16 that the threats of violence were increasing as the summer of 2020 went on, but it was
17 also clear that many members of these groups had never participated in acts of political
18 or social violence. And we did not want these groups to be able to operate, to organize
19 on the platform; at the same time, we didn't want to wipe a bunch of folks off that, in
20 many cases, we were not aware that they had broken a law, they individually hadn't
21 threatened violence, they didn't think -- in some cases, they didn't think of their
22 organizations as preparing for violence. And so we built a new category that was slightly
24 The other piece that was established in the summer of 2020 was the so-called
1 policy was ultimately what allowed Facebook to address QAnon more substantively than
2 it had in the past. And this was similar. It didn't endeavor to take down every member
3 of QAnon, everybody that said something nice about QAnon. It was focused on
5 And it required a new standard for designating these kinds of entities, because,
6 you know, QAnon was particularly difficult because it's not a formal organization, and so
7 there were a lot of efforts to try to understand, you know, how the -- I almost said
8 "group," but I think it's explicitly not a group -- but how the movement operates and
10 Now, the thing that I think is important when you think about these policies that
11 were instituted in the, you know, sort of -- when? -- August 2020 is that it's not that those
12 were the first time that Facebook had taken action against these movements. That was
13 the first time the dangerous organizations team and an actor-based approach had been
15 Violence and incitement policies, hate speech policies, all of those other kinds of
16 coordinated harm policies, all of those other policies applied to them previously. The
17 challenge was misinformation policies in the case of QAnon, And that's how Facebook
18 attempted to deal with those groups to that point. The problem is that those policies,
19 because they are content-based, were fundamentally reactive, and they were not having
20 a significant impact.
21 And so, when push comes to shove, you know -- and I'm very proud of that team
22 of people -- the team that gets things done at Facebook is the dangerous organizations
23 team. And when threats really get serious, that's who you would call.
25 I think I'll take the general platform policy areas in the order that you first brought
21
1 them up.
2 So, first, I guess, my question is, how did you go about choosing which groups to
4 A Well, I mean, you don't -- you set up a set of criteria, and then you apply that
5 criteria. And that criteria is all based around violence, you know. So, you know, you
6 can look at Facebook's definition of terrorism. I don't remember it exactly off the top of
7 my head, but it's a pretty standard, academic-y definition of terrorism, for example.
8 And you break that down into a series of discrete signals. You assess organizations
9 based on those signals, and if they meet the signals, you designate. And same thing
11 The hard part is, there are so many entities of these in the world, right? Like,
12 Facebook has designated far more organizations than the U.S. Government, far more
13 White supremacist organizations than all of the Western governments combined. And
14 so there is a dynamic here where it is -- you know, that effort moves really -- you know, in
15 governmental terms, it moves pretty fast. I think in, like, civil society terms, I think, you
17 But that process is designed to try to keep people's personal views out of it. It's
18 designed to identify a set of criteria, assess those groups as honestly and rigorously as
19 possible -- which is sometimes frustrating, because there are times when, you know,
20 you're pretty sure a group, you know, falls into this category but you don't actually have
21 the evidentiary basis for it, and so you can't make a decision on those grounds.
22 And I don't think that those, sort of, evidentiary standards -- those evidentiary
23 standards that Facebook uses are not as strict as what you'd have in a Federal courtroom,
24 but they are real, because those decisions are real and they matter. And so I do think
3 Q Yes.
4 A Not the entire time. No, not the entire time. Like, that responsibility, we
5 fully took control of that in 2019? Yeah. Before that, it was a group called the organic
7 Q So, from 2019 onward, if it started with you, where would it go after your
10 nominate and prioritize groups to look at, generally based on, you know, what we
12 Because ultimately what you're doing here -- the thing that's different about the
14 possibility of real-world violence, not what is the nasty thing somebody's going to do on
15 the internet.
16 And so, when you're prioritizing, you're not prioritizing -- I mean, like, of course
17 you're interested in, like, do they have a big presence on Facebook, right? Like, that
18 raises the, sort of, prioritization as well. But you're also really looking at, like, am I really
19 afraid that this group is going to be engaged in violence soon? Right? Like, do we
20 need to go after these guys soon because we're really worried about the possibility that
21 they may leverage Facebook for violence in the real world? And, you know, I'm not
22 going to suggest that's a perfect process. It's not. There are some judgment calls
24 But, yes, from some point in 2019 forward, we ran that process. But we still
25 included a number of other, sort of, decision-making bodies from across the organization.
23
2 A Yeah.
3 Q -- first bucket of policy: Once you had taken action against a group, how
5 A Yeah. It's hard. And I don't think we ever fully did a great job of this
7 Q Okay.
8 A We certainly had a sense overall of -- and, you know, you can look at the
11 And that's one of the -- you know, I mean, I think -- like, those were the kinds of
12 things that the data science and the central integrity team would put together, are these
13 high-level, really important sort of measurement efforts, but they don't give you the
14 granular information that you really want. You know, oftentimes, in more ad-hoc ways
15 we would try to understand, you know, how frustrated were these groups after we had
16 removed them, you know, could they come back, how evasive were they having to be
18 Q Okay.
20 Q So that's helpful.
21 And I guess I'd like to understand better, how did you identify signals that groups
24 One of the -- you know, when I say that the dangerous organizations team was
25 looking at these kinds of entities, one of the fundamental insights -- right? So, when I
24
1 got to Facebook, the original mission was deal with 1S1S, which was still very much in its
3 The thing about ISIS is that -- and what people misunderstand, I think, a lot of the
4 time when they talk about Face book in particular is, they think this is where organizations
5 organize specifically. And that's not always true. It's certainly not the core
6 organizational forum for groups like ISIS and now for many of the White supremacist
7 groups. Oftentimes that's Telegram. And certainly for ISIS it was Telegram.
8 And so, sort of, our key assumption within the dangerous organizations realm was,
9 if you want to understand the threat to Face book, you have to look off of Face book,
11 downstream, right? So, if you see something on Telegram, it may not be on Facebook
14 actual full-scope threat environment in order to understand the risk that these groups
15 pose. And that's the approach that we took within the dangerous organizations world.
16 That's the approach that I think some of the other teams that are focused on, like, state
18 But that is not a standard way of thinking for most kinds of integrity teams at
19 Face book or elsewhere in industry. That's a pretty unique way of thinking about these
20 problems. And we really sort of, you know, I think, helped to really think about that.
21 But I think that is a -- that notion of looking elsewhere to understand what the
22 threat is going to be, not just what it is right now, so you can try to get ahead of things,
23 rather than be reactive, was a really key insight for us and informed not just the, like,
24 operations of the team but, you know, the ethos of the team.
25 And so we would look at -- we had vendors that were looking at these things and
25
1 reporting stuff in. And we were, you know -- like, we didn't do a lot of direct collection
2 ourselves, you know, which hurt my old academic soul, you know, who used to chase
3 jihadists around the internet. But I think, you know, we found ways to do it, and it really
4 informed the way that we would prioritize and think about these problems.
6 And I wonder what other platforms you were looking at. You mentioned
7 Telegram. What other platforms were you looking at to mine threats as they were
10 know -- you know, I know you guys are actually focused on the run-up to January 6th.
11 Vendors were providing information about a wide range of other platforms -- you know,
12 Telegram, Parler, Gab, stuff that was happening on Twitter, you know, things that were
13 getting posted on YouTube, et cetera, and, you know, TheDonald.win, et cetera. And
14 those were all things that we were aware of and really concerned about.
17 Q And in terms of The Donald.win, did you trace the rise of -- were you actively
19 A No, I wouldn't go that far. I mean, I think we were -- you know, I'm listing
20 off those various platforms, but really what I'm doing is sort of trying to give you a
21 representative sample of -- what I'm getting to is a point that I want to make to you guys
22 and that I think you'll -- we would get around to the question, so I'll just say it to you,
23 which is: When I became really concerned about the prospect of violence on
24 January 6th -- you know, I think I probably said this to the committee when I worked for
25 Facebook, but I'll say it to you again, because it was true then, it's true now -- it was
26
1 mostly because of content that was not on Face book, and it was because of the content
4 perspective, we had come to assume that if we started to see it on other places that it
5 posed a direct threat to our platform. And that's important when you think
6 about -- that is an important difference in perspective than when you think about, like,
7 the kinds of metrics, really valid, important, useful metrics, that were used to understand,
8 when do you turn off the break-the-glass measures? Right? Because there you've got
9 really thoughtful -- dealing with very difficult tradeoffs, but basing those decisions
10 primarily on what's happening on Facebook. And, from my perspective, you can't just
12 And so that's the point that I think is really important here. But it's not an easy
13 one, and it's a really tricky one and a difficult one. But I think that that is where -- you
14 know, that history and that framework of assessing threats based on wherever you can
15 get information about them, that's how the dangerous organizations team approached
16 this. That's how I certainly approached these kinds of things. That is not a standard
17 way of approaching these things. But I think it gives you a better perspective and a
19 Q Thank you.
21 A Yeah.
22 Q So when did you see this content off site that was getting you concerned
24 A I think this -- well, the story of the run-up to January 6th, both I think in the
25 real world but also on Facebook, starts with the Virginia Civil Defense League's Lobby Day
27
1 on January 19th, 20th, 21st, somewhere around there, in 2020 in Richmond, Virginia.
2 That was a day when we were really concerned about the possibility of violence.
3 A bunch of the Boogaloo groups were chattering really nastily. You know, it's a gun
4 rights lobbying day at the Virginia statehouse. And we went on high alert, because we
5 were really concerned about it, mostly because of some of the Boogaloo movements.
6 And nothing wound up happening, though the FBI wound up arresting a couple of
7 guys from a White supremacist group called the Base that allegedly were planning to go
8 to this event and fire off some shots in order to start a shoot-out between police and, you
10 That, to me, was the sort of hallmark. And we thought about that a lot, going
11 through the year of 2020. We thought about that incident quite a bit. So it started
12 then, of, like, hey, what does a really angry crowd that's potentially armed and what do
13 small groups of people that want to incite that crowd, what kind of threat does that
14 pose?
15 At the same time, you're balancing that against the notion that people -- this is the
16 United States of America. Free speech matters. People have the right to lobby their
18 and lobby on behalf of gun rights. And the, sort of, social responsibility of a tech
19 company is to empower speech, civil speech. These are really difficult, kind of, balances
20 to strike.
21 But I think that that -- you know, that certainly shaped my thinking. It was the
22 beginning of how I was really thinking about 2020. That was the kickoff. And it was
23 almost -- you know, I mean, I could google it; I don't remember the exact date, but it was
24 almost a year exactly from the ultimate inauguration of Joe Biden. And so I think that
1 Moving forward, I think we were tracking very closely some of the efforts around
2 the election, specifically where there was an incident with some folks that had gone to
3 Philadelphia that posed a potential violent threat. The night of the election, we
4 understood the range of different arrests of various figures up and to that point.
5 Obviously, we were aware of the, sort of, Boogaloo-related arrests and others earlier in
7 And then the violence in D.C. on December 12th, you know, was nothing like what
8 happened on January 6th, but it was certainly something that we were aware of. And
9 we'd been tracking, you know, the Proud Boys in particular because of a range of violence
10 in Portland and the Pacific Northwest through that year. So we knew that this was, you
11 know, a -- anytime where there are real calls for lots of these folks to get together was
12 sort of concerning.
13 But we also knew that, when it came to the most acute of the groups that were
14 gathering on January 6th, we had policies in place that were causing them a lot of trouble,
15 right? We were constantly taking down Proud Boys. They were mad. The Oath
16 Keepers, some of the individuals were still there, but they were not using Face book as a
18 And so, you know, there was some comfort in that, especially from the dangerous
19 organizations' perspective, where we're looking at the worst, you know, the core
20 organizations that may cause violence, not the more, sort of, general
22 But immediately after the first of January, when it became very clear that the
23 rhetoric had changed across the web, in my mind, several of us coordinated and said,
25 And so, you know, I called several meetings with the team outside of the IPOC
29
1 process to really make sure that we had everything buttoned up, that we were
2 coordinated with law enforcement, that we were having those kinds of conversations,
3 because we wanted to be ready. We didn't know; you never know. We'd sort of gone
4 into overdrive for the Lobby Day a year prior, and then nothing happened. But we
5 wanted to be ready.
7 When you convened the meeting outside of IPOC, that was early January 2021
8 when you thought the violence potential for January 6th specifically was worth
9 considering?
10 A Yeah, so we did two things. One is, we called sort of a separate meeting,
11 mostly with the legal team, to make sure that we were coordinated with the law
12 enforcement. And --
14 A Oh, I don't remember. It was a small group. It was mostly me and the
15 legal team. It was basically I just wanted -- look, those folks are pros. They really are.
16 They're people with backgrounds in the FBI and everything else. Like, I had no concerns
17 that they weren't coordinating with law enforcement, but, you know, it's my job to ask,
19 But the other thing we did was we reached out to the IPOC folks -- because,
20 remember, January 5th was the Georgia special election, right? And the attitude of
21 the -- the IPOC was really built around the elections, right? And so, like, there were
22 levels to the IPOC. And so high-level, like, alert for January 5th. But we had to go to
23 them and say, "Hey, you need to be on high alert for January 6th too." And they did.
25 But that's what I mean. Like, the dangerous organizations' perspective, because
30
1 we're looking off of Facebook, we understood more intrinsically the way that the threats
2 of violence were manifesting around January 6th. And that's not something that was as
4 And I know some people will say, "No, it was obvious," and they'll point to specific
5 pieces of content. But there's so much of that crap on Face book all the time that it's
6 hard to pick out noise from the -- you know, wheat from the chaff. But we were able to
7 do that because we were concerned and primarily driven by the concerns of what we saw
9 Q So first question: Do you happen to recall a date for that initial meeting
11 A When -- well, I mean, the thing is, we'd been meeting, you know, twice a
12 week since May, you know, around all of these issues. You know, maybe it was April, I
13 don't remember. So it wasn't like this was the first time. But I had called a special
14 session with the legal team to make sure that we were coordinated in the days ahead of
15 January 6th.
16 Q And what was the nature of your conversation about communication with
17 law enforcement? Did legal let you know that law enforcement had been in contact
19 A No. What you want to make sure in a situation like that -- no, I don't know
20 that law enforcement had reached out to us. But what I wanted to make sure was, if we
21 get emergency law enforcement requests about January 6th, are we ready to get them?
22 Are we going to make sure they don't slip through the cracks? Are we going to be able
23 to respond? If we see something that looks like the clear planning of violence that we
25 And those kinds of systems exist at a company like Facebook. They're going to
31
1 exist at big tech companies -- Facebook, Google, Microsoft, Apple, et cetera. But in a
2 situation like that, where the stakes are high, my job was to check, so I checked.
3 And they assured me that we were good, which I really never had any doubts
4 about, because those folks, A, had been doing it really intensely for a year, basically,
5 through what was a really -- you know, I don't have to tell you guys -- what was a really
6 intense year in American history, 2020. But they're also people that -- they understood
7 the sort of concerns that we all had about some of the intelligence that we were seeing,
2 [12:05 p.m.]
3 BY-
4 Q Following that meeting, is that the point at which you tried to get IPOC
6 A That is when we -- no, I don't think it was -- I don't remember the order, to
7 be totally honest with you. I remember the -- I remember the -- there was a weekend.
9 Q Yeah.
10 A I remember the weekend prior to that, looking at some of the material that
11 was off platform, and saying this has really materially changed. The threat is higher than
12 we understood. And we need to make sure that we are really ready. And so, I reached
13 out to the folks running the IPOC. I don't remember exactly what that process was like,
14 and said, Hey, whatever we're doing on the 5th, we should do it on the 6th. Be ready.
15 And really that's a staffing question, right? It's like, you know, it's how many
16 people are going to be on call, who's gonna be there, you know? And as I recall
17 we -- you know, they made the change and kept a bunch of folks on call for the 6th
19 Q So that would have been around the January 1st/January 2nd timeframe?
20 A Something like that, yeah. I think there was a real -- there was shifts in the
21 plans for the actual protest itself. I think they were going to move from the -- did they
22 move from the east side to the west side of the Capitol? There were permitting
23 changes, right? And there was a shift in tone, and a specificity about the Capitol that we
24 started to see, you know, online. And by on line, I mean, I really -- I know it may sound
25 strange, but when I say on line, I don't mean Facebook specifically. But that raised my
33
1 level of concern. And we'd been concerned throughout December especially after the
3 Q And we touched on this earlier, but were there particular platforms that you
6 rhetoric that was being -- you know, I think there was decent work being done by some of
7 the researchers. And I don't always agree with all of the researchers, but I think there
8 was a number of researchers tracking some of this stuff, you know, that were publishing
9 on it, that those things concerned us. But I remember the Donald.Win. I remember
11 We also saw some of the stuff Stewart Rhodes was putting out on the Oath
12 Keepers site that had a little bit of a different tone, suggested some planning and
13 organization that was concerning. But that's the thing, right? Like, you know, you're
14 trying to measure risk in these situations, and you have to make judgment calls. And
15 this was really about being prepared. You know, how high of an alert do we need to go
16 on? And so, we wound up going on a higher alert, and it was clearly the right decision.
17 Q Was it your hope that some of the information you were seeing would be
18 given to law enforcement after your meeting with legal, or did you understand them to
20 A No. I mean, I don't think -- I mean, all this stuff was just on the internet,
21 you know? Like none of this was the kind of material that law enforcement needed to
22 get from Facebook, you know? Like, there are things -- there are plenty of instances
23 where Facebook comes across something on Facebook that is clearly a threat, and gets
25 But that's not really the dynamic that was going on here. What was happening
34
1 here was the kinds of stuff that the -- you know, all sorts of researchers were publishing
2 on. I don't think this was really special knowledge. It was awareness, and our sort of
3 predilection for looking outside of the platform to understand general levels of risk.
4 Q That's helpful.
5 I wanted to zoom back a few days because you mentioned your risk level kind of
6 being pinged around the New Vear, and then in between the December 12th event in D.C.
8 On December 19th, President Trump tweeted about the January 6th rally for the
9 first time.
10 Does that -- do you recall that being an important moment in your risk calculus?
12 rhetoric from various organizers, and sort of provocateurs, including President Trump,
13 that really seemed to really want to rile people up. But -- because we saw the level of
14 anger and the frustration and confusion, you know, genuine confusion by supporters of
15 the former President who, you know, genuinely believed that the election was stolen, and
16 genuinely believed they were losing their democracy. And so, we were certainly
17 worried about that. But I don't remember that to be in particular -- which, you know,
18 again, it was 18 months ago. So some of these things stick in my mind. Some do not.
20 A Yup.
21 Q When you did get IPOC, or seek to get IPOC involved in January 6th, could
22 you tell us some of the individuals who were involved in that process or involved in IPOC
23 at the time?
24 A I mean, I don't remember. You know, Molly was certainly a key driver of
25 the IPOC process. And Molly runs a really good meeting in those situations. She's
35
2 There were lots of folks, you know, lower-level operational folks who, you know,
3 were on sort of -- they were on call, right? And so, they were the ones that were sort of
4 running the actual operation on a moment-to-moment basis for some set period of time,
6 You know, there were like a bunch of those. Rosa Birch, Rosa was doing some of
7 the work for Molly. You know, lots of those folks. I don't remember exactly the
8 process we went to when we said, you know, we needed to -- like we wanted to make
9 sure we were on high alert on the 6th itself. But we submitted it in. We said, Look,
10 there's a bunch of crazy stuff going on. We need to be really ready. It wasn't that the
11 IPOC was going to be gone. It was just gonna be on a lower level of alert after the 5th.
12 We made -- at least that's my recollection of it. And, you know, we endeavored to keep
14 Q That makes sense. I want to pause here because I've been monopolizing
16 Thanks,_
17 BY
18 Q Could we back up a little bit, and when you say that you wanted to keep the
19 IPOC on the same level alert, to make sure their alert level didn't recede. What is the
20 practical effect of having the IPOC on alert? If they've been alerted on the 5th, what can
21 they do on the day of the 6th to make a difference within that time period?
22 A Yeah. I think what it really comes down to is staffing, because you've got
23 more folks. You know, it's like any -- I don't know if you guys have ever been out to
24 NCTC or seen an ops centers, or military ops center, or something like that, you know, the
25 practical reality is you've got a lot of folks just sitting around a lot of the time, right?
36
1 They're kind of twiddling their thumbs, and they're getting ready, and they're making sure
3 But the reality is, they're there for a crisis, which means between crises, there's
4 some slack. Now, in a situation like Facebook, that's -- nobody was really sitting around,
5 because there's always something going on, right? It's just so big, and like, the scope is
6 so wide. But, you know -- so I really do not want to imply some of those folks
7 working -- I mean, they worked their butts off, like, especially the lower-level folks.
8 They really, really work hard, and I'm grateful to them, and I hope other people are as
9 well.
10 But what it really was is staffing, is like how many people you have there ready
11 assigned to this in case something goes wrong, that are able to quickly investigate and
12 triage if a question comes in from, you know -- if there is an emergency request from law
14 If there is a violation on the platform that looks like there's a network associated
15 with it that can chase that down and figure out what's going on, things like that. None
16 of those things are automatic, even though those systems are better than they used to
17 be, they still require people to make smart judgment calls to deal with glitches in the
19 people that don't understand how all the internal tools work. And so, there's a lot of
21 And I think there is a perception outside of Silicon Valley that like tech companies
22 have everything totally wired, and all you have to do is press a button on a like you know,
23 a black screen with green letters, and all the sudden all sorts of perfect, sexy information
1 and query four or five different systems and try to compile it in a way that is reasonable
2 for their leadership so they can make good decisions, and it's hard. And so, you need
3 people on call to do it. I think that's -- you know, you could ask Facebook. I'm sure
4 they would be able to give a better explanation of exactly the differences of the IPOC
5 levels. But what we pushed forward was just a higher alert on the 6th itself.
6 Q So the types of things they would have been able to act on in the moment of
7 a crisis, it seems like we're talking about things like, you know, if there's an act of violence
9 there's sort of an outbreak of violative content in response to the events of the day and
11 Are those the types of things they would be there to handle in the office?
12 A Yeah.
14 A No. Those are the big ones. But I think -- remember that this is like a
15 fusion center, right? So there's some people there, their responsibility is to take on the
16 most acute problems; but they also are kind of the tips of the spear for larger
17 organizations that are still doing their things, right? So there's a larger operational
18 organization. There's a larger legal organization that's dealing with law enforcement
19 requests. You know, there's -- you know, there are larger entities out doing their
20 everyday work that may be sort of redirected towards -- certainly towards an event like
21 January 6th.
22 So, you know, I don't want to give the impression that all of the activity that
23 Face book or a company like it does would occur within that fusion center, right? You
24 know, some of it does. But some of it is also there to represent just to share
1 Q This might be a good time to move to the events of the day itself in more
2 granular detail.
3 So starting with the morning of January 6th, can you describe sort of the
5 A Yeah. Everyone was really nervous. I think that was -- at least I was really
6 nervous. The -- no, I think there was a real sense of concern over how this would go,
7 because we had seen -- but there was also that sort of moment especially when you are,
8 you know, when you're at a frankly more senior level, and you're in a coordinating role
9 when something starts to happen, you know, you just don't want to get in everybody's
10 way, right?
11 You've sort of like -- okay, there's a bunch of folks whose job it is to deal with this,
12 you want to be there if they need you, but you don't want to get in their way. So I
13 remember the 6th, I basically just cleared my schedule on the 6th to, you know, to make
14 sure that I was there to help. But, you know, I remember being -- there's sort of that
15 moment when you are in a leadership role of any kind where, when things start to
16 happen, you've got to let people go do it, and it can be very frustrating, because you want
19 somewhere, and the -- I was sitting right here where I am now. And I remember very
20 clearly, at one point, I had a phone call with my colleague, Nathaniel Gleicher. We had a
21 sort of standard Wednesday call with each other, and we're chatting, and so, we had a
22 phone call. And I thought I'd go for a walk around the block. And, you know, I got
23 halfway down the block, and Nathaniel was still at his house or something, and he's like,
24 oh God, they're attacking the Capitol. So I immediately came back. And I remember it
25 very, very clearly, obviously. Before that began to happen, it was rhetoric, it was
39
2 Q Who was your first call after you got back to your back to your computer
4 A I don't think it was a call. It probably would have been a chat across, you
5 know, to a bunch of -- a chat with a bunch of the cross-functional folks to say, Hey, what
7 Q You don't have to worry so much about the chronology or the exact order,
8 but who would you have reached out to first? Like which cross-functional teams would
10 A Well, certainly my policy team, you know, and the folks that were tracking
11 some of these things closely, our legal team. And -- you know, to see whether we were
12 engaged with law enforcement, whether we were getting stuff in, whether we were
13 seeing practical threats that we needed to get out the door, and then the operations
14 team, you know. Certainly, at that point, the IPOC, because the IPOC was a central
15 coordinating body. So if something was coming in somewhere weird, like, the hope
17 Facebook -- you know, tech companies don't like to believe that they're
18 bureaucracies, but they are. So stuff comes up in one part of the company, and you
19 may not hear about it, right? So the intent of something like the IPOC is to try to limit
20 that, right? But where you -- if somebody has something important to say, you get it in
21 there and then it's shared, right? So I don't remember -- I'm sure there was sort of an
22 emergency IPOC convened, you know, stand up, but I don't remember specifically.
23 Q This might be recreding some of the things Mr. -asked about, but who
24 were the immediate points of contact for the IPOC? Who was involved in that that
25 would have taken your -- who would have received your message?
40
1 A I don't know. It would have been whoever was on call at that time.
2 don't remember.
3 Q What was your message to your team? What did you tell them to do, or
5 A Well, I think what you ask, you know, the folks that -- like, in a situation like
6 that, you know -- right, there were at least two things happening on the Mall that day,
7 right? One was you had a mass group of folks that had been subject to misinformation
8 around the election itself that were angry, that were there, that were, on some level,
9 primed for violence. And then you had more organized groups that had shown up with,
10 according to various indictments now, with intent to commit acts of violence and
12 When you break that down, it was those groups that are more sort of the ones
13 that I would be thinking about more than the masses of people informed by sort of
14 misinformation. Now, obviously, there was some crossover, and we should talk about
15 that crossover and where that gets difficult, but those organizations, you know it's like,
17 Like, it's not rocket science, right? You know, it's -- we'd been following these
18 things before. We knew the Proud Boys were going to be there. We knew the Oath
19 Keepers were going to be there. So it's like, Hey, what do we know about what these
20 groups are doing now as best you can, and oftentimes, you don't know much, right?
21 You wish you knew more, but especially because we had tried to chase these groups off
22 of Facebook. You know, we didn't always have the most tangible information on those
23 entities themselves. And so, you run around. You chase it down. You see what other
25 You reach out to vendors and say, Hey, do you guys have anything? And that
41
1 process is pretty hit and miss, you know, especially in a really fast-moving crisis where
2 you're trying to understand what's going on. Of course, you're watching it on TV, trying
3 to see what other folks are, you know -- what's coming up. So, you know, I think that
4 that process is, you know, is a really, you know, dangerous and tricky one. I mean,
5 dangerous is the wrong word. That process is fraught because the information coming
6 in is spotty. It's not always clear how to analyze it. It's not always clear what's
7 important, and you have to sort of whittle that down as best you can.
8 But we wanted to make sure, like, you know, are we missing something here with
10 about Proud Boys on platform because you know these groups you kicked off are coming
11 back, right? They always do. And so, you know that there is, as much as you've tried
12 to sort of prevent these things, you know you've not done it perfectly. And you want to
13 see if there is harm that is being facilitated on the platform. And so, you want to talk to
14 the legal teams and make sure that as requests are coming in -- and they started coming
15 in that day from law enforcement agencies -- that we are in a position to triage them, to
16 respond, to provide, you know, to make good decisions about how we respond, because
17 sometimes, you know, sometimes even an environment like that, requests will come in
18 that are wildly off base, and you want to make sure that everybody in the organization
19 with their individual set of responsibilities is doing them. And in many ways, that was
20 my sort of job in that setting is within the dangerous org writ, are we all chasing these
21 things down? Are the folks that are responsible for gathering intelligence, gathering
22 that intelligence and getting it over to us? Are the folks that are dealing with law
23 enforcement, dealing with law enforcement? And you know, do we have any true crises
24 that we need to, like, throw more resources at? So those are the kinds of things that
2 in response to the crisis, do any -- in retrospect, do any pieces of evidence stand out in
3 your memory? Is there anything important that you discovered on that day that you
5 A It all kind of blends together. I don't think so on that day, you know?
6 mean, look, you know, I think, you know, the indictments tell the story now, which is that,
7 you know, many of the people that were there that participated in the attack were
8 informed by misinformation. Facebook was a vector, like many others, for that
10 There were some activities by Oath Keepers and Proud Boys on Facebook.
11 don't think we discovered that stuff that day. I don't remember to be honest. But
12 those activities were pretty limited for the most part. And they were in Messenger
13 traffic I think. So things like some of the pre-stuff about where to stay in hotels, all that
14 stuff's in indictments now, you know, that are publicly available. That's the kind of stuff
16 And, you know, I think, you know, what's more important was, you know, efforts
17 to try to like, Hey, if we've got people that are claiming --you know, that are posting
18 videos and pictures of themselves inside the Capitol that day, you know, these are
19 things -- these are things we want to probably hang on to, and understand that there's
20 going to be law enforcement requests about that, you know? But a lot of that is, like,
21 kind of the masses that were there. And, you know, as you know, there have been lots
22 of indictments and lots of stuff that's come in, and Face book provided information in
24 Q I have a list of, sort of, individuals I'm just wondering if you could -- I could
25 run through it with you and we could talk about any interactions or recollections you
43
1 have about their -- any interactions you had with them on the day of 1/6, or the
3 And I know we've been going for, gosh, an hour and a half. Is that right? So we
4 could look at taking a break after that, if you like, that could be helpful?
6 Q Okay. So the first on my list is Samidh Chakrabarti, the head of the civic
7 integrity team.
8 What do you remember about him from 1/6 or the days immediately after?
9 A Very little.
11 A We did sometimes, but not a ton. I think, you know, the civic team was
12 more focused on efforts like break-the-glass measures, and frankly, the sort of really
13 important work to try to protect elections generally, misinformation and things like this.
14 That's really -- you know, I don't want to give the impression at all that just because I
15 wasn't focused on it doesn't mean that it wasn't really important. That stuff's really
17 And I do think that, you know -- and I do think that when you get to those
18 more -- and those groups, and like, so, these acute organized groups that try to use the
19 internet, they are -- they know that you're trying to get them on line, so they're trying to
20 evade you, you know? So it's not the same story as, like, the misinformation story of,
21 you know, our recommendations doing this. It's they're trying to use private spaces and
22 do things in ways that are evasive. They're not trying to make a big slash all the time.
23 That was certainly the case with, you know, the Proud Boys and the Oath Keepers
24 and even QAnon, by that point, which was still moreover, but you know, much of QAnon
25 had been -- the really overt stuff had been kind of pretty dramatically disrupted in August
44
1 and October in the run up to the election, and they were still around of course. None of
2 this ever goes away, but it's -- but they were less overt. And so I think that
3 changed -- you know, we were -- there's a very different perspective between a group
4 that is like the civic integrity group and the broader integrity efforts of Facebook, and not
5 just civic, but like, where you're really trying to move big metrics over time, right?
6 Because you're dealing with these mass movements all the time. And a group like Mon,
7 which is very unique, that's focused on very specific small acute, really, you know, the
8 most acute problems. And that is a -- you know, that's why there's -- we weren't always
12 Q That makes sense, and it's useful. I have two follow-ups on that, which you
13 don't have to answer in too much length, but just to help me understand the points of
14 overlap.
15 Were they involved at all in the IPOC, and did your team have any kind of input, or
17 A So I -- yeah. I'm sure the civic team was involved in the IPOC. And I don't
18 remember when -- I guess in December at some point, the reorg of the civic integrity
19 team occurred. I don't remember the exact timeline. I'm sure they were involved.
21 were -- you know, in the dangerous ergs world, we were already doing a lot of that stuff.
22 If you -- stuff like -- you know, a lot of those things were stuff we already did. So, like, if
23 you were a, you know, user that had, you know, that had posted some dangerous ergs
24 content or things, you were already subject to these kinds of restrictions, in many cases.
25 So, you know, like the break-the-glass -- the dangerous ergs approach was the blunter
45
1 instrument, right? And it's blunter instrument, but it was applied to a narrower set of
2 users, right? I mean, it's much more narrow. But a lot of those kinds of things -- we
3 started doing that stuff in 2017, and you know, so those kinds of things I think were -- we
4 did have some input on it, but it was less critical for us.
5 Now, where we wanted, you know -- I think the danger is that we recognized that
6 many of the sort of, you know -- the danger is that what we saw, and what we've seen, as
7 a steady movement -- and this is a problem that's going to confound, not just your
8 investigation of January 6th, but it's going to confound social media, it's going to
9 confound everybody that has been a steady movement, is that many of these violent
11 And they operate using symbols and sort of, you know, in the case of Boogaloo,
12 like, aesthetics as a way to unit themselves rather than organizational structures, that
13 creates real problems for applying policies like the dangerous ergs policies. So you wind
15 The challenge there is that those other policies are reactive, right? And that's a
17 So you can try to create, sort of, creative responses to that, like the
18 violence-inducing conspiracy network policy, but, you know, you're dealing with really
19 difficult questions at that point, and extending that in a really broad sense I think is a very
20 difficult thing to do. Dealing with dangerous organizations online is hard conceptually.
21 Dealing with misinformation is way harder, conceptually. But that's kind of where we're
22 headed.
23 And one of the reasons we're headed there is because of the 6th -- you know, it's
24 not the only reason. But one of the reasons we're headed there is because of the
25 success of things like actor-based policies, because they made life more difficult for folks
46
1 to operate in sort of more structured ways. But people aren't stupid, you know, and so
2 they recognize that and they adapt, and they try to evolve around it. And I think part of
3 what we saw, you know -- actually, I don't even want to say in the run-up just to January
4 6th, but I think part of what we're seeing right now generally in the world, and certainly
5 on line, is a range of different organizations, not just in United States, but globally, that
6 are recognizing those challenges, and they are adapting to them. And you know, if
7 anybody says they know how to solve that problem perfectly, they are full of it. But I
8 think, you know, you're certainly pointing to something that's very real.
9 Q I have a lot of questions about what you just talked about that I want to
10 table for later. But certainly, the point about the evolution from organized groups to
11 loser networks and the challenges you highlighted are things that are on my mind and I
13 I suspect, just really quickly, that the next individual on my list, Guy Rosen,
15 A No. I worked with Guy on a number of different things. So, you know,
16 like Guy was one of those -- Guy was top of the integrity organization. And you know,
17 Guy is somebody who really understood I think the different perspective of the dangerous
18 ergs team. And so, when push came to shove and something was broken
19 bureaucratically, Guy was somebody who could help break those logjams sometimes.
25 Q Okay. Next person I think would be you might have more recollection of,
47
2 A Well -- so I worked with Monica -- I reported to Monica for years, and even
4 the 6th. I'm sure we were -- again, I'm sure we were sort of pinging back and forth and
5 in the meetings together, but I don't remember specifically what she was focused on.
7 A Not specifically. You know, I'm sure we communicated, but I don't have
9 Q How closely did Monica oversee your team? Would you describe her as a
10 hands-on manager?
11 A I think less hands-on as time went on. Mostly because she sort of
12 developed trust, you know? And, you know, a lot of the things that I wound up doing
13 were things that she did before I got there. She was [inaudible] former Federal
14 prosecutor. So she -- part of-- like there is correctly kind of a line between the sort of
15 legal operations of a big tech company, and the sort of community standards, terms of
16 service. Most of the, like, terroristy stuff Face book removes is not a threat in the real
18 So the folks that are enforcing that kind of stuff, they don't need to know about
19 the legal wrangling and the sort of sharper tip of the spear where things like that
20 manifest. And so where that stuff comes together is in a few people. I was one of
21 those people. Monica was one of those people. And where you're trying to make sure
22 that there is communication across, and lessons learned across and ways that are safe
23 and that are privacy protective and all of those sorts of things. So she would still do that
25 Q Who else did Monica oversee? Like, was most of her writ related to
48
1 dangerous ergs? What were the other areas that she had insight?
2 A No. Monica's writ was across all of product policy. So that included, you
3 know, everything from information operations and state espionage, child safety, to all of
4 the sort of content-focused teams. And teams -- ultimately teams that were building
5 sort of product-focused recommendations like, you know, what you can now see as
6 the -- what do they call them? The content distribution guidelines, which are now
7 online, which are kind of more generalized principles about content ranking and things
8 like that based on policy principles. So Mon led all product policy, so that's all of that
9 stuff.
10 Q What about after 1/6 when the smoke cleared, and you and Monica were
11 maybe able to have a more focused conversation about what happened, do you recall
13 A I don't -- I mean, I don't. At some point, Mon had some health issues, and I
14 don't remember exactly when those occurred. The -- I don't remember the next precise
15 sitdown. I mean, you know, we would chat regularly, you know, both in formal, sort of,
16 meeting settings but also, if I just needed to let her know about something. I'm sure
17 that I was sending her notes about some of the things we were getting from our legal
18 team, and updates on that kind of stuff. I'm sure that those updates were going at
19 various points, but those communications weren't always terribly formal to be honest.
21 Q Did you ever have a sort of retrospective discussion on what went wrong in
22 the run-up to the 6th, or what could have been done better?
23 A I'm sure generally. But that was not something that Monica would have
24 run specifically, you know? Like there were certainly broader company efforts to try to
1 Q Maybe we can return to that. I want to make sure we get through some of
4 A Yeah. Joel -- so similarly, I'd be surprised -- I'm sure that I must have
5 communicated with Joel that day on some level, or been in meetings with Joel or been
6 on, you know, emails or something with Joel, you know, because we were certainly
7 plugging in what we knew to the IPOC process, and sending updates, you know, and -- but
8 I don't recall a specific conversation with Joel about, you know, the day of or something
9 like that.
11 A Yeah.
14 A Yeah. Similar. I mean, you know, Molly didn't always run sort of the IPOC
15 stand-ups. Sometimes she would deputize others for that. But again, I don't know.
16 don't remember specifically. But like, when it was something really important, Molly
17 would oftentimes run those meetings herself, and she was very good at that. And so, I
19 recollection, is that she probably ran some of those on the 6th, and immediately
20 afterwards.
22 A Yeah.
24 A Yeah. Yeah.
1 A Very little. Yeah. Very little. I mean, sometimes, but, like, not a lot.
2 Q Tom Allison?
3 A Who?
4 Q Tom Allison?
5 A Again, it's a name I know. I know we had some interactions, but not a lot
6 of detail.
7 Q What about Sheryl Sandberg? Do you have a sense of her activities on 1/6?
9 communications and updates that would go up to Sheryl and Mark that I probably was
10 CC'd on because we plugged in some information. I don't recall those specifically, but
14 major event like that where there's any nexus to Facebook, updates to very senior
15 leadership that, on something like this, I assume I was CC'd. I don't remember, but
16 the -- like, in many cases, the distribution of such messages would not go very wide.
17 You know, but on a topic like that, I would assume I was CC'd on these kinds of things.
18 But you got to -- I mean, you know -- yeah, I don't recall specifically.
19 Q So there must have been, I think, a series of high-level decisions that needed
20 to be made in the days following 1/6 by people like Mark Zuckerberg, Sheryl Sandberg,
21 Joel, Monica.
22 Do you have a sense of, again, once maybe starting the evening of the 6th going
23 into the 7th, what their priorities were? What were their -- the sort of priority zero
1 scenarios, the overriding PO, generally speaking, was to cooperate with law enforcement
2 and make sure that we were providing information that could protect lives. That
3 commitment, you know, like that commitment from senior leadership at Facebook is
4 pretty good. It really is. And people I think -- I've got my gripes about decisions made
5 at Facebook, but on that count, I think they really held right. And they did it right.
6 But, you know, the thing that everybody always wants to know is, like, what the
7 hell happened on Facebook, right? That's what the questions are always. Like, what is
8 the nexus on Facebook? Was this planned on Facebook? Was this organized on
9 Facebook? How was this organized on Facebook? To what extent? Et cetera. And
10 then identifying and removing, you know, violative material that relates to the event,
11 right?
12 So there were decisions made about some of the remnant Stop the Steal groups,
13 and groups like that that were removed in the wake of January 6th. There was the
14 removal of the Patriot Party Network, which was essentially a mirror de facto of some of
16 And, you know, I remember the outlines of those decisions. I don't remember
17 the exact timeline, but in the wake of -- in the wake of January 6th the -- but I think, you
18 know, there is this sort of deal with law enforcement, real-world harm, and then there's
19 what is the Facebook implication, like how do we clean this up? And then, of course,
20 there is the, like, how do we deal with all the questions we're going to get? Like what
21 do we know?
22 That is both, you know -- big company like Facebook that is both a question of,
23 like, what is our communication stance, like how do we protect the brand, but also, how
24 do we answer questions when we don't know all the answers yet in a way that's not
25 gonna turn out to be erroneous, right? You know, that is -- that takes up a lot of time
52
1 inside of Face book, is like how do we make sure we've got all of the information so we
2 don't say something we're going to find out was wrong in a month even though when we
4 Q Is my recollection correct, correct me if I'm wrong, that Stop the Steal was
5 sort of banned -- that type of content was banned from Facebook in the days after 1/6?
6 A So the -- so I think -- this is an interesting one, I think when you think about
7 Stop the Steal, people use that phrase to mean lots of different things. And it's
9 organization Stop the Steal led by Ali Alexander and others, right? Real world, 501{c)3
10 or 501{c)4, whatever it is. There is that sort of, you know, specific group. There was
11 an original Stop the Steal group on Facebook that sort of showed up in the days after the
12 election that was actually removed for -- I forget what it was removed -- misinformation
14 Then you've got a bunch of other groups on Facebook that were kind of called
15 Stop the Steal or sort of a derivative and maybe they were actually organized by the same
16 people. Some of them probably weren't, but they used -- sort of named themselves
17 that way. Then you've got the, like, phrase, Stop the Steal, right, which lots of different
18 groups used.
19 Then you've got this sort of more general belief that the election was stolen, and
20 people sort of getting at that idea using a lot of different sort of language and
22 And so, I do think that when people say Stop the Steal, and they talk about Stop
23 the Steal in the world more broadly, and when they talk about Stop the Steal on
24 Facebook, they're not specific about what they mean. That lack of specificity leads to
25 confusion sometimes. People say, Well, Facebook, why didn't you take down Stop the
53
1 Steal, and Facebook will say, Well, we took down the one Stop the Steal group, but not all
2 these other Stop the Steal groups, not necessarily the phrase, not necessarily targeting Ali
4 So, I do think that there were efforts to take down some of these things. Some
5 of the other groups came down because they had other violations, like violence and
6 incitement, and mis info and hate speech and other sorts of things. But the sort of more
7 aggressive effort against Stop the Steal, more generally, did happen after January 6th
8 where it hit a much wider swath of the movement, but even then, it didn't take down all
10 And I do think that this is an important thing for folks to understand. Which is, I
11 thought Facebook should be more aggressive in taking down Stop the Steal stuff prior to
12 January 6th. But I don't think that that would have prevented the protest on the Mall.
13 I don't think it would have prevented violence on January 6th. And I think that that's
14 really, really important. I think there were some groups that probably should have
16 But this sentiment of the election being stolen was being promulgated by so many
17 sources across the MAGA movement that both on Facebook and off, that if Facebook
18 were to have tried to take that off generally, it would have had to take down much of the
19 conservative movement on the platform, far beyond just groups that said Stop the Steal,
20 mainstream conservative commentators. And without taking really dramatic action like
21 that, I don't think Facebook would have -- any action Facebook would have taken would
23 And I think that that's really important, because such action would have been so
24 sweeping, that I think it would be extremely difficult for Facebook to defend, both on its
25 existing policy grounds, but also more generally. So while I think Facebook should have
54
1 been more aggressive, I don't think that the sort of -- the steps that it might have taken in
2 scope would have prevented the violence on January 6th. In fact, I don't think it could
3 have taken any steps that would have prevented the violence on January 6th.
5 A Facebook had policies against misinformation around voting and things like
6 that. But it was focused on voting, not sort of ex post facto delegitimization if I'm
7 remembering correctly.
8 Q So, you laid out the different varieties of Stop the Steal groups. Was Stop
9 the Steal, the real organization, ever designated as a dangerous org, a violent conspiracy
11 A No. It would not have qualified except perhaps after January 6th because it
13 Q You mentioned earlier in our conversation that sometimes for the category
14 of violence inducing conspiracy narratives and militarized social movements, that you
15 would look to offline signals to see if an entity was on the verge of violence, and that, in
17 A Sort of. Like, we would -- so, for the military social movements, for the
18 militias, we would look to see whether their sort of sole purpose, their purpose was to
19 organize for the purposes of violence, right? And I forget the exact criteria, but that was
20 the gist of it, right? You're looking for things like, is this an armed group? Are they
21 preparing in, sort of, militarized fashion? Do they state they're there to assert control in
24 bound by a specific symbols, or you know, has some kind of boundaries, and you're
25 looking for a track record of violence associated with that conspiracy that is directly
55
1 attributable to those concepts. And the -- you know, QAnon met that standard. Stop
2 the Steal, prior to January 6th, I think, would not meet that standard.
3 Q What about some of the incidents that you mentioned between the election
4 and January 6th, things like the Million MAGA March, the violence around State houses,
5 in Philadelphia and Arizona, some of those events? Did you understand those to be
7 A Well, I think the -- I mean, I think the Million -- is that December 12th?
9 That one's November 14th, but then there's also the event on
10 December 12th.
11 Mr. Fishman. You know, keeping track of them all is difficult. I think for some
12 of those, we had already identified smaller groups and more discrete organizations were
13 involved. So some of the militias -- there was some stuff at the State House in Arizona
14 that we were able to -- that were -- you know, that we addressed via some of the Arizona
15 militias that were not allowed there. December 12th, it was, you know, the Proud Boys,
16 which had been a hate group since 2018. So there were -- you know, so we already had
20 BY-:
21 Q Mr. Fishman, this is, I think, a good way for us to get a better understanding
22 of what you saw the threat landscape as, and Face book's response leading up to the 6th.
23 I'm a little curious to dig in more about how you think Facebook could have, I guess, in
24 your words, not done anything more in advance of the 6th to prevent some of that
25 violence. And I know we were discussing sort of the nebulousness of Stop the Steal at
56
2 But at any point, was there a conversation about the need to curtail particular
4 A Yes. Yeah. There were discussions around that, and there were
5 discussions around taking broader action against elements in the Stop the Steal network.
6 I argued for some of those. And I think, you know, and I think Face book should have
7 taken some of them. You know, I certainly was not the only one. Other people were
9 But the -- but, I think even if those steps were taken -- the issue is, I think if you
10 are a citizen, or you are a company, you have a responsibility to take actions, even -- so
11 your tools are not abused, even if that's not going to prevent the final outcome. And I
12 felt like there were steps we could have taken against a broader swath of the Stop the
13 Steal network on Facebook, that, in my mind, would have been appropriate. But I don't
14 think that that would have changed the final outcome. I think they should have -- those
15 are steps they should have been taken anyway, because I think it's the right thing to do.
16 But I think it's the big leap to suggest that would have changed the final outcome.
57
2 [1:04 p.m.]
3 BY
5 Can you describe some of the steps that you were advocating?
6 A Yeah. I mean, we were pushing for broader removals of some of the other
7 Stop the Steal groups, you know, that, you know, under coordinating harm policies or -- I
8 guess the coordinating harm policy was sort of the key one.
9 I mean, that's the thing, is, like, dangerous organizations policies are, like, last
10 resort, you know? And, frankly, many of us were uncomfortable with the way that we
11 had to apply that even to QAnon, because we felt like there were other policies that
12 should've had actor elements. And when we had to come in and resolve that issue in
13 the end, like, you know, I certainly felt some frustration that it hadn't been addressed
14 more effectively earlier using some of the other policy tools that were available.
15 So I think there was -- you know, there was clearly a decision made at the
16 company to try to address Stop the Steal through some of these other content-based
17 policies. Because an actor-based policy is a really big, blunt object, and there's just no
18 way to do it without collateral damage, even if you try to focus it, right? Because we
19 talked about trying to -- when you add nuance, you know -- like, nuance at scale winds up
21 And so, you know, I thought we should take more aggressive action against some
22 of those Stop the Steal groups based on other policies, but I also -- I want to be really
23 clear that, like, I don't think that would've changed the outcome on January 6th.
24 Q Yes. And --
1 Q Right. Right.
3 A Yeah.
4 Q -- a lot of what it seems like you were advocating for might've been more on
5 the actor policy side, and the response you got from Facebook was that the preferred
6 route was to stick with some of the content moderation policies, content-based policies?
8 Q Okay. Yeah.
9 A And actor-based policies do get blunt. I mean, you know, like, you
10 know -- but there are things you can do. Like, if you see that, like, the refugees from a
11 group that gets disabled sort of re-coalesce in another group or something like that, you
12 can understand it essentially as a recreation of that other group, things like that.
13 But those are places where, you know, the area gets gray. And, in general, I
14 think moderating groups, in particular, is a very difficult thing to do, because oftentimes
15 it's not the administrator of a group that is doing the, like, worst things within that group;
16 it's other accounts. And so I think, you know, that's a tricky thing for policymakers to
17 deal with -- "policymakers" meaning at tech companies, not policymakers in the D.C.
18 sense.
19 But I do think that there should've been stronger action against some of them, but
22 And one thing that's occurred to me as you're talking is the interplay between
23 some of these groups that were more comfortably in the dangerous ergs
24 universe -- Proud Boys; I guess Oath Keepers was somewhere in the middle -- and then
1 And I wonder how the increasing intersection of those various groups -- for
2 example, on December 12th, you have a lot of Stop the Steal folks speaking at the same
3 rally as Stewart Rhodes, basically all reading from the same playback. And I wonder
4 how that confluence affected your view of what should be done and how that was
6 A It was certainly something that concerned me, right? Because when you've
7 got a movement like Stop the Steal that is, sort of, aligning itself with groups that you
8 have made a determination are dangerous, then that certainly increases your concern
10 And we absolutely felt that. That's a pretty typical way of thinking if you are
11 focused on organizations and you're thinking about the way that influence is transmitted
13 At the same time, you have to be really careful about guilt by association.
15 Now, my job was to be paranoid, and I did it pretty well. And so we looked at
17 You know, when you think about --you know, the difference between that Lobby
18 Day, the Virginia Civil Defense Lobby Day, a year prior was that the organizers of that
19 event said, "Hey, we do not want violence." They were very clear about it. "We're
20 here to talk. We're here to yell. We're here to complain and scream and maybe
21 intimidate a little bit, but yell and scream at our lawmakers to get them to do what we
23 But you didn't hear that from the Stop the Steal guys. You know, they stood up
24 next to folks that we knew had a track record of violence, and they weren't trying to draw
25 that line about, "Hey, this is actually just a peaceful protest. That's all this is. Because
60
1 we want to express our frustration with" -- you know, based on an erroneous belief that
3 And that's a very reasonable judgment that I'm describing, process of making a
4 judgment call about risk and perspective. It is a much harder one to build into a
5 replicable policy that you're going to make judgments about over and over again.
6 And I think that when you are grounded in -- fundamentally grounded in fear of
7 real-world violence, it's easier to sort of say, we need to make some judgment calls
8 sometimes. And I think, in this case, we were right. But I also think it's a very difficult
9 call.
10 And I also think that even under more aggressive actions against Stop the Steal I
11 don't think it would've changed things. I think that this messaging was so prevalent
12 across the conservative media ecosystem, both on Facebook in groups that never said
13 "Stop the Steal" or they said "Stop the Steal" deep in groups, you know, in comment
14 sections and things, and across the broader, you know, radio/TV/digital ecosystem, that
16 And if they hadn't attacked the Capitol, then they just would've been wrong.
17 And so, in my mind -- you know, which is sort of a basic thing to say. But when I think
18 about what really led to violence that day, it was a bunch of very confused and misled
19 people informed by misinformation that they got on Facebook and they got on Twitter
20 and that they got on FOX News and they got in other places, coupled with these more
22 And that's a kind of dynamic that we'd seen the possibility of with the Virginia Civil
23 Defense League's protest a year prior. It didn't actually happen then, but it did happen
24 on January 6th.
25 Q Yeah.
61
1 I know our break is long overdue, but I had a couple followups on that.
2 You mentioned a couple times the crowd being informed by misinformation that
3 they got on Facebook and other places. And I'm curious about how Facebook's posture
4 towards the underlying misinformation about the election changed as you saw more
5 potentially violent organizing by groups like the Oath Keepers and organizing that toed
6 the line between not being violent but also not condemning it more broadly.
7 A Well, you know, I don't know. I mean, it's a tricky one to answer,_
8 I think where that question came to the fore was the decision to roll back some of
9 the break-the-glass measures. That decision was informed by real data on platform, on
10 Facebook, that suggested that various kinds of concerns were being reduced.
11 So it's not as if, you know, Facebook made that decision willy-nilly. Now, again,
12 like, this wasn't the decision I would've made were I king, but they made that decision
13 based on that kind of data. And they made that decision because all of those
14 break-the-glass measures -- and, you know, I looked at -- you guys sent over one of the
15 documents describing some of the break-the-glass measures. I don't know if that's all of
16 them. I don't know, you know, what version of that document that is.
17 Many of those measures come with, you know, I think they probably called it the
18 "collateral damage," or, you know, they come with false positives, where you're
20 classifiers. And machine-learning classifiers always have errors. They have false
21 negatives, and they have false positives. You can lower the thresholds at which you
22 make a decision using that, which is largely what the break-the-glass measures did, but
24 And that's the kind of decision that you make when you're really, really concerned
25 about real-world violence or some other kind of tremendous danger. But there are real
62
1 costs to it. Now, I would've accepted some of those costs for a bit longer because of the
2 concerns that I had about the possibility of violence on the 6th and all the way up through
3 January 20th, frankly. But there was no decision that wouldn't carry important costs.
4 That's an important cost that would've come with some false positives.
5 And what I really hope is that you guys reflect -- even in situations where I
6 disagree with the ultimate decision that Facebook made, I think it's really important to
7 reflect the tradeoffs that they were facing and the -- because it's going to happen again.
8 These questions are going to come up again, and those tradeoffs are going to come up
9 again. And Facebook, other social media companies, policymakers in government that
10 ultimately are going to be using artificial intelligence more often going forward, they're
12 And if there's one thing we need to learn from this, it's that those tradeoffs are
13 real. Measuring them and balancing them against each other is hard. And hopefully at
14 some point we can come, you know, to some broader agreement about how you weigh
15 those tradeoffs in making these decisions. Like, that would be a -- I don't know if it's a
16 good outcome, but it would be a piece of a good outcome for some of the work that you
17 guys are doing. And I hope that we get there, which is why I'm going on this soliloquy to
18 make the point, because it's something that I think really matters.
19 And that algorithmic policymaking is -- the social media companies are canaries in
20 the coal mine. And this is the future, and those decisions are not easy and they're not
22 And so, even though I disagree, I would've been more conservative in the sense of
23 concern about the possibility of real-world violence and accepting of some of the false
24 positives, I think that it is a -- and I would've done that primarily because of my concern
25 about real-world violence based on off-platform information that I don't think Facebook
63
1 as a whole adequately incorporates. But I also think that the course of action that I'm
2 recommending -- that I recommended and that I still think was probably the right decision
6 Mr.1111 That's what I was going to recommend. Well, we can take a longer
7 break. I think -- yeah, we should take a 20-minute break for lunch. Does that sound
8 good, everyone?
10 lunchtime here, Mr. Fishman. You might want to refresh your coffee or something as
11 well.
12 So let's take 20. Maybe we can reconvene at 1:40. That's 22 minutes. 1:40
13 eastern time.
14 Mr. Fishman. Okay, sounds good. And I'll just stop video and mute for now?
18 [Recess.]
20 And thank you for coming back for another session, Mr. Fishman. And I think, at
21 this point, I'll hand it over to M r . - who had some more questions for you.
23 BYMR.-
24 Q And thank you again, Mr. Fishman, for your time today.
25 If I could get exhibit Cup? Or, that's exhibit No. 5. We changed our
64
1 nomenclature.
2 Okay. So you should be able to see on your screen this copy of an internal
3 Facebook report with the title "Stop the Steal and Patriot Party: The Growth and
5 A Yep.
10 operates often in a decentralized manner. Were there a lot of retrospectives like this?
13 way of putting it. Every organization and function has ways of, and desires to, sort of
14 learn lessons. It's a really good thing about the culture in a lot of ways, but it also means
15 that those retrospectives oftentimes don't actually speak directly to each other, because
16 they define terms differently, they gather slightly different data. And that can be really
17 problematic.
18 And it's something that I warn folks -- and I warn not only you guys, but I warn
19 others that are looking at some of the leaked Haugen documents, just to be really careful
20 in understanding how those terms are defined, where the data came from, et cetera.
21 Because it doesn't necessarily match up with what would be seen as canon inside the
23 And I'm not saying that's the case here. I just -- as a general concern.
24 Q That makes sense, though, and thank you for clarifying it. It's not always
1 I wanted to start by asking you about one point this report makes that I thought
3 The authors claim that one of the most effective and compelling things that they
4 did when studying Stop the Steal was to look for overlaps in the observed Stop the Steal
6 A Yep.
7 Q Do you have any insight into what they meant when they wrote that?
8 A Well, I don't know for sure. It may have been -- membership overlap I think
9 is probably the most natural. Now, what I couldn't tell you is whether -- sometimes
10 you'd look at membership overlap and you'd look at including deleted accounts. So just
11 because there was membership overlap or something like that here doesn't mean
12 that -- you know, it may mean that, like -- and this is purely hypothetical, but it might
13 mean that there was a Stop the Steal group and then there was some Proud Boys group,
14 the Proud Boys group came down, a bunch of the Proud Boys -- like, identified Proud Boys
15 within the group came down, but there was membership overlap between the two
16 groups, something like that, right? Those are the kinds of things that you might look for.
17 And I do think that one of the, sort of, adversarial pieces that you see is, you
18 know, accounts that are part of a network but that can't be removed because they don't
19 meet some policy threshold, they'll oftentimes try to come back within a setting or
20 operate within a larger setting where they feel like they're safer for some reason. And
21 that might be a smaller group, but it also might be a group that for whatever reason isn't
23 Q When a violent or dangerous org gets designated and groups and pages
1 but not necessarily every member of the group. And you can imagine, in many cases
2 you've got journalists, researchers, you know, in other settings that are not actually
3 members of the organization. So the, you know, membership in the group itself is not
4 always indicative of membership in their, like, actual, real organization. And so it really
5 depends.
6 Oftentimes what you'll do is you'll go through and review the users of that group
9 Q And then when you do that review, is it then sort of the next step to remove
10 them? Or is there a series of, sort of, policy checkpoints that have to be met before
12 A It depends, right? So, you know -- and, again, like, I can't speak to
13 Face book's policy today, right? But a user that is a member of a tier-one hate group, for
14 example, or ISIS, right, if you get a member of ISIS on Face book and they say, "I'm a
15 member of ISIS," their account comes down. Right? Like, you can't be ISIS on the
16 platform. You can't be the KKK on the platform. And so, if you are representing
18 The other thing that happens sometimes is they don't represent themselves as
19 such but, because of who they are, they will have content violations on their profile.
20 And so sometimes that fan-out, you know, manifests in different ways depending on the
21 circumstance.
22 Q But this wasn't the case for T3 organizations, which included MSMs and
24 A That's correct. Just to be clear, that was disclosed publicly when the policy
25 was announced.
67
3 Q Yeah.
4 So are you generally -- at any point after 1/6, are you aware of an intersection
5 between known militia and conspiracy groups and either the organizers or membership
7 A I mean, there was certainly -- there was certainly, you know, membership
8 overlap to varying extents with some of those Stop the Steal groups, you know, and the
9 Patriot Party groups. And I believe this document details -- you know, as you suggested,
11 And those are the kinds of things that, to us, were quite concerning from a
14 understand what you mean by that, but I wondered if that ever extended, then, to the
15 implication that accounts are a form of actor that needed to be sort of monitored and
19 One is that sometimes an individual is designated under the policy, right, like
20 Hitler or the Unabomber or somebody like that, right, where they're not clearly
22 And then there are other folks that were sort of brought under but in less, sort of,
23 extreme cases, the Alex Joneses of the world, where there's a lot of nastiness but not
25 And then you've got situations where you've got leaders of designated groups or
68
1 members of designated groups, and if they are understood to be that, then they can
3 And, now, as a practical matter, right, like, if you are a low-level member of many
4 different kinds of organizations, that's not something that Facebook is going to know
7 al-Zawahiri, you know, you know who they are, you can look out for that kind of thing.
8 And folks representing themselves as Ayman al-Zawahiri -- even though, you know, I don't
9 think he personally had a Facebook account, right? But if it's a, you know, Joe Schmoe
10 that's a member of, you know, even a noxious organization like the KKK, if they're not,
11 sort of, putting that front and center, you know, Facebook's not going to know. How
12 would they?
15 huge problem not just for dangerous organizations but for everybody else, right?
17 definitely used some of those. Some of those things were developed originally by us.
18 You know, I think spam and some of the more sophisticated state-actor teams developed
19 some of those kinds of techniques that were then borrowed and adapted.
20 But the thing that I would say is that very, you know, smart, resilient actors that
21 want to beat that stuff, they're going to find ways to do it. Like, anti-recidivist efforts
22 are good, they raise the bar, they make life more difficult, they can make it seem like it's
23 not worth it to operate on a platform and drive people elsewhere, but they are
24 incomplete solutions.
25 Q Who was the first line of defense against recidivism, at least in your policy
69
1 area? I mean, was that your team? Was there someone proactively looking for it, or
4 had been built. Like, we built some in the dangerous ergs world years back.
5 And they were kind of -- you know, the challenge is, when you build something,
6 you're -- like, when you're a subject-matter-focused team, like the dangerous ergs
7 network, you can build something, but oftentimes you're sort of holding it together with
8 duct tape, with digital duct tape, right? And if you really want to build something that's
9 scalable, that is flexible, that is easy to update, do those kinds of things, then you really
10 need to push those things into more centralized engineering teams that oftentimes take
11 longer to build something, but they'll build something that's more robust. And those
12 are the kinds of things that the central integrity team would build. And so we would do
13 that.
14 But, also, we ran a number of SNDs, strategic network disruptions, against some
15 of these groups, right? Proud Boys, Oath Keepers, a range of others. You know, we
16 ran them against ISIS. And Facebook announces those occasionally. But those kinds of
17 efforts would be more targeted investigations of a network, attempt to take the network
18 down as a whole, or as much of it as you could, you know, sort of gather in and
19 understand as part of the network. And then you can, sort of, enroll some of those in
20 more targeted recidivism programs that some of the more specialized teams built.
21 And those are good. They're still not perfect. And, you know, if somebody
22 decides they're going to go get a new device, for example, they can make another
23 account, you know? And there's not a lot Facebook can do about that. So recidivism is
24 a real problem.
25 Q Were any of those strategic network disruptions ever targeted toward any of
70
1 the actors involved with the initial Stop the Steal groups or the others that percolated
3 A I don't recall. They certainly -- there were network disruptions dealing with
4 the Proud Boys, the Oath Keepers, other militia groups, Three Percenters. I don't recall
6 Q Okay.
7 Earlier, you talked about the tradeoffs related to false positives. I was
8 wondering, when you do enforce against accounts in this space, are there ever
10 platform?
11 A Always. Yeah. Of course. It's one of your great fears when you're
12 dealing with these kinds of situations, is that, like, you know, the purpose of a social
13 media platform, whether it's Facebook or Twitter or whatever else it is, is to, you know,
14 maximize voice. That's one of -- has always been one of Facebook's values. You want
15 to empower people to speak. It's one of the great promises of the internet, is that it
17 The dangers that come along with that are manifest, right? That some of those
18 people are going to use that platform for real harm. And drawing the line between
19 those that, you know, may have contentious ideas and those that are going to use a
20 platform to advance real harm is not always easy. And so, you know, all of these
22 Q And what were the views of the senior leadership that you interacted with
23 on this question? Did they ever -- did they justify any decisions during the election
24 period, 2020 into 1/6, based on these concerns about voice, this purpose?
1 And so, you know, this is one of the risks of keeping the break-the-glass measures going,
2 for example, because they reduce voice for some, and they don't just reduce it for the
3 bad guys, right? That's the problem, is, when you've got algorithmically adjudicated
4 policies, you are, by definition, reducing voice not just for the ones that you want to get
5 but for other people who are not necessarily doing anything wrong.
6 And that is a -- you know, that's a very, very difficult -- you don't want to make
7 that choice, but that is the choice you have to make at scale. And it's the kind of choice
9 But certainly that's the kind of thing that -- you know, Facebook doesn't want to
10 reduce the voice of people that are expressing -- I shouldn't characterize. I don't speak
13 constructive way and aren't, sort of, coordinating real-world harm and real-world
14 violence.
15 And I think the trick here is that, clearly, the Stop the Steal movement did
16 contribute to real-world harm and real-world violence, but they were also reflecting a
17 genuine, misinformed view by lots of Americans that the election had been stolen, and
18 expressing that frustration, as crazy as it was, is protected speech. And I think that that
19 puts that narrative in a very difficult place for leaders of a social media platform.
20 Now, you know, when you put that all together, from the perspective that I'm
21 coming from, I think, you know, there was real possibility of violence, and that's how it
23 Q So what you said makes sense in the instance of account takedowns, right?
24 You don't want to necessarily remove someone's voice from the platform.
1 interventions, as more about things about how Facebook steers content around the
2 platform.
3 Did Facebook make a distinction in its commitment to voice in areas about the
4 speech acts of individuals versus areas in which Facebook demoted or amplified certain
6 A Can you repeat the question, I'm not sure I totally understand the
9 principle of voice when thinking about soft interventions, about things like, say, reducing
10 or, let's say, demoting content that's likely to be hate speech or likely to be violent
12 A Right.
15 A Yeah.
16 Q So the hard versus soft intervention distinction, how did Facebook apply
17 voice there?
18 A Yeah. So I think that -- I mean, this is a very tricky question. All social
20 I think what they were trying to do, what social media platforms in general are
21 trying to do, is serve up content in an environment where a user -- there's no way a user
23 it's primarily organized by your friend network, the groups that you've joined, the pages
24 you've liked, the stuff you've searched for, right, by signals that users are putting in.
25 But all of that does get sort of mediated through an algorithm that has to select
73
1 which posts from your friends, which posts from the groups that you like, all of that kind
2 of stuff, and you're probably not going to see everything. And so there is an effort to try
4 This is the same instinct that drives -- you know, people say, well, it should just be
5 reverse chronological order. And the assumption there is, stuff that's more recent is
6 more relevant. And what Twitter and Facebook and VouTube and everybody else are
7 doing is they're saying, well, how do we get better than that? Recency is not the only
8 thing that's relevant. It's also who are your friends, et cetera, and then something about
10 And so what Facebook released are these, you know, content distribution
11 guidelines, which I think you guys have probably seen. They're pretty high-level. But
12 they basically say two things. They say that, at some level, on a classifier score,
13 if something is, like, borderline violating, right, if we think it's close to violating, we may
14 down-rank it. So it might not be hate speech, but it's pretty nasty stuff; we're going to
15 try to make sure that we don't, you know, promote that stuff, you know, more than we'd
17 And then the second thing they say,_ is what you pointed to, which is, our
18 classifier says there is some percentage likelihood that this is hate speech, right, and
19 we're going to down-rank it, you know, based -- we're going to down-rank content that
20 hits that likelihood level, right? And that's what was changed oftentimes in the
22 Now, the problem is, if you're saying, we're going to down-rank something if it's
23 50 percent likely that it is hate speech, that it is the thing that we want to
24 down-rank -- because we don't have the capacity to review it all to see if it's hate speech,
25 and we're worried about the damage that it might do, so we're going to reduce its
74
1 visibility, et cetera -- that means that half the things you're down-ranking aren't hate
3 And that has an impact on voice. Because that other half, it might not be close
4 to that. It might just be -- that's the problem with algorithms, is that sometimes when
5 an algorithm is wrong -- like, when human beings make mistakes -- because human
6 beings make mistakes when they review stuff all the time, right? Everyone always
7 blames the algorithms, but human beings make mistakes too. But human beings tend to
8 make mistakes in ways that sort of make sense, right? Like they missed satire or they
9 were moving too fast or they just pressed the wrong button, you know, like we make
10 mistakes. Sometimes the algorithms make mistakes in ways that are really hard to sort
12 And so, if you're going to take action against those things based on that
14 that in an ideal world you would not be down-ranking, you would not be sanctioning in
15 any way. And there's no way not to, at scale, when you're using an algorithm.
16 And I think that's the tension with the break-the-glass measures, is that that
18 content.
19 What happened with the break-the-glass measures is that, because of the general
20 circumstance, Facebook's leadership said, you know, we're going to accept more false
21 positives during this particularly concerning time period, but that we want to get back
22 and don't want to have that impact on voice over a longer period chime.
23 And so, you know, I think the context suggested that the risk remained during that
24 period, but I think that basic tradeoff is a really, really difficult one. And, frankly,
25 nobody knows exactly -- like, nobody knows exactly where to draw those lines.
75
2 reasonable threshold?
3 A Oh, that's --1 mean, I don't know,_ Like, I think we need -- you guys
4 should -- there should be a congressional commission looking at classifiers, not just social
6 Like, I mean, I really mean this. There -- like, we -- maybe not a congressional.
10 And it should not just be about social media. Like, social media is one way we
11 should think about this, but it should be, you know, facial recognition by police
13 what do we do for real estate classification and its impact on civil rights.
14 Like, there are so many areas where we don't have ethical guidelines for the use
15 of artificial intelligence when there are inherently going to be false positives and false
16 negatives, and every industry has to make it up for themselves with no sort of
17 overarching framework.
18 Now, there are people out there thinking about this stuff, but I just -- this is a
19 debate that is going to increasingly enter the policy arena. It has done so through the
20 lens of social media is where it gets attention. But this is part of the 21st century
21 and -- or the 20th -- what century are we in? -- right? It is part of our policymaking
23 And I just -- like, I could pull some numbers out, but, honestly, I just think that the
24 takeaway is that we actually need a much deeper investigation and discussion about this,
25 where, like, a lot of stakeholders come to the table, so that, in the future, social
76
1 media -- in part so that social media companies can say, look, we are abiding by the
2 guidelines that have been set by some commission, rather than just we made it up
3 ourselves. Like, in the most well-intentioned way, they still made it up themselves, and
6 have a responsibility, then, to be transparent about the classifiers, how they're applied,
7 their impact, any kind of assessment? Is there a need to sort of pull back the curtain on
8 these kinds of questions? Which, to date, I don't think they have really done in a
9 meaningful way.
11 more transparent about what these, you know, classifiers do. But, at the same time, I
12 think that responsibility needs to be applied broadly across industries. And I don't think
13 that it's right to say that Face book specifically has that obligation, distinct from other
14 platforms.
16 A Yes, I do.
17 You know, I don't think that we should expect every little detail. I mean, you
18 know, Elon Musk says he's going to open-source the Twitter algorithm, and I think what's
19 going to happen if they do that, it's going to be totally anticlimatic and uninteresting.
20 And I think that's mostly the case here for Face book, with the exception of, you know,
21 what are the guidelines and the circumstances in which you're going to alter things
23 And I think those are kinds of things that platforms ought to be transparent about.
24 What are the factors that really go into these algorithms, and what are the situations
25 where platforms make changes, like the break-the-glass measures? How are they
77
2 And they should be transparent about when they turn them on and when they
3 turn them off and that decision process. But they should be able to do that in a way
4 within some boundaries that we have collectively sort of resolved on in society that are
5 reasonable, right, like, these are reasonable places, as opposed to the companies making
6 it up themselves.
7 Because I think that's -- you know, I think -- I just -- people don't want the
8 companies to make those decisions on their own. We, as a society, I think can help
9 provide some ethical guidelines that would be good for the companies and, frankly, good
10 for everybody.
11 Q I want to move back up -- this has been really useful, but I want to get out of
12 this segue a little bit to move back to the exhibit and ask: Were you aware of any
13 studies like this on the growth of Stop the Steal, any monitoring efforts or investigations,
14 before 1/6?
15 A Yeah, I'm sure there was something. I'm sure there was something.
16 mean, there -- but I couldn't point specifically. I mean, it was certainly a movement that
17 we were aware of and are aware of, you know, other elements of it that were not those
18 that were taken down. So I imagine somebody was investigating. But I don't
21 conducting studies?
22 A I remember discussion about Stop the Steal and whether or not we should
23 take more aggressive action against that, sort of, manifestation on the platform.
24 Q Uh-huh.
25 A And because the original group did come down, you know -- and, again, I
78
1 don't remember exact -- I think it was coordinating harm. I don't remember exactly
2 why. And so there was discussion about, you know, what to do with the rest
3 of the -- you know, with other parts that were reflecting similar sort of perspectives.
4 And, inevitably, there'd be some analysis that went along with that, but I don't remember
5 precisely.
6 Q Who would've made the call to take that original group down?
8 Q Were they typically involved in taking enforcement actions like the deletion
9 of groups?
10 A Not at scale. I mean, like, you know -- you know, untold numbers of
11 Facebook groups will be deleted, you know, in the next 20 minutes, you know, via rote
12 actions, right? And they will probably not be involved in any of those.
13 But certainly when you get to, like, really, you know, either very difficult,
14 borderline calls, areas where the policy may have, sort of -- you know, the language they
15 use is "spirit of the policy" versus "letter of the policy." I'm not sure that's the best
16 language, but that's often what's used -- and things that are going to be scrutinized
17 highly, like a decision like that, you know, those are the kinds of things that would be
19 Q What was the controversy around the takedown of Stop the Steal that
23 One is that the group was -- such groups were primarily distributing
24 misinformation about the election. And so there is a -- you know, the erroneous
25 assertion that the election had been stolen. And this was before the election was
79
1 certified, I think, that it came down, right? So there is, you know, some -- you know,
2 removing that stuff is easier after the election is certified and, you know, including after
5 assessment, you know, but you're bringing in other perspectives. Like, maybe there's
6 overlap with, you know, some militia groups that have come down or something like that.
7 And so, you know, you're operating in a little bit of a gray area there.
8 But it's also a dynamic where this is a high -- like, you know, could not be a more
9 highly politicized environment, obviously. And so, you know, senior leadership, that's
10 the kind of thing that they would want to be aware of and want to understand and, you
11 know, potentially shape, but certainly you'd want to be able to explain any decision in
12 very clear terms. And that gets tricky when it's a gray area, right?
13 Q Was this something that they discussed, this highly politicized nature of the
15 A Well, I wouldn't say that the takedown -- I mean, maybe I did say that, and I
16 didn't mean to. I wouldn't say that the decision itself was highly politicized. I said that,
17 like, you could -- you know, Facebook really tries to make these decisions without, sort of,
18 reference to, like, political perspective. But, obviously, the impact would be, this would
20 And I'm sure that that was discussed. I mean, how could it not be, you know,
21 when there was, you know, a huge -- you know, the MAGA movement was roiling for a
22 fight at that point, and Face book is obviously one of the targets of that movement.
24 they weren't around civic content or they weren't around politicized content? Were
1 A For the most part, that's where things get really tricky. I mean, sometimes
2 there'd be harder ones, if it's a bullying case or, you know, certainly some of the
3 misinformation, medical misinformation stuff, which, you know, clearly has a civic
5 You know, there were cases -- you know, there are edge cases with all of them.
6 think most of the time things would go up when there's a political dynamic, but there are
7 edge cases in all of these policies, including things like child safety, right?
8 And, you know, once you get to the edge of the policy, it's hard and it's tricky, and
9 there's -- you know, you may be impacting somebody's livelihood. And so it may not be
10 a big civic/political question, but it is a question that may matter in the real world, and if
11 you're going to take action, you've got to try to think through the consequences, the
13 So I would say that it's, you know, mostly the civic stuff and the political stuff, but
14 not exclusively.
15 Q Do you remember the arguments for and against taking down the initial
16 groups?
17 A I don't. You know, I'd have to characterize, but I worry that I'd just be sort
19 Q Okay.
20 Moving on from that decision, obviously, Stop the Steal groups continued to
22 A Yep.
23 Q What was the ongoing debate like? I mean, who were the players in the
1 you know, I remember that the decision was made to take down the core group and not a
2 wider set of the movement. And then much of the conversation moved on.
3 There was a discussion about the break-the-glass stuff and what to do with that.
4 There was a ton of work still ongoing within the dangerous ergs world about specific
5 kinds of threats that we were seeing in various places. You guys mentioned earlier
6 some of the protests around capitol buildings, and those were -- you know, various
8 And then I think there was, you know, how to prepare for some of the -- the
9 Georgia special election, which was obviously a big one. And I think much of the, sort
10 of, civic team and others were really focused on making sure that that got the same kind
11 of protection, because it was so critical -- you know, everyone, of course, understands the
12 context -- and that disruptions there, you know, were just as likely, perhaps, as they were
14 But I don't remember, sort of, specific debates about other pieces. I mean, like,
15 we were doing a lot of work trying to, you know, chase down the Proud Boys. And we
16 did some network disruptions in December, if I recall correctly -- I'm pretty sure that we
17 did -- of militias and maybe the Proud Boys again. We'd done a bunch of that stuff.
18 But it was not -- I don't remember the, sort of, ongoing debate about Stop the
19 Steal. I'm sure that it lingered to some extent, but it was sort of a decision that was
20 made where we took down the main group, we'll leave up some of the others.
21 And there may have been subsequent debates about specific groups that had
22 come up for one reason or another: There was a lot of hate speech or there was a lot of
24 Q So, going back to the exhibit, this report describes a pattern in which a small
25 number of connected individuals drove a great deal of the Stop the Steal activity,
82
2 I think one particular statistic I'll quote is that 0.3 percent of the members were
4 A Yes.
5 Q And up to two-thirds of the invitations were sent by people who sent, I think,
8 groups?
10 recommendations and that's how groups grow. I definitely think that is garbage. The
12 That's how -- you know, I know you guys have another exhibit with -- is it Carol's
13 Journey? Am I getting her name right? Or, she's not a -- I guess -- anyway, the
14 persona. And that's sort of telling a story through recommendations, right, that it is
15 possible to rabbit-hole. And I think it is possible -- and, especially, this was, you know, in
16 the middle of 2019. Some of those other protections were not probably in place yet.
17 But, in 2020, after we took action against QAnon and with Stop the Steal, I think
18 the way that those groups grew after that was through invites. It was through
20 And that's not surprising, right? That really shouldn't be surprising, because,
22 organizations, whether these movements are built for, you know, "let's build a new
23 library in our community" or they're built around "let's call into question the Presidential
25 And that kind of perspective, though, of, like, go find the person, go find the
83
1 organizer, go find the real-world thing, inside of a place like Facebook, that attitude was
2 very, sort of, dangerous ergs' thing. Find the real-world thing.
3 And that's not exclusive. Like, other people -- I think some of the great
4 researchers put together some of these documents that have now been leaked, and they
6 But that notion of "go find the real-world thing that's driving this" is just not how
7 social media has thought about these things. They've thought about this stuff as just
8 sort of, it's a systemic thing, that it all comes together. And, you know, honestly, some
9 of that is true, but at the end of the day, push comes to shove, there are still oftentimes,
10 in my mind, people back there behind this trying to pull the strings, trying to make these
11 things happen, trying to do it. And I think some of the discussion about social media,
2 [2:25 p.m.]
3 BYMR.-
4 Q And during the period after the election but before 1/6, there was no study,
5 to your knowledge, no assessment that would have identified these super inviters and
7 A You know, I don't want to -- I don't want to categorically say it wasn't there,
10 A Yeah. Yeah, I don't remember it. We certainly did some of that stuff
11 where we saw on QAnon prior to the election, we understood the dynamics. There
12 were a bunch of invite rs on Q that were -- some of those groups that were coming back
13 after we took them down. They would come back, and they would try to rebuild, and it
14 was that invite kind of functionality, which said to me that somebody had lists of names
15 and somebody had lists of, you know, Facebook handles or somebody had email
16 addresses that they were keeping somewhere trying to bring those things back and
17 recreate those communities. That's how I interpreted it, but I don't actually have
18 evidence of that.
19 Q This report actually alleges that that is how these groups reconstituted
20 themselves, that there were backup groups made in preparation for takedowns.
22 A Oh, yeah, backup groups is definitely a thing. I mean, all of the things we
23 saw these folks do, everybody does. And, you know, I'm not relating these folks -- this is
24 not a direct comparison to some of the more extreme groups I'm about to mention, in
25 terms of like who they are and what they believe. It's different. But when groups get
85
1 pressure by a social media company, they take standard behavioral adaptations. One is
2 they make backup groups. They make backup accounts. They create ways to get back
3 on. They keep lists of their friends. They create a safe account where they -- a fake
4 safe account so that they can coordinate there but not actually do any of the nasty stuff
5 there, but they can figure out where they're going to plan and recreate, right? And ISIS
6 has done all of these things. Ha mas has done all of these things. Patriot Front did a lot
8 And so, these groups that were doing nasty stuff and facing that kind of pressure
9 through a variety of different policies on Facebook, they started do the same thing, the
10 same thing you or I would do, right, were we in that situation. They are just trying to
12 Q Was part of this tool kit the use of multiple accounts, single-user multiple
13 account activity?
17 A That's right.
18 Q Is that a bannable offense? Are people removed from the platform for
19 that?
20 A Yeah, if you're using fake accounts like that, you can be. You can be
21 banned. But usually what would happen is -- I think where that would happen more
22 often is if one of those accounts was taken down for something, would you fan out to the
23 SUMAs.
24 Q Sorry. To clarify, would you do that if one of those accounts was taken
2 Q In dangerous ergs you did. But for other policy violations, say, for
3 violations of -- you know, for someone who runs a group that's highly toxic that has high
5 A I don't know. I'm not exactly sure what all the details were. This is
6 where, like, things get complicated, and I'm just not sure across the board.
8 So the individuals who are these super inviters, the report also says that they
9 tended to interact with one another a lot. They messaged each other, commented on
10 each other, tagged each other in posts. Is that a signal that you would look for to
11 identify something like a dangerous org or a network of people engaged in activity that
12 could be concerning?
14 wouldn't necessarily be sufficient. People tag each other all the time.
16 A Right.
17 Q Yeah. Do you have any insight into who these super inviters actually were?
18 I mean, was there ever, even after 1/6, an effort to identify the offline groups of
19 individuals, if not their individual identity, but the offline network pushing these groups?
20 A I don't know if that question was ever fully answered. It's an interesting
21 one. It's certainly one that we were interested in. I don't know if it ever really got
25 don't know if that specific question was addressed. I just don't know. I don't think -- I
87
1 mean, I don't think we ever really knew who, like, the offline people were that were
3 Q Did they tend to not use their real name? I mean, I have to use an email
4 address to sign up for a Facebook account, and I guess I could create, you know, a false
5 email address, but did they tend to obscure their identity? Do the type of people who
8 know -- I mean, I don't know. There may be more in that document. I just sort of
9 skimmed it this morning. I don't know what their practice was in this case, you know.
10 It's possible that I once knew with more specificity than I do now, but it's been a while.
11 Q Did you feel like you had runway to investigate who they were? I mean,
12 was there an issue of priorities or did you get any guidance from leadership on what you
14 A No. I never felt like I was getting leadership direction on that question.
15 We were more -- you know, the question that I was most interested in was how much of
16 the violence on January 6th was organized on Facebook. That's a very dangerous way of
19 A Oh, absolutely, they were organizing on Face book. But the most acute
20 versions of organization for violence was not happening in the Stop the Steal groups,
21 right? It was -- that's where people were certainly using some violent rhetoric and doing
22 those kinds of things, spreading misinformation related to the election to get people onto
23 the Mall.
24 But when you look at the planning that was done by the Proud Boys, the Oath
25 Keepers, other organized groups, like I said, there was a little bit of that in terms of, like,
88
1 logistics that occurred on Facebook, and it's come out in the indictments. But most of
4 Proud Boys, the Oath Keepers, the types of militias you just mentioned, and the
7 The overall militia movement, yes. But, you know, the Proud Boys and the Oath Keepers
8 are relatively small organizations. Some of these Stop the Steal groups were massive.
9 And so, yes, I'm sure, but at the same time, to my knowledge, those groups, those
10 real-world groups, meaning Proud Boys, Oath Keepers, et cetera, their actual logistical
11 planning for activities on January 6th was not happening in those groups.
12 Q But, again, it seems to me like the super inviters were engaged in sort of a
13 logistical activity of manipulating the group's feature. Wouldn't you have wanted to
14 know if part of the overlap was between the organizers of those groups and the
15 organizers of the dangerous folks offline? Wouldn't that have been a pertinent
16 question?
17 A I think it is, and I think it's a fascinating question. I think it's an important
18 question. And, to my knowledge, there is -- like, I am not aware of that overlap, which
19 doesn't mean it doesn't exist. It just means I'm not aware of it.
20 Q But you said it was something that you were interested in. It doesn't seem
21 to have been a question that was answered. Why wasn't this -- why wouldn't this have
22 been a priority to understanding whether or not January 6th was organized on Facebook
24 A Well, look, because -- you know, the folks that we were looking at, primarily
25 as organizers of violence, were operating in other -- primarily in other settings, and that's
89
2 Now, I presume that if we had seen that they were leading one of the, you know,
3 major Stop the Steal groups that's something like -- that's a finding that I suspect would
4 have been surfaced to me. I don't remember that finding being surfaced to me.
5 And so, I'd like you could read into that whatever you like. Did we not look as
6 much as we should have? I don't think that's the case. But on the other hand, it's not
7 definitive evidence that that question was answered in the negative either. You know,
8 my -- well, I'm not going to make a presumption. I don't know. But we certainly
9 endeavored to understand the operations of the Proud Boys, the Oath Keepers, what
10 they were doing on platform. This is not something that was raised to me as a core -- as
12 Q If this report had been available to you, let's say, December 1st, if this -- that
13 may be too early, but let's say sometime in December, you have the insights about the
14 way this network is constituted. This report also mentions, I should say, that these
15 groups had -- and these individuals had much higher prevalence of violent incitement,
17 If this data had been available in real time in December, would you have
18 advocated for a change in policy, either, you know, that maybe election delegitimization
21 Face book?
22 A I think that this -- and I did make those arguments in December, you know.
23 You know, like, I did make the argument that we needed to address a broader range of
24 the Stop the Steal groups. I did make -- because I thought that there was evidence that
25 they were being sort of constructed, and that there was overlap with militia
90
1 organizations -- unit groups, and things like that. I mean, I did make that argument.
2 I think all of those arguments were happening, were occurring in policy gray areas,
3 and that's why it was tricky. The -- so -- but I did. I mean, I thought we should be more
5 The -- I still -- and I just can't say this enough. I don't think that that would have
6 changed things on January 6th because I think that this sentiment was so widespread, like
7 so widespread across the MAGA movement, on Facebook, off Facebook, et cetera, and
8 through a range of different, sort of, entities beyond just groups like Stop the Steal. In
9 fact, it's not clear to me at all that those were the most important groups driving the
10 people to the Mall that day. And I think they've gotten the most attention because
11 they've got a sexy slogan. Doesn't mean they were the most important, in my view.
12 So I think that sentiment was really, really widespread. But I did argue that we
13 should take a stronger action against a wider range of those groups because of that -- just
14 for the same reasons you're talking about, because of the overlap with militias, because
16 I don't think that we knew those things in as polished a way as that report, you
17 know, puts them together in a pretty coherent way. We didn't know all of that stuff, but
18 we had hints of those kinds of things as time was going on. And, you know -- and I
19 thought we should take action as a result of it. But I also think that some of this was a
20 gray area. We didn't know who these people necessarily really were in the offline
21 world. And I do think that, you know, it would have been -- and I am skeptical that even
22 the steps that I advocated and that I still think would have been the right thing to do
24 Q What about phrases that were also used, like "Storm the Capitol," which is a
25 crime, and it's an act of violence. It would be difficult to storm a Federal Building I think
91
1 without being an act of violence. If that had been the main slogan -- it was certainly
2 present. I was able to find an example of a Facebook group with that as its banner just
4 A Yeah.
5 Q But if that had been the main slogan, would the decisions have been
6 different?
7 A I don't know. So a phrase like "Storm the Capitol," I think it's possible that
8 that would have been implicated the violence and the incite policy a little bit more,
9 because it's a bit more specific about an action and a place. And usually with the V&I
10 policy, a lot of times it's an action, a place, and a time. You know, if you want to get real
12 I think the question that would come up there is, you know, how much of this is a,
14 Q And when you made these arguments, to whom did you make them?
15 A Oh, I don't remember specifically, but, you know, there were many
17 happened within the policy team, you know, within the product policy team and then
18 beyond.
20 A I mean, it would have been the usual suspects, so it would have been the
21 policy team, you know, product policy, some of the public policy folks at a senior level, I
22 assume some of the strategic response folks like Molly, and key engineering leadership
23 like Guy.
24 Q How would you characterize the stances of those -- of the leaders of those
25 teams? I mean, people like Guy Rosen, Molly Cutler, and Joel Kaplan, where did they
92
2 A You know, it's a fair question. I don't remember where everybody was, you
3 know. I didn't keep detailed notes on, you know, where everybody was in the debate.
4 You know, what I would say is, that I think everyone was falling in slightly different places
5 in terms of, you know -- characterize it all you want -- paranoia, caution about the
6 possibility of violence at that point. I think it's fair to say that the dangerous ergs world,
7 we were generally on the more paranoid, cautious side of things, and -- but everybody
8 recognized the balance and the tradeoffs and the risks between, you know, as you're
9 attempting to limit the possibilities of violence. And, you know, you can never prevent
10 it, but you can mitigate it, and I think companies do have an obligation, even when they
11 can't necessarily prevent something entirely, to mitigate risk in reasonable ways. And I
12 think we could have done more during that time period, but I don't think we -- you know,
13 but I'm skeptical that it would have had an impact on what actually occurred on
14 January 6th. And I think that that's part of it is, you know, that other folks certainly I
15 think were reticent to disrupt, you know, an emotional and political sort of cry of
16 frustration which was based on, sort of, like wanton misinformation, but was still
17 genuinely held by a big chunk of the electorate and many of whom, you know, were going
19 Now, I was concerned, as were others, about the possibility that that would spill
20 into violence, and it did, but I think, you know, it was -- you know, even the most
21 concerned of us didn't know for a fact that was going to happen. We were -- you know,
22 we understood that there was risk, and we were -- and we thought we should take steps
25 individuals to go to the Mall was the kind of thing that was circulating Stop the Steal
93
2 A I think it was the belief that the election had been stolen, that there had
3 been fraud somewhere in the process. And I think that that misinformation was spread
4 by the President. It was spread by broadcasts and cable news, and it was -- not so much
5 broadcasts, I guess -- mostly cable news, and it was spread online, including on Facebook.
6 Q You said earlier that it would have been difficult to shut down these groups,
7 this narrative, this movement without shutting down a large part of the conservative
9 A I do think that. I don't think it would have -- I think it would have been -- I
10 think that that narrative was so closely entwined in the larger conservative media
11 ecosystem that to -- so to address that narrative in any substantive way would have
12 meant the widespread removal of the conservative media ecosystem on Face book.
13 Q Do you think that was a major factor for some of the individuals that we just
15 A I don't know. I mean, like, of course, the -- if you're going to take a decision
16 that's going to impact, you know, 50 million Americans, that's something that people are
17 going to think hard about, right? You know, I'm making up numbers. But I think
18 that -- yeah, of course you have to understand when a decision is that significant, I think,
19 of course, people are going to be thinking about the scale of the potential impact, but I
20 also think the reality -- you know, some of that is that, you know, much of that was an
21 expression of protected political speech. And, you know, connecting that to the direct
22 risk of violence was something that I was more willing to do. But even I wasn't arguing
23 that we should take down the entire ecosystem, right? Like, I wasn't. And I think that
25 I think the possibilities for the further de legitimization of the election among
94
1 those people would have been high, and I think even then the impact on January 6th is
2 really up for debate. I don't know at all that that would have limited the amount of
3 people that showed up on the Mall, and I think there's a real possibility that they might
5 So, you know, I think that's -- the only way that that notion, that idea would have
6 been dramatically reduced in its prevalence on Facebook, would have been that very,
7 very extraordinary step, and I'm not at all convinced that that would have resulted in a
8 better real-world outcome, even though, like, as I said, I argued for more aggressive
9 measures than the ones we took. But I just think that when the President and leading
11 companies like Facebook can do better, but they can't be ultimately the gatekeepers.
12 They can't hold that back. And I think Facebook could have done better, but I don't
14 Q The exhibit mentions that most, if not all, of the fastest growing civic groups
15 on Facebook during this period were Stop the Steal groups. The groups, as a feature of
16 Face book, facilitate the spread of this narrative. I mean, what is the culpability of the
17 groups feature in the prevalence of this narrative on Face book? Let's keep it bounded
18 to on-platform consequences.
19 A Yeah. I think groups are oftentimes tricky and problematic, right? And
20 it's not all groups. There are public groups. There are private groups. There are
21 secret groups. They all pose different kinds of worries, right? You know, public
22 groups, obviously are, in some ways, more important for the proliferation of
23 misinformation at wide scale. Secret groups were more often used by -- you know, by
24 more private entities, right? So a lot of the militias used secret groups, for example.
25 And because they're not trying to broadcast everything, they're trying to create a
95
1 tighter-knit group and community. And those dynamics are really different.
2 So I do think that, you know, any time that you sort of can create that kind of
3 environment, you know, there's risk of abuse. That said, like what people have got to
4 understand -- and one of the great lessons of studying terrorism over the years is you
5 just -- you can't approach any community or any group by their absolute worst people
6 and expect to understand the wider population, right? So you can't say, you know what,
7 I'm going to understand, you know, Iraqis via the lens of ISIS. Like, that's wrong. It
8 doesn't work that way. Are there Iraqi members of ISIS? There are. But that's not
9 how you understand Iraq. That's not how you understand Iraqis.
10 And anytime we approach these kinds of risks, like, it's -- you know, just recognize
11 and be wary of the danger, the analytical danger of saying, Look, there are some of these
12 bad groups; therefore, all the groups are bad when we know many of these groups are,
14 Those things are real, and I think sometimes they're easy to dismiss in debates
15 about the downsides of social media, but we have to remember them. And look, I am
16 not some -- you know, I am not pollyannish about the role of social media in society by
17 any stretch of the imagination, but I do think that it's easy to forget that reality, that
18 those same tools that are abused are used by people that are doing genuine and good
19 things.
20 That said, the challenge with groups is that it's hard to figure out who's going to
21 be responsible, right? When does the group go bad? Does the group go bad when the
22 administrator of the group is setting it up for a bad purpose, right? Does the group go
23 bad when a bunch of users in the group use it for a bad purpose and the administrator
25 How do you set up accountability within that entire structure? When is it that
96
1 administrator's responsibility to just govern the group and make sure that they're kicking
2 out people that are doing nasty thing? When is it Facebook's responsibility to take
4 And I do think that that sort of notion of who's going to be accountable, and who's
5 responsible in that sort of larger chain is one that is harder to resolve with the
6 product-like groups where you do have users that are running an authoritative position,
7 but do have the ability to moderate content on their own independently of Facebook, and
8 I think that that sort of dichotomy sometimes led to paralysis inside Facebook, because
9 the idea was, Hey, we just need to provide that administrator more tools. And
10 sometimes these groups, they were -- they were, like, a conservative political group, and
11 the administrator didn't set it up to be a place where people are fomenting violence on
12 January 6th, but a bunch of people showed up there and that's what they're using it for,
13 right? And so that's not totally -- that's not hypothetical, right? Those are real things.
14 At the same time, you've got groups set up by somebody for the express purpose
15 of having a discussion about how the election was stolen, we need to do something about
16 it, let's all get riled up, show up on January 6th and maybe -- and, you know, we're raring
17 to go if the Proud Boys start knocking down police barricades, you know.
18 And all of those things are true at the same time. There's a tremendous amount
19 of variation in the use of these groups, but fundamentally, the challenge from a
22 administrator level, and some of it is at the user level. And I do think that how those
23 sanctions get applied and how those rules are going to operate is trickier with a product
24 like that than it is with something more straightforward like an account, for example.
25 Q Was it the view before the election of senior Facebook leadership that the
97
1 best way to assess a group for harm was the intent of the administrator?
2 A I think oftentimes.
3 Q Oftentimes?
4 Mr- Okay. I want to make sure,_ if you had any points for
6 Mr.. I think what might make sense is to take a break for 5 minutes, and
7 then we can come back for the last hour, if that's all right with everyone?
10 Mr.. So let's go off the record, and we'll come back on the record at 3:00
11 p.m.
14 [Recess.]
16 And I want to note for the record that a staffer on the select
18 And now I want to turn it over to Mr. • who has some more questions, and
20 Mr. • .
23 BYMR. •
1 January 6th. And some of this has already been covered in our conversations so far
2 today, so if that's the case, if you'll briefly restate or point us back to what we were
4 A Sure.
5 Q We can dive in, and then I think we'll hand it back to M r - f o r some
7 So, first, can you walk us through what Facebook's operation looked like in the
8 lead-up to the election and how that changed after the election with regards to potential
11 Q I guess we're mainly interested in how the legal side would have interacted
13 A Oh. Well, so, you know, the dangerous organizations XFN at Facebook was
14 one of the closest, you know, working across different teams at the company. And so,
15 historically, we would get together -- you know, we would meet weekly. Leadership
16 would meet weekly as well separately from across those different organizations. You
17 know, prior to the pandemic, we met, you know, quarterly to do planning as an entire
18 XFN. And the idea there was to make sure that we were pushing road maps, each of the
19 various teams would roadmap, which in tech company terms is, you know, build a plan
20 independently, but we would meet across our issue areas to make sure that we were
21 collectively pushing things simultaneously, and in conjunction with one another, things
22 like that. That included the legal team, which was closely integrated.
23 Starting in the spring of 2020 -- I don't remember exactly when it was; April, May,
24 somewhere in there -- we began a pace of, you know, several additional weekly meetings.
25 Those began with a lot of the work up around the Boogaloo movement, and I think
99
1 they -- if I remember correctly, they slowed -- we took them down to, like, one a week
2 because we were hoping things would calm down, and then they didn't, and so then they
4 And so those became -- we also set up other meetings that my team ran that was
5 focused on what we call time-sensitive threats. And so, a lot of that was sharing focused
6 on various upcoming real-world events where there might be violence. Even if they
7 weren't sort of planned, it wasn't specifically organized, but things that we needed to be
8 focused on.
9 Those are the kinds of things that the dangerous ergs XFN ran, you know, my team
10 in conjunction with other parts of the XFN. And some of that was stuff that we would
11 take action on directly, but some of it was really just to make sure that there was good
12 information sharing around those kinds of incidents and potential incidents for other
13 kinds of teams, right? Because sometimes an operations team wouldn't be looking off
14 platform, so they wouldn't know that the Proud Boys had been stating on Telegram that
15 they're going to show up at some event, right? And they wouldn't know that that was
16 part of the context that they needed to have when they were dealing on the day with
17 that sort of thing. So we built systems to try to facilitate that kind of work.
18 Those all continued from the dangerous ergs' perspective after the election. We
19 did not -- we understood that there was still a risk of violence. Those pieces continued.
21 knowledge, they're still doing some of that today, although I don't think they're doing
22 those two extra meetings, which is really about the 2020 cycle at that point, but, you
24 And all of that, from a dangerous ergs' perspective, those sort of institutional
2 So for the pre-election period, up to and including the election, was there a
3 designated channel that Facebook was using to communicate threats to law enforcement
6 There are portals -- Facebook has portals, and I imagine that other large tech companies
7 do too where law enforcement agencies can put in requests. But they also have
8 telephones, can make phone calls to say, Hey, we're going to send something. This is
10 And Facebook actually has programs through something -- they call it the legal
11 outreach team to train law enforcement so that those requests come in in a format that
12 aligns with the law, that they're checking the boxes correctly, that they're making sure
13 that they submit things in a way that a company can respond to, you know, fulsomely and
14 effectively. And so, those kinds of ongoing programs, you know, are out there as well.
15 Q Were there ever any conversations between law enforcement and multiple
16 platforms that you were aware of to, sort of, broader threat landscape discussions?
17 A There absolutely are sometimes those kinds of discussions, either from law
18 enforcement or from, you know, DHS or NCTC sometimes. DHS did a lot of work on the
19 information security side, not so much with my side, but with other -- with Nathaniel
20 Gleicher and his team, looking at, you know, threats of foreign interference, and things
21 like that, to make sure that channels for that are open. But those are little bit different
22 than -- you know, those are broader sorts of threat environments than the FBI, you know,
23 providing a warrant, you know, process saying, Look, we need disclosure of something
24 about this because there's a real-world threat of violence or something like that.
25 And there also are procedures there in case of -- you know, Facebook comes
101
1 across something that looks like a threat that law enforcement is unaware of where it is a
2 real-world danger, that Facebook can proactively provide that information to law
3 enforcement.
4 Q Except for those broader level conversations, were those happening related
7 well-established, at least for me, as what was happening on the information security side.
8 It was not as well-structured, but I do think that -- again, like, specific on the law
9 enforcement side, we're plugging in with our legal team, with the Facebook legal team,
10 and they were communicating about one thing or another all the time. I mean, there's
11 always something, you know. And, you know, so I think those avenues were wide open.
14 cross-cutting kinds of meetings, but I know that -- you know, I mean, I just can't
16 funds and investigations now have, you know, an online digital component as evidence.
17 And so, companies are really well-practiced at communicating with the --you know, big
18 companies. Silicon Valley, as a whole, still has a ways to go, in terms of, like, recognizing
19 that you can talk to the FBI without -- you're not -- it doesn't mean you're going to jail,
20 right? And I'm making a little bit of a joke, but, like, I think historically in Silicon Valley
21 that's sort of the fear, right? Like, Oh, the Feds called, you know.
22 But that's not where Facebook is at. Facebook is a much more mature
23 organization now than it was a decade ago. They've got folks with background in law
24 enforcement, in those kinds of agencies. They know how to have those conversations.
25 They know how to push back when a request is not sort of constructed in a way that
102
1 is -- you know, aligns with the law. But they also know how to provide that guidance so
2 that law enforcement agencies are providing process in the appropriate ways as well.
3 And I think -- you know, I just -- those avenues are wide open.
4 Q Got it.
5 So moving forward past the election, again, you mentioned this, that the pace of
6 meetings had slowed down because you had hoped that it would be calmer. We talked
7 about this a little bit earlier, but could you walk us through how the posture of your team
8 and Face book more generally changed after the election, and when did it start to ratchet
9 back up again?
11 I distinctly, however, remember having the impression that we in the dangerous ergs
12 world were still terrified, and everyone else wasn't. Everyone was terrified in the run-up
13 to the election. We remained terrified, and I don't think that that was as widespread
15 And, you know, I don't -- like, I don't remember the details over the sort of the
16 meeting cadence, you know. We certainly still had the time-sensitive threats, which
17 turned out to be a great thing. It wasn't my idea. Some of the other folks had it.
18 They made it happen. You know, for some of these, my job was to drive it personally.
19 For some of it, my job was to just sort of sponsor and say I think this is a good idea and
21 I think we probably went down to 1 day a week on these really special sort of
22 coordination kind of meetings, at least for part of it. I don't remember exactly, exactly
23 how that worked, but I distinctly remember the concern that we were really -- we were
24 really worried.
25 And largely -- you know, I know I said this at the beginning -- a big part of the
103
1 reason why the dangerous ergs team was so worried was because we were looking at
2 things off of Facebook. And that was not a natural way for many teams to think about
3 threats on Facebook, but it was a very natural way for the dangerous ergs team to be
5 Q What was that experience like for your team in the weeks after the election,
7 A Well, I mean, we were kind of used to that idea. Like, this was a
8 team -- this is the team of people whose job it is to be terrified, right? It's the team of
9 people whose job it is to be paranoid and worried about stuff, right? So on some level I
10 think there was just some general consistency. If you went to, you know, Face book on
11 any day of the week and said, Who are the people that are most terrified, you know, it
12 would be the dangerous ergs people, because they are the ones looking at the most acute
13 kinds of threats, you know, unless you go talk to the child safety folks or, you know,
14 they're looking at different kinds of terrible things, right? You know, and so on some
15 level, that is just, you know, you stand where you sit kind of dynamic, right?
16 But I also think that the more important lesson for the social media companies is
17 that you have to combine -- in order to understand risks holistically, it's really important
19 scale with intelligence style approaches to understand what's the connections between
20 your platform and things that are happening elsewhere in the world, both in the real
22 And that is a jump that I just don't think -- I think all companies probably can learn
23 from that and get better at that. I definitely think Facebook needs to get better at that.
24 I don't think that they've nailed that across the board. I don't think we did it perfectly in
25 dangerous ergs, but I really -- but I do think that the dangerous ergs team does that
104
1 better than most. And I think that, you know -- and I think where that comes to the fore
2 most is in understanding, you know, what the risk environment was in relationship to the
3 break-the-glass measures, because those efforts were -- you know, and it's really easy to
4 criticize after the fact, to say that they got that wrong.
5 But they had a bunch -- they had a series of metrics that they were looking at.
6 They were not crazy metrics, but they were all on file, and those metrics did go down.
7 Things did change. It did look like things were calming down. They were not making
9 From my perspective, though, that perspective was too focused on the things that
10 they could see, and it's like using a single source of intelligence in the IC, right? If you
11 use a single source, you get -- you may get one answer. You may get a bias from that
12 source. And if you use multiple sources, you can triangulate. You may get a different
13 perspective on things. And I just don't think -- I don't think that companies have really
14 figured out how to do that very well. Part of the reason is that sometimes that means
15 you're sort of using intelligence judgments, right? Because you're looking at things off
16 platform. You can't measure it precisely. It's not the kind of thing that speaks to an
17 engineering brain, to a certain extent, whereas metrics that just tick up and down do.
18 That's how platforms want to operate. That's how founders that are engineers want to
19 operate. It's totally understandable. It's not evil, but it's limited.
20 And I think one of the things that platforms need to get better at is combining
21 these sources of information so that they can make good judgments, and then be
22 transparent about those kinds of decisions, as we discussed previously more -- you know,
23 earlier. And I think that Facebook -- I don't remember what the communications
24 strategy was around the break-the-glass measures, but I do think that Face book had
25 announced that they were being put into place and that they were coming off, and I don't
105
1 remember exactly. And that's good. Those things are good. Now, maybe there
2 should be more precision about that. There should be more detail. I think there
3 probably should.
4 But, I think, most importantly, there needs to be ways to measure the risk
5 environment that doesn't just look at the obvious signals on a platform in a quantitative
6 way. You've got to look at the wider environment, especially if you're a big platform
7 where every bad thing is going to show up there eventually, even if it's not there yet.
8 And that's Facebook. That's YouTube. That's even Twitter, to some extent. It's
9 certainly TikTok now. You know, all of those platforms, just because you don't have it
10 today, if it's on the internet, it's going to be there. It might be tomorrow, it might be
11 next week, but it's going to get there. And so some of those things you can get ahead if
13 This is how we dealt with ISIS, you know, in the early days, and it largely worked.
14 Q Right. So are there ways that you wish Face book's -- some of this talk and
15 information that you were seeing on your team had been different, specifically in the
17 A Um, I don't know. I'm sure there are things that we could have done to be
18 more effective at getting that message out. We tried. You know, inevitably there are
19 better ways -- there would have been better ways to do it. I think the structural -- like, I
20 think the structural problem is that you don't want -- is that social media companies,
21 including Face book, need real -- people aren't going to like to hear this, but, you know,
22 especially in Silicon Valley, but you need an intelligence function, and I don't think that
24 Because what you don't want -- and there's all sorts of literature about this.
1 read about it all day, about how you don't want policy people doing the intelligence
2 analysis. And you want -- those are separate functions. You want -- and I don't -- and I
3 think -- and I don't think that distinction has occurred enough at Facebook, because the
4 fear is, for a decision-maker, whoever the decision-maker is in the chain, if you've got a
5 policy person that's also doing the intelligence analysis, you're afraid that they're skewing
6 the intelligence, and they might be skewing the intelligence to get what they want.
7 They might be doing it on purpose. They might be doing it on accident. They might
8 just have gotten lazy when they got the answer that they wanted.
9 And so, if you're the decision-maker, you're always worried about that. I think it
10 would be easier to get better decisions if companies structured themselves so that there
11 were bodies that were strictly doing intelligence analysis of risk so that you could make
12 better decisions and you weren't -- and so you wouldn't have a team like mine sort of
13 consolidating information and trying to serve it up, and then also making a policy
14 recommendation on something. That's not how you want to do it. We did it that way
15 because we had to. I think we did a pretty honest and good job of it, but that's not
16 actually how you want to structure it. The way you want to structure it is you want to
17 have a body whose job it is to measure risk, assess that risk, to provide an impartial
18 description of that risk, and then you want somebody else making the case, Well, what do
19 we do about it right now? Because if you combine those things together, then the
20 decision-maker has to question and has to wonder, Where is the bias coming from?
21 And, like, there's a million -- like, this is not a new idea. There are a million studies
22 like it.
23 You know, I'm not saying anything terribly exciting. I'm just adapting a
24 discussion that comes from intelligence studies and security studies programs all over the
25 place to the social media environment that has to make similar kinds of decisions now.
107
1 And I don't think that social media companies have fully incorporated that idea, and they
2 need to do it better.
5 that include some of the limits that were placed on groups? Do you recall?
8 In your broader conversation about the issue of intelligence, and sort of separate
9 chain of intelligence collection as opposed to social media policymakers, are there -- why
10 do you think the reasons -- what do you think the reasons are that that hasn't been set
12 is it sort of another incentive that's keeping companies like Facebook from doing that?
14 awareness, and sort of focus on doing it. I think it would -- there will be bureaucratic
15 pushback from various existing stakeholders, right, that don't -- wouldn't want to give
18 sounds spooky. Everyone is mad at social media companies already. Nobody wants to
19 hear that they have an intelligence function, right? It sounds creepy. It's scary. You
20 know, potentially it is ripe for abuse. But it would improve the kinds of information flow
21 to senior decision-makers, and put them in the position where they could, I think, more
22 easily trust that data coming in the door, because right now, if they're honest with
23 themselves, like I'm sure this is true at anybody above my level in that organization that
24 has to make a hard decision, when information comes up, they're going to be asking
25 themselves, Is the raw data that I'm getting being construed in a way to advance the
108
2 recommendation.
3 And in government we would say, Yeah, that's a problem. Let's separate those
4 two things out. Let's separate them, you know. And social media companies need to
5 start thinking that way too, especially the big ones that can afford it. Like, this is not a
6 thing that smaller companies would be able to do, even a Twitter, something like that,
7 right? That just doesn't have the same resources. But big companies have the ability
8 to do that. They do have, you know, intelligence functions that do more tactical stuff,
9 that are doing things -- you know, that are doing investigations on foreign influence
10 operations, that are, you know, doing the detailed dives for network disruptions and
12 But in terms of that macro risk analysis so that you can make good decisions about
13 when you want to adjust things, like the break-the-glass measures and that kind of stuff,
14 those are the kinds of changes that I would want to see. And I think that it's sort of
15 strange, but, like, this is actually -- this is what I would want for senior decision-makers.
16 This is what I wish all of those people in the chain that you mentioned -- names that you
17 mentioned earlier, I wish they had access to that, you know, and they'd get better
18 information if they had that kind of thing. And they wouldn't have to question -- it
19 would be painful at first. It's bureaucracy. But I think over the long run, there's a
20 reason why government after government tries to structure these things this way.
21 There's a million case studies of why it works better when information flows that way to
22 senior decision-makers. And I just think -- look, again, would this have fixed -- would
23 this have changed what happened on January 6th? I don't think it would have. But I
24 do think that this is an illustration of a broader challenge, and there will be other really
25 hard decisions that social media companies have to make going forward, and I hope that
109
2 Q Got it.
3 After January 6th, the [inaudible] was deferred for a number of months following
4 the attack as well, and I'm curious about how you saw the violent ergs -- dangerous ergs
5 policy change after the attack, particularly with regard to the kinds of far-right militarized
7 Did monitoring increase? Did some of the criteria that you were using get
8 sharper? Was there more interest from other areas of the company and the work that
10 A Yeah. I mean, I think there's -- I think there are -- what's the right word?
11 So I think there was an effort that was not led by my team, but it was led by a partner's
12 team to develop something called the coordinated social harm policy that was utilized, to
13 my knowledge, I think the only time, with a group in Germany called Querdenken, which
14 is sort of a -- effectively, it's a German, you know, version of QAnon. But it's a little bit
15 of an odd group. And so the coordinate social harm policy tries to get at the same sort
17 more behavioral lens, like, you know, how are they -- you know, how are they structuring
19 And, you know, I don't know where that policy sits today. You know, I've been
20 out of Facebook for 6 months. I'm just not sure. But that was one effort. And, again,
21 these are efforts to try to think creatively about how you get at these more amorphous
23 I think there were efforts to go back, and I don't know where -- again, I don't know
24 where these stand today, but to think through some of the policy changes that we put in
25 2020 and make sure that we got them right. You know, there was certainly an effort in
110
1 2020 to change the external facing representation of the dangerous organizations policy.
2 The tiered structure that you see on their website today went up sometime in the
3 summer, I think, of 2021-- I forget when -- because we made some of these changes, but
4 we just, you know, were running so fast, they weren't, sort of, publicly enumerated in the
5 way that they should be. And so there were a lot of efforts to try to do that to make
8 was -- there was awhile there where there was a really tense focus on that, of course, and
9 their sort of hyper prioritization, I think, probably changed and reduced as the rest of the
10 world -- you know, you've got to sort of balance resources and, you know, resources is
12 So I think there were a range of those different kinds of efforts, but, you know, the
13 policy is in place now, as far as I know, and being enforced. I don't know all of the
15 I think the larger issue, from my perspective, is that there has been this shift to the
16 place where civic violence now doesn't just come from organizations, right? It is driven
17 by a range of other pieces. And, in my mind, social media companies need to put in
18 front and center the risk of civic and political violence, and make sure that there are
19 policies that are unified from content-focused policies, behavioral policies, and
20 actor-based policies, so that you don't have situations where content-focused approaches
21 don't get it done for a long time, and then you need to come in with an actor-focused
23 And I don't think that that kind of restructuring has occurred, at least it hadn't
24 when I left. And, to my knowledge, it hasn't since but, you know, my specific knowledge
2 [3:33 p.m.]
3 BYMR.-
5 Mr. - I think M r . - has some followup on the policy, so I'll hand it over
6 to him.
9 BYMR.-
10 Q Yeah. I did want to ask about the coordinated social harm policy. So that
11 wasn't developed under your team; that was developed under a different team?
13 my team sort of advised and coordinated on it, but that team did a lot of work with
14 behavioral-based policies. And this was an effort -- so it was really a collaborative effort.
15 I mean, those two teams worked very closely together. But they had a lot of expertise
16 in behavioral-based approaches, and so the idea was they would take the lead on it and
20 A That's correct.
21 Q Yeah.
22 Was that policy a direct outgrowth of anything that happened during the election
23 or January 6th?
25 response -- I think it was definitely another way to try to get at this problem that we all
112
1 recognized. But building a firm policy foundation for dealing with a range of different
4 boundaries and a set of standards that works, but it has its limits, you know? And one
5 of the limits is that you really need clear evidence that a group has done a bunch of
6 violence you know, or that members of that group have done a bunch of violence that
8 And, you know, what you're trying to do is find ways where you can define very
9 high-risk behavior that has a very, like -- that risks real-world violence and harm in a
10 pretty serious way without dramatically spilling over and capturing all sorts of things that
12 And so the CSH effort was another way to do that. And I don't know where -- I
15 Narratives" report.
16 A Uh-huh.
17 Q I am wondering if those themes are related, if that report was part of the
21 Mr. Fishman. Yeah. I skimmed it this morning. I don't -- well, let me take
23 Mr- Thankyou,_
24 BYMR.-
1 A Yeah.
2 Yeah, I don't remember this document, which doesn't mean that it wasn't part of
3 the process somewhere in the line, you know? But I don't remember this document,
4 personally.
5 You know, I think a lot of folks recog- -- you know, I mean, it's not rocket science.
6 Everybody recognized this idea of these really difficult, sort of, diffuse movements that,
7 you know, likely have nasty players back there somewhere pulling the strings, but those
8 nasty players are hard to put your finger on and enforce against because they're playing it
9 smart. And so you want to try to deal with that somehow without having a dramatic,
10 terrible impact on voice and without making, sort of, decisions willy-nilly without a clear
12 And so, you know, I think there were a lot of efforts internally to try to figure that
13 out. The VICN was something that we came up with in 2020, and the CSH is another
14 effort. I don't know if this got plugged into that or if this was -- you know, I just don't
15 know -- or if this was somebody else that was wrestling with similar ideas and coming up
17 Q What were the -- let's just take these themes, the ones we've just been
18 discussing. What were the conversations at Facebook like that after the election like, in
19 your recollection?
20 You said that your team was terrified, but the other teams, maybe their level of
21 terror diminished following 1/6. This was an environment in which it seems like a lot of
22 reflection and studying and policy development was going on. I know that it's always
23 going on. But some of it is very interesting, for the reasons we've been discussing.
25 A So I think there was -- you know, look, Facebook had some wins here that
114
1 shouldn't be diminished, right? Like, around the election itself, I think there was -- you
2 know, there wasn't a, sort of, tsunami of misinformation about voting places and things
3 like that. There were a bunch of pieces in all sorts of different places. There was, of
4 course, misinformation, including, and I think more acutely, in Spanish, you know, related
5 to the election.
6 So, you know, it wasn't perfect, by any stretch of the imagination. But, at the
7 same time, I think there was a ton of work that went into it, and worst-case scenarios
9 And so I think there were efforts to try to understand what did we do right, what
10 did we do wrong. And some of that was related to some of these extraordinary product
11 measures, you know, that are encapsulated in a break-the-glass sort of concept, but I
13 You know, I think some of those, sort of, civic product efforts -- I mean, this is one
14 of the challenges, right? In some of those civic product efforts, we as the dangerous
15 ergs team were not always central to. And that was sometimes frustrating. Because I
16 think, in some cases, those efforts did not directly incorporate the risk of violence. They
18 And I think there were elements of that even -- you know, even after January 6th,
19 we sort of had to say, "Look, this is still a thing, this is going to happen again," right?
20 And some of that is just bureaucracy, right? Like, that's why you have different
21 bureaucracies, and you need somebody standing up to be the one that says that, right?
22 But I do think that that's one of the risks here, when you're talking about -- that
23 perspective, the focus is very much on there's sort of the sanctity of the election, rather
24 than, like, okay, what happens if people just decide to blow up the whole thing?
25 Which, effectively, is what the efforts to de legitimize the election after the fact
115
1 were. They were efforts to say, "Look, we're not just going to de legitimize and try to get
2 a few people to vote in the wrong place. We're just going to say that everybody that
3 voted in Pennsylvania's vote doesn't count." Right? Which is a -- like, it's just a
4 different level of disruption that I think was, you know -- and we're going to physically
5 stop the Congress from certifying the election. Right? That is not, I think -- you know,
6 that's the kind of paranoia that, you know, in that physical, real-world-violence frame,
7 that's not where you're thinking of if you're coming from a traditional election protection
9 And so I think you need those two things together. And, you know, over time,
10 unfortunately, I worry that we're in a place where this kind of thing is going to be more
12 Q I have just, like, three more questions for you, and then I know Mr. has
13 a concluding question, and then I think we can wrap up. I want to be respectful of your
15 I know you haven't seen this document. It does have some interesting
16 recommendations, and I want to just quickly run them by you. And I'll group them
17 together and see if -- I'd like to know if, in your opinion, these would have helped limit the
18 kinds of narratives that build up into something that eventually manifests in offline
19 violence.
21 better soft, targeted interventions -- better soft interventions and better targeted
22 interventions; and then a focus on the highest percentile harmful groups, networks, and
23 accounts. It notes that you can often segment harmful spaces and users on Facebook in
24 ways that make it clear that some actors are much worse than others, and that can help
1 Are those types of interventions things that you think would be worthwhile
2 tradeoffs?
3 A If they're targeted effectively, I think they can help. Which is a big "if,"
6 struggled with over and over and over again at dangerous ergs. And sometimes we did
8 So I think -- yes, I think, you know, understanding and defining categories of actors
9 that are doing things so anathema to civic functioning and democracy, that maybe they
11 I also think that's a really dangerous thing to do. Actor-based policies are
12 dangerous. They're right, they work, they're the way you get ahead of problems, but
13 they're dangerous. And we should just -- you've got to look that in the eye. It is.
14 Like, it's dangerous. Because you're saying, these folks have done something in the
15 world so much that we're just not going to allow them to use these products at all or
16 we're going to deliberately limit their access to certain features. And that precedent is a
18 And I think better soft interventions -- I think soft interventions are good. I think
19 they come with real tradeoffs, because what they do is they hit a wider range of things.
20 They're often contingent on Al, which means that they have some false positives, and
22 I think what's important about that is, you know, in those soft interventions,
23 everyone always thinks it's the algorithm, it's the algorithm, it's the algorithm.
24 Sometimes it's the algorithm. Sometimes it's people inviting thousands of people to
1 So, as long as we're understanding that soft interventions means more than just
2 the algorithm, which, frankly, I think gets more attention than it deserves as a driver of
3 problems, and what gets less attention is that there are those few people, like you noted,
6 algorithm distracts them from getting better at identifying those kind of nefarious actors,
7 and I would rather they spend more time on those nefarious actors.
8 And then that gets to the focus on the highest percentile counts, which is sort of a
9 bucket that I think gets at that issue. And that, to me, is really it. Like, how do you
10 work backwards from those counts that are really finding ways to make things happen?
11 Because they will abuse any platform. They will find the gaps. They will try again.
12 They will wait until they'll get disabled 10 times and then they'll create another account
13 and, you know, know the right path to go to avoid detection. They will do all of these
14 things. It's not just, like, they manifest on the platform and it just does exactly what
16 And, I think, what I hope is that, you know, as folks think about what platforms
17 can do better, it's not just reiterating what I think has been an excessive focus on the
18 algorithm and the business model related to adds and more on how do you get better at
19 understanding the discrete networks that are doing harm and the discrete accounts and
21 Because I think -- and I think you guys asked good questions about those things.
22 You know, I didn't have great answers on some of them, because I didn't know the
23 answers. Those are things that platforms should get better at, is being able to do those
24 kind of things in real-time. And I think that is just as important, if not more so, than this
25 discussion that has gotten so much attention in the media around the algorithms, the
118
3 opportunity in a highly polarized media ecosystem? And is there a way in which social
7 A I think the more that our politics use violent, militarized language, the harder
10 on what is acceptable political speech and what is not defers some of that responsibility
11 to social media companies in ways that we just should never want to give it to them.
12 Even the best-intentioned companies are not the people -- are not the institutions that
14 That doesn't excuse those companies, because when that responsibility falls to
15 them, they have a responsibility to do what they can. But our democracy would
16 function better if political leaders, civic leaders were drawing those lines in normative
18 And we've blown through some of those lines. And I think it is -- you know, we
19 can hope that social media will fill that gap, but it won't. It can't fill that gap. And, you
20 know, even if they act perfectly, they don't have the legitimacy. They're not
21 democratically elected. We don't want them setting those norms. We want our
22 political leaders setting those norms; we want our civic leaders setting those norms.
23 And, you know, whatever you think about Mark Zuckerberg, that's not why he
24 signed up to make Facebook, you know? And, you know, I don't agree with every
25 decision he's made, but, like, you know, that's not why he wanted to do it. It's his job
119
1 today, whether he likes it or not, but that's not why he signed up to do this.
2 And we shouldn't want to imbue that kind authority in him or Elon Musk or
3 Sundar, right? None of these people. Those aren't the people that we want making
4 those decisions. We want our political and civic leaders making them. We want them
5 drawing those ethical lines. And I don't think they've done that nearly well.
7 I do want to play off against it the revelations that in 2018 or 2019 political party
8 leaders did come to Facebook and say that the way that the platform had changed
11 Is there a way in which we should be thinking more about that, the incentives and
12 the communications environment that have changed since the advent of social media?
14 remembering, I think it was one of the other Haugen docs, right, about a -- it was like a
15 Polish political party or something, right? Yeah. So, you know, I wasn't aware of that,
17 But I do think, like, you know, obviously, not just political leaders but, you know,
18 businesses and everybody else is optimizing for eyeballs and engagement on line, right?
19 They are doing search engine optimization. They are trying to game various algorithms
20 to get surfaced and get attention. And I think that is a noxious game.
21 I also think that any usable, scaled social media platform is going to have an
22 algorithm that ranks posts so that people can try to see things that are relevant to them,
23 right? An unranked algorithm on Facebook would show us, you know, I guess in reverse
24 chronological order, every post made on the platform. It'd be totally unusable. So
25 you're going to sort things by friends and the groups you're in. You're going to do some
120
1 of that, just inherently. Because that's what people are signing up for.
2 I do think that this notion that more, you know, noxious things get more
3 engagement and that that cycle leads to even more engagement is a dangerous one and
4 is a worrisome one. I don't claim to understand it fully, in part because, you know, I
5 think the data is sort of mixed. I wish that data was out there and more available so
7 But I also think that during my time period at Face book they took real efforts to
8 try to improve those things, you know, and the content distribution guidelines are an
10 So I don't really know what the, sort of, status quo today is versus 2018, 2019.
11 That was a long time ago. And, you know, I agree wholeheartedly that companies that
12 are doing these sorts of things ought to be more transparent around them. And I think
13 that, in the long run, will be good for them and will be good for society writ large.
14 I do think it's really important that those calls don't just focus on Face book, that
15 they move across the social media ecosystem. Because I do think that there is a sense,
16 you know, that is not helpful for anyone that leadership, you know, and folks inside of
18 And I think, you know, Facebook's made plenty of mistakes. It's the
19 biggest -- well, I guess TikTok's bigger now, but it is one of the largest. It deserves a hell
20 of a lot of scrutiny. At the same time, it is a wounded whale that everybody wants to
21 get in a fight with because it benefits them politically. And that focus has led to, I think,
22 an intransigent view inside of the company that isn't, you know, totally wrong that they
24 And with the solution here for society -- if our goal is to improve society, we need
25 to make sure that these standards apply across the board, that everybody is living up to
121
1 them, and including Facebook. But narrowly focusing on that platform I think is a
2 mistake, because these problems exist all over the place, and, you know, none of these
4 Mr.- Mr. Fishman, I want to make sure you don't have a hard stop at
6 has a final question, and if we do need to stop, I want to give him the chance to ask it.
7 I also want to make sure the court reporters are okay if we stay.
8 Mr. Fishman. I can go until 10 after, but I should stop then, because I've got to
10 Mr.- That is fine with me. And as long as there's no reason we can't do
12 BYMR.-
15 When you said that you were advocating for more robust response to the Stop the
16 Steal groups after the election in part because of the overlap with militias, but it seemed
17 like no comprehensive study had been done about those groups and their membership.
18 I wondered, I just wanted to know if there was any intelligence you were acting on
19 that suggested that. You said you didn't have a crystal-clear picture, but, like, what
21 A Well, you know, I don't remember all of the details. What I recall is
22 understanding that a number of those groups had significant overlap. And I think we
23 saw this with the original group, that some of those groups had significant overlap with
24 groups that had been -- and, you know, by "significant," I'm not talking 80 percent; I'm
25 talking, you know, a much smaller percentage -- but that there was overlap with some of
122
1 those militia groups, that there were high prevalence of hate speech and violence and
2 incitement, sort of, violations inside the group, and that they were manifesting in ways
3 very similar in intent, apparent intent, as the Stop the Steal group that did come down.
4 So, you know, all of those things in aggregate, in my mind, represented sort of
6 But these were things that were operating in sort of gray areas. And I thought
7 we should be more aggressive against some of the groups anyway, where a lot of the
8 harm was happening, just generally, a lot of harm was happening in the group. I think
9 that there was responsibility on the app for the group owner and administrator
10 themselves, and I thought we could be more aggressive against some of those kinds of
11 entities.
12 But I do want to -- like, I said it before, and I'll say it again, that I do not think there
13 was an action that Facebook could've taken that would've demonstratively changed the
14 presence of that many angry people on The Mall on January 6th. I think it was going to
15 happen. And I think it was going to happen because of the organization off of Face book,
16 and I think it was going to happen because of the, sort of, pushing of it by the President
18 And, you know, I don't even want to say everybody, because I think there were
19 plenty of responsible folks and there were plenty of conservatives and Republicans that
20 were upset because they lost an election but took it like good citizens, which we all do
21 sometimes.
23 Mr. - I'm going to cede the rest of my time to Mr. •just so he can get
24 his final questions in and so you can get out of here in time for your next call.
25 BYMR.-
123
2 I just have one more question, and it's admittedly broad, but are you concerned
3 about the potential for violence in the 2024 election cycle, given what we saw and how
5 A Yes. I am. I'm concerned about 2024. I'm also concerned about 2022.
6 And, you know, I think that, unfortunately, in the United States, we have now
7 crossed into a time period in which political violence has been normalized for a significant
8 element of the electorate, and there are people out there that want to instrumentalize
10 And that's why the connection between, you know, sort of, delaying the
11 congressional certification and having new slates of electors is so concerning. You guys
12 probably understand if there's a link between those things better than I do. But I think
13 the broad, sort of, facilitation of that and, sort of, entrance of that into our political
14 bloodstream suggests to me that it is going to get played out when we are certifying
15 congressional elections and State senate and assembly elections and possibly again when
17 So I am very concerned about it. And I think that those threats are -- you know,
18 we're rightly going to look at how are elections counted, you know, what are we doing to
20 But what we've seen is that there is a component of the electorate that is going to
21 use whatever means they have at their disposal -- which, Facebook is one of those tools.
22 Twitter is one of those tools. Telegram, WhatsApp are some of those tools. Bespoke
23 websites like TheDonald.win are just a tool. All of these things are just tools. And
24 they're going to use them to incite and inspire folks to disrupt in ways that they think are
2 abused and manipulated in that way. It's also really incumbent upon political leaders,
3 whether in Congress, the White House, or at State levels, local levels, to stand up and say
4 that it's unacceptable. It's important for business leaders, like those at social media
5 companies and elsewhere, to say that it's unacceptable; civic leaders to stand up and say
7 Because once it's gone, it's gone. And I really do worry that we've crossed a line
8 that's going to be very hard to get back across. It's going to take some time. And it's
9 going to take demonstration of more political leadership than I see right now.
11 I think on that note we'll call it a day, unless you have anything more you wanted
12 to add. But you've been extremely generous with your time, and thank you again for
13 coming to speak with us. And it's been really helpful for our context.
16 Mr. Fishman. Good luck with the hard work. I'm sure you guys have a lot of
21 Mr.1111. Yeah, we --
22 Mr.- At4:05?
1 Certificate of Deponent/Interviewee
4 I have read the foregoing _ _ pages, which contain the correct transcript of the
10 Witness Name
11
12
13
14 Date
15