The IT role in Russiagate: Part III – Internet maneuvers, an Obama-linked trust group, and DARPA

Information maneuvers in the dark.

Part I is here.  Part II is here.

Part II concluded with a discussion of Rodney Joffe’s private ISP, an arrangement he spoke of himself in a June 2015 PR release on an award from the tech organization Messaging, Malware and Mobile Anti-Abuse Working Group (M3AAWG).

Part III continues with a listen to the echoes of the Conficker worm (a high-profile project for Rodney Joffe); a most remarkably-timed formation of a new trust group with extensive Obama links – and one link in particular to the 2016 DNC intrusion; and the link – because of course there is one – to the DARPA project John Durham is investigating Georgia Tech’s participation in.

Previous plot outline?

One penultimate set of facts sheds some clarifying light on what we’ve tallied up already.  It relates to Rodney Joffe’s participation in the IT working group that formed in earnest on 28 January 2009 to fight the Conficker worm, which had hit the Internet after the U.S. election on 21 November 2008.  (The main source for this section is the report linked below for DHS/Air Force Research Lab.)

It’s an interesting feature of Conficker that it’s still out there, infecting mainly – but not entirely – old, legacy systems operating on old OSes.  In 2020 it was still being reported to the tune of about 20,000 instances a month worldwide; apparently it has been detected trying to establish itself on Windows 10 systems as well as the older ones.

It’s also interesting that it came and went mysteriously in 2008-2009, with no firm attribution ever made.  (Naturally, a speculative attribution to a Russian is favored by many commentators.)

Speculation at the time was that fighting back against it, with the Conficker Working Group, discouraged those using it from continuing their fell campaign.

Most interesting of all, from my perspective, is that the method used by the Conficker Working Group was basic and kind of brute.  The working group essentially predicted where the worm-deployers were going to create new domains next, in order to introduce the worm, and labored to beat them to the punch.

Pixabay

In other words, the Conficker Working Group registered and took control of a slew of new domains to preempt the online movements of the worm-deployers.

To do this, in fact, Rodney Joffe got ICANN to waive fees for domain registration, and the group went on a registration spree.  I haven’t seen a number attached to that – how many new domains were registered by the group for the sole purpose of sprinting ahead of Conficker – but it had to be a tidy few.

We could think statically about this, and in one dimension, but we don’t have to.  The first questions I have are who owned those domains afterward and what the status of each domain is today.

But the next thought is that this is a ponderous but real form of “Internet maneuver” operations.  We could merely take at face value the point that its purpose was to outmaneuver Conficker.  But if that was a form of defensive maneuver, it’s not a stretch to extend the concept to offensive maneuver, using some of the Internet’s most basic features.

We could imagine – without claiming we’ve proved anything – that a group like the Conficker Working Group could make up new domains and register them at 2 Mach for other purposes.  Having a lot of domains under the control of a group working together would create a powerful tool, not only for moving information but for corrupting, blocking, and spoofing it.

The key to the opportunity in this case (i.e., to register a whole bunch of domains for free, without raising eyebrows) was having a crisis that seemed to justify industry, quasi-government entities, and government overseers in loosening the rules.

The specific pattern here (racing the Conficker worm with new domains) is not what’s of concern.  It’s the mindset about assembling SME (subject matter expert) groups to operate with unusual access to resources and abnormal latitude for out-of-the-box behavior.

The definition of a “crisis” may have been different in the summer of 2016, but it’s interesting – that word again – that some of the same key players were involved.  Besides Rodney Joffe, David Dagon was in the Conficker working group and in the loop on the Joffe team’s Alfa Bank labors.

And when the working group needed a place to park the massive flood of data they were working with, the parking lot they chose was – you guessed it – Georgia Tech.

Tatooine satellite location in Augusta, Georgia. Image: Georgia Cyber Center

With that as an introduction, hear with your ears a few passages from a report compiled in 2010 for DHS and the Air Force Research Laboratory on the Conficker Working Group’s activities.  This one deals with the odd nature and unexplained purpose of the worm (p. 13; all pages given as numbered in the original document).

Since the discovery of the Conficker worm researchers have debated the intent of Conficker. The worm was not designed to promote a specific type of attack (the way Srizbi would send spam). It essentially allowed the author to virtually “put his foot in the door” and wait for the right time to use the growing botnet.

A popular theory about the purpose of Conficker is that the worm would be used to spread other malware. … However, to some it seemed a mundane and inelegant use for such an exceptional botnet, leading analysts to question whether Conficker E was a diversion to draw attention away from its true purpose. …

[…]

Some suggested that the author may never have intended to utilize Conficker and the entire botnet was a feint or a “head-fake.” Among those with this theory, one suggested Conficker was used to distract the security community from other malware such as Zeus and Torpig, which continue to reap large profits for criminals. …

While the view that Conficker was a ruse and not a legitimate threat is not the prevailing view, it does come up in questions of why Conficker was never used for anything more devious than scareware. It is likely that the Conficker Working Group effort to counter the spread did make it more difficult for the author to act with impunity, but the author did not seem to have tried his or her hardest.

Curiously, the essence of Conficker as a self-propagating botnet is what seemed to mandate the response of trying to get ahead of it by preempting new deployments with domain-grabbing.  In retrospect, that defensive move is quite as interesting as the question of what the worm was created for, given that Conficker never seemed to do anything.

A bit of history (p. 17):

The large-scale coordination began in the final days of January and first days of February 2009. Throughout January, security researchers, registries, Microsoft and the Shadowserver foundation discussed the potential for managing the worm. On January 28, Shadowserver set up the Conficker email listserve. The initial membership of the listserve was small and nearly everyone knew each other.

In late January, T.J. Campana at Microsoft contacted Rodney Joffe of Neustar, the registry operator that manages .biz domains. Microsoft wanted Neustar’s assistance to register or block .biz domains that would be contacted by Conficker-infected computers. Joffe requested that ICANN waive their mandatory registration fees with the domains as the issue was related to the security of the DNS system. According to ICANN, this was the first time they had received such a request. ICANN agreed to waive the fee and later agreed to waive all fees related to registering Conficker domains. Since that time, ICANN has instituted a formal process for registry operators to request a fee be waived when dealing with an attack on the DNS system. … Most registrars cooperating with the Conficker Working Group did not charge the group for registering the domains.

Rodney Joffe. Neustar video, YouTube

Domain takeovers and the sinkholing (basically, sequestration) of data (bold in original):

Early on, several researchers were paying for and registering the vulnerable domains by hand, one-by-one. Some were discussing the possibility of doing so in a comprehensive way. Others were getting access to domains so they could sinkhole the data and learn more about the infection. (p. 16)

Sinkholing of Data. As domains were registered, they were pointed at six sinkhole servers to collect information about the scope and spread of the malware. Originally, a number of individual groups and organizations ran sinkhole servers. In early February, the group decided to centralize the data at Georgia Tech, which offered server space to hold the data and bandwidth to manage it. This was seen as a neutral site, where companies could share data and have access controlled. Various access agreements were granted with some companies placing restrictions on the usage of their data. (pp. 17-18)

Registering domains (p. 18):

Technologically, pre-registering (infected, affected, suspicious?) domains was not that hard. Once the malware code was reverse engineered, they were able to replicate the domain generation algorithm. From there, members of the Conficker Working Group could create lists of domains that must be registered and get them to the appropriate registries or authorities. Those registries learned how to automate the process. The difficulty lied [sic] in the coordination of efforts and associated legal frameworks, as well as research of domains already registered and double-checking of the lists.

So grabbing and manipulating fundamental elements of the Internet’s architecture, while time-consuming and needing a few excuses, was basically pretty easy.  Proprietary and privacy concerns weren’t really an issue.  The group wasn’t hindered by such concerns, from what I could tell reading the entire report.

Listed participants in a February 2009 meeting on coordinating the anti-Conficker campaign:

Report on Conficker Working Group by The Rendon Group for DHS/Air Force Research Laboratory. Link in text. pp. 18-19

Many of our Internet sleuths will see names they know.

Notes on coordination (p. 19):

There were few formal contacts with the US government as an institution, but a large number of connections through personal channels. Several researchers within the Conficker Working Group, without coordinating with others, communicated through their own social networks with the FBI, DHS, DoD and various intelligence agencies.

Treatment of confidentiality and group ethics (p. 23):

There was an understanding that the Working Group was reaching into unprecedented territory in terms of cooperation and they wanted credit as a group, not as individual organizations.

Maintaining discipline in a group in which everyone is a volunteer is not easy or even realistic. A number of individuals spoke with the media. Others felt mildly offended, as if those speaking to the media after people had agreed to avoid it had violated their trust.

There’s a lot more.  It’s well worth the read; the story sounds so different today, in terms of how heroic or what a good idea this was, from the way it sounded in 2010.

The point here is not that the Conficker Working Group set a precedent for misuse of data.  The point is that pushing the envelope on manipulating Internet activities and their appearance, and sharing and using data – a dataset that was proprietary for many tech companies and personal for millions of end-users – was tacitly made a guiding principle because of a perceived crisis.

And again, in this case, the actual crisis was a worm that propagated all over the lot as a botnet, but otherwise never did anything.

In 2022, and especially with members of the Conficker cast showing up later in Alfa-gate, the whole episode makes you go, Hmmm.

Back to 2015 and 2016, in trust group history

On Thursday 10 March 2022, Margot Cleveland again reported some intriguing information:  that Special Counsel John Durham is apparently investigating Georgia Tech’s involvement in a DOD study of the intrusion suffered by the DNC in 2016.

This update doesn’t surprise me.  A key reason is that in doing the research for this article, I came across a curious set of dates attaching to a fascinating spinoff from the M3AAWG angle (see Part II).

We’ve seen that much of the Alfa-gate cast intersects through M3AAWG.  It’s to be expected that Joffe and Neustar would do so, at the very least.  It’s not surprising that Georgia Tech would, though that moves to a different level, helping to clarify the kind of player Georgia Tech was likely to be.  The date of the university’s affiliation with M3AAWG – November 2015 – can’t help being interesting as well.

But when Listrak shows up there too, that starts to tell us something about the expectations and purposes some M3AAWG members may have – though, again, not necessarily Listrak.  (It looks to me like Listrak, a mature but small company when it joined M3AAWG in 2008, has been trying to leverage punching above its weight to raise its industry profile, and there’s nothing wrong with that.  At this point, I’d judge that the only main chance Listrak had its eye on was growing its reputation and customer base.)

Checking out M3AAWG, I ran across an announcement from the organization, dated 4 May 2016, that M3AAWG was partnering with a group called the Global Cyber Alliance (GCA) to “push the security community to more quickly adopt concrete, quantifiable practices that can reduce online threats.”

Said M3AAWG:  “The non-profit GCA has joined the Messaging, Malware and Mobile Anti-Abuse Working Group, which develops anti-abuse best practices based on the proven experience of its members, and M3AAWG has become a GCA partner for the technology sector.”

As noted in the announcement, the GCA was formed only a few months earlier, in – get this – September 2015.  It’s a partnership sponsored by (then) New York County DA Cyrus Vance, Jr. and includes the U.S.-based Center for Internet Security (CIS) and the City of London Police.  (Go figure.  Of interest as a side note, Vance launched GCA with $25 million in proceeds from a sanctions-violation settlement with global bank BNP Paribas, an ever-wider-spreading pattern of slush-fund use at the federal and blue-state levels during the Obama administration.)

Cyrus Vance, Jr. explains why he didn’t prosecute Harvey Weinstein in 2015. PBS video, YouTube

The arresting aspect of this notice was a specific interest ascribed in it to the GCA.  In a recent announcement by GCA, said M3AAWG, the Alliance had “revealed that its first strategic area of concentration will be phishing with a focus on two solutions shown to be effective at combatting it: implementation of DMARC to limit spoofing of email and secure DNS practices to minimize the effect of phishing and other attacks.”

Phishing and spoofing of email was certainly a timely topic.  The date of GCA’s announcement was 19 April 2016, and if I spot you one name, readers who know the Spygate saga will immediately recognize its significance in the timeframe (19 April and 4 May 2016).  The name is Shawn Henry.

Yes, the one-time FBI official, then (and now) CrowdStrike executive, who was called in when the DNC discovered the intrusion in its system on 29 April 2016, has been a board member of GCA since October 2015.

Screen cap by author March 2022

It also turns out that other senior executives with GCA, and with partner group CIS, were steeped in cyber-professional background at, among other federal agencies, DOD and DHS.  There would be nothing surprising about that, of course.  But it’s interesting how many of them were on a commission set up in 2008 to propose cyber policy to President-Elect Barack Obama, and equally interesting to see the particular jobs a couple of them had in the federal cabinet departments.  (I promise, you want to know this.)

The commission in question continued to produce reports for President Obama through 2014.  Among those whose biographies list the commission – created in the presidential transition of 2008 –  are GCA cofounder and first CEO William Pelgrin, CIS founder and founding chair Franklin Reeder (who served in Clinton Administration as Director of the Office of Administration for the EOP – the EOP entity that procured IT infrastructure for the White House), and CIS President and CEO John Gilligan, who was CIO for the U.S. Air Force and Department of Energy, and was the Program Executive Officer for the Battle Management and Command & Control system for the Air Force (over the past 30 years a high-powered role for each of the armed services).

Senior GCA and CIS executives with particularly relevant federal department experience include the current President and CEO of GCA, Philip Reitinger, who was appointed by Obama as the Deputy Under Secretary for the National Protection and Programs Directorate and the Director of the National Cyber Security Center in the Department of Homeland Security.

The NPPD and National Security Cyber Center were predecessors to the current Cybersecurity and Information Security Agency (CISA).  They notably sat throughout the period 2009 to December 2015 (when Reitinger left to become GCA’s CEO) on a growing mountain of detailed and sensitive industry information about all 16 sectors (including tech/telecoms) of DHS’s designated critical national infrastructure.

Dmitri Alperovitch at an RSA Conference in 2020. RSAC video, YouTube

This is as good a place as any to post a reminder that Dmitri Alperovitch of CrowdStrike and an official from that DHS branch, Phyllis Schneck, were at Georgia Tech together years ago and have a long history, written about in an Esquire article on Alperovitch in October 2016.  (More on Alperovitch and CrowdStrike below.)  The mention is opportune, although neither individual holds a position with GCA, because of Ms. Schneck’s DHS position:  Deputy Under Secretary for Cybersecurity and Communications.

That’s interesting in part because CIS executive Roberta G. “Bobbie” Stempfley, from our list of GCA-linked personalities, was at DHS as Deputy Assistant Secretary for Cybersecurity and Communications, as well as at DOD as Chief Information Officer of the Defense Information Systems Agency (DISA). 

Meanwhile, Jack Arthur, a board member at CIS (and Emeritus Director of the board), was previously Associate Director of the Office of Administration for the EOP, and also has the distinction of having been a consultant to the government of Qatar, as one-time Executive Vice President (now Emeritus) at Octo Consulting Group.

This is the group Shawn Henry of CrowdStrike was running with, as a member of the GCA board, when he was called in on the DNC intrusion at the end of April 2016, exactly at the time GCA and M3AAWG decided to partner on the threat of phishing and email security.

That had all unfolded for me over the last two weeks.  Then Margot Cleveland’s latest little bomblet dropped about Georgia Tech and the DARPA project, which appears to be what John Durham is investigating.

Enter DARPA

It took no more than half an hour to connect some big dots on that one.  Twitter sleuth Hans Mahncke spotted us a Georgia Tech email with the essential clue:  the abbreviation “EA.”

Georgia Tech apparently got in on the DARPA project to research “Enhanced Attribution,” which was announced to interested parties – Holy Gregorian Calendars, Batman – on 22 April 2016.

Here is Broad Agency Announcement DARPA-BAA-16-34.  (The PDF can be downloaded from the webpage; scroll down.)

The definition of the basic research task is a bit vague (see the Program Structure on p. 5):  “The goal of the Enhanced Attribution program is to develop technologies for generating operationally and tactically relevant information about multiple concurrent independent malicious cyber campaigns.  The objective is to not only collect and validate this pertinent information, but to create the means to share such information with any of a number of interested parties without putting at risk the sources and methods used for collection.”

Wikipedia

But the description of the EA program at the DARPA website is more specific:

Malicious actors in cyberspace currently operate with little fear of being caught due to the fact that it is extremely difficult, in some cases perhaps even impossible, to reliably and confidently attribute actions in cyberspace to individuals. … The identities of malicious cyber operators are largely obstructed by the use of multiple layers of indirection. …

The Enhanced Attribution program aims to make currently opaque malicious cyber adversary actions and individual cyber operator attribution transparent by providing high-fidelity visibility into all aspects of malicious cyber operator actions and to increase the government’s ability to publicly reveal the actions of individual malicious cyber operators without damaging sources and methods.

Two features of Spygate help clarify why that’s a clincher here.

Rounding third base…

One relates to CrowdStrike, the originator of the attribution theory of the DNC intrusion; i.e., Russia dunnit.

Long-time readers may remember that when all of this first came in for the intensive scrubbing by OSINT sleuths, back in early 2017, one of the most significant characteristics of CrowdStrike was the company’s sloganeering commitment to attribution.  CrowdStrike, they told customers, wasn’t going to just find forensic evidence of threat activity.  It was going to identify the source, and help customers know who was on the attack and hence which direction to beef up their shields.

Consider again the words of CrowdStrike cofounder George Kurtz in 2013 (citation source at link above):  “We’ve built a platform that can identity the kind of attack that is being used, but we can also determine who’s behind it and what their motivation is.”

Said Wall Street Journal’s All Things D blogger Arik Hesseldahl:  “The firm uses a big data and analytics platform to keep track of hacking groups around the world, and Kurtz said it can usually figure out who’s behind an attack, and shut it down.”

And Esquire quoted Dmitry Alperovitch in a profile in 2016:  “You don’t have a malware problem” – what CrowdStrike tells customers – “you have an adversary problem.”

As my 2017 article noted, however, much of the industry thinks – at least thought at the time – that such an aspiration was unrealistic and couldn’t be reliably backed up.  Divining the motives of malicious cyber actors is particularly questionable.

But it probably isn’t necessary to point out how closely this mirrors the construct for the Alfa-gate operation, which sought to impute human identity and motive to a series of DNS lookups.  Industry experts have been largely skeptical about that as well (even before analysis done for Alfa Bank’s lawsuit suggested the DNS lookups were manufactured deliberately).

At any rate, hindsight makes this an even more interesting strategic approach to ponder.  I expressed it this way at the time:  “[S]elling its services on the basis of understanding the attacker and his motives is exactly how CrowdStrike markets itself.  CrowdStrike is effectively, in its own right, a narrative-concocting security company – and that should color everything we think about its testimony.”

Now, if you will, grab your Spygate calendar and listen with your ears one more time as we review elements of DARPA’s desired performance parameters for the proposed research.  The three Technical Areas of the project are enumerated on page 6.

DARPA project announcement DARPA-BAA-16-34 dated 22 April 2016, p. 6

Read TA1 and TA2 at your leisure.  Here are some verbal gooses from TA3, starting at the bottom of p. 7:

Validation and Enrichment (TA3):

TA3 performers will enrich the knowledge base with additional sources of information, including publicly available data (e.g., WHOIS records), commercial data (e.g., threat intelligence feeds), and Government-only data, to complement TA1 data toward developing a broader picture of malicious activity and identifying weaknesses in their TTPs, tools, and infrastructure. DARPA anticipates that the ingestion of such data would be of limited scope, based on the data’s relationship with the ground truth data that TA1 generates. Some examples of cyber operations information of interest include, but are not limited to:

  • Network-based information (e.g., PCAP, NetFlow, firewall and IDS logs)

  • Host-based information (e.g., antivirus logs, resident files, registry settings)

  • Network Presence Information (e.g., WHOIS and DNS records)

… and here’s home plate

TA3 proposers should include the following topics among those discussed in their proposal:

  1. The goal of Enhanced Attribution is to compose an explanatory and operationally useful narrative of malicious cyber operator actions that can be publicly disclosed in order to expose the actions of individual malicious cyber operators without damaging sources and methods. Performers should discuss how their solutions would expose known but hidden structure of cyber actions leveraging information that can be publicly disclosed.

In other words, the goal of DARPA-BAA-16-34 was to write something very much like the Alfa-gate narrative packaged by Team Joffe, for Michael Sussmann to shop to federal agencies, while billing the professional time to Hillary Clinton’s campaign and the DNC.

Presumably DARPA’s project announcement wasn’t asking to have cyber evidence manufactured as the basis for such a narrative.  But otherwise, the DARPA project announced on 22 April 2016 was a pretty good description of what Alfa-gate turned into.

No wonder Durham wanted to investigate Georgia Tech’s role in the DARPA project.  Stay tuned.

Feature image:  The Battle of Austerlitz, 2 December 1805 (detail), François Gérard via Wikimedia Commons. Collateral images Pixabay.

5 thoughts on “The IT role in Russiagate: Part III – Internet maneuvers, an Obama-linked trust group, and DARPA”

  1. Another Conficker parallel (?): Conficker worm right after Obama is elected and then the domains established and taken over by Joffe after Inauguration—and the SolarWinds hack right after Biden’s 2020 election and the activation of Pentagon IP addresses minutes before Inauguration.

Comments are closed.

%d bloggers like this: