Candid Thoughts on US Leadership in the World

By | Uncategorized | No Comments

There’s an elephant in the room. And its past time to let it out.

Those of you who’ve met ERI’s leadership and analyst cadre probably recognized right away that we, like most intelligence professionals, are rather opinionated individuals, and as individuals, we’re rather comfortable speaking our minds in private settings. Divergent in opinion as we often are (which we see as a strength of our business and analysis), we take exceptional pride in training our analysts – and other companies’ analysts – to work diligently to ensure that analysis is anything but opinion. There is, after all, a sharp division between editorial and analysis – something that is not always immediately understood by those new to the field. Our product is intelligence analysis. By definition, analysis is as objective as humanly possible. It should consider only facts, trends and potential outcomes. It does not serve a specific agenda, other than to support business and executive decision-making with the best and most relevant information possible. In hope that its consideration leads to decisions that produce the most effective outcomes possible. That is the role of proper analysis. These premises underpin our tradecraft.

That’s a long-winded way of saying that we find it challenging to suggest that we, as intelligence analysts, need to explore head-on whether the ship that is the world’s superpower has lost its leadership heading. And if so, what have been the impacts and how might we predict future outcomes. (Elephant exits the room.) Its past time to look at this situation objectively, accounting for bias (yes, we are ALL biased) and approaching our tradecraft as the expert analysts we are. As for what follows, agree or disagree. Or ask more questions. This is not intended as political commentary, but it is also not analysis. Rather, its thought. Or better yet, a challenge to all of us to think beyond our politics – whatever they are. To explore whether bias is creeping into our own work when we set to with our cup of coffee and keyboard each morning. How would we treat this situation as a US company? Would we/ or do we treat it differently as an organization outside the US?

Leadership Shifts

Since taking office, the administration has engineered a marked shift in US leadership. A move away from accepted norms and redlines of political power, changes in perceptions and dealings with US allied and non-allied countries has brought us to a new and very different foreign policy than the US has followed under previous leaders. And not all of it has been a negative change, in fact there is an argument to be made that some of these changes were badly needed to break stalemates over old issues. But, the president of the world’s current superpower has also been accused of opening himself up to the prospect of blackmail by adversarial foreign powers, spoken derisively of perceived political challengers on both sides of the aisle, engaged in name-calling with foreign leaders and alternatively praised authoritarian rulers while speaking, at times wistfully, of the authoritarian controls they enjoy. Where diplomacy and backroom discussions between the US and world leaders previously led to restraint from human rights abuses – or at least significantly lower profile abuses – these discussions, or perhaps lack of those discussions and diplomacy, now seems to have led us to a place of emboldened dictatorial behavior. Does this lie in the more transactional approach of current US leadership? Where previously US foreign policy and aid money was attached to more conditionals and more diplomacy, today it appears less interventionist and attached to only a few, inconsistent ideologies.

Over the course of its post- World War II reign as one of the world’s two superpowers – and as the only superpower in the post-Soviet world order since the 1990’s, embracingly or reluctantly or both, the US was often been viewed as the world’s police force and center of gravity for good governance. Successfully and unsuccessfully, the nation has been a key provider of aid, it has (yes, selectively) contained dictators, spoken out against human rights abuses and borne an overly proportional role in defense alliances such as NATO. Its domestic market has buoyed the global economy and its role in advent of the internet has connected industries, academia and people from major population centers to far flung rural corners of developing nations, bringing to bear today’s new, data-driven industrial revolution. As a result, foreign leaders have often been forced to consider consequences of their actions, whether against their own population or their neighbors. The US could not be everywhere at once, nor did every malicious action interfere directly with US interests. But consequences it could impose were at least a consideration, thanks to US interventionist policy that projected hard and soft power far beyond US borders.

Today, how serious are these considerations by foreign leaders? Are they still a serious concern that impose a deterrent effect? And perhaps, even if we disregard a direct effect, how do these events – seemingly ordered by governments that are often our business and government partners in various endeavors – impact our investments? Our ability to continue to work in these countries, with these government partners? Will this new world order endanger our personnel by giving them less protection from corrupt governments and security forces?

Let’s simply take a look at some of the facts.

  • In March, Russia resumed targeted assassinations of opponents in the UK, when alleged Russian GRU officers poisoned former Russian spy Sergei Skripal and his daughter Yulia in Salisbury. The last time Russia committed such an attack was in 2006, while the US government and populace was consumed by a global war on terror and deeply committed to ongoing wars in Iraq and Afghanistan.
  • In the last year, China has approximately doubled the size of internment camps where it incarcerates its ethnic Uighur population in effort to conduct “re-education,” attempting to convert Chinese Muslims from Islam to atheism and Chinese Communist Party-endorsed values.
  • Earlier this month, Meng Hongwei, president of Interpol and Chinese citizen, returned to China and has since disappeared. Meng’s disappearance, though suspected to be the result of ongoing corruption investigations against Chinese officials, draws a bright spotlight on China’s secret detention program, as used in the case of China’s most internationally visible law enforcement official. Meng’s detention follows a four-month detention of high-profile Chinese actress Fan Bingbing on tax evasion allegations.
  • In early October, Saudi journalist and prominent critic of the ruling family Jamal Kashoggi walked into the Saudi Consulate in Istanbul, Turkey and was killed by Saudi officials. The killing, responsibility for which has been not-so-deftly deflected by Saudi rulers, comes during a period when US-Saudi relations have been at least as close as ever.
  • In the Middle East, Asia and Africa, dictatorial leaders have increased their war on journalists and press outlets, decrying news that portrays them in a negative light as “fake,” and in many cases jailing them or worse.
  • Last week in Nigeria, the Nigerian military posted a video of the US President discussing shooting people on the border who would throw rocks at US border patrol agencies. The video was in answer to criticism over the military’s decision on Monday of that week to fire on Shiite protesters in the capital, killing six.
  • Meanwhile, countries across Europe – the UK, Germany, Poland, Sweden and others – are experiencing a rise in influence of far right-wing political groups, activists and extremists that are gaining political ground through parliamentary seats won through elections. Much of the recent rise of nationalism across Europe has correlated to the rise of similar sentiment in the US, suggesting at minimum a tangential relationship.

Are these actions the result of foreign leaders’ perception of changed US leadership? Or would they have happened anyway? If they are, how does this change our assessment or prediction of future foreign leadership decisions and resulting risks to our organizations? While there was a time organizations may have said “that has nothing to do with us,” most companies can say that no more. Importantly, most don’t want to. If the Khashoggi case illuminated anything for the business world – it was the tangible political and financial risk associated with gross human rights violations – even when those organizations have no direct involvement and would never sanction such actions.

So, is the world today really experiencing more of an “each to their own” climate than it has in recent history? And how much of that is down to changes in US leadership? Historian Robert Kagan calls this the international “jungle.” The world before the “liberal world order” – pre-World War II. A world where leadership based on good governance is absent and strongmen prevail. If the US no longer wants to take on that leadership role, who takes it up? Or do we return to the era of every nation for themselves? And if so, how does that change the boundaries of countries behavior and its impact on our people and organizations?

As we have previously posited, when writing on domestic politics, it is sometimes useful to distance oneself from the most polarizing aspects of the issue – in this case, that often means speaking less directly about polarizing personalities – whose mention can distract us from the information and analysis we need to provide to decision makers. But in this new international order – with US decision-making less predictable and in some cases more immediately impactful on other global leaders decisions its important to ask what we may be leaving out of our analysis – and to chance – if we are tiptoeing around the topic or ignoring it altogether. If it wasn’t our country, would we leave it out of our analysis as a major data point? How do we approach it with as little bias as possible? How do we look objectively at an administration that virtually no one on the planet is ambivalent about?

It’s a tough nut to crack, and we suggest doing it with structured thinking, analytic tools that can help you understand your own bias and with counterparts with perspectives that differ markedly from your own. But to leave these questions unanswered is to create a serious blind spot for the future of our organization and the world.

Meredith Wilson and Brady Roberts are the CEO and COO of Emergent Risk International, respectively.

Emergent Risk Summer Seminar Series – Chicago and Detroit

By | Business Risk, Emergent Risk, Events, Intelligence, training | No Comments

Join us for a free concise writing seminar for corporate intelligence professionals.

This seminar will focus on assisting intelligence professionals of all levels of expertise in honing and sharpening their business writing skills. We will work through several interactive exercises, discuss editing tips, tone, word choice and ways to write on politically polarizing and sensitive topics.

ERI Seminar at Tufts University Political Risk Conference, April 2016

Date: August 2, 2018

Where: Chicago, IL

Time: 1:30 – 4:00

Or

Date: August 7, 2018

Where: Detroit

Time: 1:30 – 4:00

 

Both trainings will be followed by an informal Happy Hour at a location TBD from 4:30 – 6:30.

To register, please send the name, title and email of the desired participant via the Contact Us link on this website. We have a limited number of seats, so please send your registration right away. Participation is limited to corporate, government and NGO intelligence professionals.

Calling all interns!!!

By | Uncategorized | No Comments

Intelligence Analysis Intern

Emergent Risk International is hiring interns! ERI is a Dallas based risk and intelligence advisory firm. Our business is assisting companies in utilizing intelligence and analysis to better drive their business. We focus on three primary activities:

  • Assessment and Analysis: We provide bespoke geopolitical and threat intelligence products to address issues of concern to our clients. These products range from those developed for regular distribution to in-depth new market entry assessments.
  • Training: We train intelligence analysts to provide intelligence analysis in a business environment focusing on tradecraft and tools that drive more efficient and effective analysis. We offer a range of in-house and open trainings to address specific levels of experience and need.
  • Consulting: We help companies develop and improve their intelligence programs, providing end-to-end support; from assessing needs and providing analysis to recruiting and hiring highly qualified candidates.

Intelligence Analysis Interns will be responsible for helping ERI with a range of research, technology, analytic and administrative tasks to better serve its clients. Interns will contribute to research projects, products and services and will have a role in developing new products for the company.

Primary responsibilities:

  • Assist analysts by providing structured country research on issues of importance to ERI clients
  • Assist in building data visualizations and social media posts
  • Assist in developing business leads
  • Develop and maintain an awareness of relevant global issues impacting ERI’s primary client base
  • Administrative tasks as necessary

Remuneration: This role will be unpaid and run from the beginning of October through December (candidates interested in continuing through the spring semester will also be considered). Interns will receive intelligence analysis training, start-up experience and exposure to other critical professional skill sets. This internship can also be taken on for course credit with the permission of a candidates’ institution. Successful candidates will work out of our offices in downtown Dallas 15-20 hours per week. Some exceptional candidates may be considered for remote work.

Experience: The right candidate will possess most or all of the following qualifications:

  • Excellent writing skills
  • In final year of undergrad or has finished BA/BS degree
  • Foreign language capability
  • International relations, political economy, economics, development, political science or other related fields of study are preferred
  • Working knowledge of information technology, social media and tech related trends
  • Experience living or studying abroad
  • Strong academic record
  • Self-motivated and able to manage time effectively
  • Strong work ethic and commitment

Application Deadline is September 30. Please send your resume, cover letter, and a recent writing sample to: eriteam@emergentriskinternational.com

The Geopolitics – Cyber Nexus

By | Uncategorized | No Comments

 

We live in a world exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology.

Carl Sagan

When I talk to students or my interns about careers, one thing I tell them all: they must possess an above average understanding of cyber and technology. Inability to process the meaning and implications of technology on the geopolitical landscape will make it impossible to grasp the future trajectory of global security, inter-state relations, economics, politics and security.

For years, tech oriented analysts warned of quickening convergence between cyberthreats and traditional security threats – be it crime or the convergence of geopolitics, war, diplomacy and cyber warfare. Today, cyber and physical threats are becoming indivisible. Consider the rise of ISIS – which would not have taken shape as quickly or broadly without the internet – and especially social media. The protests of the Arab Spring, which, aided by social media, rose faster and pushed the Middle East further in a shorter time frame than anyone would’ve thought possible. And criminality, from human trafficking to drug smuggling to the most basic of crime: credit card fraud. You are more likely to have someone steal your credit card information virtually (or purchase it from a dark web site), than you are to be mugged on the street.

Cyberwarfare and technology form a major facet of the strategy of state and non-state actors. “Cyber” is nothing more than another tool used by a range of geopolitical and criminal actors to influence an outcome by force. When we discuss North Korea, for example, most reporting highlights developments in its missile program and the threat of traditional war. Much less is said about covert cyberwarfare that has purportedly been aimed at North Korea’s nuclear program for several years now. Even less is said of North Korea’s cyber capability and the attacks it has reportedly carried out against governments and major multi-national companies. Or the country’s purported role in the WannaCry virus that surfaced earlier this year, fashioned out of leaked CIA cybertools. None of this is a secret, but it is repeatedly left out of geopolitical analysis that examines how scenarios between North Korea and the rest of the world might unfold.

We cannot credibly assess future scenarios without taking the cyber capabilities of the actors – state or non-state – into account. Likewise – it is increasingly hard to do the opposite as well. That is, credible cyberthreat analysis cannot be undertaken without consideration of the geopolitical and security aims of the actor. Is the attack part of a broader strategy to attack our country or organization? Is it being carried out by a state or non-state actor? Is there a motive that stretches beyond money? Who are they connected to? And yet, we continue to separate our analysis into the cyber and the non-cyber – almost guaranteeing that we are not appreciating the whole picture.

Ukraine is the most obvious example of how cyberwarfare has become an integrated part of political and conflict strategy. Its become a virtual blueprint for what modern hybrid warfare looks like. It is the future of conflict, but it remains relatively unappreciated outside of a few well-informed circles of the geopolitical and cyber community. The Petya “not Petya” attacks in late June are one of the most recent examples. While ground warfare continues in and near separatist controlled Ukraine, an ongoing disinformation campaign has persisted for years. And critical infrastructure has been hit repeatedly, not by bombs (though that has also happened), but by cyberattacks. The Petya attack – while not spreading as far as WannaCry – impacted trains, airports, banks, electricity and several other types of critical infrastructure simultaneously. To that point – simultaneous attack of several types of critical infrastructure had not been accomplished by a malicious cyber actor, ever. The attack, disguised as an accounting software update, which was then disguised as a ransomware attack was actually created to wipe out hard drives, with no recovery. In addition to taking critical infrastructure of the intended recipient offline; it spread to global business, bringing supply chains and logistics to a grinding halt. Many affected businesses still have not fully recovered and will suffer significant losses from the recovery expense for Q2 this year.

While the probable state actor was identified quickly, given that Ukraine has suffered virtually non-stop cyber attacks for the past four years, it may have taken longer to make this connection had the attack targeted a country or company with a less clear cut adversary, potentially making mitigation and defensive measures harder to implement.

In order for us to improve the way we understand the world we have to better understand those things that may lie outside our expertise or comfort zone. Facebook’s Head of Global Security, Alex Stamos, made a similar point to the cyber community at the 20th Anniversary of Black Hat, an annual conference dedicated to hacking. In Stamos’ words, the hacking community has to become more diverse and inclusive. This is not just for diversity’s sake, but in order to better understand the implications of hacking, who it harms and how it harms them. It is estimated that the world will need around 2 million more cybersecurity experts in the next two years alone. Meanwhile, academics, intelligence professionals, business leaders – and especially governments – need to get much smarter about cyber, rather than dismissing it. Until we do, our analysis will be incomplete.

Disinformation is changing the way we view the world

By | Uncategorized | No Comments
  •  Disinformation is proliferating and intensifying threats, seen and unseen.
  • False narratives propagated by state and non-state actors utilizing disinformation can ruin corporate and personal reputations, threaten business continuity, cause or intensify conflict and drive deep ideological fault lines.
  • A recent study of computer generated propaganda and disinformation revealed that disinformation is having a strong impact on public opinion in some countries.
  • A lack of public understanding of the depth and sophistication of disinformation operations is leaving governments, organizations, and individuals vulnerable to cybercrime and false narratives that undermine sound decision-making.
  • Tools, information, and counter-measures are emerging, but education is the most effective tool we have against disinformation.

Back in January, not long after the new administration took office, Starbucks made a pledge to hire 10,000 refugees. The move quickly became a political issue with folks on one side applauding the effort, and on the other, angry about it.

What drove some of the anger however, was, at best, misinformation (unintentionally misleading info), and possibly disinformation (intentionally misleading with a goal to manipulate public opinion/ perceptions), via several viral memes. One of the most visible memes said: “Hey Starbucks, instead of hiring 10,000 refugees, why don’t you hire 10,000 veterans?” But here’s the thing: Starbucks has had a goal to hire 10,000 veterans through their very successful military partners program since 2013. And while there were other factors at play too, the disinformation amplified the issue, which quickly lodged itself in a partisan fault line. Starbucks handled the issue, in part, by drawing attention to their long running veteran’s program. In addition, veterans who work with Starbucks came to their defense. But the story is illustrative of a growing problem. Fake news and disinformation travel at the speed of a text message or a twitter post and its impact can damage reputations – and worse – catalyze security and major geopolitical events.

As we’ve highlighted in a previous article, disinformation is NOT new. But it is an increasingly potent force driving public opinion or catalyzing crises amplified by its use with other mediums like social media, malware and increasingly popular alternate media outlets. Technology, speed and open access to low cost tools to disseminate information instantly and globally allow anyone: state and non-state actors, politicians, companies and ordinary people to disseminate and amplify disinformation with the right combination of tools.

Disinformation is shaping public opinion

A new study from Oxford University discusses how technology and social media contribute to the spread of disinformation and propaganda. The Computational Propaganda Research Project, found that propaganda and disinformation spread via social media is being utilized in multiple countries around the world to successfully shape public opinion on political issues, especially related to referendums and elections. Moreover, it found that highly automated social media accounts (also known as “bots”) were responsible for as much as 45% of all Twitter traffic in some countries. The study took a look at recent events in Canada, Russia, Brazil, China, the US and several others to understand the impact of social media – especially automated social media, on public opinion. Particularly interesting findings from the paper included:In Brazil “bot networks and other forms of computational propaganda were active in the 2014 presidential election, the constitutional crisis, and the impeachment process.” These included “highly automated accounts supporting and attacking political figures, debate issues such as corruption, and encouraging protest movements.”

  • In the US, “Twitter bots … reached highly influential network positions within retweet networks during the 2016 US election. The botnet associated with Trump-related hashtags was 3 times larger than the botnet associated with Clinton-related hashtags.”
  • In Russia, over 45% of all social media activity is generated by bot nets. Nearby “Ukraine is the frontline of experimentation in computational propaganda, with active campaigns of engagement between Russian botnets, Ukraine nationalist botnets, and botnets from civil society groups”

In simple terms, bots are being utilized to pump out information, usually false and misleading, at alarmingly high numbers in some countries. The use of bots to increase the visibility of disinformation across the internet plays into several biases including availability bias and confirmation bias, and has proven highly effective at shaping public opinion. That impact affects everything from referendums and elections, to social and political stability, to consumer brand preferences, to whom we trust online with sensitive information, to our reputation and security.

Disinformation is driving geopolitical events

Disinformation has the potential to catalyze diplomatic tensions into much larger geo-political situations. Particularly in countries where hair-trigger response to threats are common (Israel/Palestine, North Korea, South Korea, etc), disinformation at the right tempo and time could touch off a major conflict before there is time to set the record straight. Two recent examples include an article in a Bahrain newspaper – planted disinformation that catalyzed the still-festering Qatar blockade and a fake article about Israel and Pakistan.

In Bahrain, the offending article may have been a catalyst (and a convenient excuse) for GCC countries to cut-off diplomatic relationships with Qatar, though it was by no means the cause of the dispute. The crisis has created expensive logistical problems for businesses in the region and substantially complicated transnational relationships across the Gulf, US regional relationships and counter-terrorism efforts there.

In another recent incident, the Pakistani Foreign Minister tweeted a nuclear threat at Israel after reading a fake news piece on Israel’s response to reports that Pakistan was getting involved in Syria.

Disinformation is driving real world security events

Disinformation, misinformation, and outright fake news are causing real world security problems. Most people will now be familiar with the fabricated news story that circulated in the final days of the election about a certain candidate running nefarious activities out of the basement of a pizza restaurant in Washington DC, Comet Ping Pong. The fake story, which originated on Twitter and quickly spread to Reddit, back to Twitter and on Facebook, became the basis for a real world security event when a man named Edgar Welch showed up at the Pizzaria with a rifle in hand. Only then did he discover that the restaurant didn’t HAVE a basement, let alone any politicians involved in human trafficking there.

Long before the election, similar fake stories were generated in an attempt to create more panic over the Ebola crisis, Ferguson and a completely made up event about a chemical plant, owned by Colombian Chemicals, exploding in Louisiana on 9/11.

The Colombian Chemicals case, in particular, demonstrated how such a hoax can cause disruption to a company, local emergency services and propagate disinformation through the use of botnets (though this particular attempt was not as successful as the perpetrators intended).

These cases show obvious links between disinformation and the actual event, but more cases can undoubtedly be found by digging into the annals of radicalization. Cases exist across the political and ideological spectrum highlighting the selective disinformation and propaganda that radicalized individuals have consumed in the process of becoming radicalized.

Disinformation can lead to major IT security threats

Malicious links are often found in hyper-partisan click-bait publications that trade in fake news, disinformation, hyperbolic and polarizing headlines or advertisements created by botnets. In a recent case, a Defense Department employee clicked on a link about vacation packages tailored to his interests in his twitter feed. This link downloaded malware that allowed a server in Russia to take control of his computer and Twitter account. That same malware was sent to at least 10,000 DoD employees.

Like phishing attacks, these campaigns trick users into clicking on them, based on a sense of trust in the friend who shared the post, or the targeted nature of the information which is based on their specific social media behavior. According to government sources, Twitter has the most malicious links embedded in disinformation and other types of posts due to the large amount of botnets active on the site and sharing the information at a high volume. The problem has also been identified in smaller volume on Facebook, which has been taking public steps to address disinformation and fake news proliferating on the site.

Disinformation may be driving some of your most important decisions

Distinguishing between what is real and what is fake is getting more difficult. Even more worrying is that gray area in between where information is intentionally altered, taken out of context, or twisted to support a particular narrative. Over-time these misconceptions become embedded in political discourse and become part of our basic assumptions. This drives poor decision-making and in the extreme; dangerous rhetoric that dehumanizes and encourages violence against people who do not agree with a particular point of view. Whether it is viral memes suggesting that police officers are violent or media outlets that portray peaceful protests as being carried out by “violent criminals.” Disinformation and propaganda, in extreme scenarios fed numerous genocides that took place during the 20th century, whether religious, ethnically, or politically driven.

In a less dramatic – but worrying – scenario, Russia has targeted multiple demographics in the US including US military personnel, through the use of fake “patriotic” websites and Facebook groups that plant disinformation and fake news to influence their behavior and loyalties.

In aggregate, continued exposure to disinformation affects consequential decisions, from split second battlefield decisions to major policy initiatives fed by hyper-partisan narratives. But it also feeds everyday decisions, like who we choose to hire, whether we interpret a person’s actions as a threat based on the way they look, or some aspect of their character, and how we respond to that threat.

Falling behind on information warfare

Disinformation – though as old as time – has been harnessed in the information age to amplify social issues, increase distrust among disparate groups of people and drive wedges between previously unified groups. While some countries have been wise to this for many years, in the US, the phenomenon has caught many off-guard, leaving us more vulnerable than many realize. The US public had more awareness of disinformation operations during the Cold War, but the fall of the Soviet Union and the shift in security focus to terrorism and threats from non-state actors dulled the public’s understanding of the topic. Meanwhile, disinformation tradecraft (particularly in Russia and Eastern Europe) became more sophisticated, automated and easier to exploit. This was most dramatically seen in the recent US election cycle, but has been occurring throughout the last decade and particularly since the US and EU levied sanctions against Russia in 2014 following annexation of Crimea.

Managing Pandora’s Box

As with every problem that vexes our society, there is innovation and ingenuity at work to solve these problems. And though there is no way to close Pandora’s Box again, there are ways to lessen the impact of disinformation. Key among these is educating ourselves, and those around us, about how our biases make us prone to falling prey to disinformation. In addition, there is a growing body of literature and study on the topic that can help us understand what is occurring, when it’s occurring and how to address it. We can also familiarize ourselves with some of the tools (FacebookGoogle) that are emerging to address this problem; from initiatives to develop artificial intelligence solutions, to academic research, to the use of trusted data sources to refute questionable claims. There are good reasons to be hopeful about our ability to manage the threat that disinformation creates, but the fight against the use of technology to mislead, misinform, and drive wedges between people is particularly urgent. Especially in an age when mass communication happens at the touch of a button.

Meredith Wilson is the Founder and CEO of Emergent Risk International, LLC. Find out more or subscribe to our newsletter here.

Sign up for our Newsletter

Sign up for our newsletter to receive our monthly intelligence briefing, infographics, and updates on the latest events. (And we promise we don’t do spam).


Subscribe!