The Geopolitics – Cyber Nexus

By | Uncategorized | No Comments


We live in a world exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology.

Carl Sagan

When I talk to students or my interns about careers, one thing I tell them all: they must possess an above average understanding of cyber and technology. Inability to process the meaning and implications of technology on the geopolitical landscape will make it impossible to grasp the future trajectory of global security, inter-state relations, economics, politics and security.

For years, tech oriented analysts warned of quickening convergence between cyberthreats and traditional security threats – be it crime or the convergence of geopolitics, war, diplomacy and cyber warfare. Today, cyber and physical threats are becoming indivisible. Consider the rise of ISIS – which would not have taken shape as quickly or broadly without the internet – and especially social media. The protests of the Arab Spring, which, aided by social media, rose faster and pushed the Middle East further in a shorter time frame than anyone would’ve thought possible. And criminality, from human trafficking to drug smuggling to the most basic of crime: credit card fraud. You are more likely to have someone steal your credit card information virtually (or purchase it from a dark web site), than you are to be mugged on the street.

Cyberwarfare and technology form a major facet of the strategy of state and non-state actors. “Cyber” is nothing more than another tool used by a range of geopolitical and criminal actors to influence an outcome by force. When we discuss North Korea, for example, most reporting highlights developments in its missile program and the threat of traditional war. Much less is said about covert cyberwarfare that has purportedly been aimed at North Korea’s nuclear program for several years now. Even less is said of North Korea’s cyber capability and the attacks it has reportedly carried out against governments and major multi-national companies. Or the country’s purported role in the WannaCry virus that surfaced earlier this year, fashioned out of leaked CIA cybertools. None of this is a secret, but it is repeatedly left out of geopolitical analysis that examines how scenarios between North Korea and the rest of the world might unfold.

We cannot credibly assess future scenarios without taking the cyber capabilities of the actors – state or non-state – into account. Likewise – it is increasingly hard to do the opposite as well. That is, credible cyberthreat analysis cannot be undertaken without consideration of the geopolitical and security aims of the actor. Is the attack part of a broader strategy to attack our country or organization? Is it being carried out by a state or non-state actor? Is there a motive that stretches beyond money? Who are they connected to? And yet, we continue to separate our analysis into the cyber and the non-cyber – almost guaranteeing that we are not appreciating the whole picture.

Ukraine is the most obvious example of how cyberwarfare has become an integrated part of political and conflict strategy. Its become a virtual blueprint for what modern hybrid warfare looks like. It is the future of conflict, but it remains relatively unappreciated outside of a few well-informed circles of the geopolitical and cyber community. The Petya “not Petya” attacks in late June are one of the most recent examples. While ground warfare continues in and near separatist controlled Ukraine, an ongoing disinformation campaign has persisted for years. And critical infrastructure has been hit repeatedly, not by bombs (though that has also happened), but by cyberattacks. The Petya attack – while not spreading as far as WannaCry – impacted trains, airports, banks, electricity and several other types of critical infrastructure simultaneously. To that point – simultaneous attack of several types of critical infrastructure had not been accomplished by a malicious cyber actor, ever. The attack, disguised as an accounting software update, which was then disguised as a ransomware attack was actually created to wipe out hard drives, with no recovery. In addition to taking critical infrastructure of the intended recipient offline; it spread to global business, bringing supply chains and logistics to a grinding halt. Many affected businesses still have not fully recovered and will suffer significant losses from the recovery expense for Q2 this year.

While the probable state actor was identified quickly, given that Ukraine has suffered virtually non-stop cyber attacks for the past four years, it may have taken longer to make this connection had the attack targeted a country or company with a less clear cut adversary, potentially making mitigation and defensive measures harder to implement.

In order for us to improve the way we understand the world we have to better understand those things that may lie outside our expertise or comfort zone. Facebook’s Head of Global Security, Alex Stamos, made a similar point to the cyber community at the 20th Anniversary of Black Hat, an annual conference dedicated to hacking. In Stamos’ words, the hacking community has to become more diverse and inclusive. This is not just for diversity’s sake, but in order to better understand the implications of hacking, who it harms and how it harms them. It is estimated that the world will need around 2 million more cybersecurity experts in the next two years alone. Meanwhile, academics, intelligence professionals, business leaders – and especially governments – need to get much smarter about cyber, rather than dismissing it. Until we do, our analysis will be incomplete.

Disinformation is changing the way we view the world

By | Uncategorized | No Comments
  •  Disinformation is proliferating and intensifying threats, seen and unseen.
  • False narratives propagated by state and non-state actors utilizing disinformation can ruin corporate and personal reputations, threaten business continuity, cause or intensify conflict and drive deep ideological fault lines.
  • A recent study of computer generated propaganda and disinformation revealed that disinformation is having a strong impact on public opinion in some countries.
  • A lack of public understanding of the depth and sophistication of disinformation operations is leaving governments, organizations, and individuals vulnerable to cybercrime and false narratives that undermine sound decision-making.
  • Tools, information, and counter-measures are emerging, but education is the most effective tool we have against disinformation.

Back in January, not long after the new administration took office, Starbucks made a pledge to hire 10,000 refugees. The move quickly became a political issue with folks on one side applauding the effort, and on the other, angry about it.

What drove some of the anger however, was, at best, misinformation (unintentionally misleading info), and possibly disinformation (intentionally misleading with a goal to manipulate public opinion/ perceptions), via several viral memes. One of the most visible memes said: “Hey Starbucks, instead of hiring 10,000 refugees, why don’t you hire 10,000 veterans?” But here’s the thing: Starbucks has had a goal to hire 10,000 veterans through their very successful military partners program since 2013. And while there were other factors at play too, the disinformation amplified the issue, which quickly lodged itself in a partisan fault line. Starbucks handled the issue, in part, by drawing attention to their long running veteran’s program. In addition, veterans who work with Starbucks came to their defense. But the story is illustrative of a growing problem. Fake news and disinformation travel at the speed of a text message or a twitter post and its impact can damage reputations – and worse – catalyze security and major geopolitical events.

As we’ve highlighted in a previous article, disinformation is NOT new. But it is an increasingly potent force driving public opinion or catalyzing crises amplified by its use with other mediums like social media, malware and increasingly popular alternate media outlets. Technology, speed and open access to low cost tools to disseminate information instantly and globally allow anyone: state and non-state actors, politicians, companies and ordinary people to disseminate and amplify disinformation with the right combination of tools.

Disinformation is shaping public opinion

A new study from Oxford University discusses how technology and social media contribute to the spread of disinformation and propaganda. The Computational Propaganda Research Project, found that propaganda and disinformation spread via social media is being utilized in multiple countries around the world to successfully shape public opinion on political issues, especially related to referendums and elections. Moreover, it found that highly automated social media accounts (also known as “bots”) were responsible for as much as 45% of all Twitter traffic in some countries. The study took a look at recent events in Canada, Russia, Brazil, China, the US and several others to understand the impact of social media – especially automated social media, on public opinion. Particularly interesting findings from the paper included:In Brazil “bot networks and other forms of computational propaganda were active in the 2014 presidential election, the constitutional crisis, and the impeachment process.” These included “highly automated accounts supporting and attacking political figures, debate issues such as corruption, and encouraging protest movements.”

  • In the US, “Twitter bots … reached highly influential network positions within retweet networks during the 2016 US election. The botnet associated with Trump-related hashtags was 3 times larger than the botnet associated with Clinton-related hashtags.”
  • In Russia, over 45% of all social media activity is generated by bot nets. Nearby “Ukraine is the frontline of experimentation in computational propaganda, with active campaigns of engagement between Russian botnets, Ukraine nationalist botnets, and botnets from civil society groups”

In simple terms, bots are being utilized to pump out information, usually false and misleading, at alarmingly high numbers in some countries. The use of bots to increase the visibility of disinformation across the internet plays into several biases including availability bias and confirmation bias, and has proven highly effective at shaping public opinion. That impact affects everything from referendums and elections, to social and political stability, to consumer brand preferences, to whom we trust online with sensitive information, to our reputation and security.

Disinformation is driving geopolitical events

Disinformation has the potential to catalyze diplomatic tensions into much larger geo-political situations. Particularly in countries where hair-trigger response to threats are common (Israel/Palestine, North Korea, South Korea, etc), disinformation at the right tempo and time could touch off a major conflict before there is time to set the record straight. Two recent examples include an article in a Bahrain newspaper – planted disinformation that catalyzed the still-festering Qatar blockade and a fake article about Israel and Pakistan.

In Bahrain, the offending article may have been a catalyst (and a convenient excuse) for GCC countries to cut-off diplomatic relationships with Qatar, though it was by no means the cause of the dispute. The crisis has created expensive logistical problems for businesses in the region and substantially complicated transnational relationships across the Gulf, US regional relationships and counter-terrorism efforts there.

In another recent incident, the Pakistani Foreign Minister tweeted a nuclear threat at Israel after reading a fake news piece on Israel’s response to reports that Pakistan was getting involved in Syria.

Disinformation is driving real world security events

Disinformation, misinformation, and outright fake news are causing real world security problems. Most people will now be familiar with the fabricated news story that circulated in the final days of the election about a certain candidate running nefarious activities out of the basement of a pizza restaurant in Washington DC, Comet Ping Pong. The fake story, which originated on Twitter and quickly spread to Reddit, back to Twitter and on Facebook, became the basis for a real world security event when a man named Edgar Welch showed up at the Pizzaria with a rifle in hand. Only then did he discover that the restaurant didn’t HAVE a basement, let alone any politicians involved in human trafficking there.

Long before the election, similar fake stories were generated in an attempt to create more panic over the Ebola crisis, Ferguson and a completely made up event about a chemical plant, owned by Colombian Chemicals, exploding in Louisiana on 9/11.

The Colombian Chemicals case, in particular, demonstrated how such a hoax can cause disruption to a company, local emergency services and propagate disinformation through the use of botnets (though this particular attempt was not as successful as the perpetrators intended).

These cases show obvious links between disinformation and the actual event, but more cases can undoubtedly be found by digging into the annals of radicalization. Cases exist across the political and ideological spectrum highlighting the selective disinformation and propaganda that radicalized individuals have consumed in the process of becoming radicalized.

Disinformation can lead to major IT security threats

Malicious links are often found in hyper-partisan click-bait publications that trade in fake news, disinformation, hyperbolic and polarizing headlines or advertisements created by botnets. In a recent case, a Defense Department employee clicked on a link about vacation packages tailored to his interests in his twitter feed. This link downloaded malware that allowed a server in Russia to take control of his computer and Twitter account. That same malware was sent to at least 10,000 DoD employees.

Like phishing attacks, these campaigns trick users into clicking on them, based on a sense of trust in the friend who shared the post, or the targeted nature of the information which is based on their specific social media behavior. According to government sources, Twitter has the most malicious links embedded in disinformation and other types of posts due to the large amount of botnets active on the site and sharing the information at a high volume. The problem has also been identified in smaller volume on Facebook, which has been taking public steps to address disinformation and fake news proliferating on the site.

Disinformation may be driving some of your most important decisions

Distinguishing between what is real and what is fake is getting more difficult. Even more worrying is that gray area in between where information is intentionally altered, taken out of context, or twisted to support a particular narrative. Over-time these misconceptions become embedded in political discourse and become part of our basic assumptions. This drives poor decision-making and in the extreme; dangerous rhetoric that dehumanizes and encourages violence against people who do not agree with a particular point of view. Whether it is viral memes suggesting that police officers are violent or media outlets that portray peaceful protests as being carried out by “violent criminals.” Disinformation and propaganda, in extreme scenarios fed numerous genocides that took place during the 20th century, whether religious, ethnically, or politically driven.

In a less dramatic – but worrying – scenario, Russia has targeted multiple demographics in the US including US military personnel, through the use of fake “patriotic” websites and Facebook groups that plant disinformation and fake news to influence their behavior and loyalties.

In aggregate, continued exposure to disinformation affects consequential decisions, from split second battlefield decisions to major policy initiatives fed by hyper-partisan narratives. But it also feeds everyday decisions, like who we choose to hire, whether we interpret a person’s actions as a threat based on the way they look, or some aspect of their character, and how we respond to that threat.

Falling behind on information warfare

Disinformation – though as old as time – has been harnessed in the information age to amplify social issues, increase distrust among disparate groups of people and drive wedges between previously unified groups. While some countries have been wise to this for many years, in the US, the phenomenon has caught many off-guard, leaving us more vulnerable than many realize. The US public had more awareness of disinformation operations during the Cold War, but the fall of the Soviet Union and the shift in security focus to terrorism and threats from non-state actors dulled the public’s understanding of the topic. Meanwhile, disinformation tradecraft (particularly in Russia and Eastern Europe) became more sophisticated, automated and easier to exploit. This was most dramatically seen in the recent US election cycle, but has been occurring throughout the last decade and particularly since the US and EU levied sanctions against Russia in 2014 following annexation of Crimea.

Managing Pandora’s Box

As with every problem that vexes our society, there is innovation and ingenuity at work to solve these problems. And though there is no way to close Pandora’s Box again, there are ways to lessen the impact of disinformation. Key among these is educating ourselves, and those around us, about how our biases make us prone to falling prey to disinformation. In addition, there is a growing body of literature and study on the topic that can help us understand what is occurring, when it’s occurring and how to address it. We can also familiarize ourselves with some of the tools (FacebookGoogle) that are emerging to address this problem; from initiatives to develop artificial intelligence solutions, to academic research, to the use of trusted data sources to refute questionable claims. There are good reasons to be hopeful about our ability to manage the threat that disinformation creates, but the fight against the use of technology to mislead, misinform, and drive wedges between people is particularly urgent. Especially in an age when mass communication happens at the touch of a button.

Meredith Wilson is the Founder and CEO of Emergent Risk International, LLC. Find out more or subscribe to our newsletter here.

Intelligence Analysis in the Age of Disinformation.

By | Uncategorized | No Comments

Sometimes the US intelligence community gets it wrong, but often, they get it right. And sometimes, they get it really right. We featured the excerpt below in our summary piece on the ODNI threat assessment back in February – highlighting its importance for those who provide analysis and information for decision-makers.

“Future cyber operations will almost certainly include an increased emphasis on changing or manipulating data to compromise its integrity (i.e., accuracy and reliability) to affect decision-making, reduce trust in systems, or cause adverse physical effects. Broader adoption of IoT (internet of things) devices and AI—in settings such as public utilities and health care—will only exacerbate these potential effects… cyber actors, who post disinformation on commercial websites, might seek to alter online media as a means to influence public discourse and create confusion… (pg. 2)” 

As the perfect storm that is the US Presidential election is upon us, we are facing a global problem of how to address veracity and credibility of sources, not just for analysts and researchers but for the general public; who ultimately make voting decisions that set the course for US government foreign policy towards the rest of the world.

The proliferation of information on the internet has given rise to an avalanche of disinformation. At the same time, disparate groups across political, cultural and idealist-driven divides increasingly disagree on what constitutes credible information. This creates problems for analysts and decision makers as we grapple with utilizing the information that most accurately reflects the reality facing our business or organization. Below we outline some thoughts and resources for addressing disinformation within your organization, analysis and decision-making:

Agree on a common understanding credibility.

Like many terms in the geopolitical-security world, we sometimes assume that when we discuss credible information, that we have a common understanding and acceptance of what constitutes a credible source. Even within small organizations, opinions on credibility often vary markedly; informed by each individuals’ life experiences and personal biases. For example, one analyst could be assigning high amounts of credibility to only sources that are politically left leaning, while another may be doing the same with right leaning media. (In reality, we hope analysts are looking at sources across the ideological spectrum to understand what their customers may be reading too). Having team discussions about how your team defines credibility – and even utilizing a simple credibility ratings system for information and human sources – like the one used by the US intelligence community – can go a long way to ensuring good communication, use of high-integrity sources and internal understanding of whats credible and what may not be.

Emphasize the importance of integrity of reporting and information over speed of reporting. 

While there are notable exceptions to this rule, in general, we should rarely be focused on being the first to report breaking news information to our leadership. There are so many free and paid services that do this already – and some that do it really well. The role of the intelligence and analysis function is to make sure that the organization has the most organization specific, relevant, high-integrity information and analysis to ensure high-quality decision making. Some good resources for understanding the integrity of online sources and journalistic guidelines and standards can be found hereherehere and here. Teams can learn a lot through reviewing these materials, both about how they can better assess the integrity of sources, as well as how to evaluate other organizations and the sources of information they provide.

Develop an understanding of disinformation. 

While Russian propaganda is receiving the most attention at present, it should be noted that disinformation campaigns are as old as espionage itself and have been utilized across the world. What has changed is how disinformation is disseminated and its exceptionally wide and quick propagation across the internet – which often makes it difficult to counter before it has become accepted as true. The EU, and increasingly, the US, have been especially affected by propaganda campaigns since sanctions were levied against Russia in 2014 over Ukraine. The EU has even created a task force responsible for educating the public about disinformation and highlighting disinformation in the press.

Exercise extreme caution in utilizing leaked materials of any kind. 

Leaked documents, such as those released by Wikileaks, Edward Snowden and other actors, can provide insight into how individuals and organizations communicate. Unfortunately, it is also exceptionally difficult to identify whether the information in these documents has been tampered with, is outright false or taken out of context. In our work, the most important thing we can do related to leaked materials is ensure that our personnel and organizations are not in danger as a result of information in purported leaked documents.

Understand the WHY of fake information. 

While much of the above-discussed disinformation is designed to sow confusion and divisive politics, there are many types of disinformation and fake information out there today. These do not all serve the same purpose. For example, a large body of fake information is driven by the pursuit of web advertising revenue; fake news sites that publish alarmist headlines in an effort to get readers to click on the story. Many of these are so poorly written that their veracity is easily questioned – but not all. There are also fake news sites that are generally devoted to providing entertainment (also to bring in advertising revenue, but overtly). The most well known among these is The Onion; but lesser known satire sites can get picked up by less savvy readers and quoted as fact. By learning the motives behind fake information, it is often easier for analysts to divine fact vs fiction and good sources vs bad ones.

Stay on top of the changing information landscape

Finally, in an information environment that is changing by the minute, ensure that your team’s remit includes regular reviews of their sourcing choices and an assessment of their personal biases towards information. Keep them focused on utilizing high quality information from credible government sources, think tanks, academia, and news sites with a record of integrity and accurate reporting. Ensure an understanding of best practices in the use of social media and ground-level sources. And insist on independent source verification of all reported information, including that which comes in from information vendors’ everyday. After all, our analysis is only as good as the credibility of the information that it is based upon. 

Don’t wait until October to think about surprises.

By | Uncategorized | No Comments

“The world is much more uncertain and volatile than it has ever been before. And that is because of some factors coming together now that have never come together before. And they amplify each other. You have a totally different world order and we struggle with that enormously”  Paul Polman, CEO, Unilever – comments made in interviews for Thinking the Unthinkable; A New Imperative for Leadership in the Digital Age”

Intelligence professionals need to be fundamentally forward looking in how we approach our roles as guardians of our organization’s resilience – especially as it relates to major external shocks.  With potentially disruptive US elections approaching and a host of hot button geopolitical issues on the table – the economy and global security environment are ripe for shocks. Excepting political risk created by the elections, most of these issues exist without the US Presidential elections as a factor.  What makes them especially worthy of our attention right now is their potential to catalyze actors to provocative actions with intent to influence outcomes, capitalize on US distraction with domestic politics, or intimidate allies and other foreign actors. All of these may increase political and security risk for multi-national organizations as foreign governments react to or seek to hedge their risk based on candidate’s foreign policy and trade platforms.
While it can be hard (and politically touchy) to train leadership attention on strategic impacts of these hot button issues, prepping decision makers for possible shocks and discussing mitigation measures in the event of a major unanticipated event is a key reason our roles exist.  This is an opportunity for risk intelligence to add real value and depth to their organization’s risk management capacity.   
With that in mind, here are just a few of the issues we’re addressing with our clients as the fall approaches:
Political risk from US elections impacting US multi-nationals overseas
For the first time in decades, political scientists and financial markets anticipate substantial political risk associated with the upcoming US elections. For multi-national organizations, the risk associated with foreign government perception, potential sharp changes in US foreign policy, and foreign reaction to perceived risk to their interests is sizable.  From the impact on defense and trade agreements to increasingly punitive regulatory environments to increased anti-American sentiment, it’s important to consider which way the winds are blowing and begin thinking about ways to mitigate these risks in this increasingly unpredictable election year. 
Simmering geopolitical challenges
Last week China announced it would deploy its first nuclear submarine days after the US announced it was lifting a decades old arms embargo on Vietnam.  In a climate of continual small escalations in the South China Sea, this was not as alarming as perhaps it should’ve been.  Meanwhile, North Korea continues with bellicose rhetoric and aggressive weapons technology development. Russia’s unpredictable foreign policy may also worsen in response to the US missile shield in Europe or in the likely event that sanctions are extended in July. Further agitations designed to provoke response from Presidential candidates shift the general tenor of the election, or in anticipation of a muted response from a distracted US political establishment should be anticipated on at least one of these fronts.  While these are high level geopolitical issues, all of them have real world impacts for the security, political and regulatory environment for US companies operating abroad. 
Global economic shocks
The 2008 financial crisis completely changed the trajectory of the political conversation ahead of the 2008 election and shifted the fortunes of millions of people around the world.  As the US elections approach, increasingly erratic US and overseas markets may be the norm, depending on who is polling ahead in the race, whose voice is being heard the loudest and how that bodes for economic growth at home and abroad.  With respect to Chinese markets, no matter how we dice it, enormous amounts of the global economy – and as an extension global risk – are wrapped up in a market that is anything but transparent.  Election season could exacerbate this risk OR a market collapse could impact the outcome of the elections.  While the question of how a collapse in the Chinese market would reverberate is an overwhelming one, we need to seriously consider US business resilience to such an event. 
Terrorism, extremism, and technology enabled catastrophic events
As has occurred in the past, interest in pulling off major coordinated attacks ahead of elections is likely to rise within extremist organizations possibly with intent to influence the outcome.  While the nature of terror attacks makes it hard to plan for, it is worth the time to assess organizational readiness and resilience to catastrophic attacks in the coming months.  Particular attention should be paid to large events  – especially technology enabled events – that would cause disruptions to critical infrastructure such as electricity, water, food supply, or major logistical hubs. A chaotic election year offers a potentially irresistible stage for those who wish to do catastrophic damage and disrupt major economies.
Its not what you know, it’s how you analyze it
Our approach to these problems – because they so dramatically impact the analytic outcome – is as important as analyzing the issues themselves. Here are a few thoughts on ways to approach what may be a very touchy political topic within your organization.
(1) Choose the most relevant issues
A critical examination of strategic issues that may impact your organization shouldn’t be an individual scattershot exercise, nor should it be based on gut instinct.  Take a systematic approach to the risks that are most relevant to your short and medium term business model with other company professionals with complimentary roles.  It may be helpful to draw up a list of company relevant issues and actors that are sensitive to major changes in the US electoral environment – including countries that have already weighed in on US elections, extremist actors looking for a high profile stage or with definable interest in seeing one candidate win over the other, or actors with sizable economic interest at stake.
(2) Introduce issues gradually
Rather than writing a paper on potential shocks with no previous introduction to the issue, start sensitizing potential strategic issues slowly through other mediums first.  A daily or weekly product is a great place to start introducing issues of concern and to briefly highlight potential risks for the organization. Another method is through one on one discussion with others who may have an overlapping area of responsibility and a different point of view on the subject.  Depending on your organization’s receptiveness, you could also conduct an internal poll on how people in your company view risk associated with these issues – gauging both awareness and internal concern.
(3) Consider your bias
Consider the organization’s and your own bias about potential outcomes and look for ways to correct for it.  On election-related issues, our biggest blind spot may boil down to our inability to be objective about the candidate field.  Beyond personal feelings about the candidates, US-based risk intelligence analysts are often not used to looking at political risk and geopolitical risk associated with US politics.  As a result, our bias may be higher and analysis more inclined to downplaying potential outcomes than it might be if we were looking at a similar issue in another country. The first step is to be aware of it and willing to challenge your own assumptions.  Then, there are many ways to help us reduce bias, including (shameless plug alert!) some that we will be working through at our July mini-training. 
(4) Make sure you’re asking the right question 
Too often companies approach geopolitical issues with the immediate impact in mind and if they don’t discern a direct impact, they move on.  But there are serious business continuity disruptions and consequences that may not be immediately apparent with the application of the simple question: How does this impact us?  In Asia for example, secondary and tertiary supply chain risks are acute with so many companies’ supply chains inextricably tied to the region.  So what is the right question?  Is it: How would a Chinese economic crash impact our company?  Or is it rather something else, like: How resilient is our supply chain to a major economic crash in China and beyond?  Or: How many single source suppliers do we have in this region and what are the major shocks we need to be thinking about that could impact our business resilience here?
Keep your mind in a strategic place
Businesses are so busy doing their business, that, without leaders who are intentional about examining strategic risks, many of these issues go unnoticed until mitigation is no longer an option.  Thought leadership on strategic shocks is a perfect role for risk intelligence professionals.  Regardless of other curve balls this election season throws us, it will pay dividends to get ahead of the issues. Now is the time. Our ability to add value through intelligence and analysis will decline as the elections near and the opportunity for building resilience and mitigation strategy disappears altogether. 

Sign up for our Newsletter

Sign up for our newsletter to receive our monthly intelligence briefing, infographics, and updates on the latest events. (And we promise we don’t do spam).