Here's a transcript of the speech I gave at the Abuja launch for our Chatham House report, Collective Action on Corruption in Nigeria on 17/05/17. For the transcripts of other speeches given at the same event by British High Commissioner H.E. Paul Arkwright CMG and my co-author Leena Koni Hoffman, please click here.
So the report is called: ‘Collective Action on Corruption in Nigeria: A Social Norms Approach to Connecting Society and Institutions.” I just want to start off by talking about what we mean by the social norms approach.
I think the key question motivating the report – and I think it’s one that policymakers, including all of you, really care about – is why do corrupt practices persist, and how do we change them? And the structure of this question is quite like other questions that face policymakers in their daily work – why do pernicious practices (like child marriage, open defecation, domestic violence, and so on) in general persist, and how do we change them?
What’s key and integral to our approach is that these two questions are intimately linked. We need to know why a particular practice persists and sustains itself in order to understand the kind of policy intervention we ought to design to induce behavioural change. And what the social norms approach emphasizes – and I’m going to lay my cards on the table here – is the set of social expectations that underlie behaviour. So we need to figure out what those are – and it’s important to remember that they might be social norms, but they might not, more on that later – before we can devise policy solutions.
Another important aspect of our approach is that the framework we use to understand corruption is as a set of interactions amongst people, amongst individuals, that is, real life flesh and blood human beings. In other words we try to understand it as a social phenomenon driven by a set of beliefs. We think corruption is stable because the beliefs that support it are self-reinforcing and resistant to change.
It’s worth mentioning at this point, I think, that we take a neutral social scientific approach to this work. We’re not interested in moralizing, or playing the blame game, or disparaging any particular sectors or institutions found in Nigerian society. We’re not interested in talking about metaphorical beasts of corruption either.
And I’ll tell you why we’re not interested in playing the blame game: It’s the fact that corruption is so frustrating – the reason it’s so frustrating is that it represents a situation where everyone (or almost everyone) realizes that it is a problem, that is, that corruption is a problem. And everyone (or almost everyone) also has a preference to live in a society that is free of corruption, or at the very least has comparatively less corrupt behaviour than the status quo. I want to suggest that even the front-line bureaucrat who is asking for a bribe in whatever context would probably prefer to live in a high honesty low corruption society than a low honesty high corruption one. And that’s what makes this whole thing so frustrating. If it’s true that everyone agrees there is a problem, and everyone has a preference to change, why hasn’t it happened yet?
Well our approach sheds some light on that problem. It’s because corruption is a stable phenomenon whereby everybody has an incentive to continue to engage in it because of a set of interdependent beliefs (even if some of those beliefs are false, which is what our evidence suggests). Social expectations of this sort are difficult to change because they’re resistant to change.
But I do want to push a kind of cautious optimism here, because I want to stress the idea that just because social expectations are resistant to change doesn’t mean they can’t change. Corruption on our view isn’t driven by evil actors or unchangeable conditions, but social expectations. And you can change social expectations. In fact we give you some tools to do just that in the policy recommendation section of our report.
So the takeaway to the social norms approach is to find out what are the types of beliefs driving a practice, and then to devise a policy response to attempt to change them. But you might ask: how exactly do you find out what types of beliefs underlie the behaviour you’re trying to change?
Well you go out into the world and try to find them, try to measure them. That’s what we did – we did a specialized social norms survey. We worked with around seven Nigerian university departments and organizations, including the National Bureau of Statistics and carried out around 4,000 surveys in six states and the FCT. We also did interviews around the country as well.
We didn’t use the word ‘corruption’ in the survey because we didn’t want to prime the respondents into giving us responses that they thought we wanted to hear. We wanted to get at their true beliefs so we tried to be as morally neutral as possible in the framing of the questions. We also asked them about their factual beliefs, moral beliefs, and legal knowledge surrounding the practice in question. Here are some pictures to partially prove to you that we really did go out there and do these surveys.
I think it’s important that we be clear about what exactly we mean by ‘social expectations.’ In the study, we use Cristina Bicchieri’s definition of social norms, which are made up of two different kinds of expectations. These are: (1) empirical expectations, which is basically an expectation about what other people do in a given situation. And (2) a normative expectation, which is basically an expectation about what other people expect you to do in a given situation. The thought is that if you hold these two expectations about a given behaviour, you will conform to that behaviour—or, in other words, you will conform to that norm. And if you don’t conform to that norm, you will be sanctioned in some way (sanctions can be innocuous like idle gossip, or they can be extreme and violent).
Consider an example that I was told about just the other day. Apparently there’s a social norm in Nigeria about handing things to people with your right hand. If you give something to someone with your left hand, then you may get sanctioned – that is, the person might say, ‘hey, don’t give that to me with your left hand. Give it to me with your right hand.’ I probably wouldn’t violate this norm because I’m right-handed, but say I did – say I gave something to someone with my left hand and they sanctioned me in some way. They told me not to do that again. That might put the thought in my head that other people expect me to give them things with my right hand and not my left hand—that would be a normative expectation in the sense described above. And if I observe other people doing that, namely, I see that other people also give things to people consistently with their right hand, then that might give me the empirical expectation. And I will probably start giving things to people with my right hand. All you left handed people probably have learnt this by now.
Another example is tipping in the United States. In the US, you’re expected to tip at least 15 – 20% after a meal to the waiter. And if you don’t, then you might get sanctioned. If I were with a close friend who did not tip after a meal, I might say something to them like ‘hey, you should really leave a tip.’ Or if I was with an acquaintance, maybe I’ll gossip about them behind their back to my other close friends like, ‘I was at dinner with this cheapskate the other day.’ Maybe I wouldn’t. But you know who definitely would? The waiter. In fact if you did that consistently, I bet the staff at that restaurant would know who the cheapskate was. They’d say, ‘here comes the cheapskate.’ Or ‘here comes the European.’ Or ‘here comes the English guy.’ Because you might be a cheapskate, or just ignorant of the norm (like Europeans, like an English person), or you might be all three.
I also want to stress one last distinction before moving onto the findings. So the kinds of behaviour I just described are driven by what we might call interdependent beliefs – that is, they are beliefs that are dependent on what other people think and do.
But it’s really important, from a policy perspective, to figure out if what’s driving a particular behaviour is independent or interdependent. So you might have beliefs that drive behaviour that aren’t dependent on what other people think or do. For example, using an umbrella when it’s raining outside is like this, as is brushing your teeth. I don’t care if you all use an umbrella or don’t when it rains outside – I will use my umbrella because I don’t want to get wet. And I don’t care if you brush your teeth or not, I value having healthy gums and a clean mouth, so I will brush my teeth. Well I might care if you don’t brush your teeth and you talk too close to me, but even so, that won’t govern my choices regarding the brushing of my own teeth!
Moral rules and moral convictions also work like this – if you have a strong moral conviction about not killing people or about not eating meat, then you shouldn’t care about whether other people kill others or eat meat. You will not kill people or eat meat because of a personal moral conviction.
Why this is important is that whether a behaviour is independent or interdependent will govern the kind of policy response required to change it.
What our findings suggest is that there are social norms governing the solicitation of bribes amongst law enforcement officers in Nigeria, but there are empirical expectations governing the giving of bribes. So law enforcement officers might ask for bribes because they have pressures from within their relevant reference networks to do so, but people generally give bribes because they see other people doing it, or they expect other people to do it, or because it’s just a way to get out of administrative hurdles.
Onto our findings. So, as I just mentioned, social norms of corruption seem to be limited to specific contexts and sectors in Nigeria, like law enforcement officers.
Second, if the environment or options are changed, behaviour will change. So people give bribes in many instances because it might just be an efficient way to circumvent inefficient rules and administrative hurdles. It’s quicker and cheaper to give a bribe than it is to go through official processes which take a while. Also, fines are usually less than bribes, making it easier to give a bribe.
We also found that collective action is impeded because in some places people have misconceptions about what other people think. That is, they systematically make mistakes about what other people think.
To return to the point about social norms amongst law enforcement officers, our research suggests that there seem to be both upward and downward pressures to engage in corruption amongst law enforcement. So senior law enforcement officers expect lower ranking officers to solicit bribes from the public and lower-ranking officers expect senior officers to do so.
What’s interesting here is that there is moralistic and value-laden language surrounding non-compliance which indicates a social norm at work – so you have this odd situation where typical moral judgments seem like they’ve been flipped upside down. Those who do the in fact morally correct thing and stand by their moral convictions and the law by not engaging in bribery are the ones who are called evil and wicked, and those who do the morally questionable thing like ask for bribes are not. Being called these things, and the loss of privilege and status that these names indicate, from people in your reference network creates strong social pressures to conform to a norm.
We also found what seems to be a case of people being systematically mistaken about other people’s beliefs, indeed, other people who are in their community. For example, in Enugu, around 9 out of 10 people said that it was wrong and illegal for a police officer to ask for a direct payment for a traffic violation instead of going through the official process, but they also thought that 5 out of 10 of their fellow citizens thought that the officer should ask for a bribe. So you have a case here, a situation where people are systematically making mistakes about the beliefs of their fellow citizens in a way that makes collective action hard. This is because it’s exactly these kinds of false beliefs that give rise to the fatalism and inevitability surrounding corruption, that make it seem like such a hard problem to overcome. Of course you would think it impossible to overcome corruption if you thought half of the people in your community think that law enforcement officers should ask for bribes and you personally think, at the same time, it is wrong and illegal for law enforcement officers to do so. These false beliefs need to be dispelled and the illusion lifted to even begin to facilitate anti-corruption collective action.
The second behaviour we looked at was regarding a government health facility employee asking for a payment for a hospital bed – a bed that you should legally be entitled to for free.
On the diagram, you can see the shorter lighter blue bar in the middle of the other two. This bar represents responses to ‘do you think it is wrong for a health facility employee to ask for payments for a hospital bed?’ I think Adamawa came in at the highest for a yes but that’s still relatively low, with just over 20%. But then you have the other two bars – do you think a government health facility should ask for a payment, and a high percentage of respondents across the board said yes.
What’s really interesting here is that when asked about whether it was illegal for a government health facility employee to ask for a payment for a hospital bed, lots of respondents also thought it was illegal. That’s the yellowish bar. So here you have a situation where most respondents didn’t think it was wrong, in fact, they thought the nurse should ask for a payment, but nevertheless they thought or knew it was illegal.
So what’s going on here? Despite the fact that many respondents thought that asking for a payment for a bed was illegal, they found it less objectionable than bribery at the checkpoint. Presumably this is because they can see themselves as funding an underfunded service as opposed to being extorted at a traffic checkpoint. The experiences must feel different.
So people view this as making a private transfer for something that ought to be a public transfer. They’re funding a government institution and they can see where their money is going.
Even so, though, we should remember that asking for a payment for a hospital bed basically amounts to a regressive tax because it puts the burden on those who can bear it the least, that is, the poorest people. This is presumably because wealthier people can already afford to go to private hospitals where they pay for healthcare as it is.
So I want to end on that, and hand it over to my colleague Leena to go through the rest of the findings and the policy recommendations. Thank you.
The uneasiness of cyberspace
The recent Sony hacking scandal brought one important policy question to light: To what extent should the US government be involved in the cybersecurity affairs of private citizens and business? Answering this question is difficult; the issues are highly complex, there are epistemic barriers to fully appreciate the risks that need to be managed, and it’s a platitude amongst cybersecurity professionals that cyberspace is notoriously murky. But getting the answer right is imperative because of the issues at stake – privacy, the scope of government, and even national security might hang in the balance.
The task of a good cybersecurity policy is to help us navigate through these complexities and mitigate risk. In my mind, there are at least four reasons why the government will become increasingly involved in the cybersecurity affairs of private citizens and businesses:
(1) Hackers from so-called closed-societies are sanctioned (either implicitly or explicitly) by their respective states: In closed-information societies, such as North Korea and China, hackers may operate with the blessing of the government. Implicit backing from a government ‘allows’ hacker activity to go unchecked. For example, many hackers in North Korea or China must operate at least with the tacit knowledge of the government since the Internet is so closely monitored in those countries. My suggestion here is that the ‘permission’ to allow hackers to operate in an otherwise controlled environment constitutes an implicit endorsement of hacker activity. Of course, this claim rests on the assumption that these governments have knowledge of the hacker-activity; surreptitious hacker-activity in controlled information environments does not have the implicit endorsement of those who are controlling the environment.
This implicit endorsement can be contrasted with the explicit endorsement of a government. In this case, a government maintains formal ties with hackers—for example, the People’s Liberation Army (PLA) Unit 61398 is China’s ‘cyber unit’, apparently responsible for numerous attacks on US-based public and private targets. These hackers could be part of the formal apparatus of the government (such as PLA Unit 61398) or be an independent hacking group that receives resources from a government.
That’s not to say that this kind of implicit acknowledgment but formal distance cannot characterize the relationship between the state and hackers in more open societies as well. Nevertheless, there does, prima facie, seem to be a salient distinction to be made between tacit acknowledgment in societies that straightforwardly monitor and censor the Internet, and tacit acknowledgement in societies that are comparatively ‘open’ in the relevant respect.
The upshot is this: What may look like lone hackers may not be lone hackers, particularly if they are operating in a controlled-information environment. So, for example, what may look like criminal cyber attacks for financial gain might instead be a form of sophisticated economic espionage, or worse.
(2) Information and resource asymmetries: This point is related to (1). If the pertinent cyber hacking groups are getting help from states as large as China, or states willing to spend as much money on their military as North Korea, then private cybersecurity resources will probably not be enough.
Asymmetries of information resources are particularly pronounced in cyberspace. Indeed, even small businesses are worthy targets for hackers. For example, consider the fictitious Mr. and Mrs. Kim, small business owners, who don’t know a thing about cybersecurity, but the computer at the front desk of their small independent motel holds thousands of customers’ credit card information and personally identifiable information (PII). Or say Mr. and Mrs. Kim have a franchised (but still small) hotel – a Best Western, for example. They still don’t know anything about cybersecurity, but their front desk is connected to Best Western International’s central information hub. This might give hackers access to millions of credit cards and terabytes of PII. The point is this: it’s unfair to expect Mr. and Mrs. Kim to develop cyber security practices that will keep them safe from hackers with the backing of the Chinese or North Korean governments. Similarly, it might also be unfair to expect larger businesses to develop cyber security measures that will protect against, say, PLA Unit 61398.
But we might say that it’s not unfair to expect the large multinational corporations take drastic cyber security measures, as they have resources comparable to or larger than countries like North Korea. For example, WalMart’s revenue in 2013 was almost 40 times as much as North Korea’s GDP. But asymmetry issues remain: North Korea spends a significant amount of resources on its military, and presumably quite a bit of that money goes towards gaining information superiority in cyberspace. From defector accounts, for example, North Korea has a specific program to train ‘home grown’ cyber-warriors. WalMart, however, does not have strong enough incentives to pour resources into maintaining a WalMart’s Liberation Army Unit 61398.
It’s important to stress that the information asymmetry stems from an incentive asymmetry. North Korea, China, and indeed all states have strong incentives to put resources into gaining information superiority because relative gains far outweigh costs. An excellent military cyber unit, for example, can not only obtain sensitive and classified national security information, but it can also cause physical damage to ‘smart’ systems or systems otherwise reliant on network infrastructure (see  below). Hackers can also cause significant psychological and economic damage. For example, convincing New Yorkers that there is a nuclear bomb in Manhattan (say, by controlling information flows to the city) would shut the city down and in the process economically paralyze the Northeastern United States. These kinds of attacks can be executed from a computer at no threat to personnel unlike traditional warfare.
(3) Incentives for secrecy: Private actors have a strong incentive to cover up cyber attacks on their systems. This is because consumers place trust in the cybersecurity infrastructure that lies behind much of their face-to-face activity with businesses. When you deposit money into a bank, you expect that nobody can simply hack into your account and take your money – importantly, you trust the bank will insure that nobody is able to do that.
You place a similar trust that nobody can access your information when you rent a room from Mrs. Kim, or when you buy something from Target.
But what incentive would the bank, Mrs. Kim, or Target have to tell you that your credit card information was, say, a part of a large package of information taken from their systems? None (notwithstanding legal compliance). In fact, their incentives run the other way: If the bank lets it be known that someone has hacked their systems and has access to their accounts, it risks a bank run. There are similarly significant costs for Mrs. Kim and large retail stores as well – a loss of trust means a loss of business.
These incentives might run directly against national security interests. It’s important to recognize that what may look like discrete cybersecurity incidents might be part of a broad and sophisticated attack. Obtaining information about cyber attacks is not only prudent for privacy and financial reasons, then, but also for gathering intelligence and developing a robust cybersecurity posture. My view is that a mature cybersecurity posture is not only about ‘keeping out’ who we want to keep out or protecting information, but also about gathering information as well – what are the hackers looking for? Why would they possibly be looking for this or that information set? Is there a discernible pattern to their activities?
(4) Networked infrastructure: Industrialized societies are embedded with networked infrastructure, which means that industrialized societies are embedded with cyber risk. Sometimes industry doesn’t follow best practices, as in the case of the German steel mill that didn’t keep a ‘gap’ between its networks and the public Internet. A cyber attack in 2014 caused physical damage to the steel mill, and even closed down one of its blast furnaces. Cyber attacks can wreak damage even when there’s an air gap between the public Internet and closed networks. Consider the infamous Stuxnet – a computer worm – that was introduced to the closed environment of Iran’s nuclear industrial control systems through an infected USB drive. Stuxnet shut down almost a fifth of Iran’s nuclear centrifuges before it was discovered.
These worries are only set to intensify with the growth of smart technologies and smart cities, as more critical infrastructure becomes part of the ‘Internet of Things’. Cyber risk to power grid networks, water delivery systems, and transportation increases with the increased connectivity in smart cities. Moreover, various third party contractors (with perhaps varying cybersecurity postures) are typically involved in the running of this critical public infrastructure, increasing overall vulnerability.
These are but some of the issues we ought to keep in mind as we decide on an appropriate cybersecurity policy for a changing information environment. As cybersecurity professionals, decision-makers, and policy analysts know, cyberspace is an especially messy area when it comes to discerning the proper role of government. But even amidst all of this messiness, one thing is clear: we have a formidable task ahead of us in navigating these murky waters.