The Future of Governance: A Choice and a Question

Governance structures are experiencing increasing levels of stress inherent in a world that is evolving rapidly toward more complexity. They are compelled to adapt. In parallel, social media becomes a key new arena for influence. The trend is irreversible and governance structures will need to adapt too. Yet, online social platforms are ill-equipped to provide healthy debates and are needing to be redesigned with governance in mind. From there, we see two trends emerging. First, a centralized algorithmic governance with social media as an instrument of control. Second, an European declaration of intent which emphasises data ownership and sovereignty. In the wake of European ideals, we engage a discussion for distributed human governance that aligns with this intention. However, this theoretical model is challenging to current governance structures. More than a technical implementation, it is also a political choice and a philosophical question. Whatever the outcome, it is our belief that the future of governance lies in a human-computer interaction.

1. Facing a new threshold of complexity

Throughout the modern industrial era social systems have been structured by top-down command and control mechanics where social trust could only solidify in hierarchical vertical responsibilities. However, with the Internet came the advent of pervasive connectivity and a web of intricate relational interdependencies that challenge these traditional social dynamics. The transformation in this structural arrangement raises the question of the viability of a new social architecture where relationships could be organized with a positive dimension of horizontal trust and accountability. Both from a structural and societal point of view, it is our suggestion that governance structures are having to reinvent themselves thoroughly, if they want to successfully navigate this paradigm shift that is starting to weight heavily on them.

Undoubtedly, the surge in connectedness that came with the Internet propelled our societies into a new threshold of complexity. In addition, the emergence and combination of disruptive technologies exponentially increase the field of uncertainty. As a result, governance structures now have to deal with a wide array of challenges - both local and global - in a world where change is the only constant and where unpredictability is the only certainty. In order to navigate effectively the waves of change that are forming, social organizations in general and governance structures in particular will need to express greater degrees of agility. Most likely, they will be forced to join transversal participatory networks and collaborate with stakeholders that will escape their traditional chains of command and control. On one hand, new societal models would need to include an improved capacity for resilience. On the other hand, they would need to exercise trust horizontally as a fundamental element of social relationships. Given that centralized social organizations have an upper limit on what they can structurally handle, it is time for them to be fundamentally re-designed for organic adaptability and individual reflexivity.

Moreover, as society transitions from the modern to the postmodern era, new values such as cultural diversity, gender equality, improved transparency and accountability emerge and have to be taken into consideration by its leadership. Indeed, this contemporary set of values exerts additional pressure on governance structures that were accustomed to performing their work behind closed doors. As such, doing business as usual is now becoming increasingly hazardous in terms of social perception. In fact, ‘social justice’ movements benefit from the social web to disrupt organizations from within by leaking out documents and practices of confidential nature. They are also able to challenge their stability and security from outside by triggering leaderless protests that become viral on networks (e.g. French ‘yellow vests’).

Unfit to cope with the multiplicity of desires and intentions that are now being expressed on social media, nested governance structures find it increasingly demanding to translate their leadership into society which - in return - does not feel adequately taken into account (e.g. Extinction Rebellion). As a result, we witness a global erosion of social trust and we may further expect a persistent trust gap between citizens and their institutions [1]. The centralized architecture or the ‘one-to-many’ model that used to be effective in previous times is now reaching its structural limits. Democratic institutions have to factor in these new variables and take appropriate action, if they want to stay relevant in the times to come. That is to say, systemic challenges require systemic solutions.

2. Social media, the new arena for influence

With the steady transfer of political discourse and influence onto the social media, the world of governance and leadership has changed - forever. Spearheaded by political figures and social media enthusiasts, leadership can no longer separate itself from online debates. Besides, online social platforms have already become the primary public sphere of interest for digital natives (i.e. Gen z). We believe that it is time to move with that curve as social media will likely overdetermine who gains influence in the political arena from now on. Online social platforms are already helping governance structures to reconnect with the social fabric as an interface for innovative participatory governance models. As a matter of fact, social media sees itself as ‘the critical infrastructure for modern-day democracy’ [2].

However, social media is plagued with data privacy practices, hate speech, election meddling, misinformation, and filter bubbles which provide little help for users to develop trust and confidence among themselves. Without trusted information and the means to decide whom to trust, viable thresholds of cooperation are barely reached and hardly maintained. While social technologies have allowed hyper social interaction which provides great connectivity, discoverability, and outreach, accessing reliable information has become - between fake news and fake friends - tremendously challenging. Certainly, with data corruption and with rankings and ratings based on prediction algorithms that create feedback loops such as reputation inflation, the value of second-hand information is such that it becomes increasingly difficult to separate the wheat from the chaff and the signal from the noise. As a result, though internet users are well connected to share information, they are not well equipped to inform meaningful decision making.

We argue that for social media to function as a legitimate environment for participatory governance, it will need to reinvent itself too. By all means, social media is inherently driven by algorithms that relentlessly pursue the growth of a business model based on users data. This ‘growth hacking’ conflicts with ethical principles about the management of users data since it incentivizes people to connect, post, and share ever more at the expense of privacy, trust, and meaning. While geographical location is not a strong barrier to interaction anymore, trust - or lack thereof - remains an issue. As a result, the ‘many-to-many’ platforms increasingly generate social distrust as they aimlessly connect everyone to everything with pervasive and persuasive technology that influence individual’s actions.

Unsurprisingly, we witness the rise of legitimate concerns from all spheres of society and those are best summarized by this following appreciation: ‘Democracy has a Facebook problem’ [3]. Under a new and more constraining regulatory framework (i.e. GDPR), this model is now seriously challenged by institutional powers [4]. From here onward - and with the aim to maintain social stability in a growing complexity - we envision two most probable scenarios for the future of governance. These two scenarios lie in the realm of a human-computer interaction and envision social technologies as the main interface for decision making. The question is which one of a centralized or distributed approach to governance will overrule the other. Though we advocate for the latter, we aim to outline some of the benefits and challenges constitutive of these two approaches that are poised to mingle nonetheless.

Algorithmic governance can either be approached in a distributed or centralized fashion. Popularized by Bitcoin, the underlying technology of Distributed Ledger Technologies (DLT) - popularly known as the ‘Blockchain’ - allows a consensus to be reached based on a cryptographic proof instead of trusting a third-party or central authority [5]. Yet, when the ‘rule of code’ fails, algorithmic or ‘on-chain’ governance falls short. For instance, in the infamous case of the DAO hack humans were required to step in to regain control of the system [6]. From these early attempts, it is our opinion that on-chain governance is rigid and therefore has intrinsic and hardwired limits. Being rigid, algorithmic governance can naturally be applied to centralized systems: “code is law is a form of regulation whereby technology is used to enforce existing rules [and where] code is used by platforms and by the public sector as a regulatory mechanism with the property of automating the law and enforcing rules and regulations a priori through smart contracts with a guarantee of execution” (De Filippi, 2017).

In contrast to traditional legal rules, regulation by code is enforced ex ante (e.g. No Fly List) so that there is less need for ex-post enforcement and for judicial arbitration [7]. The counterpart is that - without human intervention - algorithmic governance may outline the contours of a dystopian future run by Artificial General Intelligence (AGI) and challenged by Generative Adversarial Networks (GAN). This is especially of concern since algorithms are never neutral and therefore inherently political [8]. Moreover, their design depends on engineers and platform operators. As such, they escape public scrutiny and any sort of democratic oversight. Also, as immutable, incorruptible, self-executing, and self-enforcing algorithmic rules are the very foundation of smart contracts, it might become difficult to make course correction in complex situations that are constantly evolving. Finally, machine learning can address some of these issues by adding some levels of adaptivity. Yet, machine learning has already proved to be implicitly biased against minorities while undermining the principles of universality and non discrimination [9].

Similar to distributed algorithmic decision making, centralized algorithmic decision making has also serious practical limitations and ethical drawbacks. A human-computer interaction that “enable men and computers to cooperate in making decisions and controlling complex situations without inflexible dependence on predetermined programs” [10] could theoretically better support a robust distributed participatory governance model. To do so, we would need to come up with a new business model for online social platforms, and algorithmic decision making would need to be completed by an ‘off-chain’ human governance that could operate at scale by being modelled on distributed principles [11]. Improvements in off-chain governance is recognized as being an important part of the future of blockchain governance [12]. By definition, off-chain governance includes human interactions. Human interactions require social trust to reach collective decision-making. Therefore, the ‘trustless’ protocols of crypto-governance cannot avoid the social trust of decision-making beyond code.

3. A distributed model for human governance

A distributed human governance (DHG) model is a bold and somewhat exotic proposal positing that zones of influence are emergent within given sets of relationships. This DHG is not meant to replace existing governance structures but to complement them as an additional layer of human interactions. Though within that framework decision making can be formalized by [smart] contracts, the default modus operandi is the free association of people who express their preferential attachment by voting with their feet’. As such, they remain free to engage in and to disengage from any social interactions at any time for any reason - thus affecting the zone of influence one may have. This model represents a shift from contractual obligation to consensual interaction, from a monolithic to a granular governance, and from the idea of a dissonant leadership that is able to impose its will regardless of the consent of participating parties to the idea of a resonant leadership that is required to generate social recognition in order to cast its influence. This model transitions from rigidity to agility and from static hierarchies to dynamic ‘heterarchies’. As such, the model intends to transcend the dichotomies between centralized versus distributed modes of operation and between on-chain and off-chain governance. Indeed, it is less about ‘this or that’ than it is about ‘this and that’. Inspired by the science of complexity, the proposed model refers to the concept of ‘chaordic organizations’ first devised and applied by Dee Hock, founder and chairman of Visa International [13].

Human society is inherently complex i.e. composed of a large number of interacting individuals. The science of complexity is studying such ‘complex adaptive systems’, typically in the framework of ecosystems [14]. This interest in ecosystems is due to their inherent adaptability to reach an optimal state within a ‘window of vitality’ as a balance between efficiency and resilience [15]. Learning from their dynamics, we aim to investigate how complexity theory could help to bring an emerging coherent structure within human society outside of the chains of command and control. Yet, though human society is a complex system in and of itself, “[...] there is a seeming impossibility or at least an extreme difficulty in utilizing the logic of complex adaptive systems to aid in restructuring social systems capable of navigating the dynamics of the information age” [16]. We naturally identify our main challenge as overcoming the ‘tragedy of the commons’ where some individuals benefit at the expense of others. At first sight, aligning personal and collective interests with equipotent individuals who can be pursuing different agendas seems to be wishful thinking. We propose to tackle the challenge with the notion of social trust. Here, we do not define trust as a security problem which merely underlines a state of distrust, neither do we define trust as the naive belief in the benevolence of others. We define trust as a root factor in complex interdependencies.

Indeed, trusting others is a fragile experience that can have dire consequences for those involved. This risk is of special significance in the context of social networks where a breach in confidence coupled with the distributed memory of interactions across the digital social fabric can have long lasting and devastating effects. Therefore and generally speaking, trusting others prompts responsible behavior and the risk is - by virtue and by necessity - only shared with a restricted set of peers. Since the peculiarity of trust is that it must be earned as well as given, reciprocal behaviors anchored in words and in deeds are paramount to building trusted relationships. Trusting others also implies some level of emotional attunement as emotional dissonance acts as a strong social signal that a breach in confidence is somewhere in sight. Though trusting others is a fragile experience, it is the basis for strong social relationships. The model is therefore not ‘risk-averse’ but ‘risk-aware’.

With insights from complexity theory, we propose to develop a network architecture that would scale from local to the global the trust, resonance, and reciprocity experienced by communities that have the highest levels of engagement (e.g. community groups, startups, sport teams, music bands, churches, special forces). Taking the structure of biological systems as a cue, we can see that in order for a large number of cells to function as an organism - or for a large number of organisms to function as an ecosystem - the connections are not ‘many-to-many’ as we are currently connected via the Internet. Instead, they are ‘few-to-few’, in a fractal and heterarchical network that encompasses the entire organism without any absolute center or apex of authority. Whereas hierarchies involve relations of dependence and market [and blockchain governance] involve relations of independence, heterarchies involve relations of interdependence [17].

The relevance of using trust as the foundation for a distributed ‘social ecosystem’ is that by sharing the risk of trusting others within small social clusters overlapping each other, peer pressure incentivizes participants to take the collective interest into account. Indeed, any relationship eroding trust would belittle the collective potential. And any relationship expressing resonance would lead to increased synergies, attractiveness, and influence at the collective level. Therefore, the pursuit of self-interest at the micro or individual level is balanced by the pursuit of collective fitness at the macro or group level. Trust - as an expression of interdependence and applied to online social platforms - might thus play the role of ‘social glue’ and resolve the question of an emergent structure with a community-centric architecture. Beyond the vertical relationships of top-down and bottom-up, this distributed model would initiate a new relationship between the individual and the collective and a shift from political influence anchored in nominal status to political influence rooted in social recognition.

4. A political choice and a philosophical question

Most assuredly, we are now at crossroads where “acceptable ideas are competent no more and competent ideas are not yet acceptable” [18]. In a world of tremendous complexity and uncertainty, centralized solutions won’t work anymore unless they impose a hyper rigid governance system that won’t leave much room for social dissent [19]. Wherever this system will be dismissed in favor of democratic principles, increased complexity will call for distributed solutions. Distributed solutions won’t work without some form of distributed human governance. Distributed human governance will, by-design, challenge the nominal power of static hierarchies with the social power of dynamic heterarchies.

From a European perspective, “We find ourselves stuck between two dominant models: the monopolistic corporate-led internet of Silicon Valley and large-scale government surveillance systems of Beijing. Can we now come up with a third narrative, where citizens and communities are in control and can determine their own future?” [20]. A centralized algorithmic governance will be detrimental to democratic principles. Yet, it will provide a cheap and effective way to enforce stability without challenging institutional powers. In contrast, a distributed human governance could be seen as an emergent and challenging power that could take down the old system - especially if framed in the context of encrypted networks and ‘self-sovereign identities’ [21]. Faced with the real of societal transformations, static hierarchies might be reluctant to live up to their intentions. Still, riding the ‘edge of chaos’ is the difficult art that today’s leadership needs to master.

In the EU, ‘open government’ is seen as the way forward. The shift from a ‘government-centric’ to ‘citizen-centric’ and ‘community-centric’ models is prompted as governance structures enter multi-stakeholders knowledge networks. The trend undoubtedly paves the way for a participatory ‘open governance’. Such an open governance would put the function value of traditional governance structures under threat potentially raising the question of a new social contract. Nevertheless - and given that such distributed models would satisfy GDPR requirements [22] - they could address scenarios that require to propagate trust horizontally within, across, and beyond social organizations. Being purpose-agnostic, robust, adaptive, and scale independent, they could enable participatory governance at scale. As such, they would help to tackle exponential challenges with exponential participation. “[..] The real disruption taking place is not technology; it’s a trust shift that will open the doors to new and sometimes counterintuitive ways of designing systems that will change human behavior on a large scale” (Botsman 2016).

With 5G and the IoT at the doorstep, a new framework for interaction is emerging and a new threshold of complexity is upon us. A human-computer interaction is most assuredly the future of governance and that future is being engineered today. The choice is between a rigid governance that does not require cognitive engagement and an agile governance that will prompt everyone to exert their responsibility. The question is this: will we commit - as human beings - to govern ourselves or will we let algorithms govern us?


5. References

[1] Edelman. (2019). 2019 Edelman Trust Barometer. [online] Available at: https://bit.ly/2TVprXB [Accessed 17 Mar. 2017].

[2] Price, R. (2019). Car-bomb fears and stolen prototypes: Inside Facebook's efforts to protect its 80,000 workers around the globe. [online] Business Insider. Available at: https://read.bi/2TRSg77 [Accessed 17 Mar. 2019].

[3] American congresswoman Alexandria Ocasio-Cortez on Twitter, Mar 12, 2019. Available at: https://bit.ly/2Hu03lK [Accessed 17 Mar. 2019].

[4] European Commission, Next Generation Internet Initiative. [online] Available at:https://bit.ly/2GXIBRT [Accessed 17 Mar. 2019].

[5] Nakamoto, S. (2019). Bitcoin: A Peer-to-Peer Electronic Cash System. [ebook] Available at: https://bitcoin.org/bitcoin.pdf [Accessed 17 Mar. 2019].

[6] Siegel, D. (2016). Understanding The DAO Attack. [online] CoinDesk. Available at: https://bit.ly/2Jb2OKC [Accessed 17 Mar. 2019].

[7] De Filippi, P., & Hassan, S. (2016). Blockchain technology as a regulatory technology: From code is law to law is code. First Monday, 21 (12).

[8] Hodson, H. (2019). DeepMind and Google: the battle to control artificial intelligence. [online] 1843. Available at: https://bit.ly/2HxnTw9 [Accessed 17 Mar. 2019].

[9] Hardt, M., Price, E. and Srebro, N. (2016). Equality of Opportunity in Supervised Learning. [online] Cornell University. Available at: https://arxiv.org/abs/1610.02413 [Accessed 17 Mar. 2019].

[10] Licklider, J. (1990). Man-computer symbiosis. [ebook] Palo Alto: Systems Research Center of Digital Equipment Corporation. Available at: http://memex.org/licklider.pdf [Accessed 17 Mar. 2019].

[11] Rosenberg, L. (2015). Artificial Swarm Intelligence, a Human-in-the-loop approach to A.I.. [ebook] San Francisco: UnanimousAI.com. Available at: https://bit.ly/2F38j8E [Accessed 17 Mar. 2019].

[12] Ehrsam, F. (2017). Blockchain Governance: Programming Our Future. [online] Available at: https://bit.ly/2zO45ST [Accessed 17 Mar. 2019].

[13] Hock, D. (1999). Birth of the chaordic age. San Francisco, CA: Berrett-Koehler.

[14] Cilliers, P. (2002). Complexity and Postmodernism: Understanding Complex Systems. [ebook] London and New York: Routledge. Available at: https://bit.ly/2JkY9px [Accessed 17 Mar. 2019].

[15] Goerner, Sally & Lietaer, Bernard & Ulanowicz, Robert. (2009). Quantifying economic sustainability: Implications for free-enterprise theory, policy and practice. Ecological Economics. 69. 76-81. 10.1016/j.ecolecon.2009.07.018. Available at: https://bit.ly/2UvQ6dN [Accessed 17 Mar. 2019].

[16] Last, C., Van Weyenbergh, G. and Werner, B. (2018). Transformative Social Ecosystem Dynamics:A psychological architecture of emotional trust. 1st ed. [ebook] Brussels: Meoh ASBL. Available at: https://bit.ly/2IOPhDb [Accessed 17 Mar. 2019].

[17] Wagner, R. (2018). Is heterarchy the answer to the crisis of hierarchy?. [online] Available at: https://bit.ly/2UdLg5u [Accessed 17 Mar. 2019].

[18] Gharajedaghi, J. (2006). Systems thinking: Managing Chaos and Complexity. Amsterdam [etc.]: Butterworth-Heinemann.

[19] En.wikipedia.org. (2019). Social Credit System. [online] Available at: https://bit.ly/2eHPomc [Accessed 17 Mar. 2019].

[20] Nesta. (2019). 30 years of the web: where do we go next?. [online] Available at: https://bit.ly/2CnScCo [Accessed 17 Mar. 2019].

[21] Sovrin. (2019). Sovrin Network – an open source project creating a global public utility for self-sovereign identity. [online] Available at: https://sovrin.org/ [Accessed 17 Mar. 2019].

[22] Blockchain and the GDPR. (2018). [ebook] Brussels: The European Union Blockchain Observatory and Forum. Available at: https://bit.ly/2CriGn0 [Accessed 17 Mar. 2019].