Governance structures are experiencing increasing levels of stress inherent in a world that is evolving rapidly toward more complexity. They are compelled to adapt. In parallel, social media becomes a key new arena for influence. The trend is irreversible and governance structures will need to adapt too. Yet, online social platforms are ill-equipped to provide healthy debates and are needing to be redesigned with governance in mind. From there, we see two trends emerging for the future of governance. First, a centralized and algorithmic-assisted governance with social media as an instrument of control. Second, an European declaration of intent which emphasizes data ownership and sovereignty. In the wake of these ideals, we engage a discussion for distributed and agile human governance that aligns with this intention. However, this theoretical model is challenging to current governance structures. Whatever the outcome, it is our belief that the future of democratic governance lies in a human-centered computer interaction.
1. Facing a new threshold of complexity
Throughout the modern industrial era social systems have been structured by top-down command and control mechanics where social trust could only solidify in hierarchical vertical responsibilities. However, with the Internet came the advent of pervasive connectivity and a web of intricate relational interdependencies that challenge these traditional social dynamics. The transformation in this structural arrangement raises the question of the viability of a new social architecture where relationships could be organized with a positive dimension of horizontal trust and accountability. Both from a structural and societal point of view, it is our suggestion that governance structures are having to reinvent themselves thoroughly, if they want to successfully navigate this paradigm shift that is starting to weigh heavily on them.
Undoubtedly, the surge in connectedness that came with the Internet propelled our societies into a new threshold of complexity. In addition, the emergence and combination of disruptive technologies exponentially increase the field of uncertainty. As a result, governance structures now have to deal with a wide array of challenges - both local and global - in a world where change is the only constant and where unpredictability is the only certainty. In order to navigate effectively the waves of change that are forming, social organizations in general and governance structures in particular will need to express greater degrees of agility. Most likely, they will be forced to join transversal participatory networks and collaborate with stakeholders that will escape their traditional chains of command and control. On one hand, new societal models would need to include an improved capacity for resilience. On the other hand, they would need to exercise trust horizontally as a fundamental element of social relationships. Given that centralized social organizations have an upper limit on what they can structurally handle, it is time for them to be fundamentally re-designed for organic adaptability and individual reflexivity.
Moreover, as society transitions to the postmodern and transmodern eras, new values such as cultural diversity, gender equality, improved transparency and accountability emerge and have to be taken into consideration by its leadership. Indeed, this contemporary set of values exerts additional pressure on governance structures that were accustomed to performing their work behind closed doors. As such, doing business as usual is now becoming increasingly hazardous in terms of social perception. In fact, ‘social justice’ movements benefit from the social web to disrupt organizations from within by leaking out documents and practices of confidential nature. They are also able to challenge their stability and security from outside by triggering leaderless protests that become viral on networks (e.g. French ‘yellow vests’).
Unfit to cope with the multiplicity of desires and intentions that are now being expressed on social media, nested governance structures find it increasingly demanding to translate their leadership into society which - in return - does not feel adequately taken into account (e.g. Extinction Rebellion). As a result, we witness a global erosion of social trust and we may further expect a persistent trust gap between citizens and their institutions . The centralized architecture or the ‘one-to-many’ model that used to be effective in previous times is now reaching its structural limits. Democratic institutions have to factor in these new variables and take appropriate action, if they want to stay relevant in the times to come. That is to say, systemic challenges require systemic solutions.
2. Social media, the new arena for influence
With the steady transfer of political discourse and influence onto the social media, the world of governance and leadership has changed - forever. Spearheaded by political figures and social media enthusiasts, leadership can no longer separate itself from online debates. Besides, online social platforms have already become the primary public sphere of interest for digital natives (i.e. Gen z). We believe that it is time to move with that curve as social media will likely overdetermine who gains influence in the political arena from now on. Online social platforms are already helping governance structures to reconnect with the social fabric as an interface for innovative participatory governance models. As a matter of fact, social media sees itself as ‘the critical infrastructure for modern-day democracy’ .
However, social media is plagued with data privacy practices, hate speech, election meddling, misinformation, and filter bubbles which provide little help for users to develop trust and confidence among themselves. Without trusted information and the means to decide whom to trust, viable thresholds of cooperation are barely reached and hardly maintained. While social technologies have allowed hyper social interaction which provides great connectivity, discoverability, and outreach, accessing reliable information has become - between fake news and fake friends - tremendously challenging. Certainly, with data corruption and with rankings and ratings based on prediction algorithms that create feedback loops such as reputation inflation, the value of second-hand information is such that it becomes increasingly difficult to separate the wheat from the chaff and the signal from the noise. As a result, though internet users are well connected to share information, they are not well equipped to inform meaningful decision making.
We argue that for social media to function as a legitimate environment for participatory governance, it will need to reinvent itself too. By all means, social media is inherently driven by algorithms that relentlessly pursue the growth of a business model based on users data. This ‘growth hacking’ conflicts with ethical principles about the management of users data since it incentivizes people to connect, post, and share ever more at the expense of privacy, trust, and meaning. While geographical location is not a strong barrier to interaction anymore, trust - or lack thereof - remains an issue. As a result, the ‘many-to-many’ platforms increasingly generate social distrust as they aimlessly connect everyone to everything with pervasive and persuasive technology that influence individual’s actions.
Unsurprisingly, we witness the rise of legitimate concerns from all spheres of society and those are best summarized by this following appreciation: ‘Democracy has a Facebook problem’ . Under a new and more constraining regulatory framework (i.e. GDPR), this model is now seriously challenged by institutional powers . From here onward - and with the aim to maintain social stability in a growing complexity - we envision two most probable scenarios for the future of governance. These two scenarios envision social technologies as the main interface for decision making. The question is which one of a centralized or distributed approach to governance will overrule the other. Though we advocate for the latter, we aim to outline some of the challenges constitutive of these two approaches that are poised to mingle nonetheless.
Algorithmic governance can either be approached in a distributed or centralized fashion. Popularized by Bitcoin, the underlying technology of Distributed Ledger Technologies (DLT) - popularly known as the ‘Blockchain’ - allows a consensus to be reached based on a cryptographic proof instead of trusting a third-party or central authority . Yet, when the ‘rule of code’ fails, algorithmic or ‘on-chain’ governance falls short. For instance, in the infamous case of the DAO hack humans were required to step in to regain control of the system . From these early attempts - and being rigid - it is our opinion that on-chain governance has intrinsic and hardwired limits. Yet, it is poised to be applied to centralized systems: “code is law is a form of regulation whereby technology is used to enforce existing rules [and where] code is used by platforms and by the public sector as a regulatory mechanism with the property of automating the law and enforcing rules and regulations a priori through smart contracts with a guarantee of execution” (De Filippi, 2017).
In contrast to traditional legal rules, regulation by code is enforced ex ante (e.g. No Fly List) so that there is less need for ex-post enforcement and for judicial arbitration . The counterpart is that - without human intervention - algorithmic governance may outline the contours of a dystopian future run by Artificial General Intelligence (AGI). This is especially of concern since algorithms are never neutral and therefore inherently political . Moreover, their design depends on engineers and platform operators. As such, they escape public scrutiny and any sort of democratic oversight. Also, as immutable, incorruptible, self-executing, and self-enforcing algorithmic rules are the very foundation of smart contracts, it might become difficult to make course correction in complex situations that are constantly evolving. Finally, machine learning can address some of these issues by adding some levels of adaptivity. Yet, machine learning has already proved to be implicitly biased against minorities while undermining the principles of universality and non discrimination .
Similar to distributed algorithmic decision making, centralized algorithmic decision making has also serious practical limitations and ethical drawbacks. A human-computer interaction that “enable men and computers to cooperate in making decisions and controlling complex situations without inflexible dependence on predetermined programs”  could theoretically better support a robust distributed participatory governance model. To do so, we would need to come up with a new business model for online social platforms, and algorithmic decision making would need to be completed by an ‘off-chain’ human governance that could operate at scale by being modeled on distributed principles . Improvements in off-chain governance is recognized as being an important part of the future of blockchain governance . By definition, off-chain governance includes human interactions. Human interactions require social trust to reach collective decision-making. Therefore, the ‘trustless’ protocols of crypto-governance cannot avoid the social trust of decision-making beyond code. “As a society, and as technologists and entrepreneurs in particular, we’re going to have to get good at cooperating — at building trust, and, at being trustworthy. Instead of directing resources to the elimination of trust, we should direct our resources to the creation of trust—whether we use a long series of sequentially hashed files as our storage medium or not.” 
3. Principles for distributed participatory governance
We define governance as the way those with authority express that authority. We define participatory governance as the way those without formal authority or nominal power express their authority. We define distributed participatory governance as the ecosystem of decision making that takes place in multi stakeholder informal networks. A distributed participatory governance model is a bold and somewhat exotic proposal positing that zones of influence are not predetermined but are instead emergent within a multitude of interactions. This proposal for distributed participatory governance is not meant to replace existing governance structures but to complement them as an additional layer of human interactions between centralized and decentralized systems, and between humans and machines. The default modus operandi is the free association of people who continuously express their preferential attachment by ‘voting with their feet’. Before ‘explicit statements’ like decision-making per se take place, ‘implicit statements’ constitute the basis of interactions because ‘actions speak louder than words’. Implicit statements postulate that people remain free to engage in and to disengage from any social interactions at any time for any reason. Implicit statements thus directly affect the zone of influence one may have.
This model represents a shift from contractual obligation to consensual interaction, from a monolithic to a granular governance, and from the idea of a dissonant leadership that is able to impose its will regardless of the consent of participating parties to the idea of a resonant leadership that is required to generate social recognition in order to maintain and expand a zone of influence. This model transitions from rigidity to agility, from static hierarchies to dynamic ‘heterarchies’, and from boundaries that separate to boundaries acting as interfaces that “intimately connect the system with its environment” . As such, the model intends to transcend the dichotomies between centralized versus distributed modes of operation as it is not so much for the system to scale up than it is for it to scale across. Indeed, it is less about ‘this or that’ than it is about ‘this and that’.
Inspired by the science of complexity, the proposed model refers to the concept of ‘chaordic organizations’ first devised and applied by Dee Hock, founder and CEO of Visa International . Human society is inherently complex i.e. composed of a large number of interacting individuals. The science of complexity is studying such ‘complex adaptive systems’, typically in the framework of ecosystems . This interest in ecosystems is due to their inherent adaptability to reach an optimal state within a ‘window of vitality’ as a balance between efficiency and resilience . Yet, though human society is a complex system in and of itself, “[...] there is a seeming impossibility or at least an extreme difficulty in utilizing the logic of complex adaptive systems to aid in restructuring social systems capable of navigating the dynamics of the information age” . We aim to investigate how complexity theory could help to bring an emerging coherent structure within human society outside of the chains of command and control.
Indeed, core to the science of complexity is the idea of emergence where structure emerges over time out of the multitude of local interactions and without central control. Transposing the notion of emergence to the human fabric, we naturally identify our main challenge as overcoming the ‘tragedy of the commons’ where some individuals benefit at the expense of others. At first sight, aligning personal and collective interests with equipotent individuals who can be pursuing different agendas and without central control seems to be wishful thinking. Beyond swarm behaviour, we propose to tackle the challenge of emergence with the notion of social trust. Here, we do not understand social trust as a security problem which merely underlines a state of distrust, neither do we understand social trust as the naive belief in the benevolence of others. We understand social trust as a willingness to enter a position of vulnerability in order to achieve a positive outcome. Social trust is therefore a root factor in complex interdependencies that could help to develop interpersonal coherence at scale by providing a new set of relational dynamics between the individual and the collective.
The relevance of using trust as the foundation for a distributed ‘social ecosystem’ is that by sharing the risk of trusting others within small social clusters overlapping each other, peer pressure incentivizes participants to take the collective interest into account. Indeed, any relationship eroding trust would belittle the collective potential. And any relationship expressing resonance would lead to increased synergies, attractiveness, and influence at the collective level. Therefore, the pursuit of self-interest at the micro or individual level is balanced by the pursuit of collective fitness at the macro or group level. Trust - as an expression of interdependence and applied to online social platforms - might thus play the role of ‘social glue’ and resolve the question of an emergent structure with a community-centric architecture. Beyond the vertical relationships of top-down and bottom-up, this distributed model would initiate a new relationship between the individual and the collective and a shift from political influence anchored in nominal status to political influence rooted in social recognition.
In our view, trusting others is directly linked to reciprocal relationships and to emotional resonance. Indeed, the peculiarity of trust is that it must be earned as well as given, and reciprocal behaviors anchored in words and in deeds are paramount to building trusted relationships. Also, trusting others is a fragile experience that can have dire consequences for those involved. When we trust others, we expect to meet a positive outcome, yet we put ourselves in a position of vulnerability. This risk is of special significance in the online world where a breach in confidence coupled with the distributed memory of interactions across social media can have long lasting and devastating effects. Therefore and generally speaking, trusting others prompts responsible behavior and the risk is - by virtue and by necessity - only shared with a restricted set of peers. Therefore, trusting others implies some level of emotional attunement. Furthermore, emotional dissonance acts as a strong social signal that a breach in confidence is somewhere in sight. Though trusting others is a fragile experience, it is the basis for strong social relationships. A model for governance based on trust is therefore not ‘risk-averse’ but is instead ‘risk-aware’.
With insights from complexity theory, we propose to develop a network architecture that would scale from local to the global the trust, resonance, and reciprocity experienced by communities that have the highest levels of engagement (e.g. community groups, startups, sport teams, music bands, churches, special forces). Yet, since the peculiarity of trust is that it is limited to small social clusters, how to scale relationships in a continuum of trust? First, the transitivity of trust postulates that if X trusts Y, and if Y trusts Z, then X has an inclination to trust Z. Second, taking the structure of biological systems as a cue, we can see that in order for a large number of cells to function as an organism - or for a large number of organisms to function as an ecosystem - the connections are not ‘many-to-many’ as we are currently connected via the Internet. Instead, they are ‘few-to-few’, in a fractal and heterarchical network that encompasses the entire organism without any absolute center or apex of authority. Because trusting others is the expression of a willingness to put oneself in a position of vulnerability, trusting others is the ultimate expression of social interdependence. Whereas hierarchies involve relations of dependence and market [and blockchain governance] involve relations of independence, heterarchies involve relations of interdependence .
4. A political choice and a philosophical question
Most assuredly, we are now at crossroads where “acceptable ideas are competent no more and competent ideas are not yet acceptable” . In a world of tremendous complexity and uncertainty, centralized solutions won’t work anymore unless they impose a hyper rigid governance system that won’t leave much room for social dissent . Wherever this system will be dismissed in favor of democratic principles, increased complexity will call for distributed solutions. Distributed solutions won’t work without some form of distributed human governance. Distributed human governance will, by-design, challenge the nominal power of static hierarchies with the social power of dynamic heterarchies.
From a European perspective, “We find ourselves stuck between two dominant models: the monopolistic corporate-led internet of Silicon Valley and large-scale government surveillance systems of Beijing. Can we now come up with a third narrative, where citizens and communities are in control and can determine their own future?” . A centralized algorithmic governance will be detrimental to democratic principles. Yet, it will provide a cheap and effective way to enforce stability without challenging institutional powers. In contrast, a distributed participatory governance could be seen as an emergent social power that could challenge the old system - especially if framed in the context of encrypted networks and GDPR compliance which does in fact require defense level security . This challenge might even be reinforced by the rapid development of digital self-sovereign identities’  driven by the willingness to track climate refugees. Faced with the real of societal transformations, static hierarchies might be reluctant to live up to their intentions. Still, riding the ‘edge of chaos’ is the difficult art that today’s leadership needs to master.
In the EU, ‘open government’ is seen as the way forward. The shift from a ‘government-centric’ to ‘citizen-centric’ and ‘community-centric’ models is prompted as governance structures enter multi-stakeholders knowledge networks. The trend undoubtedly paves the way for a participatory ‘open governance’ especially as public services start to experiment with participatory budgeting. Such an open governance would put the function value of traditional governance structures under threat which will eventually raise the question of a new social contract. Nevertheless, more distributed models that increase interpersonal coherence at scale could help to propagate trust horizontally within, across, and beyond social organizations. As such, they could be the only way to tackle exponential challenges with exponential participation. “[..] The real disruption taking place is not technology; it’s a trust shift that will open the doors to new and sometimes counterintuitive ways of designing systems that will change human behavior on a large scale” (Botsman 2016).
With the alarming potential of 5G and of the IoT at the doorstep, a new framework for interaction is emerging and a new threshold of complexity is upon us. Nonetheless, the future of governance is being engineered today. The choice is between a rigid governance that does not require cognitive engagement and an agile governance that will prompt everyone to exert their responsibility. The question is this: will we commit - as human beings - to govern ourselves or will we let algorithms govern us?
 Edelman. (2019). 2019 Edelman Trust Barometer. [online] Available at: https://bit.ly/2TVprXB [Accessed 17 Mar. 2017].
 Price, R. (2019). Car-bomb fears and stolen prototypes: Inside Facebook's efforts to protect its 80,000 workers around the globe. [online] Business Insider. Available at: https://read.bi/2TRSg77 [Accessed 17 Mar. 2019].
 American congresswoman Alexandria Ocasio-Cortez on Twitter, Mar 12, 2019. Available at: https://bit.ly/2Hu03lK [Accessed 17 Mar. 2019].
 European Commission, Next Generation Internet Initiative. [online] Available at:https://bit.ly/2GXIBRT [Accessed 17 Mar. 2019].
 Nakamoto, S. (2019). Bitcoin: A Peer-to-Peer Electronic Cash System. [ebook] Available at: https://bitcoin.org/bitcoin.pdf [Accessed 17 Mar. 2019].
 Siegel, D. (2016). Understanding The DAO Attack. [online] CoinDesk. Available at: https://bit.ly/2Jb2OKC [Accessed 17 Mar. 2019].
 De Filippi, P., & Hassan, S. (2016). Blockchain technology as a regulatory technology: From code is law to law is code. First Monday, 21 (12).
 Hodson, H. (2019). DeepMind and Google: the battle to control artificial intelligence. [online] 1843. Available at: https://bit.ly/2HxnTw9 [Accessed 17 Mar. 2019].
 Hardt, M., Price, E. and Srebro, N. (2016). Equality of Opportunity in Supervised Learning. [online] Cornell University. Available at: https://arxiv.org/abs/1610.02413 [Accessed 17 Mar. 2019].
 Licklider, J. (1990). Man-computer symbiosis. [ebook] Palo Alto: Systems Research Center of Digital Equipment Corporation. Available at: http://memex.org/licklider.pdf [Accessed 17 Mar. 2019].
 Rosenberg, L. (2015). Artificial Swarm Intelligence, a Human-in-the-loop approach to A.I.. [ebook] San Francisco: UnanimousAI.com. Available at: https://bit.ly/2F38j8E [Accessed 17 Mar. 2019].
 Ehrsam, F. (2017). Blockchain Governance: Programming Our Future. [online] Available at: https://bit.ly/2zO45ST [Accessed 17 Mar. 2019].
 Stinchcombe, K. (2018). Blockchain is not only crappy technology but a bad vision for the future. [online] Available at: https://bit.ly/2GFw32m [Accessed 2 Jun. 2019].
 Cilliers, P. (2001). Boundaries, Hierarchies and Networks in Complex Systems. [ebook] International Journal of Innovation Management, Vol. 5, No. 2 Imperial College Press, pp. 135–147. Available at: https://bit.ly/2VD0SLD [Accessed 17 Mar. 2019].
 Hock, D. (1999). Birth of the chaordic age. San Francisco, CA: Berrett-Koehler.
 Cilliers, P. (2002). Complexity and Postmodernism: Understanding Complex Systems. [ebook] London and New York: Routledge. Available at: https://bit.ly/2JkY9px [Accessed 17 Mar. 2019].
 Goerner, Sally & Lietaer, Bernard & Ulanowicz, Robert. (2009). Quantifying economic sustainability: Implications for free-enterprise theory, policy and practice. Ecological Economics. 69. 76-81. 10.1016/j.ecolecon.2009.07.018. Available at: https://bit.ly/2UvQ6dN [Accessed 17 Mar. 2019].
 Last, C., Van Weyenbergh, G. and Werner, B. (2018). Transformative Social Ecosystem Dynamics:A psychological architecture of emotional trust. 1st ed. [ebook] Brussels: Meoh ASBL. Available at: https://bit.ly/2IOPhDb [Accessed 17 Mar. 2019].
 Wagner, R. (2018). Is heterarchy the answer to the crisis of hierarchy?. [online] Available at: https://bit.ly/2UdLg5u [Accessed 17 Mar. 2019].
 Gharajedaghi, J. (2006). Systems thinking: Managing Chaos and Complexity. Amsterdam [etc.]: Butterworth-Heinemann.
 En.wikipedia.org. (2019). Social Credit System. [online] Available at: https://bit.ly/2eHPomc [Accessed 17 Mar. 2019].
 Nesta. (2019). 30 years of the web: where do we go next?. [online] Available at: https://bit.ly/2CnScCo [Accessed 17 Mar. 2019].
 Blockchain and the GDPR. (2018). [ebook] Brussels: The European Union Blockchain Observatory and Forum. Available at: https://bit.ly/2CriGn0 [Accessed 17 Mar. 2019].
 Stokkink, Q. and Pouwelse, J. (2018). Deployment of a Blockchain-Based Self-Sovereign Identity. [ebook] Delft University of Technology. Available at: https://bit.ly/2IVmBwA [Accessed 2 May 2019].