Artwork by Beau Stanton — Backfire: CARTOGRAPHY OF THE MACHINE (2014)

Climatic Backfire: A Central Bank Distributed Ledger to fund Zero-Carbon Investment with Artificial Intelligence and Robotics Taxation


Libertarians claim that dollars are backed by nothing — yet, in some sense, it is indisputable that dollars are backed by future human labor, and more specifically, the taxation thereof. Given this — can artificial labor displace human labor respecting taxation?

In response to this, a case is made to answer in the affirmative within the context of the estimated $73 trillion (USD) needed to decarbonize the economy. In sections one through five (1–5), focus is given to justification for such a solution. In section six, a non-technical, pre-fabricated solution is provided using a de-abstracted, hypothetical “real-world” scenario. In section seven, a formal technical solution is modeled.

In Section 1 — The Ornamented Man, an overview of Artificial Intelligence, Machine Learning, and Robotics (AI) is given with concern toward the immediate risk they pose to a stable labor market, and the implications of this threat over the next two decades.

In Section 2 — The Infernal Apparatus, political partisanship is considered through a lens of politico-industrial alliances with a specific emphasis on the [fossil fuel] energy sector. Recent social unrest attributable to coal mine closures in the eastern United States is studied, and moreover — the employment implications of a fast transition to a zero carbon economy.

In Section 3 — Old World Projection, an overview of the Federal Reserve System is given with an eye toward its founding, its founders, and its central role as the “Cathedral” of the modern world since its inception in 1913.

In Section 4 — New World Projection, an overview of the problem of China’s state capitalism to Western technological dominance — emphasis is placed on the implication (for the West) that China’s economic rebalancing and efforts to promote innovation, appear to have worked.

In Section 5 — Elemental Crisis, an overview of modern Neoreactionary thought is given in terms of the “dark enlightenment” — with a brief introduction to the movement’s (apparently) influential opposition to democracy in all its forms.

In Section 6 — Hierarchy of the Eternal Artifice, the Cathedral of the West is reconstructed by de-abstraction: the power of the Federal Reserve System expands to form a “Modern Triumvirate”, becoming inclusive of not just “big banks”, but also “big tech” and “big energy” — hypothetically as Libra-Coin and The American Fuel & Petrochemical Manufacturers Association coordinate to finance, on the one hand, and physically build and maintain, on the other hand, zero-carbon infrastructure.

In Section 7 — Libra-Aries: A Federal Reserve Bank Hashgraph Network & Governing Council, a formal model for a Distributed Ledger Technology (DLT) based on Hedera Hashgragh and Proof of Trust protocols. A summary is given of how the Fed can mitigate the dual risks of socioeconomic unrest posed by automation and decarbonization, while also ensuring that the majority of AI enabling CPU/GPU power and intellectual property (IP) is controlled by nodes which strengthen, stabilize, renew, and reinforce the network.

NOTE: No attempt has been made to avoid directly copying/pasting content from other authors and published works where it seemed most efficient to do so. Where this has been done, a hyperlink to the original work is contained within the section.

ADDITIONS & ADDENDUMS: Over the next few months I will attempt to add-to, clarify, and correct the content and concepts contained herein. No claim is made of correctness or completeness in its current form.

  1. The Ornamented Man | Artificial Intellect

The word “energy” comes from the Greek energeia, which is a concept developed by Aristotle (384 BC − 322 BC) that has no direct modern translation into English, but is akin to “being at work”. For our present purposes, energy is a condition that describes the capacity to do work. And money, principally, is a device used to direct work.

Before getting too far along, it’s worthwhile to clarify some semantics about work. Fundamentally, a job or taskis what work is to be done, while a process is how work is to be done. A job is an overall unit of work, and is composed of tasks. A process performs a task.

In computing, “job” originates in non-interactive processing on mainframes, notably in IBM’s Job Control Language for the DOS/360 and OS/360 of the mid-1960s, and formally, it means a “unit of work for an operating system”, which consists of steps, each of which is a request to execute a specific program. Early computers primarily did batch processing (running the same program over many input data), like census or billing, and a standard type of one-off job was compiling a program from source, which could then process batches of data. Later on, batch came to be applied to all non-interactive computing, whether one-off or multiple items. Formally “multitasking” means “working on multiple tasks concurrently”, but in practice means an operating system (or virtual machine, or runtime, or individual process) “running multiple processes/threads concurrently”.

If that didn’t make sense to you, don’t fret. Just know that in this paper we aim to avoid using the terms “job” or “task” in reference to machines, and instead refer to a “set of processes”, “process”, or “thread”, and for servers we refer to requests (or queries) rather than tasks. Thus, the anthropomorphized (human) analogues we call “job” or “task”, respectively. In anthropomorphized scenarios, we say that a process refers to who works.

Regardless of the exact number of jobs susceptible to automation over the next two decades, the implications are nonetheless looming. Bringing the issue to a wider audience is U.S. Candidate for President in 2020 (D) Andrew Yang:

“Fifty-five hundred floor traders once roamed the trading floor of the New York Stock Exchange. Now there are fewer than 400, as most trading jobs have been taken over by servers running trading algorithms. Those scenes you see on CNBC are not of the New York Stock Exchange but of the Chicago Mercantile Exchange, where they still have enough humans to make a good backdrop. Goldman Sachs went from 600 NYSE traders in 2000 to just two in 2017 supported by 200 computer engineers. ” — Excerpt From: Andrew Yang. “The War on Normal People.

To be sure, AI and ML have dominated recent debates on the Future of Work. Since Frey and Osborne (2013) shocked analysts and policy makers worldwide with a study suggesting that 47% of jobs in the United States are at high risk of being automated, several other researchers and institutions have contributed to the debate, and all produced estimates in the high double digits. All these studies stem from an assessment by experts of the risk of automation for a subset of occupational titles, based on the tasks these occupations involved. This allowed identifying the so-called bottlenecks to automation — i.e. the tasks that, given the current state of AI, are difficult to automate. These bottlenecks were used to compute a risk of automation for occupational titles that were not included in the expert assessment and for countries outside the United States.

Regardless of differences between studies, one needs only high-level appreciation of the facts to realize many experts have been sounding the alarm — for instance, a bullet point from this study:

  • Across the 32 countries, close to one in two jobs are likely to be significantly affected by automation, based on the tasks they involve. But the degree of risk varies. About 14% of jobs in OECD countries participating in PIAAC are highly automatable (i.e., probability of automation of over 70%). Although smaller than the estimates based on occupational titles obtained applying the method of Frey and Osborne (2013) this is equivalent to over 66 million workers in the 32 countries covered by the study. In addition, another 32% of jobs have a risk of between 50 and 70% pointing to the possibility of significant change in the way these jobs are carried out as a result of automation — i.e. a significant share of tasks, but not all, could be automated, changing the skill requirements for these jobs.

Clearly, our political establishment is woefully unprepared to address anything close to 50% job displacement — and arguably, even a quick jump to 14% would significantly destabilize the economy.

What is needed is an economic system which expands its human-employment capacity with each incremental AI advancement.

2. The Infernal Apparatus | Energy and Intellect

The US energy sector is undergoing a profound transformation, but nothing even remotely close to its near-future inevitable conclusion in a zero-carbon economy. The transformation thus far has dramatically reduced US consumption and production of coal — both relative to expectations in the mid-2000s and in absolute terms. Coal production declined 19 percent from 2015 to 2016 and by 38 percent since its peak in 2008. Between 2008 and 2015, coal mining employment declined by 23 percent and 31 percent of coal mines closed, mostly in the eastern United States.

The result of the 2016 presidential election led to some federal policies and proposals that aim to reverse the downward trend of coal production and employment. One of those policies is a reversal of an Obama administration order to review the federal coal leasing program. In early 2016, the Obama administration placed a moratorium on new coal leases on federal lands, beginning a review of leasing policy that might have ultimately raised costs of producing coal from federal lands (Department of Interior, 2016). Regardless, the plight of coal miners and coal-mining communities across the United States is acute and growing — for instance, even while Robert E. Murray, head of Murray Energy, presented President Trump just weeks after the inauguration with a sixteen point “action plan” which Trump has attempted to [essentially] fully enact, Murray Energy still became the eighth major coal company to file bankruptcy in 2019.

“The way of life is changing so bad,” Mr. Rose said. He grew quiet. “You’ll get overwhelmed if you think about it too hard.”

Most of the employment losses in the coal sector have occurred among coal mines rather than power plants.

It has long been industry conventional wisdom that Western coal would continue to prosper, at least for a while, even while Appalachian coal country (West Virginia, eastern Kentucky, and Pennsylvania, along with eastern Ohio and parts of Alabama, Maryland, Tennessee, and Virginia) has been in severe decline for over a decade. However, it’s beginning to look like conventional wisdom was wrong. Western coal is declining too, and as it does, vulture capitalists are buying up mines, squeezing out the last bit of profits, and declaring bankruptcy, leaving behind an environmental mess and workers without jobs or pensions. The coal boom in the Powder River Basin — the largest coal basin in the US, the source of 40 percent of American coal, spanning northeast Wyoming and southeast Montana — dates back to the early 1970s. It has resulted in a few large companies with deep local roots, their taxes funding infrastructure and schools. Their steady profitability has made coal the heart of several Western communities. There are 13,000 coal-dependent jobs in the PRB.

While low in total compared to overall U.S. employment, energy jobs are highly influential at the state level [to say nothing of their influence at the national level] since they are not spread equally across the United States, thereby receiving increased attention from elected representatives of those states:

  • Though supporting fewer total fossil jobs, the coal industry employed a disproportionate share of workers in two states — Wyoming and West Virginia. In Wyoming, nearly 300 of every 10,000 workers (or 3%) were employed in coal-based power generation and coal mining last year. In West Virginia, the coal industry accounted for 2%

Various extensive reports have undertaken the challenge to inform policymakers and other stakeholders of the employment risks of shifting energy sources, for instance, see the 2012 OECD report for the European Commission titled “The jobs potential of a shift towards a low-carbon economy”. Seven years later, one reads such a report with a profound realization that their focus is grounded in looking at “the bright side”.

So then, what’s “the dark side”? On December 30th 2019, Mark Carney, former investment banker at Goldman Sachs and current head of the Bank of England, said that the leading pension fund analysis “is that if you add up the policies of all of companies out there, they are consistent with warming of 3.7–3.8C (7.2 degrees F)”. Scientists say the risks associated with an increase of 4C include a nine metre rise in sea levels — affecting up to 760 million people — searing heatwaves and droughts, and serious food supply problems. Speaking to the Today program, he re-iterated his warning that unless firms woke up to what he called the climate crisis, many of their assets would become worthless — “If we were to burn all those oil and gas [reserves], there’s no way we would meet carbon budget,” he said. “Up to 80% of coal assets will be stranded, [and] up to half of developed oil reserves.”

Even while Goldman Sachs recently said that it would no longer finance coal projects, or oil and gas exploration in the Arctic, if anybody doubts the motive for big banks, hedge funds, pension funds, and others within Wall Street and finance to hide their exposure and downplay the seriousness of the impending threat, consider that as of 2016 the New York Stock Exchange held over 2 Trillion in market cap from just the top 20 fossil fuel firms.

Exploring the relation of this exposure to current political movements is critical to understanding the situation we’re in. As part of the Buzzfeed article Help Us Map TrumpWorld, the four investigative journalists, John Templon, Alex Campbell, Anthony Cormier, and Jeremy Singer-Vine asked the public to help them map and analyze the data that they investigated, confirmed and published surrounding the business connections of Donald J. Trump.

Neo4j has been used in investigative journalism before, most notably in the collaborative work of the International Consortium of Investigative Journalists (ICIJ) on the Panama Papers investigations, but also by other investigative organizations and journalists — and while we’re not “strictly speaking”, investigative journalists, for our present purpose one of Neo4j’s prebuilt sandboxes is a great tool to visualize the overlap of the Trump Administration with some of the fossil fuel industry’s largest financiers. What follows is a graph database of Trump’s White House and its relations within the banking world.

Now, let’s dive in on a few key individuals, such as reclusive hedge fund manager Robert Mercer.

This stunning web of people directly connected to Trump (who are also connected to Robert Mercer) should cause everybody a “double-take”. In comparison, it’s arguable that the President’s very own son-in-law, Jared Kushner, is likely less influential on the President than Mr. Mercer.

So, who is Robert Mercer? Graham Readfearn of DeSmog says that some $15 million of Robert Mercer’s money went into the Make America Number 1 super-PAC that was headed by Rebekah Mercer and that bankrolled the final months of Donald Trump’s campaign. One source told The Hill: “The Mercers basically own this campaign.”

But DeSmog has found the Mercers have also pumped at least $22 million into organizations that push climate science denial while blocking moves to cut greenhouse gas emissions. Across the board, the groups funded by the Mercers have misrepresented climate science, promoted fossil fuels, denigrated renewable energy, and pushed to strip powers from the U.S. Environmental Protection Agency (EPA). The Chicago-based Heartland Institute has received $4,988,000 from the Mercers, cashing its first $1 million check in 2008. The Heartland Institute holds regular “international climate change conferences” where denialist, fossil fuel-funded scientists, and politicians come together to talk tactics. In 2012, the institute famously started a billboard campaign that used a picture of terrorist Ted “Unabomber” Kaczynski next to the phrase: “I still believe in global warming. Do you?”

The fact that the Trump Administration has taken a formal denialist position on the topic is, to all reasonably intelligent people, clear evidence of stupidity (as in very low IQ) marrying greed.

Clearly, fund managers everywhere are (or should be) less enthusiastic to deny the wisdom of the consensus of 99% of the world’s foremost scientists — in fact, if fund managers are risk managers, all of them should immediately hedge the risks which such a strong consensus implies. What Robert Mercer and Goldman Sachs and “mom and pop” stockowners, and everybody else on Wall Street needs — is an EXIT.

3. Old World Projection | Trust Builders

To look for such an exit, we begin at the dawn of the twentieth century, when Theodore Roosevelt and J. Pierpont Morgan were the two most powerful men in America, perhaps the world. As the nation’s preeminent financier, Morgan presided over an elemental shift in American business, away from family-owned companies and toward modern corporations of unparalleled size and influence. Despite their many differences in temperament and philosophy, Roosevelt and Morgan had much in common — social class, an unstinting Victorian moralism, a drive for power, a need for order, and a genuine (though not purely altruistic) concern for the welfare of the nation. Working this common ground, the premier progressive and the quintessential capitalist were able to accomplish what neither could have achieved alone — including, more than once, averting national disaster.

In 1910, Aldrich and executives representing the banks of J.P. Morgan, Rockefeller, and Kuhn, Loeb & Co., secluded themselves for ten days at Jekyll Island, Georgia. The executives included Frank A. Vanderlip, president of the National City Bank of New York, and associated with the Rockefellers; Henry Davison, senior partner of J.P. Morgan Company; Charles D. Norton, president of the First National Bank of New York; and Col. Edward M. House, who would later become President Woodrow Wilson’s closest adviser and founder of the Council on Foreign Relations. There, Paul Warburg of Kuhn, Loeb, & Co. directed the proceedings and wrote the primary features of what would be called the Aldrich Plan.

In its final form, the Federal Reserve Act represented a compromise among three political groups. Most Republicans (and the Wall Street bankers) favored the Aldrich Plan that came out of Jekyll Island. Progressive Democrats demanded a reserve system and currency supply owned and controlled by the Government in order to counter the “money trust” and destroy the existing concentration of credit resources in Wall Street. Conservative Democrats proposed a decentralized reserve system, owned and controlled privately but free of Wall Street domination. No group got exactly what it wanted. But the Aldrich plan more nearly represented the compromise position between the two Democrat extremes, and it was closest to the final legislation passed.

From 1913 onward, in our opinion, and commencing with the establishment the Federal Reserve System as the central bank of the United States, The Merry Wives of Windsorendowed upon our land its sole mandate, simply “The world is your oyster”.

If the opportunity herein has not yet crystalized in the shimmering neurons of your mind, it will soon. The gulf is already laid bare between modern fiat currencies and the gold standard — to leap it, is only a question of ridding the Fed of its Leviathan Lime Rust.

4. New World Projection | The Industrial China Complex

As of today, China’s evolving model of state capitalism has no Western answer. A key emerging trend is that with Xi Jinping’s emphasis on strengthening Communist Party control over the corporate sector, China’s state capitalism is morphing further into a mode that can be termed “party capitalism.” Moreover, the role of state capitalism in China’s economic rebalancing and efforts to promote innovation, appears to have worked.

Xi Jinping’s greatest aspiration for the economy, other than ensuring basic stability, is turning China into an innovation superpower. This is important to Beijing not only for purely economic reasons, but also for national security (reducing reliance on the US for critical technologies) and military competitiveness. It is not lost on Beijing that China’s private firms are more critical to this task than SOEs. Private players such as Huawei, Alibaba, Baidu and Tencent — rather than state-owned behemoths like China Telecom — represent China’s “national champions” in next generation areas such as artificial intelligence and its applications. Ensuring effective political control over private firms provides Beijing with the comfort to work in increasingly close partnership with them. Indeed, the internet giants are becoming deeply embedded in aspects of China’s economic and even governance infrastructure, from the payment system to inputs into China’s social credit system.

This is not to say that private entrepreneurs are necessarily seeking out this tighter embrace, which can become suffocating, but they face a difficult balance act given the importance of keeping up strong ties with regulators and other government officials. It is also important to note that Beijing’s embrace of strategically important private firms has not translated to broader efforts to level the playing field for private firms against the state sector, such as by removing the direct and indirect subsidies and political protections that SOEs enjoy.

When it comes to SOEs, Beijing is combining tighter party control over these firms with reduced administrative interference in their operations. This is an effort to improve the financial discipline and commercial orientation of SOEs while ensuring their fealty to Beijing’s key priorities. Significantly, the wave of party building within SOEs began in late 2015, just as Beijing began moving ahead with modest reforms to SOEs’ corporate governance. These include providing more leeway to SOEs to make personnel and compensation decisions without approval from SASAC, the state agency that supervises SOEs, as well as “mixed ownership reforms” that invite private firms to take stakes in SOEs and (at least in theory) bring greater market savvy to their decisions.

The implications of this for the United States, for its allies, and for the global trading system are massive, yet the U.S. Government, totally unconcerned, like a portrait of pure incompetence, battles over how best to deceive itself, and falls ever further into its absurd delusions of grandeur and scientific-denialism. With its European and Asian alliances reeling at its erratic leader, and its domestic political affairs in near-shambles, the United States seems increasingly unhinged, intoxicated, and demoralized.

Section 5. The Elemental Crisis | Endarkenment Ideals

For the [very] few intelligent higher-ups supporting this U.S. Administration, like Peter Thiel, one suspects that China’s model of state capitalism is already “proven superior”. An indication of this can be found in a April 2009 discussion hosted at Cato Unbound among libertarian thinkers (including Patri Friedmanand Peter Thiel) in which disillusionment with the direction and possibilities of democratic politics was expressed with unusual forthrightness. Thiel summarized the trend bluntly:

“I no longer believe that freedom and democracy are compatible.”

Referring to ‘Voice’ as democracy itself, in its historically dominant, ‘Rousseauistic strain’, Patri Friedman remarks:

“we think that free exit is so important that we’ve called it the only Universal Human Right.”

Reading into the above, insofar as it’s of present concern, we wonder whether the zeitgeist of today isn’t simply a need for stronger, more robust, more enterprising men — and not in any way, shape, form, or derivation thereof — an American yearning for English-styled monarchs. At least to these American ears, Nick Land’s decrepit crying causes nausea. Democracy, in its current form, is hardly that which Plato feared.

But our present purpose is not a polemic response to the acid trips of the Cybernetic Culture Research Unit. Instead, we wish to take a more enterprising view toward the dominant institution of the Cathedral, and rebrand it, so to speak, in the “Hierarchy of the Eternal Artifice”.

Section 6 — Hierarchy of the Eternal Artifice | Resurrection of the Cathedral

Libra is a permissioned blockchain digital currency proposed by Facebook, Inc., written in Rust, semi-launched on June 18th 2019, and published as open source under the Apache License.

Originally, Facebook established the Libra Association to oversee the currency, founded in Geneva, Switzerland as follows:

Notably, many AI heavyweights are absent here: Google, Apple, Amazon, Microsoft, IBM, Oracle… all of these and more, to be sure, ought to have provided their extensive political and economic clout, rather than hidden, trembling in the shadows from the mad hatters of the eroding system. Regardless of the cowardliness of others, the founding members of the Libra project deserve credit for providing a starting point to our present discussion, which aims to counter the system of Chinese state capitalism with a Western version, and thereby provide the full backing of the Federal Reserve System to exactly this type of corporate alliance. Only in such a way, it seems, can an OPEC-like price influence be gained over AI and robotics supply.

In many ways, it’s easy to think that with the decline of freemasonry the decline of the Fed’s “innovators” would be inevitable. After all, who sees a dollar bill and fails to wonder at its symbolism?

“These are the four points of the compass… when you get up to the top, the points all come together, and there, the eye of God opens.” — Joseph Campbell

Yet, nobody in government can read symbols anymore, and thus, nobody saw any potential whatsoever in the Libra project — instead, being predominantly fear-based reactionaries and unimaginative, incompetent politicians and managers of once-great institutions, the “relic leaders” did little besides devour the project’s still-unborn offspring.

On Oct. 9 2019, Congresswoman Maxine Waters officially announced that Zuckerberg would attend a congressional hearing entitled “An Examination of Facebook and Its Impact on the Financial Services and Housing Sectors.”

The announcement notes that Waters and other Democrats on the committee sent a letter to Facebook in July, requesting an immediate moratorium on the implementation of proposed cryptocurrency Libra and digital wallet Calibra.

The statement also mentions the draft bill “Keep Big Tech Out of Finance Act,” which is designed to ban large tech firms from getting licensed as financial institutions in the United States. The announcement reads:

“The draft legislation prohibits large platform utilities, like Facebook, from becoming chartered, licensed or registered as a U.S. financial institution (e.g. like taxpayer-backed banks, investment funds, and stock exchanges) or otherwise becoming affiliated with such financial institution”

Notwithstanding the “buzz kill” of this result, let’s pretend that somebody, somewhere, can pull a few strings to review the decision. Perhaps the Mercers can, or perhaps somebody in the American Fuel & Petrochemical Manufacturers?

For more than 100 years, the American Fuel & Petrochemical Manufacturers (AFPM) industry alliance has successfully worked on behalf of American fossil fuel manufacturers. And, as the industry has evolved, so has the trade association.

The AFPM’s list of members is so long that it’s impractical to list here. However, it counts among its members some of the heavyweights of American fossil fuel interests, as well as many other small-medium sized firms.

At this point in our discussion, it’s time to start combining concepts into a coherent picture.

Section 7: Libra-Aries ­ — a Federal Reserve Bank hashgraph network and governing body designed to address the needs of climate finance.

The Libra-Aries network will be governed by a council of leading global enterprises, envisioning itself as a partnership among existing industry alliances such as Libra and AFPM, across multiple sub-industries within tech and energy, and their respective geographies. Platform governance will be decentralized through the Libra-Aries Governing Council (LAGC), which will have a term-limited, rotating set of governing members that each have staked (as in proof of stake) voting rights over key decisions relating to the platform. The relationship between the Fed and the LAGC is envisioned as being similar to Primary Dealers in the current financial system, and should be governed by a similar provision to the Primary Dealers Act of 1988 and the Fed’s operating policy “Administration of Relationships with Primary Dealers.”

The network borrows from a few key existing protocols, namely, Hedera Hashgraph and a Smart Contract validation layer based on Proof of Trust.

1 . P E R FO R M A N C E — The platform is built on the hashgraph distributed consensus algorithm, invented by Dr. Leemon Baird. The hashgraph consensus algorithm provides near-perfect efficiency in bandwidth usage and consequently can process hundreds of thousands of transactions per second in a single shard (a fully connected, peer-to-peer mesh of nodes in a network). Initially, we anticipate that the Libra-Aries network will be able to process 10,000 cryptocurrency transactions per second. Consensus latency is measured in seconds, not minutes, hours, or days.

2 . S EC U R IT Y — Hashgraph achieves the gold standard for security in the field of distributed consensus: asynchronous Byzantine Fault Tolerance (aBFT). Other platforms that use coordinators, leaders, or communication timeouts to improve performance tend to be vulnerable to Distributed Denial of Service (DDoS) attacks. Hashgraph is resilient to these types of attacks against the consensus algorithm because there is no such leader. Achieving this level of security at scale is a fundamental advance in the field of distributed systems.

Many applications require that the consensus order of transactions match the actual order in which the transactions are received by the network. It should not be possible for a single party to prevent the flow of transactions into the network, nor influence the order of transactions in the eventual network consensus. A fair consensus algorithm ensures that if a user can submit a transaction to the network at all, then the transaction will be received by the network and the order in which it was received will be a fair ordering. Hashgraph uniquely ensures that the actual order transactions are received by the network will be reflected in the consensus order. In other words, hashgraph ensures both Fair Access and Fair Ordering.

3 . G OV E R N A N C E — The Libra-Aries network will be governed by a council of up to N-number of leading global enterprises within Big-Tech and Big-Energy. Libra-Aries Council members will bring needed experience in process and business expertise. Council membership is designed (i) to reflect a range of industries and geographies, (ii) to have highly respected brands and trusted market positions, and (iii) to encompass competing perspectives.

The terms of governance ensure that no single Council member will have control, and no small group of members will have undue influence over the body as a whole. Just as all of the top ten dealers in the foreign exchange market are also Primary Dealers, and between them account for almost 73% of foreign exchange trading volume — the LAGC members are likely to exert a similar power over energy and tech markets. Arguably, this group’s members will be the most influential and powerful non-governmental institutions in global energy and tech markets. Group membership is meant to change slowly and repress “disruptive” technological advances throughout society.

4 . S TA B I LIT Y — Libra-Aries relies on both technical and legal controls to ensure the stability of the platform. Libra-Aries technical controls enable two capabilities.

i) First, the hashgraph technology ensures that software clients validate the pedigree of the Libra-Aries hashgraph ledger prior to use through a shared state mechanism. It isn’t possible for a network node to fork the official version of the Libra-Aries hashgraph platform, make changes, and then have those changes accepted as valid. If the original hashgraph platform and the copy are changed independently, software clients using the Libra-Aries platform will know which is the valid version and which is not.

ii) Second, the hashgraph technology makes it possible for the Libra-Aries Council to specify the software changes to be made to network nodes, precisely when those changes are to be adopted, and to confirm that they have been adopted. When the Libra-Aries Council releases a software update, network nodes will have their software automatically updated at exactly the same moment. Any node with invalid software (i.e., one that didn’t install the software update) will no longer be able to modify the ledger or have the world accept their version of the ledger as legitimate. Libra-Aries legal controls ensure the platform will not fork into a competing platform and cryptocurrency.

iii) The Libra-Aries codebase will be governed by the Libra-Aries Governing Council, and will be released for FRB review with Version 1.0, expected to be released in 2020. It will not be open source, but Congressional Oversight and even Academics should be able to read the source code, recompile it, and verify that it is correct. The combination of technical and legal controls provide the governing body with the mechanisms needed to enable meaningful governance, and to bring the stability that we think is required for broad-based adoption.

5 . R EG U L ATO RY C O M P LI A N C E — The Libra-Aries technical framework includes controlled mutability of the network state and the potential to request or attach additional data to transactions, such as identity certificates. These features enable future functionality such as the erasure of personal data and other uploaded files and opt-in verified identity mechanisms — all optional and within the control of the end users. We intend to work with regulators and encourage development of such tools to allow enterprises to fulfill their consumer protection and regulatory compliance obligations.


A hashgraph distributed ledger is inexpensive to operate compared to blockchain distributed ledgers, as it avoids energy-intensive proof-of-work. Individuals and organizations who want to run hashgraph nodes will not need to purchase expensive custom mining rigs. Instead, they will be able to run hashgraph nodes via readily available hardware that is less expensive than such specialized mining rigs.


The hashgraph is 100% efficient, as that term is used in the blockchain community. In blockchain, work is sometimes wasted mining a block that later is considered stale and is discarded by the network of nodes.

In hashgraph, the equivalent of a “block” of transactions never becomes stale. Hashgraph is also efficient in its use of bandwidth. Whatever the amount of bandwidth required to inform all the nodes of a given transaction (even without achieving consensus on a timestamp for that transaction), hashgraph adds only a very small overhead of additional bandwidth to achieve a consensus timestamp and put the transactions into order. Additionally, the hashgraph voting algorithm does not require any additional messages be sent in order for nodes to vote on validating transactions (or those votes to be counted) beyond those messages by which the network nodes learned of the transaction itself.


The hashgraph is fast. It is limited only by the bandwidth. If each network node has enough bandwidth to download and upload a given number of transactions per second, the network as a whole can handle close to that many transactions per second. Even a fast home internet connection could enable a hashgraph node to be fast enough to handle transaction volume equal to that of the entire global VISA card network.

The centralized energy trading system faces a challenge in terms of fair energy distribution. Centralized existing energy trading system totally relies on a central system or third party, because the third party has many drawbacks in the form of record tampering or record altering. The fair transaction is the main issue in the energy trading sector.


Hashgraph is fair because there is no leader node or miner given special permissions for determining the consensus timestamp assigned to a transaction. Instead, the consensus timestamps for transactions are calculated via an automated voting process in the algorithm through which the nodes collectively and democratically establish the consensus. We can distinguish between three aspects of fairness.


Hashgraph is fundamentally fair because no individual node can stop a transaction from entering the system, or even delay it very much. If one or a few malicious nodes attempt to prevent a given transaction from being delivered to the rest of the network and so be added into consensus, the random nature of the hashgraph gossip protocol through which nodes communicate messages to each other will ensure that the transaction flows around that blockage.


Hashgraph gives each transaction a consensus timestamp that is based on when the majority of the network nodes received that transaction. This consensus timestamp is fair, because it is not possible for a malicious node to corrupt it and make it differ by very much from that time. Every transaction is assigned a consensus time, which is the median of the times at which each node says it first received it. Received here refers to the time that a given node was first passed the transaction from another node through gossip. This is part of the consensus, and so has all the guarantees of being Byzantine. If more than two-thirds of participating nodes are honest and have reliable clocks on their computer, then the timestamp itself will be honest and reliable, because it is generated by an honest and reliable node or falls between two times that were generated by honest and reliable nodes. Because hashgraph takes the median of all these times, the consensus timestamp is robust. Even if a few of the clocks are a bit off, or even if a few of the nodes maliciously give times that are far off, the consensus timestamp is not significantly impacted.

This consensus timestamping is useful for things such as a legal obligation to perform some action by a particular time. There will be a consensus on whether an event happened by a deadline, and the timestamp is resistant to manipulation by an attacker. In blockchain, each block contains a timestamp, but it reflects only a single clock: the one on the computer of the miner who mined that block.


Transactions are put into order according to their timestamps. Because the timestamps assigned to individual transactions are fair, so is the resulting order. This is critically important for some use cases. For example, imagine a stock market, where Alice and Bob both try to buy the last available share of a stock at the same moment for the same price. In blockchain, a miner might put both of those transactions in a single block and have complete freedom to choose in what order they occur. Or the miner might choose to only include Alice’s transaction, and delay Bob’s to a future block. In hashgraph, there is no way for an individual node to unduly affect the consensus order of those transactions. The best Alice can do is to invest in a better internet connection so that her transaction reaches everyone before Bob’s. That’s the fair way to compete.


The Hedera ledger can run smart contracts written in Solidity. Currently, large libraries of Solidity smart contract code exist, and they can be run unchanged on Hedera. These allow for distributed applications to be easily built on top of Hedera.

A large part of our system will be based on Smart Contracts.

Both Energy and Tech members of LAGC will have dedicated teams of “Delegates” who perform certain roles within the network.

Proof of Trust’s PoT is a multi-platform protocol enabling decentralized Smart contract settlement validation at scale. Throughout what follows we will be borrowing numerous concepts from the protocol, and moreover, recommend that any real-world deployment should utilize IBM’s expertise as such.

Using a network of accountable participants (the LAGC) acting as decentralized oracle inputs (Delegates), PoT uses an algorithm to create, monitor, and report reputational trustworthiness over time. Smart contracts are broken down to binary logic and voted on for majority consensus, allowing up to 49% of malicious actors (much higher than BFT, the assumed threshold for consensus reliability). To reach validity probabilities of a near-zero chance of incorrectness, a member has the ability to contest the Consensus determination of the Delegates and raise the query to SuperDelegates who have proven perfect reputation by always inputting correct determinations.

The evolution of smart contracts) in blockchain allows rules to be enforced and transactions to be automatically executed when certain parameters are met. A simple example would be the case where a smart contract is created to remit payment to a recipient when a particular action is done. For instance, suppose that ExxonMobile has entered into contract with the Fed to be distributed 10M units of Libra-Aries currency pending its successful launch of a new 10Megawatt solar installation. Now suppose that a Delegate from Google is to verify that this condition has in fact been met using GoogleEarth… We can imagine any number of applicable scenarios and Smart-Contracts:

With current protocols, smart contract payment may be executed incorrectly without parameters being met and entered irreversibly to the blockchain without real world validation. Bots can be created or malicious single actors with access to Keys or API’s can attack distributed systems and maliciously affirm or attest to misinformation without proper systemic contestation, recourse or rectification. This raises a key question in the blockchain ecosystem:

As Smart contracts enforce predetermined outcomes, how can distributed consensus validate Contract settlement before it is immutably posted to the blockchain?

Given the nature and scale of the operations being executed, trustworthy inputs are crucial. The disconnect between real-world confirmation and Smart contract validation is of primary importance for instituting daily and industrial applications of blockchain (or hashgraph) Smart contracts. This lack of redundancy is seen as a leading flaw in distributed trust systems. For serious users to be able to trust Smart contracts, they need a type of insurance for cases of misinformation or nefarious inputs, to be able to assure that settlement only occurs when inputs are validated. Proof of Trust proposes the Proof of Trust (PoT) protocol to support reliable institutional services within a decentralized framework. PoT acts as a protocol for assessing, auditing and affirming data inputs to execute with distributed trust — enabling them to be assured and potentially insured for validity.

Proof of Trust is a Smart contract validation protocol that works with dApps concurrently on any blockchain to assure valid smart contract settlement. Leveraging the state and consensus of the existing blockchain, it provides active distributed smart contract validation with internal safeguard incentive mechanisms, probabilities and reputation to prevent fraud and assure trustworthiness.

Applied above the blockchain layer, PoT occurs within a set range of accountable participants (Delegates) that have proven stake and the ability to competently provide determinants that execute smart contracts or other criteria. Delegates attest to real-world events and provide consensus for contracts that may require distributed validation.


Delegates are known, insightful actors (can be a person, group, company, managed API or otherwise) functioning as oracles that work together to achieve consensus on smart contract determinations. A Delegate could be a any LAGC authorized representative with specialized industry expertise who can attest to the validity of a smart contract’s content — such as a geospatial satellite analyst, oilfield supervisor, pipeline inspector, just to name a few examples. The variability of individuals that are allowed to participate as Delegates cultivates aspects of the developing global gig economy, rewarding participants for information they are privy to and the value they provide for the network. Each Delegate individually interacts with its third-party API and oracle to input determinations, and their voted consensus is proposed as contract settlements. Once proposed as valid, the User has an additional opportunity to contest and raise the query to a set of SuperDelegates who have perfect reputation to vote and achieve indelible consensus.


To be onboarded as a Delegate, the participant must prove the ability to affirm information relevant to the type of Smart contracts that they will be executing. To ensure individual identity, accountability, and relevance to the Smart contracts being executed, each Delegate must go through an agreed-upon KYC process before they enter the PoT network. Their identity is known to Proof of Trust network operators but anonymized from the general members to prevent collusion. Delegates also do not directly interact with other identified Delegates.


Messaging between Delegates and the Proof of Trust network is done through a dedicated internal portal. This portal allows Delegates to go through the onboarding process as well as set up open API’s for database query/ responses and accepting new contracts to adjudicate. Proof of Trust administrators may also use this portal to send notifications or inquiries to the Delegate community.


Sectors are developed as Delegates are onboarded with specialties in adjudicating smart contracts relative to particular industries such as solar, wind, hydro, storage, oil & gas, and so on. A particular contract is categorized under a delegate sector based on parameters that the Proof of Trust framework templatizes when translating smart contracts to delegate voting logic.


Shadow contracts are contracts that are sent by the Proof of Trust system to a Delegate in its onboarding process to act as controls, testing that the potential Delegate can competently vote on smart contract determinations. To the Delegate, these contracts look and act like all other smart contracts that will be sent to them to adjudicate. This process creates the basis for the Delegate’s reputation.


Each Delegate holds a Reputation based on its historical performance of inputting smart contract determinations that yield majority consensus. When voting on a contract determination, all Delegates are either part of the majority consensus or dissenting. The anonymous nature of Delegates in the system prevents collusion while voting. A Delegate’s Reputation can also degrade from failure to adjudicate smart contract inputs in time. When a Delegate attests to being able to adjudicate a particular contract but fails to input any determination/vote within a reasonable amount of time after it should be available, then it is less trustworthy. If a Delegate’s Reputation degrades below a certain threshold, it will be suspended from the network until it has answered a sufficient amount of Shadow Contracts to rebuild its Reputation.

When a Delegate votes on smart contract query, one of two events may occur:

1. Determination is part of majority consensus:

  • a. Reputation stays at 100% or increases if previously degraded.

2. Determination is dissenting:

  • b. Reputation degrades incrementally, and Stake is distributed to majority vote Delegates. No reward is granted for the specific Contract and future rewards are commensurate to new Reputation score.

Reputation Reward:

The PoT incentive structure financially aligns Delegates to provide trustworthy inputs and retain a perfect reputation. Fluctuating Reputation directly impacts financial reward and input capability for future contracts. Delegates are financially incentivized to retain a high Reputation for inputting correct data and are punished for offering incorrect information. Higher Reputation equates to a higher reward, and lower Reputation means a lower reward.

Reputation determines the weighted amount of the Contracts total reward that an individual Delegate will receive. The total reward for the contract equals 100% of the reward and each Reputation point becomes a weighted fraction of the 100%, with the Delegates total Reputation being its commensurate share of the Contract reward. For simplicity, an example of the process is shown on the following page with a total reward amount of 80 Proof of Trust:

Zero knowledge proofs are proposed for ensuring ambiguity for the Delegates as to which complete contracts and therefore payouts will yield from a determination. This is to promote impartiality and prevent collusion between delegates that may be allocated similar parts of contracts to adjudicate. With zero knowledge proofs, the details of payout will be obfuscated yet a Delegate is able to technically verify their due reward from a particular input.


SuperDelegates are actors in the PoT network that have accrued a perfect reputation by always inputting consensus-driving smart contract determinations.

There are two layers of validation redundancy within the PoT protocol, each for increasing levels of ensuring distributed consensus. Delegates, as explained above, propose inputs to execute smart contracts (or provide other data inputs), which auto-settle seamlessly in most cases. To reach validity probabilities of a near-zero chance of incorrectness, the User has the ability to even contest the consensus determination of the Delegate network. The Delegate input is communicated to the User without executing in the Smart Contract and held for a period of time for contestation. If or when the User does contest, the original query is sent to a set of SuperDelegates to provide indelible majority consensus, which would be considered valid. SuperDelegates interact with contracts the same way that Delegates do through the Delegate Portal to input their determinations, and they may participate in regular contract inputs, as shown in the Reputation Reward example above. Users pay an additional fee to the network (as described in detail below) to employ SuperDelegate voting.

In this section it to be shown mathematically how the chance of invalid contract execution falls from an initial 1 in 20 to approximately 1 in 500 million, through the application of PoT. With the PoT layer, malicious or error laden Smart Contracts are audited,

exposed and corrected, such that a given network can ensure its validity. In order to demonstrate mathematically how the PoT protocol ensures such a dramatic reduction in risk, two important assumptions are made:

1. Users will challenge incorrect determinations

Vital to the PoT process is that a user will contest an input, if they feel an error as incurred. The Delegate and SuperDelegate layers will be bypassed and the Smart Contract assumed to be correct, if no challenge is made. Committing the transaction permanently to the Blockchain.

2. Delegates operate independently

To avoid malicious action or collusion between individuals with nefarious intent, it is assumed that Delegates and SuperDelegates will act independently. Practically this may require time limits on the adjudication process in order to avoid the possibility of any communication between Delegates.

In the analysis below, it is assumed that there are 5 delegates in layer 1, and 9 in layer 2. This relatively small number of delegates is sufficient to show the extent to which PoT reduces invalid contract execution. It should be noted that the probability of incorrect contract execution falls further still, if the number of delegates in either layer increases. Furthermore, it is assumed that each delegate has a 5% probability of permitting an invalid Smart Contract. It should be noted that in the eventuality that each delegate has a 30% chance of error, probability still falls to less than 0.5%.

Even in the worst-case scenario, when there is a single Delegate who has malicious intent (i.e. has 100% chance of validating an incorrect contract), the likelihood of incorrect contract execution falls from 100% to approximately 2% if there are 5 Delegates in any given layer. This calculation assumes that the remaining Delegates each have a 5% probability of validating an error.

It is useful to define the variables used in each calculation here, they are as follows:

  • p = The probability of an individual Delegate validating an incorrect determination. Pchallenge= The probability of a Delegate challenging a correct result.
  • p1= The probability of acceptance of an invalid contract in layer 1.
  • p2= The probability of acceptance of an invalid contract in layer 2.
  • n1= The number of delegates in layer 1.
  • n2= The number of delegates in layer 2.

1 Network Layer (Layer 0)

Likened to a regular Smart Contract execution, the probability of one Delegate allowing incorrect data to be committed to the blockchain can be modelled as a single Bernoulli trial. In this layer, it is assumed that all participants of a contract agree on an outcome and the Smart Contract auto-settles without additional technical overhead. If the Delegate has a 5% (1 in 20) chance of incorrect input, then that alone is the probability of incorrect contract execution. This probability of failure at this initial stage is labelled po.

2. PoT Delegate Layer (Layer 1)

It is in this layer where Delegate consensus voting occurs. The probability that an invalid contract will pass through Delegate layer 1 is equal to the probability that more than half of the delegates validate it. Taking the probability of each individual Delegate validating incorrect data to be constant (p), the probability that an invalid contract passes through layer 1, pi, may be modelled using the binomial expansion as follows,

Where the ceiling function is used here to return the nearest integer, greater than the non-integer value.

3. PoT SuperDelegate Layer (Layer 2)

It is likely that any industry or governmental organization will still deem a 0.1% failure rate as too high. In which case, a second

SuperDelegate layer reduces the probability (p2) further. A similar calculation as in layer 1 can be performed, this time the summation must span more than half of the delegates in layer 2 as shown,

(2) Overall, the probability that an invalid contract is successfully submitted to the blockchain is calculated by multiply the probability of passing each layer as shown,

PPoT1 = Po *P1* P2•

(3) Another eventuality is that a correct contract may be challenged and then rejected. A small tweak to equation 3 can be made to account for this second probability, n PoT2 giving,

PPoT2 = (1 — Po) * Pchallenge * P1* P2•

(4) Where Pchallenge is the probability that a client will (incorrectly) challenge a legitimate contract. For each industry numbers will vary marginally, however, a baseline figure is obtained by taking the following values for each parameter:

If n1 = 5, n2 = 9, p = 5% and n r challenge = 1%, we can insert these numbers into equations 1 and 2 to give,

p1= 0.116%, p2 = 0.0033%.

These values can then be substituted into equations 3 and 4 returning,

PPoT1= 0.00000019%, PPoT2 = 0.000000036%.

Finally, the total probability that a mistake will be made in the smart contract and submitted to the blockchain can be found by summing pponand PPoT2 to give pTot,

PTotal PPoT1 + PPoT2 -7–7 0.00000023%.

A 0.00000023% chance of failure translates as a 1 in 500 million chance. Whilst it should be made clear that iCash does not bare liability for contract determination, this figure is infinitesimally small, and a demonstrates the efficacy of the PoT protocol.

When n1 = 5, n2 = 9, and p p challenge correct result = 1% then

  • layerl = 0.116%
  • layer2 = 0. 0033%
  • p_proof_of_trust_(1) = layerl * layer2 = 0.000003847%
  • p_proof_of_trust_(2) = (1 — layer1) * p challenge correct result * layer2 = 0.00003318%

Probability that proof of trust fails = p_proof_of_trust_(1) + p_proof_of_trust_(2) = 0.000037%

One primary aim of the Fed’s open market operations is to manipulate the short-term interest rate and the supply of base money in an economy, and thus indirectly control the total money supply.

In most developed countries, central banks are not allowed to give loans without requiring suitable assets as collateral. Therefore, most central banks describe which assets are eligible for open market transactions. Technically, the central bank makes the loan and synchronously takes an equivalent amount of an eligible asset supplied by the borrowing commercial bank.

Thus, if we wish to implement a system which simultaneously funds clean energy investment while also [indirectly] regulates the introduction of AI into the civilian economy, we need [essentially] a new type of open market operation.

Suppose then that we take a subset of our Libra-Aries Governing Council members and allow them to “prove stake” in the network via a new class of AI assets. For instance, suppose we said that any cloud computing resource or AI platform could count as stake in this network.

For a concrete example, see Google’s AI & Machine Learning Cloud Platform which offers scalable, flexible pricing options and gives us a plausible direction to consider. Google’s AI Platform charges for training models and for getting predictions.

The tables below provide the price per hour of various training configurations, as well as the number of training units used by each configuration. Training units measure the resource usage of the job; the price per hour of a machine configuration is the number of training units it uses multiplied by the region’s cost of training.

A predefined scale tier or a custom configuration of selected machine types can be selected for purchase.

The following table provides the prices of batch prediction and online prediction per node hour. A node hour represents the time a virtual machine spends running a prediction job or waiting in a ready state to handle prediction requests. Read more about calculating prediction costs.

Thus each member firm of the Primary Artificial Intelligence Dealers (aka Libra-Aries Governing Council Tech members) could submit new assets as such in terms of compute capacity, to the Fed, which would count toward their respective stake in the network as described the Delegate section(s) above.

On the Energy side of our network, we might say that all energy reserves such as onshore and offshore wells, pipelines, refineries, wind turbines, solar panels, nuclear plants, coal mines, gas stations, and so on, would count similarly toward their respective network stake. We would want to take care to count assets in such a way that distinguishes low-carbon from high-carbon assets.

Assuming “business as usual”, according to the market research firm Tractica, the global artificial intelligence software market is expected to experience massive growth in the coming years, with revenues increasing from around 9.5 billion U.S. dollars in 2018 to an expected 118.6 billion by 2025. The overall AI market includes a wide array of applications such as natural language processing, robotic process automation, and machine learning.

With the Fed having collateralized the network in AI assets, the Libra-Aries Governing Council should aim to auction off “blocks” of AI assets for consumption. As n dollars are traded-in for these blocks, we simultaneously create n*m Libra-Aries coins (LAC) which will be used to fund zero-carbon infrastructure investments, where m = an LAGC voted/managed multiplier which is justified in theoretical terms as a measure of future value consumed in the present. (i.e. debt)

We call “m-target” the differential between “projected global job displacement up to 2050 in earnings” and “projected AI revenue up to 2050”. Starting in year 2025 and up until year 2044 we are effectively consuming “future AI tax revenue”, and afterward, we begin to “payoff” the $73 trillion we borrowed. Continuing the AI revenue projection out to 2060, we see that even a minimal “tax” of approx. 10% can theoretically payback 100% of the debt by 2060.

Projected Global GDP vs global AI job displacement

Now, for various reasons, including sidestepping tech industry stonewalling of any imposed tax, we can effectively “pay” the line item tax amount by artificially adjusting the value of m (after breakeven) to be greater than the minimum “needed” value. In other words, we can see that the tech industry is a net beneficiary of limiting AI supply, even after we subtract 10% for taxes. The m-target value minus 10% is the industry net gain, even with the proposed 10% taxation.

Note that by rate-limiting the AI assets auctioned in these new Open Market Operations (OMO), the Fed gains a price control over these resources and subsequently, an indirect means to control AI-driven job displacement.

Likewise, this new OMO provides an economy-wide stabilizing force against the impending (and needed) fast transition to a zero carbon economy — protecting not only employees of these firms, but pensioners and mom-and-pop stockholders who otherwise stand to lose trillions in savings and other assets.



machine learning developer

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store