Monthly Features: What's New? | Guest Speaker | Real-Time Chronicles
A new guest appears frequently, and the prior wisdom is
archived in the Library.
|Featured Guest Speaker
Posted: April 14, 2005
William L. Livingston, PE
Compliance Practitioner, aka Bete Noire Engineering
Livingston's very cogent comments are relevant to the crafting of any policy, procedure, or regulation that expects to be useful in the uncertain future it intends to affect - whether done by a regulatory body or an organization wishing to influence internal behavior. Here is a glimpse at agile procedure crafting. Part 1 and Part 2 were published in prior months.
William Livingston is author of Friends in High Places, New Plague, and Have Fun At Work; with a book in process tentatively called Amicus Rex, about the engineering/law relationship. He is a Professional Engineer with more than 100 patents to his name, and has extensive background in the Energy and Utility sector.
From: William Livingston
Sent: March 21, 2005
Mr. Chairman and Honorable Commissioners:
You invited comments on proposed rulemaking for SROs, release No. 34-50699; File No. S7-39-04; RIN 3235-AJ33. Your proposal covers [www.sec.gov/rules/proposed/34-50699.htm] multiple issues regarding SRO operations, including governance, ownership and reporting. This comprehensive regulatory initiative is attended by 166 questions regarding rule design specifics. The commentary you have received from SRO veterans addresses those issues.
The purpose of this commentary, supplementing my two previous submissions, is to clarify the technological platform of “transparency.” There is strong tension in your rule proposal between subjective attributes of SRO governance, such as independence, and the frequent calls for objectivity and transparency. The great majority of the proposed rules concern subjective, personality elements of governance, including:
Likewise, the commentary posted responds primarily to the subjective elements of your proposal. When you are regulating by a system based on faith in particular people doing the “right” things, subjectivity and personal characteristics is what you wallow in. The strong themes of objectivity and transparency as your regulatory objectives, sprinkled throughout your proposal is, of course, incompatible with subjective attributes. The very definition of transparency excludes all subjective elements.
The issue you habitually face with the mixtures of subjectivity and objectivity you submit are immediately interpreted in completely subjective terms by the regulated – as exampled here in the SRO proposal and commentary received. The SEC’s own literature shows little faith that rules for subjective attributes will deliver the Commission’s stated purpose – preventing foreseeable stakeholder damage.
Subjectivity reigns supreme in every institution. Invoking a rich system of meaning and interpretation, humans explain themselves to themselves in terms of what they believe and wish. It is a world suspended in space by the divine chatter of human voices expressing judgments. Knowing what judgments do, tells you what judgments are. Paths through the communication traffic are always circular, and not dissective. Circulating power deals with meaning until the energy supply runs out. We live amidst the infernal clutter of opinions and feelings.
Brute facts rationalized by formal logic, i.e., objectivity, is a proper and practical characteristic of SRO governance and when transparency is attained, the entire litany of subjective elements within its sphere instantly evaporates. Transparency is the elimination of meaning. Remember that almost transparent, in this business, is opaque.
Transparency is the algorithm, computer executable, used by the institution to seek its desired future. The process of producing transparency is transparent. Objectivity does not derive from quantities of subjectivity. Objectivity comes from imposing structure on to disorder. Only objectivity, executed by masterless men, can produce transparency.
The loud SEC call for transparency, erroneously attended with adjectives and adverbs, is the call for the displacement of subjectivity with objectivity, even if you don’t mean it. Transparency is the regulatory panacea - representing and conveying no meaning. It is an unsigned work providing equal opportunity to manage forward. Transparency may be attained in several ways but the method used to obtain transparency is not detectable in the product. Transparency is a competency, subjectivity is obedience. As the dynamic model for dealing with the future, transparency solves the stop rule problem. No further explanation is possible.
In transparency, things move dissectively down the inferential staircase towards fundamental mathematical physics, their basic properties and laws. The domain thus revealed is meaningless, its fundamental laws controlling a vast but inaccessible arena. Down there in the engineer’s sanctuary of natural law, no human voices may be heard.
The calculus serves the imperial vision of mathematical physics. The world that emerges from this description is indifferent to human desires. The great era of mathematical physics, the effort to represent the world in mathematical terms, is now over. The world the algorithm makes possible is retrograde in its nature to the engineering world of natural law. Its elements are symbols and not quarks. Algorithms are human artifacts.
As transparency is the absence of meaning, it is abundantly clear that the only possible source of indifferent objective building blocks is natural law. There is no meaning to natural law, and it cannot be corrupted. Just as meaning is alien to mathematical physics, reality has the capacity to resist false interpretations. The cause-effect process of thinking in natural law eliminates personality. Genuine transparency is a concept, not a thing, which begins only when all subjectivity ends. In the end, transparency is justified by itself as the elimination of meaning or it is not justified at all.
Transparency is free passage of information about the workings of the regulated institution going forward, to a regulator. It is clear procedures for decision making that are scrutable and subject to viable methods of challenge. It is open, clear, measurable and verifiable. Transparency is a task in robotics. Knowledge development is used to form an effective algorithm using control theory that can be handled by available information processing power.
The natural law origin for transparency
The fundamental natural laws, radically simple, describe the primordial soup from which institutional complexity ultimately arises. Theories of change may always be expressed in terms of the conservation laws. Where there is conservation, there is symmetry and coherency. The fundamental laws of physics achieve their effect by aligning with symmetry. Natural law exhibits a shapeliness of mathematical form and a compactness of expression that itself cannot be improved in favor of anything shapelier or more compact. Natural laws capture the world’s patterns by capturing the play of its symmetries.
At the conceptual basement, the engineer’s bedrock sanctuary from which he need not further retreat, no further explanation is possible. No one has to account for their complexity. Natural laws are incompressible and so short they may be handled by computer. At the same time they are material; however, they must be complete, explaining everything that is complex. That is why they are useful.
Transparency is algorithm
There is a world of difference between the character of natural laws and the computations required to breathe life into them. The laws are infinite and continuous tying description ultimately to the calculus. Algorithmic computations are finite and discrete – providing a series of stylized snapshots.
The algorithm is the quintessential device engineers use for the mastery of time. An algorithm is a finite procedure written in a fixed symbolic vocabulary, governed by precise instructions, moving in discrete steps, whose execution requires no insight, cleverness, intuition, intelligence or perspicuity, and that sooner or later comes to an end
An algorithm is an effective procedure to get something done in a finite number of discrete steps. It is a prescription for action. An algorithm creates a world. This world is entirely accessible, its laws perfectly lucid, and the processes that they describe entirely subservient to the laws that describe them.
An algorithm is an abstract instrument of coordination, supplying procedural means to various ends – residing in a world beyond time. Almost any institution can execute an algorithm and so come to command some form of intelligence. What is an institution but a social organization that has undertaken the execution of complicated algorithms?
An algorithm is an alien entity in this world, discrete, finite, moving crabwise through its appointed steps and forever bearing the mark of its human creator. An algorithm is an ambidextrous artifact, residing at the heart of both human intelligence and intelligence amplification. Algorithms are instruments that convey information.
The triple concepts of algorithm, information and symbol lie at the prime mover of reality. In executing algorithms, time loses its pliant seamlessness and moves forward in finite, jerky, integral steps. Engineers devise systems of symbols gaining their point from having their point made by the glue of natural law.
To the engineering practitioner, complexity is subject to a principle of conservation, much like angular momentum. Complexity has to be either in the immediate material configuration or in the laws governing its formation and interaction. In this way, complexity is connected to a form of intelligence - intrinsically connected to natural law.
Investigating a plausible future
The only way time is going to be the engineer’s faithful friend is when he has made a previous inspection trip. Natural law just sits in the shedding sun, mute, inglorious and unrevealing. Reality appears only at the conclusion of the process of engineering. And what a process it is.
The explanation of the right now on the basis of natural law, the fundamental laws of physics and chance, involves a congeries of computational schemes, algorithms, specialized programming languages, techniques for numerical integration, huge canned programs for modular modeling, computer graphics, interpolation methods, computer-theoretic shortcuts and strenuous efforts to convert the data of dynamic simulation into coherent patterns. Validation looks to the program it instantiates, the design that it implements, and the intelligence that it represents.
To investigate the future to build transparency, the engineer takes existing reality of right now as the starting point. The present is always coherent and complete and, accordingly, can be described in mathematical physics. Then, using the transfer function of reality operating on the present instant, time is moved forward in discrete steps respecting the laws of conservation. Recursion then takes the now for its ride into the future on the back of this algorithm according to the script of imposed disturbances. This exercise investigates a single future path possibility. Investigations are then done, always starting from the right now, for a different selection of imposed disturbances. This cycle is continued until the tracks through the future of interest have reached the designer’s specified stop rule. This is the engine of pragmatic foresight to the stringent standards of transparency. From the natural law platform, transparency is constructed by algorithm.
The engineer always faces this dilemma. Without the triple concepts (algorithms, symbols, information), natural law knowledge points nowhere; but with the triple concepts in place, the laws are no longer simple. Intelligence is necessary to construct the inferential staircase, scrutable connectivity. Complexity is transferred from theories to facts and back again to theories. Ultimately, things are as they are for no better reason than they are what they are.
Equations that cannot be analytically solved may nonetheless be simulated. This is done by invoking a case-specific algorithm by purely numerical, purely discrete, purely finite techniques of calculation. The difference between an analytic and an algorithmic solution to an ordinary differential equation is sharp and it is inescapable. An analytic solution returns a differential equation to a continuous world; an algorithmic solution to a world that is discrete. Analytic solutions must be discovered; an algorithmic solution, executed.
Recursion is an expression in mathematical terms of the very essence of the algorithm - a finite rule of construction. Functions take objects to other objects, forming associations and binding elements of one set to elements of another. The primitive recursive functions are precisely those arithmetical functions that may be derived from recursion’s core by a finite number of specified mechanical operations. Recursion allows you to reach stunning goals by means of simple steps repeated over and over again. No thought is involved in execution. Recursion takes an infinitely large complexity and handles it in a rule of construction. It is the engine of intelligence amplification.
Building institutional transparency
The institution faces challenges that are far from well understood. They are too complex and too contingent to make definitive predictions. In the presence of deep uncertainty, the machinery of prediction and decision making seizes up. As you know, efforts are allocated to the clear parts and the rest is ignored.
You can follow an example of the transparency process, building threads through a plausible future on a grand scale, in the April 2005 issue of “Scientific American” - “Shaping the Future” by Popper, Lempert and Bankes (RAND corp.).
The engineer has rigorous, systematic methods for dealing with deep uncertainty. He does not seek precise prediction, but frames strategies that are competent over a wide range of plausible futures. Working on the design basis scenarios highlights the exposures so that ways to control the excursions can be developed. The engineer explores a set of what-if scenarios specified by the institution.
A public example of design basis scenario selection is the 1995 convention of the Global Scenario Group, convened by the Stockholm Environmental Institutes. The most hazardous plausible scenarios form the platform of design.
Engineers look for robust control strategies. The goal is to perform well when compared with alternatives over a wide range of plausible futures. It is not optimal, but it yields acceptable outcomes in both the likely and the unexpected futures. The designer stress-tests candidate solutions, finding the conditions that defeat the controls. The issue to resolve is what actions today will best shape the future to the specified criteria.
The desktop machine is used to generate multiple diverse paths into the future, spanning the diversity of those scenarios which might occur consistent with natural law. “Exploratory modeling” software is now available to test control strategies. The engineer invents a control strategy and for each scenario in the design basis ensemble the computer generates the outcome. The result is control means that can play out adaptively no matter whose subjective view of the future proves correct.
Engineering cannot change the future’s fundamental unpredictability. The quest is to find the internal control strategy to install today that can best usher in a desirable future. Engineers search for plausible futures in which the control strategy could fail and then devise means to avoid those potential adverse outcomes.
A splendid and complete example of operational transparency is provided by Burt Rutan and his prize-winning SpaceshipOne project to ferry people to space and back to earth. Burt’s innovation was expressed in a dynamic model and flown into space using flight simulation for everyone to see. The test pilots were part of the scenario investigations from the start. This achievement via transparency is available for independent validation to this day. Burt’s operation welcomes auditors.
Note carefully that when the artifacts delivered by transparency to natural law completed the mission as intended, all subjectivity evaporated. While Burt got some restrained credit, he completely avoided social punishment. There will be more design innovations from the Rutan operation and each will follow the same script. There will be media attention for the maiden flight, but when the objective is attained, none will hang around to learn how transparency pulled it off. NASA audited the flight, but ignored the process that made it possible.
When transparency is attained, as defined, obligations reverse. Transparency is ultimate SEC compliance. There can be nothing higher. It allows anyone to see plausible futures of particular interest. Tracking of actual compared to predicted performance is continuous. The “flight simulator” is made available to everyone.
With opacity, the institution retains responsibility for outcomes regarding stakeholders’ interests. Opaque systems run on faith. To be allowed to operate opaquely, the institution is entrusted with safeguarding those diverse interests. The stakeholders eventually see outcomes and react accordingly. Before there is corrective action, there is damage.
With transparency, outcome responsibility abruptly shifts to the stakeholders. As opaque institutions, they have been given the means to protect their own interests before damage is done. Since the stakeholders are opaque, the regulated institution can hardly be expected to know what might be considered damage by a stakeholder. In a reversal, the defendant can show how the plaintiff could have taken the flight simulator and foreseen the outcome he is now complaining about.
What is seen via transparency may be interpreted and given meaning in various ways but there can be no debate over what is being seen and that it contains no meaning itself. The more subjectivity can be displaced with objectivity, the less charisma matters. Transparency, like all algorithms, exhibits no personality at all.
Getting rid of personality in the product requires all sorts of idiosyncratic talents to complete the intermediate steps. Compared to business as usual, the mission to transparency involves individual freedom of thought beyond the chain of command on the one end and then reaches the goal line with no need for management buy in. Transparency needs no judgment or explanation.
We now know the market cares little about materially defective internal control and corporate accounting. No penalty for bad control amounts to a penalty for good accounting. Transparency eliminates the whole regulatory mess exposed here by making stakeholders accountable for their own futures. No yelling after damage is done, when the means to avoid the damage was provided. The apathy of the market for integrity of financial reporting is a major obstacle for the SEC in designing regulation. What is the point of more rules to engineer an outcome deemed insignificant by your clients? The only way around this thwart is to go for the transparency reversal of responsibilities.
The engineering speak of this commentary about transparency as algorithm may not be familiar lingo within the SEC, but the principles recited here are very easy to validate by sources you trust. Independent validation, audit and examination are highly encouraged. Transparency is the process of engineering applied to the task of system control. You will find a direct government parallel in the folly of mixing subjectivity with objectivity with the NIST Malcolm Baldrige National Quality Award. To a professional engineer, its 2005 “Criteria for Performance Excellence” booklet reads exactly like your SRO rule proposal. If we had to deal with system control as the SEC is dealing with SRO governance and NIST deals with Quality, there would be no cell phone.
Engineering competency advances always mean change to how business gets done. Better means for getting work done has a way of displacing lesser means, no matter how hard the institution resists. New means requires new work habits and the old ones fade away by disuse. Improving technology and methodology always bring the displacement of subjective means with objective means.
No doubt the standard of care for pragmatic foreseeability, i.e., transparency, enforced through tort escalates in step with advances in competency. Sooner or later, stakeholders will wonder why the damage raining down on them from institutions protected by the business judgment rule, continues. Why is the same pragmatic foresight, transparency, required of the professional engineer not appropriate to managing institutional affairs so as to avoid pernicious consequences to the stakeholders?
The SEC is commended for providing this convenient means for submitting comments.
William L. Livingston, PE
|Would you like to offer some advice or add to the dialog?
Your sending of a comment automatically grants us permission to edit and
post at our discretion. Comments will be forwarded to the author. Send your
========= Reply =========================
Monthly Features: What's New? | Guest Speaker | Real-Time Chronicles
Send mail to
with questions or comments about this web site.