Friday, November 18, 2016

Science and Sense-making and Hype and Promises

I was reading the slides from Dave Snowden's (@snowded) talk at KMWorld this week.  Most unfortunately I could not make the show and his Keynote.

In his talk two slides stand out for me. The first is titled “The nature of the system constrains how we can act in it.”  In this slide, Dave describes an ontological based set of principles for working with Ordered, Chaotic and Complex systems.  In regards to the Complex domain, one of the principles is “Real time feed back for control via modulators

I think we need to carefully consider the wisdom Dave shares.  Modulators are not algorithms. Modulators are people who have the experience, gained by praxis, to disintermediate the contextual data.  From an anthro-centric standpoint, algorithms cannot sensibly switch the variable links.  For sensible means sense-making, in the manners that Dave describes in the slide titled “How do we avoid the hype and the false promises.”  



I’ve annotated the quadrants with my understanding of philosophy, which is nowhere near the caliber of Dave’s.  The annotation in the lower right (Prediction & high risk Scaling) is drawn from contemporary events. Readers might recognize it more readily by the old adage “History is written by the victors” (Walter Benjamin).  

I also think there is a spiritual dimension in sense-making that needs to be included; for it is the spiritual connections between us that most strongly influence the promises we give and receive.  However I am not at all qualified to describe where the spiritual lies in this framework.

Sunday, July 10, 2016

Thinking about Governance and Managing Constraints

I follow Dave Snowden's writings on a regular basis.  The following tweet by Dave grabbed my attention, as it is in line with my evolving ideas on complexity in bureaucratic systems.
“We need to stop talking about governance and start talking about constraint management”
I was about to reply with a quick question on governance, but didn't. In pausing I gave the matter some further consideration. In the shower I asked myself, what other practices do we need to consider?   The answer I found is posted here



Basically what we have a 2X2 matrix of Managed Constraints, Algorithms, Instinct and Governance. These correspond with the Cynefin domain's ideas of practice:

  • Managing Constraints is emergent, and thus locked to the Complex domain. 
  • Algorithms are Complicated, but note that there is a penchant for local optimization. As a result, we miss the black swans. 
  • Governance is overly constrained and thus fixed in the Obvious domain. 
  • Instinct is inherently Chaotic as the combination of individuals and circumstances is path dependent (see slide 4).
Further consideration of this matrix reveals the characteristics of the system (slide 6):

  • In the complex domain we design rituals to modulate decision processes.  See the Wikipedia entry to gain a sense of  how rituals are enacted into practice.
  • Treating governance as programmable as suggested by Daniel Kkahneman's 'Strategy firms can think smarter', is just the latest idea in a long string of algorithmic management approaches dating back to Taylorism, McGregor's Theory X, and others. 
  • In the obvious domain we design rules to modulate behaviors. It is easy, nay obvious, to prohibit bad behavior, and most rules are written accordingly.  In government every incident of fraud, waste and abuse seems to result in another rule, the consequence of which is bureaucracy by straight-jacket - an employee's hands are tied behind their back and the public is left wondering why service is so poor.
  • In both the Algorithmic and Governance quadrants the activities we choose - local optimization and a penchant for stability - serve to hide key aspects of social systems. Intermediation is amplified in these domains.
  • Engagement with the subject in both Managing Constraints and Instinct are by definition means of disintermediation. They differ in their approach. Dave recommends probes and safe-to-fail practices along with other ideas; please read his blog.
  • Instinct is far from perfect, witness the many types of cognitive bias that have been cataloged. And after all, the future is unknown. There is a particular class of these unknowns that are important here in the Chaotic domain.  This is the subset of unknown-unknowns we recognize as unintended consequences. This result of these consequences is damaged relationships and a loss of potential relational energy.
No decision is consequence free. However, how we approach decisions matters. There is a significance difference between the probing methods of Dave's safe-to-fail, and the brute force (fail-safe?) or totally naive practices that people partake.


Sunday, February 1, 2015

A Decision Lens for Complexity

This slide deck is a study of management and managing which is then set up in i-space, Max Boisot's classic reference for knowledge management. The practice of managing is drawn from Art Kleiner's "Core Group" theory. I demonstrate how knowledge work (the social learning cycle) is tangential to managing, and present a way that this can be studied using Cognitive Edge methods and tools.


A decision lens for complexity v10 from tony1234

A small repair

For some time my blog has been dysfunctional. All of the links were dead, the embedded slideshares were likewise merely static images. I have tried on a couple of occasions to adjust the template, without success. I've sent a couple of messages to the admins; with no feedback I have been at a loss for what to do. Today I tried again, changing to a new template, and suddenly my blog is working. Success, relief, and it is a sign to pick up the writing pace. For some reason, the fourth time was a charm.

Tuesday, December 9, 2014

A study of the Cynefin Framework


I had hoped to make the KM World & Taxonomy World meetings this year but couldn't manage it due to pressing tasks at work. I did meet up with a couple of friends and managed to meet one or two new people. In the course of conversations over what have we (each) have been up to, I mentioned this study of Cynefin that I have been working on. This is a deep dive into the transition spaces between domains which leads to some interesting conclusions.




Direct link is www.slideshare.net/tony1234/exploring-cynefin-transitions-v14 (as the slideshare frame is not functioning in some browsers)

Sunday, November 30, 2014

A Look at Software Development in Government


I have been working out this thesis for a year or so and have finally wrestled it into a working draft.  The study examines DOD software development practices centered around the term "requirements" and the different contexts that we use the term in.  I use the Cynefin framework as an organizing tool, which leads to some interesting conclusions.




Direct link is www.slideshare.net/tony1234/brownfields-agile-draft-v11 (slideshare frame is not functioning in some browsers)

Monday, November 12, 2012

Exploring Transitions


I've been thinking about the new picture that covers the transitory domain since it was first posted at http://cognitive-edge.com/blog/entry/5734/...-to-give-birth-to-a-dancing-star/ and in particular I’ve been focused on the "Duffer zone" for which Dave Snowden has said:
… when you deliberately remove all constrains with no idea whatsoever about what you will do.  You deserve to die.
This does not square with what is a crucial association of this transitory framework - our awareness of the dynamic of a situation.  There is in this sub-model an area of deliberation upon the unknown and the impossible, which we attend to because we must when we find ourselves in certain situations.  As a result, I think this zone may be far richer than a first glance indicates.

The background for this discussion is a “system-of-meaning,”  where we have taxonomy and typology locked into conflict.  Taxonomies are information lattices formed under high constraints, while typologies are theories that can developed when there is an abundance of information and constraints are low.  From my post on Between Taxonomy and Typology:

(3.6)    We see the conflicts more clearly because the boundaries of systems are more visible than the rules and principles of behavior.  Thus transgressions of the rules may be more shocking because they occur in unexpected contexts, nearer the heart (center) than the edge.
(3.7)    Fragments slip through these boundaries. 
To understand this sub-model we need to look at both the high and low constraints and their interactions.

1.  High Constraints


If we consider our starting point to be the lower right corner “Deliberate Awareness of the Dynamic,” we will find in most circumstances that there are considerable constraints in place and our options for engagement and activity are rather limited.  When starting in the lower right corner of the illustration, there are only a limited number of options that one can try for.  This is an area of high constraints where only incremental change can be exercised.  There is no easy outcome in these sorts of activities or games and our choice of moves are rather limited.

I have found a concise example from classes in Political Science and Statecraft in Deterrence Theory.  When we consider the strategies through which a state can exercise its power in a bipolar relationship, we can construct movements from one cell to the next along the following lines:


    (1.1)    The desired outcome of bargaining is coercion, wherein the target party has acted through force that is real or implied.

    (1.2)    If the target does not yield, it is because one side has settled upon a strategy of escalation. This is brinksmanship wherein “…the threats involved might become so huge as to be unmanageable at which point both sides are likely to back down.” (ibid)

    (1.3)    A state can pursue an indirect strategy of diplomacy or soft power.  Outcomes are ones of mutual benefits.  Outcomes may also be the de-escalation of demands and other levers of explicit force.  Diplomacy expands the scope of engagements between the parties.

   (1.4)    The last available movement in deterrence is the opposite of brinkmanship, namely deception, propaganda or bluffing.  One party has hidden information that, when later revealed, underscores the weakness of the position.

These movements are sensible when placed within the sub-model of the transitory domain.  In the highly constrained situation these moves are all below the diagonal line.  The diagonal represents the Cynefin boundary between the complex domain and disorder, proceeding from disorder through ambiguity towards coherence as it rises to the right.

We cannot breach this line as we cannot design a strategy that moves from the deliberate and impossible to the easy and unexpected.  Such a move would require a radical change of constraints and that reconstitutes the problem into one in a different domain.  If we reframe the problem, then we are no longer oriented towards the complex domain – reframing reduces the situation to an ordered solution which by definition means the complicated or simple domain.  In similar fashion, the revelation of hidden information changes the context in the same way.  The transitory domain is fluid and we cannot fashion a deliberate strategy to “stay in place” for very long.

We can construct similar moves from contracting and negotiations:
    (1.1)    Demands and conditions;
    (1.2)    Chicken or defense-in-depth; also see the Chicken game;
    (1.3)    Integrative negotiation, the so called “win-win”; and
    (1.4)    And various forms of misdirection.

From certain forms of negotiations and Game Theory there is one final applicable move:
 
    (1.5)    This is the roll of the dice, the deadline of the clock, and the so called “moves by nature.” This is the limit of the unknown when a move, any move, is forced upon us. This last case is necessarily ambiguous and fits in the center cell of the sub-model. 




2.  Low Constraints


Having fleshed out the incremental moves available under high constraints, we need to examine the zone labeled “Exploit Occurrence” in a similar vein.  The critical characteristic of the Easy corner (upper left) is that when there are few constraints the people or agents in the system can readily – freely – move.


The baseline example for this scenario is the Technology Adoption Lifecycle model.  With the first move already provided and oriented towards coherence and the complex domain, we can easily plot out the moves:

    (2.1)    Innovators and early adopters accept the risks to seize the rewards that are promised. This is deliberate awareness, even though the risks (and failures) are not clearly apparent.

    (2.2)    The early and late majorities are the mainstream; they follow as more information becomes apparent.  A decision can be made on the basis of risks and rewards; this decision is rational or emotional and therefore the plausible movement case.

    (2.3)    We then find the laggards who avoid making the transition.  As they wait the circumstances change, until they face an abrupt transition whose consequences are unknown.

In marketing and technology diffusion we can design our (pricing) strategies towards the early adopters, the mainstream, or the late-comers.  And it is from marketing, rather than the technology adaptation model, that we can discover the last move within this region:

    (2.4)    The final case is one of deliberate bad acts to achieve disorder.  These are the Luddites, copycats and intellectual property thieves, spammers and more.  The actor’s objective is to upset the normal system and introduce disorder which refashions the dynamics of the system in unexpected ways.

These moves are all above the line.  Curiously, in a free-movement system I haven’t found a move into the center ambiguous square.  By way of explanation, we cannot subdivide the laggards (the unknown and unexpected) in any meaningful way.


3.  Misinterpretation


While we cannot design a strategy that moves across the center line, we can all too easily mistake a situation and naively apply a strategy for a highly constrained situation to one of low constraints.  Or vice versa.

If we choose for instance to apply strategies for an open (low constraint), free-movement situation to a highly constrained, incremental movement situation, we may well end up in the “Duffer zone” suffering from Dave’s comment about “no constraints … deserve to die.”

On the other hand, we may attempt to apply high constraint strategies to a free-movement situation (or system).  The outcomes that succeed or fail in this case are likely to be more random than they are deliberate.  We will chalk up these outcomes to surprise, or intervention, or luck in post-hoc rationalization (retrospective coherence).  This case is probably a return into the chaotic domain vice an ascent into the complex realm.

4.  Collaboration


Collaboration also has a role to play in this transitory framework.  We can see a ready alignment between early adopters and the dominant power in the easy—deliberate cell (upper left right).  We also can see a natural alignment among bad actors in the impossible—unexpected cell (lower right left).  These two collaborations have no predictive power as the system will stabilize and form or collapse without either coalition’s contribution.

The horizontal axis of the unknowns is peculiar case.  There would not normally be a confluence of interest between Brinksmanship and Laggards, as the zealots pushing their increasingly radical proposals are avoided and ignored by the late adopters.  I suspect this dynamic tends to reinforce the status-quo, especially when the laggards are a large fraction of the population.

The vertical axis of the plausible is the most interesting combination.  Here we find in diplomacy and integrative negotiation a predisposition towards engagement and the expansion of boundaries in “win-win” agreements.  We also have in social networks, which are open, free-movement systems, the known accomplishments of swarming and crowdsourcing.  The combination of these two interests may well represent a significant predisposition in this framework.

In summary this transitory framework is quite rich and complex. It contains a number of surprising connections and is asymmetrical enough that it defies reduction into 2 x 2 forms.

Sunday, October 14, 2012

Between Taxonomy and Typology


1   The War Between Taxonomy and Typology

David Weinberger presents a conundrum in a brief piece in KMWorld magazine, “Who Cares About Knowledge?”  Is knowledge distinct or is it indeterminate?  Is it orderly or messy? Is it true belief, or just ideas and opinions?  Is knowledge embedded in content?  Or is it free floating in a web of human relationships? Weinberger concludes that “knowledge is becoming an old-fashioned term.”

Is he implying that we don’t need to talk about knowledge anymore?  That it is a term of art, of use only to specialists (aka knowledge managers)?  Perhaps some other idea, say “memes,” will supplant knowledge?  I’m sorry but I think I’ve heard this argument before.  This is just another skirmish in a long running battle. To use another old-fashioned term, it is the latest round in a war between Taxonomy and Typology.

2   The Knowledge Battleground

The starting point for this thesis is a blog by Dave Snowden titled “Typology or Taxonomy.”  Snowden cites as a reference an excellent paper by Kevin Smith, “Typologies, taxonomies, and the benefits of policy classification” in the September 2002 edition of the Policy Studies Journal.  Snowden describes a situation from his knowledge management consultancy work at IBM where adherence to an ingrained taxonomy led the company awry.  He concludes: 
The message is very simple, rigid boundaries have huge value in static situations so taxonomies work.  But where things are subject to rapid change and the possibility of encountering novelty is high, they [taxonomies] are plain dangerous.  However we do need constructs to make sense of the world and that is where conceptual frameworks, or typologies, come into their own.
It is clear that Snowden favors typology and is exceedingly reluctant to accept taxonomy as a guide.  He consigns taxonomies to the Simple domain, and finds them unsuitable for the rapid change of the Chaotic and the novelty of the Complex.
I’m not sure that I can agree with this proposition, as knowledge forms from the constant interplay between structured information and the amorphous, even formless, mass of data we are immersed in.  If we are to accept that ambiguity is a principle component of fragmented knowledge, then we must take a closer look at taxonomies and typologies and their interactions.  The patterns that are found in one frame of reference are different from patterns that are prevalent in the other.

3   Systems of Meaning

It is tempting to consider that our patterns are degrees of order, and to try and map them through the Cynefin framework. That approach leads us into the "I-space" described by Max Boisot.  However, "I-space" is too confining, and does not account for certain attributes of complex knowledge that are essential to understanding this puzzle.

(3.1)    Human systems are mutable – they are stable for a long while, until they suddenly change.  Stability is a property of the system, which suggests predictable behaviors in the ordered domain (simple and complicated) and probable ones in the unordered complex domain.  We will see that the form of the system is an emergent property based on how much information was available at its origin.

(3.2)    As we construct a taxonomy or a typology, we create a system of meaning.  As a novice, by the time we can talk cogently about a subject we are enmeshed in a taxonomic system of meaning.  The same holds true for experts constructing theories, for all theories are typologies.


(3.3)    So what we face is the continuous interplay of systems of meaning.  Patterns are more apparent when contrasts are strong, that is, when conflicts occur vice agreements.  When conflict happens, it will most often occur at the boundaries.  What we look at and think about human systems, what we see most often are the borders of systems.  That systems are constrained by their boundaries is inherent in the definition of all systems.

(3.4)    This isn’t to say that there are no conflicts over principles, which are the operating rules of a system of meaning.  There are plenty of conflicts over principles!  Still, it is genuinely hard to tell which type of conflict we are seeing: edge or inner.  The disagreements on the edges, the rubbing of boundaries, are conflicts that won’t change the running system inside.  The result is lots of noise and smoke, and infrequent change, which is what we see in all communities and social networks.

(3.5)    Human systems are consistent, although that consistency may or may not be coherent.  Each transaction that emerges from disorder drives the system towards consensus or coherence.  The transition itself forms a dampened oscillation, along with a shift in the community towards an empirical proof, or else towards a satisfaction that is fallible, the new consensus.  See Rotate-45-degrees-and-think-anew for an illustration of this process.

(3.6)    We see the conflicts more clearly because the boundaries of systems are more visible than the rules and principles of behavior.  Thus transgressions of the rules may be more shocking because they occur in unexpected contexts, nearer the heart (center) than the edge.

(3.7)    Fragments slip through these boundaries.  Stories move without difficulty as carriers of fragments and the providers of context.  In this sense of carrying, stories are dis-intermediated.  However, stories are more limited than fragments that are obviously “good” or “evil” or noticeable in some way.  Stories are filtered by culture, and what types of stories are acceptable are dependent on the norms of storytelling and the rituals of listening.  There are many more constraints on stories than on fragments.

4   Taxonomy

Taxonomy starts with scarce information and some observations or empirical evidence. We then search for some alignment of the evidence, producing an ordered set, which is new information.  When we build out taxonomies we aggregate information, and leave behind the scarcity that we started with. 

A taxonomy … classifies things based on clear empirical characteristics and will have rules that allow determination of location. They have clear boundaries … On the downside, once a taxonomy is established if something does not fit, it will be made to fit as the taxonomy itself creates a filtering mechanism through which we filter observable characteristics.  Kevin Smith






(4.1)    We can see the emergence of a system of meaning from praxis of a profession.  Practice is experience – an increase in knowledge - and eventually there is recognition, with or without titles, that you have knowledge and are an expert in some domain of information.  Within an organization, work leads to the formalization of a role, or even an office (organizational unit) as the system.  These patterns persist in the norms and cultures long after the condition of sparse information has disappeared.  What has emerged is a system of meaning where identity and knowledge grow together.  This is how the practice of “knowledge management” formed and how it grew into a recognizable domain of information.

(4.2)    Management is in its essence the making do with limited resources, inadequate time, and other severe constraints.  To talk of managing is to use language and metaphors of making do and accomplishing things when resources are limited.  It makes sense to talk of knowledge management when information is hard to come by, and when knowledge is scarce.  As we depart the condition of scarcity, we can still talk with our companions in the organization with the shared management concepts and terms that we grew up with.  KM makes sense in scarcity.  KM doesn’t work in a framework of abundance because management does not have terms and language for a world without constraints.

(4.3)    An autopoietic system is a system that grows and renews. (wikipedia/wiki/Autopoietic) We have just seen how taxonomy forms such systems from praxis.  Without the impetus of growth, the system destabilizes, turns chaotic, and decays into the confusion of disorder.  The system may also devolve into a simple and stable form that is superstition.  These systems of meaning are self-sustaining, unchanging, and have legitimacy in an information-is-sparse sense.  Conspiracy theories are one example of the devolved form of taxonomy.

5   Typology

Typology is possible when we have ample information. Typology also holds when there is more than enough and often far too much information.  We construct theories of various combinations of information and then filter by some measure of importance.  Filtering discriminates and excludes information, eventually reaching a balance that is the theory. 

In a typology the dimensions represent concepts, they do not necessarily exist in physical reality (although they can). As such typologies generate heuristics which are more adaptive under changing circumstances. On the downside the concepts can be arbitrary, may not be exhaustive and can easily be subject to clashes of interpretation.  Kevin Smith







The typology scenario differs from taxonomy because of the condition of ample information where disruption is far more pronounced.  I’m certain that systems of meaning will form, but the nature of these systems is not as clear.  It is much harder to isolate persistent patterns, and we will find a variety of system forms when there is an abundance of information.

(5.1)    For typology in the extreme, the condition is that there is too much information – a super-abundance.  Discordant information is the norm.  Snowden’s property of coherence is a necessary condition to attaining stable patterns. It isn’t a sufficient condition, as these patterns may be more like islands in the stormy sea than emergent, self-sustaining social systems.

(5.2)    We need look no further than the IT Department to find one example of the organizational form.  The practices of work may change fairly quickly, but the norms and culture of the group serve as memory that evens out the information flows.  When there is too much information, we must filter to reduce the level of discordance.  Filtering avoids disruption by creating compartmentalization or specialization.

(5.3)    Memory fails in larger groups as communications media do not scale effectively.  Diversity is lost because it is outvoted, it is not in the mainstream, and it is not loud enough to be heard in the Echo Chamber of the media.  Even with the best social media technologies, loud voices and hyper partisanship will drown out the diverse and obscure.  Consequently what forms is the notorious silo.

(5.4)    Silos exist to preserve order, because they are social structures that make knowledge simple and complicated (the ordered domain).  The primary characteristic of order is equifinality, therefore silos are open systems. (wikipedia/wiki/Equifinality)

(5.5)     Without silos the knowledge that the organization holds is at best complex and chaotic (the unordered domain).  It is an open question as to whether we can have an organization that is completely disordered and thus holds no knowledge.  I expect that any organization would collapse and disband well before the truly disordered stage, as smart people decamp into other organizations or communities.

(5.6)    An open system can easily devolve into conflict.  The conflict pattern is a situation where opposition becomes the operating principle for the social system.  The focus of the system becomes obstruction in all things, to the detriment of coherence. On the surface the conflict pattern seems similar to the silo and it’s equifinality, in so far as conflict is the emergent property.  Conflict breeds conflict; consider the so-called Law of the Jungle: “kill or be killed.”  On deeper inspection we will find that the language of the conflict pattern is one of provocation and escalation, and the conflict pattern hardens into a conflict system of meaning that is autopoietic, not open.

(5.7)    When there is too much information available, a group can reach consensus without coherence.  They can cherry-pick from the facts and construct the simple theories that we call fantasy.  This is another variation of the open (equifinal) system of meaning.

6   Balancing

To distinguish system forms in the case of ample information we need to examine an organization’s culture and its reaction to incoming (additional) information.  How does it react?  Does it avoid discordant incoming information?  Avoidance shows us a closed system.  Redirection and deflection are likely the hallmarks of a complex adaptive system - we can see that some change or some work (information processing) actually occurs in the handoff.  As noted in the definition of systems of meaning above, it isn’t clear whether any transaction impacts the system at a larger scale.  However, if we consider the interaction of information and constraints that bear on the system of meaning, we have illuminated the field where the patterns can play out.

(6.1)    When there is too much information, we can see in the equifinal characteristic of open systems the motivation to maintain the status quo.  Here we can see the roots of the antagonism of certain topologists who hunt the KM zombies and the other undead fantasists.

(6.2)    There is the reinforcing pattern of “remembering” that Patrick Lambe has described from his taxonomy work (http://wiki.sla.org/download/attachments/54264068/Taxonomy_KM_Lambe.pdf).  We filter naturally and we down-select based on preferences and aptitudes. Consequently, systems of meaning emerge.  Lambe’s remembering suggests that to change the form of the system there is a rigid set of bad habits to overcome.  We will find there is a particular rhythm that must develop to achieve effective remembering.  It is something that is uniquely found in story circles, campfires, and skilled tellers of stories.

(6.3)    Organizations consist of two or more silos.  Disorder occurs between the silos when they cannot agree and create a common meaning.  Where the silos do agree we will find a complex adaptive system where dialogue occurs.  The CAS system of meaning is one where taxonomy and typology are balanced and rapid change is possible.

(6.4)    There is one other case to include: exaptation.  Bricolage is less ordered, and less structured, than an open system; it takes more energy to cope with the abundance of information.  Bricolage is less safe than an open system, as the primary safety net of equifinality is missing.  But safety is not the operative motivation here, for if you want safety then fantasy is the easiest world to reach and inhabit.  Remember fantasy is the devolution of typology, and superstition is what devolves from taxonomy.  Bricolage is of course more familiar as pragmatism, the seeking of small gains from the system that one resides within.  Dave Snowden suggests that diversity may be key characteristic of the adbuctive reasoning of exaptation, see http://cognitive-edge.com/blog/entry/5575/exaptation-managed-serendipity-ii.  Also see Yiannis Gabriel’s discussion of bricolage at http://www.yiannisgabriel.com/2012/08/on-paragrammes-theory-of-organizations.html.


Emergent System of Meaning Types
Taxonomy
Typology
Praxis and learning
(growth, autopoietic)
Silo
(equifinal, open system)
Confusion
(disordered, not a stable pattern)
Conflict
(growth, autopoietic)
Superstition
(equifinal, open system)
Fantasy
(equifinal, open system)
Dialogue
(complex adaptive)
Bricolage
(complex adaptive)


7   Knowledge Redux

By the time we can talk cogently about searching or filtering, we are already embedded in an information system.  It may be a system of learning or remembering, or one of collaboration; these are stable patterns of information as we can see from Patrick Lambe’s taxonomy practice.  Other patterns of information which we can discover from following Dave Snowden’s principles of coherence are systems of power, of privilege, and of ignorance, both willful and blind.  These too are coherent systems of meaning.

(7.1)    What lies between taxonomy and typology is not a difference of information or even ideas.  What we are really seeing is the clash of systems of meaning in opposition.  Systems have boundaries and that is where the conflict occurs.  We do not often see the conflicts at the center, the heart, which is the ground from which the system forms

(7.2)    In the center of information space is a perpetual collision of systems. Concepts compete at the boundaries, but fragments easily slip through. 

(7.3)    It is only after we have recovered a perspective of the scarcity or abundance that once was in force, that we can finally talk about understanding and knowledge.  This perspective is what I believe Dave Snowden describes as cognition.

(7.4)    Other systems – of economics or of power – may have stronger effect than systems of meaning, and pure examples of systems of meaning may be hard to find.

(7.5)    We do not have a language for superabundance, nor do we have much language for low constraints – when there is little to hold us back from hurting others as we tell “our truths.”  I suspect this is the basis of David Weinberger’s complaint.

(7.6)    Principles may be in conflict, but if so, how can we tell them apart from the conflict over boundaries?  Complex knowledge emerges as the balancing along borders of systems of meaning.  Systems are both constrained and coupled as shown in this framework.  



(7.7)    If we balance well we have satisfied the conditions for social change; if poorly than we create the persistent resistance that we so often see.

(7.8)    Abductive reasoning is tightly constrained within certain dimensions and loosely constrained along others.  Contrast this with deductive reasoning that is uniformly constrained, and inductive reasoning which is generally loosely constrained.  Think of “best practice” and perfection as exclusivity seeking knowledge searches.  They focus on narrow criteria and as a result limit the possibility space they search.  On the other hand, “good enough” and novel need to be inclusive and thus they expand the search space of probabilities.