Heuristics for Building Great Products – Gordon Bell

One of the great entrepreneurs of the 20th Century died in 2011 – Ken Olsen who founded Digital Equipment Corporation (DEC).  For 23 years, Gordon Bell served as the Executive Vice President for Research and Development (both hardware and software) working closely with Ken Olsen to generate innovative hardware and software systems.  I had the privilege of learning from both men during the years at DEC when I was building ALL-IN-1.  One of the joys of being on email in the early 1980s was getting messages like the following from Gordon Bell.  It is a tribute to Gordon that most of these recommendations are as fresh today as they were thirty years ago.

INTEROFFICE MAIL TO KEN OLSEN FROM GORDON BELL
Dated Sunday 15 March 1981

Gordon Bell

Product goodness is somewhat like pornography, it can’t fully be described, but we’re told people know it when they see it. There are lots of heuristics in the book Computer Engineering. Since quality and competitive products must be our number one focus in these next generations, these heuristics are intended to help us. Only the four following need be attended to:

  • A responsible, productive and creative engineering group
  • Understanding the design constraints
  • Knowing when to create new direction, when to evolve, and when to break with the past
  • Ability to get the product built and sold

ENGINEERING GROUP
As a company whose management includes mostly engineers, we encourage engineering groups to form and design products. With this right of organizing, there are some responsibilities.

  • Understanding leadership who understands the product space and who has engineered successful products.
  • Having skills and disciplines required in the respective product area, eg: ergonometrics, acoustics, radiation, microprogramming, data bases, security, reliability.
  • Having skills on board to make the proposal so that we adhere to the cardinal rule of Digital, “He Who Proposes, Does”. Approving a plan, based on no implementers violates this.
  • Having openness, external reviews, clearly written descriptions of the product for inspection.
  • As a corollary of being prepared with leadership and skills, we occasionally enter very new areas, requiring research and advanced development; product commitment should not be made until fully operational breadboards exist.
  • As a corollary, start up groups with no previous or poor previous track record, may need review.

PRODUCT METRICS
Since most of our products are evolutionary, engineering is responsible for knowing their product area, in terms of:

  • Major competitor cost, performance and functions together with what they will introduce over the next 5 years.
  • Leading edge, innovative small company product introductions.

DESIGN CONSTRAINTS
Design constraints such as acoustics and radiation, are basically useful because they limit choice of often trivial design decisions. We should meet the following design constraints, and if unacceptable, go about an orderly change:

  • DEC Engineering practice for producability. These assimilate the critical external standards such as VDE, and FCC as rapidly as possible.
  • Information processing and communications standards, such as COBOL, Codasyl, IEEE 488 and EIA.
  • Information processing standards as determined by the key supplier, such as IBM SNA. For example, all eight versions of VISICALC we are implementing, should be compatible with external VISICALCs.
  • The architecture of existing DEC products. For example, future editors should be compatible with the past editors, unless it can be shown experimentally that there is a significant (x2) benefit to change. These include:
    • ISPs of the PDP-8s, several PDP 11 ‘s, VAX-11, 8048, 8080 and are likely to include a 16-bit micro.
    • Physical busses for interconnect. Fundamentally this insures that future products can evolve.
    • File, command language, human interface, calling sequence, screen/form management, keyboard, etc.
    • We must not be undone by historically poor standards which constrain us to poor products. Currently, the 19″ rack and the metal boxes we put in it, and then ship on pallets to our customers, act as constraints on building cost-effective PDP-11 Systems. The “mind-set” standard is impeding our ability to produce products that meet the 20% cost decline. A target should be the shipment of systems in cardboard boxes which the customer assembles.
  • Ability to be implemented easily in the natural language, given that we are selling products in all countries.

WHEN TO CREATE A NEW PRODUCT DIRECTION OR WHEN TO EVOLVE THE OLD

Given all the constraints, can we ever create a new product, or is everything just an evolutionary extension of the past? Also do we know or care where product ideas come from? There are a whole set of places to look for products, but that’s another set of heuristics, and the object of these heuristics is simplicity. The important aspect about product ideas is:

  • Ideas must exist to have products!

It is hard to determine whether something is an evolution or just an extension. If you look at our family tree of products, like the one for our computing systems, and which every product group should have and maintain, the critically successful products all occur the second time around. Some examples: 6, KA, Kl, KL, 2080; Tops 10, Tenex, 20; 5, 8, 8S, 8I/L, 8E/F/M; OS8-RT11; 11-20, 40, 34; RSX-A… M; TSS-8, RSTS; various versions of FORTRAN, COBOL and Basic all follow this; LA30, 36, 120; VT05, 50/52, 100; RK05, R101/2.

Some heuristics in designing good products:

  • All products whether they be revolutionary (we have yet to have any that are really in this category), or creating a new base, or evolutionary, should:
  • Offer at least a factor of two in terms of cost-effectiveness over a current product. If we build unique products that do not compete with ourselves, then we will have funds to build really good products.
  • Be based on an idea which will offer an attribute or set goals and constraints for VAX included factor of two algorithm encoding and also offering ability to write a single program in multiple languages. VT100 got distinction by going to 132 columns and doing smooth scrolling.
  • Build in generality and extensibility. We have not historically been sufficiently able to predict how applications will evolve, hence generality and extensibility allow us and our customers to deal with changing needs. We have built several dead end products with the intent of lower product cost, only to find that no one wants the particular collection of options. In reality, even the $200 calculators offer a familiarity of modular printer and mass storage options. For example, our 1-bit PDP-14 had no ability to do arithmetic or execute general purpose programs. As it began to be used, ad hoc extensions were installed to count, compare, etc. and it evolved into a digital computer.
  • Build complete systems, not piece parts. The total system is what the user sees. A word processing system for example includes: mass storage, keyboard, tube, modems, CPU, documentation including how to unpack it, the programs, table (if there is one, if not then the method of using at the customer table), and shipping boxes.
  • A new product base, such as a new ISP, physical interconnection specification, Operating System, approach to building Office Products must:
    • start a family tree for which we expect significant evolution to occur on, otherwise the investment for a point product is so short term and hence is likely to not pay off. In every case where we have successful evolutionary products, the successors are more successful than the first member of the family.
  • A product family can evolve several ways as described on page 10 of Computer Engineering. The evolutionary paths are lower cost and relatively constant performance, constant cost and higher performance, and higher cost and performance. In looking at our successful evolutions:
    • Lower cost products can’t get by without adding functionality too, as in the VT100.
    • Constant cost, higher performance products are likely to be most useful, as economics of use are already established and a more powerful system such as the LA 120 will allow more work to get done.
  • A product evolution is likely to need termination after successive implementations because new concepts in use have obsoleted its underlying structure. All structures decay with evolution and the trick is to know what the last member of a family is, such as the 132 column card and then not build it. This holds for physical components, processors, terminals, mass storage, operating systems, languages and applications. Some of the signs of product obsolescence:
    • It has been extended at least once and future extensions render it virtually unintelligible. (For example, PDP-8 memory addressing and ISP was extended three times.)
    • There are significantly better products available using another base.

SELLING AND BUILDING THE PRODUCT
Buy in of the product can come at any time. However, if all the other rules are adhered to, there is no guarantee that it will be promoted, or that customers will find out about it and
buy it. Some rules about selling it:

  • It has to be producible and work. This, although seemingly trivial rule is often overlooked when explaining why a product is good or not.
  • There should have been a business plan that several different marketing groups have contributed to in terms of ordering and selling. Just as it is unwise to depend on a single opinion in engineering for design and review, it is even more important that several different groups are intending to sell the product. Individual marketers are just as fallible as unchecked engineers.
  • Never build a product for a single customer, although a particular customer may be used as an archetype user. Predicating a product on a sale is the one sure way to fail!
  • It should be done in a timely fashion according to the committed schedule, at the committed price and with the committed functions.

Now isn’t it clear why building great products should be so easy?

Are there any heuristics that should be added? Or are patently wrong? Or need clarification?

Comments please!

The first paragraph with the 4 points says it all, but in case there’s need for detail, there are another 30 or so which follow. . . . in the words of Mies van der Rohe, “God is in the details.”

Posted in ALL-IN-1, Idealized Design, User Experience, WUKID | 3 Comments

Too Much to Know – The Death of the Long Form Book?

At dinner the other evening at Crush with my valued all things marketing and branding colleague, Katherine James Schuitemaker, I shared with her that I finally produced a draft of the book on Attenex Patterns I’ve wanted to write for a long time. She patiently listened without interrupting as I energetically talked about the topics and ideas I wanted to highlight.

When I finished and took a deep, expectant breath, I asked “so what do you think?”

Providing the gift that only long time colleagues have permission to do, she looked at me and then said “Skip, that is so old school.  You’ve waited so long to publish your first book that the world of book publishing has passed you by.  Toss the book idea out and start developing the iPad app that both of us really want.”

While this was not the comment or pat on the back I was looking for, I knew I was about to get something better.  So of course I had to ask “what do you think that app looks like?”

Katherine was at her most eloquent, software conceptualization best as she launched with the synthesis of threads we’ve talked about for twenty years since we first met at Aldus (now Adobe).  Energized, she leaned across the table and lamented “I am so tired of the linear book.  I am so tired of reading books and making notes in them that become completely inaccessible.  What I want is to have a tool that is the combination of the two tools we built at Attenex – Structure for authoring and Patterns for making sense of all the reference materials.”

“I want you to provide the same content that you were going to put in your book but now do it in app form.  But most importantly, I want that app to be the starting point of what I need.  I need to be able to put in a current project that I am working on and have your application point out the gaps between your framework and what I am doing.  I don’t want more information in the form of static content.  I want dynamic, connected knowledge that is ‘news I can use’ when I need it and in the context of what I need.”

“Skip, you have to go back to your original vision at Attenex of connecting authoring (Structure) with discovering (Patterns).  Stop with this book nonsense.  This is your legacy that only you can do.  The previous forty years are all prelude to preparing you for this killer app.”

Well, she had me know.  Legacy.  That was really unfair to entice me with the thought of producing a legacy.

While one part of my brain knew that she was on to something important, I couldn’t let go of the idea of writing a book now that I finally had the energy, motivation and stamina to do the writing.  With my high tolerance for ambiguity, I looked her straight in the eye “I’m going to be incongruent for a bit.  My gut tells me that you are right on.  Yet, my analytic brain is fighting your idea something fierce.  So I’m going to let my analytic self argue with you for a half hour and then I am going to agree with you and change course in some fundamental ways.”

Katherine was very patient with me for the next half hour as I served up objection after objection.  She did her best not to laugh as we’d played this game many times before.  Finally, as my “objection energy” ran out, I said “OK.  New game.  How do we marshal the resources to make it happen?”

As we parted, Katherine turned to me and commanded “Skip, free us from the tyranny of the linear book!”

My test for any good idea is how much energy I have for the idea when I get up the next morning.  Based on the frenetic writing that occurred over the next couple of days and the meetings I set up to corral the resources, this idea was clearly the right one.  I sent this email message to Katherine the next morning:

Katherine,

I don’t even know where to begin.

You have such a wonderful way of hearing, synthesizing, guiding, shaping and blowing my mind.

I knew there was a reason I’ve been procrastinating in writing the book.  The main reason I bought the iPad at launch was in Steve Jobs announcement he talked about the future of the iPad is combining video and books – the Vook.  That’s what I wanted to get experience with – to learn how to author a Vook or beyond.  I knew it couldn’t be linear, but the 60 years of reading linear books blinded me.  Yet, I’ve been disappointed in all the attempts that I’ve seen (the many Vooks I’ve bought), Flipboard, The Daily, and the Business Model Generation iPad app.

Using the iPad for several hours every day has transformed the way I work, play and think.  But not far enough.  Last night you moved the needle far beyond what I’m experiencing.

As a starting point, I’m attaching where I’ve gotten so far in authoring what I want to say about Patterns for the iPad.  After last night this is a start, but there needs to be so much more.

And my mind wouldn’t shut off last night.

In retrospect, the seeds of your insights last night were planted in this memo to the Attenex team.  While I kept coming back to these thoughts over the years, I clearly didn’t understand the implications of the last couple of paragraphs even though these ideas spawned the personal patterns work that Eric Robinson and I iterated through.  I was blinded by Patterns as a discovery and review tool.  Yet all the puzzle pieces are there in Patterns when we added meta-data tagging.

Email Message from Skip to Attenex Staff:  August 31, 2001

In life there are little things and big things.  In the context of business, August 15, 2001, was a “big thing” day for me.

In 1968 I was fortunate to get a job in a psychophysiology research lab at Duke Medical Center at the start of my sophomore year in college.  We ran experiments on human subjects looking at their physiological responses to behavior modification therapies and to different psychiatric drugs.  To better deal with experimental control and real time data analysis of EEGs and EKGs, we purchased a Digital Equipment PDP-12 (the big green machine).  It had a mammoth 8000 bytes of memory and two pathetic tape drives that held 256,000 bytes of storage.

Embedded in the rack of the computer was a big green CRT which could display wave forms as well as text.  A simple teletype device served as the keyboard.  While we were controlling the experiments, we displayed in real time the wave forms from the physiological data of the human subjects.  We experimented with multi-dimensional displays of EKG vs EEG vs the user task analysis.  It was so fun to get lost in “data space.”  [A former HCDE student, Denise Bale calls this “dating her data”.]

Along with doing all the programming for the lab experiments, I got to use the machine to play my first computer game (Spacewar).  It was so cool being able to control a space ship in the solar system and have it affected by the gravity of the planets on the CRT.  There was no mouse at that time, but we used several potentiometers and toggle switches to control the X, Y and Z coordinates along with the firing of guns.  Controlling green phosphor objects was a real feat for those of us who have no hand eye coordination.

One semester while procrastinating in writing several term papers, I wrote a text formatting application called Text12 which was modeled on Text360 for the large IBM mainframes of the time.  The formatting commands were eerily familiar to the HTML format that we know today.  The results of the activity were that I could enter and edit the text of my papers and then print them out on a letter quality device.  It eliminated all the messiness of using a manual type writer and white out.   Several times at 2am in the morning I hallucinated about the combination of Spacewar, Complex Wave Form Pattern Detection and Text12 to provide the ability to take the electronic texts that I was creating, analyze them and display them in three dimensional spaces by the relatedness of the concepts within the papers.  I got carried away thinking of a new document being indexed and “blasting” links throughout the galaxy of documents.  I could almost feel the gravitational attraction of the important documents.

Over the next 10 years as computer processing power grew from the PDP-12 to the PDP-11 to the DEC VAX computers (wow 4 megabytes of virtual memory space for a program and 60 megabyte hard disks), I would periodically do a midnight coding project to try and bring my hallucinations from 1968 into reality.  Nice idea but there was never enough algorithms, CPU power, or memory.  And there were precious few electronic text sources available to actually index unless I wanted to type them in myself.

As I became a manager and began to acquire research budgets, I would squirrel away a little money each year to see if the technology was ready to tackle the vision.  The technology was never ready and there was relatively little research into the indexing and display of document collections until the early 1990s.  The other side of the coin was that there was no clear idea of the business value of such a tool.  We’d use these prototypes to try and impress internal funders to create some larger research projects.  But nobody ever funded us beyond the prototypes.

During this time I hooked up with Russ Ackoff of the Wharton School at the University of Pennsylvania.  One of the many “idealized designs” that he worked on was a distributed National Library System that he published a book about.  This design called for all the texts to be in electronic format and available for searching.  A key feature of the system was to generate “Invisible Universities”.  That is, using the reference lists of published papers and books, find out who references whom.  This system could then create influence diagrams of idea evolutions.  I was really hooked then on the possibilities.

One of the many reasons I joined Primus a couple of years ago was to bring this vision to reality using the Primus Knowledge Engine as a foundation.  We even licensed the Inxight ThingFinder software to help us do the indexing we needed to automatically author “solutions” for our knowledge base.  We got started but it became clear that we had no visualization talent within the engineering department and no clear idea of the business driver for such a technology.

Which brings us to Preston Gates and Ellis (now K&L Gates) and Attenex.  Thanks to Marty Smith who connected this semantic indexing and visualization with the electronic discovery problem, we now had the baseline tool to see the dream come true.  Thanks to the efforts of Eric, last week we were able to connect the indexing capabilities of Microsoft tools so that we could inhale MS Office documents into the document analysis tool and generate concepts from Word, Powerpoint, Excel, HTML, and Adobe PDF documents.  Then, we were able to load an Attenex Patterns Document Mapper database with my research papers from the last several years about customer profiles, document visualization and knowledge management.

Then Kenji and Dan figured out how to cluster long documents and normalize the frequencies of the concepts.  And Lynne added the final layer of being able to add a document viewing window for the multiple formats along with cleaning up the interaction with the concept window panes on the right side of the Patterns display.

At 5PM yesterday, I saw my 30 year dream come alive.  I was able to display my research papers.  I navigated around the clusters and the concepts.  And then when I selected a document, whether it was MS Word or a PDF, up it would pop in its own document viewer.  Unbelievable.  The only thing missing is the ability to index the books that I have in my home library.

But synchronicity strikes again.  Just this week, Amazon.com started selling electronic versions of the popular management texts that are a core part of my library.  They come in either Microsoft reader or Adobe eBook format.  I quickly bought ebooks in each of the formats to see if we could index them.  Of course they are protected from that.  So close, so far.  But then it occurs to me, books are intellectual property.  I bet that someone in the Intellectual Property Practice at K&L Gates was involved in negotiating the licenses for some of the book properties.  Sure enough several folks in the group were.  So hopefully the last step in the journey of the dream is close at hand, the ability to not only pour my own writings and email, research reports, and published papers into the Attenex Patterns document database, but we can also get full length books indexed.

Now I will be able to SEE the idea and concept relationships between all these wonderful publications that I can only fuzzily keep in my human memory today.  I can’t wait to glean new insights as I index more documents and as I use the re-cluster on anchor documents to see relationships I’ve never been able to see before.  I look forward to being able to publish meta-data about a corpus of documents and open up a whole new field of Document Mining.

As a researcher, teacher, and business person, yesterday was the happiest day of my professional life.  My heartfelt thanks to all of you who’ve helped bring these concepts to life.

Katherine, you commented about authoring the Patterns story in an iPad version that the reader could add to in interesting ways, reminded me of this slide – my content, our content, their content.

In working with clients the last couple of weeks I’ve been adding to the above and then making it a mirror image with the author on one side and the reader on the other.

The author’s side goes something like this:

  • Collect
  • Annotate
  • Curate
  • Distribute
  • Engage – in the fullest sense of social media and Cluetrain Manifesto
  • Recycle

The reader’s side goes something like this:

  • Collect
  • Understand
  • Relate to current situation
  • Relate to other information and signals I’m getting
  • Engage
  • Act on the information

The Aggregage website looks at this phenomena from a marketing sense and then lays out a number of tools that help marketers cope with information overload.

Another term for this is transactive content:

The last step is monetization or what are the ways that you can monetize the content. While these methods are somewhat specific to social media they provide a good range of the monetization models open to you:

Top 10 Monetization Trends for Social Media and Microcommunities

“When it comes to savvy, proven, and incredibly successful tech investors, Ron Conway is a legend.  He has a gift or an uncanny sense of shrewdness, or a fusion of both, to identify the real opportunities that will transform into successful exits and also fuel and inspire aggressive innovation in the process.

“To help entrepreneurs, startups, and industry leaders capitalize on the tremendous opportunity that social media presents, Conway offers his vision for the top 10 ways to monetize real-time conversations:

  1. Acquiring followers
  2. Advertising – Context and display ads
  3. Syndication of new ads
  4. User-authentication; verifying accounts
  5. Commerce
  6. Payments
  7. Enterprise CRM
  8. Analytics; analyzing the data
  9. Coupons
  10. Lead generation”

Even the original vision for Attenex had the two pieces I felt were critical to a viable company – the authoring piece (Attenex Structure) and the reviewing/discovery piece (Attenex Patterns).  While they were sort of an integrated whole in my head, they always appeared as two separate pieces to everyone else.  Again, another thread of ideas dropped in the process of business focusing.  It really is the combo of Structure and Patterns and then going way beyond with what devices like the iPad allow for.

As a result, I never have seen Patterns as an authoring tool – it’s a discovery tool.  Even our core slide on what visual analytics means (courtesy of Sean McNee) has no authoring component to it.

So the Tom Sawyer part of me wants to kick this off by getting all of the participants in Attenex Patterns over the years to contribute to a Wiki like environment to begin the profiling.

  • Employees
  • Preston Gates Contributors – the board, Kim Church and her IT crew
  • Customers – Jones Day …
  • Channel Partners – FTI, SPI, Forensics Consulting, KPMG, Strategic Discovery
  • Competitors – Applied Discovery, Dolphinsearch, Stratify, Recommind …
  • Consultants – George Socha (EDRM), Geoff Bock, Patrick Inouye (Attenex patent attorney) …
  • Influencers – Monica Bay at Legal Technology News, Sedona Group, Judges (Schira Schindlin)

In an ideal world these folks would also become part of the startup community to enhance the personal patterns.

As part of joining the network, each person must profile himself/herself (setting up data for the template use or predictor for your stuff):

  • Meyers Briggs Score
  • Social Styles Score
  • Educational Background
  • Work experience (pointer to LinkedIn profile)
  • Number of years involved with eDiscovery
  • Role in relationship to Attenex Patterns
  • Authored Stuff
    • Favorite memory of Patterns
    • Tell me a story of your involvement
    • What role did the product play in your life
    • Key events in time
    • Artifacts
      • People photos
      • Screen shots
      • Important emails
      • Use Cases

The above information would be used to build out the content as well as create the fodder for the semantic networks, social networks, and event networks.  The meta tags for each of the authored content giblets and artifacts would place the content into the appropriate cluster or spine.  And unlike Patterns we would allow content to be in multiple clusters.

From our conversation last night, what really hit home is the comparison in eDiscovery between linear review and what we created with Attenex (conceptual review and now automated review – predictive coding).  I hadn’t made the leap from the linear book to the conceptual book or resource or dynamically mapped content or maybe the key term content in context.

And then the other things you suggested were to start the iPad App out with the story and then go through layers (overlaid by):

  • The” Make Sense of My Stuff” – ability to add your stuff to the core book (like Tableau Public)
    • For my own learning
    • To see patterns across lots of other authors’ work
    • To profile the project much like we are profiling the contributors

Now the next set of thinking is to follow through on the thread of how this transforms – reading and writing (in the fullest sense of multiple mixed media – text, photos, video, audio, simulations …), learning, and publishing.  It is learning for the Facebook generation.

This “content in context” meme is clearly in the air.  Both Amazon with their new Kindle Publishing format and Apple with their iPad iBooks textbooks announcement are creating more flexible formats for thinking outside the linear book.  Easy to use toolsets are emerging from Vook and Pugpig.

Amazon Children's book example

The Wall Street Journal weighed in with their “Blowing Up the Book” article on the new eBook formats.

The Novel remixed: Chopsticks Children's book

So Katherine, many thanks for knowing me better than I know myself, and pointing me in a new “legacy” direction.

Peace,

Skip


One of the problems with powerful ideas and paradigm shifts is once they get in your mind you now see the world through that lens. While meeting with Duke Professor Kate Hayles recently, she kindly shared several chapters of her new book How We Think: Digital Media and Contemporary Technogenesis. As I flew back to Seattle I immersed myself in these Adobe PDF chapters on my iPad. I loved the insights and implications of what she was describing for the new forms of literature in digital media.  I particularly liked her pointers to  the Digital Humanities Manifesto 2.0 to describe the first two waves of the new field:

“Like all media revolutions, the first wave of the digital revolution looked backward as it moved forward. Just as early codices mirrored oratorical practices, print initially mirrored the practices of high medieval manuscript culture, and film mirrored the techniques of theater, the digital first wave replicated the world of scholarly communications that print gradually codified over the course of five centuries: a world where textuality was primary and visuality and sound were secondary (and subordinated to text), even as it vastly accelerated the search and retrieval of documents, enhanced access, and altered mental habits. Now it must shape a future in which the medium‐specific features of digital technologies become its core and in which print is absorbed into new hybrid modes of communication.

“The first wave of digital humanities work was quantitative, mobilizing the search and retrieval powers of the database, automating corpus linguistics, stacking hypercards into critical arrays. The second wave is qualitative, interpretive, experiential, emotive, generative in character. It harnesses digital toolkits in the service of the Humanities’ core methodological strengths: attention to complexity, medium specificity, historical context, analytical depth, critique and interpretation. Such a crudely drawn dichotomy does not exclude the emotional, even sublime potentiality of the quantitative any more than it excludes embeddings of quantitative analysis within qualitative frameworks. Rather it imagines new couplings and scalings that are facilitated both by new models of research practice and by the availability of new tools and technologies.”

Yet, it was with great pain that I read Hayles in the “old school” long form book in the new digital iPad medium.  Every paragraph pointed to interesting sounding research.  Now I was going to have to wait until I was back at my desk to chase all these links to go to the source material.  Along with what Kate was describing I really wanted a new form of Attenex Patterns so I could view these documents and concepts in relationship to each other – through semantic links, social network links and event networks.

Social Network and Semantic Network Views in Attenex Patterns

I just wanted to scream as I realized the “purgatory” I am now in between the old school and the awaited next generation digital media.

As I finished up with Kate’s chapters, I turned my attention to David Weinberger’s latest book Too Big to Know: Rethinking Knowledge Now That the Facts Aren’t the Facts, Experts are Everywhere, and the Smartest Person in the Room is the Room. Imagine my continued pain when I came across this passage from Weinberger:

“I am aware that it is at best ironic, and at worst hypocritical, that I have written a long-form book, available only on paper (or on paper’s disconnected electronic simulacrum), that is arguing for the strengths of networks over books. My apology is of the unfortunate sort that does not justify the action so much as humiliate the perpetrator. And so: I am sixty years old as I write this, and am of a generation that takes the publication of a book as an achievement—my parents would have been proud. It’s also not irrelevant to me that book publishers still pay advances. Beyond these primordial and pathetic motivations—seeking money and Mommy’s approval—there are some other factors that mitigate the irony. I’m not saying “Books bad. Net good.” The privilege of holding the floor for the length of 70,000 words can allow ideas to develop in useful ways; if this book spends more time discussing networks than books, it’s because its author assumes that the case for books is made implicitly by every schoolroom with bookshelves, every paragraph of flap copy, and every public library. Further, for the past fifteen years I’ve been working in a hybrid mode that is not inappropriate to the transformation we’re living through: I have been out on the Web with the ideas in this book since before the book was conceived, and have profited greatly from the online conversations about them. (Thank you blogosphere! Thank you commenters!) Still, not only is the irony/hypocrisy of this book inescapable, it is so familiar in this time of transition that I wish someone would write a boilerplate paragraph that all authors of nonpessimistic books about the Internet could just insert and be done with.”

While I continue to enjoy Weinberger’s long form book and the transform he enables of my understanding of knowledge, the echo of Katherine James Schuitemaker’s plea reverberates in my mind:

“Skip, free us from the tyranny of the linear book!”

It is long past time to go build some innovative software once again – my life pattern that repeats.

Posted in Content with Context, ebook, Human Centered Design, Intellectual Capital, Knowledge Management, Learning, social networking, WUKID | 13 Comments

The Other 90% of Software Product Development

So you’ve just finished your alpha software product and you are ready to release it to the world to get some feedback.  Congratulations.  Now you are ready for the next 90% of the software development effort – RAAMPUSS.

In a previous post, I talked about the competing product design centers.  One of those design centers is what some pundits call the “-ilities” for the suffix that goes on so many of the categories – realiability, availability. . .

One of the fundamental challenges in new product development is balancing the drive for ever more useful functionality for customers with the need to have a very high quality product.  This paper gives an overview of the framework that I created and used at Primus Knowledge Solutions (acquired by Art Technology Group which was acquired by Oracle) to increase the less than stellar quality of our products to make sure that we met the critical requirements of some of the hardest customers to satisfy – those providing world class customer support for their own products.

RAAMPUSS is an acronym that is a short hand for eight aspects of a quality product:

  • Reliability
  • Availability
  • Adminstratability
  • Maintainability
  • Performance
  • Usability
  • Scalability
  • Security

As the volume of the product sales increases and/or the size of customer deals increases, RAAMPUSS becomes more important to the organization than incremental functionality because the product becomes mission critical for the customers.  The diagram below looks at the natural progression of a product from initial idea to something that is used between several enterprise scale corporations.  During the first three stages, the importance of functionality and will the idea work in a real world setting overwhelm the need for high quality.  But once an idea has proven itself in a pilot project, most customers want to leap to enterprise scale deployment.

However, the engineering team is so busy trying to generate functionality and find the suite spot for their product in a market niche, they are ill prepared for the sudden volume and scale demands on their product.  Life for the customer and the engineering team becomes quite difficult at this point because it is very hard to reengineer the product for RAAMPUSS while the customer is screaming for immediate action to fix their crashing software.  One customer like this would be bad enough but often there are 5-100 customers clamoring for attention with severe problems in multiple parts of the product.

RAAMPUSS can be both a diagnostic tool to assess where problems are in existing products and a development strategy for those just embarking on product development or trying to figure out how to prioritize activities for the next release of a product.

The point at which RAAMPUSS becomes critical in the lifecycle of a product business corresponds with the chasm that Geoff Moore describes in his books.  The development process that works to the left side of the chasm no longer works on the right side.  Similarly other functions of a company, most particularly the sales function, encounter this same difficulty.  Often times the managers that are great for working with early adopters are incapable of developing the right stuff for the early majority and vice versa.

Crossing the Chasm

Reliability

Does the product work according to its specification?  For the user, reliability means that they can use the product to produce work consistently.  The most obvious reliability failure in software is an application or system crash.  Less obvious is does the product work the same way each time I use it – both day in and day out as well as through upgrades.   Another form of reliability is not corrupting data, losing data, or computing results incorrectly.  Another example is when a product does the same thing inconsistently like trying to indent bullets within Microsoft word.  The inconsistency is both within a document and from release to release.

Examples of reliability issues Primus had with version 3.1 include the intermittent errors experienced by EMC, Compaq and Novell where the Versant OODB crashed every time the online backup program was run.  The regularity of the crashes (1-5 times daily) along with the time it would take to bring the systems back up (EMC client PCs took 30 minutes) led to irate customers.  More subtle are the problems related to search where the return results of an indexed solution vary depending on what phrases are used in what order.

The more stable the system, the higher the expectations for increased reliability.

Availability

Closely related to reliability is availability.  If something happens to the application, how fast and easy is it to recover?  Does it take 20 seconds to reboot (the Microsoft three finger salute) or do I require eight hours to reload a database?  When service packs are installed do I have to bring the system down for several days?

In the end availability is about uptime – 7x24x365 times the number of users.  Problems with availability can be due to either unscheduled outages as the result of bugs or scheduled outages required to install new software, bring on new users, reindex or convert a database.  The goal is for the application to be up all the time.

Availability is also related to investments in hardware to achieve fault tolerance with no single point of failure.  While the goal should be for our software to not fail, it is also important for the software to degrade gracefully.  Better to lose a single user rather than to lose the whole system bringing all the users down.

Administratability

With complex application software, one of the most expensive activities for a customer is how much time they must spend administering the system.  It starts with what is required to install a system.  It extends to how difficult it is to reconfigure the different tiers of a system.  At the end of the spectrum is how easy or difficult it is to add or subtract a user.  Additional items are keeping track of and maintaining dictionaries, reporting on usage, setting security levels, ensuring integrity of the data or knowledge.  Where is administration performed – does it occur at the server computer or can it be done remotely?  Are log files kept of all changes and/or can the system be easily rolled back to some previous environmental state?

One of the major complaints about Primus eServer products was our inability to administer application functions remotely.  A great deal of effort went into eServer 4.0 to provide a Web interface into all ADMIN functions.  Another major benefit of eServer 4.0 was the automatic installing of client software or the elimination of it with a more robust Internet Explorer like product.

Maintainability

The essence of maintainability is if something does go wrong with the software how easily can the problem be recognized as a software (versus a hardware) problem, identified as to whether it is ours or some other vendors software, whether the problem located in the code, is the problem really fixed, and is the problem resolution distributed to all those affected by it.  Ideally fixes come in three flavors:

  • What can be done to get the system back up;
  • Workaround to guard against the problem recurring or data being lost/corrupted;
  • Permanent work around to ensure that the problem doesn’t happen again.

In parallel with the immediate fixing of the problem is a Root Cause Analysis (RCA) to determine if this is an isolated fault or a symptom of a bigger design or architecture error.  The RCA then kicks off the fixing of the problem as well as the fixing of the software development process that allowed the error to creep in.  That is, how can our development process be improved so these class of errors never happen again.

Maintainability is primarily about processes and tools.   Processes range all the way from the backend – how does a problem manifest itself, how does a customer report it, how does support deal with it, how is the problem escalated into engineering, how is the problem fixed and the fix transmitted back to the customer, and how is the customer communicated to throughout this process.  At the front end is how the software is designed and architected, how is it implemented, how is it tested, and how is it delivered to the customer.

During the eServer 3.1 unstable phase, one of the major observations was that our error logging code was of very little use for high priority problems.  Thus, one of the key tasks of the Tiger Team created to stabilize 3.1 was to define a set of tools that we could put into eServer 4.0 to increase maintainability.

Performance

For our purposes we will define performance as what the user experiences.  While there are many ways to improve the perception of performance it comes down to how long does it take to perform the routine tasks that the individual user works with.  While all of us would like the performance to always be instantaneous, we’ve come to expect different levels of performance depending on the specific function.  For example, typing response time should be instantaneous (the echoing of characters or clicks).  Screen pops should be barely noticeable.  Searches of several seconds can be tolerated.  Printing can take longer although if very long, the user would like to see the printing become a batch job.

The second aspect of performance is how much the user’s perceived performance varies as environmental conditions change.  The ideal is for perceived performance not to change, but some variance is tolerated.  Environmental changes that can affect performance are:  speed of the hardware (client, server, network), number of users on the system (stand alone, average, peak), size of the database or knowledge base, and complexity of the task.  A particularly irritating aspect of performance is if it degrades from release to release.  Nothing is worse than a user having to fight a double whammy of new functionality and degraded performance that often comes with new releases.  This last aspect was one of the major reasons we focused on performance for the eServer 4.0 release because in previous releases the perceived performance got worse.  Early reports indicated that users were quite pleased with performance because it was dramatically improved at the integration point with call tracking software and similar or better search performance was experienced.

Usability

This component focuses on the usability of the product.  Today, usability is often referred to in the larger context of User Experience (UX). Typically, this function studies the user interface for good design principles and the excess of things like too many changes of context, too many key clicks, or uncertainty on the users part as to what to do next.  While human centered design is more of a front end process looking at the context of the user in the real world, usability looks at the actions of the user in relation to the actual software.  As a result, usability is often done at the end of a project when there is a functioning product to work with and to study.

Scalability

We define scalability in terms of the purchaser rather than the user.  This function defines how many users can expect reasonable performance given a particular hardware/software environment.  Ultimately this translates into the system cost per user.  Scalability tests involve running loads against standard configurations for 100, 200, 400, 600 users and so on.  The ideal is that for a given performance level there is a specified hardware configuration that can handle the tested number of users.

Security

Almost every day we read in the news of another security breach at a well known company.  One of the more recent large security breaches that went on for a longtime was with Sony’s PlayStation Network.  Computer and application security involves multiple aspects of protection of information and property from threats like theft, corruption, or natural disaster.  For any organization that has personal information or critical data, the recommended process is to have both an inhouse security team as well as contract for external privacy experts who are well-versed in how to “hack” into application systems and databases.

Test Driven Development

As a development manager begins to understand the depth of critical processes necessary to continue to improve one’s RAAMPUSS quality, it is a good time to look at Test Driven Development (TDD) methods.  While TDD does not solve all of the world’s ills, it goes a long way towards achieving RAAMPUSS goals.

Posted in organizing, Software Development, User Experience | 3 Comments

Competing Product Design Centers

Traditionally product planning is the realm of the software engineering team represented by a program manager or engineering manager and the marketing team represented by a product manager or product marketing manager.  Often, these activities became exercises in “list management” as long lists of features accumulate for inclusion in future releases and the product planning consists of a prioritization exercise.  In addition, software engineers rarely have a good view of the context of the feature decisions, which makes deciding which way to go in developing the code a challenge when there are tradeoffs.

An excellent resource for the strategic and tactical aspects of product management and marketing is Pragmatic Marketing.  Their framework illustrates the range of tasks that a product management team needs to consider:

Pragmatic Marketing Framework

Good product development teams are fortunate to have several different viewpoints represented from very direct voices of the customer, informed voices of human centered design with technology insights and market research insights.  Sorting through all this relevant research, we identify six categories of design input or design centers that provide strong, weak or implicit voices in the product planning process.  These design centers are:

  • Technology Centered Design
  • Human Centered Design
  • Customer Centered Design
  • Machine Learning Centered Design
  • Productivity Centered Design
  • RAAMPUSS Centered Design

The goal in moving forward with product planning for software products is to make explicit each of these design centers so that we can consciously and collectively arrive at defined goals for the product as a whole, a roadmap for the product, and a prioritized set of goals for each product release.

Technology Centered Design

As a software development company, the technology of the products is a key component of design.  This design center encompasses a range of design issues that are critical to a market:

  • Software Product Platform – this aspect looks at what platforms we will build our product on.  Examples of platforms include:  Microsoft Windows, Sun Unix, Linux, MS SQL, Oracle Database, IIS Web Server, Apache Web Server, Netscape Browser, Internet Explorer, Amazon Web Services (or other variants of the cloud), HTML5, and mobile platforms (iOS, Android).  Making a decision around each of these platforms determines what skills we will need and to some extent also determines the size of the market we will be going after.
  • Hardware Product Platform – what kind of hardware (server, desktop, laptop, tablet, mobile) will we require our customers to purchase.  Options range from the speed of the CPU, the amount of memory, the type of disk storage, and network capabilities.
  • Programming Language and Tools – defining the standard(s) for what language we use helps determine our ability to balance hiring profiles, what kind of performance we can expect from our products, and what kind of environment our customers will need to run.
  • Application Framework – should we have a proprietary application framework that crosses all of our products to increase the reuse of our components, do we go with commercial frameworks or open source frameworks?

If we look at research as helping to determine what is possible, the technology centered design process focuses on the art of the practical, that is, what will work reliably for thousands to millions of customers working 24×7 around the world.

Human Centered Design

Human Centered Design (HCD) is concerned with the needs of all of the people who will be using our products in one form or another.  These may range from knowledge professionals to data preparation clerks to database administrators to managers to clients.  The focus of this design center is understanding the needs of the user, their points of pain and then designing interactions that will make their lives better.  A key part of this design center is the observation of users as they go about their daily lives in relationship to the opportunities that we are trying to design solutions for.  Users are quite inventive in how they solve many problems and often our best path forward is understanding their inventive solutions.  To paraphrase Ed Lazowska, the goal here is to “understand the misunderstandings” that keep users from creating the results that they want.  The core elements of the HCD process are illustrated below:

Human Centered Design Process

Throughout the HCD process, the designer is constantly iterating through the criteria of:

  • What is desirable to users?
  • What is possible with technology?
  • What is viable in the marketplace?

Customer Centered Design

Geoffrey Moore

While Human Centered Design focuses on the user, Customer Centered Design focuses on the purchaser and influencer (see the post on Words Mean Something).  The purchaser is someone who actually buys the product.  They are usually a combination of the business manager, the IT manager and the procurement manager.  They have needs very different from the user as they are looking at the business implications of the purchase and the business relationship with Attenex.   An influencer is a person either inside the organization or outside the organization who helps set the context for the purchaser as to why a particular class of solution is important and who the key suppliers are for a given solution.  Geoff Moore has brought the interactions with purchasers and influencers alive in Crossing the Chasm and his many follow on books.   The Tipping Point by Malcolm Gladwell does a particularly good job of describing influencers and how to influence them.

One of the best processes for getting at the voice of the customer (influencer and purchaser) was developed by Katherine James Schuitemaker with her Value Exchange Relationship framework.  This process focuses on the power of the brand to create value with influencers and purchasers by establishing the context for funneling customers into the sales efforts.  Launch customers can provide a strong customer voice and a good collaborating partner.  Channel partners should become a good set of collaborating business builders.

Machine Learning Centered Design

In order to keep from becoming a one product wonder of a company, it is important at the early stage of development to invest precious resources in research.  This design center looks at what kinds of algorithms, computational linguistics, modeling and prototyping can help us stay ahead of our competitors.  The core of this design center is to capture data on every aspect of who the software deals with and profile them in great depth (with permission of course).  To further productivity, mathematicians need to bring powerful algorithms to aid us with unsupervised and supervised learning for very high dimensional data spaces.  Then these mathematics need to be combined with equally powerful visualizations and interaction designs to ensure that productivity gains are realized for the users.  This group is also responsible for looking at the next big areas of potential automation.  A good research team will constantly look for patterns in the data to implement Slywotzsky’s knowledge imperatives:

  • Move from guessing what customers want to knowing their needs;
  • Move from getting information in lag time to getting it in real time;
  • Move from burdening talent with low-value work to gaining high talent leverage.

Productivity Centered Design

Oftentimes productivity is equated with do it faster.  At the heart of how a software product team should prioritize its research and development efforts is to find and solve those problems where we can achieve at least ten times productivity increases.  Productivity is a complex interaction of “better, faster, cheaper” with ever increasing quality (six sigma) and improved business relationships (customer, supplier, partner).

To improve productivity it is important to have key metrics that are measurable and can be made visible for all parties.  We want to ensure that each feature that we add to our products improves the overall measures of productivity for our users, purchasers and influencers.  Productivity increases will include complex balancing of machine improvements and user level improvements that often times are non-obvious.  As an example at Attenex, we thought deeply about whether should we spend more machine time on identifying near-duplicate emails (reducing our throughput) in order to reduce the amount of documents that an attorney has to look at (decreased human labor).  Identifying key metrics and then making it painless to track the metrics and identify patterns is the focus of this design center.

RAAMPUSS Centered Design

While it is functionality and our selling/marketing process that gets our products into early adopter customers, it is our ability to continuously improve at RAAMPUSS which both keeps us installed and improves our reputation with our most important customers.  The components are:

  • Reliability
  • Availability
  • Administratability
  • Maintainability
  • Performance
  • Usability
  • Scalability
  • Security

The goal of this design center is to prioritize for each release which elements we will be focusing on and then to establish clear goals for that release to meet.  From a productivity standpoint, the effect of these elements of a product show up in how much labor and costs that a company bears in support of its product or in the additional sales costs to overcome objections or a poor reputation in any of the categories. In addition to the development costs, several of these components also affect the Total Cost of Ownership metrics of our customers.  As a company matures with a product and a customer base, these functions become even more important when balancing against new functionality.  Part of the development risk equation is that new functionality increases the risk of destabilizing one or more of these components.

Criteria for Prioritizing Clusters of Features

As part of the moving from what is possible or desirable to build, we need to establish criteria for the selection of a feature.  Examples of criteria for prioritizing are:

  • How does this feature reduce mean time to revenue?
  • How does this feature increase the productivity of a user?
  • How does this feature increase revenue for our customer, our channel partner or our company?
  • How does this feature reduce costs for our customer of for our operations?
  • How does this feature contribute to our core mission, vision and strategic intent?

While no single framework or process can guarantee success, the combination of the above product design centers ensures that the needs of the customer (influencers, purchasers and users) will be heard.

Posted in Content with Context, Human Centered Design, Intellectual Capital, Knowledge Management, Learning, organizing, User Experience, Working in teams | 2 Comments

Digital Humanities – Really?

Russ Ackoff shared that the best knowledge system he knew was to have an intelligent set of graduate students that knew him.  In 1985 when we were meeting regularly, he described the joy every morning of coming in and having 2-3 journal articles taped to his office door that his students thought were relevant for him in the moment.  He pointed out that the students knew his interests and his current projects and would look out for material they knew Russ would be interested in.  Russ chuckled and shared “graduate students are much better than any search engine could ever be.”

To Russ’s observations I would add that colleagues and professors who know me are also a great source of knowledge pointers, if I just remember to include them in what I am up to.

Cathy Davidson

Kate Hayles

I mentioned to my colleagues at UW Bothell that are working on the future designs for innovative universities that I was headed back to Durham, NC, where I hoped to meet with Cathy Davidson.  Gray Kochhar-Lindgren suggested that I also try and meet with Kate Hayles while I was at Duke.  Both professors were available and I looked forward to the meetings.

As I prepared for the meetings, I remembered another conversation with Russ Ackoff where he talked about his favorite design for a graduate seminar with his second and third year PhD students.  The class had only one assignment – each student had to teach Russ something that he didn’t already know.  With his impish grin, Russ described how much fun the first couple of weeks of the seminar were as the students went from thinking this class was a breeze to it dawning on them how hard it was going to be to figure out what Russ already knew.  He enjoyed the different strategies the students employed to “discover” what he already knew.

Russ delighted in the new things that he learned each semester.  However, he particularly loved how much he was able to impart to the students without ever having to lecture.  The students had to learn a large portion of what he already knew (which in my limited life experience was huge as Russ was the best systems thinker and synthesist I’ve ever encountered).

If you asked me two months ago if I was interested in learning anything new about the digital humanities, the answer was an emphatic “No.”  Yet after spending time with Alan Wood, a Chinese History professor, Susan Jeffords, an English professor (now UW Bothell Vice Chancellor), Gray Kochhar-Lindgren, a philosophy professor, and Jan Spyridakis, a technical communications professor (now Human Centered Design and Engineering Department Chair), my exposure to the humanities had increased by light years compared to the previous forty years of professional life.  The “Ah Hah!” moment that I needed to spend some serious time understanding the digital humanities came at the recent Modern Language Association meeting in Seattle where two English professors talked about Big Data and two computer scientists talked about the need for digital storytelling to go with their worlds of Big Data.  The world it is a shiftin’.

I was familiar with Cathy Davidson’s work through my research over the last two months, but I was unfamiliar with Kate Hayles work.  So I went to Amazon to see if Kate had written any books and out popped a list of several interesting titles.  I didn’t recognize any of them, but before I ordered them, I checked my Kindle library (nothing) and I went to Librarything to see if I had any of her books.  Sure enough, I’d ordered and read Writing Machines.  One of these days I’m going to have to do a better job of remembering author’s names.  So I ordered several of Kate’s books (How We Became Posthuman, My Mother was a Computer, and Electronic Literature).  Two of the books were on the Kindle so I could scan through them pretty quickly for the key themes.

As I made my way to the Smith Warehouse where Cathy has her office, I marveled at how much the Duke campus had changed over time.  When I went to Duke (1967-1971), the Smith Warehouse was literally a tobacco warehouse.  Any time you came near the building you were assaulted with the cloying smell of tobacco leaves being aged and dried.  Now it was a beautifully remodeled space of bricks and 100 year old wooden beams and floors.  I was reminded of Stewart Brand’s How Buildings Learn:  What Happens After They’re Built.

The primary topic I wanted to explore with both professors was what qualities they thought were important for an idealized design of an innovative university.  Each professor was quite articulate about their ideas for the key qualities of the new university or the new humanities department.  The short version of these qualities is:

  • Collaboration and Collaboration by Difference
  • Provide flexible spaces for collaboration that can be easily re-configured
  • Rethink the curriculum to be multi-disciplinary and jettison many of the ossified department structures
  • Shorten the formal school year to end in March with the rest of the second semester spent in community based projects where professors, graduate students and undergraduate students from multiple disciplines team up with community members to work on important local problems.

Cathy emphasized many of the issues she raises in her books and her consulting with corporations.  She shared “we have to move from an educational model which is based on testing and mastering content to a learning model that is focused on process, collaboration and learning to learn.”  She quoted from sources that describe that the average college graduate will change careers 4-6 times during their lifetime.  Not just change jobs, but change careers.  She described how every time she talks to corporate groups, the business executives demand that we change the way we teach.  Most of these business people say some variant of “it takes us two years with recent college graduates to break them of their pursuit of individual mastery and being scared of being wrong to getting them comfortable with not knowing so that they can collaborate with a diverse group of professionals with different skills.”  Their plea is to stop turning out students with skills that business doesn’t need.

The more I talked with Cathy, the more I wondered how I had missed this transition in the  humanities departments from being book based to being digitally based.  I finally asked Cathy how long has this transition been going on.  She reflected that it was about five years ago when humanities professors started paying serious attention to how computing could help their research and pedagogy.

I shared with Cathy that I was going to a Duke basketball game that evening with my nephew.  Cathy immediately used that topic to springboard to what she had learned from the social environment of Duke basketball games and how she changed her class structure. “Did you know that each year there is a student governance committee for Krzyzewskiville that takes the rules that the university mandates and then turns it into that years constitution for K-ville?  Can you believe that this system has worked since 1986?  Think about all of the issues of students camping out and the nature of 17-21 year olds potentially getting into fights and having drugs and alcohol.  There is no way it should have worked even for one year, let alone since 1986.  If you look at the ‘constitutions’ that are generated each year, they are far more comprehensive and restrictive than what Duke University requires.  So I decided to do that with my class.”

I loved the turn this conversation was taking.  I asked “I’ve tried to read everything you’ve written including your more recent voluminous Tweets and blog entries and I don’t remember seeing a discussion about starting your class with a constitution development. How long does it take?”

Cathy realized that she had not written more than a paragraph about this process and made a note to herself to write a blog entry about it.  Upon reflection, she shared “it usually takes between one and two class periods with a lot of homework crafting the Google Doc that has the constitution.  These are class periods where we’d be discussing the syllabus anyhow, but now it becomes the students’ syllabus.  The students always require more work than I would require.  And in the process about 20% of the registered students drop out, but that is OK as there is always a long waiting list.”

I can’t wait to see the blog entry and see some examples of both the K-ville constitutions and the course constitutions.  I will be interested to see how these constitutions relate to what Jim and Michelle McCarthy are trying to do with their Core Protocols for producing great teams who produce great products.

As I walked out of the Smith Warehouse and started my walk to Duke’s East Campus to meet with Kate Hayles my head was hurting with the implications of Cathy’s research and observations for the future of business as well as the future of the university.  I was shaking my head wondering how I’d missed this transition in the humanities to a digitally based paradigm.

Then I remembered that I’d glimpsed this world when I came across Franco Moretti’s Graphs, Maps, Trees: Abstract Models for Literary History (and the recent response to it – Reading Graphs, Maps, Trees – critical responses to Franco Moretti).  I bought the book more for its collection of visualizations in a topic area I wasn’t familiar with.  I was fascinated with the notion of “distant reading” that the author espoused.  Yet, like a lot of other concepts I’ve encountered over the years, I did not do much with it.

With my trusty iPhone 4S smart phone I was able to navigate my way to Kate’s office. What did we ever do to make our way in the world without these amazing devices?

I was less prepared for my meeting with Kate Hayles than I like to be.  However, she was very gracious and engaging and asked for some of the background on why I wanted to meet with her.  I described a little bit of my background and that Gray Kochhar-Lindgren had suggested that I meet with her so that we could gain her insights on how to design the idealized university.

As we talked she made notes of some of her books that might be of interest to my intellectual pursuits.  We started with a discussion of what her proposal for the restructuring of digital humanities looked like:

  1. Restructure humanities as a comparative study rather than being organized by either nationality (American, British, French), genre, or by century. She suggests that epochs now be defined by their medium (oral, print, digital) as the lines between previous ways of characterizing humanities are quite blurred.
  2. Shift how we think.  Digital humanities is shifting not just the answers but also the questions.  Digital humanities is a “technogenesis” as we are co-evolving along with the media.  Technology has changed how we read and we are changing neurologically as we read and use technology differently.
  3. Understand that electronic literature is different than print literature.  Computation is now a theoretical issue for the humanities, not just for the sciences.

The core part of the transformation to digital humanities is understanding that the overwhelming focus on print as the medium for the last 300 years has evolved to a new digital medium.

Since I am mostly a bottom up kind of thinker, I asked Kate to give me an example of what she meant.  She pointed to an example in the print world of a shift in media.  When William Blake first published his poems he wanted complete control of the publishing process as he wanted his poetry “read” surrounded by appropriate artwork (see William Blake Archive).

The reader of the original poetry would have a very different experience than a more modern print edition of The Poems of William Blake which is just the plain text:

Similarly, Kate pointed out that when print “texts” are translated into the digital medium they become different.  They are “read” differently.

A recent Wall Street Journal article describes this shift in media as “Blowing Up the Book”. One of the adaptations to eBook format mentioned in the article is T.S. Eliot’s “The Wasteland.”  This iPad app “includes a facsimile of the manuscript with edits by Ezra Pound, readings by Eliot recorded in 1933 and 1947 and a video performance of the poem by actress Fiona Shaw.”

If you compare the print version of the poem with the enhanced version there is a very different understanding of the poem in the electronic version than in just the “plain text.” The following three screen shots give you a sense of the richness of the electronic version:

Table of Contents of the iPad app "The Wasteland"

Eliot Scholars commenting on "The Wasteland"

My favorite “digital media” variant within the app is Fiona Shaw performing the poem while the poem’s text is also presented on screen with the current line of the poem she is speaking highlighted in blue.

Fiona Shaw performing "The Wasteland"

Slowly but surely I was beginning to get a sense of what Kate was describing. This discussion started reminding me of the philosophical question “Can you step into the same river twice?”

As I look at these new forms of digital text where the text is embedded in art, I reflected on a recent conversation with Jim and Michelle McCarthy where they showed me examples of reports they gave to clients.  These reports were fragments of text placed on top of the team art that was generated during one of their Bootcamp weeks.  Both of these discussions reminded me of Nick Bantock’s series of books that started with Griffin & Sabine: An Extraordinary Corespondence. Bantock created a book as a series of illustrated postcards and letters for the user to “experience” the correspondence.

Illustrated Book as Postcard Correspondence

Given the power of the inclusion of team art on the Bootcamp weekend, I wondered if we should be doing that with our emails.  Instead of sending plain text emails, we should surround our text with appropriate art to reinforce our message. Stan Davis in The Art of Business suggests that the absence of art in the workplace was one of the explanations for the lack of creativity and innovation.

Then, Kate hit me with the real paradigm shift here.  Along with comparing “texts” across different media, she is using literary critique skills to critique code.  She described this emerging field of critical code studies.  I wasn’t sure I had really heard what she just said so I asked for a specific example.

Kate explained “We are now as interested in critiquing the software as we are in critiquing the text.  There are several efforts under way to have side by side displays of the ‘digital text’ and the software that implements the digital text.”  Now I knew that I had just fallen down Lewis Carroll’s Alice’s Adventures in Wonderland rabbit hole.

“Let me see if I understand this right,” I asked.  “You mean to tell me that Humanities students are both interested in software and have the ability to critique and write software in an humanities course?”

Kate looked at me a bit like I was a Freshman student, and patiently explained “of course, this current generation is interested in software.  This is the digital native generation and they are eager to do the software explorations.  They are frustrated with those of us from the old school who only want to focus on print.”

“Let me try one more time.  There are not any humanities majors I know (including one of my children) who have the least bit of interest in computing.  They chose the humanities so they could stay away from science, math,  and computation,” I asserted.

Kate just smiled and suggested that I ought to sit in on one of her classes where they do exactly what she is describing – study comparative literature by creating and critiquing software.  Kate said that given this turn in the conversation she would send along a couple more chapters from her latest book.

I knew that I needed more grounding in what Kate was describing so I asked for some specific examples.  She pointed me to Mark Marino’s Critical Code Studies to give me an overview of this simultaneous critique of the text and the code.  Another researcher in this area is Alan Liu with his Research-oriented Social Environment (RoSE) project.  She suggested I look at John Cayley.  I particularly liked Cayley’s Zero-Count Stitching or generative poetry (I wonder if somebody will combine Zero-Count Stitching with the generative poetry on Sifteo Cubes).

However, the example that really grounded me in what Kate was trying to articulate was Romeo and Juliet: A Facebook Tragedy.  This research project involved a group of three students translating Romeo and Juliet into Facebook.  The students described their results:

“Reading the story as we have created it requires users to navigate through various Facebook features such as the “Wall,” “Groups,” “Photos,” and “Events.” Following the story in this way is similar to a work of hypertext fiction. However, the advantage offered by Facebook is that the interactions are ordered and timestamped, allowing for users to more easily discern which interactions come first in which progressions. We feel that this means of presenting a story offers a benefit of hypertext, forcing users to interact with the text, but at the same time it cuts down on much of the confusion by clearly communicating the progression of the overall plot.

Character interaction map from Facebook version of Romeo and Juliet

“Manufacturing character profiles based on the limited information in the text was difficult. We relied on individual interpretation and key themes surrounding each character. We supplied interests, books, movies, music, etc. that individuals with those character traits and personality types would be likely to enjoy. Character development was further facilitated by use of various applications and groups which we had characters add or join in order to reflect what we interpreted as their key traits. We feel we have provided somewhat more complete profiles for each character, hopefully to the aim of making them more relatable and providing more depth.

“The project also unexpectedly became an exploration of how virtual role-playing could potentially produce a simulation or model of events in plays and/or novels. Despite the limited nature of the character profiles offered in the original text, enough detail was present to conclude certain character types and the ways that certain characters would act. With close reading, we were presented with certain constraints (ie: character traits, personalities, and relationships), and we had to make sure that these constraints were incorporated into our simulation aside from use in the creation of the profiles. For example, Tybalt in the play is an angry character. Therefore, he was permitted to only perform angry actions and have angry interactions with others on the site. With these constraints, group members attempted to play out the rest of the story while keeping in mind that certain actions or interactions needed to occur for the plot to move forward.”

I was really hooked now and could not wait until I got back to a computer to go online and explore these links and examples. What wonderfully creative ways to learn both narrative structures and programming.  I know I have just found an important source of inspiration for the next generation of “content with context” software I want to build.

“One of the most important vehicles for the digital humanities is to create projects.  An example of a project at Duke is The Haiti Lab (Cathy Davidson also used this example). The project focuses on a wide range of topics associated with Haiti including art, demographics, and epidemiology.  The project members provide a vertical integration with undergraduates, graduate students, post doctoral researchers, and professors,” she elaborated.

Our time was almost up and I knew I could find out more about these topics from her books, her chapters that she would send, and poking around her online references.  So I moved on to ask her if she were doing an idealized design of a university, what were some of the qualities that she would want embraced in the design.  Kate shared her top three qualities:

  1. Collaboration.  The focus of everything in the new university has to be collaboration.  There is just too much for any one person to master.  We have to prepare students for the way of the world now – collaboration.
  2. Flexible spaces.  Space is more bitterly fought over within the university than any other resource.  Yet, our facilities are designed almost exclusively for lecture based classes.  We need spaces that are open and can be reconfigured quickly with no fixed seating.  We need spaces where work can be left on the walls or partitions so that they can be seen and commented on by others.
  3. Rethinking the curriculum. We need to jettison the categories and departments that don’t make sense anymore.  So many of the departments within the university are ossified and self-perpetuating.  The sciences are much better about regularly revisiting the curriculum than the humanities.  The curriculum has to be multi-disciplinary.

I thanked Kate as we finished up and asked her if she would invite me back in the fall to sit in on one of her courses that explored humanities students creating the software for “digital texts.”  Kindly, she thought that would be a great idea.

The next morning the three chapters she had promised from her forthcoming book How We Think: Digital Media and Contemporary Technogenesis showed up in my inbox.  On my flight back to Seattle, I read these chapters along with finishing up David Weinberger’s Too Big to Know. The unintentional reading of these two documents simultaneously shed even more light on the challenge of “networked knowledge structures” which require collaboration and story telling to make meaning.  Kate shared that the timeless questions from her perspective are:

  • How to do?
  • Why we do?
  • What it means to do?

She points out that the latter two questions are what the humanities are really good at understanding. My focus is on the first two questions.  I guess we will meet in the middle.

While I was very appreciative of the work that Kate and her fellow travelers were creating, it never dawned on me that from completely different directions we might be developing the same types of tools.  In Chapter 2 of How We Think, Kate pointed to the Digital Humanities Manifesto 2.0 to describe the first two waves of the new field:

“Like all media revolutions, the first wave of the digital revolution looked backward as it moved forward. Just as early codices mirrored oratorical practices, print initially mirrored the practices of high medieval manuscript culture, and film mirrored the techniques of theater, the digital first wave replicated the world of scholarly communications that print gradually codified over the course of five centuries: a world where textuality was primary and visuality and sound were secondary (and subordinated to text), even as it vastly accelerated the search and retrieval of documents, enhanced access, and altered mental habits. Now it must shape a future in which the medium‐specific features of digital technologies become its core and in which print is absorbed into new hybrid modes of communication.

“The first wave of digital humanities work was quantitative, mobilizing the search and retrieval powers of the database, automating corpus linguistics, stacking hypercards into critical arrays. The second wave is qualitative, interpretive, experiential, emotive, generative in character. It harnesses digital toolkits in the service of the Humanities’ core methodological strengths: attention to complexity, medium specificity, historical context, analytical depth, critique and interpretation. Such a crudely drawn dichotomy does not exclude the emotional, even sublime potentiality of the quantitative any more than it excludes embeddings of quantitative analysis within qualitative frameworks. Rather it imagines new couplings and scalings that are facilitated both by new models of research practice and by the availability of new tools and technologies.”

As I eagerly read more, I realized that the tools of the first wave of digital humanities were trying to recreate what we built with Attenex Patterns.  The second wave of digital humanities was partially implemented in Attenex Patterns and is extended through what I’ve been calling “content in context” as I research and design this tool set for applications like patent analytics and loyalty marketing. This new tool set provides the visual analytics needed for semantic networks, social networks, event networks, and geographical networks. Who would believe that I would find such great resources in a distant context – digital humanities – to extend my design to include features like curated story telling.

Once again I am reminded of the old proverb “When the student is ready the master will appear.”  Like Russ Ackoff, I am grateful to a collection of students and colleagues who with gracious synchronicity point me to the human talent that I need when I need it.

Digital Humanities – Really?  Yes, Really!

Posted in Content with Context, ebook, Human Centered Design, Idealized Design, Intellectual Capital, iPad, Knowledge Management, Learning, Relationship Capital, Russ Ackoff, social networking, Teaching, University, WUKID | 13 Comments

Cameron Crazie for a Night

I am an over the top obnoxious Duke Men’s Basketball fan.  I have to be as the rest of my siblings and my wife and her siblings are Carolina graduates (now there is an oxymoron). Ever since I entered the hallowed halls of Cameron Indoor Stadium on the Duke West Campus as a freshman, I cheered the Blue Devils through good years and bad.  Until Coach K came along there were a lot more bad years than good.

As the gods would ordain it, I was in Asheville, NC for a family life event last week.  When I realized that I would be so near Durham, NC, I decided to spend a couple of extra days and see if I could get meetings with a recent addition to my “invisible university” Duke professor, Cathy Davidson.  I arranged for the meeting and called my sister in Chapel Hill, NC to see if I could spend the night with her family.  She was overjoyed and reminded me that it was my nephew Ross’s 16th birthday.

At our family gathering, I asked my nephew, Abe, whose son just turned 16 what I could get my nephew for his birthday.  He laughed and said “If you are not giving him a car, 16 year old boys don’t seem to care about much else.”  A car was out of the question, but I got to wondering if Duke might have a home game on Thursday night.  As luck would have it, Duke had a home game with Wake Forest.  So I then went to Stubhub to see if there were any tickets available.  Eureka.  There were.  However as I checked out the prices they spiraled upward to over $200 a seat as game time approached.  As much as I would like to irritate my sister, these prices were a little rich for me.

My sister suggested that the Carolina Hurricanes hockey team might be in town and we could go as a family group.  Indeed the Hurricanes were in town so that was a viable option.  However, I really wanted to see if I could get into Cameron Indoor Stadium and irk my sister something precious. So I checked the Duke official ticket website and I couldn’t believe it.  There were lower bowl tickets for $65 a piece.  It never occurred to me that I could get a seat at floor level in Cameron.  I wasted no electronic time ordering the tickets.

I picked up my nephew, gave him his official Duke basketball shirt (angering my sister in her UNC sweatshirt mightily) and we headed to Durham for the game.  When I’d picked up the tickets from Will Call earlier in the day, the agent said that I should get there at 4pm to lineup as the tickets weren’t for reserved seats.  As we hurried to the game, I made sure that I took Ross through Krzyzewskiville to see the camping students.  We were able to get in line about 5:30pm and we were only 40 fans back.  Earlier in the day, Cathy Davidson talked about the amazing K-ville constitution that the students develop and ratify each year.  This model of student governance is what led her to have the students develop a constitution in her class.

The doors opened at 6pm and we crowded in.  When what to my wondering eyes would appear, we were channeled into the Duke student section.  We were actually going to be able to stand on the top row of the student section bleachers directly across from the Duke bench.  I had truly died and gone to Duke Blue Heaven.  Ross had an ear to ear grin on his face and was distracted in every direction looking at the goofy costumes and painted students and the never ceasing cheers.

The last forty years of living were melted away and I was that goofy teenager attending my first Duke Basketball game in 1967.  The mind is an amazingly plastic set of memories allowing me in a matter of a few minutes to go from an accomplished professional to a young college student cheering his brains out.

As I scanned the stadium’s rafters, there were lots of wonderful additions in the form of four National Championship banners.  The newest addition was a big banner celebrating Coach Mike Krzyzewski’s becoming the NCAA’s winningest men’s basketball coach.  Lots of the attendees were wearing T shirts proclaiming “903 wins and counting.”

Then the memories returned one by one.  I remembered sitting under the Duke basket one night and having future Senator Bill Bradley from Princeton and NBA Hall of Famer land in my lap after being fouled by Mountain Mike Lewis from Missoula, Montana.  I remembered the feverishly awaited showdown with the University of South Carolina when they were in the ACC.  We, Cameron Crazies were out early that night for blood and couldn’t wait to swear at full voice at John Roche and Bobby Cremins and the rest of the “dirty” Gamecock players coached by the “get all red” in the face New York Irishman, Frank McGuire.  Coach McGuire was doubly hated because he had coached successfully at UNC Chapel Hill for many years winning a National Championship in 1957.

Since the students could be right behind the visiting team’s bench, we were all wondering how McGuire was going to get any coaching in with the shouted profanities that would be going on all night.  Imagine how we all were rolling on the floor laughing when we walked in and saw eight New York Irish Catholic priests sitting in the bleacher row right behind the South Carolina bench.  Assistant Coach Hubie Brown had an uncle who was a priest and he’d paid for his uncle and seven of his colleagues to come down and “protect” one of their own in Frank McGuire.  I don’t know that any cameras captured Coach McGuire’s face as he broke up in laughter when he saw his protective phalanx of Irish priests.  Frank eagerly went over and shook the hands of all the priests and thanked them for being his guardian angels that night.

I have no idea who won the game, but I do remember that strangely enough there was no profanity in the air.

When I started at Duke, freshman were not allowed to play on the varsity team so there was a separate freshman basketball team.  Two of the team members lived in our dormitory.  I can remember many nights coming back to the dorm and having to slither my way through the narrow, short hallway where three 6′ 10″ outsized human beings were gathered around hunched over to avoid hitting their heads on the ceiling.  I didn’t even come up to their belt buckles.  I can remember sitting in the stands at their games and looking at the point guard, Dick DeVenzio, and marveling at what a shrimp he was. Then my roommate would laugh and remind me that Dick was taller than I was.

It is raining threes at Cameron

Before I knew it the Duke-Wake Forest game started, and there I was with the most unexpected seat in the house.  I was back in the student section, the Cameron Crazies hallowed ground.  I was taking pictures from every direction and emailing my brother and sister and taunting them about how great Duke is.  Their taunts came flowing right back.  All is right in the world.

As expected, the Cameron Crazies cheers were inventive as ever.  So many of them never make it to TV.  I particularly loved the faint murmuring of “four, four, four” accompanied by the waving of our fingers at the player whenever a Wake Forest player received their fourth foul.  Yet, I was bummed as I never heard the cheer I was really wanting to hear. Oh ye of little faith.  Wait for it … In the last thirty seconds of the game, the Crazies cranked it up “Go to Hell Carolina!  Go to Hell!”  Yes, all is really right with the world now.

I sent photos out to my children and my sports fanatic lawyer daughter, Maggie, immediately replied that what kind of dad was I because I’d never taken her to a Duke basketball game.  Now that I know that I can get Duke basketball tickets I look forward to fixing that oversight.

To put a delightful stamp on the night, Duke beat Wake Forest, 91-73.  What a joy to be 17 years young again and remind myself about where and how my life’s journey started – growing up in Cameron Indoor Stadium.

For those of you not familiar with the Duke-Carolina rivalry, I am told there is a wonderful book about it – To Hate Like This is to be Happy Forever:  A Thoroughly Obsessive, Intermittently Uplifting, and Occasionally Unbiased Account of the Duke-North Carolina Basketball Rivalry.  Even though I buy hundreds of books a year, I’ve never felt the need to buy this book – I’ve lived it for forty years.

Shortly after sharing this post with my brother he sent this photo to remind me that there is another side to this story:

After posting this blog entry, several articles are showing up on the decline of interest in Duke Basketball by the students.  Their boredom was my gain in being able to relive the Duke Basketball experience.

Posted in Sports, Travel, University, User Experience | 3 Comments

Beautiful Day at Duke University

What a quick way to drop away forty years of my life as I revisited Duke University this week.  As I walked around the quadrangles, so many memories from my four years as an undergraduate came flooding back.  Thoughts I had not remembered for a long time seemed like just yesterday.

The sky was a deep “Duke Blue” today, not that Carolina faux blue.  Enjoy a couple of pictures from West Campus and East Campus.

Duke Chapel on West Campus

Duke East Campus

What made the trip special was meeting with Cathy Davidson and Kate Hayles in the afternoon to get as much insight into the digital humanities as I could absorb.

Synchronicity struck as the Duke Men were playing Wake Forest in basketball and I got to take my 16 year old nephew to his first Duke Basketball game.  We became Cameron Crazies for a night.

Posted in Learning, Photos, Travel, University | Leave a comment

Envisioning the Visual Analytics Future – circa 1986

The Fantastic Voyage – Computerworld, November 24, 1986 (John Kirkley)

The following article appeared in Computerworld and described a talk I gave about a potential vision for a powerful visual analytics user interface.  I had not remembered this article until finding it while preparing a series of blog posts on the Making of ALL-IN-1.  As I re-read this article, I realized how much of this vision we captured when we created the Attenex Patterns product for legal electronic discovery in 2000.  While my conscious mind had forgotten all about these ideas, the thoughts were clearly wired into my thinking process.

“Skip Walter clapped his hands together loudly, startling several people in the front of the meeting room.

“Walter, Digital Equipment Corp.’s manager of business office services and applications and the “father” of All-In-1, was making a point.

“He was telling the attendees at a recent industry executive forum about some of his explorations into the nature of communications, explorations that could eventually lead to the design of new and radically different office information systems.

Sophisticated methods

“Right now, he said, the computer human interface is primarily visual and character-based. It works fairly well. But every day, in the simplest person-to-person interchanges, human beings use far more sophisticated methods of assimilating, storing and communicating information.

“To illustrate, Walter recalled a financial officer at DEC who used an interesting analogy to explain why substantial cash reserves were necessary for a fast-growing company.

“It is like driving down a highway, the man said, in a car marked Income. Right behind you is another car with Expenses painted in big red letters on the side.

“Now you know, for safety’s sake, the distance between your car and the car behind you — the safety zone. That is your cash reserve.

“Now if you’re driving at a sedate 10 mph, you don’t need much space between vehicles. But here you are, clipping along the thruway at 60 mph in your souped-up, fuel-injected Income sports car. If you’re without adequate cash reserves, you’re racing down the highway with that Expenses car a mere four inches from your rear bumper. If your income falters for even an instant, what happens? Wham! And here Walter clapped his hands together, making his listeners jump.

“The story, he explained, took a dry accounting idea and made it understandable and memorable by appealing to all the senses we use to assimilate information — the visual (you can see the cars), the auditory (the hand clap) and the kinesthetic (you can feel yourself speeding down the road and sense in your gut the wrenching impact of two crashing vehicles).

“What people want, Walter explained, is communication, not information. ‘I receive 100 mail messages a day,’ he said. ‘That’s 600 to 700 pages of information. I can’t physically scan, much less read to understand, all the trade publications and books I need to. Or talk with all the people I need to. I don’t need more information; what I need is a way to communicate with others and with myself about the meaning of facts — not moving these facts back and forth.'”

“Walter was delving into difficult questions. He was probing that often explored but little-understood arena where people, processes and technology combine to form what he characterizes as a living, intelligent structure.

“William Wordsworth, when writing about the modern scientific reductive method, which attempts to understand a process by chopping it up into separate pieces, said, ‘We murder to dissect.’ The point is that the intelligence of the structure cannot be isolated: It is enmeshed in the total structure itself.

“To communicate knowledge across this living network of people, processes and technology, new and innovative methods of presenting information must be developed. They must involve our visual, auditory and kinesthetic senses.

“To illustrate his point, Walter unveiled some proprietary research on which his group is working. He showed several short videotapes about a mundane subject — data
base design and information retrieval.

“But the tapes were far from mundane. The attendees saw the data elements in three dimensions and in color. Elastic connectors, fine white filaments, stretched between the data elements, visually indicating the web of relationships. It was reminiscent of the film Fantastic Voyage, in which the characters, miniaturized by technology, enter a man’s body and use a tiny submarine to sail through the uncharted regions of his body.

Shifting relationships

“In the DEC video, you move in three dimensions among the data, changing it, rearranging it, retrieving it, observing in real time how the relationships between elements shift.

“More important, because of the way the data is presented, you are able to bring your intuitive faculties to bear as you roam this digitized landscape.

“The videotapes were rudimentary, but the possibilities are fascinating. Imagine adding sound and a joy stick. You could zoom among the towering structures that you have built like a intergalactic fighter pilot from Star Wars. Others could join you in this network of information and ideas, and, like explorers mapping uncharted territory, you together discover new relationships, new roads to explore. Unlike real life, if you fall off a cliff, it’s not fatal; you simply push the reset button and try again.

“As Walter sees it, the next step is deceptively simple but hard to realize: the design of human interfaces that use sound, pictures and movement. As this approach develops, we will be making the first tentative steps toward tapping the tremendous capabilities latent in the partnership between man and technology.”

As I get ready to develop the next generation visual analytics software, it is a delight to see how much of this thinking has been a part of my conscious and unconscious processes for forty years.

What else should we be thinking about as we launch the development of “content with context?”

Posted in ALL-IN-1, Content with Context, eDiscovery, Human Centered Design, Idealized Design, User Experience | Leave a comment

Ode to Steve Jobs

On the way to somewhere else in preparing the last couple of blog posts, I came across a reflection document I prepared in 1990  “ALL-IN-1 Ten Years Later.”  Buried in the document was an article that caught my eye about Steve Jobs vision for NeXT Computer.  Now that we are twenty years farther along, I found it interesting to reflect on Steve Jobs vision and the ALL-IN-1 vision.

In the January 29, 1990, Businessweek, an article appeared about Steve Jobs vision of what the NeXT computer meant for business. NeXT is about making Electronic Mail systems happen. Excuse me. Maybe timing is everything, but I thought that was our vision for ALL-IN-1 in the 1980s. Then I got to thinking that for his target market – PC users upgrading to workstations and Local Area Networks – electronic mail that automates workflow processes (like expense reports, capital appropriation requests, etc … ) probably is a far out vision. I also remembered a conversation with the DuPont account team as DuPont was doing a review of their return on investment for their ALL-IN-1 systems. DuPont realized they had only put the infrastructure in; they had not implemented the customized work flows of the original plan.

The following article represent a view of time of the 1980s in relation to the VISION of what office automation is (could be) about.

The Third Wave According to Steve Jobs

“What’s good for Next Inc. will be good for Corporate America. That’s how Chairman Steven P. Jobs sees it. “We put a NeXT on each desk, and it changed the company in ways I never expected,” he says. Those computers, networked and programmed with sophisticated electronic messaging software, have nipped the incipient bureaucracy that can slow even a 300-employee startup. Once refined, Jobs adds, such systems will launch a third wave in PCs – “far bigger than spreadsheets and desktop publishing.”

“Jobs says that the third wave will raise productivity by doing away with paper memos, forms, and even phone calls. At NeXT, for instance, purchase requisitions are written on a computer, passed along the network for approvals, and checked against the budget. Only then is a paper purchase order printed. Check requests work the same way. Schedules, notices, and announcements are posted electronically. And because electronic mail lets participants prepare better, meetings are more productive. “They’ve been cut in half, and more people get involved in key decisions,” Jobs says.

“Detractors note that this is not unique and can be accomplished with cheaper computers, such as IBM PCs. “I don’t see where it’s all that different,” says John R. Lynch, director of business markets for NeXT rival Sun Microsystems Inc.

“But Jobs insists that built-in features, such as software that lets the Next computer do several things at once, will make it a networking standout. PCs have options for handling digital sound for voice mail, but sound is standard on Next, as is voice-mail software. Today, Next uses custom software to exploit such functions. When third-party companies develop special software for that – later this year – then Corporate America can try the third wave, too.”

Rest in peace, Steve Jobs.

Posted in ALL-IN-1, Knowledge Management, User Experience | 2 Comments

Good Software Never Dies – ALL-IN-1 becomes Enterprise Vault

In 1979, John Churin and I created an enterprise Office Automation product called ALL-IN-1.  I left the full time management of the project in 1986 and then left Digital Equipment Corporation in 1990.  For some 18 years, ALL-IN-1 generated $1 Billion in sales for DEC.  I next encountered ALL-IN-1 while consulting with Health Partners in 1999.  I was pretty confident that the software wouldn’t last beyond 2000 because of several Y2K issues I knew were buried in the software.

By 2000, I was enmeshed in yet another startup, creating Attenex Patterns for eDiscovery. In 2007 we lured Greg Buckles (now of eDiscovery Journal) away from Symantec where he was a senior product manager for the Enterprise Vault product.  Greg has an impish sense of humor and was constantly dropping random hints that we had a very interesting professional connection.  Even when he talked about spending a lot of time in Reading, England when he was at Symantec, I still didn’t catch on.

One afternoon, in a bar of course, at the St Paul Hotel in St Paul, MN, with our colleagues from a recently completed EDRM meeting, we were challenging each other with past war stories. Present at the table were George Socha (founder of EDRM), Laura Kibbe and Kevin Esposito (formerly Pfizer’s Directors of eDiscovery), Greg and me.  Greg suddenly announced “There is somebody at this table that has created the two highest revenue generating products in the eDiscovery industry.  Can you guess who?”

I knew that I had created one of them in Attenex Patterns.  Yet, as I looked around the table, I wasn’t aware that any of my colleagues had ever created even one software product.  Greg elaborated further “And Pfizer uses both products every day.”

Now we were all confused. Then with a big Cheshire Cat grin, Greg shared “Skip, it is you.”

I responded “What are you talking about Greg?  I created Attenex Patterns, but not any other eDiscovery products.”

Greg’s grin got even wider when he shared “You never knew that Symantec’s Enterprise Vault product was really ALL-IN-1.”

I had no idea.  So for the next half hour, Greg shared the story of how when DEC was sold to Compaq, Nigel Dutt, one of the UK ALL-IN-1 developers, bought ALL-IN-1 and turned it into today’s Symantec Enterprise Vault (after some intermediary mergers and acquisitions).  I was stunned.

A month later on one of his visits to Seattle, Greg came over to our house for a wine and glass tasting with some wonderful Archery Summit Pinot Noir.  He was kind enough to share the evolutionary story of ALL-IN-1 with my wife.  She pleaded with him to write the story so that she could share it with our children.  Greg agreed.

A few days later this wonderful fairy tale showed up from Greg:

“Once upon a time, in the old kingdom of DEC, there lived a wizard named Skip. This wizard belonged to a cabal that was building a special spell, called the All-In-One Spell. The cabal worked long and hard to make the spell that would store whatever you needed. Then the evil Compaq Compact plotted the overthrow of the kingdom of DEC. The wizard Skip escaped the invasion, but many of the cabal were captured and forced to work on the spell, turning it into a spell vault to hold all the whisperings within a kingdom safely locked away.

“But the evil Compaq could not control the cabals, wizards and spells that they had taken, so many wizards escaped and took the spells with them. The wizard Nigel escaped with the Vault and ran back to the same castle in Reading where the spell had first been born. He gathered as many of the original cabal as he could find and they cast the spell for many kingdoms.

“The Vault spell was so powerful that almost all of the greedy gnomes with banks along the great Street of the Wall used the spell to listen to the whispers of all of their trader gnomes. But the trader gnomes got greedy and began to steal from everyone. So the great sheriff Spitzer declared war upon the merchant gnomes and all the other great houses of trade. He demanded all of their hoarded whispers to find the bad gnomes.

“In the far west, the House of El Paso struggled to fend off the attacks of the sheriffs. They hired a young wizard named Greg to find all the whispers and deliver them to the sheriffs, but he needed the Vault spell to do it. He made another spell to work with the Vault. A spell that saved his House and many other Houses under assault.

“The time of troubles passed, but Kingdoms, Counties and Houses across the world wanted the Vault and the new Discovery spell to protect themselves. Great bags of gold came to the Knowledge Vault Sorcerers and they grew rich. So rich that great kingdoms vied to buy the secret code of the Vault. The kingdom of Symantec bought the spells and many of the Cabal went forth to start new cabals that created spells to understand all of the whispers captured in all of the Vaults.

“Wizard Skip had created many spells since his time with the kingdom of DEC. His reputation had grown and his spell was the strongest spell for understanding the Patterns in the whispers. The House of Ten Times was known far and wide for the power of their spells.

“Always seeking to work with the best spells in all the lands, wizard Greg joined the House. He knew that wizard Skip had helped create the great Vault spell, but did not reveal his own spells to Skip. For many moons they worked together, but the House of Ten Times had  become too complacent with the success of the Patterns spell. Each wizard decided that it was time to leave. Only then, did the younger wizard show to his elder how his code had grown and what it had become.

“Small spells may become great spells and great spells may give birth to small spells. The wheels turn, but the Patterns remain the same.”

What amazes me some 32 years later is how a software product  two of us started in a tiny office in Charlotte, NC, is still alive and well and generating more than $250M a year in revenue.

Posted in ALL-IN-1, Content with Context, Relationship Capital, social networking, User Experience, Value Capture, Wine | 8 Comments