ALL-IN-1 Philosophy

In the process of describing the Making of Enterprise Software – ALL-IN-1, I came across the ALL-IN-1 Philosophy we published in 1982.  I was impressed at how much I still adhere to this philosophy thirty years later.  This philosophy was the product of my collaboration with John Churin.  It is interesting to reflect on this philosophy as we experienced the evolution of office automation systems from ALL-IN-1 to Lotus Notes and to the present day of Microsoft Office and Exchange and Google Mail and Apps.

Behind the development of any major software product is a philosophy of why the product was developed and a model of the environment in which the product is expected to perform. Underlying the philosophy are also the business requirements of the organization funding the product development. This chapter (circa 1982) provides the major elements that went into the development of the Digital Equipment Corporation ALL-IN-1 product.

The following is a list of the major philosophical points:

  1. Provide automated information access to the widest possible base of users.
  2. Understand customer needs by focusing on solving today’s key business problems with today’s technology.
  3. The needs of the office worker are evolutionary.
  4. An automated office information system must be flexible to easily adapt to multiple user types, changing needs of the users, multiple input/output devices, and changing technology.
  5. The system should appear as an integrated whole to the end user.
  6. Office information systems tie into or touch all parts of an organization.
  7. An office system should be self-contained.
  8. The individual user should feel ownership of the system.
  9. The user should be rewarded or acknowledged as an individual by the system.
  10. The value of an automated system is only understood through demonstration of how functions are used to solve current key business problems.
  11. The product development efforts should be directly funded by the customer base on the direct merits of the software.

1. Provide automated information access.

The goal of office information systems is to provide an information access utility for the widest possible population of users. The utility should provide the ability to access data no matter which computer system it resides in, either internal or external to an organization, so long as appropriate security conditions are met. The utility should then provide the tools for the user to turn the data into information. Once the information is created, it should be communicated to the appropriate people in an automated fashion. What is one person’s information may be another’s data.

This philosophy can be summarized by: getting the right information to the right person at the right time.

The user base should be thought of in its largest possible sense, not just the members of one’s own organization but also the other organizations that must be communicated to. Examples would be the exchange of paperwork that accompanies the exchange of goods and services between two organizations (purchase orders, status replies, invoices). In addition, information utilities are increasingly being shared by several organizations (Dow Jones, The Source, and CompuServe).

2. Understand customer needs.

The key to success is the understanding that office information systems are needed to solve the key business problems within an organization. The focus of product development and installation within the office environment should be on the business problem, not the available technology and what COULD be done. More than enough technology exists today to solve today’s problems; what does not exist is an understanding of how to apply the technology to specific customer needs.

Understanding customer needs is difficult to do in practice. Much time needs to be spent listening to the people who will use and have used office systems to understand their needs. Then “working models” need to be built quickly to determine if the users needs were really understood. In general, office users are not very good at stating their business problems so that we may build solutions. However, these users can very quickly observe a working model and determine whether the model meets their needs.

The key to success for the installer of an office information system is to find the key business problems within their organization and then use today’s products to start solving those problems. The key to success of the product developer is to understand enough different specific customer environments, abstract to the general solution for these problems, and then build a product which can adapt to specific user needs.

Early on in the development of systems which would meet customer needs we found that Digital had a superb set of tools which could be applied to the office environment. However these tools were designed for programmers, a very small portion of the overall user community. Our task was then to figure out how to take these general purpose tools and apply them to non-programmers for specific business problems. This application of using existing tools was born of necessity. We did not have the time, resources or funding to invent the perfect solution. We spent our time understanding available products and applying them to current problems.

3. The needs of the office worker are evolutionary.

The information access needs of the office worker are dynamic. Therefore the system must evolve to meet these needs. Our philosophy has been that the customer is pursuing a journey and not a destination. Office automation is really a process not a single product.

The process of accessing information is one that is practical; it is accomplished through trial and error, as there is no theoretical base for how an office worker performs their job.

Office systems need to be available at all times. This capability is actually an extension of the current environment, where the office is only accessible when one is physically in it. The logical extension of an automated system, is that it should be available wherever and whenever the office worker is available. Reliability is implicit.

Users will use the system in very different ways then ever intended by the implementor. User needs change rapidly over time.

Functionality before efficiency. This philosophy falls out of the rapid feature needs that most users find they want when exposed to a good system. These needs are more important than the relative efficiencies of the process. This philosophy assumes that others will provide the needed efficiencies primarily in the base hardware and software.

The system should evolve by adapting to the work patterns of the user. A logging mechanism is critical for this and many of our other philosophical tenants. The system should allow the user to modify the application to adapt to his work flow and where necessary the resulting procedure should become a permanent part of this user’s view of the system.

4. The automated system should be flexible and adaptable.

The automated office system should have a flexible structure and be able to adapt to a wide range of users and wide range of input/output devices. The types of users that the system must adapt to range from novice to expert, secretary to professional to manager. Input may come from a terminal, from OCR, from touch panel, touch tone phone, bar code readers. Output may go to other applications, printers, terminals, graphics or voice output.

Since there is such a variety of users and devices that the system must interact with, general purpose tools were used as the starting point. These tools allow the application programmer to build working models quickly as well as modify them to suit different users or different input/output devices.

Most vendors are supplying the same types of base functions: word processing, electronic mail, calendar, desk management, data processing, and graphics. Each vendor can pull out an impressive feature and function list. However, most of the applications were not designed to work together and the user has to manually move from application to application. Since these systems generally hard code the user interface in each application, the customer is unable to modify the interface to the function or modify how the function performs.

What attracts most customers to ALL-IN-1 is the ability to have a user interface which ties very diverse applications together. At the same time this interface may be customized to suit the local users needs. With this interface, programs and applications which were used by programmers can be extended to office users.

5. System should appear as integrated whole.

To provide a system which is easy to understand and use, the system should achieve a high level of integration. The individual components should appear to work together and be part of the same structure.

Integration can be viewed as a spectrum:

<—- Features —- Common Structure —- Integrated Only System —>

At the Features Only end, applications are developed as independent entities. The user must come back to a command level before accessing the next application. Data or information is not automatically passed from one application to the next. An integrated system would be developed from scratch and would have all applications tightly coupled to the others.

There is a natural conflict between features only and full integration. It is very easy to develop features only and extremely complicated to develop an integrated system. Further, the integrated system is very difficult to modify, as a change in one application could potentially affect all other applications.

Due to the dynamics of the office environment, we felt that a fully integrated system was not possible. Rather, a common structure should be built of independent modules that would have a common flow control for moving information between the independent applications without the user at the terminal having to do so manually.

6. Office systems touch everything within an organization.

Since data arises in almost every part of an organization and outside of it, office systems must be able to access the data to turn it into information and then be able to communicate with all other systems. Likewise, just about every data processing product ever developed interacts with an office information system at some point.

The office environment will be multi-vendor so it is vital that a full function communication architecture be available that allows for easy information transfer between systems.

7. System should be self contained.

As far as the end user is concerned the office system should be self contained. All documentation, help, key concepts, and computer based instruction should be available through the users terminal. The user should not have to have a mass of documentation to operate the system nor require extensive training away from her work environment. The system should be able to start being used immediately. Thus, our philosophy pushed us to develop all collateral material in machine readable format for inclusion with the system.

8. Individual should feel ownership of the system.

The users of the system should feel like they own it in much the same way that they take ownership and personalize their own desk environment. Further, the individual should have the ultimate control of what happens within his environment. Other users should not be able to invade his environment without the permission of the individual. Examples are: incoming mail should not go into the users file cabinet until he says to do that. Meetings should not be permanently scheduled unless the schedulee is aware and agrees to the meeting.

9. User should feel sense of reward.

The user should have built in rewards from the system. The system should acknowledge that there is an individual. Examples of acknowledgment: the user should have the main menu built from a single entry for Computer Based Instruction (CBI) to the total list of functions that he has gained an understanding of by going through the appropriate CBI (avoid the ALL-AT-ONCE syndrome). The system should analyze the log files periodically to detect work flow patterns or the need for more training in areas that the user has not used. If a user is entering orders, the system might come out with an acknowledgment at suitable intervals (every 1 Million dollars) or let the user know how many clean orders they entered.

10. System value understood only through demonstration.

People rarely think about what they do in their current office environment. We have found it extremely difficult to talk about what an office information system is or how it should be used. The users only understand the value, once they see a demonstration of the capabilities. An analogy is how we learn to drive a car. We don’t do it by reading about it, we do it by watching and then practicing.

11. Product development should be directly customer funded.

The development efforts for office applications should be directly funded by customers. This philosophy was born of necessity but has been key to the motivation of those involved. It places an emphasis on using existing products instead of inventing new ones, and keeps the development focus on solving real problems instead of inventing solutions which no one wants. A fallout of this philosophy is a built in marketing test, that is, if one customer is willing to pay for the development, chances are many more customers are, since the generic office needs are quite similar in large organizations.

Posted in ALL-IN-1, Content with Context, Human Centered Design, Idealized Design, Knowledge Management, User Experience, WUKID | Leave a comment

The Making of Enterprise Software – All-IN-1

In the early 1980s I was part of Digital Equipment Corporation’s (DEC) Software Services group in Charlotte, NC.  The unit I was a part of consisted of 10 software specialists and a manager.  Due to the nature of our remote work at customer sites, we rarely saw each other.  Therefore, we always enjoyed getting together for our monthly unit meetings.  Yet, we were always disappointed when each meeting turned into yet another exercise in administrivia.  What should have been a great forum for sharing our learnings about our products and customer needs over the last month, shared problem solving, and new service creation brainstorming, always turned into going over the guidelines for filling out the latest paper form that had come down from the bureaucrats on high.

My officemate, John Churin, and I got so fed up after one meeting that we decided to design a solution to the administrivia problem.  Anything that would ease our pain would be greatly appreciated by others as well.  We determined that what we needed was a way to move the form filling from paper to networked computer systems along with a rich electronic mail and help system that would have more explanation about the forms than you could ever want.  We figured that if we could move the routine communications to email, our manager would know that we’d received and read the material and then we could use face to face time for learning and really creative stuff.  As luck would have it, John and I both had a couple of free days between customer engagements.

We spent the next two days in design mode rapidly iterating between the needs our organization had and the skills that existed between the two of us (real time processing, electronic forms design, database management, transaction processing, and networking).  We also looked at the expense side of what it was costing the company to do things manually with a horrendous amount of expensive multi-part paper forms.  We quickly estimated that it was costing $200 per week per specialist just to do the basic time reporting and expense reporting to support our revenue stream.  These costs included the paper forms, data entry personnel, maintenance programmers, and computer systems.  Not included in these costs were the management and facilities costs that a robust analysis would include.  We were excited just to find that our own unit had $100,000 of expenses associated with time reporting during the year.

While the expense side was one part of the value equation, unfortunately that expense was in another part of the organization.  The only offer that would really fly in our department would be increasing revenue, either in the form of generating more systems integration projects or selling large amounts of DEC hardware.  We were quite pleased with our analysis of the needs and our initial design which would be based on electronic mail.  We realized that if you looked at email as more than just pushing text around, particularly adding a robust electronic forms capability, you could solve a wide range of our administrative needs.

On the “building it” side, we examined DEC’s product pipeline which consisted of large timesharing systems (DECsystem 10s and 20s), small real time computers (PDP-11s), and the recently introduced VAX systems.  While there was only one model of the VAX at the time (VAX 11/780) it was clear that this was the system of the future.  More engineering dollars were going into the VAX hardware and software platform than the other two combined.  We knew that there would be lower end models and higher end models so that configurations of very small and very large networks would be possible in the next two years.  While there wasn’t a lot of application software on the VAX, most of the PDP-11 software already ran on VAX systems.

As we emerged from looking at our own needs, we became aware that what we were really working on was what the marketplace started calling office automation (OA).  The components of office automation were: word processing, calendaring, electronic mail, spread sheets, electronic forms, and data management.  These systems required both local and wide area networking.  Early competitors in the space were Wang, Datapoint, and IBM with a mainframe adaptation (PROFS).  At the time DEC was primarily known for its capabilities in the scientific/technical, medical and manufacturing environments.  However, the company realized that its next level of growth would come from the commercial markets.

Office automation provided a nice entry point in the commercial market because you didn’t have to go up against IBM’s mainframe database processing stronghold.  What was common with all the competitive products was that they had fixed functionality – what you saw was what you got.  Our own needs and a cursory look at other needs showed that no two office automation problems looked alike.  You had to take a customizing and component approach in order to be successful. Since the hardware was relatively inexpensive, you couldn’t require a large custom project to get the system going.

We spent some time discussing what our intent should be.  Did we want to just solve our unit’s administrative problems or did we want to go after something bigger?  If we were interested in building a product, should we form a new company, stay at Digital, or try and transfer up to the central engineering group responsible for building application products?  As we worked through these issues John and I realized that first and foremost we wanted to build a successful product.  We each had built what we thought were products before coming to DEC, but they hadn’t succeeded because we only looked at the coding part of the product effort.  We knew that to be successful, you had to think of a whole product (see diagram) and all the functions required to create the product, get it to market and sold, get it installed, and have it be supported.  We knew that the likelihood of the two of us doing it on our own was slim, but if we could use DEC’s resources then we had a chance of being very successful.  So we decided that our intent would be to build an office automation product within DEC, preferably within our systems integration group so that we didn’t have to relocate.

Reality set in after the two days, when John and I were pulled back to our customer commitments and our support of the sales representatives.  Yet, there was a subtle shift for both of us.  Where before any customer project was as good, bad or indifferent as the next; now we were actively looking for customers which met our intent – building an office automation product.

The design of our product concept changed our worldview.  Now almost every customer or sales situation we touched seemed like an office automation opportunity.  We talked about our ideas to anyone who would listen.  While none of the sales reps or customers looked at us as a first choice, several were now looking at us as a last resort rather than giving up on a piece of business or project.

Our first break came at RJ Reynolds Industries (RJR).  They were looking for a way to replace their aging papertape Telex systems that were cumbersome to use and expensive to operate.  As RJR was expanding their number of office and manufacturing sites, the speed with which they could move information was becoming increasingly important.  We did a half day analysis and realized that our preliminary design would nicely match their needs since this was primarily an electronic mail application.  Then the client gave us a rude awakening.   They liked our ideas but IBM had agreed to give them a systems analyst and a corporate telecommunications consultant for a month to analyze their needs.  We knew we could not match that offer but got the customer to agree to give us a chance to bid on the results of the IBM system analysis.

A month later we got called back in and given a copy of the IBM analysis.  My spirits soared.  The IBM folks just drew a few illustrations and copied some brochures.  I knew that we could do better in a few short hours.  We recently installed one of our new word processors so we could turn out a nice looking document in short order.  I asked if we could come back the next afternoon with the analysis and proposal that we had been working on for the last month (a small white lie, but the work that we would do that evening would look like several months work compared to the IBM analysis).  The client agreed and we hurried back from Winston-Salem, NC to Charlotte, NC.  I phoned ahead to John to get him to start cleaning up some of the architecture diagrams that we created previously.

Drawing on the early design work we created a 20 page analysis with several diagrams and a three page consulting contract to design their system for real, for a mere $50,000.  We took the proposal back the next afternoon and the customer was most impressed.  They never expected DEC to upstage IBM, and to do a free analysis in the process.  The customer agreed to our proposal and the next day sent us the approved purchase order.  This was a first for our region, getting a paid project just to do a specification.  We were off and running.

While John and I were the primary consultants on the project, having real customer dollars allowed us to tap into expertise around the country that we didn’t have.  Also, under the guise of project reviews we received great guidance and criticism of the completeness of our designs.  We went back with well over a 100 page specification and a twenty page proposal for the next phase of the project.  Once again, the customer was most impressed, but then gave us the bad news that RJR was reorganizing and that this project was cancelled.  While disappointed, we now had a very complete specification that we’d been paid for.  We had received real customer dollars without requiring our own company to invest.

In parallel, we worked on consulting projects to Milliken in Spartanburg, SC, a very large textile manufacturer.  DEC was the major supplier for their plant industrial automation. Milliken realized that as they automated processes like the creation of fibers and the making of carpets that their automation had an incredible amount of paperwork that followed the products around.  They were starting to do the prototyping for a plant office automation system and were working with Datapoint on the prototype.  As they showed us what they liked about the Datapoint offering, we realized that with our specification and a little bit of rapid prototyping we could provide a much better system.

While Milliken was not willing to pay for our time to develop the demonstration, the District Sales Manager realized that he could sell them triple the amount of hardware systems if they bought our approach to office automation.  Further, he could keep an unwelcome competitor out of the environment.  So he funded our next activity which was to take our specification and create a demonstration.  We estimated that we could do that in three short weeks.

The other requirement that the customer had that no other competitor could deal with is the necessity for their large engineering staff to be able to add to the base capabilities of the system and to customize the capabilities.  So an important part of our demonstration was illustrating the rate at which new functionality could be added.  Therefore, a part of our offer and needs analysis was to have meetings with the client every three days to demonstrate progress and gain critical feedback on our approach and understanding of their problem.  In three short weeks we had a demonstration of our office automation system and the customer was quite impressed.  They were ready to buy.  However, we still had some important lessons to learn about marketing and sales proposals.

In parallel with our demonstration project, the senior management team of Milliken was meeting with the top management of Digital for their quarterly meeting.  The Milliken upper management knew that the plant office automation team selected Digital as the top vendor, but none of us had thought to brief top DEC management on our capabilities.  So when Roger Milliken, CEO, asked Win Hindle, DEC Executive VP, about DEC’s office automation products and could they see a demonstration of the products, Hindle replied that we had no products in that area.  Further, he told them that he expected that it would be two years before we would have any products in Office Automation and that there were no products to demonstrate.  Talk about snatching defeat from the jaws of victory.  No amount of lobbying on our part could overcome the authority of our own top management.  While we didn’t get the Milliken office automation project, we were selected as the hardware vendor of choice and the customer elected to develop their own proprietary system.

We now had a demo of our capabilities and we took every opportunity to demonstrate that DEC and the VAX computer systems were the best platform for office automation.  In short order, the DuPont sales team from Wilmington, DE, contacted us to propose our system to DuPont.  The DuPont productivity team wanted to do a demonstration project of what OA technology could mean for a Fortune 50 company.  We analyzed their pilot needs, recast our specification and demo, and made a proposal for a project to transform the demo into a scaleable product for $100,000.  The sales team liked it and we started the multi-layer sales process required in old line hierarchical companies.  Finally, we made it to the top of the chain to present to Ray Cairns, DuPont Information Systems VP.

As part of the presales process, the sales rep and I went to lunch with Ray in the magnificent Hotel DuPont, and we talked quite a bit about the history of our efforts and the capabilities of our ideas.  He was an active questioner and probed far and wide about where we’d been and where we expected to go with the product.  He knew that we were developing this capability in conjunction with our customers and then would move it into DEC central engineering.  They liked our unique offer that for the cost of the project, they would have an unlimited use license for the software within DuPont.  Then, if they liked the tools, they would have to buy licenses for the next version of the product.  This offer allowed them to amortize the cost of the project over quite a few hardware systems which made the costs appealing to their financial analysts.  We appeared to be giving up quite a bit of future software revenue, but we were betting that we would have a new version of the product well before they were ready to deploy the software across a lot of systems.  This offer was win-win for both parties.

I relaxed and felt quite good that the decision maker would decide in our favor.  Little did I know what I was in for with the formal agenda with Ray and meeting all twelve of his direct reports.  We had professionally designed 35mm slides to present our story and product ideas.  At the end of the presentation, Ray asked several warm up questions and then hit me with the question that stood me on my heels: “how has this product helped impact Digital’s bottom line, either positively or negatively?”  He knew from our lunch time conversation that the product didn’t even exist, so that it couldn’t have much of an impact.  I knew he wasn’t a stupid man, so what was going on here?

In a flash, it came to me, that he wasn’t really asking about DEC, he was using me as a convenient foil to get critical education across to his management team.  I mumbled a few things about our unique approach to developing application software in conjunction with a customer.  Then, I turned the question around to the DuPont management team and asked them how they thought this product might affect DEC’s bottom line.  It is much easier to speculate about the cause and effect in someone else’s organization when you are at a level of optimal ignorance than to do speculation about your own organization.  What ensued was a great one hour conversation about the implications of such a product and technology on a large, complex organizational system like a Fortune 50 company.

After the meeting, we were awarded the order and we now had the funding to take the ideas of our specification and our demonstration into a full blown product.  The learning for me in this meeting was quite revealing.  Our way of approaching the selling of our ideas to individual contributors and middle managers was the traditional sales pitch of features and benefits.  What Ray Cairns made clear is that at a certain level of management, the rules change and the offer must move from features and benefits to second order implications.  In our case, what effect would buying our software have on both the revenue side of DuPont and the expense side of DuPont?  In order to answer that kind of question you have to move from the product under study to the system under study.  In particular you have to look at the interactions of an entity with its environment.  In later years, I would come across the third order thinking which is how will this product or service change a company’s valuation either positively or negatively.

Gordon Bell, DEC Senior VP of Engineering at the time (now at Microsoft Research), noticed a heuristic that no product ever made it out the door by requiring more than three new insights or inventions.  During the specification and rapid prototyping stage we hit on two core insights – that email was the core engine and that emailing should have multiple data types, not just text.  Yet there was still something missing as we moved into the implementation phase.  We couldn’t figure out how to implement our ideas without having to rewrite parts of the VAX/VMS operating system, which neither of us wanted any part of modifying.  The whole excitement of the VAX architecture was that we could build as complete an application as we were looking at without any systems work.

One day a happy accident occurred when all of a sudden the contents of my VT100 screen changed.  I was editing a file and all of a sudden a data entry form appeared on my screen.  It was eerily like Alexander Graham Bell yelling “Watson come here!” through his new invention – the phone.  The joy was realizing the power of the messaging architecture already built into the VMS Operating System.  While a bit esoteric, what it meant is that our system could be built on messaging from top to bottom and be fully recursive.  With a few architectural rules, the system could generate a wide range of custom applications.

We expanded our team by three and developed the code, installation procedures, documentation and training for the pilot site at DuPont.  During the process, I learned several invaluable lessons about engineers and product quality.  Any problem an engineer experiences directly gets fixed immediately.  The difficulty is that most critical problems in an application are rarely used by a software engineer.  So they never see the problems and the problems never get fixed.  Not only did I have to see the product opportunity at DuPont, and develop it, but I was also the person that went to install the software and train the users.  The things that worked fine on our development computer didn’t work fine on their computer system.

No amount of asking, cajoling, screaming or yelling could get the engineers to fix the problems. Unfortunately, the problems were in an area of the code that I was unfamiliar with.  Through shear desperation, I had John make the next trip to Wilmington, DE, to install the latest update.  Imagine how I had to keep from laughing (or crying) when he came back after the trip and asked me why I hadn’t told him about all the problems.  Magically, they were all fixed within a couple of hours.  Direct observation of problems or activities is worth far more than an abstract narrative.

Shuttling back and forth got old pretty quick so we created the next product support innovation – direct connection of our development network with their pilot prototype network (remember this was 1980).  The connection improved the reporting of problems, the suggestion of improvements, and the quick delivery of new software and documentation.  The beauty was that we were able to charge more for this service as it enhanced both companies capabilities.

We were really moving now.  We had a product (in today’s terms it would be a beta of a product), we had a paying customer, and we had a development staff.  Life couldn’t be better.  However, our intent was to build a successful product business.  John and I had gotten this far before.  How did we get farther?  It was time to put a distribution channel in place and find a set of complementors.  Little did we know we’d find them in the same place.  About this time, the Corporate Software Services Organization made a proposal to the corporation that we could grow from a $60 million business to a $1 billion business in less than five years.  One of the core assumptions was that there were already systems integration projects that DEC retained the intellectual property rights to and therefore could turn them into application products.  The ALL-IN-1 product effort became the test case for this idea.

The first thing we needed to do was to train some of our best consultants on the product.  The training was scheduled, but little time was allocated to prepare materials for the class.  On the Sunday night before starting the training, I was looking at only a four hours worth of material for a five day class.  I had no idea as I started lecturing Monday morning how I was going to make it through the week.  Through desperate acts, brilliance emerges.  During that first morning’s presentation, I kept getting questions about whether the product had this feature or that feature.  Could the product be used to solve this customer need or that customer need?  As I started to answer “No” to each of these and feel quilty that we didn’t have those features or capabilities in our product, I realized that each of the things they were asking for was easy to produce with the core functionality that we did have in the product.  So now the answer to each question became “No, we don’t have that capability yet, but you now have your homework assignment for the afternoon.”

As they broke for the afternoon laboratory, we had more than enough enhancement requests to go around to each of the “students.”  The students were all excited because they were getting to work on something that they cared about and in most cases they knew exactly which of their customers would be interested in that capability.  Some of the features required the core ALL-IN-1 engine to change.  So John would work with those students to figure out a good operational split between what the application engine would do and what the ALL-IN-1 applet should do.  They worked together to define what the interface or syntax between them should be.  With 20 students and a week’s worth of exercises that all arose from their questions, the product grew by leaps and bounds before our eyes.  This was our first experience with the power of an open architecture and a large group of collaborators.

The second order of brilliance was letting the students know that we would include all of their functionality in the next product distribution tape and we would give attribution to each of them for the parts that they contributed.  With that simple context switch, the afternoon exercises took on new significance.  Knowing that their work would be distributed, they didn’t just do something that demonstrated that the function could be done.  They switched into engineering mode and thought through the general case, implemented the functionality, tested it under a variety of customizations, and documented the features.

The next week we were teaching a group of 20 DuPont systems analysts who would be moving ALL-IN-1 beyond the pilot environment.  We taught the class in the same format.  In the morning, we lectured and explained the different capabilities of the system.  As questions arose about whether the system could do this or that kind of application, those questions became the person’s laboratory assignment for the afternoon.  Very quickly several different departmental systems emerged – Human Resources, Plant Office Automation, Order Processing, Strategic Planning – and we made the same offer to them.  We would include their efforts on the distribution tape internal to DuPont and external to DuPont if the feature were generic and give attribution to the person who created it.  Unknowingly what we did was create 20 disciples who couldn’t wait for the next Monday morning to demonstrate “their” creation to their peers or their internal clients.

Within three years, largely through the efforts of the 50 DuPont analysts who went through these courses, DEC installed $450 million of ALL-IN-1 systems in Wilmington County Delaware.  Largely through the efforts of the first 100 DEC software specialists we trained in this manner and the sales representatives they converted, ALL-IN-1 systems became a $1 billion a year business for DEC for 18 years in succession.  When I visited Health Partners in Minneapolis, MN in 1999 and found that they were still using ALL-IN-1 as their primary corporate electronic mail system, I was filled with awe and much trepidation (since the system was quite dated being in maintenance mode for the previous 14 years)

The last of the core lessons we learned was the power of tailoring far beyond what we’d anticipated.  I was asked to speak at the DEC European Software Services Meeting in Majorca in 1982 to describe what we were doing in the U.S. with Office Automation.  I gave an overview of the product and the kinds of customer solutions we were generating AND the revenue we were creating.  This got their attention as they were going through a revenue shortfall at the time.   The immediate set of questions I got were around translating the product into each European language.  It was an “oh, of course” type of question for me, but was unheard of in products of the early 1980s where the user interfaces were hardwired into the software code.  I shared “sure, it should be no problem, if you want to localize the interfaces, just change the ALL-IN-1 interface forms using the standard VMS tools.  All of the messages that we present on the screen are forms;  just go ahead and change them.  And we’ve adhered to all the VAX VMS coding conventions so all the National Replacement character sets should work.”

It was like I ignited a bomb.  Everyone came out of their seats and started throwing a hundred questions at me.  Seizing the moment, David Stone, DEC’s European Software Services VP, called a half hour break so that the attendees could get their questions answered informally or for country management groups to caucus on what this meant for their business.  He then called the whole group back to order and gave them a challenge.  Based on what they heard did they think they could make up any part of the $100 million revenue bookings shortfall Europe was projected to have for the year.

He first asked each country group to gather together and develop a straw plan for using the ALL-IN-1 product to address the revenue shortfall.  In the process, the groups were to identify any issues, questions or concerns that they had and they could pose those to me in their group presentations.  Each group went off and spent ninety minutes discussing the opportunity.  Then each of the ten country groups made their reports and identified their issues.  As each group reported, I went through my large library of 35mm slides to find the slides that addressed each issue or question.  I got David to stall for a few minutes after the last group presented so I could arrange the slides in some semblance of presentation order.

I then stood up and gave a “custom” presentation addressing all of their issues.  The management team was really excited now.  Until that moment I had not realized the power of a “custom” presentation to go along with our easily tailorable product.  The group correctly sensed that this product was quite real and could be adapted for each country’s unique business needs.  Until now, Europe had to take a one size fits all set of products that were English language and US culture centric.  Long delays would occur to get even minimal localization done for each country.  Now they’d found a product that could be introduced Europe wide in each natural language at the same time as the US introduction.

David Stone, then asked the country groups to get together with this new information and come back with a “committed” forecast of how much of the revenue shortfall they could make up with this product and what they needed to do to make this program happen in Europe.  The groups quickly formed and in 15 minutes came back with their commitments.  The commitments totaled $120 million.  David scaled the numbers back to total $100 million to mitigate the exuberance factor.  I was blown away and now fearful for my life.  I wondered what would happen if they all woke up and decided they’d been railroaded into a commitment in the bliss of the moment.  I figured they’d shoot the messenger – me.

The group then started to work on the marketing program to make their commitments happen.  Using the natural creative competitiveness of the countries Stone broke them into their country units to develop logos for the $100 million in 100 days campaign.  Several of the teams found a local T Shirt design shop and had their logos emblazoned on T Shirts for the attendees. Then he organized cross-functional and multi-country groups to develop:

  • the 20 page newsletter that would go out to all sales and software personnel in Europe,
  • the training program for sales and software personnel,
  • the localization teams to translate ALL-IN-1 into each country’s natural language,
  • other applications that could be combined with ALL-IN-1 in each geography.

By the end of the four day meeting, the Europeans now had a major new product that was their own, a marketing program that they could roll out, and a new found respect by their sales peers.  I was asked to stay over another week to train the software consultants who would be doing the translations.  My ask of the software consultants was that each country supply a software consultant to come work with our development team for a month at a time to make the core ALL-IN-1 engineers aware of multi-cultural needs.  Within the month they had ALL-IN-1 translated into 10 different languages and cultures.  Within the 100 days they’d exceeded their goals and in fact did $120 million of additional business.

In David Stone, I saw and appreciated a master at management leadership and motivation.  For the next year I spent as much time in Europe as I could to learn about changing the behavior of a large organization.  I asked David how he knew that my presentation would set off so much energy.  He laughed and said that he had no idea that it would.  “What I do is schedule an agenda that has as much informational diversity as possible running the gamut from product information, service information, corporate strategy, engineering strategy, organizational behavior change stuff, and management education.  I never know which of these will cause an energy hit with these 150 managers, but I’m confident that one of them will.  When that energy resonation occurs between the audience and the speaker then I go into action.  I’m not good at generating that energy.  However, I know how to move energy into action; that’s what I’m really good at.”

We succeeded so well in those first several months at promoting and selling our version of Office Automation that the company now had a dilemma.  Taking a portfolio approach, DEC had six different central engineering groups working on office automation projects.  The corporate powers that be met to decide which group would win and be anointed as THE office automation product.  There were pluses and minuses to each product and the capabilities that each would add to the VMS product line and the push into the commercial market.  But in the end, there was no clear winner at the features and benefits level.

Peter Christy was chartered by DEC’s executive management with evaluating all of the different DEC OA options.  Here are some excerpts from his evaluation.  Peter would have made a good economist with his report that was of the variety – on the one hand and on the other hand.  What I would have given for a one-handed Peter Christy at this juncture.

“The VAX/VMS office automation system developed by the Charlotte, NC, District Software Services office has been studied. Under the goals set forth in the “Grey Saturday” meeting, the Charlotte software is not an alternative to EMS/VMS. On the other hand, the Charlotte software deserves careful examination on its own merit. It offers new ideas on the design and marketing of office automation into large enterprises (e.g. Fortune 100 companies). . .

Recommendations:  The Charlotte software seems like an excellent office automation system, both to those of us that went and looked at it, and to the customers that the Charlotte people have been dealing with. Unfortunately, there is no simple reason to bring the software in and base product development on it. The most reasonable approaches today seem to be to begin a parallel development based on the software (which requires additional resources) or to continue to use the software for leveraged field consulting, with some effort expended to assure reasonable comnunication between field and home developers. Most critically, we need a broadly accepted decision on how to position this software.  The software is too attractive to just forget; it’s too dangerous to treat passively.”

However, only one of the six product alternatives had generated any revenue for the company – ALL-IN-1.  That made the decision quite easy in the end.

The implications of the decision were not so easy for me.  I was asked to manage all the groups that “lost” and meld them into one group producing the next version of ALL-IN-1.  Suddenly, I would be moving from managing 10 people to managing 250 people in five different organizations, in three sites, in two countries.  But that is a story for another time – global product management.

Gordon Bell was instrumental in getting ALL-IN-1 accepted as the primary office automation product development effort.  He sent the following memo in 1982:

SUBJECT: THANKS FOR ALL-IN-1 DEMO. LET’S BE #1 IN THE OFFICE BY SALES

“I thoroughly enjoyed the demo and interaction on ALL-In-1, and anxiously await to use it on our own VAX. It’s an impressive set 0f tools for the office and the basis for others to build additional tools that work together in a system.

“It is most urgent to get books out on it of the form:

1. A user’s manual that has the whole thing including DECmail for the heavy duty office environment for professional and secretary.

2. A tutorial on office automation where we feature it as being “this is what office automation is.”

“I hope that we might be able to help with the writing somehow.

“It should be noted that you folks have created a second generation office automation system. Because you adhere to the principles of VAX/VMS regarding providing an extendable fully compatible environment, versus just having a set of independent tools, you have the second generation.

“WE SHOULD ALL NOTE THAT A SECOND GENERATION SYSTEM IS PROBABLY NOT BUILDABLE ON ANY OF THE EARLIER SYSTEMS (eg. IBM 370, Prime, HP) . . . or for that matter, RSTS, RSX and Tops due to the difficulty of addressing and lack of common data dictionaries, etc. From my perspective, this is one of the few applications I’ve seen that begins to utilize VAX the way it was supposed to be.

“My only concern now is getting it marketed. This is a great product to go cream the IBM System 38, Wang and other parts of IBM with and to become number one. We must get market share while it is possible and the other folks get their products.

“As Pogo said: ‘We’ve met the enemy and he is us.'”

The exciting part of being selected to build ALL-IN-1 was that we quickly went from nowhere in the OA space to being one of the front runners.  Patty Seybold of the PSGroup put us on the front cover of her monthly report in 1983:

“As we move into the ‘big league,’ analyzing some of the product offerings which might be appropriate as the basis for the implementation of a large-scale organization-wide office system (such as Honeywell’s OAS, IBM’s PROFS, Wang’s Alliance/VS, DG’s CEO, Prime’s OAS, as well as the offerings from Datapoint and HP), we begin to look at a different set of criteria from those we used heretofore in evaluating smaller, dedicated systems. An organization-wide office system must, above all, excel in the areas of data communications and networking. Ideally, it should integrate well with the existing data processing, data base, and management information systems that are already in place within an organization. And it now appears that such large-scale integrated office systems must also interface serenely with the desk-top tools that managers and professionals will be using: personal and professional computers.

“Frankly, it was this third aspect—customizability—that attracted us to DEC’s All-in-1 system. We believe strongly that every office environment is different, since every business has different problems to solve. We also feel that each user of an office system should be able to mold the capabilities of the system to his own personality. DEC’s All-in-1 offering purports to do just that. We wanted to find out if it really did. . .

“What does the future hold?  We think that DEC will ‘pull it off’ – become a major office systems supplier, on a par with Wang and IBM, at least in part because DEC customers will insist that the vendor take an active, leading role in helping them determine their own strategic directions.”

During the process of developing, marketing and selling ALL-IN-1, I met so many wonderful professionals like John, David, Gordon, Peter and Patty.  The eight year ride of creating an innovative Office Automation System remains one of my top three innovation experiences.

Posted in ALL-IN-1, Content with Context, Dilbert, Human Centered Design, Learning, organizing, Relationship Capital, Value Capture, Working in teams | 22 Comments

What if Business were Art Making?

At the UW Bothell Bootcamp Part 1 facilitated by Michele and Jim McCarthy, a powerful part of the process was the making of team art.  The supplies were all there and we were encouraged to “make art” throughout the weekend.  A wonderful stream of paintings flowed from our interactions.

Professor David Socha captured this collage of the paintings in his blog after the weekend.

UW Bothell Bootcamp Part 1 Team Art

What I missed during the weekend was that creating great teams who create great products included bringing the making of art into the workplace.

The process of art making fulfills the vision that Stan Davis created in his book The Art of Business.

What would business look like if art making were an integral part of work?

Posted in Content with Context, Human Centered Design, Idealized Design, Learning, social networking, User Experience, Working in teams | 5 Comments

Life Goes On

As I was watching an episode of Body of Proof, the main character, Megan Hunt, in comforting the wife of a policeman who was killed in the episode shared this quote from the poet Thomas Campbell:

“To live in hearts we leave behind is not to die.”

As our extended family mourns the passing of my wife’s mother, Barbara Cassat Keleher, this one line is somehow comforting.  Barbara’s obituary describes her long and happy life.

The full Thomas Campbell poem is:

“What’s hallowed ground? Has earth a clod
Its Maker meant not should be trod
By man, the image of his God,
Erect and free,
Unscourged by Superstition’s rod
To bow the knee?

That’s hallowed ground where, mourned and missed,
The lips repose our love has kissed;—
But where’s their memory’s mansion? Is’t
Yon churchyard’s bowers?
No! in ourselves their souls exist,
A part of ours.

A kiss can consecrate the ground
Where mated hearts are mutual bound:
The spot where love’s first links were wound,
That ne’er are riven,
Is hallowed down to earth’s profound,
And up to heaven!

For time makes all but true love old;
The burning thoughts that then were told
Run molten still in memory’s mould;
And will not cool
Until the heart itself be cold
In Lethe’s pool.

What hallows ground where heroes sleep?
‘Tis not the sculptured piles you heap!
In dews that heavens far distant weep
Their turf may bloom;
Or Genii twine beneath the deep
Their coral tomb.

But strew his ashes to the wind
Whose sword or voice has served mankind,—
And is he dead, whose glorious mind
Lifts thine on high?—
To live in hearts we leave behind
Is not to die.”

Rest in peace, Mom Keleher.

Posted in Family, Quotes | 4 Comments

Looking Out my Window

On a grey winter day, it is the little things in life that make a difference in the dreary Northwest.  As I was heads down focusing on my virtual screen to the world, I looked up and there was an aircraft carrier cruising by my window.  I immediately raced for my camera to record its passage and brighten up my day.

Trying to focus the camera, I was busy trying to capture the carrier with the Bremerton Ferry in the background. Fortunately I also caught a container ship in the far left background anchored near the docks.  It wasn’t until I loaded the photo to my computer that I realized that those weren’t planes on the deck, but rather cars.

Thanks to the wonders of the internet I looked up US Navy Aircraft Carrier number 76 and found out it was the USS Ronald Reagan (CVN-76).  The Wikipedia entry is so good that it let me know that the USS Ronald Reagan was transferred to the Puget Sound Naval Shipyard.

As I filed the photo away in my very disorganized library, I happened upon a similar photo taken last summer of the Washington State Ferry passing my window.

What a difference the summer sunshine makes on a photo, and on my mood.  I can’t wait for the warm sun to come back to the Puget Sound.

As the silly season of presidential politics heats up, the passing of the USS Ronald Reagan reminds me of the gift we get every day for the sacrifices that our military folks make so that I have the freedom to do what I choose to do.

Having a great day in the Pacific Northwest looking out my window.  Being. Here. Now.

Posted in Nature, Photos, User Experience | 1 Comment

Hassle Maps and Theory of Constraints

In the midst of teaching my human centered design course last fall, I came across Adrian Slywotzkys latest book Demand: Creating What People Love Before They Know They Want it.  I was delighted to find his discussion of the use of Hassle Maps to drive demand. For years, I’ve used Eli Goldratts Theory of Constraints as a way to make sense of complex enterprise workflows.  However, this approach is overkill for making sense of consumer needs.  Hassle Maps fits the consumer user research task.

In getting ready to give a seminar lecture this week on Hassle Maps and the Theory of Constraints and how they affect demand, I came across the wisdom of Calvin and Hobbes:

In the book Demand, I liked the example of the user observation and human centered design explanations of what it takes to drive demand.  I decided to use the Zipcar example for the lecture:

The founders of Zipcar were committed to a new, green economic model for personal transportation that could eliminate the need for owning a car.  However, the company struggled for a number of years, until a new CEO came in and required a fresh look at what users really wanted.  As it turns out, while most users were philosophically aligned with Zipcar, their key desire was to walk less than five minutes to get to a car.  Zipcar experimented with this solution which worked and demand for their product took off far beyond their competition.

As I switched to describing Theory of Constraints, I realized I needed a bridge slide between the two topics.  So I borrowed another example from Slywotzky and a diagram of Theory of Constraints and put them together.  The “Ah Hah” moment came when I stared at the two maps side by side.  I suddenly realized that it isn’t a matter of “either/or” when it comes to Hassle Maps and Theory of Constraints, they complement each other.  In order to solve the hassle map for the consumer, the enterprise has to examine all of their value adding workflows and find the bottlenecks keeping them from fixing the hassle map of the consumer.  Hassle Maps and Theory of Constraints are mirror images of each other.

When I took a break from preparing this lecture, I read a transcript that my colleague Alan Wood sent along as part of our innovation and the university working group.  The transcript was titled “Don’t Lecture Me: Rethinking the Way College Students Learn” from American RadioWorks.  The thesis of the research professors is that lecturing is a very poor way to transfer learning to students.  Wonderful.  So now all my thoughts will be on how to change from a traditional lecture to using “peer instruction.”  Not.  The thoughts will be there, but I don’t have the time to do it right.  Oops.  That is the other main point of the article.

Once again I find myself on the horns of a dilemma.

Posted in Content with Context, Humor, Learning, Teaching | 2 Comments

The Cricket

After an intense weekend of group process at a McCarthy bootcamp, I wanted to spend some time reflecting on the process and the insights.  As part of searching while “On the way to Somewhere Else” to aid those reflections, I came upon the following story that was in my account of the Japan Study Mission I participated in while at Digital Equipment Corporation in the late 1980s.

Once we arrived in Japan, we continued our education in the Japanese culture.  Our first formal meeting was with Jean Pearce, a columnist for the English language version of the Japan Times.  As part of her show and tell for examples of the differences between Japan and America, she pulled out a little cage for keeping bugs in the house so that one can hear the song of summer. The cage is arranged so that you feed the bug cucumbers or watermelon. I asked her later about these cages and she pointed me to the following short story in Use Both Sides of Your Brain by Tony Buzan.

Kusa-Hibari

“His cage is exactly two Japanese inches high and one inch and a half wide: its tiny wooden door, turning upon a pivot, will scarcely admit the tip of my little finger. But he has plenty of room in that cage – room to walk, and jump, and fly, for he is so small that you must look very carefully through the brown-gauze sides of it in order to catch a glimpse of him. I have always to turn the cage round and round, several times, in a good light, before I can discover his whereabouts, and then I usually find him resting in one of the upper corners – clinging, upside down, to his ceiling of gauze.

“Imagine a cricket about the size of an ordinary mosquito – with a pair of antennae much longer than his own body, and so fine that you can distinguish them only against the light. Kusa-Hibari, or ‘Grass-Lark’ is the Japanese name of him; and he is worth in the market exactly twelve cents: that is to say, very much more than his weight in gold. Twelve cents for such a gnat-like thing! . . . By day he sleeps or meditates, except while occupied with the slice of fresh egg-plant or cucumber which must be poked into his cage every morning. . .to keep him clean and well fed is somewhat troublesome: could you see him, you would think it absurd to take any pains for the sake of a creature so ridiculouly small.

“But always at sunset the infinitesimal soul of him awakens: then the room begins to fill with a delicate and ghostly music of indescribable sweetness – a thin, silvery rippling and trilling as of tiniest electric bells. As the darkness deepens, the sound becomes sweeter – sometimes swelling till the whole house seems to vibrate with the elfish resonance – sometimes thinning down into the faintest imaginable thread of a voice. But loud or low, it keeps a penetrating quality that is weird . . .All night the atomy thus sings: he ceases only when the temple bell proclaims the hour of dawn.

“Now this tiny song is a song of love – vague love of the unseen and unknown. It is quite impossible that he should ever have seen or known, in this present existence of his. Not even his ancestors, for many generations back, could have known anything of the night-life of the fields, or the amorous value of song.

“They were born of eggs hatched in a jar of clay, in the shop of some insect-merchant: and they dwelt thereafter only in cages. But he sings the song of his race as it was sung a myriad years ago, and as faultlessly as if he understood the exact significance of every note. Of course he did not learn the song. It is a song of organic memory – deep, dim memory of other quintillions of lives, when the ghost of him shrilled at night from the dewy grasses of the hills. Then that song brought him love – and death. He has forgotten all about death: but he remembers the love. And therefore he sings now – for the bride that will never come.

“So that his longing is unconsciously retrospective: he cries to the dust of the past – he calls to the silence and the gods for the return of time. . .Human lovers do very much the same thing without knowing it. They call their illusion an Ideal: and their Ideal is, after all, a mere shadowing of race-experience, a phantom of organic memory. The living present has very little to do with it. . .Perhaps this atom also has an ideal, or at least the rudiment of an ideal; but, in any event, the tiny desire must utter its plaint in vain.

“The fault is not altogether mine. I had been warned that if the creature were mated, he would cease to sing and would speedily die. But, night after night, the plaintive, sweet, unanswered trilling touched me like a reproach – became at last an obsession, an affliction, a torment of conscience; and I tried to buy a female. It was too late in the season; there were no more kusa-hibari for sale – either males or females. The insect-merchant laughed and said, ‘He ought to have died about the twentieth day of the ninth month.’ (It was aleady the second day of the tenth month.) But the insect merchant did not know that I have a good stove in my study, and keep the temperature at above 75 degrees F. Wherefore my grass-lark still sings at the close of the eleventh month, and I hope to keep him alive until the Period of the Greatest Cold. However, the rest of his generation are probably dead; neither for love nor money could I now find him a mate. And were I to set him free in order that he might make the search for himself, he could not possibly live through a single night, even if fortunate enough to escape by day the multitude of his natural enemies in the garden – ants, centipedes, and ghastly earth-spiders.

“Last evening – the twenty-ninth of the eleventh month – an odd feeling came to me as I sat at my desk: a sense of emptiness in the room. Then I became aware that my grass-lark was silent, contrary to his wont. I went to the silent cage, and found him lying dead beside a dried-up lump of egg-plant as gray and hard as a stone. Evidently he had not been fed for three or four days; but only the night before his death he had been singing wonderfully – so that I foolishly imagined him to be more than usually contented. My student, Aki, who loves insects, used to feed him; but Aki had gone into the country for a week’s holiday, and the duty of caring for the grass-lark had developed upon Hana, the housemaid. She is not sympathetic, Hana the housemaid. She says that she did not foget the mite – but there was no more egg-plant. And she had never thought of substituting a slice of onion or of cucumber! . . . I spoke words of reproof to Hana the housemaid, and she dutifully expressed contrition. But the fairy music had stopped: and the stillness reproaches; and the room is cold, in spite of the stove.

“Absurd!. . . I have made a good girl unhappy because of an insect half the size of a barley-grain! The quenching of that infinitesimal life troubled me more than I could have believed possible. . .Of course, the mere habit of thinking about a creature’s wants – even the wants of a cricket – may create, by insensible degrees, an imaginative interest, an attachment of which one becomes conscious only when the relation is broken. Besides, I had felt so much, in the hush of the night, the charm of the delicate voice – telling of one minute existence dependent upon my will and selfish pleasure, as upon the favour of a god – telling me also that the atom of ghost in the tiny cage, and the atom of ghost within myself, were forever but one and the same in the deeps of the Vast of being. . .And then to think of the little creature hungering and thirsting, night after night and day after day, while the thoughts of his guardian deity were turned to the weaving of dreams!. . .How bravely, nevertheless, he sang on to the very end – an atrocious end, for he had eaten his own legs!. . .May the gods forgive us all – especially Hana the housemaid!

“Yet, after all, to devour one’s own legs for hunger is not the worst that can happen to a being cursed with the gift of song. There are human crickets who must eat their own hearts in order to sing.”

As I strive for the virtue of “being present” that I committed to with my bootcamp colleagues, I seek to be aware of the songs around me and to seek out the songs that I don’t hear.

Posted in Content with Context, Human Centered Design, Learning, Nature, Travel, Working in teams | Leave a comment

Bootcamping with the McCarthys

I am spending the weekend getting booted at a McCarthys bootcamp sponsored by my colleague, Professor David Socha at UW Bothell.  It is an exciting way to learn new things about myself and to meet a wonderful collection of students, faculty and community members.  However, the greatest joy is running into professionals who are using the session to kickstart their idea stage startup.  What fun it is to listen to the insights and ideas for unmet needs and their solutions.  I look forward to engaging with the two different groups to see their ideas turn into reality.

And just in time, Hugh MacLeod offers up his unique insight into doing a startup.

Posted in Humor, organizing, social networking, Uncategorized, Working in teams | 1 Comment

Orbiting the Giant Hairball

It’s been one of those weeks.  Today, feels like the wonderful image from a book by Gordon MacKenzie called Orbiting the Giant Hairball:  A Corporate Fool’s Guide to Surviving with Grace.  Too many projects and too many new and wonderful people have showed up in my life.  Instead of a network structure which is somewhat organized, it just feels like a hairball at the moment.

I look forward to navigating the loops and whorls to untangle the hairball into a network.

Posted in Humor, Relationship Capital, social networking, User Experience | Leave a comment

The Future of Higher Education – MLA Seattle

Action driving tweets – who knew that such few characters of text could drive action that leads to engaging learning.

A few days ago, Cathy Davidson tweeted that she was headed to Seattle, WA to the Modern Language Association Conference where she was participating in a panel discussion on “The Future of Higher Education.”  What a delight that I would be able to hear and experience Cathy in person, not just through her books and video snippets.  Even better, Ed Lazowska from the UW Computer Science Department would be on the panel as well. And still better, the panel session was open to the public.

This duo of synchronicity and hyper locality started by a simple tweet moved me to rearrange my schedule to attend the session.

I had high expectations for the session given the quality of the participants, and those expectations were far exceeded.  I was treated to four superb intellects sharing their passions from very different points of view across the sciences and the humanities, and yet a powerful vision of what could be emerged from the talks.

The styles of presentation were also interesting and formed a spectrum of ways to shed light on a critical topic.

Kathleen Woodward did a wonderful job presiding over the session and making all of us feel at home.

Sidonie Ann Smith started the panel talks with “Emergent Projects, Processes, and Stories.”  Clearly, this professor is a wonderful writer and her medium is text.  The few slides were all text to share her outline and to provide a few quotes.  In a more traditional lecture style, she read from her prepared paper.  She spoke at a hundred miles per hour, so fast that my traditional note taking couldn’t keep up.  She provided several wonderful turns of phrases like “the new dissertation – thinking outside the proto-book.”

Smith presented four Macro-Narratives to describe the state of higher education:

  • Macro-Narrative 1: Declining state support for public higher education.
  • Macro-Narrative 2: Redefinition of the “institution” of higher education.
  • Macro-Narrative 3: Re-conceptualization of knowledge and knowledge production.
  • Macro-Narrative 4: The emergent scholar.

Smith then followed her macro-narratives with her suggested action plan:

  1. Transform doctoral education
  2. Forge a new ethics and praxis of scholarly communication
  3. Rethink our relationship to scholarship
  4. Re-conceptualize our scholarly collaboration – faculty, students, community, world
  5. Update our narrative of the humanities – from the singular word to the expanse of Big Data

As much as I believe that I read broadly and keep up with a wide range of topics, I never thought I would hear an English professor talking about Big Data.  Did I miss that there was a harmonic convergence in Seattle this week?

Next to speak was Curtis Wong from Microsoft on the topic of “Learning Collaboratories, Now and in the Future.”  From the world of macro-narratives Curtis transported us millions of years into the universe with his WorldWide Telescope project (WWT).   The context for Wong’s talk was the notion of a collaboratory.  His vision for WWT is a prototypical collaboratory where not only is there a visualization capability but the ability to develop curated journeys through the vast data.  Curtis shared another nice turn of a phrase – “education could be a collection of curated journeys.”

As Curtis toured us around several curated journeys, he made clear that story telling is the passport to education.  I am a little slow on the uptake, but here was the union of science and the humanities – storytelling and narrative providing a context for “big data.”  Professor Smith presented a suggestive narrative presentation primarily in words.  Wong presented a visually animated narrative of what the union of the humanities and science could be.  The power of the visual map of the sky illustrates so clearly where there is scientific work going on AND where there isn’t work going on – inviting the community at large (professional and amateur) to contribute to our knowledge of the universe – and to tell stories about their discoveries.

What I especially liked about the ability of the WWT to host meta-data, text, video and audio as a curated journey, is that there was always a smooth transition between content and context.  With a blog entry or a Vook (video book), when you follow a link you are jarred from your current context as you leap from one disparate piece of content to another.  WWT seamlessly integrates the multiple media into a single visual space.

As an innovator in the development of visual analytics capability through our work in creating Attenex Patterns to exponentially increase the productivity for legal electronic discovery, I wanted to run to the podium and meet Curtis to explore how far his WWT engine could be pushed into the realm of document visualizations.

However, I calmed down and looked forward to Ed Lazowska’s presentation on big data – “It’s the Data Stupid!” Ed is one of those wonderful gifts to the computer science community and to the state of Washington.  He is a tireless advocate for the importance of research and high technology.  Whether in a public presentation or in small group meetings with Ed, I always come away smarter.  My favorite learning from Ed has affected my pedagogy ever since he advised “never answer a students question directly. Always seek first to understand the misunderstanding that caused the question to be asked in the first place.”  Forgetting this advice leads to poor mentoring on my part when interacting with students or colleagues.

Ed started his talk off with a wonderful hook “Let’s look at four things that happened in 1969 – man walked on the moon, Woodstock happened, the Mets won the World Series, and the first data packet was sent over Arpanet (precursor to the Internet).  Forty years later, which of these four events had the most impact.”  Clearly, all of us answered the Internet.

Ed then went through the evolution of approaches to science:

  • Theory
  • Experiment
  • Observation
  • Computational Science – the world of simulation
  • Today:  eScience – the dawn of data driven science

Connecting with Wong’s talk, Ed pointed out that the Sloan Digital Sky Scanner that created the data source for the WWT collected 80 Terabytes (TB) over seven years.  This amount of data precipitated the shift to the public sharing of data which is a big leap from previous methods of hoarding data based on how expensive it was to collect and curate data.  We are seeing the democratization of science through this sharing of big data.

Current desktop gene sequencers generate 17 TB of data per day.  Imagine the amount of data that a roomful of these desktop gene sequencers create – every hour, every day, 24/7.

In his wonderfully fact based presentation style, Ed shared several observations on how these advances in big data are destroying the economics of the university.  Lazowska connected nicely with the talks from Smith and Wong by arguing convincingly for the connection of humanities to big data to aid in telling the stories buried in the data.

With three talks completed and one to go, I knew I was in the presence of a special event. Each speaker had a unique style and a unique point of view, yet they prepared and shared bridges and connections to the views of the other speakers.  As a regular attendee and presenter at academic and business conferences, it is a rare occurrence to have a group of panelists coordinate their messages so seamlessly.

Now it was Cathy’s turn to share her thoughts on “How to Crowdsource Thinking”.  Over the last month, Cathy’s books, videos and prolific tweets entered my invisible university of thought leaders.  Of course, Cathy being a professor at Duke University, my alma mater, and sharing anecdotes about Shane Battier, one of my favorite Duke basketball players, helped a lot.

Cathy’s recent book Now You See It is an inspiration at so many different levels.  At the top level, her insights give me a completely different way to see both the student/teacher relationship as well as the manager/employee relationship.  As a practitioner and teacher of human centered design, I spent the last four years trying to improve the learner centeredness of my courses in the UW HCDE Department and in the UW Foster Business School.  While I made some improvements, I was operating without a framework or theory or evidence based method for teaching AND assessment.

Through Davidson’s research and innovation in the classroom, I now have several frameworks to use to change the learning dynamics in the classroom and “in the wild.”  I loved Cathy’s description of her charter while she was Vice Provost at Duke “break things and make things.”

Cathy issued several interesting questions and observations as she began her talk:

  • If <1% of college students go on to become tenure track professors, why is an English department structured for the 1% rather than the 99%?
  • Quoting Clay Shirky “institutions tend to preserve the problem they were designed to solve,” Professor Davidson asked “why do we have to keep preserving the institution of higher learning?”
  • Why are we letting the industrial-educational complex, drive scientific labor management into scientific learning management?
  • Why do we continue to use the “A, B, C, D” grading system that even the American Meat Packing Association rejected within months of starting its use?

Why indeed?

As she described in her book, Cathy’s talk was from the union of head and heart as she did not use a prepared paper and had relatively few slides.  As an audience member, it was easy to feel that I was in a conversation, not a lecture.

The Q&A session was lively and gave the panelists an opportunity to share an even wider range of thoughts.  As I left the room, I had to chuckle at the sign on the back table.

As I walked from the Ballroom at the Seattle Sheraton back to the ferry terminal for the ride home to Bainbridge Island, I was flooded with imagery of a software application that I would really love from today’s panel discussion.  I imagined that prior to the panel discussion, all of the books, publications, videos, and reference pointers from the speakers were loaded into a visually rich environment like the Wong’s WWT engine.  And the panelist presentations and Q&A session were recorded (video and audio) and loaded into the WWT engine.  Then, after the session, each of the panelists would curate a journey through this n-dimensional media space and capture their reflections on what they took away from the event and the ideas shared by the other panelists.  After the event, all of us as participants, could start adding our reflections and relevant references to the compendium and curate our own journeys through this rich topic space.  What a conversation that would be.

This kind of tool would truly be “content with context.”

The lasting benefit of the panel discussion today is that along with Cathy and Ed, I am adding three more professors to my invisible university.

Posted in Big Data, Content with Context, Human Centered Design, Idealized Design, Intellectual Capital, Knowledge Management, Learning, University, User Experience, WUKID | 6 Comments