Business Intelligence: Sometimes a problematic term

I often find myself in between the world of military language and the completely different language used in the information technology domain. Naturally it didn’t take long before I understood that term mapping or translation was the only way around it and that I often can act like a bridge in discussions. Understanding that when one side says one thing it needs to be translated or explained to make sense in the other domain.

Being an intelligence officer the term Business Intelligence is of course extremely problematic. The CIA has a good article that dives into the importance of defining intelligence but also some of the problems. In short I think the definition used in the Department of Defense (DoD) Dictionary of Miltary and Associated Terms can illustrate the core components:

The product resulting from the collection, processing, integration, evaluation, analysis, and interpretation of available information concerning foreign nations, hostile or potentially hostile forces or elements, or areas of actual or potential operations. The term is also applied to the activity which results in the product and to the organizations engaged in such activity (p.234).

The important thing is that in order to be intelligence (in my area of work) it both has to gone through some sort of processing and analysis AND only cover things foreign – that is information of a certain category.

When I first encountered the term business intelligence at the University of Lund in southern Sweden it then represented activities done in a commercial corporation to analyse the market and competitor. It almost sounded like a way to take the methods and procedures from military intelligence and just apply it in a corporate environment. Still, it was not at all focused on structured data gathering and statistics/data mining.

So when speaking about Business Intelligence (BI) in a military of governmental context it can often cause some confusion. From an IT-perspective it involves a set of technical products doing Extract-Transform-Load, Data Warehousing as well as the products in the front-end used by analysts to query and visualise the data. Here comes the first issue of a more philophical issue when seeing this in the light of the definition of intelligence above. As long as the main output is to gather data and visualising it using Enterprise Reporting or Dashboards directly to the end user it ends up in a grey area whether or not I would consider that something that is processed. In that use case Business Intelligence sometimes claims to be more (in terms of analytical ambitions) than a person with an Intelligence-background would expect.

Ok, so just displaying some data is not the same thing as doing indepth analysis of the data and use statistical and data mining technology to find patterns, correlations and trends. One of the major players in the market, SAS Institute, has seen exactly that and has tried to market what they can offer as something more than ”just” Business Intelligence by renaming it to Business Analytics. That means that the idea is to achieve ”proactive, predictive, and fact-based decision-making” where the important word is predictive I believe. That means that Business Analytics claims to not just visualise historic data but also claim to make predictions into the future.

An article from BeyeNETWORK also highlights the problematic nature of the term business intelligene because it is often so connected with data warehousing technology and more importantly that only part of an organisation’s information is structured data stored in a data warehouse. Coming from the ECM-domain I completely agree but it says something about the problems of thinking that BI covers both all data we need to do something with but also that BI is all we need to support decision-makers. The article also discusses what analysis and analytics really mean. Looking at Wikipedia it says this about data analysis:

Analysis of data is a process of inspecting, cleaning, transforming, and modeling data with the goal of highlighting useful information, suggesting conclusions, and supporting decision making.

The question is then what the difference is between analysis and analytics. The word business is in these terms also and that is because a common application of business intelligence is around an ability to measure the performance through an organisation through processess that are being automated and therefore to a larger degree measurable. The BeyeNETWORK article suggests the following definition of business analytics:

“Business analysis is the process of analyzing trusted data with the goal of highlighting useful information, supporting decision making, suggesting solutions to business problems, and improving business processes. A business intelligence environment helps organizations and business users move from manual to automated business analysis. Important results from business analysis include historical, current and predictive metrics, and indicators of business performance. These results are often called analytics.”

Looking at what the suite of products that is covered under the BI-umbrella that approach downplays that these tools and methods have applications beyond process optimization. In law enforcement, intelligence, pharmaceutical and other applications there is huge potential to use these technologies to not only understand and optimize the internal processes but more importantly the world around them that they are trying to understand. Seeing patterns and trends in crime rates over time and geography, using data mining and statistics to improve understanding of a conflict area or understanding the results of years of scientific experiments. Sure there are toolsets that is marketed more along words like statistics for use in economics and political science but those applications can really use the capabilities of the BI-platform rather than something run on an individual researchers notebook.

In this article from Forbes it seems that IBM is also using business analytics instead of business intelligence to move from more simple dashboard visualizations towards predictive analytics. This can of course be related to the IBM acquisition of SPPS which is focused on that area of work.

From the book written by Davenport and Harris, 2007

However, the notion of neither Business Intelligence nor Business Analytics says anything about what kind of data that is actually being displayed or analyzed. From a military intelligence perspective it means that BI/BA-tools/methods are just one out of many analytical methods employed on data describing ”things foreign”.

In my experience I have seen that misunderstandings can come from the other end as well. Consider a military intelligence branch using…here it comes BI-software…to analyse incoming reports. From an outsider’s perspective it can of course seem like what makes their activity into (military) intelligence is that they use some form of BI-tools and then present graphs, charts and statistical results to the end user. Resulting from that I have heard over and over again that people believe that we should also ”conduct intelligence” in for instance our own logistics systems to uncover trends, patterns and correlations. That is wrong because an intelligence specialists are both skilled in analytics methods (in this case BI) and the area or subject they are studying. However, since these tools are called Business Intelligence the risk for confusion is high of course just because of the Intelligence word in there. What that person means is of course that it seems like BI/BA-tools can be useful in analysing logistics data as well as data of ”things foreign”. A person doing analysis of logistics should of course be a logistics expert rather than an expert in insurgency activities in failed states.

So lets say that what we currently know as the BI-market evolves even more and really assumes a claim to be predictive. A logical argument on the executive level to argue that the investment must provide something more than just self-serve dashboards. From a military intelligence perspective that becomes problematic since all those activities does not need to be predictive. In fact it can be very dangerous if someone is let to believe that everything can be predictive in contemporary complex and dynamic conflict environment. The smart intelligence officer rather need to understand when to use predictive BI/BA and when she or he definitely should not.

So Business Intelligence is a problematic term because:

  • It is a very wide term for both a set of software products and a set of methods
  • It is closely related to data warehousing technology
  • It includes the term intelligence which suggests doing something more than just showing data
  • Military Intelligence only covers ”things foreign”.
  • The move towards expecting prediction (by renaming it to Business Analytics) is logical but dangerous in a military domain.
  • BI still can be a term for open source analysis of competitors in commercial companies.

I am not an native English speaker but I do argue that we must be careful to use such as strong word as intelligence when it is really justifiable. Of course it is still late for that, but still worth reflecting on.

Enhanced by Zemanta
Share

More Presentation Support Tools but less (Powerpoint) slide shows

In a recent article called ”We Have Met the Enemy and He Is PowerPoint” by Elisabeth Bumiller there is a big outcry to stop using Powerpoint because it supposed to make us more stupid in decision-making. I agree and can just reiterate a quote from the top US Intelligence Official in Afghanistan, Maj Gen Michael Flynn in the report ”FIXING INTEL: A BLUEPRINT FOR MAKING INTELLIGENCE RELEVANT IN AFGHANISTAN”:

“The format of intelligence products matters. Commanders who think PowerPoint storyboards and color-coded spreadsheets are adequate for describing the Afghan conflict and its complexities have some soul searching to do.”

These are quite hard words directed towards his commanders in ISAF and the US Component in Afghanistan but I think he is right. However, the underlying issue is a desire to simplify things which should not be simplified. Combine that with a lack of vision when it comes to tools support for higher level of military command. Basically the tools supposed to support that kind of planning are either general purpose tools like Microsoft Office or highly specialised military application which exists in their own stove-pipe.

Oversimplifications
With Powerpoint comes a method and that method mainly consists of boiling information down to single bullets. Perfect for fine tuned marketing messages that want to leave just a few critical words or terms in the heads of the recipient. Not that good for complex reasoning around complex issues like modern conflicts. Powerpoint sets out to convey a message when we instead should focus on creating situation focused on improving our understanding.

Static representations
Most Powerpoint presentations are very static in nature. They usually represent a manually crafted snapshot of a given situation which means that it can become outdated very quickly. As time goes on there are more and more static presentations that should be regularly updated but usually never are. Either they disappear in the file sharing if the organisation lacks an Enterprise Content Management system or there is no process on monitoring which presentations that need to be updated. Usually because all the traceability is lost from when they were being created. Some companies have implemented some dynamic areas in their presentations were for instance weekly sales figures are updated when the presentation opens but that is far from keeping track of where the orgins for each bullet, diagram and images are.

Laborintensive work
As described in the article there are quite a few junior officers that spend time collating information and transforming into presentations. To start with there is much to be done to support this kind of ”research work” where users are navigation and searching for relevant pieces of information. However, after the information has been collated the next part of the work starts which is to transform that presentations using a template of some kind. Decision-makers usually have an opinion of how they want their presentations set up so they recognize the structure of the information from time to time. Add to that the fact that most organisations have a graphical profile to adhere which suggests a common styling and formatting of the content. To me all this really calls for a more semi-automated way of compiling this information. I am not saying that all content can be templated, far from it, but where it is possible it would save lots of time. Hopefully time that could be spent thinking instead of searching and formatting in Powerpoint.

Lack of interactivity
Another problem of these static representations are that since they usually take hours to compile and the flexilbilty in the actual briefing situations is usually low. If the decision-maker suddenly asks to filter the information from another perspective in say a graph the unfortunate answer will be: ”We will get back to you in half-an-hour or so”. Not exactly the best conditions to inspire reflections that puts complex problems in a new light. Spotfire has even written a paper around that which is called ”Minority Reports – How a new form of data visualization promises to do away with the meetings we all know and loathe”. The ability to introduce dynamic data which is interactive can bring us a new enviroment for meetings, especially if we also have access to large multi-touch walls that invite more than one person to easily manipulate and interact with the data.

Format matters
The General is right, format matters. There is a need for several different formats of the same information. Maj Gen Flynn calls out for more products based on writing which allows people to follow a more complex reasoning. That tackles the simplification aspect of the problem. However, there is still a need to do things together in a room and handing out written reports in Times New Roman 12 points is not the answer. In fact we really need a revolution in terms of visualisation of all that information we have decided to store digitally. Especially since we are increasingly able to provide structure to unstructured information with metadata but also able to collect data with XML-based data structures. We really need more presentation and visualisation support to be able to work productively with our information. However, we need less Powerpoint because it is a very time-consuming way to do stuff which can be done much better with another set of tools. Multi-channel publishing is an establish concept in marketing areas which means that the same content can be repurposed for print, web, mobile phones and large digital signage screens. We need to think in a similar way when it comes to what we use Powerpoint for today. There is even a complete toolsets such as EMC Document Sciences which, surprise, is based on templates in order to do customized market communications where static content meets dynamic content from databases. In this case based around common design tools such as Adobe InDesign.

The Space Shuttle Columbia experience
One tragic example of when the use of Powerpoint was a contributing factor was the tragic loss of Space Shuttle Columbia. The Columbia Accident Investigation Board (CAIB) took the help of Professor Edward Tufte from Yale University to analyse the communication failure that in the end made NASA to not be aware of the seriousness of the foam strike. The board makes the following finding which is all in line with General Flynn’s observations:

At many points during its investigation, the Board was surprised to receive similar presentation slides from NASA officials in place of technical reports. The Board views the endemic use of PowerPoint briefing slides instead of technical papers as an illustration of the problematic methods of technical communication at NASA.

Tufte continues to make the argument that the low resolution of the Powerpoint slides used forces technical terms to be abbreviated and thus adding ambiguity and that the usual large font size in headline also forces shortening. He also notes that the typography and hiearchies provided by the bullet organisation also added confusion and that in the case of NASA some more advanced typographic feature to handle math and other technical formatting is needed.

During the Return to Flight work later on this was further emphasized with the following statement:

”Several members of the Task Group noted, as had CAIB before them, that many of the engineering packages brought before formal control boards were documented only in PowerPoint presentations,”

Unfortunately, this is something I can relate to in my line of business. The main form of documentation are a slide shows being emailed around. Since you know that they will be emailed around without You being there to talk to them I believe many add a lot more extra text in them which makes them into some kind of in-between creatures. Neither slides shows nor reports. At least these added words hopefully reduced ambiguity some degree. I have now started to record my presentations with my own voice to help mitigate that to some degree.

The Physical Resolution is usually to low
To further add to the Columbia findings I have serious issue with how briefing rooms are usually setup to today. They usually have only one projector with resolution between 1024×728 or 1280×1024. Many laptops today have widescreen formats on the screen which when used on ”clone mode” makes the image of a 4:3 format projector looked really skewed. When projector handles widescreen formats especially with a higher resolution they are never used because:

  • Users are given computers with sub-performance graphic cards that really don’t handle full HD (1920×1080) resolution.
  • Users don’t know anything else but to ”clone” their screen. What you see on the laptop is what you see on the projector. Thus in essence limiting the resolution on the projector to what ever the laptop handles. Again because users have been given cheap computers.
  • The resolution has to be turned down from the highest one ”because everything became too small to see”. The reason for this is that the physical screen size is too small which makes the projector sit too close and the actual pixels too small to see from most of the room.

Combine that with Powerpoint templates with big font sizes we have a situation which means that not a lot information can be displayed for us which I also think adds to the oversimplification issue of this issue.

Why the Afghan ”Spaghetti image” is actually rather good

The NYT article contains an images from the Afghanistan conflict with hundreds of nodes being connected by arrows in different colors and this is given as an example of the problems of using Powerpoint. To start with I am not even sure that the image is made in Powerpoint, at least not from the beginning. I think a likely candidate instead is Consideo which is a MODELING tool not a PRESENTATION tool. The problems with that image is that when it enters the Powerpoint world it is static with no connections to underlying data. Imagine instead that that images was a dynamic and interactive visualizations of objects with relationships objects powering the lines. Metadata allows for filtering based on object and relationship attributes. Suddenly that images is just one of almost endless perspectives of the conflict. Imaging if all these nodes also are connected to underlying data such as reports and written analysis. Then it becomes easier even for an outsider to start understanding the image. We also need to understand that some visualizations are not intended for the decision-maker. Sometimes in order to understand them you need to have been there in the room most of the time so you understand how the discussions were. So this images is potentially rather good because it does not contain oversimplified bullets but instead is something you probably could stare at for hours while reflecting. However, it MUST NOT be an image that is manually updated in Powerpoint – it has to be a generated visualisation on top of databases.

Still valid for marketing
The almighty Master of presentations, Steve Jobs, who actually is using Apple Keynote instead of Powerpoint will most likely continue using that format. He delivers a very precise markeing message with slides that does not contain very much text at all. The rest of us who are not selling iPads need to start figuring out a smarter way to do business. Newer versions of ever more complex MS Powerpoint applications are simply not the answer. It is so general purpose that it doesn’t fit anyone any longer. At least if you care about your own time and data quality. It helps to some degree that both Keynote and Powerpoint use XML today – that means that the technical ability to use them as just a front-end is possible. The real issue has to do with information architecture and usage.

Conclusion
Oh, so how to do this, then? Use Enterprise Content Management systems to manage your content and move to an concept where content is handled in XML so it can be reused and repurposed while preserving tracability. Have a look at my other blog post around ”The Information Continuum” to get an idea of how. Since we do store all of our information digitally there is a need for much more in terms of visualisation and presentation support tools – not less. However, we need to find a way to be able to present lines of reasoning with a capability to do drill-down to utilize the tracability aspect. Maybe presentations to some degree will be more in the form of a rendition with links back to text, data, graphs, images or whatever. We need to accept that in many cases it isn’t realistic to try to boil it down to summarized and instead be able to explore that data ourselves. Now, let us setup our mindset, software and meeting rooms to do just that!

Share

EMC World 2010: My presentation around using Documentum in a SOA-platform

Yesterday on Monday May 10 at 11 am I gave a speech at the Momentum 10 conference here at EMC World 2010 in Boston. The presentation was focused around our experiences of building an experimentation platform for next-generation information and knowledge management (IKM) for a large operational level military HQ. Contemporary conflicts are complex and dynamic in character and requires a new approach to IKM in order to be able to handle all those complexities based on a sound management of our digital information. At the core of our platform is EMC Documentum integrated over an Enterprise Service Bus (ESB) from Oracle. The goal is to maintain access and tracability on the information while removing stove-piped systems.

I have got quite a few positive reactions both from customers and EMC-people after the session which of course is just great. For instance see these notes from the session. All the presentations will be available for download for all participants but that will most likely take some time. So in the meantime you can download my presentation here instead:

Presentation at EMC World 2010 in Boston

Looking forward to comments are reflections. The file is quite big but that is because my presentations is high on screenshots and downsampling them to save file size will make it too hard to see what they are showing. Try zooming in to see details.

Share

Can BPM meet Enterprise 2.0 over Adaptive Case Management?

The project that I am running at JCDEC involves a lot of internal ”marketing” targeting both at end users and people in charge of our IT-projects. Lately I have found myself explaning the difference between Workflow processes using Documentum Process Engine and Taskspace and what EMC’s new clients Centerstage Pro and Media Workspace. My best argument so far has been that BPM/Workflow is well suited for formal repeatable process in the HQ while Enterprise 2.0 clients takes care of ad-hoc and informal processess. Keith Swensson explains the Taylorism-based Scientific Management-concept as the foundation of Business Process Management in this blogpost in a good way. He continues to provide a bridge over to ad-hoc work that nowadays is done by what is called Knowledge Worker. Documentum Centerstage is a tool that is intended for the Knowledge Worker which also can be seen as the Enterprise 2.0 way of working.

However, Keith continues to steer us over to a concept called Adaptive Case Management which is supposed to address those more agile and dynamic ways of working as a contrast to slow-changing well-defined business processess that is deployed in traditional BPM-systems. To my understanding this focuses a lot on the fact that the user itself (instead of a Process designer) needs to be able to control templates, process steps and various other things in order to be able to support more dynamic work such as criminal investigations or medical care.

However, Adaptive Case Management is also a concept (I understand) in the book called ”Mastering the Unpredictable”. The idea is to focus on the unpredictable nature of some work situations but also reflect a bit over to what degree things are unpredictable or not. In this presentation by Jacob Ulkeson the argument is that the main bulk of work is unpredictable and therefore also means that Process Modeling using traditional BPM most likely won’t work.

Some people have opinions that there is no need to redefine BPM and that all these three letter acronyms does not contribute much to the understanding of the problem and the solutions. I think I disagree and the reason for that is that there are no silver bullet products that covers everything that you need. Most organisations start somewhere and rolls out systems based on their most pressing needs. I believe that these systems have some similarities in what they are good and bad at. Having bought an ECM, BI, CRM or ERP-system usually says something about what business problems have been addressed. As SOA-architectures matures and the ambition to reduce stove-pipes increases it actually means that the complementary character of these systems matter. It also matters which of these vendors you choose because the consolidation efforts into a few larger vendors means choosing from different approaches.

To me all of this means an opportunity to leverage the strong points of different kind of platforms. Complex sure but if you have the business requirements it is probably better than building it from scratch. So I think when companies quickly rolls out Enterprise 2.0 platforms from smaller startup vendors they soon discover that they risk creating yet another stove-pipe but in this case consisting of social information. Putting E 2.0 capabilties on top of an ECM-platform than makes a lot of sense in order to be able to integrate social features with existing enterprise information. The same most likely goes for BI, CRM etc.

When it comes to BPM the potential lies in extending formal processess with social and informal aspects. However, it is likely that the E 2.0-style capabilities make new ways of working evolve and emerge. Sooner or later they need to be formalised maybe into a project or a community of interest. Being able to leverage the capabilties of the BPM-platform in terms of monitoring and some kind of best practice in form of templates is not far-fetched. To some degree I believe that Adaptive Case Management-solutions sometimes should be used instead of just a shared Centerstage Space because you need this added formal aspects but still want to retain some flexibility. Knowledge Worker-style work can then be done on top of a BPM-infrastructure while at the same time utilising the ECM-infrastructure for all content objects involved in the process. Having a system like Documentum that is good at content-centric human workflow processes makes a lot of sense.

So is the Documentum xCP a way to adress this middle-ground between Process Modeling-based processes and Knowledge Worker-style support in CenterStage? The mantra is ”configure instead of coding” which implies a much more dynamic process. I have not played around with xCP yet – we have so far only deployed processes developed from scratch instead of trying out the case management templates that comes with the download.

Not all companies want to do this but I think some will soon see the merits of integrating ECM, BI, E.2.0 and BPM/ACM-solutions using SOA. The hard part I belive is to find software and business methods support for the agile and dynamic change management of these systems. The key to achieve this is to be able to support various degrees of ad-hoc work where on one the user does everything herself and on the other way a more traditional developer coding modules. Being able to more dynamically change/model/remodel not only processess but also the data model for content types in Documentum is a vital capability to be able to respond to business needs in a way that maintains trust in the system. This is not a task by IT but something done by some kind of Information and Knowledge Management (IKM) specialist. They can get some proper means of doing their work using this SOA-based integration of different sets of products.

So employ E 2.0-style features in Task Management clients and make sure that E 2.0 style clients include tasks from BPM/ACM in their activity streams or unified inboxes. Make sure that all of this is stored in an ECM-platform with full auditing capabilities which needs to be off-loaded to a data warehouse so it can be dynamically analysed using interactive data visualisation, statistics and data mining. I hope we can show a solutions for that in our lab soon.

Share

The Long Tail of Enterprise Content Management

Question: Can we expect a much larger amount of the available content to be consumed or used by at least a few people in the organisations?

Shifting focus from bestsellers to niche markets
In 2006 the editior-in-chief of Wired magazine Chris Andersson published his book called ”The Long Tail – Why the Future of Business is Selling Less of More”. Maybe even the text printed on the top of the cover saying ”How Endless Choice is Creating Unlimted Demand” is the best summary of the book. This might have been said many times before but I felt a strong need to put my reflections into text after reading this book. It put a vital piece of the puzzle in place when seeing the connections to our efforts to implement Enterprise 2.0 within an ECM-context.

Basically Chris Andersson sets out to explain why companies like Amazon, Netflix, Apple iTunes and several others make a lot of money in selling small amounts of a very large set of products. It turns out that out of even millions of songs/books/movies nearly all of them are rented or bought at least once. What makes this possible is comprised out of these things:

Production is democratized which means that the tools and means to produce songs, books and movies is available to almost everybody at a relatively low lost.
– Demoractization of distribution where companies can broker large amount of digital content because there is a very low cost for having a large stock of digital content compared to real products on real shelves in real warehouses.
– Connecting supply and demand so that all this created content meets its potential buyers and the tools for that is search functions, rankings and collaborative reviews.

What this effectivly means is that the hit-culture where everything is focused on a small set of bestsellers is replaced with vast amounts of small niches. That has probably an effect of the society as a whole since the time where a significant amount of the population where exposed to the same thing at the same time is over. That is also reflected in the explosion of the number of specialised TV-channels and TV/video-on-demand services that lets views choose not only which show to watch but also when to watch it.

Early Knowledge Management and the rise of Web 2.0
Back in the late 90-ies Knowledge Management efforts thrived with great aspirations of taking a grip of the knowledge assets of companies and organisations. Although there are many views and definitions of Knowledge Management many of them focused on increasing the capture of knowledge and that the application of that captured knowledge would lead to better efficiency and better business. However, partly because of technical immaturity many of these projects did not reach its ambitous goals.

Five or six years later the landscape has changed completely on the web with the rise of Youtube, Flickr, Google, FaceBook and many other Web 2.0 services. They provided a radically lowered threshold to contribute information and the whole web changed from a focus on consuming information to producing and contributing information. This was in fact just democratization of production but in this case not only products to sell but information of all kind.

Using the large-scale hubs of Youtube, Flickr and Facebook the distribution aspect of the Long Tail was covered since all this new content also was spread in clever ways to friends in our networks or too niche ”consumers” finding info based on tagging and recommendations. Maybe the my friend network in Facebook in essence is a represention of a small niche market who is interested in following what I am contributing (doing).

Social media goes Enterprise
When this effect started spreading beyond the public internet into the corporate network the term Enterprise 2.0 was coined by Andrew McAfee. Inside the enterprise people where starting to share information on a much wider scale than before and in some aspects made the old KM-dreams finally come into being. This time not because of formal management plans but more based on social factors and networking that really inspired people to contribute.

From an Enterprise Content Management perspective this also means that if we can put all this social interaction and generated content on top of an ECM-infrastructure we can achieve far more than just supporting formal workflows, records management and retention demands. The ECM-repository has a possibility to become the backbone to provide all kind of captured knowledge within the enterprise.

The interesting question is if this also marks a cultural change in what types of information that people devoted their attention to. One could argue that traditional ECM-systems provide more of a limited ”hit-oriented” consumption of information. The abscense of good search interfaces, recommendation engines and collaboration probably left most of the information unseen.

Implications for Enterprise Content Management
The social features in Enterprise 2.0 changes all that. Suddenly the same effect on exposure can be seen on enterprise content just as we have seen it on consumer goods. There is no shortage of storage space today. The amount of objects stored is already large but will increase a lot since it is so much easier to contribute. Social features allows exposure of things that have linkages to interests, competencies and networks instead of what the management wants to push. People interested in learning have somewhere to go even for niche interests and those wanting to share can get affirmations when their content is read and commented by others even if it is a small number. Advanced searching and exploitation of social and content analytics can create personalised mashup portals and push notifcations of interesting conent or people.

Could this long tail effect possibly have a difference on the whole knowledge management perspective? This time not from the management aspect of it but rather the learning aspect of it. Can we expect a much larger amount of the available content to be consumed or used by at least a few people in the organisations? Large organisations have a fairly large number or roles and responsibilities to there must reasonably be a great difference in what information they need and with whom they need to share information with. The Long Tail effect in ECM-terms could be a way to illustrate how a much larger percentage of the enterprise content is used and reused. It is not necessarily so that more informtion is better but this can mean more of the right information to more of the right people. Add to that the creative effect of being constantly stimulated by ideas and reflections from others around you and it could be a winning concept.

Sources

Andersson, Chris, ”The Long Tail – Why the Future of Business is Selling Less of More”, 2006
Koernan, Brendan I, ”Driven by Distraction – How Twitter and Facebook make us more productive workers” in Wired Magazine March 20

Share