Category: ECM

iPhone/iPad and mobile access to ECM

Behold the iPad in All Its Glory
Image via Wikipedia

Inspired by my recent discovery of a Documentum client for iPhone and iPad by Flatiron solutions I wanted to do some research what is going on when it comes to mobile access using iPhone OS for Enterprise Content Management systems. It turned out that there are a few solutions out there but first I would like to dwell a little bit about the rationale for all of this.

First of all we are of course going more and more mobile. Sales of laptop computers are increasing on the expense on stationary ones. Wireless high-speed internet is no longer just available as Airport/WiFi but also as 3G/4G connections using phones and dongles for laptops. Nothing new here. Another recent change is Web 2.0 and it’s workrelated counterpart Enterprise 2.0 which is now gaining a lot of traction among companies and organisations. It is all about capitalized on the Web 2.0 effects but in an Enterprise context. Lower threshold to produce information and even more to particpate with comments and rating based on relationships to people. All this drives consumption of information even more as the distance between producer and consumers is shorter than ever before.

Here comes the new smartphone (basically following the introduction of the iPhone) where it actually makes sense to use that for a number of different tasks which previously was possible but not very pleasant to do. The bigger form factor of the iPad to me opens even more possibilities where mobile access meets E 2.0 based on ECM. Not only does the appliance makes sense to use on the move but it also has really good support for collaboration and sharing on the move.

It seems the open-source community is doing good here. Alfresco is an open-source ECM-system created by the founders of Documentum and Interwoven and there are actually a few solutions for accessing Alfresco on the iPhone. This slide share presentation outlines one solution:

iPhone Integration with Alfresco – Open Source ECM

Another is Freshdoc for the iPhone developed by Zia Consulting. The company also seem to have presented an Fresh Docs for Filenet iPad application at IBM IOD (Information on Demand) Conference in Rome, Italy May 19 – 21. It is open source and can be downloaded at Google Code.
Yet another company that provides iPad access is the open source product Saperion ECM. Open Text Social Media also provides an iPhone App for their platform. Another company that seem to be in the works for an iPhone app is Nuxeo.
Cara for iPhone is also available from Generiscorp – an application that uses CMIS to connect to repositories with CMIS-support which includes both Documentum and Alfresco.
In our application the mobile access is somewhat less importance but the iPad changes that to some degree. Even if you maybe can’t offer mobile over the air acccess enabling users to have large screen multi-touch interfaces like the iPad is of course very interesting. From a Documentum perspective the only thing we have seen in the mobile area from EMC itself is a Blackberry client for Centerstage (check p.22 in the PDF) (there is also a Blackberry client available for IRM). I understand that Blackberry is popular in the US but in terms of being visionary having a nice iPhone OS app is important I think. As I said before there are many similarities between how information is handled in the iPad and how an ECM-system like Documentum handles information. It is all about metadata.

In the light of the fact that Flatiron’s iPhone app iECM so far is not said to be a product for purchase but rather a proof-of-concept I wonder if EMC or some partner would be the best way to provide access to a long-term iPhone OS app for Documentum.

Reblog this post [with Zemanta]

EMC World 2010: Next-generation Search: Documentum Search Services

Presented by Aamir Farooq

Verity: Largest ingex 1 M Docs

FAST: Largest Index 200 M Docs

Challenging requirements today that all requires tradeoffs. Instead of trying to plugin third party search engines chose to build and integrated search engine for content and case management.

Flexible Scalability being promoted.

Tens to Hundreds of Millions of objects per host

Routing of indexing streams to different collections can be made.

Two instances can be up and running in less than 20 min!

Online backup restore is possible using DSS instead of just offline for FAST

FAST only supported Active/Active HA. In DSS more options:

Active/Passive

Native security. Replicates ACL and Groups to DSS

All fulltext queries leverage native security

Efficient deep facet computation within DSS with security enforcement. Security in facets is vital.

Enables effective searches on large result sets (underpriveleged users not allowed to see most hits in result set)

Without DSS, facets computed over only first 150 results pulled into client apps

100x more with DSS

All metrics for all queries is saved and can be used in analytics. Run reports in the admin UI.

DSS Feature Comparison

DSS supports 150 formats (500 versions)

The only thing lacking now is Thesaurus (coming in v 1.2)

Native 64-bit support for Linux and Windows, Core DSS is 64-bit)

Virtutalisation support on VMWare

Fulltext Roadmap

DSS 1.0 GA compatible with D 6.5 SP2 or later. Integration with CS 1.1 for facets, native security and XQuery)

Documentum FAST is in maintenance mode.

D6.5 SP3, 6.6 and 6.7 will be the last release that support FAST

From 2011 DSS will be the search solution for Documentum.

Index Agent Improvements

Guides you through reindexing or simply processing new indexing events.

Failure thresholds. Configure how many error message you allow.

One Box Search: As you add more terms it is doing OR instead of AND between each terms

Wildcards are not allowed OOTB. It can be changed.

Recommendations for upgrade/migration

  • Commit to Migrate
  • No additional license costs – included in Content Server
  • Identity and Mitigate Risks
  • 6.5 SP2 or later supported
  • No change to DQL – Xquery available.
  • Points out that both xDb and Lucene are very mature projects
  • Plan and analyze your HA and DR requirements

Straight migration. Build indices while FAST is running. Switch from FAST to DSS when indexing is done. Does not require multiple Content Servers.

Formal Benchmarks

  • Over 30 M documents spread over 6 nodes
  • Single node with 17 million documents (over 300 Gb index size)
  • Performance: 6 M Documents in FAST took two weeks. 30 M with DSS also took 2 weeks but with a lot of stops.
  • Around 42% faster for ingest for a single node compared to FAST

The idea is to use xProc to do extra processing of the content as it comes into DSS.

Conclusion

This is a very welcome improvement for one of the few weak points in the Documentum platform. We were selected to be part of the beta program so I would now have loved to tell you how great of an improvement it really is. However, we were forced to focus on other things in our SOA-project first. Hopefully I will come back in a few weeks or so and tell you how great the beta is. We have an external Enterprise Search solution powered by Apache Solr and I often get the question if DSS will make that unnecessary. For the near future I think it will not and that is because the search experience is also about the GUI. We believe in multiple interfaces targeted at different business needs and roles and our own Solr GUI has been configured to meet our needs based from a browse and search perspective. From a Documentum perspective the only client today that will leverage the faceted navigation is Centerstage and that is focused on asynchronous collaboration and is a key component in our thinking as well, but for different purposes. Also even though DSS is based on two mature products (as I experienced at Lucene Eurocon this week) I think the capabilities to tweak and monitor the search experience at least initially will be much better in our external Solr than using the new DSS Admin Tool although it seems like a great improvement form what the FAST solution offers today.

Another interesting development will be how the xDB inside DSS will related to the “internal” XML Store in terms of integration. Initially they will be two servers but maybe in the future you can start doing things with them together. Especially if next-gen Documentum will replace the RDBMS as Victor Spivak mentioned as a way forward.

At the end having a fast search experience in Documentum from now is so important!

Further reading

Be sure to also read the good summary from Technology Services Group and Blue Fish Development Group about their take on DSS.

Reblog this post [with Zemanta]

More Presentation Support Tools but less (Powerpoint) slide shows

In a recent article called ”We Have Met the Enemy and He Is PowerPoint” by Elisabeth Bumiller there is a big outcry to stop using Powerpoint because it supposed to make us more stupid in decision-making. I agree and can just reiterate a quote from the top US Intelligence Official in Afghanistan, Maj Gen Michael Flynn in the report “FIXING INTEL: A BLUEPRINT FOR MAKING INTELLIGENCE RELEVANT IN AFGHANISTAN”:

“The format of intelligence products matters. Commanders who think PowerPoint storyboards and color-coded spreadsheets are adequate for describing the Afghan conflict and its complexities have some soul searching to do.”

These are quite hard words directed towards his commanders in ISAF and the US Component in Afghanistan but I think he is right. However, the underlying issue is a desire to simplify things which should not be simplified. Combine that with a lack of vision when it comes to tools support for higher level of military command. Basically the tools supposed to support that kind of planning are either general purpose tools like Microsoft Office or highly specialised military application which exists in their own stove-pipe.

Oversimplifications
With Powerpoint comes a method and that method mainly consists of boiling information down to single bullets. Perfect for fine tuned marketing messages that want to leave just a few critical words or terms in the heads of the recipient. Not that good for complex reasoning around complex issues like modern conflicts. Powerpoint sets out to convey a message when we instead should focus on creating situation focused on improving our understanding.

Static representations
Most Powerpoint presentations are very static in nature. They usually represent a manually crafted snapshot of a given situation which means that it can become outdated very quickly. As time goes on there are more and more static presentations that should be regularly updated but usually never are. Either they disappear in the file sharing if the organisation lacks an Enterprise Content Management system or there is no process on monitoring which presentations that need to be updated. Usually because all the traceability is lost from when they were being created. Some companies have implemented some dynamic areas in their presentations were for instance weekly sales figures are updated when the presentation opens but that is far from keeping track of where the orgins for each bullet, diagram and images are.

Laborintensive work
As described in the article there are quite a few junior officers that spend time collating information and transforming into presentations. To start with there is much to be done to support this kind of ”research work” where users are navigation and searching for relevant pieces of information. However, after the information has been collated the next part of the work starts which is to transform that presentations using a template of some kind. Decision-makers usually have an opinion of how they want their presentations set up so they recognize the structure of the information from time to time. Add to that the fact that most organisations have a graphical profile to adhere which suggests a common styling and formatting of the content. To me all this really calls for a more semi-automated way of compiling this information. I am not saying that all content can be templated, far from it, but where it is possible it would save lots of time. Hopefully time that could be spent thinking instead of searching and formatting in Powerpoint.

Lack of interactivity
Another problem of these static representations are that since they usually take hours to compile and the flexilbilty in the actual briefing situations is usually low. If the decision-maker suddenly asks to filter the information from another perspective in say a graph the unfortunate answer will be: ”We will get back to you in half-an-hour or so”. Not exactly the best conditions to inspire reflections that puts complex problems in a new light. Spotfire has even written a paper around that which is called ”Minority Reports – How a new form of data visualization promises to do away with the meetings we all know and loathe”. The ability to introduce dynamic data which is interactive can bring us a new enviroment for meetings, especially if we also have access to large multi-touch walls that invite more than one person to easily manipulate and interact with the data.

Format matters
The General is right, format matters. There is a need for several different formats of the same information. Maj Gen Flynn calls out for more products based on writing which allows people to follow a more complex reasoning. That tackles the simplification aspect of the problem. However, there is still a need to do things together in a room and handing out written reports in Times New Roman 12 points is not the answer. In fact we really need a revolution in terms of visualisation of all that information we have decided to store digitally. Especially since we are increasingly able to provide structure to unstructured information with metadata but also able to collect data with XML-based data structures. We really need more presentation and visualisation support to be able to work productively with our information. However, we need less Powerpoint because it is a very time-consuming way to do stuff which can be done much better with another set of tools. Multi-channel publishing is an establish concept in marketing areas which means that the same content can be repurposed for print, web, mobile phones and large digital signage screens. We need to think in a similar way when it comes to what we use Powerpoint for today. There is even a complete toolsets such as EMC Document Sciences which, surprise, is based on templates in order to do customized market communications where static content meets dynamic content from databases. In this case based around common design tools such as Adobe InDesign.

The Space Shuttle Columbia experience
One tragic example of when the use of Powerpoint was a contributing factor was the tragic loss of Space Shuttle Columbia. The Columbia Accident Investigation Board (CAIB) took the help of Professor Edward Tufte from Yale University to analyse the communication failure that in the end made NASA to not be aware of the seriousness of the foam strike. The board makes the following finding which is all in line with General Flynn’s observations:

At many points during its investigation, the Board was surprised to receive similar presentation slides from NASA officials in place of technical reports. The Board views the endemic use of PowerPoint briefing slides instead of technical papers as an illustration of the problematic methods of technical communication at NASA.

Tufte continues to make the argument that the low resolution of the Powerpoint slides used forces technical terms to be abbreviated and thus adding ambiguity and that the usual large font size in headline also forces shortening. He also notes that the typography and hiearchies provided by the bullet organisation also added confusion and that in the case of NASA some more advanced typographic feature to handle math and other technical formatting is needed.

During the Return to Flight work later on this was further emphasized with the following statement:

“Several members of the Task Group noted, as had CAIB before them, that many of the engineering packages brought before formal control boards were documented only in PowerPoint presentations,”

Unfortunately, this is something I can relate to in my line of business. The main form of documentation are a slide shows being emailed around. Since you know that they will be emailed around without You being there to talk to them I believe many add a lot more extra text in them which makes them into some kind of in-between creatures. Neither slides shows nor reports. At least these added words hopefully reduced ambiguity some degree. I have now started to record my presentations with my own voice to help mitigate that to some degree.

The Physical Resolution is usually to low
To further add to the Columbia findings I have serious issue with how briefing rooms are usually setup to today. They usually have only one projector with resolution between 1024×728 or 1280×1024. Many laptops today have widescreen formats on the screen which when used on “clone mode” makes the image of a 4:3 format projector looked really skewed. When projector handles widescreen formats especially with a higher resolution they are never used because:

  • Users are given computers with sub-performance graphic cards that really don’t handle full HD (1920×1080) resolution.
  • Users don’t know anything else but to “clone” their screen. What you see on the laptop is what you see on the projector. Thus in essence limiting the resolution on the projector to what ever the laptop handles. Again because users have been given cheap computers.
  • The resolution has to be turned down from the highest one “because everything became too small to see”. The reason for this is that the physical screen size is too small which makes the projector sit too close and the actual pixels too small to see from most of the room.

Combine that with Powerpoint templates with big font sizes we have a situation which means that not a lot information can be displayed for us which I also think adds to the oversimplification issue of this issue.

Why the Afghan “Spaghetti image” is actually rather good

The NYT article contains an images from the Afghanistan conflict with hundreds of nodes being connected by arrows in different colors and this is given as an example of the problems of using Powerpoint. To start with I am not even sure that the image is made in Powerpoint, at least not from the beginning. I think a likely candidate instead is Consideo which is a MODELING tool not a PRESENTATION tool. The problems with that image is that when it enters the Powerpoint world it is static with no connections to underlying data. Imagine instead that that images was a dynamic and interactive visualizations of objects with relationships objects powering the lines. Metadata allows for filtering based on object and relationship attributes. Suddenly that images is just one of almost endless perspectives of the conflict. Imaging if all these nodes also are connected to underlying data such as reports and written analysis. Then it becomes easier even for an outsider to start understanding the image. We also need to understand that some visualizations are not intended for the decision-maker. Sometimes in order to understand them you need to have been there in the room most of the time so you understand how the discussions were. So this images is potentially rather good because it does not contain oversimplified bullets but instead is something you probably could stare at for hours while reflecting. However, it MUST NOT be an image that is manually updated in Powerpoint – it has to be a generated visualisation on top of databases.

Still valid for marketing
The almighty Master of presentations, Steve Jobs, who actually is using Apple Keynote instead of Powerpoint will most likely continue using that format. He delivers a very precise markeing message with slides that does not contain very much text at all. The rest of us who are not selling iPads need to start figuring out a smarter way to do business. Newer versions of ever more complex MS Powerpoint applications are simply not the answer. It is so general purpose that it doesn’t fit anyone any longer. At least if you care about your own time and data quality. It helps to some degree that both Keynote and Powerpoint use XML today – that means that the technical ability to use them as just a front-end is possible. The real issue has to do with information architecture and usage.

Conclusion
Oh, so how to do this, then? Use Enterprise Content Management systems to manage your content and move to an concept where content is handled in XML so it can be reused and repurposed while preserving tracability. Have a look at my other blog post around “The Information Continuum” to get an idea of how. Since we do store all of our information digitally there is a need for much more in terms of visualisation and presentation support tools – not less. However, we need to find a way to be able to present lines of reasoning with a capability to do drill-down to utilize the tracability aspect. Maybe presentations to some degree will be more in the form of a rendition with links back to text, data, graphs, images or whatever. We need to accept that in many cases it isn’t realistic to try to boil it down to summarized and instead be able to explore that data ourselves. Now, let us setup our mindset, software and meeting rooms to do just that!

Interesting thoughts around the Information Continuum

In a blog post called “The Information Continuum and the Three Types of Subtly Semi-Structured Information” Mark Kellogg discusses what we really mean with unstructured, semi-structured and structured information. In my project we have constant discussions around this and how to look upon the whole aspect of chunking down content into reusable pieces that in itself needs some structured in order to be just that – reusable. At first we were ecstatic over the metadata capabilities in our Documentum platform because we have made our unstructured content semi-structured which in itself is a huge improvement. However, it is important to see this as some kind of continuum instead of three fixed positions.

One example is of course the PowerPoint/Keynote/Impress-presentation which actually is not one piece. Mark Kellogg reminded me of the discussions we have had around those slides being bits of content in a composite document structure. It is easy to focus on the more traditional text-based editing that you see in Technical Publications and forget that presentations have that aspect in them already. To be honest when we first got Documentum Digital Asset Manager (DAM) in 2006 and saw the Powerpoint Assembly tool we became very enthusiastic about content reuse. However, we found that feature a little bit too hard to use and it never really took off. What we see in Documentum MediaWorkSpace now is a very much remamped version of that which I look forward to play around with. I guess the whole thing comes back to the semi-structured aspect of those slides because in order to facilitate reuse they somehow need to get some additional metadata and tags. Otherwise it is easy the sheer number of slides available will be too much if you can’t filter it down based on how it categories but who has created them.

Last year we decided to take another stab at composite document management to be able to construct templates referring to both static and dynamic (queries) pieces of content. We have made ourselves a rather cool dynamic document compsotion tool on top of our SOA-platform with Documentum in it. It is based on DITA and we use XMetaL Author Enterprise as the authoring tool to construct the templates, the service bus will resolve the dynamic queries and Documentum will store and transform the large DITA-file into a PDF. What we quickly saw was yet another aspect of semi-structured information since we need a large team to be able to work in parallell to “connect” information into the finished product. Again, there is a need for context in terms of metadata around these pieces of reusable content that will end up in the finished product based on the template. Since we depend of using a lot of information coming in from outside the organisation we can’t have strict enforcement of the structure of the content. It will arrive in Word, PDF, Text, HTML, PPT etc. So there is a need to transform content into XML, chunk it up in reusable pieces and tag it so we can refer to it in the template or use queries to include content with a particular set of tags.

This of course bring up the whole problem with the editing/authoring client. The whole concept of a document is be questioned as it in itself is part of this Continuum. Collaborative writing in the same document has been offered by CoWord, TextFlow and the recently open source Google tool Etherpad and will now be part of the next version of Microsoft Office. Google Wave is a little bit of a disrupting force here since it merges the concept of instant messaging, asynchronous messaging (email) and collaborative document editing. Based on the Google Wave Federation protocol it is also being implemented in Enterprise Applications such as Novell Pulse.

So why don’t just use a wiki then? Well, the layout tools is nowhere as rich as what you will find in Word processors and presentation software and since we are dependent on being able to handle real documents in these common format it becomes a hassle to convert them into wiki format or even worse try to attach them to a wiki page. More importantly a wiki is asynchronous in nature and that is probably not that user friendly compared to live updates. The XML Vendors have also went into this market with tools like XMetaL Reviewer which leverages the XML infrastructure in a web-based tool that almost in real-time allow users to see changes made and review them collaboratively.

This lead us into the importance of the format we choose as the baseline for both collaborative writing and the chunk-based reusable content handling that we like to leverage. Everybody I talk to are please with the new Office XML-formats but say in their next breath that the format is complex and a bit nasty. So do we choose OpenOffice, DITA or what? What we choose as some real impact on the tool-end of our solutions because You probably get most out of a tool when it is handling its native format or at least the one it is certified to support. Since it is all XML when can always transform back and forth using XSLT or XProc.

Ok, we have the toolset and some infrastructure in place for that. Now comes my desire to not stove-pipe this information in some close system only used to store “collaborative content”. Somehow we need to be able to “commit” those “snapshots” of XML-content that to some degree consitutes a document. Maybe we want to “lock it” down so we know what version of all of that has been sent externally or just to know what we knew at a specific time. Very important in military business. That means that it must be integrated into our Enterprise Content Management-infrastructure where it in fact can move on the continuum into being more unstructured since it could even be stored as a single binary document file. Some we need to be able to keep the tracability so you know what versions of specific chunks was used and who connected them into the “document”. Again, just choosing something like Textflow or Etherpad will not provide that integration. MS Office will of course be integrated with Sharepoint but I am afraid that implementation will not support all the capabilities in terms of tracability and visualisation that I think you need to make the solution complete. Also XML-content actually like to live in XML-databases such as Mark Logic Server and Documentum XML Store so that integration is very much need more or less out of the box in order to make it possible to craft a solution.

We will definitely look into Documentum XML Technologies more deeply to see if we can design an integrated solutions on top of that. It looks promising especially since a XProc Pipeline for DITA is around the corner.

Reblog this post [with Zemanta]

EMC World 2010: Chiming in with Word of Pie about the future of Documentum

We have got a written reaction to Mark Lewis’ keynote held at EMC World 2010 in Boston. I both feel and have the passion around Enterprise Content Management and it is great that Laurence Hart spent so much time and effort on talking to people to craft this post. Someone need to say things even if they are not always easy to hear. So I will try to not repeat what he said in this blog post but rather try to provide my perspective which comes from what I have learned about Information and Knowledge Management over the past years. ECM and Documentum is a very critical component to move that IKM vision from the Powerpoint stage into reality. In our case an experimentation platform that allows to put our ideas to improve the “business” of staff work in a large military HQ into something people can try, learn and be inspired from. Also, this turned out to be a long blog post which calls for an summary on top:

The Executive Summary (or message to EMC IIG) of this blog post:

  • Good name change but make sure You live up to your name.
  • A greater degree of agility is very much needed but do not simplify the platform so much that implementing an ECM-strategy is impossible.
  • Case Management is not the umbrella term, it is just one of many solutions on top of Documentum xCP
  • The whole web has gone Social Media and Rich Media. The Enterprise is next. Develop what You have and stay relevant in the 2010-ies!
  • Be more precise when it comes to the term “collaboration”. There is a whole spectrum to support here.
  • Be more bold and tell people that Documentum offers an unique architectural approach to informtion management – stop comparing clients.
  • Tell people that enabling Rich Media, Case Management, E 2.0 and (Team) Collaboration on one platform is both important and possible.
  • I am repeating myself here: You want to sell storage, right? Make sure Video Management is really good in Documentum!

The name change

Before I start I just need to reflect on the name change from Content Management and Archiving into Information Intelligence Group (IIG). I agree with Pie…the had to be changed to make it more relevant in 2010 and a focus on information (as in information management which is more than storage ILM) is the right way to go. The intelligence part of it is of course a bit fun because of my own profession but still it implies doing smart things with information and that should include everything from building context with Enterprise 2.0 features to advanced Content and Information Analytics. You have the repository to store all of that – now make sure you continue to invest in analytics engine to generate structure and visualisation toolkit to make use of all the metadata and audit trails. Maybe do something with TIBCO Spotfire.

Documentum xCP – lowering the threshold and creating a more agile platform

Great. Documentum needs to be easier to deploy, configure and monitored. Needed to get know customers on board easier and make existing ones be able to do smarter things with it in less time. However, it is easy to fall into the trap of simplifying things to much here. To me there is nothing simple around implementing Enterprise Content Management (ECM) as a concept and as a method in an organization. One major problem with Sharepoint and other solutions is that they are way to easy to install so people actually are fooled into skipping the THINKING part of implementing ECM and think it is just “next-next-finish”. All ECM-systems needs to be configured and adapted to fit the business needs of the organisation. Without that they will fail. xCP can offer a way to do that vital configuration (preceeded by THINKING) a lot more easier and also more often. We often stress how it is important to have the technical configuration move as close to any changes in Standard Operating Procedures (SOP) as possible. If Generals want to change the way they work and the software does not support it they will move away from using the software. Agility is the key.

In our vision the datamodel needs to be much more agile. Value lists need to updated often – sometimes based on ad hoc folksonomy tagging. Monitoring of the use of metadata and tags will drive that. Attributes or even object types need to be updated more often. Content need to be ingested quickly while providing structure later on (think XML Store with new schemas here). xCP is therefore a welcome thing but make sure it does not compromise the core of what makes Documentum unique today.

The whole Case Management thing

Probably the thing that most of us reacted against in the Mark Lewis Keynote was the notion that ECM-people in reality just have done Case Management all the time. I recently spend some time reflecting on that in another blog post here called “Can BPM meet Enterprise 2.0 over Adaptive Case Management?“. There is clearly a continuum here between supporting very formal process flows and very ad-hoc Knowledge Worker-style work. They clearly seem different and while they likely meet over Adaptive Case Management but to me it makes no sense to have that term cover the whole spectrum – even for EMC Marketing 🙂

I immediately saw that Public Sector Investigative work is often used as an example of Case Management. Case Management in especially done by law enforcement agencies is fundamentally different from work done by Intelligence Agencies because in Case-based Police investigations there is usually some legal requirement to NOT share information between cases unless authorised by managers. This is of not the case (!) for all Case Management applications but from a cultural perspective it is important that Case Management-work by the Police is not a line of business that should be used as an example of information sharing. It is even so that the underlying concept actually is at ends with any concept of unified enterprise content management strategy where information should be shared. That is why workgroup-oriented tools such as i2 Analyst’s Workstation have become so popular there.

The point here is that it is important to not disable sharing in the architectural level because again it is what constitutes a good ECM-system that content can be managed in a unified way. Don’t be fooled by requirements for that – use the powerful security model to make it possible. Then Law Enforcement Agencies can use it as well. However, there must be more to ECM than Case Management – as Word of Pie suggests it is just ONE of many solutions on top of the Documentum xCP platform. A platform which is agile enough to quickly build advanced solutions for ECM on top.

Collaboration vs Sharing and E.20

So, Collaboration is used everywhere now but the real meaning with it actually varies a bit. First there are two kind of collaboration modes:

  • Synchronous (real-time)
  • Asynchronous (non-real time – “leave info and pick up later)

Obviously neither Documentum nor Sharepoint is in real-time part of the business. For that you will need Lotus Sametime, Office Communications Server, Adobe Connect Pro or similar products. However, Google Wave provides a bit of confusion here since it integrates instant messaging and collaborative document editing/writing.

However, I am bit bothered by the casual notion of anything as a collaboration tool like Sharepoint and for that sake eRoom is getting. To further break this down I believe there is a directness factor in collaboration. Team collaboration has a lot of directness where you collaborate along a given task with collegues. That is not the same as many of the Social Media/Enterprise 2.0 features which does not have a clear recipient of the thing you are sharing. And sharing is the key since you basically are providing a piece of information in case anyone wants/needs it. That is fundamentally different from sending an email to project members or uploading the latest revision to the project’s space. Andrew McAffe has written about this concept and uses the concept of a bullseye representing strong and weak ties to illustrate this effect.

My point is that it is important that tools for team collaborations from an information architecture standpoint can become part of the more weaker indirect sharing concept. That is the vehicle to utilze the Enterprise 2.0 effect in a large enterprise. Otherwise we have just created another set of stove-pipes or bubbles of information that is restricted to team members. I am not saying that all information should be this transparent but I will argue that based on a “responsibility to provide”-concept (see US Intel Community Information Sharing Policy) restricting that sharing of information should be exception – not the norm.

Sure as Word of Pie points out in his article “CenterStage, the Latest ex-Collaboration Tool from EMC” there are definitely things missing from the current Centerstage release compared to both Sharepoint and EMC’s old tool eRoom. However, as Andrew Goodale points out in the comments I also think it is a bit unfair because both eRoom and at least previous versions of Sharepoint (which many are using) actually lacks all these important social media features that serves to lower the threshold and increase participation by users. They also provide critical new context around the information objects that was not available before in DAM, WebTop or Taskspace. Centerstage also provides a way to consume them in terms of activity streams, RSS-feeds and faceted search. Remember that Centerstage is the only way to surface those facets from Documentum Search Server today.

So, I am also a bit disappointed that things are missing in Centerstage that should be there and I also really want to stress the importance of putting resources into that development. Those features in there are critical for implementing all serious implementations of an ECM-strategy and the power of Documentum is that they all sits in the same repository architecture with a service layer to access them. Maybe partner with Socialcast to provide a best practice implementation to support a more extensive profile page and microblogging. Choose a partner for Instant Messaging in order to connect the real-time part of collaboration into the platform. Again, use your experience from records management and retention policies to make those real-time collaboration activities saved and managed in the repository.

Be bold enough to say you are an Sharepoint alternative – but for the right reasons

I’m not an IT-person, I come into this business with a vision change the way a military HQ handles information so I see Enterprise Content Management more as a concept than a technology platform. However, when I have tried to execute our vision it becomes very clear that there is a difference between technology vendors and I like to think that difference comes from internal culture, experience, and vision of the company. It is the “why” behind why the platform looks like it does and has the features it has. So as long you are not building everything from scratch for yourself it actually matters a lot which company you chose to deliver the platform to make your ECM vision happen. That means that there IS a difference between Documentum and Sharepoint in the way the platform works and we need to be able to talk about that. However, what I see now is that most people focus on the client side of it and try to embrace it is a popular collaboration tool. Note that I say tool – not platform. All those focuses on the client side of it where the simplified requirement is basically a need for a digital space to share some documents in. However, the differentiator is not whether Centerstage or Sharepoint meets that requirement – both do. The differentiator is whether you have a conceptual vision on how to manage the sum of all information that an organization have and to what degree those concepts can be implemented in technology. That is where the Documentum platform is different from other vendors and why it is different from Sharepoint. Sharepoint is sometimes a little bit to easy to get started with which unfortunately means there is no ECM-strategy behind the implementation and when the organisation have thousands of Sharepoint sites (silos) after a year or so that is when that choice of platform really starts to differ.

This week at EMC World has been a great one as usual and there is no shortage of brilliant technical skills and development of features in the platform. What I guess bothers me and some other passionate ECM/Documentum-people is the message coming out from the executive level at IIG. In the end, that is where the strategic resource decision are made and where the marketing message being constructed. I think now there is a lot more to do on the vision and marketing level than actually needs to be done on the platform itself. The hard part seem to be proud of what the platform is today, realize it’s potential to remain the most capable and advanced on the market and use that to stay relevant in many applications of ECM – not just Case Management.

Rich Media – A lot of content to manage and storage to sell

One of the strong points of Documentum is that it can manage ALL kind of content in a good way and that includes of course rich media assets such as photos, videos and audio files. Don’t look upon this as some kind of specialised market only needed by traditional “creative” markets. This is something everybody needs now. All companiens (and military units for that sake) have an abundance of digital still and video cameras where a massive amount of content needs to be managed just as all the rest of the content. There is a need for platform technologies that actually “understands” that content and can extract metadata from it so that this content can be navigated and found easily. It is also important to assist users in repurposing this content so it can be displayed easily without consuming all bandwith and also easily be included in presentations and other documents. This is also very much relevant from a training and learning perspective where screencams and recorded presentations has so much potential. It does not have to be a full Learning Management System but at least an easy way to provide it. Maybe have a look at your dear friend Cisco and their Show and Share application. Oh, it is marketed as a Social Video System – the connections to Centerstage (and not just MediaWorkspace) is a bit too obvious. Make sure you can provide Flickr and Youtube for the Enterprise real soon. People will love it. Again, on one very capable platform.

Media Workspace is a really cool application now. Even if it does not have all the features of DAM yet (either) it is such a sexy interface on Documentum. The new capabilites of handling presentations and video are just great. Be sure to look more at Apple iPhoto and learn how to leverage (and create) metadata to support management of content based on locations, people and events. A piece of cake on top of a Documentum repository. Now it is a bit stuck in the Cabinet/Folder hierarchy as the main browsing interface.

Summary

I agree with Word of Pie that there is a lack of vision – an engaging one that we all can buy into and sell back home to our management. In my project we seem to have such a vision and for us Documentum is a key part of that. I just hoped that EMC IIG would share that to a greater degree. From our responses back home in Sweden and here at EMC World people seem to both want and like it (have a look at my EMC World presentation and see what you think). We can do seriously cool and fun stuff that will make management of content so much more efficient which should be of critical importance for every organisation today. At least in the military one thing is for sure and that is that we won’t get more people. We really have to work smarter and that is what a vision like this will provide a roadmap towards.

So be proud of what you do best EMC IIG and make sure to deliver INTEGRATED solutions on top of that. For those who care that will mean a world of difference in the long run and will gather looks of envy for those who did not get it.

With Jamie Pappas in the Blogger’s Lounge at EMC World 2010

The Blogger’s lounge is a great water hole to stop by to get a really good latte but of course also sit down in nice chairs and sofas with power outlets on the floor to blog and tweet about experiences at EMC World 2010 in Boston. Today I stopped by in the morning to have my photo taken with Jamie Pappas who is Enterprise 2.0 & Social Media Strategist, Evangelist & Community Manager at EMC. Be sure to visit her blog and follow her on Twitter. My dear Canon EOS 5D camera managed to capture the nice lighting in the lounge I think.

EMC World 2010: What is New and What’s Coming in Documentum xCP?

This session was presented by John McCormick on Tuesday morning.

The three pillars are:

  • Information Governance
  • xCP
  • Information Access

EMC wants to help customers to get maximum leverage from their information and Deliver the leading application composition platform for information management and case processing.

Intelligence Case Management:

Data, People, Content, Collaboration, Reporting, Policies, Events, Communication, Process

Case Management: Argues that it is a discipline of information management which is:

  • Non-deterministic
  • Driven by Human Decsionmaking
  • Driven by Content status

xCP Product Priniciples

  • Enable Intelligent business decisions (content and business process analytics)
  • Composition and configuration over coding
  • Enable performance through responsiveness and usability
  • Delight application builders and systems integrations
  • Beyond Documents: People, process and information in context
  • Leverage the private cloud
  • Build a future-proof product (move to declarative composition model)

The goal is collapse all the existing products that makes up xCP into fewer ones.

It is about reusable components, compositions tools, xCelerators

Resusable components:

  • Activities (templates)
  • Forms
  • UI

Tools:

  • Process Builder
  • Forms Builder
  • Taskspace for the UI

What is coming next…

There are different version numbering for xCP and the Documentum platform and this is how they relate:

  • xCP 1.5 – D 6.6 (June 2010?)
  • xCP 1-6 – D 6.7
  • xCP 2-0 – D7 (next-gen Case Management)

Focus for Documentum 6.6

  • Real-world performance testing
  • Composer 6.6 (dependency checking, simplfifcation
  • Taskspace is getting better in 6.6
  • Improved manageability (workflow agents behaves more gracefully)
  • Forms Enhancement (conditional required fields, better relationship management)
  • ATMOS Integration

Documentum 6.7

  • Final release of D6 family (Q1 2011)
  • Licence Management improvements
  • Improved Search ( integration of DSS)
  • Public Sector Readiness (Section 508 improvements for Taskspace)
  • Composer Improvements (xCP application (and no manual installs and version ingestions)

6.5 SP2/SP3 and 6.6 ready for Documentum Search Server (DSS)

Integration of cloud storage ATMOS D 6.6

As soon as DSS is out the whole platform is supported on a virtualized environment.

vSphere integration & Certification (D 6.7)

Documentum 7 (xCP 2.0) Sneak peak – Increased Business Agility

  • Composition is simpler
  • Deployment is faster
  • Case workers are more productive

Improving the tooling

  • Single Composition Tool – xCP Composition Tool probably based on Eclipse
  • Modeling view
  • Compose a page/screen

Deployments is Faster

  • Leverage the private cloud
  • Everything is virtualized
  • Deploy to an already installed environment directly from xCP composition tool to a VMWare instance

User Experience

  • Better insights into cases
  • Better viewing experience
  • Integrated capture
  • There will be a new Web Services based UI
  • Easy to search and add content to a case
  • Easier inline viewing

EMC World 2010: Customizations of Centerstage

The session was presented by Andrew Goodale who is the architect behind Centerstage. I am not a developer but to me this session was very important because I believe that the level of customizations possible greatly influence the potential of a successful Centerstage deployment. A lot of the power of enterprise systems lies in the possibility to adapt to the business needs.

He started by exploring the Services SDK and outlined that the architecture is set up with a Direct Web Remoting (DWR) LIbrary do the magic between web browsers and Web Services WSDL.

Using DFS Types for Data Model where appropriate

  • ObjectIdentify
  • DataObject
  • PropertySet
  • TypeInfo

Simplification was needed to support broader language adoptions because it is hard to called them from anything else than Java and .Net

Trouble calling them from Flash

  • No use of abstract XML Types
  • Minimize the number of XML namespaces
  • Need to support invocation from un-typed languages (e.g. Javascript)

Interface Design

  • Restricted set of data types

The DFS error handling is fine for programmatic access but when you want to show a progress dialogue and had new data structures for that. If you copy 200 files some of them could invoke an error there is important that these are handled and for instance not importing anything from file number 53…instead give more extensive information to the user of what happened and what went wrong without breaking the import after the error.

Foundation Services

  • Create blogs, wikis,
  • Manage spaces
  • Templates

Application Services

Overview

  • Provide the “guts” of Centerstage
  • Capture application logic that is UI.agnostic

Basic Content Services (Create, Checkin, Checkin, Copy, Move, Delete and Properites dialog)

  • Icon
  • Lists (Grid data sources – a declarative mechanism for creating queries, handling sorting, pagination and caching)
  • Permissions (simplified permission levels to standard dm_acl
  • Search (Knows about CS Artefacts and Integrates CIS entities with facets)

DFS Core Services

Possible to use them to modify CS artifacts

– for example ObjectsService.copy to  copy a wiki page

Copy things, add things to a page etc

Our DOF modules will enforce data constraints which for instance means that you can’t copy a page object without copying the page content

Deploying the SDK

  • A zip.file containing binaries and javadocs
  • Centerstage Services are added to core SDK
  • “remote” jars only – a deployed centertsage server is needed

Setup

– Unzip the SD

For Java your classpath should include

– DFS runtime, JAX-WS, JAX-B

Java: centerstage-foundation-remote-jar and centerstage-application-remote.jar

.Net requures 3.0 SDK for WCF, Visual Studio Optional)

Samples in both Java and .Net

Creating a Space

Uses the Blank Template which ships with Centerstage

  • Identify qualification shows how to pick a specific template
  • Using a template guarantees that the space will be Centerstage-compatible
  • Space needs a home page

Returns and OperationStatusSet

  • The standard return type for creates, updates
  • Allows validation errors to be returned.

Creating a Wiki – child pages to the wiki can be added in the same way

An activity template can create a space and send an invitation email to everyone.

Java samples can be built with Ant 1.7 and Java 1.5 Not IDE requirement – Eclipse will work fine.

Sample: Wiki to eBook sample

Goal:

  • Given URL to a CEnterstage wiki, create and ePub book
  • Each wiki page becomes a chapter in the book
  • Blogs and Discussions can also be converted
  • High-fideliyt (the rich text in CS is XHTML in the repository)
  • Page links are preserved

What it shows

  • PageService
  • Fetch wiki home page

The used a set of Google code – Java library that builds ePub books – contributed by Adobe

http://code.google.com/p/epub-tools/

Centerstage Mini – Demo to call services from Javascript

Goal:

  • Build and HTML page that shows Centerstage data
  • Pure AJAX Technologies

What it shows

  • How to call services from JavaScript
  • How si data marshaled

Demo showed the Recent Activity in an external native ExtJS Grid

List Services

The SDK for CS is not licensed….basically need a CS license to use it…

eBook Sample is available on ECN

The SDK will be in GA in the July timeframe.

To me it seems more powerful than I thought that it is now possible to programmatically be able to setup Centerstage Space and modify existing ones. That gives us an opportunity to create “templates” for common things that the business needs to do using E 2.0 features. Instead of relying that users are aware of all the possibilities and can execute them manually we now can have quick buttons to do that or use workflows or external systems to trigger these actions.

EMC World 2010: There is an App for Documentum now (iPhone OS)

Flatiron Solutions delivers an iPhone OS App for Documentum

So, finally I got to see it. Documentum on iPhone OS, running on both the iPhone and the iPad. I had said it before and say it again: from a information management perspective it makes so much sense to combine the intuitive interface of the iPhone OS with power that lies in a Documentum repository. Make use of all the metadata around content objects and exploring information becomes a breeze on a multi-touch device.

It is the company called Flatiron Solutions that brought this to market. You can download a version of it from the iTunes App Store. In order to connect your own repository you will need a server component that sits between the iPhone OS App and the Documentum repository.

Download the App from iTunes

I had a chance to try it out on both the iPhone and the iPad in their booth at the Solutions Pavillion last night and it was so fun. I really want this in our Battle Lab. A very sexy interface for Documentum!