Month: May 2010

EMC World 2010: There is an App for Documentum now (iPhone OS)

Flatiron Solutions delivers an iPhone OS App for Documentum

So, finally I got to see it. Documentum on iPhone OS, running on both the iPhone and the iPad. I had said it before and say it again: from a information management perspective it makes so much sense to combine the intuitive interface of the iPhone OS with power that lies in a Documentum repository. Make use of all the metadata around content objects and exploring information becomes a breeze on a multi-touch device.

It is the company called Flatiron Solutions that brought this to market. You can download a version of it from the iTunes App Store. In order to connect your own repository you will need a server component that sits between the iPhone OS App and the Documentum repository.

Download the App from iTunes

I had a chance to try it out on both the iPhone and the iPad in their booth at the Solutions Pavillion last night and it was so fun. I really want this in our Battle Lab. A very sexy interface for Documentum!

EMC World 2010: My presentation around using Documentum in a SOA-platform

Yesterday on Monday May 10 at 11 am I gave a speech at the Momentum 10 conference here at EMC World 2010 in Boston. The presentation was focused around our experiences of building an experimentation platform for next-generation information and knowledge management (IKM) for a large operational level military HQ. Contemporary conflicts are complex and dynamic in character and requires a new approach to IKM in order to be able to handle all those complexities based on a sound management of our digital information. At the core of our platform is EMC Documentum integrated over an Enterprise Service Bus (ESB) from Oracle. The goal is to maintain access and tracability on the information while removing stove-piped systems.

I have got quite a few positive reactions both from customers and EMC-people after the session which of course is just great. For instance see these notes from the session. All the presentations will be available for download for all participants but that will most likely take some time. So in the meantime you can download my presentation here instead:

Presentation at EMC World 2010 in Boston

Looking forward to comments are reflections. The file is quite big but that is because my presentations is high on screenshots and downsampling them to save file size will make it too hard to see what they are showing. Try zooming in to see details.

EMC World 2010: At Blogger’s Lounge

Sitting at the lounge now relaxing after another cup of great latte. Relaxing after what felt like a really good presentation earlier today at EMC World 2010. Responses so far have been very positive and it feels great of course. We think we have so many cool ideas and it is great to be able to show it off to people with a deep interest in Enterprise Content Management.

Alexandra Blogger's lounge at EMC World 2010

Now it is soon time for the keynot by Mark Lewis who seem to be in charge of the newly renamed Information Intelligence Group (formerly Content Management & Archiving Division).

EMC World 2010: DFS Real World Examples, Best Practices

I had planned to go to a session around the Documentum Roadmap but it was totally full so we had to go to another session. We split up and went to the BPM Fundamentals and the Documentum Foundation Services (DFS) Best Practices session by Michael Mohen instead. I am not a developer so this is a little from the 500ft level

He started by discussed the complementary nature between DFS and CMIS depending on how focused development is to only Documentum or not. CMIS is of course the new standard recently approved by OASIS. He argued that some applications like Records Management is still best done using DFS but I guess that also has to do with how people want CMIS to develop. As I understand it is not intended to contain ALL feature and the COMPLETE set of features in all ECM-systems and rather focus on the interoperabiltiy aspect of building ECM-apps based on multiple repositories.

When it comes to Content Transfer when using DFS the key considerations are latency, size of the file, formats and caching needs. Some of the ways to do content transfer is:

  • HTTP
  • Base64
  • UCF
  • MTOM

Most use UCF or MTOM  but it is important to remember that BOCS/ACS requires UCF to work. The message is to don’t be afraid to mix between HTTP, MTOM and others. In our solution we do use a mix but because we sometimes have rather large content size this of course an issue.

Notable changes in D6.5/D6.6

  • JBoss 4.2.0 is the new methods server
  • Apache Tomcat support
  • Aspect Support
  • LWSO support
  • Native 64-bit support and UCF Improvements
  • Kerberos is coming D6.6

Remote and local calls in Java – .Net does only provide remote calls

There are some applications that customers may not be aware of such as DFS Utilities developed by John sweeney, EMC and DFSX (Extension)

  • Provides utility classes
  • Based on DFS Object MOdel
  • Java-based 1.5 or greater
  • Only EAR-files today

Test Harness is JMeter extension which has custom JMeter Sampler built to invoke DFS using the Java Productivity Layer

Responsetimes collected for:

  • CreateObject
  • Get Object
  • Checkout object
  • Check in Object
  • Delete Object

Over a WAN DFS speeded up DFC especially when you have 300-400 ping times…use DFS because it is state-less. Relevant when using satellite links and such.

Sizing Calculator is soon available for DFS. It is an Excel spreadsheet. The sheet is sased on WSDL and SOAP so if we are using other designs results may vary of course.

In a speed test etween UCF and MTOM upload speeds under 50 Mb were similar. However, UCF was slightly faster. The cool part of UCF is that it is asynchronous which for instance mean that you can show one page of a document and continue loading the rest of it.

When it comes to ESB-implementations the message was that the majority of implementions is point-point for clients apps. However some have SAML for added security in their ESB implementation which affects speed a bit.

It seems that DFS is used a lot in a .Net environment and together with Sharepoint.

MOSS and DFS Examples

.Net 3.3

SDF and xCP

Webpart with an inbox rendered and Xform inside Sharepoint.

Another example is the use of DFS and Windows Explorer where some want custom integration for the Windows Desktop and essentially provides something like the old Document Desktop client. It is called DFS Explorer.

DFS Adobe Flex Example

There is an white paper available to provide a quickstart…read more about the session at the community page.

Adobe does not talk directly to DFS but through Java. Restful would much easier to use for Flex as well as most AJAX-implementations.

Best Practices

  • Leverage the SDK (.Net/Java interop layers)
  • Use UCF for BOCS/ACS
  • If you expected your query to exceed 500 you must cache and cycle through results.
  • DFS is better on WAN with poor latency.

A feature which is not well documented is to set requiresAuthentication=”false” on your annotated services implementation to browse through repositories and basic info such as data dictionary.

There is also a less known Services Catalog Viewer which is an optional install

  • Explore services available within the internet
  • DSCR is registry for consumer discover.
  • UDDI v2 standard
  • Standard Web app
  • Default port is 9010
  • Judy open source UDDI

You can also compare this with the notes from last conference by Word of Pie.

Next stop: EMC World 2010 in Boston

It is time again to enjoy the company of fellow ECM-people at EMC’s conference which is in Boston, MA this year. Although most of the conference are focused around their storage hardware there is a good “sub-conference” called Momentum where all the Documentum people gather to share experiences. I have said this before but this has so far been by far the best tech conference I have attended. Most sessions are very interesting and EMC is a fairly open company so you usually leave with a decent idea of where they are going for the next year. For us this is critical because sometimes what is in the next release dictates what kind of experiments we can run in our Battle Lab at Joint Concept Development and Experimentation Centre (JCDEC) back in Sweden.

I will try to blog and twitter as usual and I am registrered at the Blogger’s Lounge this year as well. Looking forward to some great vanilla latte there while trying to scribble down the latest from the sessions. At this wiki you can see who will be blogging from EMC World this year. Be sure to check it out because social media is great tool to get not only facts but also comments from people in the business. I guess the ECN Online Documentum community also will be a good place to find news from the conference.

And finally, I will be speaking about our experiences of integrating Documentum in a SOA-architecture to support an operational level military HQ. The speech will be at 11 am on Monday. Please stop and say hi if you can!

Reflections on our project from an Enterprise Search perspective

In a blog post called “Knowledge management: retrieve, visualize and communicate!” at their Findability blog the company Findwise reflects on the approach to Knowledge Management taken by the Knowledge Support project at JCDEC. It is inspired by the recent article in one of Sweden’s biggest technology magazines called Metro Teknik.

Simon Sineks golden circle highlights the “why”…

Martin Sjöman at the Design Planner blog has an interesting blog post reflecting on why the motivations behind the strategy of a certain company is so important. Since it is no secret that I am a devoted Mac-user I always find interesting to discuss the unique nature of Apple and their strategy for both design and marketing. I can really identify with the how the importance of relating to Apples conviction in realizing a vision for computing which makes me relate to WHY Apple is doing things the way they do. It is so much stronger than both HOW and WHAT they are doing. Something to think about in internal marketing efforts where I hope that the passion we have will be noticed and will bear fruit because people can relate to our conviction around Information and Knowledge Management.

Can BPM meet Enterprise 2.0 over Adaptive Case Management?

The project that I am running at JCDEC involves a lot of internal “marketing” targeting both at end users and people in charge of our IT-projects. Lately I have found myself explaning the difference between Workflow processes using Documentum Process Engine and Taskspace and what EMC’s new clients Centerstage Pro and Media Workspace. My best argument so far has been that BPM/Workflow is well suited for formal repeatable process in the HQ while Enterprise 2.0 clients takes care of ad-hoc and informal processess. Keith Swensson explains the Taylorism-based Scientific Management-concept as the foundation of Business Process Management in this blogpost in a good way. He continues to provide a bridge over to ad-hoc work that nowadays is done by what is called Knowledge Worker. Documentum Centerstage is a tool that is intended for the Knowledge Worker which also can be seen as the Enterprise 2.0 way of working.

However, Keith continues to steer us over to a concept called Adaptive Case Management which is supposed to address those more agile and dynamic ways of working as a contrast to slow-changing well-defined business processess that is deployed in traditional BPM-systems. To my understanding this focuses a lot on the fact that the user itself (instead of a Process designer) needs to be able to control templates, process steps and various other things in order to be able to support more dynamic work such as criminal investigations or medical care.

However, Adaptive Case Management is also a concept (I understand) in the book called “Mastering the Unpredictable”. The idea is to focus on the unpredictable nature of some work situations but also reflect a bit over to what degree things are unpredictable or not. In this presentation by Jacob Ulkeson the argument is that the main bulk of work is unpredictable and therefore also means that Process Modeling using traditional BPM most likely won’t work.

Some people have opinions that there is no need to redefine BPM and that all these three letter acronyms does not contribute much to the understanding of the problem and the solutions. I think I disagree and the reason for that is that there are no silver bullet products that covers everything that you need. Most organisations start somewhere and rolls out systems based on their most pressing needs. I believe that these systems have some similarities in what they are good and bad at. Having bought an ECM, BI, CRM or ERP-system usually says something about what business problems have been addressed. As SOA-architectures matures and the ambition to reduce stove-pipes increases it actually means that the complementary character of these systems matter. It also matters which of these vendors you choose because the consolidation efforts into a few larger vendors means choosing from different approaches.

To me all of this means an opportunity to leverage the strong points of different kind of platforms. Complex sure but if you have the business requirements it is probably better than building it from scratch. So I think when companies quickly rolls out Enterprise 2.0 platforms from smaller startup vendors they soon discover that they risk creating yet another stove-pipe but in this case consisting of social information. Putting E 2.0 capabilties on top of an ECM-platform than makes a lot of sense in order to be able to integrate social features with existing enterprise information. The same most likely goes for BI, CRM etc.

When it comes to BPM the potential lies in extending formal processess with social and informal aspects. However, it is likely that the E 2.0-style capabilities make new ways of working evolve and emerge. Sooner or later they need to be formalised maybe into a project or a community of interest. Being able to leverage the capabilties of the BPM-platform in terms of monitoring and some kind of best practice in form of templates is not far-fetched. To some degree I believe that Adaptive Case Management-solutions sometimes should be used instead of just a shared Centerstage Space because you need this added formal aspects but still want to retain some flexibility. Knowledge Worker-style work can then be done on top of a BPM-infrastructure while at the same time utilising the ECM-infrastructure for all content objects involved in the process. Having a system like Documentum that is good at content-centric human workflow processes makes a lot of sense.

So is the Documentum xCP a way to adress this middle-ground between Process Modeling-based processes and Knowledge Worker-style support in CenterStage? The mantra is “configure instead of coding” which implies a much more dynamic process. I have not played around with xCP yet – we have so far only deployed processes developed from scratch instead of trying out the case management templates that comes with the download.

Not all companies want to do this but I think some will soon see the merits of integrating ECM, BI, E.2.0 and BPM/ACM-solutions using SOA. The hard part I belive is to find software and business methods support for the agile and dynamic change management of these systems. The key to achieve this is to be able to support various degrees of ad-hoc work where on one the user does everything herself and on the other way a more traditional developer coding modules. Being able to more dynamically change/model/remodel not only processess but also the data model for content types in Documentum is a vital capability to be able to respond to business needs in a way that maintains trust in the system. This is not a task by IT but something done by some kind of Information and Knowledge Management (IKM) specialist. They can get some proper means of doing their work using this SOA-based integration of different sets of products.

So employ E 2.0-style features in Task Management clients and make sure that E 2.0 style clients include tasks from BPM/ACM in their activity streams or unified inboxes. Make sure that all of this is stored in an ECM-platform with full auditing capabilities which needs to be off-loaded to a data warehouse so it can be dynamically analysed using interactive data visualisation, statistics and data mining. I hope we can show a solutions for that in our lab soon.