At one of the meetings we had with EMC Managers I was asked to participate in their little video project called “A moment at Momentum” and of course I accepted. It is great to do stuff to strengthen the Momentum (Documentum) community at EMC World.
The Blogger’s lounge is a great water hole to stop by to get a really good latte but of course also sit down in nice chairs and sofas with power outlets on the floor to blog and tweet about experiences at EMC World 2010 in Boston. Today I stopped by in the morning to have my photo taken with Jamie Pappas who is Enterprise 2.0 & Social Media Strategist, Evangelist & Community Manager at EMC. Be sure to visit her blog and follow her on Twitter. My dear Canon EOS 5D camera managed to capture the nice lighting in the lounge I think.
This session was presented by John McCormick on Tuesday morning.
The three pillars are:
- Information Governance
- Information Access
EMC wants to help customers to get maximum leverage from their information and Deliver the leading application composition platform for information management and case processing.
Intelligence Case Management:
Data, People, Content, Collaboration, Reporting, Policies, Events, Communication, Process
Case Management: Argues that it is a discipline of information management which is:
- Driven by Human Decsionmaking
- Driven by Content status
xCP Product Priniciples
- Enable Intelligent business decisions (content and business process analytics)
- Composition and configuration over coding
- Enable performance through responsiveness and usability
- Delight application builders and systems integrations
- Beyond Documents: People, process and information in context
- Leverage the private cloud
- Build a future-proof product (move to declarative composition model)
The goal is collapse all the existing products that makes up xCP into fewer ones.
It is about reusable components, compositions tools, xCelerators
- Activities (templates)
- Process Builder
- Forms Builder
- Taskspace for the UI
What is coming next…
There are different version numbering for xCP and the Documentum platform and this is how they relate:
- xCP 1.5 – D 6.6 (June 2010?)
- xCP 1-6 – D 6.7
- xCP 2-0 – D7 (next-gen Case Management)
Focus for Documentum 6.6
- Real-world performance testing
- Composer 6.6 (dependency checking, simplfifcation
- Taskspace is getting better in 6.6
- Improved manageability (workflow agents behaves more gracefully)
- Forms Enhancement (conditional required fields, better relationship management)
- ATMOS Integration
- Final release of D6 family (Q1 2011)
- Licence Management improvements
- Improved Search ( integration of DSS)
- Public Sector Readiness (Section 508 improvements for Taskspace)
- Composer Improvements (xCP application (and no manual installs and version ingestions)
6.5 SP2/SP3 and 6.6 ready for Documentum Search Server (DSS)
Integration of cloud storage ATMOS D 6.6
As soon as DSS is out the whole platform is supported on a virtualized environment.
vSphere integration & Certification (D 6.7)
Documentum 7 (xCP 2.0) Sneak peak – Increased Business Agility
- Composition is simpler
- Deployment is faster
- Case workers are more productive
Improving the tooling
- Single Composition Tool – xCP Composition Tool probably based on Eclipse
- Modeling view
- Compose a page/screen
Deployments is Faster
- Leverage the private cloud
- Everything is virtualized
- Deploy to an already installed environment directly from xCP composition tool to a VMWare instance
- Better insights into cases
- Better viewing experience
- Integrated capture
- There will be a new Web Services based UI
- Easy to search and add content to a case
- Easier inline viewing
The session was presented by Andrew Goodale who is the architect behind Centerstage. I am not a developer but to me this session was very important because I believe that the level of customizations possible greatly influence the potential of a successful Centerstage deployment. A lot of the power of enterprise systems lies in the possibility to adapt to the business needs.
He started by exploring the Services SDK and outlined that the architecture is set up with a Direct Web Remoting (DWR) LIbrary do the magic between web browsers and Web Services WSDL.
Using DFS Types for Data Model where appropriate
Simplification was needed to support broader language adoptions because it is hard to called them from anything else than Java and .Net
Trouble calling them from Flash
- No use of abstract XML Types
- Minimize the number of XML namespaces
- Restricted set of data types
The DFS error handling is fine for programmatic access but when you want to show a progress dialogue and had new data structures for that. If you copy 200 files some of them could invoke an error there is important that these are handled and for instance not importing anything from file number 53…instead give more extensive information to the user of what happened and what went wrong without breaking the import after the error.
- Create blogs, wikis,
- Manage spaces
- Provide the “guts” of Centerstage
- Capture application logic that is UI.agnostic
Basic Content Services (Create, Checkin, Checkin, Copy, Move, Delete and Properites dialog)
- Lists (Grid data sources – a declarative mechanism for creating queries, handling sorting, pagination and caching)
- Permissions (simplified permission levels to standard dm_acl
- Search (Knows about CS Artefacts and Integrates CIS entities with facets)
DFS Core Services
Possible to use them to modify CS artifacts
– for example ObjectsService.copy to copy a wiki page
Copy things, add things to a page etc
Our DOF modules will enforce data constraints which for instance means that you can’t copy a page object without copying the page content
Deploying the SDK
- A zip.file containing binaries and javadocs
- Centerstage Services are added to core SDK
- “remote” jars only – a deployed centertsage server is needed
– Unzip the SD
For Java your classpath should include
– DFS runtime, JAX-WS, JAX-B
Java: centerstage-foundation-remote-jar and centerstage-application-remote.jar
.Net requures 3.0 SDK for WCF, Visual Studio Optional)
Samples in both Java and .Net
Creating a Space
Uses the Blank Template which ships with Centerstage
- Identify qualification shows how to pick a specific template
- Using a template guarantees that the space will be Centerstage-compatible
- Space needs a home page
Returns and OperationStatusSet
- The standard return type for creates, updates
- Allows validation errors to be returned.
Creating a Wiki – child pages to the wiki can be added in the same way
An activity template can create a space and send an invitation email to everyone.
Java samples can be built with Ant 1.7 and Java 1.5 Not IDE requirement – Eclipse will work fine.
Sample: Wiki to eBook sample
- Given URL to a CEnterstage wiki, create and ePub book
- Each wiki page becomes a chapter in the book
- Blogs and Discussions can also be converted
- High-fideliyt (the rich text in CS is XHTML in the repository)
- Page links are preserved
What it shows
- Fetch wiki home page
The used a set of Google code – Java library that builds ePub books – contributed by Adobe
- Build and HTML page that shows Centerstage data
- Pure AJAX Technologies
What it shows
- How si data marshaled
Demo showed the Recent Activity in an external native ExtJS Grid
The SDK for CS is not licensed….basically need a CS license to use it…
The SDK will be in GA in the July timeframe.
To me it seems more powerful than I thought that it is now possible to programmatically be able to setup Centerstage Space and modify existing ones. That gives us an opportunity to create “templates” for common things that the business needs to do using E 2.0 features. Instead of relying that users are aware of all the possibilities and can execute them manually we now can have quick buttons to do that or use workflows or external systems to trigger these actions.
So, finally I got to see it. Documentum on iPhone OS, running on both the iPhone and the iPad. I had said it before and say it again: from a information management perspective it makes so much sense to combine the intuitive interface of the iPhone OS with power that lies in a Documentum repository. Make use of all the metadata around content objects and exploring information becomes a breeze on a multi-touch device.
It is the company called Flatiron Solutions that brought this to market. You can download a version of it from the iTunes App Store. In order to connect your own repository you will need a server component that sits between the iPhone OS App and the Documentum repository.
I had a chance to try it out on both the iPhone and the iPad in their booth at the Solutions Pavillion last night and it was so fun. I really want this in our Battle Lab. A very sexy interface for Documentum!
Yesterday on Monday May 10 at 11 am I gave a speech at the Momentum 10 conference here at EMC World 2010 in Boston. The presentation was focused around our experiences of building an experimentation platform for next-generation information and knowledge management (IKM) for a large operational level military HQ. Contemporary conflicts are complex and dynamic in character and requires a new approach to IKM in order to be able to handle all those complexities based on a sound management of our digital information. At the core of our platform is EMC Documentum integrated over an Enterprise Service Bus (ESB) from Oracle. The goal is to maintain access and tracability on the information while removing stove-piped systems.
I have got quite a few positive reactions both from customers and EMC-people after the session which of course is just great. For instance see these notes from the session. All the presentations will be available for download for all participants but that will most likely take some time. So in the meantime you can download my presentation here instead:
Looking forward to comments are reflections. The file is quite big but that is because my presentations is high on screenshots and downsampling them to save file size will make it too hard to see what they are showing. Try zooming in to see details.
Sitting at the lounge now relaxing after another cup of great latte. Relaxing after what felt like a really good presentation earlier today at EMC World 2010. Responses so far have been very positive and it feels great of course. We think we have so many cool ideas and it is great to be able to show it off to people with a deep interest in Enterprise Content Management.
Now it is soon time for the keynot by Mark Lewis who seem to be in charge of the newly renamed Information Intelligence Group (formerly Content Management & Archiving Division).
I had planned to go to a session around the Documentum Roadmap but it was totally full so we had to go to another session. We split up and went to the BPM Fundamentals and the Documentum Foundation Services (DFS) Best Practices session by Michael Mohen instead. I am not a developer so this is a little from the 500ft level
He started by discussed the complementary nature between DFS and CMIS depending on how focused development is to only Documentum or not. CMIS is of course the new standard recently approved by OASIS. He argued that some applications like Records Management is still best done using DFS but I guess that also has to do with how people want CMIS to develop. As I understand it is not intended to contain ALL feature and the COMPLETE set of features in all ECM-systems and rather focus on the interoperabiltiy aspect of building ECM-apps based on multiple repositories.
When it comes to Content Transfer when using DFS the key considerations are latency, size of the file, formats and caching needs. Some of the ways to do content transfer is:
Most use UCF or MTOM but it is important to remember that BOCS/ACS requires UCF to work. The message is to don’t be afraid to mix between HTTP, MTOM and others. In our solution we do use a mix but because we sometimes have rather large content size this of course an issue.
Notable changes in D6.5/D6.6
- JBoss 4.2.0 is the new methods server
- Apache Tomcat support
- Aspect Support
- LWSO support
- Native 64-bit support and UCF Improvements
- Kerberos is coming D6.6
Remote and local calls in Java – .Net does only provide remote calls
There are some applications that customers may not be aware of such as DFS Utilities developed by John sweeney, EMC and DFSX (Extension)
- Provides utility classes
- Based on DFS Object MOdel
- Java-based 1.5 or greater
- Only EAR-files today
Test Harness is JMeter extension which has custom JMeter Sampler built to invoke DFS using the Java Productivity Layer
Responsetimes collected for:
- Get Object
- Checkout object
- Check in Object
- Delete Object
Over a WAN DFS speeded up DFC especially when you have 300-400 ping times…use DFS because it is state-less. Relevant when using satellite links and such.
Sizing Calculator is soon available for DFS. It is an Excel spreadsheet. The sheet is sased on WSDL and SOAP so if we are using other designs results may vary of course.
In a speed test etween UCF and MTOM upload speeds under 50 Mb were similar. However, UCF was slightly faster. The cool part of UCF is that it is asynchronous which for instance mean that you can show one page of a document and continue loading the rest of it.
When it comes to ESB-implementations the message was that the majority of implementions is point-point for clients apps. However some have SAML for added security in their ESB implementation which affects speed a bit.
It seems that DFS is used a lot in a .Net environment and together with Sharepoint.
MOSS and DFS Examples
SDF and xCP
Webpart with an inbox rendered and Xform inside Sharepoint.
Another example is the use of DFS and Windows Explorer where some want custom integration for the Windows Desktop and essentially provides something like the old Document Desktop client. It is called DFS Explorer.
DFS Adobe Flex Example
Adobe does not talk directly to DFS but through Java. Restful would much easier to use for Flex as well as most AJAX-implementations.
- Leverage the SDK (.Net/Java interop layers)
- Use UCF for BOCS/ACS
- If you expected your query to exceed 500 you must cache and cycle through results.
- DFS is better on WAN with poor latency.
A feature which is not well documented is to set requiresAuthentication=”false” on your annotated services implementation to browse through repositories and basic info such as data dictionary.
There is also a less known Services Catalog Viewer which is an optional install
- Explore services available within the internet
- DSCR is registry for consumer discover.
- UDDI v2 standard
- Standard Web app
- Default port is 9010
- Judy open source UDDI
You can also compare this with the notes from last conference by Word of Pie.
It is time again to enjoy the company of fellow ECM-people at EMC’s conference which is in Boston, MA this year. Although most of the conference are focused around their storage hardware there is a good “sub-conference” called Momentum where all the Documentum people gather to share experiences. I have said this before but this has so far been by far the best tech conference I have attended. Most sessions are very interesting and EMC is a fairly open company so you usually leave with a decent idea of where they are going for the next year. For us this is critical because sometimes what is in the next release dictates what kind of experiments we can run in our Battle Lab at Joint Concept Development and Experimentation Centre (JCDEC) back in Sweden.
I will try to blog and twitter as usual and I am registrered at the Blogger’s Lounge this year as well. Looking forward to some great vanilla latte there while trying to scribble down the latest from the sessions. At this wiki you can see who will be blogging from EMC World this year. Be sure to check it out because social media is great tool to get not only facts but also comments from people in the business. I guess the ECN Online Documentum community also will be a good place to find news from the conference.
And finally, I will be speaking about our experiences of integrating Documentum in a SOA-architecture to support an operational level military HQ. The speech will be at 11 am on Monday. Please stop and say hi if you can!
The project that I am running at JCDEC involves a lot of internal “marketing” targeting both at end users and people in charge of our IT-projects. Lately I have found myself explaning the difference between Workflow processes using Documentum Process Engine and Taskspace and what EMC’s new clients Centerstage Pro and Media Workspace. My best argument so far has been that BPM/Workflow is well suited for formal repeatable process in the HQ while Enterprise 2.0 clients takes care of ad-hoc and informal processess. Keith Swensson explains the Taylorism-based Scientific Management-concept as the foundation of Business Process Management in this blogpost in a good way. He continues to provide a bridge over to ad-hoc work that nowadays is done by what is called Knowledge Worker. Documentum Centerstage is a tool that is intended for the Knowledge Worker which also can be seen as the Enterprise 2.0 way of working.
However, Keith continues to steer us over to a concept called Adaptive Case Management which is supposed to address those more agile and dynamic ways of working as a contrast to slow-changing well-defined business processess that is deployed in traditional BPM-systems. To my understanding this focuses a lot on the fact that the user itself (instead of a Process designer) needs to be able to control templates, process steps and various other things in order to be able to support more dynamic work such as criminal investigations or medical care.
However, Adaptive Case Management is also a concept (I understand) in the book called “Mastering the Unpredictable”. The idea is to focus on the unpredictable nature of some work situations but also reflect a bit over to what degree things are unpredictable or not. In this presentation by Jacob Ulkeson the argument is that the main bulk of work is unpredictable and therefore also means that Process Modeling using traditional BPM most likely won’t work.
Some people have opinions that there is no need to redefine BPM and that all these three letter acronyms does not contribute much to the understanding of the problem and the solutions. I think I disagree and the reason for that is that there are no silver bullet products that covers everything that you need. Most organisations start somewhere and rolls out systems based on their most pressing needs. I believe that these systems have some similarities in what they are good and bad at. Having bought an ECM, BI, CRM or ERP-system usually says something about what business problems have been addressed. As SOA-architectures matures and the ambition to reduce stove-pipes increases it actually means that the complementary character of these systems matter. It also matters which of these vendors you choose because the consolidation efforts into a few larger vendors means choosing from different approaches.
To me all of this means an opportunity to leverage the strong points of different kind of platforms. Complex sure but if you have the business requirements it is probably better than building it from scratch. So I think when companies quickly rolls out Enterprise 2.0 platforms from smaller startup vendors they soon discover that they risk creating yet another stove-pipe but in this case consisting of social information. Putting E 2.0 capabilties on top of an ECM-platform than makes a lot of sense in order to be able to integrate social features with existing enterprise information. The same most likely goes for BI, CRM etc.
When it comes to BPM the potential lies in extending formal processess with social and informal aspects. However, it is likely that the E 2.0-style capabilities make new ways of working evolve and emerge. Sooner or later they need to be formalised maybe into a project or a community of interest. Being able to leverage the capabilties of the BPM-platform in terms of monitoring and some kind of best practice in form of templates is not far-fetched. To some degree I believe that Adaptive Case Management-solutions sometimes should be used instead of just a shared Centerstage Space because you need this added formal aspects but still want to retain some flexibility. Knowledge Worker-style work can then be done on top of a BPM-infrastructure while at the same time utilising the ECM-infrastructure for all content objects involved in the process. Having a system like Documentum that is good at content-centric human workflow processes makes a lot of sense.
So is the Documentum xCP a way to adress this middle-ground between Process Modeling-based processes and Knowledge Worker-style support in CenterStage? The mantra is “configure instead of coding” which implies a much more dynamic process. I have not played around with xCP yet – we have so far only deployed processes developed from scratch instead of trying out the case management templates that comes with the download.
Not all companies want to do this but I think some will soon see the merits of integrating ECM, BI, E.2.0 and BPM/ACM-solutions using SOA. The hard part I belive is to find software and business methods support for the agile and dynamic change management of these systems. The key to achieve this is to be able to support various degrees of ad-hoc work where on one the user does everything herself and on the other way a more traditional developer coding modules. Being able to more dynamically change/model/remodel not only processess but also the data model for content types in Documentum is a vital capability to be able to respond to business needs in a way that maintains trust in the system. This is not a task by IT but something done by some kind of Information and Knowledge Management (IKM) specialist. They can get some proper means of doing their work using this SOA-based integration of different sets of products.
So employ E 2.0-style features in Task Management clients and make sure that E 2.0 style clients include tasks from BPM/ACM in their activity streams or unified inboxes. Make sure that all of this is stored in an ECM-platform with full auditing capabilities which needs to be off-loaded to a data warehouse so it can be dynamically analysed using interactive data visualisation, statistics and data mining. I hope we can show a solutions for that in our lab soon.