Category: Uncategorized

EMC World 2010: Next-generation Search: Documentum Search Services

Presented by Aamir Farooq

Verity: Largest ingex 1 M Docs

FAST: Largest Index 200 M Docs

Challenging requirements today that all requires tradeoffs. Instead of trying to plugin third party search engines chose to build and integrated search engine for content and case management.

Flexible Scalability being promoted.

Tens to Hundreds of Millions of objects per host

Routing of indexing streams to different collections can be made.

Two instances can be up and running in less than 20 min!

Online backup restore is possible using DSS instead of just offline for FAST

FAST only supported Active/Active HA. In DSS more options:

Active/Passive

Native security. Replicates ACL and Groups to DSS

All fulltext queries leverage native security

Efficient deep facet computation within DSS with security enforcement. Security in facets is vital.

Enables effective searches on large result sets (underpriveleged users not allowed to see most hits in result set)

Without DSS, facets computed over only first 150 results pulled into client apps

100x more with DSS

All metrics for all queries is saved and can be used in analytics. Run reports in the admin UI.

DSS Feature Comparison

DSS supports 150 formats (500 versions)

The only thing lacking now is Thesaurus (coming in v 1.2)

Native 64-bit support for Linux and Windows, Core DSS is 64-bit)

Virtutalisation support on VMWare

Fulltext Roadmap

DSS 1.0 GA compatible with D 6.5 SP2 or later. Integration with CS 1.1 for facets, native security and XQuery)

Documentum FAST is in maintenance mode.

D6.5 SP3, 6.6 and 6.7 will be the last release that support FAST

From 2011 DSS will be the search solution for Documentum.

Index Agent Improvements

Guides you through reindexing or simply processing new indexing events.

Failure thresholds. Configure how many error message you allow.

One Box Search: As you add more terms it is doing OR instead of AND between each terms

Wildcards are not allowed OOTB. It can be changed.

Recommendations for upgrade/migration

  • Commit to Migrate
  • No additional license costs – included in Content Server
  • Identity and Mitigate Risks
  • 6.5 SP2 or later supported
  • No change to DQL – Xquery available.
  • Points out that both xDb and Lucene are very mature projects
  • Plan and analyze your HA and DR requirements

Straight migration. Build indices while FAST is running. Switch from FAST to DSS when indexing is done. Does not require multiple Content Servers.

Formal Benchmarks

  • Over 30 M documents spread over 6 nodes
  • Single node with 17 million documents (over 300 Gb index size)
  • Performance: 6 M Documents in FAST took two weeks. 30 M with DSS also took 2 weeks but with a lot of stops.
  • Around 42% faster for ingest for a single node compared to FAST

The idea is to use xProc to do extra processing of the content as it comes into DSS.

Conclusion

This is a very welcome improvement for one of the few weak points in the Documentum platform. We were selected to be part of the beta program so I would now have loved to tell you how great of an improvement it really is. However, we were forced to focus on other things in our SOA-project first. Hopefully I will come back in a few weeks or so and tell you how great the beta is. We have an external Enterprise Search solution powered by Apache Solr and I often get the question if DSS will make that unnecessary. For the near future I think it will not and that is because the search experience is also about the GUI. We believe in multiple interfaces targeted at different business needs and roles and our own Solr GUI has been configured to meet our needs based from a browse and search perspective. From a Documentum perspective the only client today that will leverage the faceted navigation is Centerstage and that is focused on asynchronous collaboration and is a key component in our thinking as well, but for different purposes. Also even though DSS is based on two mature products (as I experienced at Lucene Eurocon this week) I think the capabilities to tweak and monitor the search experience at least initially will be much better in our external Solr than using the new DSS Admin Tool although it seems like a great improvement form what the FAST solution offers today.

Another interesting development will be how the xDB inside DSS will related to the “internal” XML Store in terms of integration. Initially they will be two servers but maybe in the future you can start doing things with them together. Especially if next-gen Documentum will replace the RDBMS as Victor Spivak mentioned as a way forward.

At the end having a fast search experience in Documentum from now is so important!

Further reading

Be sure to also read the good summary from Technology Services Group and Blue Fish Development Group about their take on DSS.

Reblog this post [with Zemanta]

More Presentation Support Tools but less (Powerpoint) slide shows

In a recent article called ”We Have Met the Enemy and He Is PowerPoint” by Elisabeth Bumiller there is a big outcry to stop using Powerpoint because it supposed to make us more stupid in decision-making. I agree and can just reiterate a quote from the top US Intelligence Official in Afghanistan, Maj Gen Michael Flynn in the report “FIXING INTEL: A BLUEPRINT FOR MAKING INTELLIGENCE RELEVANT IN AFGHANISTAN”:

“The format of intelligence products matters. Commanders who think PowerPoint storyboards and color-coded spreadsheets are adequate for describing the Afghan conflict and its complexities have some soul searching to do.”

These are quite hard words directed towards his commanders in ISAF and the US Component in Afghanistan but I think he is right. However, the underlying issue is a desire to simplify things which should not be simplified. Combine that with a lack of vision when it comes to tools support for higher level of military command. Basically the tools supposed to support that kind of planning are either general purpose tools like Microsoft Office or highly specialised military application which exists in their own stove-pipe.

Oversimplifications
With Powerpoint comes a method and that method mainly consists of boiling information down to single bullets. Perfect for fine tuned marketing messages that want to leave just a few critical words or terms in the heads of the recipient. Not that good for complex reasoning around complex issues like modern conflicts. Powerpoint sets out to convey a message when we instead should focus on creating situation focused on improving our understanding.

Static representations
Most Powerpoint presentations are very static in nature. They usually represent a manually crafted snapshot of a given situation which means that it can become outdated very quickly. As time goes on there are more and more static presentations that should be regularly updated but usually never are. Either they disappear in the file sharing if the organisation lacks an Enterprise Content Management system or there is no process on monitoring which presentations that need to be updated. Usually because all the traceability is lost from when they were being created. Some companies have implemented some dynamic areas in their presentations were for instance weekly sales figures are updated when the presentation opens but that is far from keeping track of where the orgins for each bullet, diagram and images are.

Laborintensive work
As described in the article there are quite a few junior officers that spend time collating information and transforming into presentations. To start with there is much to be done to support this kind of ”research work” where users are navigation and searching for relevant pieces of information. However, after the information has been collated the next part of the work starts which is to transform that presentations using a template of some kind. Decision-makers usually have an opinion of how they want their presentations set up so they recognize the structure of the information from time to time. Add to that the fact that most organisations have a graphical profile to adhere which suggests a common styling and formatting of the content. To me all this really calls for a more semi-automated way of compiling this information. I am not saying that all content can be templated, far from it, but where it is possible it would save lots of time. Hopefully time that could be spent thinking instead of searching and formatting in Powerpoint.

Lack of interactivity
Another problem of these static representations are that since they usually take hours to compile and the flexilbilty in the actual briefing situations is usually low. If the decision-maker suddenly asks to filter the information from another perspective in say a graph the unfortunate answer will be: ”We will get back to you in half-an-hour or so”. Not exactly the best conditions to inspire reflections that puts complex problems in a new light. Spotfire has even written a paper around that which is called ”Minority Reports – How a new form of data visualization promises to do away with the meetings we all know and loathe”. The ability to introduce dynamic data which is interactive can bring us a new enviroment for meetings, especially if we also have access to large multi-touch walls that invite more than one person to easily manipulate and interact with the data.

Format matters
The General is right, format matters. There is a need for several different formats of the same information. Maj Gen Flynn calls out for more products based on writing which allows people to follow a more complex reasoning. That tackles the simplification aspect of the problem. However, there is still a need to do things together in a room and handing out written reports in Times New Roman 12 points is not the answer. In fact we really need a revolution in terms of visualisation of all that information we have decided to store digitally. Especially since we are increasingly able to provide structure to unstructured information with metadata but also able to collect data with XML-based data structures. We really need more presentation and visualisation support to be able to work productively with our information. However, we need less Powerpoint because it is a very time-consuming way to do stuff which can be done much better with another set of tools. Multi-channel publishing is an establish concept in marketing areas which means that the same content can be repurposed for print, web, mobile phones and large digital signage screens. We need to think in a similar way when it comes to what we use Powerpoint for today. There is even a complete toolsets such as EMC Document Sciences which, surprise, is based on templates in order to do customized market communications where static content meets dynamic content from databases. In this case based around common design tools such as Adobe InDesign.

The Space Shuttle Columbia experience
One tragic example of when the use of Powerpoint was a contributing factor was the tragic loss of Space Shuttle Columbia. The Columbia Accident Investigation Board (CAIB) took the help of Professor Edward Tufte from Yale University to analyse the communication failure that in the end made NASA to not be aware of the seriousness of the foam strike. The board makes the following finding which is all in line with General Flynn’s observations:

At many points during its investigation, the Board was surprised to receive similar presentation slides from NASA officials in place of technical reports. The Board views the endemic use of PowerPoint briefing slides instead of technical papers as an illustration of the problematic methods of technical communication at NASA.

Tufte continues to make the argument that the low resolution of the Powerpoint slides used forces technical terms to be abbreviated and thus adding ambiguity and that the usual large font size in headline also forces shortening. He also notes that the typography and hiearchies provided by the bullet organisation also added confusion and that in the case of NASA some more advanced typographic feature to handle math and other technical formatting is needed.

During the Return to Flight work later on this was further emphasized with the following statement:

“Several members of the Task Group noted, as had CAIB before them, that many of the engineering packages brought before formal control boards were documented only in PowerPoint presentations,”

Unfortunately, this is something I can relate to in my line of business. The main form of documentation are a slide shows being emailed around. Since you know that they will be emailed around without You being there to talk to them I believe many add a lot more extra text in them which makes them into some kind of in-between creatures. Neither slides shows nor reports. At least these added words hopefully reduced ambiguity some degree. I have now started to record my presentations with my own voice to help mitigate that to some degree.

The Physical Resolution is usually to low
To further add to the Columbia findings I have serious issue with how briefing rooms are usually setup to today. They usually have only one projector with resolution between 1024×728 or 1280×1024. Many laptops today have widescreen formats on the screen which when used on “clone mode” makes the image of a 4:3 format projector looked really skewed. When projector handles widescreen formats especially with a higher resolution they are never used because:

  • Users are given computers with sub-performance graphic cards that really don’t handle full HD (1920×1080) resolution.
  • Users don’t know anything else but to “clone” their screen. What you see on the laptop is what you see on the projector. Thus in essence limiting the resolution on the projector to what ever the laptop handles. Again because users have been given cheap computers.
  • The resolution has to be turned down from the highest one “because everything became too small to see”. The reason for this is that the physical screen size is too small which makes the projector sit too close and the actual pixels too small to see from most of the room.

Combine that with Powerpoint templates with big font sizes we have a situation which means that not a lot information can be displayed for us which I also think adds to the oversimplification issue of this issue.

Why the Afghan “Spaghetti image” is actually rather good

The NYT article contains an images from the Afghanistan conflict with hundreds of nodes being connected by arrows in different colors and this is given as an example of the problems of using Powerpoint. To start with I am not even sure that the image is made in Powerpoint, at least not from the beginning. I think a likely candidate instead is Consideo which is a MODELING tool not a PRESENTATION tool. The problems with that image is that when it enters the Powerpoint world it is static with no connections to underlying data. Imagine instead that that images was a dynamic and interactive visualizations of objects with relationships objects powering the lines. Metadata allows for filtering based on object and relationship attributes. Suddenly that images is just one of almost endless perspectives of the conflict. Imaging if all these nodes also are connected to underlying data such as reports and written analysis. Then it becomes easier even for an outsider to start understanding the image. We also need to understand that some visualizations are not intended for the decision-maker. Sometimes in order to understand them you need to have been there in the room most of the time so you understand how the discussions were. So this images is potentially rather good because it does not contain oversimplified bullets but instead is something you probably could stare at for hours while reflecting. However, it MUST NOT be an image that is manually updated in Powerpoint – it has to be a generated visualisation on top of databases.

Still valid for marketing
The almighty Master of presentations, Steve Jobs, who actually is using Apple Keynote instead of Powerpoint will most likely continue using that format. He delivers a very precise markeing message with slides that does not contain very much text at all. The rest of us who are not selling iPads need to start figuring out a smarter way to do business. Newer versions of ever more complex MS Powerpoint applications are simply not the answer. It is so general purpose that it doesn’t fit anyone any longer. At least if you care about your own time and data quality. It helps to some degree that both Keynote and Powerpoint use XML today – that means that the technical ability to use them as just a front-end is possible. The real issue has to do with information architecture and usage.

Conclusion
Oh, so how to do this, then? Use Enterprise Content Management systems to manage your content and move to an concept where content is handled in XML so it can be reused and repurposed while preserving tracability. Have a look at my other blog post around “The Information Continuum” to get an idea of how. Since we do store all of our information digitally there is a need for much more in terms of visualisation and presentation support tools – not less. However, we need to find a way to be able to present lines of reasoning with a capability to do drill-down to utilize the tracability aspect. Maybe presentations to some degree will be more in the form of a rendition with links back to text, data, graphs, images or whatever. We need to accept that in many cases it isn’t realistic to try to boil it down to summarized and instead be able to explore that data ourselves. Now, let us setup our mindset, software and meeting rooms to do just that!

Interesting thoughts around the Information Continuum

In a blog post called “The Information Continuum and the Three Types of Subtly Semi-Structured Information” Mark Kellogg discusses what we really mean with unstructured, semi-structured and structured information. In my project we have constant discussions around this and how to look upon the whole aspect of chunking down content into reusable pieces that in itself needs some structured in order to be just that – reusable. At first we were ecstatic over the metadata capabilities in our Documentum platform because we have made our unstructured content semi-structured which in itself is a huge improvement. However, it is important to see this as some kind of continuum instead of three fixed positions.

One example is of course the PowerPoint/Keynote/Impress-presentation which actually is not one piece. Mark Kellogg reminded me of the discussions we have had around those slides being bits of content in a composite document structure. It is easy to focus on the more traditional text-based editing that you see in Technical Publications and forget that presentations have that aspect in them already. To be honest when we first got Documentum Digital Asset Manager (DAM) in 2006 and saw the Powerpoint Assembly tool we became very enthusiastic about content reuse. However, we found that feature a little bit too hard to use and it never really took off. What we see in Documentum MediaWorkSpace now is a very much remamped version of that which I look forward to play around with. I guess the whole thing comes back to the semi-structured aspect of those slides because in order to facilitate reuse they somehow need to get some additional metadata and tags. Otherwise it is easy the sheer number of slides available will be too much if you can’t filter it down based on how it categories but who has created them.

Last year we decided to take another stab at composite document management to be able to construct templates referring to both static and dynamic (queries) pieces of content. We have made ourselves a rather cool dynamic document compsotion tool on top of our SOA-platform with Documentum in it. It is based on DITA and we use XMetaL Author Enterprise as the authoring tool to construct the templates, the service bus will resolve the dynamic queries and Documentum will store and transform the large DITA-file into a PDF. What we quickly saw was yet another aspect of semi-structured information since we need a large team to be able to work in parallell to “connect” information into the finished product. Again, there is a need for context in terms of metadata around these pieces of reusable content that will end up in the finished product based on the template. Since we depend of using a lot of information coming in from outside the organisation we can’t have strict enforcement of the structure of the content. It will arrive in Word, PDF, Text, HTML, PPT etc. So there is a need to transform content into XML, chunk it up in reusable pieces and tag it so we can refer to it in the template or use queries to include content with a particular set of tags.

This of course bring up the whole problem with the editing/authoring client. The whole concept of a document is be questioned as it in itself is part of this Continuum. Collaborative writing in the same document has been offered by CoWord, TextFlow and the recently open source Google tool Etherpad and will now be part of the next version of Microsoft Office. Google Wave is a little bit of a disrupting force here since it merges the concept of instant messaging, asynchronous messaging (email) and collaborative document editing. Based on the Google Wave Federation protocol it is also being implemented in Enterprise Applications such as Novell Pulse.

So why don’t just use a wiki then? Well, the layout tools is nowhere as rich as what you will find in Word processors and presentation software and since we are dependent on being able to handle real documents in these common format it becomes a hassle to convert them into wiki format or even worse try to attach them to a wiki page. More importantly a wiki is asynchronous in nature and that is probably not that user friendly compared to live updates. The XML Vendors have also went into this market with tools like XMetaL Reviewer which leverages the XML infrastructure in a web-based tool that almost in real-time allow users to see changes made and review them collaboratively.

This lead us into the importance of the format we choose as the baseline for both collaborative writing and the chunk-based reusable content handling that we like to leverage. Everybody I talk to are please with the new Office XML-formats but say in their next breath that the format is complex and a bit nasty. So do we choose OpenOffice, DITA or what? What we choose as some real impact on the tool-end of our solutions because You probably get most out of a tool when it is handling its native format or at least the one it is certified to support. Since it is all XML when can always transform back and forth using XSLT or XProc.

Ok, we have the toolset and some infrastructure in place for that. Now comes my desire to not stove-pipe this information in some close system only used to store “collaborative content”. Somehow we need to be able to “commit” those “snapshots” of XML-content that to some degree consitutes a document. Maybe we want to “lock it” down so we know what version of all of that has been sent externally or just to know what we knew at a specific time. Very important in military business. That means that it must be integrated into our Enterprise Content Management-infrastructure where it in fact can move on the continuum into being more unstructured since it could even be stored as a single binary document file. Some we need to be able to keep the tracability so you know what versions of specific chunks was used and who connected them into the “document”. Again, just choosing something like Textflow or Etherpad will not provide that integration. MS Office will of course be integrated with Sharepoint but I am afraid that implementation will not support all the capabilities in terms of tracability and visualisation that I think you need to make the solution complete. Also XML-content actually like to live in XML-databases such as Mark Logic Server and Documentum XML Store so that integration is very much need more or less out of the box in order to make it possible to craft a solution.

We will definitely look into Documentum XML Technologies more deeply to see if we can design an integrated solutions on top of that. It looks promising especially since a XProc Pipeline for DITA is around the corner.

Reblog this post [with Zemanta]

With Jamie Pappas in the Blogger’s Lounge at EMC World 2010

The Blogger’s lounge is a great water hole to stop by to get a really good latte but of course also sit down in nice chairs and sofas with power outlets on the floor to blog and tweet about experiences at EMC World 2010 in Boston. Today I stopped by in the morning to have my photo taken with Jamie Pappas who is Enterprise 2.0 & Social Media Strategist, Evangelist & Community Manager at EMC. Be sure to visit her blog and follow her on Twitter. My dear Canon EOS 5D camera managed to capture the nice lighting in the lounge I think.

EMC World 2010: What is New and What’s Coming in Documentum xCP?

This session was presented by John McCormick on Tuesday morning.

The three pillars are:

  • Information Governance
  • xCP
  • Information Access

EMC wants to help customers to get maximum leverage from their information and Deliver the leading application composition platform for information management and case processing.

Intelligence Case Management:

Data, People, Content, Collaboration, Reporting, Policies, Events, Communication, Process

Case Management: Argues that it is a discipline of information management which is:

  • Non-deterministic
  • Driven by Human Decsionmaking
  • Driven by Content status

xCP Product Priniciples

  • Enable Intelligent business decisions (content and business process analytics)
  • Composition and configuration over coding
  • Enable performance through responsiveness and usability
  • Delight application builders and systems integrations
  • Beyond Documents: People, process and information in context
  • Leverage the private cloud
  • Build a future-proof product (move to declarative composition model)

The goal is collapse all the existing products that makes up xCP into fewer ones.

It is about reusable components, compositions tools, xCelerators

Resusable components:

  • Activities (templates)
  • Forms
  • UI

Tools:

  • Process Builder
  • Forms Builder
  • Taskspace for the UI

What is coming next…

There are different version numbering for xCP and the Documentum platform and this is how they relate:

  • xCP 1.5 – D 6.6 (June 2010?)
  • xCP 1-6 – D 6.7
  • xCP 2-0 – D7 (next-gen Case Management)

Focus for Documentum 6.6

  • Real-world performance testing
  • Composer 6.6 (dependency checking, simplfifcation
  • Taskspace is getting better in 6.6
  • Improved manageability (workflow agents behaves more gracefully)
  • Forms Enhancement (conditional required fields, better relationship management)
  • ATMOS Integration

Documentum 6.7

  • Final release of D6 family (Q1 2011)
  • Licence Management improvements
  • Improved Search ( integration of DSS)
  • Public Sector Readiness (Section 508 improvements for Taskspace)
  • Composer Improvements (xCP application (and no manual installs and version ingestions)

6.5 SP2/SP3 and 6.6 ready for Documentum Search Server (DSS)

Integration of cloud storage ATMOS D 6.6

As soon as DSS is out the whole platform is supported on a virtualized environment.

vSphere integration & Certification (D 6.7)

Documentum 7 (xCP 2.0) Sneak peak – Increased Business Agility

  • Composition is simpler
  • Deployment is faster
  • Case workers are more productive

Improving the tooling

  • Single Composition Tool – xCP Composition Tool probably based on Eclipse
  • Modeling view
  • Compose a page/screen

Deployments is Faster

  • Leverage the private cloud
  • Everything is virtualized
  • Deploy to an already installed environment directly from xCP composition tool to a VMWare instance

User Experience

  • Better insights into cases
  • Better viewing experience
  • Integrated capture
  • There will be a new Web Services based UI
  • Easy to search and add content to a case
  • Easier inline viewing

EMC World 2010: Customizations of Centerstage

The session was presented by Andrew Goodale who is the architect behind Centerstage. I am not a developer but to me this session was very important because I believe that the level of customizations possible greatly influence the potential of a successful Centerstage deployment. A lot of the power of enterprise systems lies in the possibility to adapt to the business needs.

He started by exploring the Services SDK and outlined that the architecture is set up with a Direct Web Remoting (DWR) LIbrary do the magic between web browsers and Web Services WSDL.

Using DFS Types for Data Model where appropriate

  • ObjectIdentify
  • DataObject
  • PropertySet
  • TypeInfo

Simplification was needed to support broader language adoptions because it is hard to called them from anything else than Java and .Net

Trouble calling them from Flash

  • No use of abstract XML Types
  • Minimize the number of XML namespaces
  • Need to support invocation from un-typed languages (e.g. Javascript)

Interface Design

  • Restricted set of data types

The DFS error handling is fine for programmatic access but when you want to show a progress dialogue and had new data structures for that. If you copy 200 files some of them could invoke an error there is important that these are handled and for instance not importing anything from file number 53…instead give more extensive information to the user of what happened and what went wrong without breaking the import after the error.

Foundation Services

  • Create blogs, wikis,
  • Manage spaces
  • Templates

Application Services

Overview

  • Provide the “guts” of Centerstage
  • Capture application logic that is UI.agnostic

Basic Content Services (Create, Checkin, Checkin, Copy, Move, Delete and Properites dialog)

  • Icon
  • Lists (Grid data sources – a declarative mechanism for creating queries, handling sorting, pagination and caching)
  • Permissions (simplified permission levels to standard dm_acl
  • Search (Knows about CS Artefacts and Integrates CIS entities with facets)

DFS Core Services

Possible to use them to modify CS artifacts

– for example ObjectsService.copy to  copy a wiki page

Copy things, add things to a page etc

Our DOF modules will enforce data constraints which for instance means that you can’t copy a page object without copying the page content

Deploying the SDK

  • A zip.file containing binaries and javadocs
  • Centerstage Services are added to core SDK
  • “remote” jars only – a deployed centertsage server is needed

Setup

– Unzip the SD

For Java your classpath should include

– DFS runtime, JAX-WS, JAX-B

Java: centerstage-foundation-remote-jar and centerstage-application-remote.jar

.Net requures 3.0 SDK for WCF, Visual Studio Optional)

Samples in both Java and .Net

Creating a Space

Uses the Blank Template which ships with Centerstage

  • Identify qualification shows how to pick a specific template
  • Using a template guarantees that the space will be Centerstage-compatible
  • Space needs a home page

Returns and OperationStatusSet

  • The standard return type for creates, updates
  • Allows validation errors to be returned.

Creating a Wiki – child pages to the wiki can be added in the same way

An activity template can create a space and send an invitation email to everyone.

Java samples can be built with Ant 1.7 and Java 1.5 Not IDE requirement – Eclipse will work fine.

Sample: Wiki to eBook sample

Goal:

  • Given URL to a CEnterstage wiki, create and ePub book
  • Each wiki page becomes a chapter in the book
  • Blogs and Discussions can also be converted
  • High-fideliyt (the rich text in CS is XHTML in the repository)
  • Page links are preserved

What it shows

  • PageService
  • Fetch wiki home page

The used a set of Google code – Java library that builds ePub books – contributed by Adobe

http://code.google.com/p/epub-tools/

Centerstage Mini – Demo to call services from Javascript

Goal:

  • Build and HTML page that shows Centerstage data
  • Pure AJAX Technologies

What it shows

  • How to call services from JavaScript
  • How si data marshaled

Demo showed the Recent Activity in an external native ExtJS Grid

List Services

The SDK for CS is not licensed….basically need a CS license to use it…

eBook Sample is available on ECN

The SDK will be in GA in the July timeframe.

To me it seems more powerful than I thought that it is now possible to programmatically be able to setup Centerstage Space and modify existing ones. That gives us an opportunity to create “templates” for common things that the business needs to do using E 2.0 features. Instead of relying that users are aware of all the possibilities and can execute them manually we now can have quick buttons to do that or use workflows or external systems to trigger these actions.

EMC World 2010: There is an App for Documentum now (iPhone OS)

Flatiron Solutions delivers an iPhone OS App for Documentum

So, finally I got to see it. Documentum on iPhone OS, running on both the iPhone and the iPad. I had said it before and say it again: from a information management perspective it makes so much sense to combine the intuitive interface of the iPhone OS with power that lies in a Documentum repository. Make use of all the metadata around content objects and exploring information becomes a breeze on a multi-touch device.

It is the company called Flatiron Solutions that brought this to market. You can download a version of it from the iTunes App Store. In order to connect your own repository you will need a server component that sits between the iPhone OS App and the Documentum repository.

Download the App from iTunes

I had a chance to try it out on both the iPhone and the iPad in their booth at the Solutions Pavillion last night and it was so fun. I really want this in our Battle Lab. A very sexy interface for Documentum!

EMC World 2010: My presentation around using Documentum in a SOA-platform

Yesterday on Monday May 10 at 11 am I gave a speech at the Momentum 10 conference here at EMC World 2010 in Boston. The presentation was focused around our experiences of building an experimentation platform for next-generation information and knowledge management (IKM) for a large operational level military HQ. Contemporary conflicts are complex and dynamic in character and requires a new approach to IKM in order to be able to handle all those complexities based on a sound management of our digital information. At the core of our platform is EMC Documentum integrated over an Enterprise Service Bus (ESB) from Oracle. The goal is to maintain access and tracability on the information while removing stove-piped systems.

I have got quite a few positive reactions both from customers and EMC-people after the session which of course is just great. For instance see these notes from the session. All the presentations will be available for download for all participants but that will most likely take some time. So in the meantime you can download my presentation here instead:

Presentation at EMC World 2010 in Boston

Looking forward to comments are reflections. The file is quite big but that is because my presentations is high on screenshots and downsampling them to save file size will make it too hard to see what they are showing. Try zooming in to see details.

EMC World 2010: At Blogger’s Lounge

Sitting at the lounge now relaxing after another cup of great latte. Relaxing after what felt like a really good presentation earlier today at EMC World 2010. Responses so far have been very positive and it feels great of course. We think we have so many cool ideas and it is great to be able to show it off to people with a deep interest in Enterprise Content Management.

Alexandra Blogger's lounge at EMC World 2010

Now it is soon time for the keynot by Mark Lewis who seem to be in charge of the newly renamed Information Intelligence Group (formerly Content Management & Archiving Division).