.comment-link {margin-left:.6em;}

Beyond the Fat Wire

Tuesday, November 30, 2004

Post-sessions, on Tuesday

Following the last session was a reception in the exhibits area. (of course, throughout the conference, if you want the free food and drink, you have to mingle in the vicinity of the vendors)

At lunchtime, I avoided the vendors, but I decided to take the plunge during the reception. FatWire and RedDot are both here ... I didn't stop to chat. I did talk to this company, Serena -- web content management, Oracle/DB2/SQL Server based content repository, browser based author interface, LDAP/Active Directory aware, I guess runs on a J2EE app server (I didn't ask but I saw "servlet" in a URL from the browser based author interface), and they seem to be focusing themselves on the education market -- their demo site screen was for a university site, with content quoting something from Educause. They have a couple of school districts in Texas as clients, and a handful of smaller universities. Seems like a nice product, but I still think Contribute will do all Serena will do (minus the db-based content repository). Contribute will give us better control over author content permissions and roles I think, and also Contribute will give us better control over the use of styles.

There's another funky web content management vendor here called Hot Banana, but the rep was busy so I haven't talked to them yet.

Most of the other vendors seem to be a mix of content management and XML stuff. There's one saying it does metadata and taxonomy management, but I haven't talked to them yet -- they are SchemaLogic. Oh, Vivisimo is also here for site and enterprise search

The one other I did spend some time with was IXIAsoft which has this product called TextML Server. Along with the IXIAsoft guys was Mark Ludwig, a librarian at SUNY Buffalo -- SUNY Buffalo has used TextML Server to create an XML based web library catalog from their NOTIS MARC records. Here's a Library Journal article. Anyway, he has some grant and is looking for other libraries to join -- I don't have the details but it involves FRBR-ized XML catalog repositories I believe. I have his card to share with Bob and Ramona in case they're interested. Mark Ludwig also did a session I wasn't able to attend. Here's his presentation (PDF).

Tomorrow I'll have to decide between competing sessions. The CMS implementation things will continue, but at the same time there will be CMS implementation case studies. These case studies are done under the rubric of Content Technology Works -- a Gilbane initiative to document content management best practices and success stories.

On the other hand, the morning panel tomorrow is on CTW (Content Technology Works), and the CTW case studies are on the web, so maybe that will be enough.

The rest of Tues afternoon, part 2

(notes continued)

Bob Boiko says, in a discussion with the w3c guy:
Just because it's XML doesn't mean 2 docs will be interoperable ... and if 2 docs are already interoperable, it doesn't matter if they're XML or not

But then again, he didn't address the question of 2 docs that need to be interoperable now, even though they began life in non-interoperable formats.

My thought was, are these guys hooked into the library data preservation community? I haven't really been too far into the digital preservation issues, but my sense is that probably the library digital preservation people would have a contribution to this discussion.

Then, Bob Boiko had the opportunity to provide the diatribe he'd been waiting all day to give: basically, that the semantic web will be all about:
  • society
  • language
  • world view
in other words, getting people to agree on the ways and means of expressing ... and not ultimately about data format, transfer protocol, or the details of standards

In an audience Q/A, a question had to do with dealing with users (read: web authors) who are unable to separate the display of their content with the content itself. The answer was: don't even try. For people ("regular" people), the presentation of content is PART of the content. This is true for those who valiantly learned html in 1997 and still think part of making a web page is choosing font size 7 for a page headline. But it's also true for those whose authoring is just them writing copy and presenting it to an audience ... part of the communicative process is the display (as we well know), and, for these authors, separating the display-based communication from the word-based communication just is not part of their repertoire of expertise. ... which is fine ... everybody has a repertoire of expertise and no one's repertoire is anywhere close to comprehensive.

So, what we can do is "deliver semantic payload under the covers" ... that is, give authors a chance to feel as though they are controlling display, even though the design specialists are the ones who control the details of what that display is. And give them opportunities to preview their (content) creation in a context that is as real as possible so that they have what they need to evaluate the communicative effectiveness of what they have written.

And so ... we create CSS styles that are as much about semantics as they are are about layout, overall visual design, etc.


At the end of the session -- final thoughts from each of the panelists:

Bob Boiko went first: quoting a little ditty he attributed to some uncle or someone, that started with "nibble nibble little sheep". (I told you he was a hoot) Basically, he is saying, do whatever works and don't worry about it

Next was Tony (with no rhyming couplets, he apologized, explaining he's not that fast): keep it simple, don't be afraid to phase in, don't get sucked into an over-engineered, overly fine-grained solution to anything

Matt (the w3c guy) offered a haiku:
Less complexity
if you want people to read
x h t m l

Jon: "the network is the document"

Tuesday afternoon, part two

This one is on the topic of document types :

"Open" Document Formats, XHTML vs. HTML, XSL vs. CSS & Other Industry Debates

Jon Udell
, Lead Analyst, InfoWorld
Bob Boiko (see below)
Tony Byrne, Editor of CMSWatch, author of The CMS Report
Matt May, Web Accessibility Specialist, w3c

This conference has a nice approach to panels. Each speaker gets 3-5 minutes to make points, and then the floor is open for discussion and Q and A -- audience, panel members, everybody can talk. It's a good format.

European Union - WordML and Open Office's version of native XML
  • what about when an XML document flows through channels that are disconnected to its schema?
  • should we really have to worry about WordML, etc?
  • Do you need XML if you can create extremely disciplined XHTML? Or, rather, was is the point where disciplined XHTML is not enough?
This is a very interesting discussion ... no way I can keep up with notes on this..

"the gravity of the EU talking about this topic" -- the w3c guy says -- in the environment of the EU preparing its constitution., and there are all these issues of text, semantic markup, standards for data and documents. On one hand, with Word so ubiquitous, you have to deal with WordML in some way, but it's not right to just accept WordML as a standard simply because of ubiquity.

Tony Byrne is bringing the discussion back down to earth i think .... so far, it's been pretty abstract. Tony is saying, we can't forget that, no matter the format, it's going to be the document structure and metadata that will let the documents be long-lived, reusable, extensible, etc.

but, then the w3c guy is pointing out, if EU rejects all of WordML ... will there be some poor guy who has to go home to Microsoft Seattle and admit he lost the European MS Office contract ... ?

Tuesday Afternoon, part one

This session is called "Implementing Content Management Systems - Next Steps" with some familiar faces on the panel. Ann Rockley and Bob Boiko, both of whom were speakers at the 2004 IA Summit, are there. The third person is Ben Martin, who I haven't heard of.

Ann Rockley says -- get presentation here (PDF)
  • XML is your friend
  • IA for the content authors / separate from IA for site visitors
  • Beware the content review cycle
    • Avoid unnecessarily multiple reviews
    • Make sure roles and workflows make sense
    • Don't let authors and reviewers get bogged down in minutiae
  • Content audit = an accounting of the information in your organization
    • what do we have
    • how much of it is there
    • how is the content used, reused, and delivered to various audiences
    • how can the content be unified -- how to create a unified content strategy?
  • Decide the structure of your content repository (beyond the file/folder structure)
  • First mistake is to choose a tool before you understand your requirements

Bob Boiko (who is a hoot) says:
  • A new edition of his book is coming out soon
  • what are good requirements?
    • important = things with "must" and "should"
    • durable
    • doable
    • just specific enough = with solid direction, but not so restrictive that you'll get into trouble
  • What sort of requirements?
    • the ones people give you = i.e., the stakeholders. however, they usually don't know. actually, you're lucky if they even have an idea of what content management is (and probably, they won't have a clue)
    • the ones you figure out yourself = when you get tired of people giving you crazy, un-doable ideas
      • because, people usually can't tell you what you need to know
      • they have their pet ideas, they have what maybe somebody "cool" told them
      • they give you opinions that may or may not be based in knowledge
      • the topics they will raise with you will be "hit and miss", often will be irrelevant, often will be contradictory
    • you still have to talk to people, but talk to them for the skeleton of what you need to know, don't expect them to give you everything you need to know
  • How do you fill in the blanks yourself?
    • what are the goals of the system? what are we trying to accomplish? how will we know we are successful?
    • who is the audience? (are the audiences)
    • what publication(s) is/are best for those goals and audiences? what content do those people need? what will communicate with them the most effectively?
    • what methods of transmission to these people want? web? handouts?
    • who will be supplying the content? (who are the authors -- what do those authors need? what goals to the authors have?)
    • what will the access method be? what is needed to make that method happen?
    • access structures means metadata and IA as well as delivery methods
  • The detailed requirements process -- collect the requirements, manage the requirements, publish the requirements
    • Documentation .... don't neglect this step. it's how you will communicate to your stakeholders
  • Understand the expertise of your authors
    • technical expertise (or not)
    • content expertise (if not, you're in trouble)
  • Map requirements to features

Ben Martin, of Industrial Wisdom, LLC, ex of JD Edwards, says in "Multilingual Content Management: Taming of the Shrew" -- get presentation here (PDF)
  • "shrew" -- in the sense of the animal that is so voracious that it will eventually eat itself rather than starve to death
  • the web is not a US, English speaking entity any more
Ben Martin really is talking to people whose sites are global, where localization is an issue. This is not our situation -- some would say that as a library at a US university (where to get admitted you need to exceed a minimum TOEFL score) that we should not be in a position to even consider issues related to multilinguality, localization, etc.

However, I believe that we can't forget that a sizeable portion of our audience is a member of that non-US, non-native English speaking, community. We might not offer translated, localized sites, but awareness of what will and will not communicate to that non-US, non-native English speaking community is something we have to have.

"Translation memories" .. a database that holds phrasal level translations
"Up chunking" .. dividing up your content into chunks, so that it can be reused, repurposed, and served up again in different contexts, formats, etc

Basically though:
  • Tame the source
  • Tame the change process
  • Integrate translation processes
  • Centralize, but allow flexibility to allow for localization needs

One of the panelists in the opening session was a financial analyst/Wall Street guy who specializes in CMS and IT companies. He thinks that this is the time for CMSs. Apparently, content management is now being seen as the place where companies can grow their business (I suppose, since other areas are tapped out?). But, he also is looking at the fact that Oracle, IBM and Microsoft are entering the CMS space as indicators that CMS is where those 3 think they can make some money.

Anyway, this is the middle of the 2 hour lunch break. The breaks are long -- the morning one was an hour -- so we have all this free time to get hit up by the vendors.

There are about 250 or so attendees at this conference, best I can guess. Maybe 30 or so vendors. Here's the exhibitor list.

Notes, cont.

  • Content migration is one of the biggest costs, and one of the most underestimated
  • Metadata management -- another area where, typically "they all fall down"
  • ROI estimates with respect to web content management are not to be believed

Conference Presentations

Conference presentations are here

More notes

On organizations:
  • Your organization is dysfunctional. A CMS will not fix that.
  • Are you unable to manage your current web site? If so, what makes you think you'll be able to manage a CMS implementation? (if you have non-content issues, a CMS will not fix them)
  • You have to have the ownership, responsibility, communication chains among web content producers before you can implement a CMS
  • You have to reconcile a good workflow/business process and one that will actually work in the real world.
  • The only "best of breed" software is the one that matches your organization

Items from this morning

This morning's presentations included:
=============

Other things from this morning:

1. Imagining that it's possible to implement any Enterprise solution (e.g., WebLogic, ContentServer, etc.) without using professional services is living in a fantasy land. Apparently, the correct budget strategy is to expect to spend at least 50% of the price of the software on the vendor's professional services.

("Professional services" means buying consulting from the vendor to get their software to work the way you want it to)

2. "Enterprise Content Management" is a myth. There are no integrated content management systems. Documentum is supposed to be farther ahead with integrationg, but it's not there yet either.

Rather, if there is anything like ECM, then it should be viewed as a vision, or as a purchasing strategy. Not as a software solution.

3. "Multi-channel communication" -- we need to think about web content from the point of view of information consumption, not information delivery. We need to consider this also in the context of nomadicity - (PDF - 61 KB)

4. Trends in how people work -- currently, about 50% of workers list collaboration as a primary mode of working. The projection is that a year from now, 60+ % of workers will describe their mode of working as collaborative.

It may not be our job as a library to *promote* this, but we must be in a position to enable collaboration and accommodate nomadicity.

Westin Copley Place

the venue is a really nice hotel by the way.... too bad it's not affordable if you're not attending a conference !

From the Gilbane Conference on Content Management Technologies

The second session of the day is about to start, so for now, just a word

twice this morning in the opening keynote (more a panel than a keynote), two different people used the "L" word

according to those analysts of trends in the content management space, the future is in Information Architects and Librarians (or at least, those with "librarian skills" as they termed it)

and this audience is not IAs or librarians. I am amazed.