Envisioning and shaping the future of work and business.

Friday, November 30, 2007

Rules are meant to be broken

9:53:00 AM Posted by Oscar Berg , , 1 comment
Rules are meant to be broken, we all know that. End users of IT products and solutions seem to be particularly motivated to break rules and policies if they experience them as delimiting, creating obstacles that keep them from doing their jobs as efficiently (or conveniently) as possible. I can only look at myself. Ease of use is my primary concern with the applications and tools that I use - besides the obvious fact that they shall help me achieve my goals.

It is my belief that corporate policies and rules for how to use IT resources must always be contrasted with usability in order to find a balance that keeps them from be broken. Defining policies and rules must always be done in dialog with the users. Or put it in other words, it must be done in a tight dialog with the business. Organizations should focus on working with soft things as people’s attitudes and values just as much as they work with defining policies and rules.

A goal that is idealistic but nonetheless worth striving for is to eliminate the need for explicit governance. This can only be done by working with the attitudes and values of individuals within the organization. And by making it as easy as possible for them to do right, allowing them to see and take credit of their contributions and results. Such a system will be almost self governing. If your neighbor sees that you have broken the socially established and silently enforced community rules, he will react and ask you to respect the community and the rules that are there for its common good.

It could be an idea for the policy and rule makers to analyze why Wikipedia works. Wikipedia is really easy to use, much since few obstacles have been implemented in the Wikipedia user interface in order to prevent policies and rules (conventions) to be broken. At the same time, it is really simple to use and to break the rules. But, as is stated on Wikipedia:

"There is no need to worry about accidentally damaging Wikipedia when adding or improving information, as other editors are always around to advise or correct obvious errors, and Wikipedia's software, known as MediaWiki, is carefully designed to allow easy reversal of editorial mistakes."

Since the general attitude and values that are enforced by the Wikipedia community is that quality is everybody’s responsibility, the community is more or less self governing. And isn’t this what web 2.0 is all about? Ease of use. Trust. Openness. Collective social governing. All based on the belief that we will all benefit from high quality content and experiences on the web. What if that could be a nautural thing within an enterprise context as well?

Tuesday, November 27, 2007

Estimating the Value of Content

8:46:00 AM Posted by Oscar Berg , No comments
If you empty your pockets and there's a lighter, a pen, the keys to your home and your wallet - what would you care about putting in a safety deposit when you go swimming?

Estimating the value of content is key to efficient content management. Depending on how valuable the content is, it needs to be managed differently. Content that is estimated to be very valuable for the business should be considered as assets and managed with the same care as other kinds of assets. Content that is more or less worthless should be terminated - if it worth the effort to terminate it. Simply speaking.

Seth Gottlieb shares some very sharp insights on this subject in his post "CMS Business Case":
"There has been an enormous amount of writing and discussion about building a business case for a CMS and I don't have much to add other than to say that most of what I have heard is totally wrong ...//...In my opinion, the business case discussion should be around the content itself - not the technology used to manage it"

"At cmf2007, Bob Boiko's keynote talked about how we are not yet in the information economy because we have a hard time determining the value of content and the markets for trading information are primitive. Content managers are put into the subservient role of having to post everything that they are given. I would tend to agree with him. I do not feel like companies are any better at deciding what content to keep than the parent of a prolific three year old artist. In fact, I feel like the parent has the edge because he has a finite amount of refrigerator door space."

Very well said by both Seth and Bob Boiko.

Finally, here are some of the posts on this blog related to the subject of content value:

Who owns your "IT projects"?

8:07:00 AM Posted by Oscar Berg , 4 comments
Recently, I and two of my colleagues put together a PPT presentation for a potential customer. We reused a presentation we had used earlier and basically stripped it down so that it focused on our main message and sales arguments to the customer. When walking through the presentation, we realized that we had no slide about information technology other than a couple of slides that mentioned the interaction between the business and its IT systems on a very high level. Being IT Management consultants - management consultants focusing on how to use IT to improve businesses - we felt a need to add one slide about the need to address the complexity of the IT legacy just to balance the presentation a little bit. How often does that happen when you make a presentation about something that to a great deal has to do with IT?

At my company we have a saying that there are no IT projects, only business projects involving more or less IT. Every IT initiative should be owned and driven by the business. Some might object to this way of seeing things. I would expect IT people that want to keep their influence and power and business people not interested in IT to belong to this category. Upgrading the mail server software to the latest version since the current version will no longer be supported by the vendor might be used as examples of a "pure" IT project that should be owned by the IT department. But it should not. It would only be a pure IT project if the mail server was not used in the business at all, which then would question its existence and cause it to be terminated.

The main point is that someone in the business should always have the ownership of initiatives involving IT, even those needs and initiatives that have been identified and requested by the IT department (such as upgrading a mail server software due to that the current version will no longer be supported by the vendor). With the ownership follows funding and taking responsibility of the business results of the initiative and the consequences for the business if it fails (such as costs related to the inability to communicate via e-mail). If there is no ownership in the business, then the initiatives which are in fact owned by the business will always have higher priority than the initiatives owned by the IT department. So, upgrading the mail server software will be in the bottom of the priority list among other IT-owned initiatives and the mail server won't be upgraded until the mail server breaks down and hurts the business (unless someone in the IT department puts it higher up in the priority list without telling the business people that it got higher priority than their initiatives).

To repeat – there are no IT projects, only business projects involving more or less IT. These include even "boring" stuff like upgrading the mail server software.

Thursday, November 22, 2007

Will Web 2.0 Drive Knowledge Management?

Knowledge Management has a history. I wrote my first report on this subject back in 1996. Knowledge Management was then defined as a systematic approach to manage corporate knowledge to achieve business value. It is a general definition that still has merits. Some research and practices back then focused on managing knowledge assets with information technology and others on the dynamics of organizational collaboration.

Common practices to create, manage and transfer knowledge have been:

  • Communities: Collaborative groups that span across organizational boundaries.
  • Best-practice: Reusing knowledge via work descriptions, offerings and similar.
  • Knowledge maps: Map knowledge to specific work processes or situations.
  • Knowledge profiles: Describing knowledge workers’ roles and resources.

The above practices have often been enabled by means of content and collaboration technologies such as messaging, e-mail, document management, portals, enterprise content management, search and the like.

Content can be seen as a seed of knowledge. But extracting the knowledge and acting upon it require first, that people need to interpret the content to understand the intended information and second, people need to ponder the information for knowledge to emerge. As argued in a former post:

“Content can be managed with the means of (information) technology, but we cannot manage information and knowledge with technology alone since information and knowledge are created and exist only in the heads of humans.” (Back to Basics - Defining Data, Content, Experience, Information And Knowledge)

For Knowledge Management to succeed in an Enterprise it is, for the above reason, essential that appropriate roles, cultures, incentives etc are in place. This will encourage a knowledge sharing environment (knowledge market) necessary for better innovation, smarter services, increased learning, higher productivity etc.

So, what can Web 2.0 technologies and practices add to Knowledge Management?

Web 2.0 is fostered in an agile, open and distributed atmosphere. Recent social trends embrace more open collaboration where content is created and shared in self-organizing networks and communities. This attitude may be what is missing in many failed Knowledge Management initiatives, where company workers have been reluctant to join forces and share what they know.

Web 2.0 technologies for user generated content (e.g. wikis and blogs) and metadata (e.g. social tagging and bookmarks) will simplify the production and consumption of content. Other technologies such as feeds, mashups, web services, ajax etc will have a role in developing a more flexible and richer web user experience more suitable to the needs and preferences of knowledge workers.

Social interaction (e.g. profiles and social networks) has possibly the largest potential in adding something innovative to Knowledge Management. Knowledge workers may market their knowledge and interests and passively or actively strengthen their relationships across company borders.

I think it is safe to conclude that Web 2.0 will drive Knowledge Management to another level.

The democratization of IT forces the CIO role to change?

11:12:00 AM Posted by Oscar Berg , No comments
Information Week has an interesting article ("The Evolution Of The CIO" by author John Soat) about how the democratization of technology and Web 2.0 put pressure on changing the role of the CIO.

"End-user-driven technologies such as software as a service, social networking, mashups, and wikis are contributing to what the University of Michigan's Krishnan calls 'the democratization of technology,' shifting IT responsibilities to business units and pressuring the CIO position to change. Rogow hits on an important point: There's a perception that IT departments in general, and CIOs in particular, are at best order takers and at worst control freaks."

"The irony is that for years, IT managers have been trying to get business decision-makers more engaged in technology. Now that it's happening, many want to shut it down."

"Savvy business execs increasingly are aware of new technology trends and eager to have their companies embrace them. If there isn't a focal point for that change--i.e., the CIO--change will happen ad hoc: marketing guys looking at marketing solutions, finance guys looking at finance solutions. All those disparate systems will generate important corporate data that's spread across various business units"

Tuesday, November 20, 2007

Links 2007-11-20

6:32:00 AM Posted by Oscar Berg , , , , No comments
Andy Mulholland asks what's new in collaboration with Web 2.0 in his post "Mesh Working rather than Matrix Working" on Cap Gemini CTO blog. Here are some excerpts:

"...Strong collaboration is people centric and based upon a ‘pull’ model of finding, and asking, experts to work with you in providing answers. By contrast Weak collaboration is data centric relying on a push model to send content to those who need it. In weak collaboration we attempt to collaborate with people using email as an example, but our closed close coupled environment only allows us to contact the people we directly know exist and believe may be able to help."

"Competitive advantage is shifting from the cost management of transactions in the back office to business optimisation in the front office and the external market. Globalisation is forcing all enterprises to compete in this space so ultimately Mesh working is being driven as a necessary response to a changing Business world. It’s a World that takes us way beyond internal agility, and flexibility, through Matrix working, and into external responsiveness through Mesh working."

Another interesting post is "Why IT Operations hate SaaS" by David Linthicum, The Intelligent Enterprise Weblog. He provides an analysis why the increasing use of SaaS "creates headaches for IT operations folks":

"The dilemma is that while IT operations wants to continue to control all applications, including SaaS, there is little they can actually do to resolve issues. Or, is that completely true?"

"...with SaaS you're limited to the network and the applications, typically only understanding whether it's working, or not (outages). SaaS providers understand this issue and do open up points of management where IT operations can check on the status of their SaaS-delivered IT resources. You can monitor availability, performance and even the provider's ability to live up to Service Level Agreements (SLAs). It's just a matter of pointing your management infrastructure at the SaaS provider, and thus having a holistic look at all critical systems, internal and external."

Sunday, November 18, 2007

Is the IT dept not interested in business innovation with IT?

Today, most enterprises would agree that IT is an integral part of their business. Business processes and sometimes even business models are based on the use of information technologies. What is more, much of the business innovation based on IT does not happen within the IT department – it happens on “the other side”, on individual and group level in the business. Business people that (rightly or not) were collectively perceived as computer illiterates by the IT people now sometimes are one or even two steps ahead of the IT department when it comes to seeing and acting on how modern information technologies can help to support and innovate the business. Business innovation with IT comes more often from trial and error and ad hoc adoption of new technologies and new ways of working by individuals on the business side. And it comes from breaking policies and rules that where defined without seeing the need for and creating the necessary space for innovation.

At the same time, some IT departments see as their main mission not only to support the business but also to govern it. Well, they see as their mission to govern the IT-based information system and all resources related to it, but in modern enterprises this is practically the same as governing the business. Despite initiatives such as service orientation and SOA, this poses a potential threat to how enterprises are able to use IT to innovate their business. Service-oriented architecture does not automatically increase the ability of an enterprise to innovate its business with IT. SOA is there to help the IT department to react faster to business changes and therby closing the gap that currently exists between the IT systems and how the business needs to be designed and operate. In short, SOA is about keeping the IT-based information system only one step behind the business instead of two steps or more.

I am not saying that business innovation with IT must or even should come from the IT department. In fact, it is a sound development that innovation actually happens more and more in the business, by the people that are close to the customers and revenues. And I am not saying that the IT department should not govern its resources. What I am saying is that the IT department must not forget to create space for innovation with IT regardless of where it happens. Top management and the IT department need to take trends such as the consumerization of IT and Web 2.0 seriously and look closer at what they mean for the business as well as for the IT department when taking the journey from Enterprise 1.0 to Enterprise 2.0.

Friday, November 16, 2007

Wednesday, November 14, 2007

When to SOA or not to SOA is no longer a question

The concept of SOA has been along for a long time and originates from the IT side as a way to architect and design information systems. Recent years of initiatives to extend the enterprise information systems beyond firewalls out on the Internet and the need to expose and access functionality and content in legacy systems have made SOA to the dominating approach to IS architecture. Enterprises have come to realize that SOA is a necessity if their IT-based information systems are to be able to stretch out on the Internet – but also to quickly respond to rapid business changes.

Implementing SOA is however no guarantee that the IT systems will support the business as required by the business. Successful service orientation of a business and its information systems is tightly coupled to the ability to see the customers and respond to their needs. Regardless if the customer is internal or external to the enterprise, the customer is not interested in the service itself, but rather in the resources (informative content or other types of content) that it provides him with – and if and how they help him achieve his goals. Simply put, the customer does not consume the service itself; he consumes the outcome of the service.

Even thinking in terms of “services” might not be enough to be able to see and understand user needs. We must always remind ourselves that the services exist only to serve customer needs. Let’s take an example.

Physically handicapped people don’t have a need for wheel chairs. Their need is to move as freely and unhindered as possible despite their handicap. A wheel chair is a means helping them do that. But, there are other means too and someone will probably invent even better means than wheel chairs in the future. A product-oriented wheel chair manufacturer that does not understand the needs of their customers lacks the ability to innovate. When the innovation eventually comes, something that helps to satisfy the needs of their customers better than their wheel chairs, they would probably go on making their wheel chairs until they are either forced to change their business or put it down.

The point is that we must always define and design services in the light of real customer needs. That applies also for IS services in a service-oriented IS architecture. Otherwise, we might end up thinking that a service has its own reason for existence. That typically happens in engineering-minded businesses that develop products or services based on their own ideas rather than based on customer needs. They often end up asking themselves why their products aren’t as successful as they deserve to be and throw the hot potato over to the marketing department; “How can we convince the customer that he needs our product? “ In that stage, it is probably a little too late to tell the engineers to start all over and go to the customer to ask (or observe) him what he really needs.

Sunday, November 11, 2007

A real world case of semantic dissonance

7:10:00 PM Posted by Oscar Berg , No comments

My previous post stated that semantic dissonance is the real challenge of content integration and I will try to illustrate this in this post by using a real world case.

Semantic dissonance between two content sources is a typical silo effect, something that happens when two or more information systems that have more or less overlapping content have been developed in isolation from each other, typically to support specific functions within an enterprise. Semantic (as well as structural) dissonance can be expected when information systems developed in two organizations that are to be integrated after a merger or an acquisition. But it is also a common scenario within a single organization.

A couple of years ago, I was working with preparing a new version of an e-commerce site for a global corporation. The site had been in a status quo condition for a couple of years and was now to be redesigned and equipped with a new user experience with more interactive features and richer content. The site was fed with product information such as product structures, descriptions, images, prices and stock quotes from a couple of back-end system, one of which provided the basic product information. The integration with the back-end systems had always been a problem child and not worked properly since it was first designed and implemented several years earlier. The back-end systems where custom developed and the e-commerce platform used for the web site was a standard product.

When we dug a bit deeper into the design and implementation of the current site and how it used the product catalog that was part of the e-commerce platform, it showed that there was semantic dissonance between the product catalog and the back-end system that provided master product information. To put it short, the definition of a product in the e-commerce platform did not match the definition of a product in the back-end systems. Despite the dissonance, the e-commerce site was fully functioning, but mostly due to a number of workarounds. What didn’t work was to administer the products and related content in the administration tool that came with the e-commerce platform. Nor was it possible to use the features that came out–of-the-box with the e-commerce platform – we would have to build them from scratch. Interestingly, the first version of the e-commerce site was designed and implemented by the same company that provided the e-commerce platform.

This insight put us on a long journey where we had to change not only the public e-commerce front-end and how we used the product catalog, but also the integration solution to the back-end systems. We had to convince the business people that this was absolutely necessary to be able to develop the e-commerce sites in the direction they wanted.

The tool that helped us succeed was an information model that helped us distance ourselves from the data models and identify and resolve the dissonance. We (re)defined the term "product" and identified the matching entities in each of the two systems. If looking only at the implementation, it now seemed as we had mapped two different entities with each other. But what differed were only the terms, not the concepts behind them. The information model clarified that the two different terms actually meant the same thing and thus guided the redesign of the e-commerce site and the integration solution.

The trick is how we created the information model. We did not bang the drums and announce to each and everyone that we would now need to redefine the term product and some related terms. That kind of approach would certainly have met resistance because everybody is naturally protecting “their” definitions. To suggest re-defining a term such as “product” in a successful business that was entirely build on their products would probably been political suicide. Instead, we started off with discussing the existing and somewhat inconsistent and confusing product definitions. We did not attack the flaws. We simply said that we needed to understand these terms and their definitions better. So, we modeled and discussed them together in a series of workshops. And behind the scenes, we created the information model piece by piece. It might seem cunning, and it was.

Friday, November 9, 2007

Thinking on building an electronic archive?

6:45:00 PM Posted by Tommy , No comments
Looking from a governmental perspective, not only more and more information is created by or sent to our governments electronically. The share of content that is corresponded between governments and citizens on paper is decreasing. Content that is created by or received by a Swedish government is a public record and hence the need for stable archive solutions is increasing as well. How do we make sure that the information stays "in shape" throughout the remaining of times then? There are many issues to take in consideration, and there are many models to follow to get the help as well.

During my participation in projects concerning electronic archives I have found though that there are few good implementations to look at for help. And as archive solutions are using and displaying their full functionality and benefits first when some time has passed and the theories have been tested in real life, we are still in the beginning of the lifecycle of these solutions. Good practice isn't really in place yet, so to speak. There are however some findings that should be considered if you are planning on starting a project for e-archiving your information. In my experience, these are the three most important findings for a project to take in consideration.

1. Run the archive project "by the book"
I mean this in every considerable way; that is documentation, modelling, requirements specification, project staffing, budgeting etc. For example, there has to be a solid design model as a foundation for the information model and data model to be defined upon. Information and data will change over time and the foundations of the business and the business rules are less often subject to change. An example of the difficulties is the entity "client". Often, different departments within the same organisation have different views on what a client is. For this to be solid, the design model should only take in consideration the relations to other entities. And skipping this in order to cut a corner will create problems further on.

2. Develop and adopt a method for how you manage new content
In order to keep the information model and the data model as solid as possible, you should develop a method for how to take new content into the archive. From the example above, how do we manage the entity "client" for this new system since it has a different view on what a client is than any of our other systems? Well, with a solid design model and a method of defining the properties of the "new" client as a part of our already developed information model, the new system should be easier to fit into your solution. The thing here however is to realize that there is no perfect model, only ways of putting cubes into round holes without using too much violence. But remember though, the design model has to be solid.

3. Implement a cross functional information process and a process owner
Since the archiving process begins when content is created by or received by an organisation, the archive function has to be a part of a function that looks on the information through its lifecycle and not through organisational constructs. The archive function has to be regarded as a key stone in the information lifecycle and the information process. Today the archives are however often regarded as the things that happen to the information when the business doesn't need it anymore. Archiving issues hence are often considered when it's too late.

The Real Challenge of Content Integration

2:40:00 PM Posted by Oscar Berg , No comments

Many integration problems cannot be solved by designing computer algorithms. For structural dissonance it is possible, but not if there is some kind of semantic dissonance between (what appears to be) the same content in two different content sources. The term might be the same, but the meaning of it might differ. Dissonance typically occurs:

  • When translating real world observations or abstract concepts to information (creating the message)

  • When encoding the information into digital content such as text and images (creating the content)

  • When transferring content between one source and another where information models do not match (integrating content)

Experience tells that semantic dissonance is common in most enterprises (a silo effect). Still, many choose to put their trust and hopes into integration software, believing that IT alone will solve their integration problems. Reality is of course that any semantic dissonances need to be resolved first, before content sources are integrated technology-wise.

Much more is to be said about this subject, and I will return with discussions and real-world examples on how information modeling can help enterprises overcome semantic dissonance.

Thursday, November 8, 2007

Interesting Readings

9:22:00 AM Posted by Oscar Berg , , , No comments
"IBM, Microsoft, SAP lag behind on Web 2.0" by Jon Brodkin, NetworkWorld.com:

"IBM, Microsoft and SAP are taking a charge at the business Web 2.0 market, but the big vendors still lag behind smaller rivals who have developed far more innovative technology with quicker release cycles, according to a Forrester analyst"

"Half-Baked or Mashed: Is Mixing Enterprise IT And The Internet A Recipe For Disaster?" by Andy Dornan, InformationWeek:

"Enterprise mashup tools are the long tail of SOA, letting ordinary employees build applications that aren't on IT's radar screen. But what about the risks?"

"Microsoft Expands Enterprise Search Offering, Introduces Search Server 2008 Express", Microsoft PressPass:

"In delivering Search Server Express, Microsoft has taken the enterprise class search capabilities of Microsoft Office SharePoint Server 2007 and made them available as a stand-alone server for free"

"AMC 2007: 'Don't Put Content on Lockdown' Google Exec Urges Mag Eds" by Noah, FishBowlNY:

"Search is a proxy for a brand's vitality," Google media platforms director Eileen Naughton asserted to a roomful of magazine executives at AMC today in Boca Raton. "Search is a core consumer behavior that defines our times; it's an activity at the essence of what it is to be human," she said in her keynote talk entitled "Insights From Google"

"Make sure search engines can find your stuff," she urged. "Don't put it behind paid walls, don't put it on lockdown. Tag your story archives, photos, video clips and make them freely available."

Sunday, November 4, 2007

Basic Social Services for basic enterprise social software

9:54:00 PM Posted by Oscar Berg , , No comments
Basic Content Services focuses on the most basic services that an enterprise needs to manage its content, such as library services (version control, check-in/out), access control, metadata management and search. BCS can be seen as a less complex and less expensive alternative to ECM suites which are typically too expensive to be rolled out to all coworkers. A similar approach could be interesting for enterprise social software.

There are a number of social services that are absolutely necessary for efficient communication and collaboration and which must be considered as basic:
  • Presence - Presence information communicates the availability of an individual – if the individual currently can or want to communicate (synchronously) with other individuals. A presence service enables authorized users to create, modify and/or view the presence information of an individual.

  • Profiles - A profile is information associated with a specific individual user or group. The information in the profile is supplied and managed by the user (or group). A profile service enables authorized users to create, modify and/or view profiles.

  • Social Networks - A social network is a set of relationships (ties) between individuals and/or groups (nodes). The relationships (ties) can be of various kinds. A social network service enables authorized users to create, modify and/or view nodes and ties in a social network.
These services should ideally be a part of the enterprise IT infrastructure and possible to integrate in any application. The services would typically build upon and extend the information about the users and groups that are stored in a central directory such as Microsoft’s Active Directory.

Friday, November 2, 2007

Basic Content Services And Web 2.0

Consumers today are the first to get cutting-edge technology. Therefore enterprise technology adoption is increasingly driven by people from outside of the IT department. Changes in social practices also require a more agile way of exploiting new technology.

Basic Content Services (BCS) and Web 2.0 technologies follow these trends. They are both responses to a general call for improved communication and collaboration but are at the same time simpler in functionality, easier to deploy and available at lower cost. But where do they differ?

Seen from an enterprise perspective they target slightly different types of workers, namely:

  • The information worker : Characterized by ad-hoc or semi-structured team collaboration
  • The social worker : Characterized by personal relationships, knowledge transfer and participatory communities

BCS meets the needs of low-end and enterprise-wide Document Management, meaning support for creation, management (library services) and sharing of office documents for information workers. The produced documents are often relatively static and long-lived intended for consumption within a controlled context.

Web 2.0 technologies for user generated content (e.g. wikis and blogs) and metadata (e.g. social tagging and bookmarks) along with social interaction ( e.g. profiles and social networks) meet the needs of low-end and enterprise-wide Content and Knowledge Management, meaning creation, management (aggregation) and sharing of content snippets for social workers. The content being produced is relatively more interactive and short-lived intended for consumption and reuse in an open context.

These two areas will be integrated within a near future and their combined service offerings will be a threat for established monolithic and top-down oriented Enterprise Content Management systems.