Reliability – an argument against using Web 2.0 services in learning? Probably not.

When you talk to anyone in an “organisational” position (e.g IT or perhaps some leadership positions) within a university about using external “Web 2.0” tools to support student learning one of the first complaints raised is

How can we ensure it’s reliability, it’s availability? Do we have as much control as if we own and manage the service on our servers? Will they be as reliable and available?

My immediate response has been, “Why would we want to limit them to such low levels of service?”. Of course, it’s a little tounge in cheek and given my reputation in certain circles not one destined to win friends and influence people. There is, however, an important point underpinning the snide, flippant comment.

Just how reliable and available are the services owned and operated by universities? My anecdotal feeling is that they are not that reliable or available.

What about web 2.0 tools?

Paul McNamara has a post titled “Social network sites vary greatly on availability, Pingdom finds” that points to a Social network downtime in 2008 PDF report from Pingdom. The report discusses uptime for 15 social network tools.

A quick summary of some of the comments from the report

  • Only 5 social networks managed an overall uptime of 99.9% or better: Facebook (99.92%), MySpace (99.94%), Classmates.com (99.95%), Xanga (99.95%) and Imeem (99.95%).
  • Twitter – 99.04% uptime
  • LinkedIn – 99.48% uptime
  • Friendster – 99.5% uptime
  • Reunion.com – 99.52% uptime
  • Bebo – 99.56% uptime
  • Hi5 – 99.75% uptime
  • Windows Live Spaces – 99.81% uptime
  • LiveJournal – 99.82% uptime
  • Last.fm – 99.86% uptime
  • Orkut – 99.87% uptime

Is it then a problem?

The best you can draw from this is that if you’re using one of the “big” social network tools then you are probably not going to have too much of a problem. In fact, I’d tend to think you’re likely to have much more uptime than you would with a similar institutional system.

The social network tool is also going to provide you with a number of additional advantages over an institutionally owned and operated system. These include:

  • A much larger user population, which is very important for networking tools.
  • Longer hours of support.
    I know that my institution struggles to provide 10 or 12 x 5 support. Most big social network sites would do at least 10 or 12 x 7 and probably 24×7.
  • Better support
    Most institutional support folk are going to be stretched trying to maintain a broad array of different systems. Simply because of this spread their knowledge is going to be weak in some areas. The support for a social network system is targeted at that system, they should know it inside and out. Plus, the larger user population, is also going to be a help. Most of the help I’ve received using WordPress.com has come from users, not the official support, of the service.
  • Better service
    The design and development resources of the social network tool are also targeted at that tool. They aim to be the best they can, their livelihood is dependent upon it in a way that university-based IT centres don’t have to worry about.

The gulf between users and IT departments

Apparently Accenture have discovered “user-determined computing” and associated issues.

The definition goes something like this

Today, home technology has outpaced enterprise technology, leaving employees frustrated by the inadequacy of the technology they use at work. As a result, employees are demanding more because of their ever-increasing familiarity and comfort level with technology. It’s an emerging phenomenon Accenture has called “user-determined computing.”

This is something I’ve been observing for a number of years and am currently struggling with in terms of my new job, in a couple of different ways. In particular, I’m trying to figure out a way to move forward. In the following I’m going to try and think/comment about the following

  • Even though “Web 2.0 stuff” seems to be bringing this problem to the fore, it’s not new.
  • The gulf that exists between the different ends of this argument and the tension between them.
  • Question whether or not this is really a technology problem.
  • Ponder whether this is a problem that’s limited only to IT departments.

It’s not new

This problem, or aspects of it, have been discussed in a number of places. For example, CIO magazine has a collection of articles it aligns with this issue (Though having re-read them, I’m not sure how well some of them connect).

The third one seems the most complete on its coverage of this topic. I highly recommend a read.

The gulf

Other earlier work has suggested that the fundamental problem is that there is a gap or gulf, in some cases a yawning chasm, between the users’ needs and what’s provided by the IT department.

One of the CIO articles above puts it this way

And that disconnect is fundamental. Users want IT to be responsive to their individual needs and to make them more productive. CIOs want IT to be reliable, secure, scalable and compliant with an ever increasing number of government regulations. Consequently, when corporate IT designs and provides an IT system, manageability usually comes first, the user’s experience second. But the shadow IT department doesn’t give a hoot about manageability and provides its users with ways to end-run corporate IT when the interests of the two groups do not coincide.

One of the key points here is that the disconnect is fundamental. The solution is not a minor improvement to how the IT department works. To some extent the problem is so fundamental that people’s mindsets need to change.

Is this a technology problem?

Can this change? Not sure it can, at least in the organisations where all that is IT is to be solved by the IT department. Such a department, especially at the management level, is manned (and it’s usually men, at least for now) by people who have lived within IT departments and succeeded, so that they now reside at the top. In most organisations the IT folk retain final say on “technical” questions (which really aren’t technical questions) because of the ignorance and fear of senior management about “technical” questions. It’s to easy for IT folk to say “you can’t do that” and for senior management not to have a clue that it is a load of bollocks.

Of course I should take my own advice look for incompetence before you go paranoid. Senior IT folk, as with most people, will see the problem in the same way they have always seen the problem. They will always seek solve it with solutions they’ve used before, because that’s the nature of the problem they see. One of the “technical” terms for this is inattentional blindness

The chances of a fundamental change to approach is not likely. Dave Snowden suggests that the necessary, but not sufficient conditions, for innovation are starvation, pressure and perspective shift. Without that perspective shift, the gulf will continue exist.

It’s not limited to IT

You can see evidence of this gulf in any relationship between “users” and a service group within an organisation (e.g. finance, Human Resources, quality assurance, curriculum design etc.) – especially when the service group is a profession. The service group becomes so enamoured of its own problem due to pressure from the organisation, the troubles created by the “users” and the distance (physical, temporal, social, mental etc.) between the service group and the “users” that it develops its own language, its own processes and tasks and starts to lose sight of the organisations core business.

The most obvious end result of the gulf is when the service department starts to think it knows best. Rather than respond to the needs, perceived and otherwise, of the “users”, the service department works on what it considers best. Generally something the emphasises the importance of the service divisions and increases their funding and importance within the organisation. You can see this sort of thing all the time with people who are meant to advice academics about how to improve their learning and teaching.

IT are just the easiest and most obvious target for this because IT is now a core part of life for most professions, most organisations continue to see it as overhead to be minimised, rather than an investment to be maximised and the on-going development of IT is changing the paradigm for IT departments.

From scarcity to over abundance – paradigm change for IT departments (and others)

Nothing all that new in this post, at least not that others haven’t talked about previously. But writing this helps me think about a few things.

Paradigms, good and bad

A paradigm can be/has been defined as a particularly collection of beliefs and ways of seeing the world. Perhaps as the series of high level abstractions which a particular community create to enable very quick communication. For this purpose a common paradigm/collection of abstractions is incredibly useful, especially within a discipline. It provides members of a community from throughout a wide geographic area with a shared language which they can use.

It also has a down side, paradigm paralysis. The high level abstractions, the ways of seeing the world, become so ingrained that members of that community are unable to see outside of that paradigm. A good example is the longitude problem where established experts ignored an innovation from a non-expert because it fell outside of their paradigm, their way of looking at the world.

Based on my previous posts it is no great surprise to find out that I think that there is currently a similar problem going on with the practice of IT provision within organisations.

What’s changed

The paradigm around organisational IT provision arose within a context that was very different. A context that has existed for quite sometime, but is now under-going a significant shift caused by (at least) three factors

  1. The rise of really cheap, almost ubiquitous computer hardware.
  2. The rise of cheap (sometimes free), easy to use software.
  3. The spread of computer literacy beyond the high priests of ITD.

The major change is that what was once scarce and had to be managed as a scarce resource (hardware, software and expertise) is now available in abundance.

Hardware

From the 50s until recently, hardware was really, really expensive, generally under-powered and consequently had to be protected and managed. For example, in the late 1960s in the USA there weren’t too many human endeavours that would have had more available computing power than the Apollo 11 moon landing. And yet, in modern terms, it was a pitifully under-resourced enterprise.

Mission control, the folk on earth responsible for controlling/supporting the flight had access to computer power equivalent to (probably less) than the Macbook Pro I’m writing this blog entry with. The lunar module, the bit that took the astronauts from moon orbit, down, and then back again is said to have had less power than the digital watch I am currently wearing.

Moore’s law means that computer power increases exponentially with a similar impact on price.

Software

Software has traditionally been something you had to purchase. Originally, only from the manufacturer of the hardware you used. Then software vendors arose, as hardware became more prevalent. Then there was public domain software, open source software and recently Web 2.0 software.

Not only was there more software available in these alternate approaches, this software became easier to use. There are at least half a dozen free blog services and a similar number of email services available on the Web. All offering a better user experience than similar services provided by organisations.

Knowledge and literacy

The primitive nature of the “old” computers meant that they were very difficult to program and support. But since their introduction the ability to maintain and manipulate computers in order to achieve something useful has become increasingly easy. Originally, it was only the academics, scientists and engineers who were designing computers who could maintain and manipulate them. Eventually a profession arose around the maintenance and manipulation of computers. As the evolution continued teenage boys of a certain social grouping became extremely proficient through to today when increasing numbers (but still not the majority) are able to maintain and manipulate computers to achieve their ends.

At the same time the spread of computers meant that more and more children grew up with computers. A number of the “uber-nerds” that grew up in the 60s and 70s had parents who worked in industries that enabled the nascent uber-nerds to access computers. To grow up with them. Today it is increasingly rare for anyone not to grow up with some familiarity with technology.

For example, Africa has the fastest growing adoption rate of mobile phones in the world. I recently read that the diffusion of mobile phones in South Africa put at 98%.

Yes, there is still a place for professionals. But the increasing power and ease of use of computers means that their place is increasingly not about providing specialised services for a particular organisation, but instead providing generalised platforms which the increasingly informed general public can manipulate and use without the need for IT.

For example, there’s an increasingly limited need (not quite no need) for an organisation to provide an email service when there are numerous free email services that are generally more reliable, more accessible and provide greater functionality than internal organisational services.

From scarcity to abundance

The paradigm of traditional IT governance etc is based around the idea that hardware, software and literacy are scarce. This is no longer the case. All are abundant. This implies that new approaches are possible, perhaps even desirable and necessary.

This isn’t something that just applies to IT departments. The line of work I’m in, broadly speaking “e-learning”, is also influenced by this idea. The requirement for universities to provide learning management systems is becoming increasingly questionable, especially if you believe this change from scarcity to abundance suggests the need for a paradigm change.

The question for me is what will the new paradigm be? What problems will it create that need to be addressed? Not just the problems caused by an old paradigm battling a new paradigm, the problems that the new paradigm will have. What shape will the new paradigm take? How can organisations make use of this change?

Some initial thoughts from others – better than free.

A related question is what impact will this have on the design of learning and teaching?

Dealing with “users”, freedom and shadow systems

Apparently Accenture have discovered “user-determined computing” and associated issues.

The definition goes something like this

Today, home technology has outpaced enterprise technology, leaving employees frustrated by the inadequacy of the technology they use at work. As a result, employees are demanding more because of their ever-increasing familiarity and comfort level with technology. It’s an emerging phenomenon Accenture has called “user-determined computing.”

It’s not new

This problem, or aspects of it, have been discussed in a number of places. For example, CIO magazine has a collection of articles it aligns with this issue

This has connections to the literature on workarounds and shadow systems. Practices by which people within organisations workaround the official organisational systems or hierarchies and do things their own way.

This is not a problem limited to IT departments. I work within a group responsible for curriculum design, e-learning and materials development at a University. We’re a provider of services for academic staff. Those staff can and do workaround the services we provide.

The question is, what should we do? How should we handle this?

Reactions from IT folk

I find it interesting that a common knee-jerk reaction from IT folk tends towards the negative and/or aggressive. Check out some of the comments on this blog post or one of the Time to rethink your relationship with end-usersCIO articles.

This is often seen in the official reaction of IT departments to shadow systems. “SHUT THEM DOWN!!!!”. It’s a discourse that have been circulating at my institution in recent times.

Having been a creator and heavy user of shadow systems it’s not an approach which I believe is productive. In fact, some colleagues and I have argued that there is a much better approach. From the abstract

Results of the analysis indicate that shadow systems may be useful indicators of a range of problems with enterprise system implementation. It appears that close examination of shadow systems may help both practitioners and researchers improve enterprise system implementation and evolution.

The gulf

The users who know too much CIO article puts it this way

And that disconnect is fundamental. Users want IT to be responsive to their individual needs and to make them more productive. CIOs want IT to be reliable, secure, scalable and compliant with an ever increasing number of government regulations. Consequently, when corporate IT designs and provides an IT system, manageability usually comes first, the user’s experience second. But the shadow IT department doesn’t give a hoot about manageability and provides its users with ways to end-run corporate IT when the interests of the two groups do not coincide.

Other earlier work has suggested that this gap or gulf, in some cases a yawning chasm, is created by a number of different factors.

Perhaps it is the fundamental nature of some of the factors that create the gap which contribute to the negative reactions. The perspectives creating the gap are so fundamental that the people holding them never question them. They don’t see that their view is actually counter-productive (in some situations) or that there are alternatives. They simply can’t understand the apparent stupidity of the alternate perspective and the hugely negative ramifications.

Super-rational versus complexity

One of the fundamental outlooks which contribute to this gap is that most IT, and most organisations, are based on the ideal of top-down design (teleological design). I’ve written about this previously.

That previous writing includes one of the more interesting characterisations of the difference in these two fundamentally different perspectives. I’ve included it as an mp3. It’s by Dave Snowden, and is an excerpt from a presentation he gave in Helsinki on sense-making and strategy. In the excerpt he describes two approaches to organising a child’s birthday party. One based on traditional top-down approaches and another based on complexity.

What should we do?

This is a real problem which we have to address. How do we do it.

The users who know too much CIO article suggests the following principles as starting points

  1. Find out how people really work
    This connects with ideas in our earlier articles. Look at the shadow systems people are using and understand the factors leading them to use them. We need to know much more about how and why staff are doing curriculum design, e-learning etc.
  2. Say yes to evolution
    On reading the article I wonder if “don’t say no” might not be a better name for this principle. One of the nice quotes in the article is “No one will jump through hoops. They’ll go around them.”. We have to make it easy and safe for folk to do their own thing. Not just understand what they are doing, but allow them to evolve and do different things and keep an eye on why, what and how they do it.
  3. Ask yourself if the threat is real
    There is often a reason why IT believes a shadow system is bad – security, inefficiency etc. This principle suggests spending a lot of time considering whether or not this is really a big problem. In our line of work that might be equated to telling an academic that a particular learning/teaching approach is less than good.

    Another quote from the article ” When a CIO….is setting himself up as a tin idol, a moral arbiter. That’s a guaranteed way to antagonize users. And that’s never a good idea.”.

  4. Enforce rules, don’t make them.
    Some recent local experience reinforces the importance of this. It’s not the support group saying no. It’s the rules that were created by the appropriate folk within the business. As an addition to this I would suggest: “Make sure everyone knows who made the rules.”.
  5. Be invisible.
    This principle relates to the “important things” a service division should do. For example, an IT department is responsible for ensuring security of important data. The processes used to do that should be invisible. It shouldn’t cause the users grief in order to be secure. It should just happen.
  6. Messy but fertile beats neat but sterile.
    It’s not included in the article as one of the principles, but it is used as the closing section and I think it deserves to be included. To much of what goes on in organisations is based on the idea of having tidy diagrams, one way to do something of being neat and sterile. “messiness isn’t as bad as stagnation” and “If you want to be an innovator and leverage IT to get a competitive advantage, there has to be some controlled chaos.”

    Another approach

    Nicholas Carr argues for one response in terms of IT departments.

SaaS, Consumer products, shadow systems and e-learning

In a recent post I commented on the trend around how consumer driven computing is driving the development of “software as a service”. In particular, pointing to an article from the Economist that talked about how Arizona State University was using Google Apps to host email accounts for their students.

What I want to do here is to link this trend back to individual academics and how they may use these services to work around institutional structures and approaches to develop shadow systems.

A recent quote I came across from Somekh (2004) summarises why and links nicely with some work some colleagues and I have done previously.

But activity theory goes further to explain the way that institutional structures within national systems, with functions as diverse as education and the postal service (Engeström & Escalante, 1997), construct and constrain the interrelationship of humans and ICTs in mediated activity.

In that some work we built on work by Behrens and Sedera (2004) an examined how a particular shadow system rose and fell (it has since risen a bit further, but may well fall again). Behrens and Sedera (2004) generated a little framework which explains the factors that cause shadow systems to be created. It’s shown in the following figure. We attempted to show that rather than being objects of scorn, things to destroy. Shadow systems are actually important indicators, canaries in the coal mine, to demonstrate that there is a gap between user requirements and the service being provided.

Factors causing shadow systems in an ERP context

Given the diversity in universities there will always be a gap between what academics want to do with technology than can be provided by the support divisions at universities. More importantly, is that there may well be an even larger gap between what universities provide and what students want. Particularly, given the recent rhetoric around the net generation.

So, what’s new? That gap has always been there.

The difference is the growing influence of consumer computing, software as a service, websites/services such as YouTube, Google Video and the rest of the Web 2.0 avalanche, the increasing ease-of-use of personal computers and applications to create and manage multimedia resources and the growing capability of people to use these tools.

This is a significant change. In the early days of the Internet it was the universities that led the charge, that developed the innovations. Jones and Johnson-Yale (2005) give a brief overview of this in their introduction. An interesting aspect is that initially it was the staff driving the innovation, they then talk about the students going out and driving it further (e.g. Shawn Fanning and Napster, Brin and Page at Google).

John Pederson expressed this in a different, graphical form which was picked up by Scott McLeod which includes a Jack Welch quote.

All these changes increase the “resource” and “support” intervening conditions in the above figure. i.e. they make it easier for the staff and students to “fill the gap” between university provided IT services and what they want to do. University IT is no longer the only shop in town. Increasingly academics and students will ignore University IT services.

This is a change that needs to be countered. Initially, the university hierarchy will probably see this change as a “threat” and counter it by banning use of services. I can see university technical staff playing around with routing tables and firewalls to prevent use of sites. This will be counter productive. Obviously not an option that I would suggest.

Instead, university IT and related services should recognise that this change is coming and instead of banning it, they should on board. Recognise the potential benefits that this change may bring and figure out how they will operate in this new world.

This is especially important as this change may offer an opportunity to address the problem of funding, which is rated #2 in the Top 10 issues facing university IT support.

Potential University Responses

There are numerous approaches a university could take in responding to shadow systems and this particular issue. The following is a start of a spectrum of options:

  • Outlaw it
    All such systems are bad and should be crushed.
  • Ignore it.
    Shadow system, what shadow system?
  • Piecemeal adoption
    Some sections adopt and use it.
  • Adopt it
    Investigate and potentially adopt and support it centrally.

The missing ground rule for Enterprise 2.0

In his MIT Sloan Management Review article, “Enterprise 2.0: The Dawn of Emergent Collaboration”, Andrew McAfee cites two “intelligent ground rules” that people building Enterprise 2.0 technologies are following

  1. Making sure the applications are easy to use.
  2. Avoiding any preconceived notions about categories or structure by building tools that let these aspects emerge.

I see some connections with this ground rule and the concept of rapid incrementalism that has been talked about by John Seely Brown. In this John Hagel talks briefly about rapid incrementalism as one of the responses to the IT Doesn’t Matter discussion kicked off by Nick Carr.

The two John’s position is that economic impact from IT comes from incremental innovations. Rapid incrementalism “enhances learning potential and creates opportunties for further innovations”.

This sounds very much like the type of emergence.

The design theory I have formulated from Webfuse includes a heavy emphasis on emergence. But it also combines it with aspects of Roger’s diffusion theory to address one of the challenges to Enterprise 2.0 identified by Andrew McAfee

The first is that busy knowledge workers won’t use the new technologies, despite training and prodding. Most people who use the Internet today aren’t bloggers, wikipedians or taggers. They don’t help produce the platform — they just use it. Will the situation be any different on company intranets? It’s simply too soon to tell.

Actually, I don’t think it is too soon to tell. The “build it and they will come” aproach doesn’t work with knowledge workers. There’s got to be something in it for them. This is the 9X email problem which Andrew McAfee has talked about.

If you draw on the information systems literature you come to TAM (I talked about this before) which posits two main factors influencing adoption of technology

  1. Perceived ease of use
  2. Perceived usefulness
    1. Perceived usefulness is defined as

      The degree to which a person believes that using a particular system would enhance his or her job performance.

      So I think there’s a ground rule missing for Enterprise 2.0 applications. The must not only be easy to use, they must be useful.

      In an Enterprise 2.0 I believe the role for an organisation’s IT/IS people must change from supporting the technology or specific business processes to continually being on the look out for how to leverage the technology to increase perceived usefulness of the systems.

      This is how you encourage adoption and use by knowledge workers. By creating a trust that the information systems for an organisation are being developed to be useful to them.

When will enterprises truly embrace Enterprise 2.0 applications?

The following reflection (you may prefer drivel, but each to their own) has been sparked by a comment by Susan Scrupski on a previous post. It’s also been driven along by some of the reading I’ve been doing in the blogosphere in the last week or so.

The whole point is to attempt to answer the question: “When will enterprises truly embrace Enterprise 2.0 applications?”

Susan says she’s been struggling with it so I’m guessing that the following is going to contain more of my own struggling and not that many answers.

The factors which I think need to be considered in answering this include:

  • The trouble with generalisations.
  • Who makes the decisions in organisations.
  • The 9X email problem – for IT folk.
  • If it ain’t broke, don’t fix it
  • The 9X email problem – for users.
  • TAM, diffusion and perceived effectiveness and perceived ease of use.
  • Shadow systems – the user revolution.

The trouble with generalisations

I’m increasingly thinking that many of the problems of the IT industry is due to the tendency of IT folk to prefer abstrctions and generalisations. From Wikipedia

In computer science, abstraction is a mechanism and practice to reduce and factor out details so that one can focus on a few concepts at a time.

You can see this in the old computer science adage: “Any problem can be solved by adding a layer of abstraction”.

Enterprises are incredibly diverse. The characteristics of these different enterprises make a single answer to the above question difficult, if not impossible. It might be better to ask what factors or characteristics will encourage enterprises to or hinder enterprises from embracing Enterprise 2.0 applications.

Beyond that point, most of my following ruminations are based on the experience I have with enterprises. Which is mostly Universities. So take any generalisations I make below with a grain of salt.

Who makes the decisions in organisations

Kathy Sierra has a recent post about knocking the exuberance out of employees about what organisations prefer in their employees and consequently what organisations tend to encourage/do to their employees.

It really strikes a chord with my experience. Many employees, especially IT employees, are robots. Frightened to rock the boat, to break with tradition, think outside the box and many more tired cliches. IT employees are especially prone to this as IT is continually seen as overhead, a cost to be minimised rather than a strategic advantage to be maximised – more on this below.

The next obvious step beyond Kathy’s post is that if organisations do this to their employees, then which employees rise to management positions? The most successful robots. The employees most indoctrinated into the current status quo, the ones least likely to look for something and challenging.

So, in many organistions you end up with senior and line managers who actively battle against new ideas like Enterprise 2.0.

The 9X Email problem – IT staff

Andrew McAfee has recently blogged about the 9X email problem in which he suggests that Enterprise 2.0 applications will need to 9 times better than the applications they are replacing. Otherwise they will not be adopted.

This doesn’t apply to just the users. It also applies to IT staff. In many organisations IT is a cost. IT staff are like the emergency services – Police, Fire, Ambulance/Paramedics – you only ever see them when there’s a problem. When the system is down, or you can’t figure out how to use it. Management want to minimise the problems caused by IT.

Within an enterprise this message is continually sent to IT staff via numerous direct and in-direct mechanisms until the exuberance is entirely knocked out of them. To quote Kathy Sierra

If we knock out their exuberance, we’ve also killed their desire to learn, grow, adapt, innovate, and care.

Implementation of Enterprise 1.0 applications has, for many IT staff, been so difficult and traumatic that once it’s in they are very reluctant to let go, to try something new. Especially when having skills around Enterprise 1.0 applications are still highly prized in the job market.

If it ain’t broke, don’t fix it

There exists a huge distance between most IT staff and the people they are supporting. Standard best practice in implementing IT service management in large organisations, such as ITIL, actively encourages greater separation between the people using the systems and the people actually responsible for supporting and fixing the system.

Most include some concept of a “user group” as a means to reduce this distance. But the tendency is that it’s Kathy Sierra’s “robots” that attend these meetings. The people who actually see the limitations and problems with Enterprise 1.0 applications realise that it’s a waste of time to attend such meetings or, even worse, are actively kept away from those meetings via the hierarchy.

All of this leads many IT staff to the conclusion that it ain’t broke, so why would we fix it?

Perhaps that sounds a bit like I’m trying to denigrate IT staff. That’s not the intent. The intent is to illustrate that the environment in which they work is such that this is the almost inevitable end result.

The 9X email problem – for users

Most of the other staff are suffering the same problem. For many employees their job description is tied directly to a specific set of processes and activities. Any potential change is challenging and has the potential, at the worst, for them to lose their job.

There has been some seminal work in the information systems literature about the implementation of “old-style” groupware technology (e.g. Notes) into organisations by Wanda Orlikowski. There’s a working paper of this work. This is the one I’m aware of, I’m sure there are many others.

A quote from the working paper that is relevant to Enterprise 2.0

In those organizations where the premises underlying groupware are incongruent with those of the organization’s culture, policies, and reward systems, it is unlikely that effective cooperative computing will result without a change in structural properties. Such changes are difficult to accomplish, and usually meet with resistance. Without such changes, however, the existing structural elements of the firm will likely serve as significant barriers to the desired use of the technology.

It’s all about the perceptions of the users. Which are created and influenced by the culture, policies, reward systems and other characteristics of the specific organisation.

TAM, diffusion and perceived usefulness and perceived ease of use

Some work that I and a couple of my colleagues have done over recent years have focused on the use of diffusion theory and the technology acceptance model (TAM).

Concentrating just on TAM, TAM suggests that how useful and easy to use a user perceives an information system will directly influence their decision to make use of that system.

Both TAM and diffusion theory have resonances with the 9X email problem and other aspects I’ve mentioned above.

The propositions that arise from all of this include:

  • Employees will use Enterprise 2.0 applications when they are seen to be very useful and very ease to use within the organisational context to which they belong
  • It is unlikely that this will be enabled by existing IT staff or management
  • Organisations with exceptional IT staff or management may get it early
  • They’ll have to figure out how best to encourage perspectives of usefulness and ease of use within their organisational context
  • Many other organisations may face a user revolt.

Shadow Systems – the users revolt

In a recent post Susan has talked about Enterprise 2.0 being driven by a revolution – a self-help revolution.

Here at CQU I’ve been heavily involved in the development of shadow systems.
Shadow systems seem to be another phrase for Susan’s self-help systems.

My wife has done some research and publication around shadow systems and why they appear. Some colleagues and I drew on her work for a paper about the role which we think shadow systems play in the enterprise space.

A quick summary

  • Shadow systems are inevitable in any reasonably complex organisation
  • The combination of increasing change in business context, increasing ease-of-use of information systems development tools, and increasing computer literacy will only increase the prevalence of shadow systems.
  • The current view of shadow systms as something to be eliminated is short-sighted.
  • Shadow systems are a useful indicator of potential problems with the implementation of Enterprise Systems.
  • Shadow systems should be encouraged and enabled.

Which, at least to me, sounds very similar to the notion within Enterprise 2.0 of enabling emergent structures rather than imposed ones.