Thursday, March 25, 2010

A Philosophy of Cloud - understand the trends, leverage the technologies - cross post from Citrix Community

posted by Michael Harries

Understand the underlying trends, but leverage the technologies

Cloud computing is about making enterprise quality technology available to every company. It's about reducing costs. It's about increasing agility. It's about consumer ease of use. It's about making computing into a utility. It's about outsourcing. It's about dynamic capacity.

Well yes, but ...

Cloud computing is best understood, not by talking about SaaS, PasS or IaaS or any other *aaS (footnote 1). Ultimately cloud computing is a transformation driven by several key technological shifts bringing about new ways to achieve old goals and enabling some new solution categories. It's a computing transformation that is only just beginning.

Cloud computing is best understood as a "phase shift" or industry-wide disruption, emerging from combinations of a small number of key technological factors. To explain, I'll use Professor W. Brian Arthur's framework from "The Nature of Technology". Arthur is a chaos scientist/economist noted for his seminal works "studying the impacts of positive feedback or increasing returns in economies, and how these increasing returns magnify small, random occurrences in the market place." (Wikipedia) These principles are especially significant in technology-specific industries. Arthur's most recent work "The Nature of Technology" looks at how technology changes over time; how it 'evolves'. Here's a small (paraphrased) portion of his framework.

  • Technology: includes physical devices, processes, organizational structures. Technologies are overwhelmingly made up of other technologies as sub-components. The PC, for example, has various modular pieces of hardwire that are technologies in and of themselves, such as power supplies, hard drives, graphics cards, CPU, etc. Each of these again can be considered as having sub-components - less easily swapped, but sub-components (technologies) nonetheless that can be replaced at the design time (such as a new bus architecture for a CPU, etc).
  • Domains: Technologies can be thought of as occurring in groups or domains - electronic, quantum, etc. This is related to the leveraging of common phenomena, but also to the nature of human specialization. Self similar types of components will be clustered. Some examples of this might be car mechanic, or computer engineer, or robotics engineer. Generally speaking technologies will be created from components that arise from skills/knowledge in a given domain.
  • Technological change: does not occur like animal evolution - with random changes to the chromosome or by random combinations of pairs of technology characteristics. Rather, engineers or innovators will be aiming to achieve some goal and to improve the state of the art. Much of this change comes about by steady improvement in the components making up a given technology.
  • Rebasing: Rapid change (Sometimes called disruptive change) occurs when solutions for a given problem are radically improved by replacing components of a technology with components from an entirely different domain. One example from computing is the way that the mainframe was replaced by the micro-computer. Relatively quickly the whole world of IT underwent a major change in cost, complexity and capability. Similarly, the airplane was transformed by the jet engine, the camera by digital recording.


Cloud Computing Drivers

Cloud computing fits the definition of rebasing described by Arthur. A number of major shifts (technology rebasing) are impacting the world of computing simultaneously. These include:

  1. The Internet
  2. Commoditization
  3. Virtualization (all flavors)

Let's look at these in turn.

  1. The Internet: we're still seeing the impact of the internet rippling through our computing landscape. Adapting to computing shifts can take time, and require levels of familiarity and trust to be built, as well as understanding the broader opportunities that arise. Internet cost models, architectures, and business models are becoming understood, but are still rapidly evolving. The ubiquity of the Internet is the major driver for cloud computing.
  2. Commoditization: is part of the natural process of the technology. In this case, two major commoditization factors affect cloud.
    • IT as a business commodity: IT is no longer a business advantage, but a cost of entry. In other words, "IT doesn't matter" (Nicholas Carr (footnote 2)) Whether this is absolutely true is very arguable. However there is no argument that business IT is a maturing industry. In such a world, competition moves toward doing the same old things, but more efficiently. For example Software as a Service, by and large, is about addressing known jobs (such as sales force automation) with some changes in capability, flexibility and price. (Of course there are exceptions.)
    • The x86/Windows monoculture: Intel and Microsoft have driven an extremely successful ecosystem of high volume, low cost chips and computers that have enabled an enormous part of our computing world today. "Monoculture produces great yields by utilizing plants' abilities to maximize growth under less pressure from other species and more uniform plant structure" Wikipedia. Monocultures are also susceptible to shocks. So the right (or wrong) push can have a huge impact across the whole monoculture. That is to say, the ubiquity of a given architecture raises the potential impact of certain technological innovations.
  3. Virtualization: including desktop virtualization, presentation virtualization, application virtualization, storage virtualization, server virtualization and many more. These technologies provide a way to separate one "level" of an environment from another; they provide an 'abstraction' layer. Abstraction and virtualization is a consistent theme through the history of computing - think about virtual memory, virtual users (in which each user appears to have a whole computer), virtual disks, and many, many more. While none of today's batch of virtualization is strictly new, the fact that they are being rapidly adopted by the (monoculture) market is very telling.

Together these factors lead to a dramatic re-basing of our computing environment, at an unprecedented scale. While cloud computing is the term du jour, it is representative of a shift in all of our computing toolkits.

Cloud is an ephemeral term, representative of a particular point in a larger scale rebasing of Information Technology to fully take advantage of the global Internet, a primed state of commoditization, and the broad adoption of a broad range of virtualization technologies.

Understand the underlying trends, but leverage the technologies

To anyone who makes it though this large blog post, I'd love to hear what you think,
Michael
__
Dr Michael Harries, Senior Director Strategy, Citrix Labs
@michaelharries

NOTES
1. This is seen in the overlapping and merging of these categories (SaaS, IaaS, PaaS), in the fact that much is missed (Desktop, Storage, Network, etc). Far more appropriate to move to terminology like 'software', 'platform' and 'infrastructure'.
2. It's not surprising that Nicholas Carr followed "Does IT Matter" with his cloud book "The Big Switch", given that the commoditization of IT is such a significant driver for "cloud".

Finally, here's a great quote on Complexity Theory from W Brian Arthur:

"Complexity theory is really a movement of the sciences. Standard sciences tend to see the world as mechanistic. That sort of science puts things under a finer and finer microscope. In biology the investigations go from classifying organisms to functions of organisms, then organs themselves, then cells, and then organelles, right down to protein and enzymes, metabolic pathways, and DNA. This is finer and finer reductionist thinking. The movement that started complexity looks in the other direction. It's asking, how do things assemble themselves? How do patterns emerge from these interacting elements? Complexity is looking at interacting elements and asking how they form patterns and how the patterns unfold. It's important to point out that the patterns may never be finished. They're open-ended. In standard science this hit some things that most scientists have a negative reaction to. Science doesn't like perpetual novelty."

Extract from Wikipedia

This is relatively dense, but the message is simple. All that we are hearing with the word 'cloud' in it, is merely a shorthand for a particular point in the evolution of the Internet combined with a firestorm of virtualization technologies giving us a rapid shift in what's practical for mainstream adoption of IT.

Posted via web from _technoist_

Ada Lovelace, the story - Prezi - for schools (and the rest of us) - created by @andragy - Happy #ald10

Posted via web from _technoist_

Hyperreality or cyberspace - real social media experts (philosophers) (essay from @andragy)

"Baudrillard's hyperreality is not a map at all, but a participative process that may shape us, or may allow us to shape our surroundings. There is no other controller. Governance is our own hands but is set to mass agendas. The cybernetic loop has closed on postcapitalist society and cyberspace."

This is dense, but well worth the read. This is also fascinating material to consider in greater depth. It provides a perspective well beyond the technological utopian view that pervades so very much of our industry.

(I also love the perspective of our society as a cybernetic loop -- it fits very well with the dynamics of the IT industry, and provides an interesting counter-perspective on notions of 'the singularity', suggesting that many elements of 'merging with the machine' are already at hand.)

Posted via web from _technoist_

The Mediator Pattern for Desktop and App Virtualization » copy of my Citrix Community post

posted by Michael Harries

What does a pattern mean to you?

The modern software idea of Design Patterns comes from work done building out a language of patterns for architects. This idea has been a profound influencer of software design approaches and was pioneered by the 'Gang of Four'.

Pattern thinking has also been applied to architecture, and to broader notions of IT design and management. For example "Architecture and Patterns for IT Service Management , Resource Planning, and Governance: Making Shoes for the Cobbler's Children". This book provides actionable patterns for treating IT like a business (and is highly recommended).

I would like to propose an important pattern that sits between enterprise architecture and application architecture for virtual desktops and virtual applications.

MOTIVATION: In today's era of rapid change and consumerization all users have growing expectations from their enterprise computing systems. This raised expectation is generally portrayed as affecting only end users, but the reality is that these users include the business owner (aka The guy who pays the IT bills). This raised business expectation is much like when the microcomputer revolution hit mainframe/mini IT shops (and it was equally tempting for business to 'go around' IT - at the time by purchasing PCs, now by moving to SaaS). We have an environment with ever higher expectations of IT.

At the same time there are ever more desktop and hand-held device types. We are amidst a transition from a period of relative end-point homogeneity (Windows, Windows and Windows) to a period with multiple desktop environments (increases in Macs and Linux) and where most internet access will occur from a heterogeneous range of hand held mobile devices. This is a new game, a new paradigm, and one that is not going away any time soon. (I've talked about this elsewhere).

IT is facing dual challenges of increased agility expectations and increased device heterogeneity. This is a problem because traditional end-point management leads to 'installation inertia'. That is, each carefully crafted desktop image or application install becomes yet another point that must be rebuilt with every application or architectural change. Like a poorly lubricated engine, the whole system is impaired and in many cases can all but grind to a halt. This friction costs you, and your company, money every day.

(Applicator virtualization and desktop virtualization in all its variants (e.g. Flexcast) act to reduce this friction. This is a key attribute of all types of virtualization - they reduce the friction, the inter-linkages, between layers.)

The Desktop and App Virtualization Mediator Pattern

THE PATTERN: So, to come back to the notion of patterns ... In traditional software pattern terminology, there is a pattern called a mediator. This pattern acts as a way to manage communication between large numbers of frequently changing objects. The idea is that by having a module, or program that acts as a connector between these different components - it avoids propagating changes from one object through all others it touches by isolating the changes in the meditation object.

This is exactly what we achieve with desktop and application virtualization. Changes can occur in any application without affecting end user devices. At the same time end user devices can be completely changed without affecting the applications. Hence, the Desktop and Application Virtualization Mediator Pattern is the right way to manage the device/application nexus in your IT infrastructure. It avoids the installed application inertia trap. It reduces friction and increases your Enterprise Architecture agility.

WRAP: This pattern fits between enterprise architecture and application architecture and it matters. (Hat tip to Michael Keen who has been talking enterprise architecture/application architecture as strategy for some time.)

Let me know what you think.

There seems to be a great deal of 'design pattern' books on the market, I thought it would be interesting to look at how they could be used to motivate the use of a desktop/application virtualization strategy.

Posted via web from _technoist_

The future of mobile devices, the internet, and us

Great Danah Boyd Interview (at SxSW) - realities of digital native, privacy and location

Great interview with Dana Boyd on her work at Microsoft Research - nice primer on realities of the web/social net for the "digital native", and a couple of pointers on how location changes a lot of social games. If you prefer text, check out her articles at http://www.danah.org/.>

Posted via web from _technoist_