B. Backgrounds to the phenomena: five contexts

In Appendix A, open sourcing while private sourcing was described as a phenomenon in seven cases. Over a decade, these cases had sometimes sequential and sometimes nearly contemporaneous events were learnings in one could influence another. In addition, the individuals involved in depth in one case might be an influencer or participants in another. In this Appendix, five overarching background contexts are described as the background for open sourcing while private sourcing between 2001 and 2011:

  1. IBM's senior managers, from 2001, advancing strategic bets on future drivers of industry, business, computing and marketplace;
  2. IBM employees, since 1996, engaging online with w3 intranet platforms for global knowledge exchanges and social sharing;
  3. IBM consultants, from 2004, probing to confirm business priorities through industry-based executive studies;
  4. IBM researchers, from 2004, exploring social changes influencing new organizational and technological opportunities on longer horizon; and
  5. At large, from 2000, businesses, creatives, governments, makers and academics, taking up open sourcing.

These contexts coincided with an atmosphere of positivity and collaborative teaming amongst IBMers for a decade. More detail is provided in each of the sections following.

B.1 IBM senior managers, from 2001, advancing strategic bets

The network-centric computing vision presented by IBM CEO Lou Gerstner did not get the attention of Wall Street analysts in 1994, nor did it resonate at Comdex presentations in 1995 (IBM 2011b). When the idea was refined less technically and enterprise customer started to appreciate what the Internet might do, e-business became the galvanizing mission for IBM in 1999.

We infused it into everything—not just our advertising, product planning, research agendas, and customer meetings, but throughout our communications and operations—from my e-mails, broadcasts, and town hall visits to the way in which we measured our internal transformation. It provided a powerful context for all of our businesses. It gave us both a marketplace-based mission and a new ground for our own behaviors and operating practices—in other words, culture.

Most important, it was outward-facing. We were no longer focused on turning ourselves around. We were focused on setting the industry agenda again. We shifted the internal discussion from “What do we want to be” to “What do we want to do” (Gerstner 2002, 320).

The e-business vision not only presented a future for enterprise-scale customers, but for the ways that IBM itself would change (Sager 1999). In business and the Internet, the “real revolution” would be when “big business turns e-business” (The Economist 1999).

In 2000 and 2001, the confidence in Internet-based businesses shifted from buoyant to shaken. The NASDAQ index peaked in March 2000, and the dot-com bubble burst into a dot-bomb shakeout of new e-commerce startups (Madslien 2010; The Economist 2000). At March 2001, the NASDAQ market had lost 60% of its value, and NYSE, European and Japanese markets had fallen to lows of 2 to 3 years earlier (BBC News 2001). After the terrorist attacks of 9/11, stocks fell further (Ulick 2001).

The IBM 2001 annual report, published in April 2002, reflects much of the context for the industry in the prior decade. The cover was a letter by outgoing CEO Louis Gerstner that acknowledged recent challenges, yet foreshadowed a future with even greater change for the industry:

I want to use this occasion to offer a perspective on what lies ahead for our industry. To many observers today, its future is unclear, following perhaps the worst year in its history. A lot of people chalk that up to the recession and the “dot-com bubble.” They seem to believe that when the economies of the world recover, life in the information technology industry will get back to normal.

In my view, nothing could be further from the truth.

Louis V. Gerstner (IBM 2001).

Inside the annual report, the shift that was foreseen was “that customers are finally driving the direction of the information technology industry”.

IBM's direction, looking forward, was described as a “handful of strategic bets on the future drivers of industry”.839 The four drivers emphasized were:

1. The New Industry Model: Innovate or Integrate

2. The New Business Model: Services-Led

3. The New Computing Model: Infrastructure Plus Ubiquity

4. The New Marketplace Model: An Open Playing Field

Prior to 2001, IBM has participated in the rise of the Internet, and in open standards. The execution of the strategic bets would be left in March 2002 to the new CEO Sam Palmisano, as Lou Gerstner stepped down (Kirkpatrick 2002). This 2002 formal statement of strategy set a foundation for strategies that included open sourcing while private sourcing. The implications of each of the four drivers is described below.

B.1.1 IBM would lead the industry by both innovating and integrating

The IBM 2001 annual report saw industry survival as either innovating or integrating, but industry leadership as both innovating and integrating. This was the first strategic bet on the future drivers of the information technology industry.

Innovation that in the 1960s and 1970s had come from vertically integrated technologies companies …

… had given way by the early 1990s to a dizzying array of “pure play” companies (specialists in PC, databases, application software and the like). This explosion of entrepreneurial and technical creativity was, on the one hand, a testament to our industry’s enduring power. [….]

As I/T moves out of the back office and into the executive suite, value and growth in our industry are driven less than they used to be by technical innovation or product excellence, as necessary as those remain (IBM 2001, 3).

IBM could approach executive suite leaders with a message of innovating for value and growth, whereas niche companies would approach functional managers with the promise of technical innovation or product excellence.

The fragmentation of technologies and initiatives led to the need for integration work to be contracted with professional services companies. With technology and services coming from two different sources, the weight of transformation had to be borne by customers.

What matters most today is the ability to integrate technology into the lifeblood of business. The people who help customers apply technology to transform their businesses have increasing influence over everything from architecture and standards to hardware and software choices and partners (IBM 2001, 3).

IBM's strengths had traditionally been as a full-service, one-stop provider. The strategy going forward would be for IBM both to innovate and to integrate, both in the technological and organizational senses.

One way in which IBM would simultaneously innovate and integrate is by open sourcing while private sourcing. Innovating through open sourcing features divergent and convergent phases, but often lacks the credibility to break through to enterprise scale implementations. Private sourcing appeals to a customer preferring predictability and low risk in the technologies that the organization practically adopts and maintains. IBM could make open sourcing credible, complementing its traditional private sourcing offerings. Integrating the mix of open sourcing and private sourcing components would be a responsibility that IBM would take on, for enterprise scale companies. Working with industry committees on open standards would allow customers the possibility of alternative providers, allying fears of technology lock-in. Both innovating and integrating restored an emphasis on the business value of technology, counter to the leaps of faith leading to dot-com crash.

B.1.2 IBM would evolve e-business from services-led to on demand

The IBM vision of e-business was essentially the adoption of Internet technologies inside enterprise-scale organizations. The heritage of IBM relied on revenue streams from the 1980s on mainframe computing, and from the 1990s on client-server systems. In 2001, the success in professional services was heralded with IBM Global Services as “the world's largest and most innovative consultancy, systems integrator and strategic outsourcing leader”. However, the 2001 annual report foreshadowed the shift from selling hardware, software and IT services with a further disruption of:

… utility-like delivery of computing—from applications, to processing, to storage. We see the beginnings of this trend in Web hosting and our own “e-business on demand” offerings, where customers don’t buy computers, but acquire computing services over the Net, on a pay-for-use basis (IBM 2001, 3).

This second strategic bet represented a new business model, not just in professional services, but web technology services between machines (via machine-readable formats, e.g. XML and JSON). This vision of e-business evolving to computing as a utility would require development of virtualization technology and new business models that would be known as cloud computing after 2006.

In 2002, the IBM messaging on e-business shifted to an e-business on demand model described with three major components:

  • IBM infrastructure on demand of core services (i.e. processing, storage and bandwidth);
  • IBM business process on demand: pre-integrated software from IBM and IBM Business Partners encompassing horizontal and vertical business processes; and
  • Know-how: worldwide consulting support from IBM and IBM Business Partners to ensure best practices integral to every solution (IBM Global Services 2002).

For customers to migrate from in-house provisioning of computing resources, implementations by a variety of information technology providers would have to coevolve with emerging industry standards. This would be a race of open sourcing while private sourcing, where the value of a reliable utility would be challenged with disruptive innovations. The “service-led” vision of e-business on demand was organizationally positioned at IBM Global Service (and in the Strategic Outsourcing unit, in particular) as well as the extended IBM business partner community. The evolution of transforming an organization from product orientation towards a services and software business would continue through to the IBM CEO following Sam Palmisano in 2012, Ginny Rometty.

B.1.3 IBM would invest in enterprise systems, integrating middleware, and specialized high-value components

Under Lou Gerstner, IBM's investments had followed a “barbell” pattern: the highest value was seen in early electronic components parts and later services; lower value would come in the middle computer manufacturing parts. This led to IBM entering joint partnerships to develop and produce PowerPC chips for Apple Macs and Sony Playstations.840

Gerstner ... sketches his vision on the easel in his office: a vertical barbell. The big weight at the bottom is components. At the top end is services. The skinny bar in the middle is everything else: PCs, servers, network gear.

Profits are moving to the ends of the barbell, Gerstner says. The companies in the middle? “They’re becoming assemblers. The value is being pulled down to the people who have the real underlying assets.” Look at Intel and its 23% aftertax margin: “They’re at the bottom of the chain–and they make all the money.”

The barbell is especially tough on IBM. In the one market it still dominates–mainframes–prices drop so fast that IBM had to sell twice the horsepower in the first half to muster a 10% rise in revenue. IBM’s services business is booming ..., but profit margins are lower than in hardware (Lyons 1999).

The barbell pattern would lead IBM to de-emphasize some industry segments to the point of effectively exiting them. As examples, IBM left enterprise application software to companies like SAP and left networking hardware to companies like Cisco, while providing customers with middleware software to integrate across information systems.

In the 2001 annual report, the third strategic bet as on “a new computing model: infrastructure plus ubiquity”. The rise of the Internet would see computing workloads moving back to the industrial-scale server technologies. Mobile phones, videogame consoles and television set-top boxes would soon decline in price within the reach of consumers. The pervasive computing trend that would become the Internet-of-Things would take hold a decade later.

By 2002, e-business was getting traction as a trend. It was a vision moving away from client-server computing based on personal computers that relied on private sourcing connections to specific server hubs. It was a vision moving towards network-centric distributed computing where intelligent devices and computers could connect in a variety of ways through middleware based on open standards.

This meant that, on one end of the scale, the workload was moving back to the infrastructure—to industrial-strength servers, storage, databases and transaction-management systems. On the client end, it has spawned a proliferation of network-connected devices of all kinds: PDAs, cell phones, videogame systems, set-top boxes and beyond—to the whole pervasive-computing world of embedded components in everything from household appliances, to medical devices, to cars. And tying it all together was an emerging category of software with a wonderfully descriptive name, which hardly anybody had heard of five years ago -- middleware (IBM 2001, 4).

In the decade 2001-2011, the technologies of an e-business on demand vision would include (i) virtualization and cloud computing platforms, where operating systems would run on software emulators across private and public domains rather than bare metal physical machines, (ii) new devices with embedded technologies, such as smartphones and tablets; and (iii) middleware in a service oriented architecture that rationalized application functions into reusable and more manageable web service components.

IBM would continue to focus on being a provider to businesses rather than to end consumers. As the technology evolved, the company would have to simultaneously advance its investments in private sourcing, while participating with open source communities on evolving standards.

B.1.4 IBM would turn toward open architectures and common standards

Until its termination in 1996, IBM leadership was conscious of a 1956 consent decree with the U.S. Department of Justice (Passell 1994; U.S. Department of Justice 1996). The decree, a settlement so that an anti-trust investigation would be ceased, required IBM to sell its machines as well as to lease them, and to continue to provide services and parts for computers no longer owned by IBM. The provisions were terminated for (i) personal computers and workstations by January 1996, (ii) midframe computers (e.g. AS/400 products) by 2000, and (iii) mainframe computers (e.g. S/390 products) by 2001. The weight of the consent degree, said Lou Gerstner in his outgoing chairman's statement in the 2001 annual report, had IBM “so acutely aware of the siren call of proprietary control that we have learned to resist it”.

The fourth strategic bet on “an open playing field”, in a networked world of computing. Private sourcing would not be the only way IBM would deal with customers. Participating through open sourcing would enable both IBM and other technology providers to have interoperable platforms for the benefit of its clientele.

In a customer-driven world, open architectures and common standards are inevitable.

Today, we are focusing all our technical expertise and marketing energy—previously devoted to creating and marketing self-sufficient systems—toward reimagining and rebuilding them for open platforms. We now share our emerging software products with the developer community; license our technology and patents; and champion common standards at all levels, from Linux, to Java, to Web services. Most important of all was the work we undertook to open up our technical architectures. Absolutely every piece of IBM hardware and software today is a fundamentally different beast (and a more socialized one) than it was ten years ago (IBM 2001, 7).

IBM already had a strong history of working with industry standards groups, and customer councils. The alignments reflected IBM's strong installed base of heritage IBM offering lines (e.g. the z/Architecture mainframes, iSeries midrange servers, pSeries Power Unix servers, xSeries Intel-based servers) targeted for different conditions and purposes. Under Gerstner's reign, the feasibility of installing the same IBM software product across the variety of platforms improved. In the age of the Internet, however, the popularity of Linux and Unix servers would lead to an interoperable open sourcing orientation.

Evolving from the e-business vision centered on the Internet towards a broader open source perspective had been in discussion by IBM's leaders, since 1999:

In March 1999, a report was prepared summarizing our findings and presented to the Corporate Technology Council, a management group that governs key IBM strategy and business decisions. It was well-received although a number of questions were raised, particularly about business considerations, which involved more homework and analysis. Ultimately, a small group was set up within the IBM Software Group organization to oversee IBM’s open source activities, formalize the goals, create educational materials and provide training, and manage the day-to-day aspects of our activities. This included making sure that appropriate approvals were granted before any IBM team externally participated in an open-source activity and that team members received appropriate education (Capek et al. 2005, 253).

The Open Source Steering Committee (OSSC), an IBM internal board, was formed to oversee open-sourcing activities and review all planned external uses of open source.

Since their establishment during 1999, IBM’s strategic goals for open source have remained consistent. They are:

  • To support rapid adoption of open standards by facilitating easy access to high quality open-source implementations of open standards in order to speed industry adoption. A primary goal is to encourage open-source implementation of open standards and thus use open source as a way to support our business and strategic goals.
  • To use open source as a business tool by keeping the platform open and taking advantage of new business opportunities. By creating more open opportunities, we encourage choice and flexibility in responding to customers’ needs in typically heterogeneous environments.
  • To enhance IBM mind share, creating a preference for IBM brands by associating them with successful OSS projects and building relationships with a broad spectrum of developers. We contribute to key OSS projects that are functionally connected with some of our key products. The joint participation of commercial developers and independent OSS developers creates a synergy that enhances the open-computing ‘‘ecosystem.’’

To summarize these goals, IBM views open source as a tool or technique to be used, where it makes sense to do so, to enhance our business and that of our customers (Capek et al. 2005, 253–254).

At the beginning of 2000, resources from the pioneering Internet division were migrated into a new Linux team (Wladawsky-Berger 2006). In February 2000, IBM made a variety of announcements in a keynote presentation at LinuxWorld, and then demonstrated Linux running on “everything from laptops to an IBM S/390 mainframe” (Orzech 2000). In December 2000, IBM announced that it had spent $1 billion on Linux that year, "and you can expect that to grow in 2001" (Wilcox 2000).

In the elevation of Sam Palmisano from president to CEO in 2002, IBM was well-positioned for organizational-building. Under Lou Gerstner, the company had been revitalized. IBM was financially strong, and seen as a leader in a information technology sector with many dot-bomb collapses. Further, the attacks on the United States on September 11, 2001 would predispose the world to prefer a stable institutions like IBM. The evolution of both the business direction and stronger leaders at all levels of the company were in place to engage open sourcing as a complement to the tradition of private sourcing.

B.1.5 Through 2009, IBM reiterated on open source and open standards

In the eight years following Palmisano becoming CEO, themes from the IBM 2001 annual report would be repeated again and again.

In the 2002 Annual Report, the computing model had evolved to become an “On Demand Operating Environment”, where IBM would continue to lead open technical standards and platforms.841

In the 2003 Annual Report, the next wave was described as “On Demand Integration” with open standards.842

The 2004 Annual Report reiterated “the architecture and technologies for the On Demand Operating Environment, based on open standards”.843

In the 2005 Annual Report, the rise of service-oriented architecture (SOA) was built on “open, standards-based middleware”.844

In 2006, CEO Sam Palmisano wrote about a new business model beyond the multinational corporation, as the Globally Integrated Enterprise, observed “within IBM and among our clients”. As part of the systemic change:

New forms of collaboration are everywhere: from increasingly complex intercompany production networks to the open-source software movement, which has helped transform the traditional model of innovation. Today, innovation is not led by lone inventors in their garrets but is the product of a collaborative process that also combines technological and marketing expertise. And such open approaches affect far more than software and IT: they also apply to education, governance, and many industries (Palmisano 2006).

In the Annual Report 2006, the language of “open standards” was complemented by the addition of “open, modular systems”, expanding from the domain of technology into the broader domain of business processes. 845

By the Annual Report 2007, open sourcing had become a behaviour that IBM had practised for some time, and would continue. The IBM Strategy was explicitly described as delivering value through three strategic priorities: (i) focus on open technologies and high value solutions, where:

The company continues to be a leading force in open source solutions to enable its clients to achieve higher levels of interoperability, cost efficiency ;

(ii) deliver integration and innovation to customers; and (iii) become the premier globally integrated enterprise (IBM 2007b, 18). “Open standards” and “open source” were phrases used over and over again in the detailed Management Discussion.846

In November 2008, the primary leadership message from CEO Sam Palmisano shifted to “A Smarter Planet”, as the Globally Integrated Enterprise was having a broader impact on society. Potential was seen in “infusing intelligence into the way the world literally works—the systems and processes that enable physical goods to be developed, manufactured, bought and sold… services to be delivered… everything from people and money to oil, water and electrons to move… and billions of people to work and live”. The possibilities were being made, as “First, our world is becoming instrumented”, “Second, our world is becoming interconnected”, and “Third, all things are becoming intelligent”. This meant that “digital and physical infrastructures of the world are converging” (Palmisano 2008).

The IBM Annual Report 2008 acknowledged that “the global economy is experiencing profound disruption”.847 The chairman's letter from Sam Palmisano described IBM was “well positioned to continue delivering strong results” and “positioned to lead in the new era that lies on the other side of the present crisis”. The emergence of a “new computing model” described in 2001 was amended to include “a new platform for global economy and society”, in a world that was increasingly instrumented, interconnected and intelligent.848 The three strategic priorities in the 2008 report were worded identically to 2007.

By the Annual Report 2009, open source was described as part of the “IBM's track record” in positioning of new areas for growth.849 Investments were then focused on four high-potential opportunities: (i) growth markets; (ii) analytics; (iii) cloud and next-generation data center; and (iv) Smarter Planet. Open source was explicitly linked to the third investment, and implicitly to the fourth.

The 2010 annual report celebrated IBM's centennial in June 2011. Looking towards its “second century”, the IBM business model “based on continuous forward motion” included “a commitment to research” “pioneering breakthroughs, advancing technologies and helping define open standards” (IBM 2010a, 14). In the 2011 annual report, with Ginny Rometty as President and CEO, and Sam Palmisano as Chairman of the Board, “open standards” and “open source” were so ingrained into the company that they only merited mentioning in the management discussion (IBM 2011a, 23-24).

From the writing crafted in IBM annual reports, open source and open standards could be seen as new in 2001, core by 2006, and a natural part of the way of doing business by 2009.

B.2 IBM employees, from 1996, engaging globally online

IBM has appreciated operating as a multinational corporation since the founding of the World Trade organization in 1949 (IBM 2011c). Beyond ethnocentric and polycentric designs, geocentric organizations involve “a collaborative effort between subsidiaries and headquarters to establish universal standards and permissive local variations, to make key allocational decisions on new products, new plants, new laboratories (Perlmutter, 1969, p. 13).

Jacques Maisonrouge, the French-born president of IBM World Trade [from 1967, and member of the IBM board of directors from 1983 until his retirement in 1984], understands the geocentric concept and its benefits. He wrote recently:

“The first step to a geocentric organization is when a corporation, faced with the choice of whether to grow and expand or decline, realizes the need to mobilize its resources on a world scale. It will sooner or later have to face the issue that the home country does not have a monopoly of either men or ideas …

“I strongly believe that the future belongs to geocentric companies …. What is of fundamental importance is the attitude of the company's top management. If it is dedicated to 'geocentrism', good international management will be possible. If not, the best men of different nations will soon understand that they do not belong to the 'race des seigneurs' and will leave the business” (Perlmutter, 1969, pp. 16–17).

One way in which employees can interact with their peers on a worldwide basis is online through information and communications technologies (ICT). Beyond point-to-point e-mail transmissions, IBM has had a long history in online platforms for open engagement amongst its employees on a world scale. Some of the platforms have included (i) online forums, in section B.2.1; (ii) the w3 intranet, in section B2.2; (ii) alphaWorks, in section B.2.3; (iii) pooled non-commercial source internally in section B.2.4; (iv) Jams, described in section B.2.5; (v) the Technology Adoption program, in section B.2.6; (vi) Social Computing Guidelines, in section B.2.7; and (vii) the Greater IBM Community in section B.2.8.

B.2.1 From 1996, IBMers conferenced on IBMPC, then IBM Forums

IBM began online computer conferencing in the age of the mainframe, predating the advent of the Internet.850 In October 1981, the IBMPC Forums were initiated as a modification of the TOOLS file sharing technology on the mainframe (i.e. under the VM operating system) on the internal IBM network (i.e. VNET) (Chess and Cowlishaw 1987). TOOLS maintained a persistent record of files with contributions added as “appends”, so that the content could be read as a temporal transcript of conversations between participants. The IBMPC Forum was a platform developed by scientists from IBM Watson Research Center Yorktown and the IBM UK Scientific Centre to support discussion about the IBMPC that was introduced in August 1981. By 1990, IBMPC had 1500 active conferences, over 1000 contributions a day from over 10,000 contributors around the world, and readers on the order of 100,000 IBMers (Foulger 1990).

In 2001, the online conferencing became known as IBM Forums, with infrastructure was moved to an NNTP (Network News Transfer Protocol) application more consistent with the Internet technologies.851 Despite having become the worldwide online platform of choice for over 20 years, the IBM Forum were still primarily operated by IBM Research, rather than the office of the CIO.852

By 2007, 23,000 unique authors contributed 188,000 posts, with an average of 14,980 posts per month. The application was moved from IBM Research into the TWE (Total Workplace Experience) Center of Excellence production support with a customization of Jive Forums 5.05 onto a zLinux server with four Websphere Application Servers and a DB2 server in 2007.853

On June 2, 2012, the complete content from the IBMPC and IBM Forums legacies were migrated to an intranet installation of the IBM (Lotus) Connections program product.

While many companies have struggled with communicating over national boundaries, the IBM Forums were an everyday part of the internal open sourcing culture for IBM employees since the 1980s. Questions or observations as a starter for a thread sparked responses and clarifications where engaged IBM employees could think together. The original form of communications was more rudimentary than today's Internet technologies enable, but the knowledge captured in text has endured since its beginning, beyond generations of IBMers who have retired or left the company.

B.2.2 From 1996, IBMers got connected to the Internet and w3 intranet

On their desks, IBM employees would graduate from 3270 terminals first introduced in 1971, through the PC/XT in 1983, the PC/AT in 1984, the PS/2 in 1987 (with the OS/2 operating system). By 1993, virtual offices had become common for customer-facing employees and consultants.854 Mobile workers graduated to laptops with a Thinkpad 700C laptop from 1992, a Thinkpad 755C from 1994, a Thinkpad 600 from 1998 and a Thinkpad T20 by 2000. IBM employees in the field became accustomed to connecting to each other electronically through the IBM Global Network Dialer, which became the AT&T Network Dialer after the 1998 acquisition and outsourcing deal.

After the 1994 Winter Olympics in Lillehammer, Irving Wladawsky-Berger and John Patrick were shown by David Grossman that Sun Microsystems were publishing IBM's raw data feeds, and presenting them in a way that IBM was not (Hamel 2000). This led to development of a primitive corporate intranet, and the “Get Connected” manifesto of six ways IBM could leverage the web:

  1. Replace paper communications with e-mail.
  2. Give every employee an e-mail address.
  3. Make top executives available to customers and investors on-line.
  4. Build a home page to better communicate with customers.
  5. Print a Web address on everything, and put all marketing on-line.
  6. Use the home page for e-commerce.

This led to the launch of www.ibm.com on May 24, 1994 (Ransdell 1997). In 1995, a cross-IBM Internet Division drawing from the software and services business units was formed (Wladawsky-Berger 2005). IBM gained experience with public Internet sites for the 1995 U.S. Open and Wimbledon tennis tournaments, the 1996 chess match between Gary Kasparov and Deep Blue, and the 1996 summer Olympics in Atlanta. By at least 1996, a ShopIBM link has been on the IBM web site, was as paths to a variety of product support paths.855 Microsites for investors, developers and IBM business partners were also available by late 1996. As early as 1997, the complete directory of every IBM employee has been accessible on the Internet at whois.ibm.com.856

In the first half of the 1990s, IBM employees used a combination of PC-based productivity tools (e.g. Lotus SmartSuite on OS/2 and Windows 95) and mainframe-based collaboration (i.e. PROFS on 3270 emulators to VM/CMS).857 In 1996, e-mail and document collaboration moved to Lotus Notes from PROFS.858 Lotus Notes databases became a standard way of sharing documents and discussions, with master copies on networked servers and local replicas on laptops.

In late 1996, the IBM w3 intranet was rolled out. In the following year, the web search would handle 2 million hits per day (IBM 2008a).859 By 2001, IBM employees would rank the w3 intranet equally with co-workers as the most credible or useful source of information, above the news media, executive memos or managers (Stellin 2001).

Starting with 8000 intranet sites, 11 million web pages and 5600 domain names, the company standardized and integrated its IT systems so that the organizational boundaries for information sources became transparent (Smeaton 2002). In 2000, the capability for each employee to personalize his or her homepage to prioritize the most common job role needs (e.g. managing projects, employees, or teams) was added. By 2004, redesign of the enterprise portal refined tabs for (i) home, (ii) work, (iii) career and (iv) life. The BluePages expertise locator and employee directory was enhanced to show information on mouseover, improving the immediacy to connect to colleagues (Pernice, Schwartz, and Nielsen 2006).

The functionality of the w3 intranet has followed the rise of open standards, with IBM Websphere portal technology on servers. Desktops based on the Client for e-Business (C4EB) first relied on Internet Explorer 6 bundled with the Windows XP operating system. IBM was a strong supporter of the Mozilla Firefox browser introduced in 2003, and made that the preferred browser for the company.860 Employees would continue to use, everyday, both a browser and the Lotus Notes version 8.5.1 “ fat client” released in 2009. New mobile applications for smartphones would gradually adopt more standards (e.g. HTML5), but intensive knowledge work continues to be better on a computer with a physical keyboard.

The w3 intranet home page is the portal into the company for each and every of IBM's 300,000-some employees. The portal does not host all of the context, but accesses all of the information sources indexed within the company. While much of the information would be official announcements by company executives, the search engines would crawl every forum back to the IBMPC days, and include wiki and blog pages as those new technologies developed.

B.2.3 From 1996, IBMers shared emerging technologies on alphaWorks

IBM has had a long history in emphasizing quality.861 The development of commercial products at IBM has traditionally followed a structured Integrated Product Development (IPD) process with stage-gated decision reviews (Grzinich, Thompson, and Sentovich 1997).862 Before a product plan could be created, the concept would first have to be completely specified. This practice is consistent with the 1960s IBM hardware development process that separated the design team from the manufacturing team, with the release of specification documentation.863 In the rise of agile practices in software development, the gates evolved to become known as alpha test (of units or modules within a system) and beta test (an initial test of an integrated system).864 For software releases, the rapid pace for new features on “Internet time” was exemplified in the development of the Netscape browser from 1994 through to its open sourcing in January 2008, when beta versions were being released bimonthly or quarterly (B. Wilson 2000).865

In the IBM Internet Division, there was a frustration that “there was a new class of applications possible that were a lot less industrial strength, more user-friendly using Web front ends, browsers and things like that”, with a recognition that “development cycles were all geared to back-end commercial applications ... taking too long” (Wladawsky-Berger, Smith, and Poole 2006). The idea came that “maybe what we need is to put out alpha versions of our stuff out there. Maybe what we need is alphaWorks” (IBM alphaWorks 2006).

alphaWorks was first exhibited at the Fall Internet World '96 in December 1996. It was intended as an "online laboratory” and “web site that demos new Web technologies months before they become products or services” (Toporek 1997). In addition to technologies developed by IBM Research, non-IBM technologies were welcomed on the site. These offerings were deemed insufficiently mature to be labelled as beta versions. The alphaWorks license agreement included a “term and termination” clause: the license “will terminate ninety (90) days after the date on which you receive the Software. Upon such termination you will delete or destroy all copies of the Software”.

In its first year, the web site hosted 28 early-stage technologies attracting 60,000 users to the community. From those, five products became commercialized that year (Ransdell 1997).

In June 2006, IBM complemented the alphaWorks download site with a new alphaWorks Services site (Kerner 2006). This would enable the addition of web-based services, such as browser-based development tools. After 10 years of operating, alphaWorks claimed to have introduced almost 700 new technologies, 129 of which found their way into IBM products.

alphaWorks became a way in which IBM researchers could interact directly with university faculty and students. By 2008, many downloadable technologies were offered to the academic community with more open terms and evaluation periods (Bridgwater 2008). The graduation path from alphaWorks has been developerWorks, where a broader audience could engagement with the technology. In 2008, over 200 technologies were available for download, and 40% of the assets posted on the alphaWorks web site had been incorporated into IBM products.

The design of alphaWorks as a community site enables both IBMers and non-IBMers to share and try out technologies that may or may not become products and/or standards. Alternative paths of graduation or termination within a short-horizon encourage limits investment and resources in dead ends, while providing a stage on which early implementations can be sampled.

B.2.4 From 2000, IBMers have pooled on source repositories

The IBM Internal Open Source Bazaar (IIOSB) is “a free service to promote Open Source style development internally at IBM”866. The project with the earliest registration date, of May 3, 2000, is “Linux Client for e-business” (C4EB), that has enabled a standard installation of operating system and applications on computers issued to IBM employees, as an alternative to the Windows XP platform.

The IIOSB was first implemented on the IBM intranet in 2000, based on the Gforge software developed by VA Linux Systems at the foundation of Sourceforge on the Internet (McMillan 2000).

The IIOSB was originally developed, and continues to be funded by the IBM Linux Technology Center. This place allows IBM employees to create projects where they could share code they have written, that might also include open source code (e.g. under a GPL, LGPL, Apache, BSD, IPL or CPL license) and internal use source code with others in the company. Only non-confidential materials were to be hosted in the IIOSB, with confidential materials to be moved to the IBM Community Source server (which later evolved into the IBM Community Development Platform, also called CSNext). Ownership of the source code written by IBMers posted to the IIOSB is copyrighted to IBM, while third-party copyrights and licenses (e.g. GPL, LGPL, Apache, BSD, CPL, etc.) are respected and retained. Registration on the IIOSB reinforced the licensing terms and conditions with an explicit agreement to be electronically signed within a month.867 Links point to the online education on the IBM Open Source Participation Guidelines that describe the spirit in which open source is to be interpreted.868

The repository is partitioned into licensing zones, with project counts as at May 2010:

  • IBM Internal zone, for 750 projects with no plans to externalize the source code, or an unknown disposition, with a potential for release either as an IBM product or open source;
  • IBM Internal / Mixed OSS zone for 381 projects with no plans to externalize the source code, embedding Open Source Software that could be used only internally;
  • Apache / BSD / MIT zone for projects covered under an Apache Software License (43), a BSD License (11) or an MIT License (13) that might eventually be released externally following an approval process;
  • GPL / LGPL zone for source code covered under the GNU General Public License (146) or the GNU Lesser General Public License (20) that might eventually be released externally following an approval process;
  • IPL / CPL / EPL zone for projects using the IBM Public License (58), the Common Public License (58) and Eclipse Public License (5), that might be released externally following an approval process; and the
  • Other License zone for the miscellany in which legal counsel would have to be sought.

At May 2010, 1,493 projects were hosted, with 12,865 registered users. The IIOSB listed 425 projects by development status: 49 were in planning, 49 in pre-alpha, 64 in alpha, 118 in beta, 129 in production/stable, 4 in mature, 3 in moved, and 9 in end of life.869

Benefits of the internal community source model are seen as two-sided. Producers (i.e. authors) get improved collaboration, broader testing and tuning, and more developers using their code. Consumers (i.e. beneficiaries) get faster access to company capabilities, the ability to tune an existing investment to a unique specification, and broader testing and tuning (Sabbah 2005, 16).

In 2005, the IIOSB community open to any IBM employee was complemented by the IBM Community Source repository870. “The IBM Community Source web site is the place internally to practice open source behavior with IBM product code”. The motivation for these professional developers was easier collaboration, leading to increased efficiency and a speed to market improvement of 30% (Graham 2005). In contrast to the IIOSB, IBM Community Source targets employees of the Software Group division of programmers across 40 labs. The assets “are not open source” in licensing, but are open sourcing in behaviour. Community source “represents a shift in the traditional development model ... that is required to realize the component model of building IBM products”871. At May 2010, 1,747 projects were hosted, with 33,580 registered users.

Both the IIOSB and IBM Community Source are programs where procedures for categorizing open source assets are rigourously managed. This provides clarity for reuse of the contents in a variety of contexts, for both non-commercial and commercial reuses.

B.2.5 From 2001, IBMers have collaborated on global online jamevents

In IBM, a jam is a massive online collaboration whereby participants meet around the globe, around the clock for three days (IBM 2004c, 41). Leaders can use a jam to “start a movement” in an organizational transformation.872

In 1999, employee opinion surveys uncovered that intranet was a more useful, credible and reliable source of information than both managers and coworkers. This raised a question as to whether an intranet could be used for culture change (Wing 2005b). The pragmatic core for a jam was to capture best practices from individuals on clever, common sense ways to get work done and produce value in spite of the complexity of operating in such a large, diverse and complex enterprise at IBM.

“It was a best practices surfacing effort,” says Mike Wing, IBM's director of worldwide intranet strategy and programs. “It was not a suggestion box or a free-form chat. This was not to propose things that management should do. This was very much E to E — employee to employee. We chose topics deliberately so that the ideas that they generated would be things people could go implement, not things that required large capital expenditures or policy decisions.

“We didn't necessarily want the executives who were responsible for X, Y, or Z topic [to respond to problems],” he adds, “because we precisely wanted to come at things from some unusual angles, encouraging people who would bring a different intellectual frame of reference to the topic” (Andelman 2001).

After nine months of preparation, World Jam was scheduled May 21 to 24, 2001, 72 hours around-the-clock (Feder 2001). The web-based platform was intended as a medium to help IBM employees talk with and help each other, shaping a different kind of computer mediated communication while striving to provide mutual awareness. The World Jam process had three stages:

  1. Drawing people in, with headline articles on the main intranet page, and e-mail through managers and from the CEO, leading to a description of the World Jam concept and a list of the fora;
  2. Choosing where to go, with ten forum topics divided into five areas (new relationships, new ideas; travelling without a map; managing an e-worklife; managing the matrix (or in spite of it); talent and quality), of which three topics were related to individuals within the company, and seven were related to working within and around company issues; and
  3. Participating in one of the fora, contributing to an asynchronous conversation thread and rating posts for usefulness, overseen by a team of 3 to 4 moderators on duty 24 hours per day would nominate ideas to be voted on for the list of 10 Great Ideas.

Over the three days, there were 52,595 unique logons (one-sixth of IBM's 300,000 workforce), 1700 posting at least once, totalling 6048 postings (Halverson et al. 2001). Topic moderators had assembled a “board of advisors” to provide reference materials and participate online. Moderators and facilitators communicated through an IBM Research prototype, Babble, to engage on joint problem solving on a back channel (Thomas 2001).

Findings from the WorldJam research were positive in three ways: (i) the event accelerated expertise engagement, with 2000 new contacts met; (ii) participants had trust, with 68% observing constructive criticism of each other; and (iii) repeatability, as 85% said they would engage again in a similar event and 62% said they learned new things (Kellogg 2005).873

Following this success, other jams for IBM employees were planned: ManagerJam in July 2002; ConsultantJam in February 2003; On Demand IT Jam in April 2003; ValuesJam in July 2003; and WorldJam 2004 (Wing 2005a).

ManagerJam was scheduled on July 9 and 10, 2002, for 48 hours. The central idea was to invite 32,000 managers at all levels worldwide to offer practical grassroots solutions to everyday IBM management challenges (Dorsett, Fontaine, & O’Driscoll, 2002). Forum topics included (i) translating strategy into results; (ii) building careers; (iii) fostering innovation; (iv) managing performance; (v) the human face; and (vi) the new customer landscape. JSP servlets were used to integrate the discussion forums were hosted on an NNTP server, and the message ratings on a DB2 server. The project and facilitation team used a TeamRoom Plus online collaboration environment (Millen and Fontaine 2003). During the event, 8123 managers (25% of the population) participated, with 22% posting (more than double that in WorldJam 2001) to the 4554 comments and replies.

ConsultantJam, in February 2003, was conducted within 45 days of 30,000 PriceWaterhouseCoopers consultants coming on board following the company acquisition by IBM. Conversations included what does it mean to be an IBMer, what does PwC bring to the table, what's best and what doesn't work in our cultures, what can we learn from each other, and how can we work with each other. The event ran for 96 hours, involving 8560 participants discussing 2960 ideas, with a JamAlzyer real-time text mining and theme analysis (Birkinshaw & Crainer, 2007; Wing, 2005b).

The On Demand IT Jam in April 2003 was designed to uncover ideas and solutions to further IBM's on demand transformation. The four forum topics were: (i) business transformation enablement, how the way business is done could be changes; (ii) development environment, leveraging on demand technologies; (iii) operating environment, changing data centers; and (iv) e‑business on demand for employees, changing the way work was done. Over 72 hours, 9793 participants discussed 2963 ideas, and established management accountabilities for follow-up (Wing 2005a).

Values Jam, held between July 29 to August 1, 2003, for 72 hours, was a landmark for the company. With Lou Gerstner as CEO from 1993 to 2002, the focus in the company was to survive and turns its fortune around. When Sam Palmisano took over as CEO in April 2002, it was the first time in since the early 1990s that IBM grew more than the IT industry.874 Having been an IBM employee for 30 years, he believed that the Basic Beliefs -- originally established by Thomas Watson Jr. in 1969 – could continue to be a foundation for the company, but that a discussion on a new set of corporate values should include which aspects were worth preserving and which aspect would need change in the new century.875 Research involving 300 senior executives and then a survey of over 1000 employees developed propositions that could be discussed across the whole company (Palmisano, Hemp, and Stewart 2004).

Values Jam invited all employees to discuss what defines IBM and IBMers. The four forums initiated discussions on (i) Company Values (i.e. Do company values exist? What would a company look like that truly lived is beliefs?); (ii) A First Draft (i.e. Consider “1. Commitment to the customer; 2. Excellence through innovation; 3. Integrity that earns trust”. Is there a nuance missing?) (iii) A Company's Impact (i.e. Is there something about our company that makes a unique contribution to the world?); and (iv) The Gold Standard (i.e. When is IBM at its best)? Moderators, with the help of text mining, surfaced emergent themes and updated links on the home page to point specifically at hot topics.

After 72 hours and 1.25 million views of the Values Jam web site, 22,007 participants had written 9337 posts and replies. Executives then reviewed the JamAlyzer e-classifier analysis, pre- and post-jam survey and read the raw transcripts. The revised set of new corporate values became: (i) Dedication to every client's success; (ii) Innovation that matters – for our company and for the world; and (iii) Trust and personal responsibility in all relationships. These were published on the company intranet in November 2003, leading to 200,000 downloads by employees within ten days. This led to floods of posting on the intranet, and a thousand e-mails sent directly to Palmisano. A lot of the reactions were positive towards the new values, but said that the company wasn't close to actually living them.

World Jam 2004, in October, came out of the responses to the Values Jam in 2003. The intent of this jam was to tease out what could be done operationally, both in the daily work of individual employees, and at the policy level. The focus was to identify actionable ideas to accelerate profitable growth, unleash innovation and drive internal productivity, consistent with the new IBM values (Wing 2005a). Six discussion forums were moderated by 18 senior executives. On the value of client success were forums on “Making IBM Work for Each Client” (i.e. How can we make IBM easier to do business with?); and “Delivery Excellence” (i.e. How can we get better at delivering what the customer expects, and more)? On the value of innovation that matters, were forums on “For the World” (i.e. How can we see and seize new growth opportunities?); and “For our Company” (i.e. Where and how can we innovate on IBM itself)? On the value of trust and personal responsibility were forums on “Managers” (i.e. What do our strategies and values imply for the job of the first-line manage?); and “Every IBMer” (i.e. What do our values imply for each of us -- in our jobs, and an in our careers)?

For 52 hours (including a 6-hour extension) between October 26 to 28, 2004, with 2.4 million page views, 56,870 unique participants wrote 32,662 posts. The results were analyzed, based on priority and impact to the business, distilled in 191 ideas across each of the six discussion forums. A “Rate the Ideas” follow-up web site became active for 7 days between November 30 to December 6, inviting every employee to log in and individually rank the 191 ideas. Senior management then committed to 35 of the top-rated ideas. There was overwhelming support for two ideas: (i) an employee survey on people manager effectiveness; and (ii) consolidation of alignment of back-office sales support functions. Three major categories emerged from 26 of the top rated ideas: (i) lowering the center of gravity, improving cross-unit integration for client success; (ii) helping my manager to become a better manager; and (iii) enabling innovation and growth.

The success of jams internally at IBM, and the publicity in the business press, led to events for external parties.

Habitat Jam, in December 2005, was the first event external to IBM using the Jam technologies and facilitation techniques. This internet conference was jointly sponsored by the United Nations Human Settlements Programme (UN-Habitat), the Government of Canada and IBM (Debbe Kennedy 2006). The approach was to use Jam technology an inherently global medium to drive ideas to action, in preparation for the third session of the World Urban Forum scheduled for June 2006 in Vancouver, Canada. The promise was to have a truly democratic event, where people who would not otherwise engage in a United Nations event could join the discussion. The participation of 400 organizations around the world drew in 39,000 people from 158 countries. Real slum dwellers and villagers joined architects, planners and activities in discussions, sometimes employing translators and screen reading technologies. Discussion was channelled into seven forums: (i-ii) Improving the lives of people living in slums; (iii) Sustainable access to water in our cities; (iv) Environmental sustainability in our cities; (v) Finance and governance in our cities; (vi) Safety and governance in our cities; (vii) Humanity: the future of our cities. The jam resulted in 4000 pages of discussion and ideas leading to the generation of 600 ideas. From that output, 70 actionable ideas were chosen, research and summarized into a workbook and CD for the World Urban Forum III (IBM, 2006f).

In 2006, Innovation Jam was a significant shift from prior events, in breadth and scale. In addition to IBM employees, participants invited included family members, and represented from 67 organizations spanning business partners, customers and university researchers.876 Weighting more towards developing ideas better over immediate commercialization, the jam formula was broadened to incorporate major market trends as well as internal inputs.877 Unlike prior jams that did not require advance preparation, participants were asked to familiarize themselves with some 25 emerging technologies, in six broad groupings of (i) embedded intelligence; (ii) extracting insight; (iii) global collaboration for individuals; (iv) global collaboration for companies; (v) practical supercomputer; and (vi) intelligent information technology systems (Gryc et al. 2009; Bjelland and Wood 2008, 34).

Between July 24 and 27, the first phase of Innovation Jam ran for 72 hours with forums on (i) Going Places, transforming travel, transportation, recreation and entertainment; (ii) Finance & Commerce, the changing nature of global business and commerce; (iii) Staying Healthy, the science and business of well-being; and (iv) A Better Planet, balancing economic and environmental priorities. This led to 57,000 people logging in, and 37,000 posts.

In the five weeks that followed, a multidisciplinary cross-IBM team analyzed the Phase 1 outputs, resulting in 31 “big ideas”.878 These 31 big ideas became the focus for refinement in the Innovation Jam Phase 2, September 12-14. Participants were asked to flesh out the proposals, and rate each one on business impact, market readiness and societal value, leading to 9000 posts.

The output from Phase 2 was analyzed by the cross-IBM team, working up specific proposals around 10 finalists.879 On November 14, 2006, Palmisano announced that IBM would invest $100 million over the next two years to pursue these ten new businesses, partnering with multiple clients and universities to bring the ideas to market quickly (IBM 2006f). Five years later, those ten new businesses were estimated to have returned $700 million to IBM in revenue (L. Cleaver and Euchner 2011, 17).

By 2007, the IBM Jam Program Office within the Office of the CIO had organized to facilitate jams for customers as Jam Consulting Services, contractable through IBM Global Services (L. Cleaver 2010).

  • In March 2007, Automotive Supplier Jam was the conducted with the Original Equipment Suppliers Association (OESA) (IBM 2007l; Guerrera 2007).
  • In May 2007, Nokia had a Nokia Way Jam. focused on the question “What does it take to be an Internet company?” as part of shifting Nokia's business and strategy.
  • In December 2007, Eli Lilly had a Vision Jam to generate practical ideas through an increased understanding of the company's new vision.
  • In January 2009, the UK Foreign and Commonwealth Office had a jam on One Team, Many Voices, Our Future, to model the behaviour for the future organization, to break down divisions between the home office and overseas, career diplomats and civil service.
  • In early 2010, Royal Dutch Shell Projects and Technologies had a My P&T Jam to create an affiliation across a new organization of 8200 globally dispersed employees.
  • In February 2010, the European Union and NATO had a Security Jam focused on the changing nature of the 21st century security landscape.
  • In March 2010, USAID and the White House had USAID Global Pulse 2010 Jam to share ideas and creative innovation solutions to social issues, informing U.S. foreign assistance and diplomatic strategies.
  • In June-July 2010, the UK Coventry City Council had CovJam, to engage in dynamic conversations with constituents about future direction.
  • In late 2010, Citibank Global Transaction Service had a GTS Jam. to enable employees to engage directly with senior management on key initiatives for future growth.

In addition, the American Council on Education and the Kresge Foundation sponsored a Veteran Success Jam in May 2010, to address issues for returning veterans with colleges, universities and employers.880

Innovation Jam 2008 was an IBM-sponsored event with an external emphasis. The jam was structured to follow findings from the IBM CEO Study released in May 2008, where 1100 CEOs shared their visions of the Enterprise of the Future (IBM Global Business Services 2008). The four main discussion areas would be: (i) built for change --adapting to thrive; (ii) customers as partners -- the new intelligence; (iii) globally integrated -- navigating to a flat, smart world; and (iv) the planet and its people -- moving well beyond green.

The jam ran for 90 hours from October 5th through 9th. Nearly 90,000 logins generated 32,000 posts. Employees from over 1000 companies across 20 industries included thousands of IBMers, and subject matter experts from Mars Incorporated, Eli Lilly and Company, Citigroup and Boston College. Jammers read 1.5 million pages, with an average jammer reading 76 pages and spending just under 2 hours participating (IBM 2008d).

Jammers concluded that the enterprise of the future has to do three things: (i) embrace a new level of transparency for itself and across the systems we are seeking to make smarter, allowing customers and partners to engage more intimately, and on a variety of levels; (ii) increase efficiency in every aspect of its business operations, eliminate waste, and employ new and powerful monitoring and measuring techniques to make better business decisions; and (iii) adopt corporate stewardship as a core business function, working closely with the public sector to build sustainable business practices that will improve global living conditions and drive positive social change.

In April 2009, IBM University Programs sponsored a Smarter Planet University Jam. Nearly 2000 students and faculty from 200 universities in 40 countries participated with IBMers and IBM business partners. The jammers saw the need for a new model of university education around smarter campuses, contributed 100 examples and ideas of how universities could “go green”, and identified a variety of opportunities in water, healthcare, grid technologies and cities (IBM 2009e).

In January 2010, the Institute for Business Value unit of IBM Global Business Services sponsored the Eco-efficiency Jam. Eco-efficiency “is about embedding sustainability and resource conservation in every facet of an organization” with government regulations and incentives. Participants included 1600 leaders, journalists and experts from more than 60 countries. Three best practice recommendations emerged: (i) “green” infrastructures would overlay the physical infrastructure with digital intelligence; (ii) sustainable solutions would promote resource efficiency while reducing the environmental and social impact of operations; and (iii) intelligent systems would use open standards for realtime information on infrastructure (IBM Institute for Business Value 2010).

In October 2010, IBM Corporate Citizenship & Corporate Affairs sponsored Service Jam. More than 15,000 people from 119 countries – including former U.S. presidents, German professors and South African tutors – discussed practices that “elevate the effectiveness and impact of volunteering, public services, social entrepreneurship and other forms of service”. The four key systems of service that presented the greatest challenge and held the most opportunity were: (i) service learning; (ii) volunteer management; (iii) partnership; and (iv) measuring impact (IBM 2010f).

In February 2011, IBM Software Group sponsored a Social Business Jam. The event brought together 2700 participants from 80 countries for 72 hours on five major themes: (i) building the social business of the future; (ii) developing participatory organizations through social adoption; (iii) using social media to understand and engage with customers; (iv) determining what social means for IT functions; (v) identifying risks and establishing governance. The jam yielded over 2600 discussion posts and more than 600 tweets. These led to a series of insights. Over 25% of organizations had low levels of adoption of social business practices, as ways of calculating returns-on-investment, and the need for cultural change required by the human resources function were a concern. Social technologies raised issues in the changing role of middle managers, inclusiveness of lurkers who don't contribute, reconciling personal openness with business privacy, and the importance of soft infrastructure (i.e. people, processes and problem solving). Engaging customer with social technologies was seen as shifting from a vendor model to a service provider model, at the risk of overwhelming the business with the wealth of information. The IT function would need to respond more quickly since social tools were rapidly released as consumer applications, on smartphones and tablets previously unsupported. Risks had not yet been identified, nor governance in place, for social business, with the content streaming in from a new variety of sources, and employees interacting directly with customers in a mix of professional and personal spheres (IBM Software Group 2011).

In November 2011, the World Wide Web Consortium (W3C), an international community that develops open standards to ensure the long-term growth of the web, hosted the W3C Social Business Jam (World Wide Web Consortium (W3C) 2011). For this technically sophisticated audience, IBM provided the platform for a minijam.881 Six aspects of social technology were covered: (i) identity management for social; (ii) mobile and social; (iii) information management; (iv) business process meets social; (v) seamless integration of social; (vi) metrics for social business. The jam had 1073 unique registrants attending, from 20 different industries.882 On the day of the jam, each of those six topics had posted times with hosts and special guests.883 The primary recommendation from the Jam was to “form a W3C Social Business Community Group in order to develop a number of customer-driven strategic use-cases for standardizing the Social Web”. These activities evolved with the W3C launching its Social Web Activity in July 2014 with two groups:

The Social Web Working Group, which defines the technical standards and APIs to facilitate access to social functionality as part of the Open Web Platform.

The Social Interest Group, which coordinates messaging around social at the W3C and is formulating a broad strategy to enable social business and federation (Jacobs 2014).

In December 2014, the OpenSocial Foundation transferred its specifications and assets to the W3C, moving its standards work there and integrating into the W3C legal entity.884

Jams started as massively collaborative internal IBM events in 2001 with World Jam, became externalized in 2005 with Habitat Jam, was offered commercially in 2007 through Jam Consulting Services, and was used for external studies starting with 2008. The collaboration has always had an open sourcing style with individuals able to voice their views at a peer-to-peer level. The scale of planning, operating and reporting on such a large enterprise has followed a private sourcing style with committed resources managed on a worldwide schedule.

B.2.6 From 2005, IBM early adopters have collaborated on innovations via the Technology Adoption Program

In 2005, the IBM CIO (Chief Information Officer) recognized that, outside of research and development labs, innovation can happen anywhere in the enterprise. While labs have resources to progress work from the design phase into a feature for an offering, individuals without mandate within their job roles have not similarly had access to such infrastructure, resulting in potential innovations that go undeveloped, forgotten and/or unrealized. The Technology Adoption Program (TAP) was designed to nurture self-selecting innovators dispersed across the enterprise. This required a shift of the role of the CIO from an operations manager to become an innovations manager (Chow et al. 2007, 640–641).

The Technology Adoption Program was launched in August 2005.885 TAP began as a two-page proposal by Sandesh Bhat, Director of Technology and Innovation in the Office of the CIO, with an idea “What if 30,000 employees were always running the n+1 version of the w3/Intranet?”.886 The first projects mentioned were CEWL (Client for Enterprise Web Services for Blackberry smartphones), Tommy! (a Firefox extension that would augment person-identifiers with tagging) and Dogear (a Firefox extension for enterprise social bookmarking).887 These, and subsequent other projects, have been described as a crowdsourcing in intra-enterprise communities: instead of the responsibility for early adoption development to a specific employee, volunteers donate their time to provide feedback (Alkalay et al. 2009, 3). Even projects with negative feedback were seen to provide value.888

The staff assigned to TAP was relatively small, starting in 2005 with an initial core team of four. IBM management consciously allowed TAP to evolve organically, responsive primarily to community priorities rather than corporate constraints. In 2007, staff was added to manage the large number of TAP offerings, the emphasis shifted towards accelerating the rate of graduation from early adopters to production support organizations. At the close of 2007, there were 143 TAP offerings and over 100,000 registered TAP users -- about 28% of IBM's total worldwide workforce (Alkalay et al. 2009).

The life cycle process for a TAP offering was structured as three major phases:

  1. In an innovation proposal phase of 1 or 2 weeks, the TAP web application would include (i) an offering proposal submission, (ii) proposal review; (iii) completion of a boarding questionnaire; (iv) proposal evaluation; (v) the boarding welcome call to understand how the TAP team might help; and (vi) early adoption and prototype deployment, including promotion through TAP channels.
  2. In the offering time phase of about 9 months, assessment and evaluation from early adopters, both through active data collection (e.g. surveys, defect tracking) and passive data collection would track usage, interests (as buzz or hype), level of satisfaction, and potential innovation value.
  3. At graduation, some offerings would be (i) returned to development for refinement and a second round through TAP; (ii) retired, or (iii) movement to a production environment, e.g. product development through IBM Software Group, asset commercialization through IBM Global Business Services or IBM Technology Services, alphaWorks or the internal Applications Hosting Environment (AHE).

The TAP community was seen as early adopters releasing trial offerings to other early adopters, so a bounded offering time would be sufficient to assess adoption or lack of interest.

In February 2009, TAP was organizationally moved into a new Innovation Programs department, that also managed BizTech and Thinkplace Next.889 Biztech was a program focused on forming virtual teams of early tenure employees of one-to-five years. A project sponsor would back a day-per-week effort for nine-to-twelve months. Results expected could include cost savings, project improvements, or the launch of a new technology (Alkalay et al. 2009, 14). Thinkplace was an ideation management forum originally developed by IBM research. Ideas could be suggested, commented upon, rated, sorted and routed. Thinkplace Next was an evolution of the Thinkplace collaborative tool, combined with social networking tools, i.e. Beehive (a social networking site) and Small Blue (a social network analysis tool) (Majchrzak, Cherbakov, and Ives 2009, 105–107).

TAP continued to evolve, with an aim to embed innovation into every employee's workday. By 2010, the Innovation Carousel would enable three types of projects from TAP to be selected to appear on each employee's personalized intranet home page. Social networking features would go beyond the need to search on TAP for projects of interest, but would surface what colleagues with close social ties were following.890

The style of open sourcing style of collaboration in TAP was not only encouraged by the Office of the CIO, but also tracked by IBM Research to learn about new ways of working over the Internet. In 2011, the experiences gained from TAP were directed credited in the development of the private sourcing commercial Lotus Connections offering (Naone 2011).

B.2.7 From 2005, IBMers wikied guidelines and grew social computing

Since 1956, IBM has had Business Conduct Guidelines in place, describing appropriate behaviour for IBM employees.891 Since 1995, employees has followed IBM Internet Usage Guidelines, including an Acceptable Use Policy on the gateways from the intranet to outside world (Patrick and Trio 1995).

By May 2005, the rapid rise of blogging by IBM employees on the open Internet, plus on the intranet -- “to just shy of 9,000 registered users spanning 65 countries, 3,097 individual blogs, 1,358 of which are considered active, with a total of 26,203 entries and comments” in just 18 months -- led to IBMers bloggers coming up with their own core principles. Developed over ten days on an internal wiki, “this isn't a policy that IBM is imposing upon us -- it is a commitment that we all have entered into together”. (Snell 2005b).

The IBM Blogging Guidelines were posted on the public IBM web pages, and led to other enterprises reflecting on their attitudes towards social interaction over the Internet.892 It recognized that employees are individuals, and “You must make it clear that you are speaking for yourself and not on behalf of IBM”. The detailed discussion reiterated that “IBM supports open dialogue and the exchange of ideas”. At the same time, it cautioned to “Don't forget your day job. You should make sure that blogging does not interfere with your job or commitments to customers.” In addition to the guidelines, a directory of “IBMers' blogs” on the open Internet was provided.

In July 2007, the rise of immersive technologies (e.g. Second Life) led to publishing of the IBM Virtual World Guidelines (Reynolds 2007). By May 2008, a group again working on a wiki on the intranet generalized the prior writings in the IBM Social Computing Guidelines (Piper 2008). The revision was published on the public IBM web site, deprecating the earlier works. In 2010, these guidelines were subtlely evolved.893

At the end of 2007, the IBM Social Software Enablement Program formed a BlueIQ Ambassadors Community as a peer-to-peer network of volunteers who would be come social software evangelists. On the community intranet web site was a description of aims:

BlueIQ Ambassadors are social software enthusiasts who help IBM individual employees, teams and communities with using social software. We seek to build a worldwide community of social software evangelists who are passionate and want to learn more about social networking, and who can volunteer their time and talent to energize and enable every IBM employee in order for him/her to benefit from using social software, both internally and externally (Tuutti 2010, 34–35).

Beyond the early adopters, the goals of enabling client-facing employees to (i) leverage the collective intelligence of IBM, (ii) improve productivity, and (iii) serve clients more effectively attracted the sponsorship of the Senior VP of IBM Software Group. To support the global IBM base of 400,000 IBM employees across the wide range of job roles, geographies and product lines, 9 people were funded to operate the program (Murray and Shah 2010). The purpose of the initiative, described on the intranet wiki site, was to:

  • Showcase for the business benefits of IBM social software, in both internal and external use, to help employees learn about it, get productive with it, connect to communities with it, and share it with other users, clients, partners, and press.
  • Operate as a living lab filled with the latest social software tools and programs, education and advice, marketing materials, and success stories.
  • Offer a starting point for quickly and easily making the most of social software -- and sharing best practices and success stories -- as an individual, member of a collaborative team, or member of a social-networking community, or BlueIQ ambassador (Tuutti 2010, 34).

By 2011, the BlueIQ Ambassador committee grew to 1600 volunteers. They had held 160 events to educate each other on how to collaborate, and then 50 events to engage other IBMers. The ambassadors are credited with spreading the philosophy and practices of social business, growing from 11% of the sales force in 2009 to over 40% in 2011. Over 66% of sales roles were estimated to be using the environment (Shah, Murray, and Overly 2011).

Leading up to the 2011 IBM Centennial Program, the Senior VP of IBM Marketing and Communications sponsored the Social Business @ IBM program starting in 2010. This made visible 29,000 IBM experts on the external IBM web site, extending the BlueIQ program (R. Shah, Murray, and Overly 2011). The crossover to IBM's commercial offerings was complemented by a web site and publication on how Social Business could be a game changer, with references on how additional enterprises had effectively networked their people (Hassell et al. 2011).

In hindsight, the progress on Enterprise Social Strategy has been traced from 2003 to 2009, summarized in Table B.1. Four stages of maturity have been mapped over five periods (Emerick 2013).

Table B.1 Enterprise social strategy (adapted from (Emerick 2013)
Maturity stages
Ad-hoc experimentation / Discovery
(2003-2009)
Sponsored exploration
(2009-2010)
Business unit engagement
(2011-2012)
Enterprise engagement
(2012-2013)
Characteristics No engagement

Lack of cultural readiness

Restricted (i.e. legal, regulated industry)
Limited complexity

Free tools / tools evaluation

Experimental

People working in their spare time
Stovepiped investments

Decisions by BU functions

Ownership disputes
Business targets for social

Comms or marketing straining to scale

Growing effort to coordinate

Some data integration Business targets for social
Business process focus

Enterprise standards and best practices

Data integrated into business processes

Cost efficiency imp
Leadership Management skeptical of business value Taking steps to build skills and culture No overarching vision, under-developed coordination Successful transformation of major enterprise processes Strong overarching vision and culture, good governance, digital initiatives generating measurable business value
Business Value Brand becoming irrelevant to increasingly social society, unable to control risk, no influence over earned media. Some improved customer under-standing

Some brand surveillance, cost reduction
Improved customer insight, new customer touchpoints, predictive modeling Strengthened productivity through worker enablement, increased employee engagement, enhanced customer acquisition Socially modified business models, open innovation, top line growth, new revenue streams

The years 2003 to 2009 were a stage of ad-hoc experimentation and discovery. It started with no engagement, a lack of cultural readiness, legal restrictions, skeptical management and an inability to control risks to business value. By the end of period, people in their spare time had tried the free tools, leadership saw that skills and culture should be built, and customers began to have a some understanding that an enterprise social strategy should be pursued.

By 2009 to 2010, sponsored exploration saw some decisions autonomously make by business unit functions, in the absence of an enterprise vision and coordination. Ways in which customers could interact with the enterprise improved.

In 2011-2012, business unit engagement saw social business targets emerging, with leaders transforming some major processes. Productivity amongst enabled workers strengthened, and impacts on customers was noticed.

Only by 2012-2013 did an enterprise engagement occur. Focuses on social business with business processes, standards and data were tracked for cost efficiency and investment sharing. A strong vision and governance guided digital initiatives with a promise of generating business value. A modification of the business model towards open innovation would produce growth and new revenue streams.

The six years from 2003 to 2009 can be seen as individuals and communities learning open sourcing within a private sourcing context. Expanding this to the enterprise level would take about 4 years from 2009 to 2013. Social business would be largely a formalization of way of interacting that had informally developed slowly from a grassroots level.

B.2.8 From 2006, IBM alumni connect via the Greater IBM Connection

In June 2006, a Google Group was formed by IBM U.S. Strategic Communications to open discussions on forming a Greater IBM group. Complemented by a Wordpress blog, this initiative was:

... part of our larger strategy to build a community of IBM veterans, whether retired or still working, as a global innovation community. The goal is nothing less than to reimagine what several hundred thousand IBMers beyond the company’s active workforce can accomplish by having more interaction with the company and current IBMers.

A virtual team was formed to plan, develop and drive the success of an IBM alumni network:

There are an estimated 800,000 to a million former IBMers worldwide. They include traditional retirees; long-term employees who’ve pursued other opportunities; people who spent a few years with the company and have long careers ahead of them elsewhere; even former interns.

The "Greater IBM Community" is new model for lifelong affiliation that encompasses individuals in all of these categories, as well as the company itself and our intersecting circles of friends, colleagues, business partners, clients and advocates (Goodson 2006).

At that time, a variety of motives for a corporate alumni network were cited: 20 to 25% of professional hires are former employees, they're more productive on reboarding, they stay for longer the second time, and the recruitment cost of 20% to 30 of an annual salary can be saved. 894 Generation Y young adults expecting high employee turnover in the 21st century often stay connected to their networks.

In October 2006, a virtual block party was held for the Greater IBM Connection in Second Life. On Almaden Island, 43 attendees joined from 11 countries (Greater IBM 2006). In addition to the Second Life virtual online meeting room where groups could communicate via text chat, a teleconference line for voice was also provided (Mariacher 2006).

By December 2006, the Greater IBM Connection was on the two largest business social networking web sites: Xing and LinkedIn (Suarez 2006). A site was launched on Facebook in 2007, and a Twitter identity was created in July 2008.

In December 2008, a virtual holiday party was held in Second Life for the Greater IBM Connection (Debbe Kennedy 2008).

By 2012, the online community had 135.000 members. The benefits of participating in the Greater IBM Connection were found to be (i) staying connected for former colleagues; (ii) finding job opportunities, with an interest in rehiring; (iii) keep abreast of thought leadership and initiatives coming from IBM (Swenson 2012).

The Greater IBM Connection has been cited as an example of a new network driven approach to human resources where former employees may have opportunities to consult on projects, mentor, or participate in engagement or development events (Kwan, 2013).

B.3 IBM consultants, from 2004, focused priorities from business leaders through industry-based executive studies

The IBM Institute for Business Value formed in 2002, within IBM Business Consulting Services, with a charter to develop primary research.895 Towards anticipating issues that top business leaders were facing, a series of global executive studies prioritized themes towards which IBM could coalesce its resources and position its offerings. The primary research began with the Global CEO Study in 2004, complemented from 2005 with functional leaders to be known as Global C-suite Studies. Outside IBM, the global executive studies showed that IBM was listening across a broad range of customers, and provided a platform for discussion with business leaders. Inside IBM, the global executive studies set contexts through which leaders and employees could assess their alignment with marketplace changes. All of these studies were funded by IBM at the corporate level, and involved hundreds of interviews with executives in the public and private sector.

B.3.1 From 2004, IBM consultants surveyed priorities on innovation and strategic change with Global CEO Studies

In early 2004, the first Global CEO Study was published with the title Your Turn: CEOs across the world are renewing their organizations for growth. Are you? It reported on interviews with 456 CEOs “to understand current strategic issues, ambitions and concerns affecting the world’s CEOs” (IBM 2004d, 7). Key findings included that (i) four of five CEOs believed that revenue growth was the most important path to boosting financial performance, following some years emphasizing cost containment; (ii) customer responsiveness as high on their agendas, while rating their ability to respond to changing market conditions and risks low; and (iii) growth and differentiation comes through people, but deficiencies in skills – both inside organizations and across the wider labour force – call for re-education and retention, with a need for managerial leadership.

In March 2006, the second Global CEO Study was titled Expanding the Innovation Horizon. It reported on interviews with 765 CEOs -- 80% face-to-face -- to capture views on innovation. Innovation was defined as “using new ideas or applying current thinking in fundamentally different ways to effect significant change” (IBM 2006c, 3). Findings included that (i) business model innovation was much higher on CEO's priority lists than expected, although not negating the need for innovation in products, services, markets and operations; (ii) external collaboration with business partners and customers ranked as the top source for innovative ideas, much above research and development, with an admission that organizations were not collaborating enough; and (iii) innovation has to be fostered by the CEO, orchestrating teams, rewarding individuals and better integrating business and technologies.

In April 2008, the third Global CEO Study was titled Enterprise of the Future. It reported on interviews with 1130 CEOs, more than 95% face to face, on the enterprise of the future (IBM 2008f). Findings included (i) eight of 10 CEOs bombarded by change, with gap between expected change and ability to manage it tripling since the 2006 study; (ii) more demanding customer not as a threat, but as an opportunity to differentiate; (iii) two-thirds of CEOs adapting their business models, with 40% changing to be more collaborative; (iv) aggressive movement towards global business designs, changing capabilities and partnering more extensively; and (v) financial outperformers making bolder plays.

In April 2010, the fourth Global CEO Study was titled Capitalizing on Complexity. With three prior studies as a foundation, this edition “also sought to understand differences between financial standouts and other organizations” (IBM 2010b). It reported on interviews with 1541 CEOs in 60 countries and 33 industries. In addition, a subset of the questions was also asked to 3619 students from 100 major universities around the world, to “provide insight into the views of future leaders”. Findings included that (i) 79% of CEOs expected greater complexity ahead, but 51% doubted their ability to manage it; (ii) 60% said embodying creative leadership would the most important leadership attribute over the next five years, with 52% mentioning integrity; (iii) 88% planned to get closer to their customers, reinventing relationships to the co-create products and services, and integrate customers into core processes; (iv) 61% of standouts intended to build operating dexterity by simplifying operations as compared to 47% of others, with an expectation of 20% more future revenue to come from new sources.

In May 2012, the fifth Global CEO Study was titled Leading Through Connections. The focus was on “the complexity of increasingly interconnected organizations, markets, societies and governments”, also called “the connected economy”. It reported on more than 1709 CEOs across 64 countries (IBM 2012b). Findings included that (i) 75% of CEOs seeking collaboration as the number one trait in their employees; (ii) 70% investing in customer insights, above operations, competitive intelligence, financial analysis and risk management; and (iii) more than half partnering extensively to drive innovation.

Subsequent CEO Studies were not packaged independently, but integrated with the C-suite.

B.3.2 From 2005, IBM consultants surveyed functional executives with additional C-suite studies

Before the global executive studies started in 2004, the Institute for Business published a survey published in September 2003, titled CFO Survey: Current state and future direction. It reported on interviews with 450 CFOs from 35 countries, representing global enterprises with average annual revenues of US$8.4 billion (IBM 2003b). From the past 5 to 7 years, CFOs saw organizations shifting to a “new, more efficient and effective model”, including three transformations: (i) from a role from “policeman” to a strategic business partner; (ii) from a cost base from 3% to 1% of revenue; and (iii) from activity focus of transaction processing to decision support and control. The model of the future, for an on-demand world, would see (i) the emergence of a Chief Focus Officer proactively driving business model design and portfolio configuration; (ii) responsive business management architectures; (iii) resilient governance structures; (iv) competitive cost structures across the finance network; and (v) becoming integrators of process, technology and people.

In September 2005, the Global Human Capital Study was titled The Capability Within. It reported on interviews with 106 CHROs (Chief Human Resource Officers) in 320 organizations were interviewed, ranging in size from 1200 employees to over 25,0000 (IBM 2005c). The study confirmed “the findings of the IBM Global CEO Study 2004”, with fewer than half of participants seeing their organization as “adequately equipped to respond to the growth and responsiveness priorities set out by their CEOs”. Findings included (i) CHROs describing their businesses as “cycling downhill” in either “maturing” or “declining” markets, where HR strategies originally in place to drive growth shifted to institutionalizing and consolidation of processes; (ii) the question for talent seen as a buy-build balance, where higher investments in development of middle managers had higher profits per FTE (full-time equivalent) at a risk of enabling higher voluntary turnover of staff; (iii) the challenge of retaining people in a fast-moving world saw more than 50% of CHROs believing their organizations were “doing alright” with work/life programs and flexible hours, fewer than 50% adopting a relaxed dress code, and only 30% of organizations implementing child-friendly policies; (iv) organizations falling short on knowing which areas need to be measured, with only 25% measuring return on investment in human capital, and few incorporating human capital measures into their leadership rewards; and (v) significant differences across regional geographies challenging a unified corporate culture, with the Asia-Pacific region having advantages in cheap labor and Latin America volatile in layoffs and a relatively young workforce.

In December 2005, the Global CFO Study was titled The Agile CFO: Acting on business insight. In cooperation with Economist Intelligence Unit, 889 CFOs were interviewed (IBM 2005b). Responses from the CFOs top areas of importance of performance, growth and risk were thought to track with the 2004 Global CEO Study top agenda items of growth and responsiveness. The most effective finance organizations were addressing issues of (i) structural complexity and (ii) fragmented information.

In October 2007, the Global CFO Study was titled Balancing Risk and Performance with an Integrated Finance Organization, in cooperation with the Economist Intelligence Unit and Wharton School professors. The preface by Mark Loughridge, CFO of IBM, cited the Palmisano 2006 “vision for the 21st century successor to the multinational corporation, the globally integrated enterprise”, and asked “what does it mean for the Finance discipline?” to enable this innovation (IBM 2007c). The primary survey instrument as delivered to 1200 CFOs in 79 countries, with 619 surveys in person and 611 via an online survey tool. Findings included that (i) 62% of enterprises with revenue over US$5 billion had material risk issues in the prior three years with 42% saying they were not prepared for it, and 46% of enterprises with revenues under US$5 billion had a major risk event and 39% were not well prepared; (ii) IFOs (Integrated Finance Organizations) had been adopted by only 1 in 7 enterprises over US$1 billion in revenue, yet their revenue growth rate of 18% over five years was almost double those of non-IFOs at 10%; (iii) 69% of finance executives believed that greater integration was an imperative to achieve, yet no significant progress had been made in the past 3 years; and (iv) risk management was immature, with only 42% doing historic comparisons, 32% setting specific risk thresholds and 29% creating risk-adjusted forecasts and plans. A roadmap to mature into an IFO was proposed.

In September 2009, the first Global CIO Study was titled The New Voice of the CIO. Participants included 2598 CIOs in 78 countries and 19 industries (IBM 2009b). CIOs were found to spend (i) 55% of their time on activities that spur innovation, including generating buy-in for innovative plans, implementing new technologies and managing non-technology business issues; and (ii) 45% of their time on managing the ongoing technology environment, including reducing IT costs, mitigating enterprise risks and leveraging automation to lower costs elsewhere in the business. At any time, a CIO could be described as (i) an insightful visionary and able pragmatist; (ii) a savvy value creator and relentless cost cutter; and (iii) a collaborative business leader and an inspiring IT manager. The three seemingly opposing mindsets would see the CIO lead initiatives to (i) make innovation real; (ii) raise the ROI of IT; and (iii) expand business impact.

In March 2010, the Global CFO Study was titled The New Value Integrator. Participants included more than 1900 CFOs from 81 countries and 32 industries, 75% interviewed face-to-face by IBM executives, and the remaining surveyed by the Economist Intelligence Unit (IBM 2010d). Two primary capabilities were identified: (i) finance efficiency, as the degree of process and data commonality across Finance; and (ii) business insight, as the maturity level of Finance talent, technology and analytical capabilities dedicated to providing optimization, planning and forward-looking insights. This led to four segments: (a) scorekeepers, with low efficiency and low insight, primarily focused on reporting results and ensuring regulatory compliance, struggling with speed and consistency because of insufficient standards and automation; (b) constrained advisors, with low efficiency and high insight, developing strong analytical capabilities but working with incomplete and inconsistent information requiring reconciliation and manual intervention; (c) disciplined operators, with high efficiency and low insight, competent in financial control and reporting activities in a highly automated, efficient manner, but lacking the capabilities to provide cross-functional analyses and assist with strategic operational decisions; and (d) value integrators, with high efficiency and high insight, with heightened interest in using technology to improve data accuracy, streamline information delivery and developing a richer base of information and deeper insights. Value integrators were found to outperform all other enterprises with more than 20 times EBITDA, 49% more revenue, and 30% more ROIC.

In October 2010, the second Global CHRO Study was released, titled Working Beyond Borders. Participants included 707 CHROs, with almost 600 interviewed face-to-face (IBM 2010g). The report found that the rationale behind workforce investment had changed. The traditional pattern of companies in mature markets seeking operational efficiency through headcount growth in emerging economies had changed, with CHROs in growth markets (e.g. China and India) increasing their workforce presence in North America, Western Europe and other mature markets. This shift led to three new requirements: (i) cultivating creative leaders who could provide direction, motivate, reward and drive results from an increasingly dispersed and diverse customer base; (ii) mobilizing for speed and creativity through simplifying processes and providing fast, adaptive workforce solutions; and (iii) capitalizing on collective intelligence, tapping into an institutional knowledge with new ways to connect people both internally and externally.

In December 2010, the first Chief Supply Chain Officer Study was titled The Smarter Supply Chain of the Future. Participants included 393 executives located in 25 countries serving 29 industries. They were compared to top supply chains, with 17 listed in “The AMR Research Supply Chain Top 25 for 2008” (IBM 2010e). Chief Supply Chain Officers were dealing with volatility through coping with (i) cost containment, keeping up with frenetic shocks of wage inflation, spikes in commodity prices and sudden credit freezes; (ii) visibility, with organizational silos too busy to share information or not believing that collaborative decision-making is important; (iii) risk, with 69% of respondents formally monitoring risk, but only 31% managing performance and risk together; (iv) customer intimacy, with 80% of respondents designing products jointly with suppliers, but only 68% doing so with customers; and (v) globalization, with issues reported in unreliable delivery (65%), longer lead times (61%) and poor quality (61%), while nearly 40% reported improved margins through increased sales increases rather than greater efficiency.

In May 2011, the second Global CIO Study was released, titled The Essential CIO. This was the first in the program now called C-suite Studies. Participants included 3018 CIOs, spanning 71 countries and 18 industries (IBM 2011d). Aligning with the 2010 Global CEO Study vision to increase competitiveness, plans were cited by CIOs with 83% including business intelligence and analytics, 74% including mobility solutions, and 68% including virtualization, 60% including cloud computing – a 45% jump since the 2009 study – and 60% including business process management. Four distinct CIO Mandates were identified, based primarily on how each organization views the role of IT: (i) the mandate to leverage, where IT was a provider of fundamental technology services, and operations were being streamlined for greater organizational effectiveness; (ii) the mandate to expand, where the CIO led IT operations that help expand organizational capabilities by refining business processes and enhancing enterprise-wide collaboration; (iii) the mandate to transform, where IT would provide industry-specific solutions to the value chain, enhancing relationships with customers, citizens, partners and internal clients; and (iv) the mandate to pioneer, where IT was a critical enabler, radically re-engineering products, markets and business models.

In October 2011, the first Chief Marketing Officer Study was published, titled From Stretched to Strengthened. Participants included 1734 CMOs in 19 industries and 64 countries, interviewed face-to-face. Respondents included 48 of the top 100 brands listed in Interbrand rankings (IBM 2012a). Compared to the 2010 CEO Study, CMOs were aligned on the top external forces affecting their organizations, as (i) market factors and (ii) technology factors. Four challenges were seen as pervasive, universal game changers: (i) the data explosion, with 70% of CMOs not fully prepared to deal with the impact; (ii) social media as completely different tool from traditional channels; (iii) proliferation of channels and device choices; and (iv) shifting consumer demographics, with 63% seeing a significant impact on the marketing function while only 37% prepared to deal with the shift. Outperforming organizations addressed these challenges differently from other CMOs. The three areas where marketing needed to improve were: (i) delivering value to empowered customers; (ii) fostering lasting connections; and (iii) capturing value and measuring the results of their efforts.

The most intensive research in C-suite Studies appears to have ramped down after the ten-year period ending 2013.896 In October 2013, “the first study of the entire C-suite” was titled The Customer Activated Enterprise. This cumulated the data from 23,000 interviews stretching back to 2003 (IBM 2013c). The combined study cited meeting with 4183 top executives – 884 CEOs, 576 CFOs, 342 CHROs, 1656 CIOs, 524 CMOs and 201 CSCOs – between February and June 2013. Separate reports for each of six executive roles were created:

  • The CEO insights released in November 2013 were described in Reinventing the Rules of Engagement. While CEOs since 2004 had consistently identified market forces as the biggest driver of change, they placed technology at the top of the list for the first time in 2012 (IBM 2013b).
  • The CIO insights were released in November 2013 in Moving from the Back Office to the Front Lines (IBM 2013a).
  • The evolution of the CFO perspective was released in February 2014 in Pushing the Frontiers (IBM 2014d).
  • The CHRO insights were released in March 2014 in New Expectations for a New Era (IBM 2014b).
  • The changing world of the CMO was reported in March 2014 in Stepping Up to the Challenge (IBM 2014e).
  • The ways that the CSCOs were preparing for the future were reported in May 2014 in Orchestrating a Customer Activated Supply Chain (IBM 2014c).

The Final Chapter was released not as a hardcopy publication, but an app in June 2014 downloadable to a tablet or viewable on the web as Exploring the Inner Circle (IBM 2014a). This summarized the ten years, 17 studies, and 23,000 face-to-face executive interviews.

The program of understanding customer executive perspectives from 2003 through 2013 represents a significant investment by a private company to guide not only its own interests, but also to openly share in potential future directions together.

B.4 IBM researchers, from 2004, led studies on longer horizon opportunities for social impact

IBM set up an independent research division in 1956, having temporarily set up the Watson Scientific Computing Laboratory at Columbia University in 1945 (IBM 2011e). It is now a worldwide organization with the most recent locations in the China Research Laboratory in 1995, the India Research Laboratory in 1998, IBM Research Ireland and IBM Research Australia in 2011, and IBM Research Africa in 2013. IBM Research Division continues to provide “a dynamic mix of long-range scientific research with work that feeds cutting-edge innovations to the company’s business units”.

IBM Research has always had a longer horizon of five to ten years, as compared to hardware and product divisions with development plans of three to five years, and the sales and distribution division looking out one to three years. IBM has always had strong ties with universities, dating back hosting the first meeting of the precursor to the Association for Computing Machinery (ACM) at Columbia University in September 1947.897 With the rise of the Internet, the Faculty Portal and a Student Portal hosted by IBM University Programs would expand into the IBM Academic Initiatives in 2003.898 With business success by the early 2000s, the Research division invested in two further initiatives that opened up the company to a broader range of stakeholders, starting (i) the Global Innovation Outlook, and (ii) the Service Science, Management, Engineering and Design.

B.4.1 Since 2004, IBM researchers led the Global Innovation Outlook

IBM Research has, since 2000, annually published a Global Technology Outlook for internal use as part of its forecasting processes, based on trends that might be disruptive or harbingers of change.899 In 2004, as a complement the Global Innovation Outlook was initiated with two differences: (i) for interdisciplinary collaboration, viewpoints of clients, partners (both academic and business), through leaders and proponents of change were to be included; and (ii) the scope would be aspects of where quality of life could be changed over a horizon of five to ten years, with disciplines and specialties might then be brought with innovation to break on them as a subsequent consideration (IBM 2004b, 8–9).

The Global Innovation Outlook would release reports from 2004 to 2008. In hindsight, there would be distinct focus each year:

  • GIO 1.0 focused on healthcare, government and work-life balance;
  • GIO 2.0 focused on future of the enterprise, environment and transportation;
  • GIO 3.0 focused on media and content, and on Africa; and
  • GIO 4.0 focused on security and society, and water and oceans.

In contrast to the typical IBM process, the reports were issued without form numbers, and later editions carried Creative Commons licenses.900 IBM sponsored the dialogues on innovation, business transformation and societal progress, in collaboration with a global ecosystem of experts.

The motive for the Global Innovation Outlook program described that, it was “not just our understanding of innovation that needs adjusting”, but that “innovation itself is changing in at least three major ways” in the 21st century:

one: It is occurring more rapidly — barriers of geography and access have come down, enabling shorter cycles from invention to market saturation.

two: It requires wider collaboration across disciplines and specialties — where until recently, people hunkering down in a garage could create a new technology that would sweep the world, many challenges are now too complex to be solved by individual pockets of brilliance, let alone brilliant individuals. Combinations of technologies, expertise, business models and policies will now drive innovation.

three: The concept of intellectual property is being reexamined in the light of these collaborative demands. Increasingly, entities that treat intellectual assets more like capital — something to be invested, spread, even shared to reap a return, not tightly controlled and hoarded —will find the clearest paths to success (IBM 2004b, 5).

While IBM leaders thought that IBM had excellent methods for examining technology and business trends, the company had no single integrated view of innovation. Two challenges were: (i) working across disciplines outside the company's borders; and (ii) the “invent first, apply second” bias that had crept into modern thinking. Developing the GIO thus led to two major goals:

FIRST, extend the integration of our business insight and technology expertise beyond our company’s borders to include the best thinkers from academia, our clients and partners, and other leaders in areas critical to innovation.

SECOND, follow a different path to discovery: begin with several areas critical to society over the next five to ten years, then consider implications for businesses and other integral components of society, finally considering what technologies or solutions might need to be developed (IBM 2004b, 9).

In addition, the process to produce the GIO “would be quick and spontaneous, and would maneuver around the boundaries of normal business practice”, conditions under which innovation thrives.

In 2004, GIO 1.0 GIO convened 10 “deep dive” sessions in New York, Shanghai, Washington D.C. and Zurich over 24 days. Three broad societal themes were discussed: (i) healthcare; (ii) government and its citizens; and (iii) the business of work and life. From 96 organizations, 100 ecosystem members discussed with 100 IBMers, complemented by 25 additional interviews with global thought leaders. In the report released in November 2014, three consistent themes emerged from the wide range of ideas: (i) the need for standard ways of exchanging information between members of each ecosystem (and across ecosystems); (ii) the need for more open collaboration between ecosystem members (even at times, among competitors); and (ii) the primacy of the individual as a focal point for innovation (IBM 2004b, 14).

On healthcare, three areas emerged that held significant promise for innovation:

  • integrated health records;
  • the implications of new delivery models designed to meet the needs of underserved populations; and
  • implications of a deeper understanding of ourselves (IBM 2004b, 24).

On government and its citizens, significant ways the relationship would evolve were noted as inevitable:

  • Governments will have to become more efficient and integrated across agencies and ministries.
  • Governments will become subject to new kinds of influence and pressure due to novel uses of communications technology — “they will not be able to escape the bloggers,” said one participant.
  • Governments will compete and cooperate with each other more on the basis of virtual factors (skills, expertise, infrastructure and productivity) than on simple, traditional geographic advantages.
  • In some ways, governments may behave more like businesses (tying plans to budgets and strictly measuring results and return on investment to society), but in others, they will need to remember what makes them fundamentally different: they cannot pick and choose “clients,” since they should act on behalf of all their citizens (IBM 2004b, 41).

On the business of work and life, the redefinition of work by the factory through the industrial revolution as “industrial indenture” was being reshaped by knowledge workers:

  • Finding the off switch an always on world reflected “a return to earlier, pre-industrial models where work performed at home (in the fields, at the hearth) was not thought of as something entirely distinct from life”. This changes the ways that a workforce is managed, as well potentially portending a trickle-up into larger structures of society.
  • For workers, work becomes academic, as “First, workers will no longer be able to rely on expertise (including university degrees) earned early in life to keep them at the front of the skills queue. Second, it will be unlikely that universities and other educational institutions trying to keep abreast of the dynamic nature of work will be able to do so”. This could lead to tighter collaboration between academia and industry, or even “a time when leading companies join the ranks of universities in being accredited to offer advanced degrees”.
  • Corporate culture catches up to the knowledge age, where retirement programs that “pressure knowledge workers to leave at a set age with no accommodation for an ongoing association may be wasting their best assets” and a “normalizing culture that allows interaction and collaboration” across cultural differences that are “national, ethnic, linguistic, educational, expertise- or skill-related” (IBM 2004b, 55–67).

The 2004 report closed by saying the exploration paving the way for change was not a conclusion, but a way to “stimulate concrete action”. Projects sparked by GIO 1.0 had already started.

Building on that first outlook, the view on innovation evolved, so that:

...innovation is no longer invention in search of purpose, no longer the domain of a solitary genius looking to take the world by storm. Instead, innovation is increasingly:

Global. The widespread adoption of networked technologies and open standards is removing barriers of geography and accessibility. Anyone and everyone can participate in the innovation economy.

Multidisciplinary. Because the challenges before us are more complex, innovation now requires a diverse mix of talent and expertise.

Collaborative and open. More and more, innovation results from people working together in new and integrated ways. Within this collaborative environment, notions of intellectual property are being re-examined. And those entities that view intellectual assets as “capital” to be invested and leveraged— rather than “property” to be owned and protected—will likely reap the greatest returns (IBM 2006d, 2).

This theme of innovation as “open, collaborative, multidisciplinary and global” would be consistently presented by IBM executive leaders beginning in 2005, both internally and externally.901 The pattern of open sourcing while private sourcing is implicit in a world where innovation is open, collaborative, multidisciplinary and global.

Looking to produce a second Global Innovation Outlook, a survey of the contributors to the first GIO some direction: 90% of respondents “suggested that issues related to the environment and energy would benefit most from a GIO-style investigation”. Keeping with the focus on near-term potential for technology and business innovation, the challenge of massive urbanization trends in the developing world and stresses on existing infrastructure surfaced (IBM 2006d, 48–49). The 15 deep dive sessions conducted in fall 2005 would focus on (i) the future of the enterprise, (ii) transportation and (iii) the environment. Participants included 248 thought leaders from 33 countries and 178 organizations.

In March 2006, the GIO 2.0 report was released. The patterns from GIO 1.0 -- the need for standards, the trend towards open IP and collaboration, and primacy of the individual continued to resonate. In addition, some new patterns emerged:

  • The power of networks, with the “evolution of social structures” transcending physical and geographic borders; a unifying notion of “the endeavor” in a common set of interests, goals or values that could redefine “the enterprise”, “employer” and “employee”; a “collaborative, contribution-based” environment where the traditional enterprise could shift to orchestration and facilitation of endeavors; “reputation capital” as a kind of accumulated trust, in a standard of accountability that enables diverse people to strike partnerships; and a “complex set of causes and effects” as boundaries dissolve and more fluid relationships form.
  • Line of sight shaping “decision making” as whether understanding and anticipating the full consequences of one's actions might inspire “different choices”, with opportunities capitalizing on advances in computer power, networked infrastructure and data intelligence.
  • Flipping the equation, applying intellectual energy into the areas opposite of current focuses for new breakthroughs and advancements, requiring “moving beyond 'either/or' thinking” (IBM 2006d, 9–12).

The patterns were tied to the Innovation that Matters message in 2006, that would broaden the dialogue beyond On Demand.

On the future of the enterprise, the idea of a joint endeavor or undertaking could change leadership and managing and motivating global talent. Insights included:

  • Forget about free enterprise. Think enterprise-free: where the endeavor was described as the glue between individuals or entities, relegating the traditional organization to orchestration and facilitation, likened to the fluidity in the Hollywood studio system in rotating rosters of affiliated talent, each an “aggregation of specialized entities”.
  • Talking 'bout my reputation: where brand promise is challenged to maintain integrity across workers and partners, and “reputation capital” of non-affiliated contributors might adopt a currency with a “trustmark”.
  • A small world after all, where highly-specialized businesses of 25, 10 or even five employees can conduct business on a global scale, and large businesses may emulate “smaller is often better” tailoring of products and services.
  • Success will depend on how well you play the game -- literally, where “studying the qualities of leaders” has suggested that the next generation of leaders could be the outliers at the “polar opposite from command and control management systems” in massively multiplayer online games (MMOGs) where “the connective tissues of this collaboration” is the normalizing culture, and “flexible contextual learning models” allow people to develop new skills as needed.
  • Rewriting the employer-employee 'contract', inventing the exchange of value with reward systems “beyond stock options, bonuses and retirement plans”, with disaggregation where “social networks could provide a stabilizing force” to reduce individual elements of risk, and “mobility for a common set of employees” such as retirees.
  • Insight as a mindset, not a department, where the best ideas from around the world are exchanged dynamically, such as “sensing hubs” in emerging markets to seek out innovation components and ready receivers for existing ideas (IBM 2006d, 14–23).

On transportation, eased geopolitical borders enabled people and freight to move greater distances with more frequency, but one they get there, congestion on streets and ports was seen as taking a toll. Insights included:

  • Grow, but with flow, would see cities or regions dealing with congestion, noting that “mobility increases market areas”, expanding options for access to goods, competitive advantage, attracting new business investment and a higher-caliber workforce.
  • Headlights into the system, with sensing and computing devices to reduce vehicular congestion through “more holistic approaches to understanding and managing urban traffic flows” with the challenge of “are individuals willing to cede such control?”, particularly in developing economies.
  • Playing 'leapfrog' to move forward, with emerging economies rejecting existing paradigms to embrace new approach to manage the boom in personal vehicles, e.g. alternative-energy cars or car-sharing, where a rising “middle class is spiking demand”.
  • New paths for public transportation, in better coordination and integration across different modes of public transit, including smart cards as “a common currency”, “integrating the information” to push out to riders via mobile devices or street-side kiosks, and “swarms of smaller, more mobile, more flexible vehicles” that could dynamically re-reroute themselves based on need.
  • Services on the go, with connected vehicles in “a new breed of services”, including embedded technologies, and services that “fundamentally change the relationship” among drivers, passengers, manufacturers and third-party service providers.
  • Shoring up shipping to eliminate paper documents associated with customs policies, manual processed and increased global trade with “standardization and integration”, an “traffic management as a huge differentiator” for economic advantage in ports (IBM 2006d, 24–35).

On the environment, a world could be imagined where environmental protection and economic prosperity are not only compatible but simultaneously attainable.

  • All's well that ends well found that back end decomposition could provide the richest opportunities for breakthrough thinking by “flipping the equation” to explore innovative new ingredients, products and processes.
  • The reverse supply network extended the idea of “reverse supply chains” with the possibility of massive waste reduction through new collaborative relationship to send use components and manufacturing by-products back and forth to one another.
  • Regulation: innovation's friend or foe was seen by some as driving innovation around product composition and decomposition, while others thought “regulation may actually impede innovation” by encouraging manufacturers to simply comply with minimal standards rather than rewarding exemplary performance.
  • From trash to treasure considered that landfills might be view instead as “above-ground mines” to recover copper and metal alloys.
  • Seeing is behaving with individuals and businesses having a clearer and continual “line of sight” into the consequences of their actions about energy and natural resource consumption.
  • Mighty micropower through small-scale energy sources such as wind and solar, often considered “the best energy solution for rural areas”.
  • Troubled waters? found GIO participants across the board concurring that water is possible the number one issue of concern to the world's population in 21st century, with the role of the private sector helping by contributing to waste and misuse of available resources (IBM 2006d, 36–47).

The GIO 2.0 report of 2006 would be the last integrated document published. Subsequent reports would be issued as in-depth studies on specific topics.

One freestanding complementary GIO 2.0 report would be released in February 2007: Virtual Worlds, Real Leaders. Inspired by findings from GIO 2.0 published in 2006, IBM sponsored a study led by Byron Reeves of Stanford University and Seriosity Inc., and Thomas Malone, at the MIT Sloan School, on “whether real business lessons can be learned from observing leadership on online games (IBM 2007g). The model selected to guide the analysis was the Sloan Leadership Model, that breaks leadership qualities and action into four parts: (i) visioning; (ii) sense-making; (iii) relating; and (iv) inventing.902 Then a team from the IBM Institute for Business Value built on the research to survey IBM's Virtual Universe Community. Given the right tools in the right circumstances, leadership can emerge. The study found:

Online gaming environments facilitate leadership through:

  1. Project-oriented organization
  2. Multiple real-time sources of information upon which to make decisions
  3. Transparent skills and competencies among co-players
  4. Transparent incentive systems
  5. Multiple and purpose-specific communications mediums (IBM 2007g, 17).

The iterative nature of online games presents many opportunities to lead. While there is an overriding goal for the group, a series of “raids” or missions spreads around leadership with no expectation of permanence in the leadership role. Tools can make leadership easier, with skills and competency levels of a member of the guild readily apparent, and real-time risk assessment tools with visible “incentive systems”. Finally, guild leaders would mediate conflict and maintain relationships as a natural part of their roles.

Three GIO-inspired works would have direct policy implications: (i) Peer-to-Patent Project, leading to the Building an IP Marketplace GIO 2.0 report; (ii) The Inventors' Forum for smaller enterprises; and (iii) the Standards for Standards wiki collaboration.

Independent of the main GIO 2.0 themes, a “Peer to Patent” Community Patent Proposal was co-led by the New York Law School, IBM, and the U.S. Patent and Trademark Office (USPTO) in 2006.903 One of the challenges of working across open sourcing and private sourcing was that the USPTO was challenged in recognizing open source software as a prior art in applications for patents.904 Bringing together the open source development community with the USPTO was controversial, because many developers thought “patents were evil”.905 At a meeting on May 12, 2006, a proposal to develop the Peer to Patent Project on a wiki was tabled.906 IBM continued discussions with USPTO subsequently, lead to IBM making some patent applications in progress available for the pilot:

In a private meeting with the USPTO (after the public meeting), we brainstormed out a list of some of the items that need to be addressed for the pilot. Those include advisory/steering committee, technology infrastructure, communications plan (announcement, education, code of conduct, etc), patentee consent to have certain published patent applications commented on by third parties, participants to review and comment on published patent applications, incentives to participate, USPTO rule changes that might be required, and more. Consensus seemed to be that 200-300 published patent applications would be needed for the pilot. IBM announced during the public meeting that we would consent to make some of our published patent apps available for the pilot. We hope that others will also contribute some of their own published patent apps, technical support, engineers to help provide comments, funding, etc. (Schecter 2006)

The work on a wiki in May and June 2006 evolved into a Building an IP Marketplace report issued in September 2006 as part of the GIO series. The ties between the nature of innovation and intellectual property was described in the foreword:

The very nature of innovation is changing as economic activity shifts from physical to intellectual assets. Products of the mind are often patented, making patents a key currency in the 21st century knowledge-based economy. Many of the world’s patent systems were developed decades or even centuries ago to promote invention of physical goods, and have not evolved to include mechanisms needed to support this expanded role.

While emphasis on patenting proprietary invention continues to intensify, so does the adoption of open standards and collaborative business models. Organizations endeavor to find the ideal balance on this continuum of innovation (IBM 2006b, 1).

The debate on the wiki led to a “collaboratively written manifesto” that “establishes the foundations of a functioning marketplace for the creation, ownership, licensing and equitable exchange of intellectual property”.

On behalf of the contributors across institutions -- and published under Creative Commons licensing -- the GIO special report prescribed:

In order for innovation to flourish in a global knowledge- based economy, a new set of principles guiding the creation, ownership and equitable exchange of intellectual goods should include the following tenets:

  1. Inventors file quality patent applications for novel and non-obvious inventions of certain scope.
  2. Patent ownership is transparent.
  3. Market participants act with integrity.
  4. IP value is fairly established based on the dynamics of an open market.
  5. Market infrastructure provides flexibility to support differing forms of innovation.
  6. Realistic introductory levels of global consistency exist for all of the above (IBM 2006b, 6–7).

Towards developing global consistency, the report released in September 2006 detailed issues and paths forward on five areas:

  • Patent quality could be improved by making searches for prior art easier, reducing the instances in which a patent is granted for ideas that are neither new nor distinct from prior work.
  • Transparency in the true identity of the rights holders, whether the patent is subject of legal conflict or dispute, and the terms under which the patent might be licensed could speed up further innovation.
  • Integrity would preclude manipulative behaviours that might damage the brand or reputation of a business, or cause difficulty in business relationships could deter “trolls” that produce neither products nor services, and have not customers.
  • Valuation through tools that would help determine the fair price of viable knowledge assets don't exist for IP investments in the same way that they do in the financial industry.
  • Flexibility in accommodating the intangible assets of software, services and business methods haven't kept up with open software standards, where royalty-free licenses of inventions are required, but then derivatives are patented in some jurisdictions but not others.

While the GIO 2.0 report captured the collective view of the contributors, IBM independently spoke to the press on the corporate position:

According to Ari Fishkind, spokesperson for IBM Technology and Intellectual Property, these are the core tenets of IBM's new initiative:

  1. Patent applicants should be responsible for the quality and clarity of their applications,
  2. patent applications should be available for public examination,
  3. patent ownership should be transparent and easily discernible, and
  4. pure business methods patents without technical merit should not be patentable. [….]

“The patent policy we announced is a broad framework for everything related to how we handle intellectual property. The common denominator is that in many cases, we will exceed what the law requires,” said Fishkind. “Some aspects of our policy are the notions of transparency and quality. Allowing the USPTO [U.S. Patent and Trademark Office] to publish all of our patents, and allowing other community members to provide detailed feedback about our patents, will exemplify our drive for transparency and quality in the industry” (Dames 2006).

Since IBM has had a long record of annually being the world's largest grantee of patents, the weight of these reforms are significant.

The Peer-to-Patent program officially started in June 2007. In the first year to 2008, there were “over 2000 registered users and 173 items of prior art submitted on 40 applications by participants from 140 countries” (Center for Patent Innovations 2008). By the second year to 2009, 2600 people had registered to become peer reviewers, with 187 participating patent applications (Center for Patent Innovations 2009). Applications had “been submitted by GE, HP, IBM, Intel, Microsoft, Cisco, Disney, eBay, Novell, Red Hat, Sun, Xerox, and Yahoo, as well as by smaller firms and individuals”(Schecter 2009). While the goals of the program were overachieved, the economic downturn led to the USPTO placing a moratorium on extending the pilot past June 2009, until a full evaluation of the impact could be assessed. A full report was subsequently written (Center for Patent Innovations 2012). Of more practical significance, however, was the June 2009 nomination by the Obama administration of David Kappos, the VP and Intellectual Property at IBM (and an early proponent of Peer-to-Patent) for Under Secretary of Commerce for Intellectual Property and Director of the USPTO (Noveck 2009). In addition, Beth Noveck was appointed by Obama to lead the Open Government Initiative (Hansell 2009).

In January 2007, IBM announced that it would “develop and host the Inventors’ Forum, an online initiative to share and debate ideas on how smaller enterprises view patent systems and can contribute to reform efforts such as improved patent quality” (IBM 2007g). While U.S. Small Business Administration reported that small companies earn nearly 15 times the number of patents per employee as large enterprises, small companies were seen to lack the resources to obtain a patent, maintain ownership, and then convert the patent into marketable products and services.

In December 2007, The Inventors Forum report was released as part of the GIO series, under a Creative Commons license. More than 400 participants -- smaller companies and their larger partners, attorneys and IP experts, government officials, economists and academics -- conversed over 12 weeks (IBM 20073). The group identified a number of issues, including:

  • Education, at university engineering and technical undergraduate levels, did not focus future inventors and business leaders on patents and intellectual property management, further exacerbated by small businesses lacking in-house IP counsel.
  • Patent offices could better exploit technology to help small businesses navigate the complexity of global patent systems more cost effectively.
  • The patent reform legislation pending in the U.S. Congress was viewed positively in improving patent quality and aligning the U.S. with other countries, but sometimes negatively on ways that rule changes could award damages and the challenging of patents after they had been granted.
  • A “Soft IP” system where a patent owner voluntarily foregoes injunctions and accepts some form of compensation for permitting the use of the patented invention would facilitate innovation on complex systems leveraging multiple inventions.
  • Effective IP management where patents and IP are strategic business assets rather than byproducts of other activities (IBM 2007e, 6–7).

The conversation began with an emphasis on the ways smaller entities could work more effectively with patent offices and legislators. As the dialogue shifted to areas for improvements that would effect the most significant change, the benefits were considered for the system and its participants as a whole.

In 2008, a similar wiki-based collaborative report was used to establish “standards for standards”. This did not officially appear as GIO report -- probably because the GIO initiative was winding down by late 2008 -- but did have an influence both on IBM policy and recommendations into the White House.

This collaboration occurred subsequent to the approval of OOXML as the ECMA-376 standard in December 2006, and the ISO/IEC DIS 29500 standard in April 2008, driven by Microsoft.907 Some of the issues were expressed in a blog post by Bob Sutor, IBM VP of Open Source and Standards on “Standards and Quality” in August 2007. In independent assessments of quality, comparisons to Consumers Reports ratings from an independent nonprofit organization, Amazon product ratings from purchasers, and home inspectors ranking the current state of a house under consideration were drawn.

We have nothing like this for standards.

What would it possibly mean for something to be a “one star standard” versus something that is a “five star standard”?

We have folks who do standards compliance testing for a business, but this is not evaluating the quality of the standards themselves. I will note that we do have the Web Services Interoperability Organization which looks at existing standards and best practices in using them, and then recommends both profiles for deploying the standards well and future changes to the standards that will improve them.

We have thousands of standards and no clear way to decide which of them are good and which are not. Instead, we more of less go by the organizations that create the standards, whether we are actually required to implement them (say, by law or customer requirements), or if the market leaders use them.

I’m going to tackle the issue of quality and standards organizations in a future entry, but let me say that

  • Standards organizations are not all equal in quality, though it doesn’t seem that everyone knows that.
  • A given standards organization can produce two standards of wildly divergent quality.
  • In my opinion, the key measurement of a standards organization is not the quantity of standards produced but the quality of standards produced.

As a disclaimer, I’m very aware that when IBM is involved in the creation of standard, we probably want people to use that. The same goes for everybody else.

In some cases there may be only one standard for a particular purpose. Do we just accept that or can we apply some set of metrics to it to help the maintainers evolve it into something better? (Sutor 2007)

In April 2008, the OOXML standardization by the ISO led to an opinion that something needed to be done.

When I was in Geneva in February, I found myself saying something like the following to those who asked me how I thought the OOXML/DIS 29500 vote was going to turn out.

“If the ballot fails, we will have seen that a historic change has occurred. If it passes, we will see that historic change is needed.”

Evidently, we’re in the latter case. In spite of having significant problems and intellectual property gaps, enough countries have changed their votes from the September ballot to allow the specification to move forward into the publication preparation phase with JTC1 (ISO/IEC).

So is that it? Of course not. The process of international standards making has been laid bare for all to examine. [….]

While fully cognizant of these current results, I’m energized to take the bigger fight for openness to the next level with the thousands of individuals who are now convinced that the standards system needs fixing, and soon. I hope you’ll take part (Sutor 2008a).

In summer 2008, IBM facilitated an online wiki conversation on whether standards setting bodies have kept pace with commercial, social legal and political realities, and how transparency, fairness and quality could be improved in standards. Participants included 70 independent forward-thinking experts. The online forum was divided into five topics: (i) transparency and accountability; (ii) standards quality and creation; (iii) policy and society; (iv) intellectual property; and (v) rating and accreditation. By the end of the wiki conversation, suggestions had been made on (i) government; (ii) standards development organizations; (iii) standards community; (iv) quasi-government and non-governmental agencies; (v) international organizations; (vi) intellectual property; and (vii) academia (IBM 2008e). The wiki recommendations were published on the IBM web site, complemented with translations into Japanese and Chinese.

In September 2008, learning from the discussion on the wiki, IBM announced a new corporate policy to formalize the company's behaviour when collaborating to create open technical standards.

The tenets of IBM's new policy are to:

  • Begin or end participation in standards bodies based on the quality and openness of their processes, membership rules, and intellectual property policies.
  • Encourage emerging and developed economies to both adopt open global standards and to participate in the creation of those standards.
  • Advance governance rules within standards bodies that ensure technology decisions, votes, and dispute resolutions are made fairly by independent participants, protected from undue influence.
  • Collaborate with standards bodies and developer communities to ensure that open software interoperability standards are freely available and implementable.
  • Help drive the creation of clear, simple and consistent intellectual property policies for standards organizations, thereby enabling standards developers and implementers to make informed technical and business decisions (IBM 2008g).

This new policy could lead to IBM withdrawing from a standards body. It was interpreted by the press as “intended to pressure organizations such as the ISO and ECMA, an industry-led standards organization, into rethinking their procedures” (Kirk 2008). In response to the publicity, Bob Sutor blogged:

With this principle, IBM is saying that it will increasingly look more closely at issues like the openness and transparency of a standards organization, as well as the modernness and consistency of the processes and intellectual property rules. IBM did so before, but it will do more in the future. IBM will sharpen and communicate its criteria to those involved in a cooperative manner (Sutor 2008b).

Withdrawing from a standards body would be a last resort, if the situation became dire.

In November 2008, the wiki recommendations became an input into the Standards for Standards Summit at Yale Law School. The day's discussion was divided into three working groups: (i) Standards and the role of Government; (ii) Quality and Creation of Standards; and (iii) Standards and Intellectual Property (Yale Information Society Project 2008). The working groups each summarized current problems, drafted recommendations to the Obama administration, and suggested some next steps.

In March 2009, the Information Society Project at Yale Law School presented Technical Standards Recommendations for the Obama Administration. Perceived as consistent with the technology policy directions of the Obama administration, they recommended:

  1. Develop a Government Open Standards Strategy.
  2. Form a United States Standards Advisory Council.
  3. Strengthen International Standards Collaboration.
  4. Encourage the Formation of a Global Multi-stakeholder Standards Advocacy Group (Yale Information Society Project 2009)

These actions would represent a national standards strategy, enabling both economic innovation, and a connected democracy focused on openness, transparency, and direct civic engagement. These activities would contribute, in part, to the America Invents Act, passed in September 2011 marked “the most extensive and important update to the U.S patent system in nearly 60 years”.908

The research for GIO 3.0 started in 2007. 909 Two reports would be released within the year: one focused on media and content, and the other focused on Africa.

In October 2007, the first GIO 3.0 report was titled The New New Media. The sentiment was to explore opportunities for innovation in the “newly amorphous market segment of “media, content, branding and messaging’” -- beyond traditional view of the media industry. Over 60 days, deep dives were conducted in seven cities: Helsinki, London, Los Angeles, Mumbai, New York, Seoul and Shanghai (IBM 2007f). Seven themes emerged, the first four as primary.

  • Authenticity: Viral anti-marketing, brand loyalty and listening without fear. Companies face a challenge of “walking the fine line of viral marketing” in delivering their brand messages. Authenticity comes not from a monologue, but through dialogue, particularly with “the voice of youth”. Beyond recognition, more substantial forms of compensation may be needed in “the currency of contribution”.
  • The Digital Persona: Who should own your online identity? As media producers and advertisers in the digital age aim for personalization in an elusive “market of one”, consumers may want to take back control of their personal information.
  • Context is King: In the age of free content, the future (and the money) is in the context. As the value of content content – e.g. a newspaper article, movie, song, or piece of market research – diminishes, the value of context, as the “when”, “where” and “how” that adds value to a piece of content, becomes a target for data analytics.
  • Going Mobile: Can wireless communications bridge the digital and economic divides? Affordable mobile platforms has become way for the urban poor to connect to the Internet with a simpler “iconic literacy”, and the infrastructure is easier to build.
  • Content Bill of Rights: Universal standards for content usage rights could save the digital media business. Usage rights, e.g. for digital music, could be attached to the a piece of content rather than the device used to consume it, or the mechanism used to deliver it.
  • Regional Spotlight: China: With the eyes of the world upon it, China aims to tell its story. Not only will the volume of Chinese language content on the Internet increase, but a widespread sense in China that content should have educational and cultural value could challenge the Hollywood culture coming from the United States.
  • Virtual Uncertainty: The impact of virtual worlds on the business landscape. The 3D Internet could be to the Internet what silent movies were to Hollywood, with early experiences seeming to be less about consumerism and more about expressionism.

The topics of media, content and messaging had important regional differences apparent during the GIO process: in Shanghai, the need for content to be of both educational and cultural value surfaced; in New York, the deep dive went into topics of piracy and the impact on the established media industry; and in Mumbai, there was optimism about the role that India will play in the future of content creation and distribution.

In November 2007, the second GIO 3.0 report was released titled Africa. The GIO meetings brought together 123 ecosystem partners from 24 countries, with deep dives in Atlanta, Beijing, Cape Town, Dakar, Lisbon, Nairobi and Paris (IBM 2007a). Africa represents 1/7 of the planet's population, with 43% under the age of 15. While China and giant multinational firms poured were pouring billions into Africa, the African nations have colonial legacies that could be instructive for the future. Eight factors critical to the continent's future were presented:

  • Skills: Unlocking a powerful 21st-century work force in Africa demands the engagement of the private sector. The education system may not have had significant investment in nearly 40 years, so reform and overhaul to realign to the needs of the private sector are needed, and many students want to learn to start their own businesses to employ other Africans.
  • Value Chain: African are beginning to capture more value from the continent's vast resources. Ghana has invested in cocoa-processing facilities, Rwanda rebuilt infrastructure to deliver high-end specialty coffee, Botswana and Namibia are cutting and polishing diamonds, Uganda is producing T-shirts from local organic cotton, Mauritius is becoming an offshore banking center.
  • Infrastructure: Internet access and communications technology will spark the African services economy. Only Mauritius ranks in the top 70 countries in ICT access, with the vast majority of Africa in the “low-access” category, in anticipation of East African Submarine Cable Systems (EaSSy) [completed in 2010].
  • Wireless: Mobile technology is transforming Africa in unforeseen ways. The boom from 10 million wireless subscribers in 2003 to 200 million in 2007, has led to the M-Pesa money-transfer service in Kenya, and Motorola using wind turbines and photo-voltaic cells to power base stations throughout Namibia.
  • Informal economies: Tapping the economic power of Africa's thriving informal sector. Shadow markets and the traditional sector that account of almost ¾ of non-agricultural employment in sub-Saharan Africa, and unregulated economic activity that constitute as much as 40% of GDP in some African nations has been described as a poverty trap without job protection or employee benefits, which don't contribute taxes that could be used in critical infrastructure improvement projects.
  • Women: Women are the driving force behind Africa's entrepreneurial spirit. While accounting for more than 60% of the rural labour force and contributing up to 80% of food production, female entrepreneurs still find it difficult to obtain the “mesofinancing” they need to grow their businesses.
  • Finance: Access to capital hinges on new definitions of risk and collateral. Few Africans are landowners or have assets or credit, and there are few traditional lenders, with microfinance a revolution, while small to medium sized enterprises don't have the mesofinancing to grow their businesses.
  • Non-Governmental Organizations: Is foreign aid helping or hurting African economies? NGOs providing free money demonstrated little in economic stimulation, and collaboration between NGOs and private enterprises in Africa was in its early stages.

IBM had already been doing business in Africa for more than half a century, and the reality of global integration was changing the way the company thought about opportunities for growth and progress.

In September 2008, the first of the GIO 4.0 series was published on Security and Society. Starting in April 2008, six deep dives with participants from 93 organizations were conducted in Moscow, Berlin, Taipei, Tokyo, Vancouver and Chicago. The conversations were on security as a foundation of society, with recent trends in globalization, interdependence and digital technologies creating opportunities in business models and lifestyles, also with associated criminal elements and other destabilizing forces (IBM 2008e). The idea of distributed security emerged from the conversations:

  • The Network Effect: Tapping into the power of distributed security. The harmfulness of a single treat is exponentially proportional to the number of people exposed to it, so “to fight a network, you need a network”. Adaptive security intelligence could be moved out to the edges of the network in “distributed security”, through “community-based security” where groups with common interests police themselves, “wireless watchdogs” enable citizens to record audio and video of suspicious activities, and “the secure supply chain” would see transparency from start to finish so that each link in the chain could shoulder a proportionate load.
  • The New Roles: How governments and businesses must adapt to the new security reality. Security is not seen “solely as a government enterprise”, so partnerships of “good security, good business” enables economic globalization while political systems are still nationalistic and oriented towards domestic affairs; prosecuting international crimes of a digital nature is “the legal vacuum” where cyber criminals have little fear for consequences, where lobbying for legislation on digital crime could be aided by education from the private sector; and “built-in security” is embedded in products and services, with careful consideration of convenience and cost to consumers.
  • Best Behavior: Using incentives to change bad habits. Marketing security could be “strictly business”, as manufacturers ensure the products on store shelves are legitimate, and not counterfeit, soft incentives against negative work-related events preempt “the threat within” of insider attacks; any ways to alter security behavior for the better becomes a “convenient truth” through biometric data or prescreening.
  • Getting to Know You: The evolving relationship between security and privacy. “Informational self-determination” sees the information communicated about an individual in online world no different from the physical world, where “cancelable biometrics” might enable a “master token” to be enrolled as an identifier that might be reissued if stolen or compromised; “peer-to-peer based online rating systems” enable a “reputation reconnaissance” for trusting parties; and “data tethering” enables an identity “reclamation project” to track who is using a personal record.

The movement towards distributed security tends to focus less on hardening perimeters and securing boundaries, and more on resiliency as an ability to absorb and respond to attacks.

In February 2009, the last of the GIO 4.0 series -- and the last of the GIO reports -- was on Water. This study involved participants from 122 organizations, across management, business, infrastructure, food and energy (IBM 2009d). Human beings have “survived knowing very little about out water systems. We have always known where to find it and how to use it, but we never gained an intimate understanding of how to preserve or sustain these systems” (IBM 2009c, 5). In addition to relying on water for simple sustenance, “We generated power from it. We transport people and goods through it. We grow our crops with it. And we use it to cultivate medicine and manufacture products. In fact, every time a good is bought or sold there is a virtual exchange of water”, with embodied water (IBM 2009d, 3). The study led to insights in five areas:

  • Data Drought: Informing a new era of water management. There is a lack of understanding about water, even to “the scientists, academics, businesspeople and policymakers who study water for a living”. “Measuring the oceans” has been done with ambitious projects, e.g. the Global Ocean Observing System (GOOS), but the programs are “big, international and expensive” with governments “only in power for about four years”, so the private sector could be involved with revenue-generating, self-sustaining business models. “Sharing the wealth” could collect and coordinate data on water that already has been collected, e.g. in the Beverage Industry Environment Roundtable. “Making data pay” could help narrow the gap between corporate strategy and social good, e.g. SmartBay Galway, where a steady stream of real-time data on water quality, aquaculture, chemical content, wave energy and tidal movement supports both policy decisions and industry around Ireland.
  • The Business of Water: Valuing the world's most precious commodity. The economics of water can be complex and confounding to business people. In “the paradox of value”, there are costs associated with procuring, distributing and treating water, the resource itself has no price, so that issues of wastefulness, pollution and scarcity are not mitigated by a monetary incentive. Technologies focus on increasing the supply of water, leaving an opportunity for “adventure capital” to develop the opportunity to decrease demand, through smarter water systems. For “industry and water”, public companies could son be required to disclose water efficiency in their annual reports.
  • The Infrastructure Imperative: Managing water for the next generation. Water infrastructure is easily ignored, with 15% to 35% loss to leaks in cities, and it's three times more expensive to build and maintain than electricity infrastructure. In “urban outfitting”, retrofitting infrastructure should see electricity, telephone, gas and water utilities working in concert and sharing costs. Leak detection and automatic repair of water pipes could “sense and respond” retrofitting infrastructure, as has been done in the oil and gas industry for years. With half of the world's population living in low-lying coastal areas and cities vulnerable to river flooding and storm surge, “climate proofing” has been a strategy where the Netherlands has led, as a country.
  • Food, Energy and Water: Understanding a delicate global balance. Water, energy and agriculture should be appreciated from a broader perspective, e.g. integrated resource management. With global agriculture wasting nearly 60% of the water it uses, “more crop per drop” encourages framers to match their crops to their climate at a local level. There are huge opportunities in “ocean source” for energy and aquaculture, but we must be careful not to change the chemistry of the ocean. With no global market and little international exchange of water, “think globally, act locally” would require a shift in practices and governance at the regional level.
  • Perception is Reality: Building global water awareness. With a hypothetical question of “If you had $10 billion to invest in any water-related startup, what would it be?”, the answer from participants was a “massive public relations effort”, as people think that water is cheap and abundant. To draw attention to the trade of water embodied in products, “virtual water” has been proposed as a basic measurement. Increasing the accountability of a nation, business or individual for responsible water use, “water footprints” could track water consumed during production and manufacturing and the supply chain.

The release of this final GIO report, became coupled with the idea to “Let's Build a Smarter Planet” that begin in fall 2008. In 2009, the Smarter Cities campaign would be initiated, with 100 Smarter Cities Forums run around the world.910

The Global Innovation Outlooks enabled IBM to engage in conversations across national borders, with governments, not-for-profit organizations, and commercial businesses. The topics were guided not altruistically, but with an orientation towards the differences that private enterprise could make. In hindsight …

… the GIO was never a philanthropic endeavor. If fact, it was not even part of IBM’s community engagement strategy. The GIO was a business program, expected to identify real business opportunities that led to real profit. It just so happens that IBM has always believed that addressing the most pressing needs of society are where opportunity has always been found. As our former CEO, Thomas Watson, Jr. said, “Corporations prosper only to the extent that they satisfy human needs… Profit is only the scoring system… The end is better living for us all” (Briody 2010).

The time horizons beyond 3 to 5 years reflects the scale of challenge that IBM could take on, where only a very few global multinational businesses could even aspire.

B.4.2 Since 2005, IBM researchers have led the Services Science, Management, Engineering and Design initiative

In 2002, the idea of a Human Sciences Research area in IBM Research became realized as Almaden Services Research with an initial staff of seven (Spohrer and Motahari-Nezhad 2015). In September 2004:

Jim Spohrer, who was starting up the IBM Research Service Research department, was on the line with Henry Chesbrough, a professor of business and innovation at the University of California at Berkeley. Spohrer complained to Chesbrough that he was having trouble finding job candidates who had the right mix of knowledge, including computer science, engineering, management and social science.

Chesbrough pointed out that in the 1940s and ’50s IBM had boosted the development of computer science as a discipline by donating computers to universities and then helping them create curricula for teaching students how to use the machines. “IBM started computer science. You should start service science,” Chesbrough told Spohrer. The two men were so excited at the prospect that they immediately dialed in Paul Horn, then director of IBM Research, who blessed the idea (IBM 2011f).

By July 2006, the ideas had been published in “A Research Manifesto for Services Science” (Chesbrough and Spohrer 2006). In October 2006, a Service Science, Management and Engineering (SSME) Summit included 254 people, representing 21 countries across government, industry and academia (IBM Almaden Services Research 2006). The community seemed to be developing from five clusters of intellectual impetus: (i) operations research / mathematics / optimization; (ii) industrial engineering / systems engineering; (iii) computer science / information technology / information management; (iv) process formalization / physics / complexity; and (v) business / organizational sciences / social sciences. Themes identified across the presentations and discussions included: (i) aspects of services include social interaction and relationship management; (ii) multidisciplinarity as expertise that can bridge across science, engineering, social science, management and ethics; (iii) challenges for high education in silos and tenure processes within a single discipline; and acceleration factors for higher education, where there are already establish centers of study and development of new programs.

In 2007, the scholarly community was energized with the formation of a Service Science Section within INFORMS.911 The first issue of the Service Science journal was published in March 2009.912

In summer 2007, “a consortium of technology companies, government agencies and universities dedicated to fostering advancements in service research” named The Services Research and Innovation Initiative was announced (SRII 2007). “Co-founded by IBM, Oracle, the Technology Professional Services Association (TPSA), and the Service & Support Professionals Association (SSPA), SRII was formed to increase the amount of funded service research, development and innovation in the technology industry”. “Members include Association For Services Management International (AFSMI), Cisco, EMC, HP, Microsoft, Sun Microsystems, Unisys, and Xerox. Academic participants include top researchers from UCLA; Cranfield School of Management; Indian Institute of Management, Bangalore, India; Wharton School of Business; Arizona State University; University of Maryland; and University of California Silicon Valley Center at Santa Cruz. Government and research institutions include the European Commission and the Fraunhofer Institute in Germany”.

A July 2007 symposium on Service Science, Management and Engineering at University of Cambridge led to a 2008 discussion document on “Succeeding through Service Innovation: A Service Perspective for Education, Research, Business and Government” (IfM and IBM 2008). This report made recommendations for education to enable graduates to become T-shaped professionals; for research to establish service system and value proposition as foundational concepts; for business to establish employment policies and career paths for T-shaped professionals; and for government to promote service innovation and provide funding for SSME education and research.

By 2008, a new book series, Service Science: Research and Innovations in the Service Economy, had been established with Springer Science. Proceedings from the SSME Summit were published in July 2008 as Service Science, Management and Engineering: Education for the 21st Century. Significant additional volumes were published in 2011, including The Science of Service Systems, and the Handbook of Service Science.913

For IBM, service science research was cited as having significant internal impact. “By creating reusable software assets, improving business processes and automating elements of services with software, the company improved the pre-tax profit margins for its services business from 6.7 percent in 2005, to 14.1 percent in 2009” (IBM 2011g).

In June 2012, the SRII would be partially superseded by the International Society of Service Innovation Professionals (ISSIP) as a “democratically-run non-profit organization” where individual and institutional members work together to expand career options for service innovators while impacting business and society through new and improved service innovations.914 The initial board of directors included representatives from IBM, Cisco, HP, San Jose State University, and Virginia Tech. By the end of 2014, ISSIP would report 675 individual members, across dozens of major companies and 150 universities across 40 countries.915

B.5 At large, from 2000, businesses, creatives, academics, governments and makers, taking up open sourcing

Laymen not immersed in information technology commonly associate open source with technology. The term became formalized when the Open Source Initiative was founded in 1998, and received greater notoriety only beginning in 2000 when IBM made major investments in the Linux project.916 The IT industry had presumed that the norms of “lock-in, network effects, de facto standards” were the only way to compete in the IT industry.917 Open sourcing, as a social behaviour -- by IBM and by a variety of other companies -- shifted thinking that commercial propositions with customers didn't have to only be private sourcing.

Beyond IBM, open sourcing coincidentally rose as a social behaviour with new Internet technologies. The new phenomenon of an openly editable encyclopedia in Wikipedia challenged ideas about the public sharing of knowledge, particularly around the formation of the non-profit Wikipedia Foundation at 2003.

A brief history of open sourcing behaviours by organizations and by individuals follows, to complete the background for changes impacting the period of research between 2001 and 2011.

B.5.1 From 2000, private sourcing businesses explored commercial options in open sourcing through new communities and institutions

The open source movement has benefited from corporate contributions that are “free as in liberty” as well as “free as in gratis”. The February 2000 LinuxWorld announcement World about IBM's investments into Linux was a landmark. By 2002, key leading companies open sourcing software included IBM, Hewlett-Packard, Compaq, SAP, Computer Associates, Hitachi, Sun Microsystems and Cadence Design Systems (International Institute of Infonomics and Berlecon Research GmbH 2002, 12–15). By 2005, in a study of 50 projects, 99% of vendor investments went into a “money-driven cluster” of 18 projects, with the remainder as volunteered effort from employees in a “community-driven cluster” of 32 projects. The motivated vendors in the “money-driven cluster” included IBM (estimated revenue of $4.5 billion related to open source software), Hewlett-Packard (estimated $2.9 billion), Oracle (estimated $1.2 billion) and Red Hat (estimated $280 million) (Iansiti and Richards 2006). By 2011, industry analysts estimate that 80% of all commercial software solutions could involve elements of open source.918

The resources put into open sourcing by corporate businesses are not insignificant. In the Linux project, the estimated development value of Linux release 2.6.30 in 2008 was over €1 billion, with an estimated annual R&D of €228 million (García-García and Alonso de Magdaleno 2010). The Eclipse Foundation doesn't provide hard numbers, but estimates that its ecosystem generates in the range of a billion dollars.919

The rise of open sourcing in business has been marked by (a) the contributing of assets to open source organizations, and (b) the exploring of business models to enable open sourcing into private sourcing commercial offerings.

(a) One way of measuring contributions by organizations to open source communities is as assets counted as lines of code (Asay 2009). As of 2009, the Linux kernel had received 1.4 million lines of code from Red Hat, and 725 thousand from IBM.920 Sun had contributed 6.5 million lines of code to Java, 2 million lines of code to Solaris, and 10 million lines of code to OpenOffice. On Eclipse, IBM had contributed 12.5 million lines of code. Google estimated 10 million lines of code for Android, and 2 million lines of code for Chrome.

In digital artifacts, only 15% of the content released by open source developers is computer source code. There's four times as much content released as scripts, markup language files, graphics images and documentation.921 Beyond the donations of artifacts, organizations commit resources to ensure vitality in the open source movement.

Another way of measuring contributions is through the assignment of human resources. In 2007, 31% of administrators and users of 409 Sourceforge.net project declared that one or more firms were somehow involved. In 68% of the cases, firms supported non-development activities (e.g. testing, animating, forums, documentation, financial and logistic support), and 30% contributed code) (Capra et al. 2009). In 2009, 48% of Eclipse developers were allowed by their companies to use software and contribute back to open source communities, up from 37% in 2007 (Eclipse Foundation 2009, 18).

Sponsorship of organizations independent of single vendor control has enabled open sourcing to grow. The Linux Foundation lists Fujitsu, Hitachi, IBM, Intel, NEC, and Oracle as platinum members; AMD, Cisco, ETRI, Google, HP, Motorola, NetApp, Nokia and Novell as gold members; and 44 additional companies as silver members.922 The Eclipse Foundation lists Cisco, Motorola and Blackberry as enterprise members; Actuate, Brox, CA, Cloudsmith, Genitec, IBM, Innoopract, Itemis, Nokia, Obeo, Oracle, SAP, Sonatype, and Sopera as strategic members; and 72 organizations as solution members; and 71 organizations as associate members.923

All of these contributions demonstrate ongoing open sourcing behaviour while the sponsoring organizations have simultaneously pursued private sourcing initiatives commercially. The simultaneity positions an organization to benefit by being on the “right side” of platform decisions endorsed as an open standard, while not restricting private sourcing extensions as features attractive to the customers they target and serve.

(b) While small scale open sourcing can be viable through individuals volunteering personal time and effort, ongoing large scale open sourcing occurs only through injections of capital. Governments sometimes redistribute wealth towards open sourcing (e.g. university research) to influence regional policy, with private sourcing antithetical to transparent political processes.924 Free enterprises can redistribute wealth into open sourcing while creating greater wealth through private sourcing to a selected customer set. Industry collaboration in open sourcing projects can be structured so that organizations can still compete on independently funded private sourcing complements.

A three-way categorization of business models associated with software products sees (i) pure open source models; (ii) hybrid open source/commercial licensing models; and (iii) embedded open source.925 More generally, the Open Source Definition has always recognized -- if not encouraged -- the benefits of commercial development and motivations with entrepreneurism. Silicon Valley is a hotbed for startup companies initially funded with venture capital, with a promise of business models with eventual financial viability. An offering could include open sourcing in some elements of software, services and/or hardware, while private sourcing in others. When open sourcing while private sourcing was new in 2004, seven alternative strategies enabling sustainable business models were frequently cited:

  1. An optimization strategy: In layered a software stack, commodity layers enable optimization where pricing power can be applied in interdependent and/or application layers (e.g. Oracle).
  2. A dual license strategy: Free use for software with some limitations, with fees for more features and/or commercial distribution rights (e.g. MySQL).
  3. A subscription strategy: Charge for maintenance, entitling configuration support, updates and upgrades (e.g. Novell, Red Hat, SpikeSource, SourceLabs, JBoss, Sun, Zend).
  4. A consulting strategy: Make money on the customization of enterprise solutions, where implementation costs are 70% of commercial fees (e.g. 10X, SpikeSource, SourceLabs, JBoss)
  5. A patronage strategy: Drive standards adoption to crack entrenched markets with a commoditized layer, and then offer value higher up in the stack (e.g. IBM. HP, Intel)
  6. A hosted strategy: Rather than selling software, provide software services using GPL software without being subject to redistribution restrictions (e.g. eBay, Amazon, Salesforce.com, Google).
  7. An embedded strategy: Extend hardware platforms with an open operating system, accelerating delivery at a lower cost (Neoteris, Tivo, IBM) (Koenig 2004).

In a 2005 IBM-internal study, an eighth was added: an onramp / loss leader strategy: gaining a strong affinity between open source and commercial software options. This strategy was observed for IBM, CA and Sun.926

In the subscription strategy, the consulting strategy and the hosted strategy, customers choose to pay a vendor for complements to the software rather than to do it themselves. These reflect that software only has value when packaged with other components, and that a vendor may gain economies of expertise as a provider to multiple customers. If we look at similes where vendors cook meals for people who might otherwise cook for themselves, (i) some vendors serve meals-on-wheels, (ii) some offer cooking classes where the student prepares food, and (iii) some have restaurants with everyday recipes and ingredients (e.g. a bacon and egg breakfast).

In the dual license strategy and the onramp / loss leader strategy, some customers choose to directly pay for software that they might otherwise get for free. The dual license strategy is a value-based customer segmentation. Lightweight users might not place a higher value on one provider over another (e.g. a student learning about how a relational database works, or a small business with few transactions). When an application becomes mission-critical and resource intensive, the paid option may be an economical alternative to obtain features that optimize performance. In the onramp / loss leader strategy, sunk costs are relevant. A customer may want the advantage of working with a specific vendor in the longer term, but is insignificant in its current state. That customer might choose a free version with limitations (e.g. a “community edition”) while in startup mode, then migrating to a paid version (e.g. an “extended edition”) when revenue flows become sustainable. The transition from the free version to the paid version is a simple matter of a changed license key, and no effort is wasted.

The trend towards open sourcing requires companies to adjust their private sourcing strategies in the long term, if not in the short term. Three ways in which managers and practitioners might adapt have been suggested: (1) since open sourcing moves appropriability from code secrecy (i.e. private sourcing) to licensing, larger efforts could be placed on patenting algorithms that have superior performance;927 (2) mixing open sourcing components with private sourcing assets and investments could be an attractive packaging that enables profitability; and (3) hardware manufacturers could bundling open sourcing software, while specialized software suppliers change their value propositions or target customers (Fosfuri, Giarratana, and Luzzi 2008, 302–303). A larger perspective would suggest that the space for open sourcing while private sourcing is a non-zero-sum game, if the market opportunity is reframed from just information technologies to larger social and economic realms.

Companies with large portfolios of software patents have been key participants in open sourcing projects. However, not all companies are equally enthralled with the potential to commercialize open sourcing. Open source alternatives can be seen as a cannibalization threat, with a potential to devalue a brand name and reputation for quality. Commoditization in one technology may or may not be offset from complementaries (e.g. software for gratis could be bundled with hardware or services for fee) (Fosfuri, Giarratana, and Luzzi 2008, 301–302). Open sourcing is a new phenomenon that has to coexist in a legacy of private sourcing. Companies working together have faced court challenges charging collusion and antitrust.

In March 2003, the Santa Cruz Organization (SCO) filed suits against IBM, alleging that IBM had incorporated some of its code (from the private sourcing Caldera derivative of AT&T Unix Unixware acquired by Novell, into IBM's private sourcing AIX product) into the Linux project (Shankland 2003). In a related lawsuit, SCO also attacked the Free Software Foundation (FSF) on the GPL – the license chosen by Linus Torvalds for Linux – as unconstitutional under the Intellectual Property Clause, and in violation of federal antitrust laws (Ake 2007). In 2006, Judge Brooke Wells granted, in part, IBM's motion to limit SCO's claims due to lack of specificity (P. Jones 2006b). The trial date later that year was vacated pending the resolution of a lawsuit of SCO v. Novell. In 2007, Judge Dale Kimball ruled that Novell owned the original Unixware copyright (Jones 2007a). Novell stated that it didn't “believe that there is Unix in Linux” and the company was “not interested in suing people over Linux”, seeing no value in legal proceedings against IBM (Montalbano 2007). With SCO in Chapter 11 bankruptcy, the case was then administratively closed (P. Jones 2007b). In 2013, SCO secured approval to reopen the case against IBM. At the end of 2014. Judge David Nuffer dismissed the charges, ruling that IBM had licensed the source code from Novell, and SCO was bound by the Novell judgement (Sharwood 2014).

Beginning in April 2005, Daniel Wallace filed pro se (i.e. without legal counsel) lawsuits in Indiana, against the Free Software Foundation. The complaint was amended four times, based on mistakes leading to dismissal by Judge John Daniel Tinder.928 In June 2006, a second lawsuit was launched against IBM, Red Hat and Novell.929

The plaintiff in the case, Daniel Wallace, has wanted to compete with Linux by offering a derivative work or by writing an operating system from the ground up. He argued that he has been barred from doing so, while Linux and its derivatives can be obtained at no charge. He asserted that IBM, Red Hat and Novell have conspired to eliminate competition in the operating-system market by making Linux available at an "unbeatable" price: free.

The court found Wallace's theory to be "faulty substantively." The decision pointed out that "the goal of antitrust law is to use rivalry to keep prices low for consumers' benefit." Here, the court concluded that Wallace sought to employ "antitrust law to drive prices up," which would "turn (antitrust law) on its head" (Sinrod 2006).

Later in 2006, Wallace filed with the U.S. Court of Appeals. The three-judge panel upheld the decision by the lower court (Broache 2006).

These landmark decisions have become precedents on the findings that open sourcing can be beneficial to society, and not in conflict with the principles on which antitrust laws were founded.

With open sourcing changing the legal landscape, IBM proactively (i) made intellectual property pledges and (ii) encouraged other industry leaders to follow suit in commons. The difference between a patent pledge and a patent commons has been contrasted:

A patent pledge can take various forms but it is basically a public commitment from a patent owner not to sue one or more parties for infringement, typically, in support of a specific usage. This is usually done by companies like IBM in support of specific technologies, standards, or particular industry trends, such as the open source with the goal to facilitate adoption of a specific technology, standard, or software.

Wikipedia defines commons as a word “used in the sense of any sets of resources that a community recognizes as being accessible to any member of that community.” In the case of patent commons, the resources made accessible are patents. Like patent pledges, patent commons are typically created in support of a specific goal. The major difference between patent pledges and patent commons is that while pledges can be done unilaterally, commons by nature require the creation of a community, a set of identified intellectual property owners who agree to respect the rules set by the community (Lehors 2009a).

Towards protecting both contributor and users using assets declared in the commons, legacy procedures for intellectual property ownership could be adapted.

At LinuxWorld in August 2004, “IBM pledged not to assert any of its patents against the Linux kernel” (IBM 2005d). In January 2005, that pledge was made concrete by naming 500 IBM software patents as open access “to any individual, community, or company working on or using software that meets the Open Source Initiative (OSI) definition of open source software now or in the future”.930 That pledge was intended “to form the basis of an industry-wide "patent commons" in which patents are used to establish a platform for further innovations in areas of broad interest to information technology developers and users”.931 In addition to fostering continued innovation, the pledged patents would “contribute to open standards and broader interoperability between applications by providing open source developers with a solid base of innovation they can use and share”. The proposal to create a patent comments was widely covered in the press. While criticism from some notable individuals ranged from “too little” to “too much”– e.g. Bruce Perens counting the pledge small relevant to the total IBM portfolio (Bednarz 2005), to Bill Gates describing “some new modern-day sort of communists” (Andrews 2005) – the proposal was generally well-received (B. Jones 2005). While copyright and patent systems were originally developed to protect innovation and invention, those were increasingly becoming a detriment to scientific, technological and creative advancements.932

In November 2005, the Open Innovation Network (OIN) was founded as a patent commons by IBM, Novell, Philips, Red Hat and Sony, with a charter to acquire patents and offer them royalty-free, provided those patents would not be asserted (P. Jones 2005d). In addition to the Linux-related content, initial assets also included web service patents from Commerce One, a subsidiary of Novell that had filed for bankruptcy court protection in 2004. In March 2007, Oracle would license patents from the OIN, protecting components in MySQL and PostgreSQL that compete with Oracle's database products (Shankland 2007).933 In August 2007, Google also joined the OIN, as Linux is core to its search engine, web indexing and analysis (Babcock 2007).

In late 2005, IBM started pledging technology specifications, acting unilaterally without partners. An open source analyst explained this as a way of removing bureaucracy for industry standards committees.934 Anyone – whether commercial or non-commercial, open sourcing or private sourcing – is free to implement a listed standards, without having to research relevant patents and document conditions on royalty-free licenses,. This effectively separated interface specifications as open sourcing, while putting IBM on the same footing as everyone else for private sourcing implementations (released under either commercial or non-commercial conditions). These pledges included patent shields, implying that anyone attacking IBM on a patent would be met with a counterattack by IBM on its rather extensive patent portfolio. The overall effect would be to reduce the need for legal counsel for all.

In October 2005, the first specifications pledged by IBM were selected open healthcare and educational software standards built around web services, electronic forums and open document formats (IBM 2005f). While that year had seen a lot of issues around open documents standards and interoperability, little work had been done on industry verticals. In a federated healthcare system, medical information encumbered by proprietary formats controlled be a vendor can slow down integration. IBM named 20 working groups and technical committees in 6 established healthcare and education standards organizations that have done little work on web services and security. In a forward looking announcement, “if these designated groups build their next generation of healthcare and education standards on web services, electronic forms, and open document standards, and they do so within rules of maintaining compatibility and interoperability, then IBM will not assert any of our patents on implementers of these new healthcare and education standards” (Sutor 2005b). The intent would be to promote to use of core underlying standards into a collection of next generation frameworks, as a global initiative and not just specific to North America. The IBM pledge preceded a report on “Ending the Document Game: Connecting and Transforming Your Healthcare Through Information Technology”, where stories (e.g. filling out contact and insurance information seven times) were presented the U.S. Congress by the federal Commission on Systemic Interoperability.935

In July 2007, IBM pledged universal and perpetual access to patents on 150 core software interoperability standards (IBM 2007v). Motivations were clarified on a page of Frequently Asked Questions.936 The royalty-free non-assert pledge has conditions that the implementer has to reciprocate to also not assert. Like an open source license, no communication to IBM is required. The motive for the pledge was explained as encouraging broad adoption of open specifications for software interoperability, which could see multiple organizations offering a variety of implementations. The list of 150 patents cited on the IBM public web pages included a broad assortment of XML-based technologies.937 Subsequently, additional pledges were made in July 2009 and December 2011.938

In January 2008, the World Business Council for Sustainable Development announced the Eco-Patent Commons. This had been initiated by IBM, who sought a neutral host, following the 2007 Global Innovation Outlook conference.939 The initial founders were IBM, Nokia, Pitney Bowes and Sony, each pledging environmentally responsible patents to the public domain. Examples of environmental benefits expected for pledged patents included:

  • energy conservation or improved energy or fuel efficiency ;
  • pollution prevention (source reduction, waste reduction) ;
  • use of environmentally preferable materials or substances;
  • water or materials use reduction ; and
  • increased recycling opportunity (IBM 2008h).

The number of members gradually grew: Bosch, DuPont and Xerox joined in September 2008; Ricoh and Taisei joined in March 2009, and Dow Chemical and Fui Xerox joined in October 2009. Examples of pledged patents include:

  • transforming old mobile phones into new products, e.g. digital cameras or other electronic devices (from Nokia);
  • converting certain non-recyclable plastics into beneficial fertilizer (from DuPont); and
  • substituting a toxic solvent used in quantum computing with a mixture of alcohol and water (from IBM) (Lehors 2009b).

By 2009, over 100 patents had been pledged.940

IBM was not the only corporation to make patent pledges. It may, however, have been the company that was least criticized in its legal approach.

In January 2005, shortly after IBM's pledge of 150 patents, Sun Microsystems pledged 1600 patents (Sun Microsystems 2005). This gave open source developers free access to Sun OpenSolaris related patents under the CDDL. A journalist asking Jonathan Schwartz why Sun “wouldn't do what IBM did” got a response that he couldn't “justify to his shareholders opening up all of their IP” (P. Jones 2005b). This criticized as “clumsy dodging” with patents “only for signed-up licensees of the CDDL”, which wouldn't apply to using the patents on Linux (under GPL), or any other open source licenses.941 In a headline suggesting “patent use would be okay beyond Solaris project”, the head of Sun Solaris marketing responded that “Clearly we have no intention of suing open-source developers," and added, "We haven't put together a fancy pledge on our Web site" to that effect (Shankland 2005). A deeper analysis seeing the incompatibility of CDDL-licensed OpenSolaris and GPL-licensed Linux meant “Sun prevented its nemeses Red Hat and IBM from implementing those patents in Linux in a way that's harmful to Sun (especially considering the damage that Linux has already done to Sun)” (Berlind 2005a). By 2007, the conditions hadn't changed.942

In September 2006, Microsoft would declare an “Open Specification Promise” (Microsoft 2006c). This promise was criticized as “worse than useless”, as Microsoft explicitly reserved the right to change the terms at any time in the future.943 This did not mean that Microsoft would not work with open source communities -- cooperation on the Apache POI (Java API for Microsoft Documents) project was cited -- but did mean that legal counsel would have to be involved in each and every patent licensing (Oliver 2008).

In March 2013, Google would announce an “Open Patent Non-Assertion (OPN) Pledge, whereby the company promised “not to sue any user, distributor or developer of open-source software on specified patents, unless first attacked" (Warren 2013). The first 10 patents released focused on MapReduce, a programming model for handling large datasets.944 In August 2013, Google would pledge 79 additional patents associated with cloud and big data, have acquired them from IBM and CA Technologies (Lardinois 2013). In August 2014, Google would pledge another 152 patents associated with back-end technologies, encryption and prefetching, and XML parsing and validation (Lardinois 2014).

From 2001 through 2007, open sourcing while private sourcing by corporations could be seen as pioneering, within the constraints of legal system, particularly in the United States. By 2008, with several legal challenges flattened and a new style of government encouraged by a newly elected Obama administration, open sourcing while private sourcing became an acceptable, albeit uncommon way of doing business.

B.5.2 From 2002, Creative Commons has standardized open licensing

While some altruistic individuals are willing to participate in open sourcing without concerns about ownership or liability, the pragmatic are more cautious. From the December 2002 release of the Creative Commons 1.0 license, individuals and organizations have been able to easily declare ways in which others are privileged to reuse and/or derive content without having to engage in a case-by-case negotiation.945 Copyright licensing in a broader range of domains was formalized, inspired from the experiences in the open source movement.946

[The] cc licence is not designed for software, but, rather, for other kinds of creative works: websites, texts, courseware (these are all considered literature), music, film, photography, etc.947

The original conditions of “some rights reserved” were expressed in combinations in four dimensions:

Attribution [by]: You let others copy, distribute, display, and perform your copyrighted work — and derivative works based upon it — but only if they give credit the way you request.

Share Alike [sa]: You allow others to distribute derivative works only under a license identical to the license that governs your work.

Non-Commercial [nc]: You let others copy, distribute, display, and perform your work — and derivative works based upon it — but for non-commercial purposes only.

No Derivative Works [nd]: You let others copy, distribute, display, and perform only verbatim copies of your work, not derivative works based upon it.948

Six combinations of these conditions describe increasing strengths of copyright assertion, abbreviated as (i) BY, (ii) BY-SA, (iii) BY-ND, (iv) BY-ND, (v) BY-ND-SA and (vi) BY-NC-ND.949 For each work released, a content creator can override the standard copyright terms in a jurisdiction by declaring a Creative Commons license. Enabling the wide variety of license choices tends to follow the philosophy of choice by the Open Source Initiative.950

In December 2007, two additional legal tools were announced: CC+ (i.e. CC Plus) and CC0 (i.e. CC Zero).

CC+ is a protocol to enable a simple way for users to get rights beyond the rights granted by a CC license. For example, a Creative Commons license might offer noncommercial rights. With CC+, the license can also provide a link to enter into transactions beyond access to noncommercial rights — most obviously commercial rights, but also services of use such as warranty and ability to use without attribution, or even access to physical media. [….]

CC0 is a protocol that enables people to either (a) ASSERT that a work has no legal restrictions attached to it, or (b) WAIVE any rights associated with a work so it has no legal restrictions attached to it, and (c) SIGN the assertion or waiver (Steuer 2007).

The CC+ license was developed in cooperation with commercial rights agencies and some pioneering CC-enabled businesses. The CC0 license is similar to the public domain dedication, enabling a future platform for reputation systems to judge the reliability of a copyright status depending on the certifier.

Refinements of the Creative Commons licences has continued, with version 4.0 initially launched in September 2011, published in November 2013. Porting a generic license across the variety of (60) jurisdictions is no longer necessary with CC 4.0 licenses.951

The Creative Commons has advanced the letters of the law in defining remix or read/write culture, in contrast to a permission or read-only culture (Lessig 2008). Licensing is a parallel activity whereby the lawyers have been catching up to the practices exhibited in open source communities.952

The rise of digital photography has led to a domain where new licensing for content has become popular. Photographers capture more images digitally than they did with film cameras, and the Internet can making sharing privately, with friends or family, or the public, easy.953 Organizing digital images on a web platform can make finding a specific photograph easier not only for the non-creators, but also for the photographer himself or herself.954

Flickr launched as a web-based image hosting platform in February 2004. In June 2004, Flickr announced the feature to choose a Creative Commons license on uploading new images, either as a batch, or per-photo (Butterfield 2004). With the rise of blogging, hosting images on Flickr was an easy way of managing digital photographs. Within the first year, 10 million photographs were published under the six Creative Commons licences. Yahoo acquired the company in March 2005. After 5 years, in 2009, there were 100 million photographs, free to download, print and distribute (Thorne 2009). Most photographs were licensed restrictively – with 33% BY-NC-ND, and 29% BY-NC-SA – yet 24% (i.e. 24 million photos) allow commercial use with minimal restrictions. Case law has demonstrated that Creative Commons licenses on Flickr images has been enforced in a variety of jurisdictions.955 In March 2015, Flickr added options to tag works no longer in copyright as Public Domain, or complete copyright releases under CC0 (Vaidyanathan 2015).

Creating and sharing derivative works of digital images can complicate their licensing. In November 2006, DeviantArt included Creative Commons licensing as an alternative to traditional copyright, as part of the normal workflow of uploading artistic content to its web site (Ellwood 2006). Founded in 2000, members “deviate” animations, photographs, web skins, films and literature and share them on the web.

In July 2004, Google acquired Picasa as an image organizer and viewer. With the September 2008 announcement of Picasa 3.0 and Picasa Web Albums, Google announced the option to choose one of the six CC licenses on each image, and/or set the “Photo Usage and Licensing” as a default (Benenson 2008; Horowitz 2008).

Sharing digital images through blogging has become even more popular with “retweeting” or “liking”. Creative Commons licensing by the original photographer legalizes the conditions for resharing the content. Technically, the terms of service for each web hosting service often restrict sharing to only that provider, and not other web services. While rarely enforced, normal practices in resharing content in social media could frequently represent violations of copyright that are not enforced.956

Sharing music over the Internet has had a legal chill since the 2001 enforcement of the Digital Millennium Copyright Act (DMCA) on Napster.957 This could be resolved by separating Creative Commons licensed digital music from the traditional commercial channels of distribution.

In January 2005, Jamendo was launched as a website for musicians, complementing peer-to-peer networks (e.g. eMule, BitTorrent and Kazaa) with a legal service allowing artists to choose one of the Creative Commons licenses for their works.958 Jamendo also proposed a system of direct and voluntary compensation from listeners to the artists, in the form of donations or sponsorship, passing through 90% to 100% of the payment (Roelants 2005). By December 2006, the web site had listed 2000 albums (Zimmer 2006). The company received first round venture capital funding in July 2007, which was bought out in April 2008.959 By October 2010, Jamendo claimed that the web site had 400,000 tracks of free music, with 30,000 artists from 150 countries, with additional licensing for film, television, public places and games (Jamendo Team 2013).

Spoken word podcasting saw some early adopters of Creative Commons licensing. On May 3, 2005, the first podcasting interview was released, based on a collaboration between two fellows of the Berkman Center for Internet & Society: radio journalist Christopher Lydon and software developer Dave Winer (Walsh 2011).960 The founding of Radio Open Source followed a history of a 2001 dispute on rebroadcast rights on Lydon's prior show, The Connection.961 Production on the podcast show has continued with Creative Commons licensing, with audio content preserved since 2005 on the Internet. In November 2013, an agreement was made by WBUR to rebroadcast Radio Open Source podcasts on weekends (Kahn 2013).

Video sharing web sites were slower in working through copyright options, as the popularity of camera phones rose.

Blip.tv, since the early days of its beta test in July 2006, was first in requiring all video content uploaded to its web site be licensed as Creative Commons.962 This feature on blip.tv was a major differentiator amongst video sharing web sites for many years. Blip.tv made downloading video content easy for remixing, whereas alternatives would not do similarly for many years.963

In July 2010, Vimeo announced that they were “launching a new Settings option that allows you to add one of several Creative Commons licenses to your videos” (Verdugo 2010). They re-emphasized a “golden rule” that “you may not upload content that you did not create yourself”, and that permissions could not be granted for others to use.

In June 2011, Youtube announced that content owners could market their videos with Creative Commons CC-BY licenses upon uploading the videos (Peterson 2011). Additionally, they started a new Creative Commons library of 10,000 videos from C-SPAN, Voice of America, Al Jazeera and others. This was described as “a big deal” for remix culture, as Creative Commons-licensed videos became readily available to Youtube's video editor for mashing up with other clips and synchronizing with music. The Youtube product manager describe it “as if all the Creative Commons videos were part of your personal library” (Roettgers 2011).

The sharing of text had a legacy in GNU licences (e.g. documentation), that eventually was updated for larger scale collaboration. Between May and August 2009, Wikipedia amended the licensing on its web sites to enable dual licensing under the original GFDL (GNU Free Documentation License) and a new Creative Commons CC-BY-SA. This was started in December 2007, with a request from the Wikipedia Foundation to the Free Software Foundation to provide a migration path in the GFDL 1.3 license. The Wikipedia founder, Jimmy Wales, said Creative Commons licensing might have been preferred if it had been available when the web site was first launched.

When I started Wikipedia, Creative Commons did not exist. The Free Documentation License was the first license that demonstrated well how the principles of the free software movement could be applied to other kinds of works. However, it is designed for a specific category of works: software documentation. The CC-BY-SA license is a more generic license that meets the needs of Wikipedia today, and I'm very grateful that the FSF has allowed this change to happen. Switching to CC-BY-SA will also allow content from our projects to be freely mixed with CC-BY-SA content. It's a critically necessary change for the future of Wikimedia (Wikimedia Foundation 2009).

The Free Software Foundation released GFDL 1.3 in November 2008.964 After the May passage of the dual licensing vote by the Wikimedia Foundation, the FSF gave permission to transition from the GFDL by August 1, 2009.965 After that date, all content on Wikipedia was to be licensed as Creative Commons CC-BY-SA.

By June 2011, the total number of CC-licensed works on the Internet was estimated at 400 million works “from music and photos, to research findings and entire college courses” (Creative Commons Corporation 2011). Stories of success were related in an online book, The Power of Open, in 9 languages. TED Talks has had free and open distribution of its videos since June 2006 with a CC BY-NC-ND license. British photographer Jonathan Worth followed Cory Doctorow's example of giving his book away and making money from it, with an experiment of CC BY licensing of free high-resolution copies online while selling exclusive signed prints.966 Nina Paley released her 2008 animated movie “Sita Sings the Blues” under CC BY-SA, claiming that she has a higher profile, doesn't spend anything on promotion, and fans buy merchandise. The Open University chose CC licenses on its new OpenLearn website in 2005, preempting £100,000 in legal fees. Khan Academy has licensed its videos under the BY-NC-SA license since 2004, and has received funding from the Bill & Melinda Gates Foundation. The Public Library of Science (PloS) has published with CC BY license since 2003, and has open access journals recognized as having impact. A total of 30 stories were published in the book.

The sharing of content -- as information, ideas, or creative works -- has been defined in four ways:

Pay to view sharing is making content available to paying customers. (e.g., paywalls).

Read only sharing is granting free access to read content. (for the vast majority of content published, this is the type of sharing involved).

Copy only sharing is giving other people the right to actually move and share you content around the web.

Remix sharing means giving other people rights to remix and build upon your content (Pearson 2015).

The Internet has made sharing easy with (i) the abundant information and creative works online; (ii) free copies of even locked-up content easy to find; and (iii) people going beyond just consuming to participate in creating culture. Thus, Creative Commons licences have been used (i) as a publicity vehicle; (ii) to build community; (iii) to leverage outside ideas; (iv) for the social good; and (v) new kinds of “open” companies with transparency and participative work culture. By 2014, the number of Creative Commons-licensed works is estimated at 882 million.967 The presentation of ideas has led to the August 2015 Kickstarter funding of a book Made with Creative Commons: A Book on Open Business Models (Stacey 2015). The project has a goal to begin to answer the question “how do creators make money to sustain what they do when they are letting the world reuse their work?” (Creative Commons 2015).

B.5.3 From 2005, open government data cooperated with citizens

Transparency of government and public access to information are not new ideas. Readily sourcing of evolving digital datasets through open interfaces so that that information can be analyzed, contextualized and mashed up has advanced at various rates in a variety of jurisdictions worldwide.

In December 2007, 30 American open government advocates met for a weekend to create a list of Open Government Data Principles.968 Resulting the meeting were 8 principles and 3 principles requesting open comment:969

Government data shall be considered open if it is made public in a way that complies with the principles below:

  1. Complete: All public data is made available. Public data is data that is not subject to valid privacy, security or privilege limitations.
  2. Primary: Data is as collected at the source, with the highest possible level of granularity, not in aggregate or modified forms.
  3. Timely: Data is made available as quickly as necessary to preserve the value of the data.
  4. Accessible: Data is available to the widest range of users for the widest range of purposes.
  5. Machine processable: Data is reasonably structured to allow automated processing.
  6. Non-discriminatory: Data is available to anyone, with no requirement of registration.
  7. Non-proprietary: Data is available in a format over which no entity has exclusive control.
  8. License-free: Data is not subject to any copyright, patent, trademark or trade secret regulation. Reasonable privacy, security and privilege restrictions may be allowed.

Compliance must be reviewable.

Definitions

1. “public” means:

The Open Government Data principles do not address what data should be public and open. Privacy, security, and other concerns may legally (and rightly) prevent data sets from being shared with the public. Rather, these principles specify the conditions public data should meet to be considered “open.”

2. “data” means:

Electronically stored information or recordings. Examples include documents, databases of contracts, transcripts of hearings, and audio/visual recordings of events.

While non-electronic information resources, such as physical artifacts, are not subject to the Open Government Data principles, it is always encouraged that such resources be made available electronically to the extent feasible.

3. “reviewable” means:

A contact person must be designated to respond to people trying to use the data.

A contact person must be designated to respond to complaints about violations of the principles.

An administrative or judicial court must have the jurisdiction to review whether the agency has applied these principles appropriately.970

These 2007 principles have since been adopted, to a greater or less degree, across jurisdictions at the federal, regional and municipal levels.

By October 2013, the Open Data Barometer would rank the UK as the most advanced country for open data readiness, implementation and impact, scoring above the USA. Sweden, New Zealand, Denmark and Norway, as shown in Table B.2. The leading developing country was Kenya (21st), ranking higher than rich countries such as Ireland (29th) and Belgium (31st).

Table B.2 Open Data Barometer, Top Global Ranking, from (Davies 2013
Rank Country Readiness
Sub-Index
Implementation
Sub-Index
Impact
Sub-Index

ODB Overall

1 United Kingdom 100.00 100.00 79.91 100.00
2 United States 95.26 86.67 100.00 93.38
3 Sweden 95.20 83.14 71.95 85.75
4 New Zealand 81.88 65.49 88.81 74.34
5 Norway 91.88 70.98 46.15 71.86
5 Denmark 83.54 70.20 55.73 71.78
7 Australia 87.88 64.71 51.19 67.68
8 Canada 79.11 63.92 51.59 65.87
9 Germany 74.50 63.14 53.81 65.01
10 France 79.39 64.31 39.07 63.92
10 Netherlands 85.92 67.06 21.42 63.66

The Open Data Barometer is structured in three sub-indices: (i) readiness, identifying how far a country has in the places the political, social and economic foundations for realizing the potential benefits of open data; (ii) implementation, identifying the extent to which government has published a range of key datasets to support innovation, accountability and more improved social policy; and (iii) emerging impacts, identify the extend to which open data has been seen to lead to positive political, social and environment, and economic change.

While the UK scored the highest in readiness and implementation, the United States scored highest in impact. Sampling was conducted in 77 countries. Much of emphasis in the report was on on appreciating regional (i.e. continental) trends, and recognizing the strong relationship between levels on the Human Development Index and the diffusion of open government data policy and practice. Most countries had open government data initiatives at a national level, with a few exceptions leading with cities (e.g. the Edo State ahead of Nigeria as a country).

With the focus of this book on the period between 2001 and 2011, an outline of progress in the leading two countries follow: the United Kingdom, and the United States. The UK evolved partially coinciding with a 2003 directive from the European Union culminating in 2010 action plan by Prime Minister David Cameron.971 The U.S. activity was led by citizen dissatisfaction on scandals from 2005 through to the Obama administration taking office in 2009.

In the UK, policy changes would originate first from outside the country, with grassroots level activities from citizens associated with academic institutions.

In November 2003, the EU passed a Directive on the Re-use of Public Sector Information (PSI). It established “a minimum set of rules governing the re-use and the practical means of facilitating re-use of existing documents held by public sector bodies of the Member States” (European Parliament 2003). The general principle was that “these documents shall be re-usable for commercial or non-commercial purposes”, and “where possible, documents shall be made available through electronic means”. By July 2009, all member states had implemented the Directive, although only four met the original deadline of July 2005 (European Commission 2009). The UK was one of those four (Minister for the Cabinet Office 2005). The commission would open 18 infringement cases against member states, and the European Court of Justice would deliver 4 judgements for failure to implement the Directive.972 The 2009 report assessed that “progress had been made”, but implementation in member states was “uneven”.

From May 2004, the incorporation of the Open Knowledge Foundation in the UK was led by Rufus Pollock, an economist at the University of Cambridge.973 While initial initiatives at December 2004 In did not focus on open government data, the original purpose would eventually include that as well:

The Foundation exists to promote the openness of all forms of knowledge. We work in three particular areas:

  1. To promote freedom of access, creation and dissemination of knowledge.
  2. To develop, support and promote projects, communities and tools that foster and facilitate the creation, access to and dissemination of knowledge.
  3. To campaign against restrictions both legal and non-legal on the creation, access to and dissemination of knowledge.974

The costs of computing and the Internet was seen as an opportunity for a knowledge society, while the trend was threats of strengthening of intellectual property law.975

In January 2005, the Freedom of Information Act 2000 came into force across the whole United Kingdom. This act of parliament originated as a white paper in 1997, with a schedule of compliance around timelines in different jurisdictions. The 2000 UK Act applied to public bodies in England, Wales and North Ireland. The Scottish Act, with almost identical requirements, was passed in 2002.

In March 2005, Demos, a cross-party think tank in Britain, published Wide Open: Open source methods and their future potential (Mulgan, Steinberg, and Salem 2005). They suggested three broad categories of activity observed in projects inspired by open source ideas, at least partially transferable to non-software areas: (i) open knowledge, projects where knowledge is provided freely, shaped, vetted and used by a wide community of participants; (ii) open team working, projects that merge semi-open teams rooted in organizations in loose communities of interest; and (iii) open conversations with online with more participants than before.

In 2003, the mySociety project, led by Tom Steinberg, revived the UK Citizens Online Democracy (UKCOD) originally founded in 1996.976 In June 2006, the TheyWorkForYou web site -- aggregating content from Hansard records in the House of Commons, House of Lords, Scottish Parliament and North Ireland Assembly to track votes and speeches of Members of Parliament since 2004 -- was adopted by mySociety.

In February 2007, an independent review was chartered “to explore new developments in the use of citizen- and state-generated information in the UK, and to present an analysis and recommendations to the Cabinet Office Minister as part of the Policy Review” (Mayo and Steinberg 2007, 7). This was supported by the Strategy Unit of Prime Minister Tony Blair, with Tom Steinberg of mySociety as the primary author, and with Ed Mayo of the National Consumer Council in a rapid review. The final report of The Power of Information was published in June 2007, with 15 recommendations.

June 2007 was also the month when the Labour Party leadership would transition from Tony Blair to Gordon Brown in an uncontested election. The popularity of the Labour Party would decline in the recession of 2008, and the party saw a catastrophic loss of seats in the 2010 general election. With the Conservative Party having the largest number of seats in a hung parliament, a coalition between the Labour Party and the Liberal Democrats was insufficient to rule, and Gordon Brown resigned. In May 2010, the government would change to coalition of the Conservative Party led by David Cameron and the Liberal Democrats led by Nick Clegg.

In an open sourcing mode, the Open Knowledge Foundation would develop the ideas and infrastructure for the Comprehensive Knowledge Archive Network (CKAN), which would become the foundations for the data.gov.uk initiative, and subsequently data.gov in the United States. In September 2006, the OKF would publish version 1.0 of the Open Knowledge Definition.977 A work is defined as open “if its manner of distribution satisfies” conditions on (i) access; (ii) redistribution; (iii) reuse; (iv) absence of technological restriction; (v) attribution; (vi) integrity; (vi) no discrimination against persons or groups; (viii) no discrimination against fields of endeavor; (ix) distribution of license; (x) license must not be specific to a package; and (xi) license must not restrict the distribution of other works. Licenses conformant with the open knowledge definition include the MIT Database License; the Creative Commons Attribution License (cc-by) and Attribution Share-Alike License (cc-by-sa); the GNU Free Documentation License (GFDL); and the UK PSI (Public Sector Information) Click-Use License. Features that would make a license non-conformant licenses include no-derivatives and non-commercial clauses. In October 2014, the Open Definition would be revised in version 2 in simpler language: “Open data and content can be freely used, modified, and shared by anyone for any purpose”.978 The revision was in review for year prior to release, with “input from experts involved in open access, open culture, open data, open education, open government, open source and wiki communities”.

OKCon 2007, the first Open Knowledge Conference, was held in London in March 2007, with panels on open media, open geodata and open scientific and civic information.979 OKCon 2008 was held one year later, on the theme “Applications, Tools and Services”.980 OKCon 2009 in March 2009 focused on “open knowledge and development” and on “the semantic web and open data”.981 Subsequent conferences have spread to other geographies.982

In July 2007, the OKF announced the launch of an open sourcing Comprehensive Knowledge Archive Network (CKAN) after a year of prior development (Pollock 2007b). CKAN is a registry of open knowledge packages and products, a place to search for resources as well as registering them.983 CKAN did not replace local technologies, and recommended “side by side” integration with existing public data platforms.984

In September 2009, the Cabinet Office announced an early preview of data.gov.uk, inviting the developer community to give feedback on 1000 existing data sets from 7 departments (Taylor 2009). The backend repository for data.gov.uk was CKAN technology private beta, with the packages promised to show up on the CKAN main web site when it would become public (Pollock 2009).

In January 2010, the public beta of data.gov.uk was announced (Data.gov.uk Team 2010). Improvements over the four months included more datasets, plus online browsing of data and tags, a wiki, and a forum (powered by Drupal). The OKF was a subcontractor the primary contractor, initially the National Archives and then the Central Office of Information (CKAN 2011).

In May 2010, Prime Minister David Cameron announced a radical plan to open up government data to the public, establishing a Public Service Transparency Board under Minister of the Cabinet Office Francis Maude, and appointing Tom Steinberg as one of the UK's leading experts on data transparency (Cameron 2010).

In July 2010, the Cabinet Office promoted populating data.gov.uk, with an article to “Tell us which datasets you want released”.985

In September 2010, the information covered by Crown copyright and database rights was relicensed under a new UK-wide Open Government License. From version 1, this new license was aligned to be compatible with the Creative Commons Attribution license.986

In May 2011, the Cabinet Office appointed Beth Noveck as an advisor on open source policy making, based on her experience in the U.S. She was to complement (i) Tom Steinberg; (ii) Tim Kelsey (seconded from McKinsey to direct national policy on transparency, becoming the full-time Executive Director of Transparency and Open Data in January 2012); and (iii) Martha Lane Fox (an Internet entrepreneur previously appointed in 2009 as the digital inclusion champion towards bringing poorer families online) (Osborne 2011).

The OKF worked on data.gov.uk for its first two years. In early 2010, the Open Government Data web site, wiki and mailing list was started by the Open Knowledge Foundation. The initial vision was “mapping out open government data initiatives from around the world”.987 By fall 2010, CKAN internationally included instances from governments and institutions in the UK, Norway, the Netherlands, Helsinki and the International Aid Transparency Initiative; and from communities operating at national, provincial and municipal levels.988 “In early 2012 the UK government took its CKAN work in-house, but they continue to work closely with the CKAN team and make regular code contributions back to CKAN” (CKAN 2013). By the relaunch of data.gov.uk in June 2012, the CKAN web interface was found to be providing a better web interface, and those functions were migrated from the Drupal modules (Acuna 2012).

Americans came to open government digital data from a different direction. In 2005, dissatisfaction with multiple corruption scandals in Washington D.C. combined with the rise of social media brought together citizen interested in a more open and accountable government.989

In April 2006, the Sunlight Foundation was founded, with three priorities: “digitizing data, building tools and the sites for easy access to it, and developing communities to support and help carry on its work” (Sunlight Foundation 2010). The name of the group reflects the dissatisfaction with government transparency at that time.

Publicity is justly commended as a remedy for social and industrial diseases. Sunlight is said to be the best of disinfectants; electric light the most efficient policeman (Brandeis 1914, 92).990

This group was instrumental in convening an Open Government Working Group meeting in October 2007, to develop the list of eight principles for open government data (Malamud 2007).

These citizen-led activities shaped the guidelines under which development by government managers progressed. The Federal Web Managers Council is an interagency group that collaborates to improve the online delivery of U.S. government information and services, sponsored by the GSA's Office of Citizen Services.991 In June 2008, the implementation guidelines on publishing data were updated in response, and citizen feedback on updates based on the Open Government Data Principles were incorporated (Tauberer 2008).

On January 21, 2009 -- the first day in office for a new administration -- the White House issued a Memorandum on Transparency and Open Government, outlining three principles: (i) government should be transparent; (ii) government should be participatory; and (iii) government should be collaborative. The president directed the CTO, in coordination with the director of the OMB and administrator of General Services for recommendations for an Open Government Directive within 120 days (Obama 2009).992 Beth Noveck was appointed as Director of the White House Open Government Initiative in January 2009.993 In analysis by citizens, this memorandum listed deadlines for 45 days, 60 days, 90 days, 120 days, 1 year and 2 years (Schuman 2009).

The energy from the new administration sparked public collaborations. In February 2009, a TransparencyCamp meeting -- an “unconference” inviting government representatives, technologists, developers, NGOs, wonks and activists led by the Sunlight Foundation -- convened in Washington D.C.994 Presentations, notes and audio recordings were shared on a wiki openly on the Internet, with videos following soon after.995 A TransparencyCamp West was convened in August 2009, with a better-organized web site and microblogging.996 This series matured with a March 2010 event in a larger DC venue, and video recordings following.997 TransparencyCamp has become an annual event, with strong support for local communities to host their own.998

In addition to individuals immersed in face-to-face meetings, efforts for understanding the importance of open government data has been targeted to the larger audiences. Joshua Tauberer has described Open Data as “Civic Capital”, reducing costs in the redistribution of government information and strengthening governance through educating citizens and reducing the need for government regulation (Tauberer 2009a). In the business media, Tim O'Reilly projected a “Government 2.0” whereby government becomes “an open platform that allows people inside and outside” to innovate, in the similar way that “Web 2.0” has reshaped business models in old media and software companies (O’Reilly 2009b).

Inside the U.S. government, the policy setting has trickled down to work towards changing practices. Beginning November 2009, monthly inter-agency collaborative events have been organized as an Open Government Directive Workshop Series, each hosted by a different agency. Online social media tools visible to the public have been used to coordinate these events, with presentations and notes available for public viewing.999 The White House formally directed executive departments and agencies, on December 8, 2009, to take steps towards the goal of creating a more open government, including (i) publishing government information online; (ii) improving the quality of government information; (iii) creating and institutionalizing a culture of open government; and creating an enabling policy framework for open government (Orszag 2009).

Evaluation of the U.S. government activities towards a more open government have generally been favourable. A review of the Open Government Directive of December 8, 2009 found that it addressed “early all of the open government data principles that have been put forward, and even [added] two of its own: being pro-active about data release and creating accountability by designating an official responsible for data quality” (Tauberer 2009b). In an April 2010 audit of Open Government Plans, findings scored 6 agencies as “strong”, 16 as in the “middle ground”, and 5 with “weak plans” (OpenTheGovernment.org 2010). On a multi-year time horizon, progress was being made, and the public visibility of activities towards open government through periodic reviews would help to maintain momentum.

At the international level, the Open Government Partnership “is a multilateral initiative that aims to secure concrete commitments from governments to promote transparency, empower citizens, fight corruption, and harness new technologies to strengthen governance”.1000 The partnership launched on September 20, 2011, with eight founding governments (Brazil, Indonesia, Mexico, Norway, the Philippines, South Africa, the United Kingdom and the United States) endorsing the Open Government Declaration.1001 To be accepted, “OGP participating countries will co-create a National Action Plan (NAP) with civil society. Action plans should cover a two-year period and consist of a set of commitments that advance transparency, accountability, participation and/or technological innovation”. By 2013, 57 additional governments had joined the Partnership.1002

The uptake of open government data as an idea rose circa 2007-2009, in a variety of jurisdictions.1003 Citizen engagement, as distinct from official government pronouncements, sometimes more difficult to discern.

As an example, the Canadian government espoused joining the Open Government Partnership in April 2012 (Lithwick and Thibodeau 2012). Within Canada, democracy advocates criticized the Conservative government in failing to meet the requirements of a National Action Plan.1004 In April 2014, a Progress Report on Canada for 2012-2013 was issued (Francoli 2014a). While the government of Canada had a highlight in adopting an Open Government License, the report criticized a lack of ambition in driving an open agenda. 1005 This could be interpreted as accepting the licensing of open source, while skirting the adopting of behaviours associated with open sourcing. This led to the federal government conducting consultations for a second Action Plan on Open Government, and recognizing multi-jurisdictionality (Francoli 2014b).

At the municipal level in Canada, citizen activity was better welcomed by local governments. ChangeCamp was initiated in January 2009 in Toronto, and has spread to other Canadian cities.1006 In April 2009, the mayor of the City of Toronto announced the OpenTO initiative, with an official launch in November 2009.1007 In May 2009, the City of Vancouver council endorsed principles of open data, standards and source, with an official launch in September 2009.1008

In the UK in 2012, Tom Steinberg resigned after 5 years of advisory roles in Westminister, frustrated “partly due to the dull tribalism”. He publicly posted policy papers written for both Tory and Labour politicians and analysts, expressing regret at the way his name was being used (Steinberg 2012).

The progress in government on open sourcing licensing, around the world, can be evaluated as great. The progress in government on open sourcing, as a behaviour, is highly variable.

B.5.4 From 2005, open source hardware rose with the maker movement

While open sourcing originated with non-material artifacts, the impact on the material world of physical objects was beginning to gain traction by 2007. Physicality can create issues in a mix of property domains:

  • hardware designs can fall under patent law;
  • software code can fall under copyright law; and
  • symbols, words and phrases identifying a source of goods or services can fall under trademark (or service mark) law.

The traditional pure private sourcing style would see enforcement of (i) patents on a hardware design; (ii) copyrights on software code; and (iii) trademarks on distribution.1009 Open sourcing while private sourcing may change those positions, to (i) pledging non-assertion of patent rights on hardware designs and (ii) publishing open source licenses on software code, while (iii) operating commercially and pursuing infringements of trademarks on packaging and delivery of products, and/or service marks on the identification of the offering.

Two domains where open sourcing while private sourcing received notoriety have been in (i) the maker movement, centered particularly around the popularization of the Arduino ecosystem, and (ii) fashion apparel, where intellectual property protection has been relatively uncommon.

A label of maker resonates with many who exhibit behaviours of democratizing innovation (von Hippel 2005).1010 These ideas contrast to the role-partitioned thinking where (i) users provide needs; and (ii) manufacturers develop product and services (in the style this book calls private sourcing). In user-centered innovation, individuals are involved in customizing and/or extending functions, attributes or features of products and services. The involvement can occur either independently of the manufacturer, or in cooperation.1011 Beyond open sourcing software, the development of physical products in kitesurfing by an MIT student (Saul Griffith at Zero Prestige) in 2001 became an early foundational case study for von Hippel. In 2005, Eric Wilhelm would partner with Saul Griffith to cofound Squid Labs (an engineering design firm), and then spin off four companies, including Instructables. Instructables has become “the world's biggest show and tell” on offering free step-by-step instructions on how to make things (McCluskey 2008).

The rise of the maker movement coincides with Dale Dougherty coining the term.1012 Dougherty proposed Make magazine – pitched as “Martha Stewart for geeks” – and launched the first issue in January 2005 as not only a web site, but also a physical publication with few advertisers at a $15 cover price (McCracken 2015). The term “maker” was a vague way of describing the target audience of individuals with a sense of curiosity, adventure and intellectual engagement in learning-by-doing. In April 2006, the first Maker Faire at the San Mateo County Event Center attracted 100 exhibiting makers. In 2006, the second Maker Faire was held in May in San Mateo, and in October in 2007.1013 In 2010, two new Maker Faires were started in Detroit and New York City (Hague 2014). By 2014, Maker Faires reached 100 events globally, with 530,000 attendees (Maker Faire 2014). The White House would host a mini Maker Faire on June 18, 2014.

In winter 2005, the low-cost Arduino microcontroller board was introduced as a tool for students of Massimo Banzi at the Interaction Design Institute Ivrea (IDII) (Kushner 2011). A microcontroller board, when combined with a programming language, enables novices to to create interactive devices and robots with motors, sensors, lights and sounds. Prior to 2002, the most popular platform had been the Stamp, manufactured by Parallel Inc. The Stamp was programmable with a dialect of BASIC on Windows-based personal computers. The IDII students were challenged to design interactive prototypes, with about 30 days to learn electronics. The Stamp had obstacles including a $100 hardware cost, an underpowered processor for the projects they proposed, and the lack of programming tools on the Mac computers they preferred.1014 One student, Hernando Barragán, wrote a new prototyping platform in 2003 called Wiring, including both a user-friend Integrated Development Environment (IDE) and a ready-to-use circuit board.1015 To complement that software, a core product team of five targeted a $30 budget.1016 The first implementation saw 300 blank printed circuit boards given to IDII students, with a directive to look up assembly instructions online.1017 By 2005, the first simple prototype board was created, eventually given the Arduino name by Banzi.

The Arduino project was apprehensive of the IDII running out of operating funds, and decided to ensure that the assets would remain as accessible as possible. The design of the board, as CAD (computer aided design) files, was licensed under Creative Commons. Components could be added onto the basic board, and lots of input and output pins were provided.1018 The Arduino brand was trademarked, enabling alternative implementations to be developed commercially, while retaining a distinct identity. The original Arduino Uno became eventually complemented by the more powerful Mega board, the smaller Nano board, the waterproof LilyPad, and a net-enabled Arduino Ethernet.

While Arduino counterfeits (i.e. illegally copies of the trademark) exist, Arduino-compatible clones can follow the specifications with a wide variety of features and costs.1019 Arduino LLC was founded in the United States, with Massimon Banzi as CEO. Production of officially trademarked Arduino board continued with Smart Projects SRL in Italy, under Gianluca Martino.

In 2013, Intel introduced the Galileo, an Arduino-certified development board based on the Intel x86 architecture.

By 2014, there were an estimated 1.2 million official Arduino boards in use, and possibly an equal number of Chinese counterfeit copies claiming “Made in Italy” (Orsini 2014). By March 2015, there was a split of the five original cofounders, with Smart Projects SRL ceasing royalties payments and a new CEO Frederico Musto renaming the company Arduino SRL (Banzi 2015). The trademark for Arduino in Italy was granted in 2010, and funding for manufacturing had been personally assumed by Gianluca Martino and Daniela Antonietti at Smart Projects, with much competition from counterfeits (Williams 2015). Courts in the United States, Switzerland and Italy have sided with Smart Projects (now Arduino SRL) on their use of trademarks. In May 2015, Arduino LLC announced that Arduino-branded boards would be manufactured outside of Italy, by Adafruit in the United States (Senese 2015). Further, in June 2015, Arduino LLC launched the Genuino brand to be manufactured by a longtime partner, Seeed Studio in Shenzhen, for distribution in China (Dougherty 2015).

By 2008, the maker movement had become validated as a “big idea” that had gained recognition over the past three years.1020 Not only were Make and Craft magazines successful, but online sites such as Instructables and Etsy were receiving notice in the business press (O’Reilly 2008).

The definition of “open source hardware” or “open hardware” has continued to be a challenge. By 2008, the practice of designing, sharing, distributing and modifying hardware designs over the Internet by makers was common. Sharing blueprints and sketches to make furniture and machinery predates the open source movement by centuries. The difference in open sourcing hardware was articulated by Eric von Hippel:

Most products are designed in software first. So you’re designing and simulating on the computer, and in the last step you turn it into hardware. If you think of open-source software as an information good, then open-source hardware is also an information good until the very last stage (Greene 2008).

Practically, debates on the definition center around whether the CAD (Computer-Assisted Design) files are or are not available under a Creative Commons license. For microcontroller boards, Arduino set the pace with CAD files licensed under Creative Commons, software licensed as open source, and identity protected by trademarking. Single-board computers have not been so cleanly defined.1021

From 2008, the BeagleBoard community has been following the spirit of open sourcing. The BeagleBoard was introduced as “low-cost, fan-less single-board computer based on Texas Instruments' OMAP35x device family, with all of the expandability of today's desktop machines, but without the bulk, expense, or noise”.1022 Texas Instruments was encouraging “open source as a means to drive innovation, ultimately enabling our customers to create market-leading devices”.1023 The company funded a “small dreams” project to fabricate a prototype printed circuit board and host a web site, on the requests of employees Jason Kridner and Gerald Coley who were volunteering their time.1024 An evaluation board was backed by Digi-Key Electronics, and manufactured under contract by CircuitCo.1025 The goal was not to make the Beagleboard a consumer product, but enable makers to experiment with building an embedded system that might be later be put into production by a manufacturer that Texas Instruments supplies.1026 Reference manuals and hardware documentation were provided first on the beagleboard.org web site under a Creative Commons license, and then on Github.1027 In the five years up to 2013, four generations of BeagleBoards were released (Erives 2013).

For a while there would be two major initiatives that would take different stances on open sourcing and hardware: the Open Source Hardware Association at oshwa.org; and the Open Hardware project at openhardware.org, led by Bruce Perens.

The Open Source Hardware Association originated in early 2010 with a Creative Commons fellow trying to turn a project of open source hardware modules into a company.1028 Sharing the questions, they convened an “Opening Hardware” workshop in New York City in March 2010. The timing coincided with a major Arduino meeting in New York City, bringing together many stakeholders.1029 The norms of practices in open source hardware were discussed, leading to eventual publishing of the Open Source Hardware (OSHW) Definition 0.1.

Iteration on the OSHW Definition continued through the first Open Hardware Summit in September 2010. The OSHW Definition 1.0 was released in February 2011.

Open-source hardware means sharing the files needed to build *and* modify your hardware. As the open-source hardware definition explains, that means the version of the files that you would prefer for making changes to the design, not an intermediate or obfuscated version. For mechanical stuff, this means the original CAD files. For circuit boards, it’s the original schematic and board layout files (Mellis 2012).

In April 2011, a community mark selected as the symbol of abidance by the OSHW Definition. The gear logo unfortunately led to trademark infringement suit by the Open Source Initiative (OSI) in August 2012, who were concerned about confusion amongst consumers (A. Shah 2012). In April 2013, the U.S. Patent and Trademark Office rejected an application for the trademark of “Open Source Hardware” as it was “merely descriptive” and not distinctive (R. Wilson 2013). In September 2015, the OSHWA reiterated its pursuit of a certification program, despite issues in licensing (Weinberg 2015).

In September 2007, Bruce Perens reactivated the openhardware.org web site.1030 In 1998, the original vision was for an “Open Hardware Certification Program” as a self-certification by hardware manufacturers.1031 While there was little interest in self-certification in the early 2000s, the emergence of several projects and companies (including Arduino, Adafruit and Sparkfun) using the label of “open source” led Perens to host a placeholder page until July 2011, when a wiki for an Open Hardware Catalog was put into place.1032 Two issues of Open Source Journal were published, in November 2011, and then in February 2012. In the process of encourage open hardware, Perens found that his activities may have have actually worsened the freedoms1033 that he was promoting:

Open Hardware [is] backwards in a way. [….] Patents apply to hardware designs, but most Open Hardware designers never pursue a patent on their designs. What then do they license to others?

It turns out that we have a group of people at CERN, and one of my favorite lawyers and Yahoo, and even me, trying to add restrictions to something that is, for the most part, already in the public domain. And it came to me that this was backwards, and that we could be working against our own interest that way. [….]

The problem is that when we start licensing things that are actually in the public domain, we create norms that the courts take seriously. And they start enforcing licenses on things that could not be licensed before. We really can write new law when what we do gets to a court case, and we want to be careful what law that is. If we were responsible for taking hardware designs from public domain to copyrighted status, we'd be shooting ourselves in the foot.

So, for a while I was uncomfortable with my own Open Hardware evangelism. Was I doing the right thing? I think I've worked out the right path now and will be warning the community about this issue. […]

We also have a bunch of people who use “CC BY-NC” licenses on their designs and then call it Open Source Hardware! Funny how eager they are to call it “Open Source” and then they don't even follow the rules of Open Source. Open Source includes the right to use in any way. If it's “no commercial use allowed” like CC BY-NC, it's not Open Source (Perens 2014).

By February 2014, the openhardware.org wiki had been removed and replaced by a placeholder web page.

In 2015, Bruce Perens reiterated his legal interpretation that open source licenses work for software copyrights, but hardware designers should not expect protections unless the work has been patented.

Open Hardware licenses don’t work. So go ahead and make Open Hardware, but be aware of the fact that it’s essentially public-domain. Making the licenses work would be worse, because we’d also lose the right to implement designs we read about, etc. So, keep in mind that no matter what license you put on a schematic, copyright does not protect it and anyone can manufacture it with impunity unless you have a patent (Perens 2015).

For a party that owns an entire copyright for software, Perens advised that he continues to be “a big fan of dual-licensing, using one of the more restrictive Free Software licenses like the Affero GPL 3 and a commercial-license for those who would rather pay than be open”.

In January 2005, researchers in the fashion industry conducted a conference at the Lear Center at USC called “Ready to Share”. It inquired into whether technological developments – digitalization, cheap and easy replicability (as demonstrated with the Creative Commons) represented a compelling model for creative industries to follow. While open sourcing in software and hardware was seen as enabling some forms of “crowdsourcing”, the fashion industry embraces sampling, appropriation and borrowed inspiration.

The scope of works of applied art eligible for copyright are described as rather limited. Works of art (e.g. music, sculpture) can be defined as property, and protected by copyright. Useful articles (e.g. perfume, culinary creations) are free to copy as utilitarian, and not protected under intellectual property law. Figure B.1 reveals a secondary dimension of tangibility: idea and/or digital manifestations versus physical fixed expressions. Fashion design is utilitarian, like open source code, but is expressed in a fixed physical form.

Fashion design produces apparel as useful articles in physical fixed expressions (from Blakley 2010a)

Figure B.1 Fashion design produces apparel as "useful articles" in physical fixed expressions (from Blakley 2010a)

The open, participatory culture on the Internet and in digital media has been theorized by Claude Levi-Strauss description of a “recombinant creative process as bricolage, a concept that refers to the constant mixing and morphing of incongruous 'found' elements into a new synthesis” (Bollier and Racine 2006). Fashion is seen as having a distinctive “ecology of creativity”, constantly expressing shifting cultural moods, social demographics and personal identities.

The ecology of creativity in fashion features an open design commons, limited copyright protection, a focus on marketing and branding, and competitive markets that reward innovation and speed. Intellectual property rights are not unimportant in this regime, to be sure, but neither do they obstruct new sorts of creativity and competition. Businesses still enjoy proprietary advantages -- their brand name and reputation -- but no one is allowed to privatize and lock up design itself. Fashion recognizes that pleasing a diverse, constantly changing consumer base in a timely way is the key to a profitable bottom line, and that staying one step ahead of fickle style trends that last months, not years, is imperative to success (Bollier and Racine 2006, 11).

Originality in fashion is built around “an ethic of homage, the respectful referencing and imitation of other people's creativity”, with talent framed in recognized lineage of tradition. Separating “imitation” from “originality” is a challenge for copyright law. If a “derivative” rendition attracts an independent following, the value of “originality” diminishes. While counterfeits – “that falsely bears the label of another designer even though no license has been paid” are legally prosecuted, knockoffs “that may be almost identical to a brand-name dress, but it does not purport to be anything but what it is” have been embraced with cheaper production technologies, faster logistics and shorter fashion cycles. Elite designers can charge a premium of perceived superiority and “originality”, while imitators cater to mid-market and lower-tier consumers who are not customers of elite brands.

While fashion may borrow from art and vintage styles, its interdependence with culture has led to street fashion from urban hip-hop pioneers. The culture of appropriating, modifying and sharing materials over the Internet is seen to resemble that of fashion: both innovators and imitators draw on the building blocks of the past through bricolage.

Copyright has not generally been granted to apparel, because articles of clothing are considered “useful articles” rather than works of art. Design patents may be granted for ornamental designs, but clothing rarely meets the criteria of novelty and nonobviousness.

Fashion … challenges the idea often reflexively accepted by policymakers and courts that "more rights" automatically ensure "more creativity" and less rights will choke it. In the fashion industry, the absence of rights actually may feed the creative process. Fashion designers are free to borrow, imitate, revive, recombine, transform and share design elements without paying royalties or worrying about infringing intellectual property rights. Of course, fashion designers are not the only creators who draw on previous works in order to create (Cox and Jenkins 2006, 17).

The dominant business model in fashion is a counterexample that challenges other creative industries that rely on preventing unauthorized or unpaid uses of content.

The music industry has been strong on enforcing intellectual property law that is supposed to encourage innovation, prevent theft and reward artists. However, it's possible that an innovative musician could be delayed from sharing his work, and forced to make it more derivative and less original.

In the fashion industry, sampling, derivation and reappropriation all are accepted and common forms of creative innovation. Indeed, the creative process today is almost wholly reliant on forms of reuse and has deftly avoided the kind of fracas the music industry faces over intellectual property protections. However, there still are powerful institutions that help navigate the murky waters that separate legitimate influence from theft. Without the "thick" copyright protection afforded to the music industry, fashion depends more heavily on social regulation and a primitive but highly functional watchdog – shame (Sinnreich and Gluck 2006, 6).

In the music industry, the high cost of doing business and low success rates has made an industry with concentrated ownership structures and vertically integrated business organizations risk-averse. The fashion industry has resisted corporate consolidation on the same scale.

The elevation of fashion design to an art form is partially based on the lack of qualification for copyright protection (Blakley 2010b). Whereas music, film, photography, writing, sculpting and graphic design can be copyrighted as art, fashion design can incorporate elements of peers' creative works to enable greater creative possibilities and accelerate innovation.

The Innovative Design Protection and Piracy Prevention Act was introduced into the U.S. Congress in August 2010, and died when not enacted.1034 This bill proposed to extend copyright protection to fashion designs for three years. Previous bills had been introduced with the support of the Council of Fashion Designers of America and the 2010 bill had the additional support of the larger American Apparel and Footware Association.1035 The bill was criticized as hurting the fashion industry more than helping it.

Historically, fashion designers have been denied copyright protection because the courts decided long ago that utilitarian articles should not be protected by copyright. Otherwise, a handful of designers would own the seminal building blocks of our clothing. Every time a new blouse would be made, licensing fees would need to be paid to the supposed originator of that particular sleeve or collar.

Although this bill tries to get around that problem by making the overall design, not elements of the design, protectable, once any design is owned by someone, it has a chilling effect on other designers who intend to tap into the same trend. Supporters of the bill say the copyright period for fashion designs would only be three years…but three years is an eternity in the fast-changing world of global fashion. Now that this final version of the bill has eliminated a searchable registry of protected designs, I’m not sure how designers will be able to figure out what they are not allowed to make. And according to law professors Kal Raustiala and Chris Sprigman, manufacturers and retailers could also be held liable for any copies they sold (Blakley 2010b).

The Japanese Design Law covers apparel, but only if no identical or similar design can have existed before. In the EU Community Design System, apparel with a less stringent novelty standard, but it's too easy to make a small change to a registered design and claim it as new. In Canada, works of artistic craftsmanship are limited to finished useful articles only if fewer than fifty copies are made (Daogoo 2012). The Innovative Design Protection Act of 2012 passed out of the Senate Judiciary Committee, but not enacted, and was yet to be introduced in the 113th Congress.1036

The fashion industry could be portrayed as an industry where open sourcing as a behaviour has been an accepted way of doing business for decades. Legislation towards enacting intellectual property protection could see private sourcing introduced as a legality. The enforceability of copyright on apparel would probably lead to precedent-setting cases ending up in court for many years.

In August 2013, full open-sourcing in microprocessors would see the OpenPower Consortium founded by IBM, Google, Nvidia, Tyan and Mellanox (King 2013). In comparison to Power.org founded in 2004, the consortium would have “full access not only to IBM CPUs, but also to the entire gamut of Power-related hardware and software IP. Additionally, Consortium members [are] free to choose who they like, including IBM, to manufacture the customized Power chips they develop”.1037

These evolving contexts in computer hardware and fashion design illustrate that open sourcing is practical outside of the domain of software. Legal contexts, varying by jurisdiction, can be intricate and discouraging to those who don't read law. The opportunity to accelerate innovation through open sourcing has been recognized, however. Enterprises and individuals who are diligent may find their business contexts less constrained than their thinking.

B.5.5 By 2006, research on (commons-based) peer production crossed over from academia to popularity

The rise of the Internet as an everyday phenomenon was evident by 2006. Amongst G7 countries, the percentage of individuals using the Internet had risen in 1998 from a range of 30% to 4%, to 2006 range of 72% to 38%.1038 In the leading countries, Finland and Korea, use of the Internet has surpassed 65% of individuals by 2003, reaching almost 80% by 2006. The general trend towards adoption of the Internet is shown in Figure B.2.

Percentage of individuals using the Internet (ITU)

Figure B.2 Percentage of individuals using the Internet (ITU)

Increased use of the Internet enabled the rise of open sourcing, not only with infrastructural projects such as the Apache server and Linux operating systems, but also participatory communities such as Wikipedia. The phenomenon was popularized as peer production in 2006, with the books by Donald Tapscott and Yochai Benkler.

The ideas of open sourcing and peer production were preceded by leading thinkers seeing a digital information revolution ahead, for some decades.

From 1985, the WELL (Whole Earth 'Lectronic Link) was one of the original online communities. Participation on persistent discussions changed the way the people could communicate at a distance. The value and distinctions between data, information and knowledge began to become apparent.

Information wants to be free because it has become so cheap to distribute, copy, and recombine -- too cheap to meter. It wants to be expensive because it can be immeasurably valuable to the recipient. That tension will not go away (Brand 1989, 202).

TCP/IP -- the protocol standards for the Internet -- would be declared as the standard for all military computer networking in 1982. The first Interop conference in 1985 started the focus on broader adoption of TCP/IP. The proposal for use of hypertext in the World Wide Web was introduced by Tim Berners Lee in 1989. IBM would promote a campaign on e-business in 1996.

In 1996, the publication of Co-opetition introduced a game theoretic view of business, where strategies of cooperation could be recognized “as big a factor in business success as competition” (Brandenburger and Nalebuff 1996, 264). Co-opetition sees that “there are both win-win and win-lose elements in relationships” with customers, suppliers, complementers and competitors. A framework of Players, Added Values, Rules and Tactics was proposed as ways to link to a bigger game. In the cases presented, defeating a competitor sometimes seen as the best strategy, but in other times the best strategy had multiple winners.

Also in 1996, The Death of Competition extended Gregory Bateson's ideas on coevolution to describe business ecosystems (Moore 1996, 9–21). Beyond a core business is (i) an extended enterprise including direct customers; customer of the customers; suppliers of the suppliers; standards bodies; and suppliers of complementary products and services; and (ii) the broader stakeholders in investors and owners, trade associations and labor unions; government agencies, other regulatory organizations; and competing organizations.1039 The premises of the ecosystem strategy include: (i) the collapse of traditional industries change the way of competing from molding new products to molding new ecosystems; (ii) the new communities exist to bring innovations – as entirely new outcomes – to customers; (iii) the scope of what is contained in the ecosystem – from comprehensive to narrow – is a central strategic decision; and (iv) competitive advantage comes from knowing when and how to build ecosystems. Development of the ecosystem was seen in four stages: (i) pioneering; (ii) expansion, (iii) authority; and (iv) renewal. The automobile industry, with American manufacturers challenged by a rising alternative ecosystem from Japan in the 1970s, was seen as orienting towards new open ecosystems.1040

In 1999, Information Rules took the view that “The Information Economy” did not need a new set of principles to guide business strategy and public policy, but instead required a deeper reading of “the literature on differential pricing bundling, signaling, licensing, lock-in [and] network economics” (Shapiro and Varian 1999, x). With the cost of producing information high and cost of reproducing information cheap, pricing could be personalized to the individual, or based on group identity through third-degree price discrimination. Versioning could be a strategy to offer information products across a variety of market segments, either by tailoring to different customers, or designing to accentuate needs of different customer so that they self-select the one most aligned with the value they expect to receive. Rights management of content could be published with digital technology taking advantage of (i) lower distribution costs to give away free samples while charging for the convenience in repeat view, selling similar but not identical products; or selling complements; or (ii) lower distribution costs to maximize the value of intellectual property, rather than just protecting it for the sake of protection. Recognizing lock-in enables (i) buyers to bargain hard during initial negotiations, emphasizing their influence as a customer, and (ii) sellers utilizing key principles of investing in an installed customer base, entrenching so that customers become more committed over time, and leveraging the value by selling complementary products and access to these customers to other suppliers. The economies of networks in the information economy has displaced economies of scale in the industrial economy, leading to positive feedback with a trade-off of openness versus control. Cooperation and compatibility in network markets can see the game change through open standards, as alliances are assembled in formal bodies. These changes would have an impact on information policy both in companies, and the government authorities that regulate them.

In 2003, Open Innovation was presented as a shift from the paradigm of closed innovation saying that “successful innovation requires control” to a new approach “that assumes that firms can and should use external ideas as well as internal ideas, and internal and external paths to market, as firms look to advance their technology” (Chesbrough 2003, xx–xxiv). The achievements and limits of closed innovation at Xerox PARC (1970 to 1986) and IBM (1945 to 1980) were compared to open innovation with IBM (after 1993 with Lou Gerstner and the rise of the Internet), Intel Capital (investing in its close suppliers), and Lucent Ventures Group (commercializing prior Bell Labs technologies beyond the needs of the core Lucent business). The Open Innovation paradigm leads to a business model where the firm should become both an active buyer and seller of intellectual property.1041

In 2004, The Success of Open Source analyzed the rise of this particular kind of software as an experiment around a distinctive notion of property:

Open source is an experiment in building a political economy -- that is a system of sustainable value creation and a set of governance mechanisms. In this case, it is a governance system that holds together a community of producers around this counterintuitive notion of property rights as distribution. It is also a political economy that taps into a broad range of human motivations and relies on a creative and evolving set of organizational structures to coordinate behaviour. What would a broader version of this political economy look like? (Weber 2004, 1)

The writing first traces the history of open source through Unix and the origins of the Internet, through proliferating standards, and the founding of the Free Software Foundation. The invention by Linus Torvalds and the rise of Linux is described with two ideal types for the division of labour: the hierarchy, described by Harlin Mills as breaking up separate teams to manage discrete pieces, and Frederick Brooks in a conceptual integrity, with a master plan, separating architecture and implementation;1042 and (ii) the open source process where the key element is “voluntary participation and voluntary selection of tasks”.1043 Open source matured as a model of production following a crisis in incompatible Unix forks; the evolution of Linux 1991-1994; the transformation of NCSA Mosaic in 1994 to the Apache server in 1995, and the IBM's role in the founding of the Apache Group in 1998; and Red Hat and VA Linux IPOs in 1999.

The microfoundations of open source was explained in four ways: (i) individual motivations; (ii) economic logic of the collective good; (iii) coordination, and its sustainment; and (iv) complexity managed with technology and governance institutions. The macro-organization of open source was explained with (i) the coordination of contributions of specialized knowledge on a focal point with neither authoritative command nor a price mechanism, i.e. through individual incentives, cultural norms and leadership practices; and (ii) complexity managed through technical design, sanctioning, license as social structure and formal governance structures. Business models with open source through 2000 were seen as experiments,1044 and legal structures were still wrangling with copyright law, the GPL and the DMCA.

With the case studies mostly ending by 2000,1045 the open source process was hypothesized to have implications on (i) rethinking property oriented more towards stewardship or guardianship rather than exclusion; (ii) organizing for distributed innovation, rather than just division of labour; (iii) the commons in economic and social life, with the potential for a deadweight loss from a “tragedy of the anticommons”; (iv) development in international economic geography potentially leading to even more drastic inequality; (v) power shifts with changes in relational power; and (vi) how hierarchically structured organizations will manage relationships with networks.

\

In 2005, The World is Flat popularized the association between globalization and the rise of the Internet (Friedman 2005). The idea of a level playing field was described in terms of ten flatteners. Open sourcing, as a behaviour, was threaded through at least five of the flatteners -- e.g. (ii) Netscape, (iii) workflow software, (iv) uploading, (ix) informing, and (x) “the steroids” of digital, mobile, personal and virtual -- with the phenomenon embedded in the larger context of world changes in society and political economy.1046 As one of the best-selling books amongst business readers in the decade, The World is Flat represents a milestone in bringing the average household into a recognition about how much the world had changed over the prior decade.

By 2o06, true insight into the open sourcing phenomenon was accumulated into two works: the more academic The Wealth of Networks, speaking to shapers of policy, and the popularized book Wikinomics targeted for a broad audience. Both of these publications represent accumulation of the changes associated with open sourcing as a phenomenon, by respected researchers. Insights into history, and potential changes in practices and institutions were described.

The 2006 book The Wealth of Networks is subtitled How Social Production Transforms Markets and Freedom. The title can be wryly be compared to the The Wealth of Nations published by Adam Smith in 1776. The work was both private sourcing in its parallel release under the author's copyright as a hardcover edition by Yale University Press, and open sourcing in the online version and wiki licensed under Creative Commons BY-NC-SA license. Social production and exchange, enabled through (i) economies centered on information production, and (ii) communications interconnected pervasively (i.e. the Internet) was seen as having the promise “to play a much larger role, alongside property- and market-based production” than before (Benkler 2006, 3).

With the premise that information production is not as dependent on property rights and markets as the obsession with “intellectual property” might suggest, nine ideal-type information production strategies were described, as in Table B.3.

Table B.3 Ideal Type Information Production Strategies, from (Benkler 2006)
Cost
Minimization /
Benefit
Acquisition
Public Domain Intrafirm Barter /
Sharing
Rights-based exclusion (make money by exercising exclusive rights) Romantic Maximizers (authors, composers sell to publishers) Mickey (reuses inventory for derivative works) RCA (companies hold blocking patents, in pools)
Nonexclusion - Market (make money from information production, not exercising exclusive rights Scholarly Lawyers (write articles to get clients; bands give music free and charge for performances; software customization, advice, training) Know-how (firms that have cheaper or better production processes due to research, lower cost or higher quality) Learning Networks (share information with similar organizations, e.g. newswires, professional engineering societies)
Nonexclusion - Nonmarket Joe Einstein (give away information for free, in return for status, reputation or other motivations) Los Alamos (share in-house information, public goods on government funding) Limited sharing networks (release paper to selected peers for review before publication)

IBM is described with “an excellent example of a business strategy based on nonexclusivity”. With the largest number of patents obtained from 1993 to 2004, the revenues from “intellectual property” transfer, licensing and royalties declined from 2000 to 2003, at the same time that “Linux-related services” grew at a higher rate (Benkler 2006, 46–47). Open sourcing while private sourcing is described at “the interface of social production and market-based businesses”:

IBM is effectively relying for its inputs on a loosely defined cloud of people who are engaged in productive social relations. It is making the judgment that the probability that a sufficiently good product will emerge out of this cloud is high enough that it can undertake a contractual obligation to its clients, even though no one in the cloud is specifically contractually committed to it to produce the specific inputs the firm needs in the timeframe it needs it. [….]

The presence of a formalized enforceable contract, for outputs in which the supplier can claim and transfer a property right, may change the probability of the desired outcome, but not the fact that in entering its own contract with its clients, the company is making a prediction about the required availability of necessary inputs in time. When the company turns instead to the cloud of social production for its inputs, it is making a similar prediction. [….]

In the case of companies like IBM or Red Hat, this means, at least partly, paying employees to participate in the open source development projects. But managing this relationship is tricky. The firms must do so without seeking to, or even seeming to seek to, take over the project; for to take over the project in order to steer it more "predictably" toward the firm's needs is to kill the goose that lays the golden eggs (Benkler 2006, 124).

This positioning had led to IBM contributing patents (e.g. around Linux) to the Free Software Foundation, or openly licensing with the software development community to extend a patent shield. With its size, IBM has “had to structure their relationship to the peer-production processes that they co-exist with in a helpful and non-threatening way”. This is often meant “support without attempting to assume 'leadership' of the project” (Benkler 2006, 125). With other companies (e.g. Meetup, del.icio.us, Flickr), the emergence of social production has meant “focusing on serving the demand of active users for platforms and tools that are much more loosely designed, late-binding – that is, optimized only at the moment of use and not in advance – variable in their uses, and oriented toward providing users with new, flexible platforms for relationships” (Benkler 2006, 126).

Looking forward for development in human development, the nonmarket nonproprietary modalities are expected to change the industrial organization of related information industries in the sectors of (i) software, (ii) scientific publication; (iii) agricultural biotech; and (iv) biomed / health.1047 New commons-based approaches for development require policy-making institutions (e.g. patent offices, international intellectual property organizations) to evolve.

These changes are occurring as social ties are effected not just thickening with preexisting relations with friends, family and neighbours, but also in loose relationships in virtual communities.1048

The institutional ecology of information production and exchange in the digital economy includes many regulatory and policy elements across a variety of industries. The basic functions in mediated human communications can be mapped in physical, logical and content layers, as in Table B.4.

Table B.4 Overview of the Institutional Ecology (Benkler 2006)
Enclosure Openness
Physical Transport
  • Broadband (with FCC)
  • DMCA ISP liability
  • Municipal broadband barred by states
  • Open wireless networks
  • Municipal broadband initiatives
Physical Devices
  • CBDPTA (regulated “trusted systems”)
  • Operator-controlled mobile phones
  • Standardization
  • Fiercely competitive market in commodity components
Logical Transmission protocols
  • Privatized DNS/ICANN
  • TCP/IP
  • IETF
  • p2p networks
Logical Software
  • DMCA anticircumvention: Proprietary OS; Web browser Software patents
  • Free software
  • W3C
  • P2P software widely used
  • Social acceptability of hacking copy protection
Content
  • Copyright expansion
  • Contractual enclosure
  • Trademark dilution
  • Database protection
  • Linking and trespass to chattels
  • International “harmonization” to maximal exclusive rights regime
  • Increased sharing practices and licensing
  • Musicians distribute freely
  • Creative Commons publishing
  • Social disdain for copyright
  • Jurisdictional arbitrage
  • Developing nations with free information ecology

The physical layer refers to material things used to connect human beings to each other. The logical layer includes algorithms, standards and ways of translating from human meaning to machine language and back. The content layer is humanly understandable statements and utterances. The policy debate in each layer challenges whether sufficient institutional space is left for social-economic practices of network information product to emerge (Benkler 2006, 391–396). Enclosure is associated with private sourcing behaviour; openness is associated with open sourcing behaviour.

The 2006 book Wikinomics was subtitled How Mass Collaboration Changes Everything (Tapscott and Williams 2006). It was written in parallel with a private $4 million research program in 2004-2005, exploring “how new technology and collaborative models change business designs and competitive dynamics”.1049 Wikinomics is described as a “new mode of innovation and value creation … called “peer production” or peering -- which describes what happens when masses of people and firms collaborate openly to drive innovation and growth in their industries”.1050

The principles of Wikinomics included: (i) being open, (where traditional companies were closed to networking, sharing, and encouraging self-organization), particularly with standards; (ii) peering, as with Linux and Wikipedia; (iii) sharing, of intellectual property, computing power, bandwidth, content, and scientific knowledge; and (iv) acting globally, not just thinking globally, but also eliminating geographic redundancies with planetary capabilities.

The new mode of production was characterized by Wikipedia and IBM with the Apache Server and then Linux. Tapscott and Williams describe The World is Flat as “otherwise helpful”, but criticize Thomas Friedman as “not seeing the forest for the trees” (Tapscott and Williams 2006, 91). They see public goods (e.g. open source software) and business as compatible, as “without the commons there could be no private enterprise”.1051 At IBM, Joel Crawley sees the shared infrastructure “does not decrease opportunities to create differentiated value, it increases them”. The key benefit of peer production for business are listed as (i) harnessing external talent; (ii) keeping up with users; (iii) boosting demand for complementary offerings; (iv) reducing costs; (v) shifting the locus of competition; (vi) taking the friction out of collaboration; and (vii) developing social capital (Tapscott and Williams 2006, 93–95).

For managers, Wikinomics design principles are prescribed: (i) taking cues from your lead users; (ii) building critical mass; (iii) supplying an infrastructure for collaboration; (iv) take your time to get the structures and governance right; (v) make sure all participants can harvest some value; (vi) abide by community norms; (vii) let the process evolve; and (viii) hone your collaborative mind (Tapscott and Williams 2006, 286–289).

By 2007, the idea of open sourcing had become a mainstream topic in businesses of all scales. Other publications would deepen the histories of successes in the software industry and postulate parallel possibilities in other domains.1052 Questions would shift from why, to how.

B.6 Summary: Open sourcing behaviour maturing over a decade

B.6

While the focus of this book has been on open sourcing while private sourcing in seven specific cases, the larger trends in the decade 2001-2011 were an inescapable context for IBM. Inside the company, the spirit of “open, collaborative, multidisciplinary, global” came from software development practices that changed the way the business worked as a whole. In a coevolutionary path, the Creative Commons, commons-based peer production, open government data and open source hardware emerged as related ideas that have become part of contemporary society.


Appendix A

Footnotes


Return to openinnovationlearning.com/online/readepub