March 31, 2009

Information capture and repackaging as core business

Some time ago, I blogged about the company 23andMe that specializes in personal genetics by offering DNA testing kits and web-based interactive tools that allows individuals to understand their personal genetic information. The company was founded by Linda Avey and Anne Wojcicki, in 2006, and its investors include Genentech Inc., Google Inc., and New Enterprise Associates. The company recently announced that they have launched a Parkinson’s Disease (PD) initiative in collaboration with the Parkinson’s Institute and Clinical Center and the Michael J. Fox Foundation.

Community-based research
The PD initiative is planning to use an open platform approach where the two clinical institutions above will reach out to their networks and encourage PD patients to enroll. The press release states that “the first-of-its-kind program will focus on enrolling 10,000 individuals with Parkinson's disease” meaning that a valuable database is likely to be built from the genetic information. One of the major advantages for participating PD patients, at the same time, is that they will have access to an online community where they may meet others in the same situation as themselves. Commercial tests are usually $399 each, but Google co-founder Sergey Brin has made a personal contribution to lower the price to $25 each for participants for the benefit of PD research. The value of the contribution is not explicitly disclosed but simple mathematics of a cost decrease of $374 per test for a total of 10,000 participants adds up to an indicating value. Current customers of 23andMe, who are not PD patients, that already have their genotypes uploaded into the 23andMe databases, can also contribute to the initiative by participating as “healthy controls”.

Intangibles as value-drivers in knowledge-based business
Knowledge-based industries, such as biotechnology, enable interesting novel structures due to the nature of intangibles. The PD initiative provides an optimal example of a structure where information is the most important underlying object of transfer. Although it is repackaged into many forms: First the genetic information is inaccessible (physically present in the nucleus of cells) and hence without direct value to the PD patient. By enrolling in the initiative, that very same patient transfers (by physical means: DNA kit + post office as distribution channel) his genetic information (physically present as saliva) to a cost of $25 to the company’s database where the information is presented back (by virtual means: analyzed data and predictions in a web-browser) to the PD patient. The patient may now choose whether he would like to share or compare his data or parts of it to peers (by virtual means: presented as personal data) within a community of others in the same situation. At the same time, data supplied by the PD patient contributes to a database that will increase in value for the company (by virtual means: analyzed data becomes building blocks for bioinformatic/computational tools to make discoveries) and participating institutions* (by virtual means: genomic associations that may lead to clinical studies - which may in turn provide further data that can build the database even more).

Maybe even more interesting, at least from an IAM/IPM perspective, is that due to the scalability of the current model: 23andMe is planning to launch several new communities for other diseases - extracting tremendous value from their current assets and capabilities by segmenting parts of their existing business model (TBMDB).
* I have assumed that the institutions have access to the data for the sake of argument.

Intangible assets and property
The scalability of the model makes it very attractive, but what are the underlying assets, capabilities and activities that are needed to create it? The list of assets can be made long, but on a high-level some key assets include a proprietary database of both understood and non-understood data, tools to sequence and analyze genetic information, (probably) cost-efficient out-sourced distribution channels for sending out and collecting information, an external relational capital of information-suppliers / high-end investors / renowned advisors, internal human capital, customer user base, trademarks claimed through (™), and software to manage, analyze and ”present data back“ to the customer. Not to mention policies, agreements, strategies, competitor intelligence, technical know-how etc that would require a more in-depth investigating to be identified and described.

IAM/IPM capabilities to control and extract
Some quick searches in Google Patents and Patentlens for 23andMe as assignee did not yield any results. So how does the company maintain in control over their value creation and extraction? A number of capabilities, and IA/IP strategies probably exist for each of the assets. But if we look at the proprietary database as an example, some strategies possibly include;
• property-based copyright protection associated with the totality of organized data
• technical control by providing proprietary softwares which (only?) displays data from 23andMe
• market power through a large user-base where the network effects of online communities create higher barriers to entry for other players
• contractual control to ensure that rights to use the information within are maintained (see example below)

Example of contractual control between the customer and the company: (17.) 23andMe’s Proprietary Rights: ”[...] Your saliva, once submitted to and analyzed by us, becomes our property. Any genetic information derived from your saliva remains your information. We retain the rights set forth in the consent form and any additional terms of service.“.
”You retain copyright and any other rights you already hold in information and content you create and which you submit, post, or display on or through, the Services. By submitting, posting, or displaying the information and/or content, you give 23andMe a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to reproduce, adapt, modify, translate, publish, publicly perform, publicly display, and distribute any content which you submit, post, or display on or through the Services.“
”You agree that this license includes a right for 23andMe to make such content available to other companies, organizations, or individuals with whom 23andMe has relationships, and to use such content in connection with the provision of those services. [...]“

At the same time, it is interesting to think that a number of IA/IP strategies probably exist to extract even more value from the database, including;
• value capturing strategies to repackage data in the database into new value propositions (e.g. genetic associations to be used in studies, new genetic tools, new communities, database access for external parties, publication opportunities, intellectual objects for transactions or to form new ventures/entities, and so on)
• strategic alliance strategies to both evaluate existing data, and to collect new data for other diseases

Final thoughts
In my previous blog post, I discuss the changing role of patents in a knowledge economy and how companies must incorporate IP strategy to their cores. I think that 23andMe provides an interesting example for how both control and value can be maintained efficiently, while managing several simultaneous value recipient, by detailed IA/IP strategies (although, I have in this case interpreted the strategies as a case study based on their public information). I think that many organizations can learn from this model. IAM/IPM capabilities will need to become key to business strategy for every organization that wishes to be a player in a knowledge-based economy.

Tobias Thornblad
(Follow me on Twitter)

IAM/IPM capability and open innovation will be discussed further by prominent IP thought-leaders during CIP FORUM 2009, 6-9 Sep, Gothenburg, Sweden.

(Link to 23andMe’s blog: the Spitoon)

March 29, 2009

Onlive: Technichal and Business model innovation

Some time ago we posted a some thoughts about Quake Live's business model and way of delivering the value experience to the users in a new and inventive way. Well it seems like that business model have been extended and altered.

Onlive is a new service enabeling a high-octane gaming experiance on your low-end notebook, laptop or even TV. Unlike Quake Live you need a micro console and a handcontrol to play with it. So in that sense it is actually a clear competitor to the existing gaming consols currently available.

So, by scrutinizing Onlive's business model according to tbmdb.com's business model components you clearly see that Onlive innovates upon several components. But essentially, the end-user have the same or an enhanced value experience for less money and in a more flexible manner. However, Onlive needs diffrent skills to manage external relations, create the propert control mechanisms and expecially create the capability and assets needed to have virtually zero downtime on the service since that have a huge impact on the value reciver - the customer - .

One thing is for sure, this is a clear and awaited model for how to treat piracy issues without deminishing the value experiance by the customer. That clearly talks in favour for the adoption of this model of delivering consol gaming. But whether it is Nintendo, Microsoft or Playstation who will do it in the long run, or if perhaps Onlive can claim a sustainable market position, is still to see.

Have a look at the promotion video and see what you think. I'm quite sure that I will never go back from having only a laptop. Further it is always nice to enable gaming without installation on a system managed by a grumpy ICT staff.




However, its getting increasingly clearer that in the future we will see a consolidation of IP-TV, gaming in the cloud and other similar services. They will probably utilize the same hardware and will be provided by the one service provider.

Via Wikonomics, for more great frameworks for understanding business models: tbmdb.com

March 26, 2009

A thought-experiment to test IAM/IPM capabilities in the business arena

Patent examination is discussed in an interview with Bo Heiden (deputy director of CIP) in a recent blog post in IAM magazine by Joff Wild. The topic of the discussion is on how to handle the current backlog in patent offices around the world. “Beginning in the 1990s the number of applications [at the USPTO] boomed, with a record 495,095 submissions during 2008. The backlog of unresolved applications has grown apace, increasing by nearly 73 percent between fiscal years 2002 and 2007 to about 730,000.”, at least according to Public Integrity. This backlog means that companies are forced to do business with non-granted patent applications. Heiden argues that this makes the status quo into something similar to a de facto automatic issuance system and therefore raises a provocative thought-experiment of a patent system without patent examination by the patent office. The patent offices would then, in the words of Mr Heiden, be “rubber-stamping authorities which merely certify that all the legalities associated with an application have been complied with”. This provocative stance certainly started some debate both in the blog comments and in other blogs where it was also discussed whether this was the same point that Lemley is making in Rational Ignorance at the Patent Office.

Switching the Onus of IP Awareness to the Business Arena
However, my personal interpretation of the discussion was that the whole point of the thought-experiment was to emphasize that companies are currently not accepting their full responsibility when it comes to IP. The status quo of patent prosecution seems to some extent be a closed procedure between administrative bodies and patent departments rather than an integrated core function to govern corporate strategy. Hence, the an interesting question generated by the thought-experiment is; if the onus was on the business arena to assess inventiveness, industrial applicability and novelty, would the right holders feel more of an obligation to only bring to the market patents for those inventions that would truly be determined as “strong” and “valid” patents? Moreover, as Joff describes in the blog post, if there were severe penalties in place for those bringing suits based on non-inventive patents - people would remain wary of litigating unless they felt that they had a very strong case.

Patents in a Knowledge Economy
The traditional view of patents as means to block as only function is somewhat being replaced in the emerging knowledge economy as structural building blocks instead. This is especially evident in knowledge intensive industries such as IT or biotech, where the role of patents serve as an important vehicle to package information into value propositions for transfer. In my perspective, it is important to realize that ownership of information does not automatically increase in value the more that others are being excluded. On the contrary, due to the compatibility that often exists between claimed information and others’ claimed information - collectivization may often increase the total value, although separate building blocks of information may (or rather: often need to) be proprietary. Investments in, and governance of, intellectual asset and properties therefore have the potential to drive wealth and growth in a creative transformation of R&D into products, ventures, commercial transactions and new markets. Examples of this creative transformation includes;
• early-stage and venture incubators,
• technology transfer offices (TTOs),
• spin-out companies,
• standardization platforms
• open innovation

Moral Obligation to Establishing IAM/IPM Capabilities
Building, rather than blocking, needs to be recognized as the core function of patents in the knowledge economy to generate wealth and drive growth. This means that IP strategy needs to be fully aligned with corporate strategy (or the other way around) to achieve envisioned business goals. Obviously non-core IP is likely to emerge along the way providing opportunities for further value generation, such as new markets, spin-offs, licenses, etc. As Mr Heiden rightly argues in one of his comments in the blog discussion: “the real challenge for us IP professionals - to be relevant to business not the PTO.”

Furthermore, my prediction is that the required IP awareness in this new economy will generate norms that interpret not-invented-here mentality as immoral (due to the costs of developing something yourself as opposed to license-in the technology).

These are certainly interesting times to follow the norm development within the IAM/IPM sphere in many industries.

Tobias Thornblad
(Follow me on Twitter)

Patent Examination in the Knowledge Economy, and other IAM/IPM capability topics, will be discussed further by prominent IP thought-leaders during CIP FORUM 2009, 6-9 Sep, Gothenburg, Sweden.

Recent article about another way to solve the patent backlog: Crowd-sourcing

IPRs and Open Source Licenses - The Increasing importance of IPR Management for Core Business Strategy

An increasing amount of talks and articles refer to new unorthodox forms of value creation beyond the hierarchical boundaries of a firm (some of the more popular books are “Here Comes Everybody” and Wikinomics). As is custom with new phenomena, there is no distinct and standardized definition for these trend(s). Some of the common and at times overlapping names are “Peer Production”, “Crowdsourcing”, “Open Innovation”, “Open Source” and etc. Articles in the area are ripe with the potential benefits and increasing impact of peer or open value creation. Few, however, discuss the importance of IPRs and IPR management.

Control
IPR management has a lot to do with control, but the word “control” has bad or at least one-sided connotations to it. For most people “control” in the context of IPRs signify blocking others, which is the precise opposite of what is required in peer production. In the world of the hierarchical firm, the production, ownership and utilization rights to IPRs are well established. Within a firm, rights to IPRs over its employees is often clear with standardized employer/employee agreements. Within the boundaries of the firm, represented by a nexus of employer/employee agreements, the creators of intellectual properties accept that full ownership lies with the employer.

Impact on business model
For a company that wishes to tap into the benefits of peer production by issuing a new OS license or using an existing one, the situation becomes very different. Here the control of value creation lies outside the hierarchical boundaries of a firm. Value is added to a project both by internal employees (controllable with standard employer(employee agreements) and well as by external participants, with whom the only relation often is the one dictated by the OS license. With OS type of licenses, no one entity typically owns the total value created, but everyone shares the same utilization rights. This is a major difference compared to the when the company has full ownership over the value created. With full ownership, an actor can do whatever it wishes. With shared ownership, however, different actors all have the same utilization rights as stated by the license. These utilization rights will in turn affect the range of possible business models at hand for a firm since business models are affected by which utilization rights you have to the object offered to customers. Furthermore, since software copyright is shared among many actors outside of the hierarchy of the firm, changing the license terms becomes difficult. The latter signifies that not only is the range of business models at hand limited today, but that they will most likely remain that way even in the future, i.e. a question of business sustainability (this paper studies successful method to achieve long-term sustainability with respect to business models on open source). Although this is a software example, the analogies are still valid for most collaborations where there are many different actors involved in the value creation. Special care must therefore be taken whenever a firm wishes to issue or start using open source type of licenses.

Dual Licensing
There are companies that try to circumvent some of the shortcomings of OS licenses by starting a project, releasing it as open source but also add proprietary layers to the OS code (with possibilities to use the OS code with proprietary code without the proprietary code falling under the OS terms) and thus regain the benefits of proprietary code. This is called a dual licensing, split licensing or open core (there might be small differences in their exact definitions). In order to increase the level of ownership and utilization rights, these companies require that individual peer producers’ copyrights be handed over to them in order for the contributed code to be part of the next major software update. If individual peer producers do not agree, their contribution will not be included to the next software update and since they individually cannot compete with the development efforts of a company, they agree and are happy with the level of utilization rights granted to them by the OS license (or they will effectively have forked the development and over time will have to maintain it themselves).

Although dual licensing/ open core does provide additional levels of utilization rights (and therefore additional types of business models possible), experience shows that dual licensors’ flexibility in modifying their OS license is also more static than one would think. Many times the wishes of the dual licensor, the orchestrator of a peer producing community, to change the terms of the community license is not welcomed. Many times when the dual licensor has gone ahead anyway and changed the terms of the license, the community has forked the project and effectively made the project come under a single OS license over time (i.e. the updates developed by the community often are much more frequent and stable than what is possible for a company to compete with). Such examples include the popular Joomla (previously Mambo), NeoOffice (previously OpenOffice) and there are even discussions regarding the possibilities of Sun’s MySQL to meet the same fate with the development of Drizzle (e.g. here and here)

To conclude, as collaboration involving multiple actors increases (i.e. peep production / open innovation), strategic management of IPRs becomes increasingly important and should be an integrated part of core company strategy. In addition, since by nature changing licenses involving multiple and often disparate actors is hard, the strategic importance of IPR management becomes even more important as it not only impacts current but also future business options.

....................................
The author of this blog article is Sina Keshavarzi. Sina is currently finalizing a master within Intellectual Capital Management (ICM) at Chalmers University of Technology: CIP as well as working with IC matters in the agricultural biotech industry.

Other IAM/IPM capability topics will be discussed further by prominent IP thought-leaders during CIP FORUM 2009, 6-9 Sep, Gothenburg, Sweden

March 25, 2009

Intangitopia licenses content under Creative Commons


We at Intangitopia are strong believers in that you can create more by collaborating than by blocking others. That is why we have decided to license the content on this website under Creative Commons Attribution-Noncommercial 2.5 Sweden license.


We believe that by codifying the terms we provide our content under we can also encourage our readers to take a more active part in sharing and develop our thoughts as well as contribute with their own. Since most of the content on this website is ideas with which we encourage debate and collaborative thinking we also believe that this is to be shown in how we make the content accessible.

Jurisdiction
The license we have chosen is the Swedish version of the Creative Commons Attribution-Noncommercial license. The reason why we have chosen the Swedish version is the simplest one; we are currently based in Sweden. The difference from the original license is however small and no substantial parts differ. Here is a markup copy of the original license with explanations of the differences.

Attribution and noncommercial
We have decided upon the license which requires the licensee to attribute us as individual authors and where appropriate also refer back to Intangitopia. We have also decided to use the limitation that the licensed content only can be used for non-commercial purposes. But as always, we can also waive this at any time by request from the licensee.

This license makes it possible for a licensee to use the content but also to make derivative works from it, i.e. improve our thought, models etc. The continued use, although in part, must still be attributed back to us.

Please feel free to give any feedback to this choice of licensing model or if you have any requests for use of the material in any other way tan provided by this license.

Johan Örneblad

March 22, 2009

IAM/IPM - IP Ethics - Market Behavior

This is the third blog post in an ongoing investigation in how IAM/IPM capabilities apply to IP ethics.
In earlier posts I have looked into;

1. IPM/IAM - IP Ethics: Innovation Management

2. IPM/IAM - IP Ethics: Corporate Profile

This post will look more into IP in relation to market behavior. There are some market practices that almost anyone would consider unethical, such as litigation of academic institutions or effectively blocking societal benefits for personal (financial) gain. Then, there are the more uncertain cases such as gaining a control position by making technology incompatible with other technology providers, or solely focus on patent licensing without practicing the invention oneself. In the latter case, some may argue that this is not unethical when this is not the only revenue stream of a company (i.e. a mix of both patent licenses, and practicing of some inventions), whereas others may feel that it only becomes unethical when patents are acquired and used only for the purpose of litigating, whereas others may just perceive all of the above as innovative “new“ business models.

Access exclusivity: Ethical dimensions of patenting upstream technologies per se are often discussed in biotech (e.g. DNA patents, etc.). However, as most people would argue, the patent document protecting a biological invention is obviously not unethical in itself. The ethical considerations are (or should) rather determined based on how the intellectual property right is used. This brings the focus then to offer a value proposition with sustained control. Is it ethical to let only one entity have access to the invention as an exclusive license? What if the exclusivity is only geographical? Is it more ethical to offer non-exclusive access controlled by a pricing-mechanism? Of course, most people would argue that from a societal perspective the more open the access and lower the price is the better - but what if this means that return-of-investments for the technology are not reached? Time and money invested (could be external investments from public sources) is not returned, diminishing the value for the technology inventor, its shareholders, investors, potentially also affecting society.

Dependence vs loyalty: Many of today’s newer applications create utilities for its users (e.g. Facebook, MySpace, some mobile phone contracts) that will increase the more that join, which will make users persuade their friends to join as well. This is commonly known as the network effect. Common strategies to capture the value from this mechanism include lock-in strategies, which create a dependence to a certain provider due to substantial switching costs. While this may create lucrative revenue models, there could be a point for enabling technologies where the incompatibility between different technologies may create societal inefficiency problems. Participation in standardization (in particular open standards) efforts may therefore provide both an opportunity to design a future market but also a way to ensure that transactions costs for society are kept to a minimum.

Transparency: Knowledge-based business, unlike traditional industries where production and marketing of physical goods was in focus, that is run efficiently will have an inventory of valuable objects which do not show up on the financial statement. These objects, commonly known as intellectual assets (IA), may be what distinguishes one company from another but will still be difficult to know for anyone external to the company or for stakeholders such as shareholders. A US Supreme Court decision (1976) stated that under federal securities law and court decisions, a public company has an obligation to disclose a fact in its filings if "there is a substantial likelihood that a reasonable shareholder would consider it important". This may rightly be considered to be taken somewhat out of context (as the case is focused on negative effects), but the point that I am trying to make is that a lack of transparency is a matter that is likely to be taken seriously by the public, and should therefore be managed in a controlled manner. An efficient IA management system enables IA reporting where objects (R&D assets, processes, surprising research results, technology know-how, etc.) can be packaged and managed into a state where it can be controlled whether certain assets may be propertized into patents, plant variety rights, trademarks, copyrights, design protection or similar. This is important since the balance between what to disclose (and render non-patentable) to the public and what to keep secret obviously needs to be fully aligned with the IP strategy of the company (meaning that assets often most appropriately should be registered intellectual property rights first unless a defensive publication strategy is being pursued).

The latter is not a clear-cut ethics issue, but I think that it is still an interesting perspective to look from upon IAM, since the focus of IAM discussion is usually on how it can be used for tax savings, expanding the portfolio, etc. As I have mentioned in previous posts, this list should rather be viewed as some examples of cases I find interesting in relation to ethics rather than a holistic or comprehensive framework thereof.

Tobias Thornblad
(follow me on: Twitter)

IP and Ethics, in relation to IAM/IPM capabilities, will be discussed more in-depth during the CIP FORUM 2009 event (6-9 Sep), where myself and fellow Intangitopians will actively participate.

March 19, 2009

Stimulus Package IP: Health IT part 1 - the new standard wars?

In the ever debated and updated stimulus plan there is a fair chunk (or equivalent to 27 % of AIG quarterly loss) of 17 billion USD for Health-IT. Read a great summary of the Helth IT implications here and find more info from initiatives here and here. Health IT in a broad sense is using IT to provide better and cheaper healthcare. In this sense it is focused on electronic medical records (EMR), earmarking 17 million in payments for hospitals and practitioners who implement it and also 2 billion for infrastrucure, admin and standardization. Timeframe is also of the essence, standards shall be set by the end of this year and payments will start in 2011.

I intend this to be a series I come back to as I find it interesting for many reasons, here are a few:
1) In my mind it is a great initiative - just imagine having all that data and using algorithms to compare MD's thoughts
2) Interesting to see how governement handles standard setting in just 9 months.
3) There is a provision that an open source initiative must be investigated.
4) Two very interesting and in many ways opposite players, Google and Microsoft, have already started working on this
5) It is a hot topic as it is very much intertwined with ethics, data protection and privacy

Before starting of, I wish to say that I am no expert in EMRs so comments are very welcome and I look forward to learning from them AND updating the series with good thoughts / ideas. I honestly think this is one of the most intertwined tech/legal/business/ethics/IP/standards issues of today.

Part 1 - The new standard wars?
The first thing that brought my attention was the fact that standardization plays such a crucial role in this and also that the government is the one pulling the strings. As this is no new field there is lot of IP already out there and also lot of unconnected systems. This leave to rather large questions regarding standards: "what" and "how". At time of writing I have yet not heard of a czar or some concrete measures being done, but I see endless possibilities for how this can be solved - let me share som toughts on scenarios.

How to set the standard:
* Patent pool with free access
Governmental control of IP through acquisition of key patents related to largest / fastest / best system and then labellig that as the standard and through monetary incentives make it the prevailing one. Could (should) grant free access to the patents and also has funds to do so. Likely subject to heavy lobbying activity prior to choice and thus could suffer critique and in worst case low market penetration.
* Standards Organization
There is nothing ruling out the formation of an "ordinary" standards agency like in many other fields. Relying on market powers, corproate negotiations and wealth redistribution to settle the issues. Risks are of course royalty stacking and an immediate 17 bililion dollar mark-up on total revenues generated until end of stimulus package

What to standardize:
* The whole nine yards
Just choosing one EMR option and then once again using monetary incentives to make in prevail. In theory (as always) this is easy, but in practice some MDs want one kind, some the other, some want voice control, some touch screen interfaces etc. etc. This could be full of criticism as the penalties for not adopting the systems are likely not as high as the pain and frustration for practitioners to us a system they dislike.
*Software
Scalable and quick model, but with less IP possibilities. I would draw the analogy to having all EMR's report in one certain file type so that all data would be imported in one large database (n.b. I'm no computer scientist). I'm also assuming that with software only, there are many privacy issues to consider.
* Interface
Safe but expensive, time consuming and full of interoperability problems. In my mind, given a longer time frame, it could be a safe way to develop a new interface only for EMRs, but in todays wireless world and it's encumberances that could be a monumental task.

Another interesting thought would be if whatever system is chosen then turns into a lawsuit (patent or copyright), which also needs some careful thought. Would the government be ready to tackle trolls or are they more keen on just developing an SOP and suddenly having the EMR market boom only to realize it is totally encumbered.
One final thought, with the blu-ray wars in mind, meaning that blu-ray community was so keen on winning just to make sure. their hardware would be in everyone's home with an ethernet port once we all start downloading movies. One could hope that whoever makes these decisions also thinks one step ahead and try to figure out what long term effects there are of digitising healthcare and having that data, not only focusing on choosing the short term solution best fitting the public opinion.

Marcus Malek
Follow me on twitter

March 18, 2009

CIP FORUM 2009

Intangitopia would like to inform all readers about one of the biggest happenings in the realm of innovation management this year: CIP FORUM 2009. The international conference takes place between the 6th - 9th September and gathers the world’s greatest thought leaders in IA/IP/IC management. Seminars and workshops are offered by experienced professionals and scholars to promote new developments and best practices in the developing fields of intellectual property and intellectual capital.

Four of the Intangitopia bloggers will each actively partake and have distinct responsibility areas during
CIP FORUM 2009. These responsibility areas will be explored and covered more in-depth here at Intangitopia in the time leading up to the gathering. The responsibility areas are as follows;

Mathias Hellman (follow on: Twitter) - covering and contributing to Early Innovation
Marcus Malek (follow on: Twitter) covering and contributing to - Financial Institutions
Johan Örneblad (follow on: Twitter) - covering and contributing to Open Innovation
Tobias Thornblad (follow on: Twitter) - covering and contributing to - IAM/IPM Capabilities

We will all obviously also continue discussing the topics that we find interesting in relation to IA/IP/IC in a true Intangitopia spirit.

The previous CIP Forum FORUM was held in May 20th - May 23rd, 2007. Some of the participants in 2007 included; Ulf Petrusson (CIP), Beatrix de Russé (Thomson), Erik Noteboom (the European Commission), Kai Taubert (Procter & Gamble), Marshall Phelps (Microsoft), Ruth Keir (Pfizer), Ruud Peters (Philips), Takeshi Isayama (Nissan). among others. Read more about CIP Forum 2007: IP Review <>, Anaqua IAM Perspectives , and Valea.


March 17, 2009

Ethics in IA/IP management - the Corporate Profile

My exploration in IP and ethics today takes me to IPM strategies that may be associated to the corporate profile. In my last blog post, I touched upon some of the complexities in managing innovation in an ethical manner, which may be seen as layers that are much more focused on technology whereas today’s post will target brand management.

Brand Management Deconstructed
Firstly, it is important to recognize that branding encompasses so much more than trademarking a name / logotype, registering a slogan or claiming a certain design scheme / trade dress as proprietary. Branding is the whole communicative relationship between a company and 1) its customers, 2) its collaborators, 3) its employees, 4) its end-users (if separate from customers) 5) any entity or person that the venture will communicate with.
A “strong” brand is often quoted to;
* attract new customers / retain existing customers
* enable price premiums / drive sale volumes
* decrease sensitivity / volatility of revenue streams
* secure future revenue streams
* enable clear product differentiation and positioning
* reduce marketing costs
This is all true but in knowledge-based business (more?) important aspects are also how the brand is used to 1) block competitors, 2) control value propositions, 3) control relational networks / technology platforms, standardization efforts and markets, 4) control the creation of a strong IA/IP portfolio, 5) establish incentive structures and control human resources, 6) controlling the identity and perception of the venture, 6) extract value (monetary or non-monetary).

Branding and Ethics
It is quite clear from the brand elements above that the “control” that branding enables, there is more than one path to go where there will be more than one opinion about what “the right thing to do” is. Using semantics to describe the identity of the venture or the utilities of a product can create the perception of something that is of far more value than the original object. Below follows some examples where IP and ethics can be tricky;

Functional utility:
The debate whether diagnostic tests are ethical is an interesting example of branding where descriptive terms could be used as a way to inform the patient about the uncertainty of the test or unethically to provide a false comfort. A genetic test may confirm a clinical diagnosis only if the disease is a known, described, monogenic or chromosomal disorder with evidence-based association to a disease causing mutation. Conversely, genetic tests for polygenic complex disorders only assess an individual’s risk of susceptibility slight-moderate-strong to certain diseases, providing for more room for interpretation (and external influence).

Another example of a balancing act in what utilities that are rightly or wrongly claimed is in relation to the placebo effect. If you go to the doctor to cure an ailment, he prescribes a medicine, and you feel better afterwards - you are cured. Would you feel the same in case if it later turns out that the pill he gave you was Cebocap (a known sugar pill) - or would you rather be inclined to sue him for malpractice? I guess the answer he will always be “it depends”, but this is also why it is important as a company to control the communicative relationship.
Product identity: It is often easier to argue that something is unethical when it involves human health, but a similar discussion can be had for consumer goods as well. One of Procter & Gamble’s washing-up liquids (known as Yes, or, Fairy) is rumored to have gotten a boost in sales after the introduction of a minimal addition of eucalyptus, much owing to the success of branding the liquid using the ‘natural’ ingredient.

Corporate identity: We can all relate to how we perceive company brands as having distinct personalities (traditional examples include: Volvo - safety, Apple - design, etc.). Conscious IA/IP strategies have the ability to create these identities. There are many large corporations that have their scientists writing periodical reviews about markets, current technology and future predictions. Product placements in these could be part of the branding strategy to create a demand which may justified in some cases and could be destructive to society in some cases (e.g. tobacco industry denying lung cancer).

Co-branding: A value creating strategy that may both be used for the better or for the worse. Few customers reflect over the moral values in Ferrari lending their logotypes to Acer computers, but what if Walt Disney was to co-brand with Marlboro? In this bizarre example, many would think that it would be easy to point to whom would be the ‘winner’ and to whom that would diminish their brand, but there are examples where the boundaries are more fuzzy.

Claiming ethics to create an advantage: This may not be one of the most common strategies, but I thought that it was an interesting case. In the March issue of Nature Biotech the article “a balancing act” discusses Genentech’s petition for the FDA to immediately pull many of the in vitro diagnostic “home-brew tests” from the market. Genentech’s claim is that there is not adequate “scientific evidence of their validity” and that they pose “potential risk to patient safety”. This is seemingly an altruistic move, but the Nature article states what Genentech would like for the FDA to examine closely are home brew tests to assess patient suitability for Herceptin treatment. Uses, not mentioned in the petition, that erode Genentech’s royalties from sales of ‘official’ companion diagnostic kits, not to mention potential lost sales in the future from Rituxan, Avastin and Tarceva. This could obviously be argued to be immoral, while it could also be considered a win-win situation (tougher regulation of diagnostics + more sales for Genentech) depending on who is the judge.

My exploration in ethics and IP will continue in later posts where I will investigate market considerations, and value extraction among other things.

Tobias Thornblad


March 16, 2009

Ethics in Intellectual Asset/Property Management

I will commence an exploration into ethics and IP, which I have found to be a reoccurring topic in many (incl. Intangitopia) IP blogs where NPEs, exploitation of control positions and overly proprietary models often are discussed. This topic, however, obviously has many aspects and is way too broad to cover in a single blog post, or comprehensively, so this should be seen as an ongoing exploration. My focus will, as usual, be inclined towards biotech as this is the market I find the most interesting, but my intention is for the discussion to have a wider applicability. In this post, I will look at how innovation- and IA management strategies can relate to ethics (in the perspective of society).

Innovation Management: Legal Considerations
Many of the issues that most people bring up when it comes to unethical practices is the actual technology (or in some cases product) at hand. Morality and ordre public clauses have been designed to legally prevent many immoral practices such as various forms of commercialization of the human body. My aim is to keep this post about ethics in legal practices on a more holistic level than the if genes, stem cells, diagnostic tools, should be patentable.
Lack of IP policy: This can be interpreted as somewhat counter intuitive, but the fact is that by not claiming assets as property much value could be lost. Not patenting important inventions, and instead keeping them as trade secret, keeps society in the dark regarding valuable information rather than teaching in return for ‘the right to exclude others’. Another aspect that should be considered is whether the full potential of a proprietary database is unleashed by not allowing any external access, or whether it could benefit all parties by allowing access, e.g. possibly in exchange for a subscription fee.
Too defensive IP policy: The opposite of the section above. Claiming stakes so broadly in the ground that many of the patent applications are far from reduced to practice, and is only used to scare of the competition. The obvious risk of this is that also universities and non-competitors are ‘scared off’ effectively inhibiting research in certain areas.
Keeping non-value generating patents: A portfolio having a large number of patents that are not utilized is not only costly, but may be blocking others from exploring the territory and the ethical thing may just be to transfer, donate, sell or just be abandoned.
Unrealistic expansion of the legal scope: Patent claims are most often defined broadly to expand the legal scope. This is obviously a fine line between making good strategic sense and weakening your patent, so it may be argued to be somewhat self-regulating. Licensing out the patent using unrealistically broad reach-through claims, however, may stifle research by discouraging actors from licensing in such technology. The broader the claim the more the patent holder can exclude others from using the technology.

Innovation Management: Technical Considerations
A related discussion to the one above, that is at least as important as those considerations, is how in a technical sense ethical considerations can be built into an IPM strategy. One of the factors that should be considered from a technical viewpoint is how openness is taken into account when shaping the innovation. Ethical boards, informed consent requirements and regulation are often established to scrutinize at least some of the related research practices for ethical concerns, but what I am referring to is more early-stage.
Technical barriers: When designing an invention-of-interest, it is obviously desirable to technically prevent competitors from being able to easily replicate the technical function to the greatest extent possible. Nothing wrong with this very logical strategy in regards to ethical concerns from a market competition perspective. However, considering that the way we learn is simply by imitating, experimenting and trying out alternative paths, a technical restriction that fully prevents reverse engineering (or legal documents that prevent experimental use) may not be perceived as ethical from a societal perspective.
Incompatibility: Another aspect is how closed the innovation design is in regards to what is currently being used in the market. Incompatibility with existing technology could be the basis for business models aiming to implement new market standards (e.g. Microsoft), and therefore makes perfect sense internally. Conversely, the external environment may have a different view which can be seen in the many open source initiatives that have sprung up lately. Some actors may claim that their technology is disruptive and therefore need to replace obsolete technologies to drive innovation, but high switching-costs, on the other hand, need to be paid by someone. An argument may rightly be that if the technology is truly disruptive the benefits should outweigh the societal costs in the long term. This may be compared to the concept of ‘creative destruction’ where something new replaces an old industry driving many businesses to bankruptcy only to develop society further.
Not productifying assets: This may be somewhat specific to so-called research tools in biotechnology. Valuable upstream technologies, e.g. biomolecules, that are used only internally as a step in a process (which cannot be found by reverse engineering), for instance as a tool to perform a service can provide an immense competitive advantage as trade secret. But looking at how well that specific technology benefits society will in this case only be reliant upon how many customers the company can serve. Whereas making a product that can be offered from the technology will both allow third parties to perform as much research as wanted using the tool, while teaching society how it is structurally built (through patenting).
Not killing projects: This last aspect is something which may be difficult as a company to do, as substantial time- and monetary investments may have already been made. However, the alternative cost of investing a never-ending stream of resources in a project usually means that resources are halted somewhere else where value may be extracted quicker. A benefit analysis may be in place before continuation.

This list is by no means comprehensive, and is as I mentioned the start of a series of posts. The upcoming blog posts will look into how this translates into broader and higher strategy levels both externally and internally, at company and market levels.

Tobias Thornblad


March 12, 2009

The Guardian opens up

The newspaper industry has for quite some time been struggling with declining sales and uncertainty in how to get paid for content proved on their websites. For most part the content has been proved for free and revenue has come from advertising.

It has proved hard to get a viable business model based on this and also to utilize the full potential of the material created by the journalists. Just imagine what amount of interesting articles there are published every day and potentially could be used over and over again in different forms and different media.

From my point what has been lacking is both the willingness to spread the material created, far from all newspapers have all their material available on the web. But also the means of actually making it available has been lacking. That is why I look at the British newspaper the Guardian’s new attempt to make money out of their huge catalog of content as a really good sign of that there might be a change.

“The Guardian today launched Open Platform, a service that will allow partners to reuse guardian.co.uk content and data for free and weave it "into the fabric of the internet".

Open Platform launched with two separate content-sharing services, which will allow users to build their own applications in return for carrying Guardian advertising.”

“The Cass Sculpture Foundation is using the service to add Guardian articles about British artists to its site.

Other partners for the launch of the service include web design firm Stamen and OpenStreetMap, a free, open alternative to commercial map data services. Stamen and OpenStreetMap developed a service that they hope will encourage Guardian readers to "geo-tag" the newspaper's content, positioning every article, video and picture on a map so users can find news, commentary, video and other content related to their area.”

I think that one of the keys to success as a newspaper in this era is to see that you actually have a huge content database which is not only usable as part of the day-to-day reporting but also as something much more. In the Long tail model, which I think is applicable in this area, one of the key issues is to have a good search engine/model. Not just to provide the content but also to make the users understand that it is there and make them understand that they want it.

Spreading the content through the help of other actors is probably one of better ways to get your content out. That is also why I am so interested to see how Guardian’s Beta testing turns out. There might just be a working model, there might just…

Johan Örneblad

Found through Open... .

March 8, 2009

Biological standardization of functional modules

My exploration into genetic engineering for my M.Sc. thesis has led me to an interesting path of new exciting research in gene synthesis. For those of you that have not been in contact with this topic before, it can simply be described as the synthesis of gene-length DNA from chemically derived oligonucleotides, which in turn are short Feb 23 posting in McKinsey&Company: What matters: ”Over the past few decades, most new jobs, wealth, and growth were created in the knowledge and digital realm. And while venture capital represented only about 0.2 percent of US GDP, the companies it created generated about 17 percent of economic activity. The Internet changed virtually every industry. Yet as far-reaching as the digital revolution was, the ability to code life will likely reach even further.“.

BioBrick Standardization Process
DNA cannot currently be fabricated purely using an in vitro process (it still requires an intermediate step using a host organism, e.g. yeast or E.Coli). Nevertheless, a transition period has certainly begun. An interesting model in regards to this is the BioBrick Foundation that was introduced (according to Wikipedia) by Tom Knight (MIT), Drew Endy (Stanford) and Christopher Voigt (UCSF). The trademarked words BioBrick and BioBricks refer to a specific brand of open source genetic parts, defined via an open technical standards setting process;
1. You develop some scheme for standardizing some aspect of synthetic biology work.
2. You convince at least one other person, at a different location from you, that the scheme would help them with something that they care about.
3. You each demonstrate that the proposed standard works for each of you (i.e., the standard must work and be good for something).
4. You document your scheme in writing.
5. You request a BBF RFC number by asking for one (email the list)
6. The BBF technical standards group (i.e., the folks on this list) comment on the standard, try it out, propose revisions.
7. You revise the standard if necessary.
8. The standard is formally accepted as part of the definition for BioBrick parts. Congratulations, you win (publishers are standing by), the BioBricks technical standards suite is updated.
9. New, possible standards tremble before you! Goto 1.

Productification of Functional Modules
BioBrick parts are DNA sequences held in circular plasmids with precisely defined up- and downstream sequences (both of which are not considered part of the actual BioBrick part). Larger BioBrick parts are simple to create as the up- and downstream sequences contain six restriction sites for specific restriction enzymes allowing for ”chaining together“ of smaller ones. According to the Guardian, there are 3 levels of BioBrick parts;
1) Parts: encode basic biological functions
2) Devices: made from a collection of parts and encode some human-defined functions (such as logic gates in electronic circuits)
3) Systems: perform tasks (such as counting)

Openness

I have not been able to find any IPR policies or ownership governance, but according to an article in Nature Biotech it seems as the objective is to create openness and accessibility: ”Quantitative descriptions of devices in the form of standardized, comprehensive datasheets are widely used in the electrical10, mechanical, structural and other engineering disciplines. [...] We propose to adopt a similar framework for describing engineered biological devices. [...] Finally, because the receiver can be used in many systems and because we hope to promote the collaborative development and unfettered use of open libraries of standard biological parts and devices, all of the information describing the receiver is freely available through the Registry of Standard Biological Parts

Value Extraction
It will be interesting to see what type of interesting value extraction models that may arise from this open standard. Some early models related to education include;
* The International Genetic Engineered Machine (iGEM) competition
* The Build-A-Genome (BAG) class at John Hopkins University

To paraphrase Barry Schuler in his TED talk Genomics 101, about the fact that it is a fine line between playing god and learning the laws of nature. We are not creating anything artificial we are only changing around the already existing building blocks in nature to understand what the rules of the game are.

Tobias Thornblad


More state-of-the-art gene synthesis:
http://www.cell.com/trends/biotechnology/abstract/S0167-7799(08)00285-0


 
Locations of visitors to this page