Rod Collins's Blog, page 15
December 17, 2014
Who Are The Real Key Opinion Leaders For Pharmaceutical Companies?
by Samir Mistry, Pharm.D.
After working in the managed care pharmacy space for many years, I have both worked with and for many of the largest pharmaceutical manufacturers. There has always been a significant emphasis placed on having the support and endorsement of Key Opinion Leaders or KOLs. Generally, KOLs were practicing physicians who were either considered experts in their field of medical specialty where the company’s drug was indicated for use, or were regionally and or nationally influential on practice guidelines or the field of practice.
I have worked in the medical affairs department of one of the world’s largest pharmaceutical companies. As an educator of the company’s medications, I presented the company’s products to large national and regional health plans, and pharmacy benefit managers (PBMs). I trained with many of the product-specific medical science liaisons (MSLs), who constantly spoke about KOLs. New to the pharmaceutical industry, I inquired more and more about who the KOLs were and why they were important. I learned that the KOLs were the clinical experts in the medical field that aligned with their products. They either led clinical trials crucial for the drug or influenced the clinical practice guidelines. At the time, I never questioned or challenged the role and importance of KOLs. However, after leaving the pharmaceutical company and spending many years in the payer leadership space, I strongly believe that the KOL model should be revisited.
In today’s rapidly changing healthcare markets, a brand drug’s market share growth is significantly dependent on its preferable formulary status for large national and regional health plans and employers (payers). Based on this trend, consideration should be given to recognizing managed care leaders as the new KOLs of this growing industry. When considering who has the influence with respect to a pharmaceutical company’s drugs, it is important to consider a number of factors. Initially, what type of disease is the drug treating and how many competitors does it have. If the drug is unique, considered a “gold standard,” or indicated to treat severe diseases like cancer or HIV/AIDs, then there may not be any barriers to unrestricted access on drug formularies. But if the drug is used to treat a chronic condition such as hypertension or diabetes and there are multiple brand and generic competitors, then there should be a focus on formularies for both national and large regional health plans and large employers (payers). Following is a list of key questions that drive the discussion on the factors pharmaceutical companies may want to consider when developing KOL lists.
Who is the decision-maker?
Since regional and national health plan and employer formulary status can drive either success or failure on a potential drug’s sales targets, there should be some emphasis placed on who are the decision-makers for large health plans and employer formularies. Another factor to consider is the need for all formularies to be reviewed and approved by a Pharmacy and Therapeutics Committee (P&T). If the payer (health plan or employer) manages its own custom formulary, the P&T is managed internally within the payer. If the payer uses its PBM’s national formulary, then the PBM’s P&T committee is used. With all of these items considered, the true decision-makers are those who drive the P&T committees’ decisions on the formulary, usually the Chief Medical Officer or Vice President/Director of Pharmacy. These individuals usually chair the P&T committee, present topics and drive the decision on the formulary and how the drugs are placed on the formulary.
Who influences the decision-maker?
Those who have influence on the decision-maker include the following: health plan medical directors, clinical pharmacists, directors of quality, and finally the pharmacy benefit manager (PBM) that administers the pharmacy benefit. A key influencer for health plans and employers, which may be underestimated, is the PBM. Many plans and employers utilize a PBM’s national formularies for their members. Additionally, formularies are influenced by a PBM’s rebate contracts with pharmaceutical companies to drive certain branded drugs. Therefore, another influencer could be an account executive from a PBM who manages the drug benefits for large payers that contract with the PBM for drug benefit services.
How can you change your approach for KOLs?
In considering a new approach to defining KOLs and key influencers for a pharmaceutical company’s products, I suggest developing a baseline assessment for the payers that most utilize the product. For example, it the drug treats chronic conditions like hypertension or diabetes, it would be a good idea to focus on large regional and national payers with Medicare lines of business. If the drugs treat asthma, it would a good idea to focus on the Medicaid and Commercial payers. The second step would be to identify the decision-makers, such as chief medical officers and vice presidents of pharmacy for all of these payers and consider them the key focus. The next step consists in developing the list of influencers such as medical directors, clinical pharmacists and the PBM account executives for the payers. Another focal point to consider is to intensify the visits and discussions with the payers’ decision makers where there is an unfavorable formulary position for the company’s products. This is the best opportunity for growth.
In summary, recent developments in market behavior are challenging the old theory for defining the KOLs within pharmaceutical companies. I believe the true KOLs are the individuals who drive formulary decisions in payers. If a pharmaceutical company is able to influence formulary change from a non-preferred or restricted position to either a preferred or unrestricted formulary position, the ability for physicians to write prescriptions becomes a much easier task which can translate into significant drug market share growth.
Samir Mistry, Pharm.D. is a Senior Manager and the chief pharmacy expert in the healthcare practice at Optimity Advisors.
December 16, 2014
How Useless Are Book Publishers In The Digital Age?
by Robert Moss
I’m a big fan of Matthew Yglesias at Vox, and he definitely caught my eye recently when he boldly declared that “Amazon is doing the world a favor by crushing book publishers.”
The essence of his argument is that, in the old days of print books, publishers played a valuable economic role because they converted typewritten manuscripts into printed books and got them into the hands of distributors and retailers. The digital world is different because “transforming a writer’s words into a readable e-book product can be done with a combination of software and a minimal amount of training.”
Yglesias is not particularly subtle in his thesis: “In the brave new world of e-publishing . . . publishers are getting squeezed out because they don’t contribute anything of value.” Note that he’s not saying publishers contribute less value than they use to but rather that they don’t contribute anything of value at all.
Evan Hughes shot back with a defense of the publishing industry in The New Republic, offering the interesting analogy that “a publisher’s list of books is in essence a risk pool, a term most often associated with health insurance.” Authors would be a lot worse off, he argues, without that insurance, and many great books wouldn’t be have been created had a publisher not been willing to take a chance with an advance. They even publish things like poetry that aren’t particularly profitable, using their more lucrative titles to cover the bills.
This, to me, isn’t a particularly strong argument in support of publishers’ value. There are any number of ways an author could fund his or her next project if getting money upfront was the main thing preventing him or her from getting a successful book to market: loans, for instance, or using new crowd-funding sites like Kickstarter, or an old trick many authors have long resorted to with some success—working a day job.
Publishers offer a lot more value than just advance funding, and I say this as an author who has both written for traditional publishers as well as self-published my own eBooks. Here’s just a few valuable things they do:
Working up front in the proposal stage to guide and shape the concept of the book. The mere process of trying to sell a book often helps refine and improve it.
Providing set deadlines to encourage authors to actually finish the book (for me, this part is actually a huge value)
Performing a range of editorial services: page editing, line editing, proofreading, index creation, etc. (though we fight like cats and dogs over that last one during contracting.)
Graphic design: one of the key best practices for making eBooks sell? Have a professional looking cover. It makes a big difference.
Illustrations: novels don’t need photos or illustrations, but many other types of books do. Publishers frequently pair authors up with photographers or illustrators and handle the work of sizing and getting the images ready for production, which takes time and skill even with all the great new software available to us.
Typesetting and layout: sure, in the digital world this is a lot less important and difficult than in the print world, but formatting an eBook is still a lot of work, and it does take a certain level of technical skill, including HTML coding. There’s a reason why there’s an entire subindustry out there that sells services to help self-publishing authors format their eBooks.
Sales: I know that we’re moving to a digital world, but physical books are likely to remain with us for quite some time. Publishers still have field sales forces that work face-to-face with book buyers and retailers. And, depending upon what kind of books an author publishes, non-bookstore channels like retail chains, gift shops, and specialty stores can be a fairly good income stream for authors.
Cachet & respect: A publisher’s imprint is a significant endorsement, one that makes a book more likely to get noticed. Again, in an eBook world with reader reviews and rankings and lots of social networks for endorsing, this becomes less important, but it still has a lot of value to be able to say, for instance, “Random House is publishing my new book.”
Publicity & marketing: Yglesias’ case is at his weakest when he declares that “publishers are terrible at marketing,” and that assertion is worth looking at in more detail.
Here’s the case Yglesias makes for how we knows that publishers are bad at marketing:
Authors always complain that publishers don’t market their books well
If publishers were any good at marketing then they would have a lot of leverage over Amazon instead of vice versa
Authors may one day discover that they do have the power to market their own books.
Wait, so authors do have the power to market their own books and they are also constantly whining about how their publishers are awful at it. So, how come more of them aren’t out there doing it themselves?
Maybe it’s because marketing your own book is hard work. Publishers may be lousy at marketing, but many authors (myself included) are even worse at it or, more to the point, don’t really enjoy it and are happy to have someone else do it for them. Even the publisher that I think did the worst job of marketing one of my books managed to get review copies into the hands of dozens of publications and get it reviewed quite widely. Just the mailing alone is expensive and very time consuming.
Will review copies matter when paper books go away all together? Emailing someone an eBook or PDF version seems far less likely to grab their attention than a nice, well-packaged book. I suspect that means packaging and gimmicks and buzz generation—a.k.a. good old PR and branding—will become increasingly important in the digital era.
And then there’s writing press releases, and distributing them to the press (and figuring out the emails of everyone to send them to), and coordinating book tours. Authors despise book tours already, but how much more tiresome will they be when we have to make all the arrangements ourselves?
Book publishers, in other words, offer quite a bit of value to authors. The move to eBooks may be diminishing and changing that value, but it’s a gross overstatement to say that publishers “don’t contribute anything.”
Yglesias is absolutely spot on, though, in his point that the demise of hardcopy books—that is, the need to print and distribute physical books—changes the equation dramatically, just as the decline of printed newspapers and magazines is reshaping the equation within the news industry.
The really interesting question is what book publishers might look like in the future. Sure, an author today could conceivably go 100% eBook (passing up the not insubstantial slice of income that hardcopy books still generate) and pay a freelance editor to handle the editorial functions, a freelance photographer for illustrations, a graphics designer for cover and layout, and a PR person to do the marketing. That would involve not just finding and hiring and signing contracts but also paying and, possibly, firing those people, too. An author, in other words, could basically create his or own virtual publishing company and spend a awful lot of time doing all sort of things except for actually writing.
It’s true that many traditional book publishers don’t do these things very well, particularly when it comes to producing and marketing eBooks in the new world of branding, social media, and in-person events. But, that just suggests that we are likely to see the rise of new, more agile, and more effective digital book publishers in the future, not the demise of the publishing function itself.
Robert Moss @RobertFMoss is a Partner and the leader of the Technology Platform practice at Optimity Advisors.
December 9, 2014
Breaking Down Big Data: The Value In Metadata
by John Horodyski
“I never met a data that I didn’t like.” — Internet Meme
As a Partner at Optimity Advisors, my role is to work with clients to make data likeable: identifiable, discoverable, usable and ultimately, valuable. Companies are struggling to manage big data in a landscape of rapidly increasing production and diverse formats. The ability to collect and analyze internal and external data can dictate how well an organization will generate knowledge, and ultimately value. How can you start planning for this value?
Gary Drenik argues in his recent Forbes article,
“How to beat the big data giant? Start by thinking little data, as in David vs. Goliath. The first step in the little data process is to identify key business objectives that your organization would like to have data solve. Objectives need to be clearly defined … once defined, objectives serve as a roadmap for the identification of relevant data sourcing to generate new insights for evidence-based decision making by executive team members.”
Metadata is the best way to identify little data that becomes big data. Little data provides structure to what becomes big data. Invest the time, energy and resources to identify, define and organize your assets for discovery and increase their value.
What’s the Big Deal About Big Data?
Big data refers to data sets that are too large and complex to manipulate or interrogate with standard methods or tools. Information — and all its data and digital assets — has become more available, accessible and in some ways, more accountable in business. To better understand the big deal about big data, start with an understanding of the associated terms, which include:
Metadata simply stated is information that describes other data; essentially, data about data. It is the descriptive, administrative and structural data that defines assets.
Taxonomy is the classification of information into groups or classes that share similar characteristics. It provides the consistency and control in language that can power the single source of truth as expressed in a DAM or CMS and is a key enabler for organizing any large body of content.
A Controlled vocabulary is a set of defined terms that populate a drop-down or pick list. Establishing “preferred terms” is a good way to provide control, authority and consistency to your digital assets. You not only need to know what it is you are describing but how it may best be described.
Structured data refers to information with a good level of control and organization, for example, a “date” value in an “Expiration Date” field. Structured data is usually found in a controlled data environment with inherent meaning and purpose.
Unstructured data lacks that control and meaning, offers a confused sense of purpose and requires analysis or interpretation to restore meaning. Using the example above, if a “date” is discovered with no “field” in which to provide that control and structure, what does that tell you? Wrangling all that data will create a more structured sense of purpose for the content in your organization. It makes information more relevant, palpable, understandable and useable.
Master Data is business critical data that is governed and shared across multiple systems, applications or departments within an organization. Master Data can be identifiers, attributes, relationships, reference data and yes, metadata!
Master Data Management (MDM) is the set of processes, tools and governance standards/policies that consistently define, manage and distribute Master Data. Everything starts with data modeling, and data modeling is inherently tied to metadata (ISO-IEC 11179).
The value that metadata, or little data, brings to big data is in the structure and meaning it provides. It serves asset discovery by identifying assets and allowing them to be found by relevant criteria. Metadata also brings similar assets together and distinguishes dissimilar assets. Value is added by managing data.
Big Data Challenges – ‘I Still Haven’t Found What I’m Looking For’
We have an unprecedented wealth of data at our discretion and under considerable watch and scrutiny from creators, users and stakeholders. Organizations need to change accordingly to respond and create new solutions. The challenge with big data is how to manage it. That includes everything from:
identification
capture
curation
storage
search
sharing
analysis
Big data will only continue to grow. The emergence of the Internet of Things and new platforms will produce more information and data in locations both within and external to your business. Increasing an understanding of data and repositories will protect the organization from:
Savvy plaintiffs will request big data in discovery
Inadvertent data ingestion = data breaches
Consumers will lose confidence in data protection upon realization that personal data is everywhere
There has never been a more important time to make data a priority in your strategic planning.
Best Practices
For managing metadata and digital assets in business:
Metadata management and planning for new process or systems
Clear ownership and absent documentation with digital assets
Current documentation on metadata or controlled vocabulary
Determining the right questions to ask about best practices will establish if these practices are in place. While enterprise solution providers have not delivered on many tasks that could be automated, new platforms provide great opportunities for communication/engagement/risk management. Additionally, social media and a variety of other social collaboration tools will affect the workplace, blurring the boundaries of how and when business is conducted. Data sharing and collaboration will play an important part in this growth.
The ability to collect and analyze internal and external data can dictate how well an organization will generate knowledge and ultimately value. How can you start planning for this value? A few things to start working towards include:
1. Data Assessment & Organization Planning
Inventory and discover data life cycles and users
Creating a well-planned data warehouse model ensures valuable enterprise-wide information and metrics, as well as good performance and provisions for growth
2. Capability Assessment & Gap Analysis
Produce maps linking data to business processes and validate
Determine areas where redundant, obsolete or transient data reduction may be happening. You need to ensure that data in transition is handled accurately and implemented quickly to meet the speed of your business.
3. Modeling & Analysis
Develop plans to facilitate analysis and future action for an operational system or creating visualizations and reports for your teams and interest groups, formatted as they need them and delivered when they need them.
Data must be delivered consistently, with standard definitions, and provide the ability to reconcile data models from various systems or data marts.
Making Value
The struggle to manage information within the big data landscape is as complex as the digital workflows it supports. This landscape includes the internal ecosystem and the wider geography of partners and third-party entities. The complexity of all of the available data is compounded with the increasing rate of production and diversity of formats.
Assets are critical to your business operations — they need to be discovered at all points of the digital lifecycle. Key to building trust in your data is ensuring its accuracy and usability. Leveraging meaningful metadata provides your best chance for a return on investment on the assets created and becomes an essential line of defense against lost opportunities. Your users’ digital experience is based on their ability to identify, discover and experience your brand in the way it was intended. Value is not found — it’s made — so make the data meaningful to you, your users and your organization by managing it well.
John Horodyski @jhorodyski is a Partner within the Media & Entertainment practice at Optimity Advisors, focusing on Digital Asset Management, Metadata and Taxonomy.
December 2, 2014
Being Adaptive Means Cultivating A Little Chaos
by Rod Collins
There are fewer words that will strike utter fear in seasoned managers more than the word “chaos.” For most managers, chaos is, by definition, the absence of management. It’s what happens when managers lose control. It’s something to be avoided at all costs because its inevitable attributes of unpredictability and uncertainty are the mortal enemies of efficiency.
For the past century, the hallmark of management excellence has been efficiency. Thus, once a business achieves market dominance, management’s focus is all about sustaining its competitive advantage by cutting costs, eliminating variances, improving economies of scale. When efficiency is the principle preoccupation of management, stability is the ultimate state of effective management.
But what happens when stability is not an option? What happens when the world is suddenly engulfed by a rapid pace of change and the most effective organizations are not necessarily the most efficient but rather the ones who are most adaptive? These were the questions that were on the mind of General Martin Dempsey when he reached out to Ori Brafman, after reading Brafman’s book The Starfish and the Spider, which described how leaderless organizations are oftentimes more adaptable than their centrally controlled counterparts.
The military, like most traditional organizations, believed efficiency was the pathway to excellence. However, given the realities of modern warfare, Dempsey was becoming increasingly aware that excellence on the battlefield had more to do with adaptability than efficiency. Like so many organizational leaders today, Dempsey was coming to terms with the realities of a rapidly changing world. In Dempsey’s first meeting with Brafman, the General related his concern that the army would need to change if its success depended upon its ability to quickly adapt. Dempsey recognized that imagination and innovation were the key ingredients of adaptability, and he was concerned that the army was deficient in both. When he asked Brafman what he needed to do to make the army more adaptive, the author’s response was a simple strategy: “Make the army more chaotic.”
An Unlikely Collaboration
In his latest book The Chaos Imperative, co-authored with Julian Pollack, Brafman relates the story of the unlikely collaboration between the career army general and the vegan Berkeley alumnus and how they used the power of chaos to increase the army’s excellence.
Getting the army to embrace chaos as a resource was no easy task. According to Brafman, “We tend to confront chaos as if it were an unruly beast—something to be contained as much as possible.” However, in attempting to squelch chaos, leaders often inadvertently stifle the new ideas that are the seeds of innovation and the pathways to future growth. Dempsey knew the army needed better and quicker avenues to new ideas, so he was open to learning different—maybe even uncomfortable—ways of thinking and acting.
An Emergent Process
In encouraging the army to become more chaotic, Brafman was not asking the soldiers to abandon all semblance of order, but rather to become proficient in a discipline he called “contained chaos.” While the notion that chaos can be managed in a disciplined way may sound counterintuitive, Brafman contends that “a little bit of chaos, encouraged but confined within borders, can be highly beneficial to an organization’s overall health.”
The discipline of “contained chaos” is an emergent process that involves three elements: white space, unusual suspects, and organized serendipity. White space is a time, place, or system unfettered by an established structure where people are free to self-organize, brainstorm, follow their imaginations, and make up their own rules. Interestingly, these are also the fundamental elements of “play.” Learning to value white space can be challenging for traditional organizational leaders who’ve come to believe that play is the antithesis of work. Nevertheless, a close look at today’s most well-run businesses reveals that their leaders understand that play is oftentimes a key component of work, which explains why Ping-Pong and pool tables have suddenly become standard office furniture in innovative companies. These amenities are not distractions from work, but rather the white space that spawns the emergence of new ideas.
The second element is the introduction of “unusual suspects.” These are outsiders who are not part of the usual group. These are the non-experts who don’t share the preconceived notions that make up the foundational knowledge of the experts. Experts, by definition, are those who know how things work and how things are done, given the current state of affairs. In a stabile world, their knowledge is paramount. However, in a rapidly changing world, expert understanding can become a treacherous obstacle to change. That’s because the effective response to change often requires us to be innovative and to come up with ways to do things differently or even to do different things. The essence of innovation is the connecting of unusual things, such as a telephone and the Internet. If you want to enable unusual connections, include unusual suspects when working to solve your most important problems.
The final element is organized serendipity. When there is sufficient white space and the inclusion of unusual suspects in the problem solving process, the circumstances are ripe for “new and creative ideas to emerge out of nowhere.” Brafman describes this attribute as the paradox of chaos. By cultivating a little chaos, leaders set the conditions for serendipity to happen. This entails creating “pockets of chaos,” where leaders don’t give directions, but rather facilitate creative group processes from which better quality directions emerge from the collaboration of people with very different perspectives. When leading organized serendipity, “structure and efficiency are set aside or blocked off to create a more organic process that allows new ideas to come to the fore.”
An Intelligence Advantage
In embracing Brafman’s advice to make the army more chaotic, General Dempsey and his soldiers have learned to become more adaptable. They have come to understand the limits of planning, the value of disruptive thinking, and the power of organized serendipity when faced with the challenges of unconventional warfare. By learning how to create white space, include unusual suspects, and facilitate organized serendipity, what has emerged are not only more adaptive and effective solutions but also the most important intelligence needed to succeed in rapidly changing environments: the knowledge of what they didn’t know that they didn’t know. When an organization has the wherewithal to effectively manage chaos and the capacity to uncover the “unknown unknowns” before taking action, they create for themselves a huge intelligence advantage, which in turn becomes both an adaptive and a competitive advantage in mastering a rapidly changing world.
Rod Collins (@collinsrod) is Director of Innovation at Optimity Advisors and author of Wiki Management: A Revolutionary New Model for a Rapidly Changing and Collaborative World (AMACOM Books, 2014).
November 18, 2014
Transformation by Design: Employees at the Epicenter of Corporate Restructuring
by Carol Huggins
When business leaders decide to restructure or reorganize, they are transforming the company and, therefore, must carefully consider the design of such a change. Like an architect who takes months or years to thoughtfully design a building, a company must intricately and carefully plan for change. As they do so, they need to keep in mind the most critical component of an organizational redesign are the employees, whose performance can make or break a company.
Whether a start up or legacy corporation, the primary driver behind reorganization is usually profitable growth. Of course, such plans may also result from a desire for improvements in customer service performance or production quotas. But, most importantly, and often with little consideration, it’s how the restructuring will impact the people that matters most, which is why the emphasis on concise planning and thoughtful design is so important.
Many senior managers simply focus on the what, the desired end goal of organizational change: Improved financial performance as a result of restructuring and resizing. They often ignore the human factor in such a modification. Instead, it’s the innovators who bolster the objectives by carefully composing how employees will proactively participate in reorganization and impact the bottom line, and how they will interact for the good of the firm after the change has been implemented.
Throughout my career, I’ve both been directly involved with and witnessed organizational change. In one instance, as a lodging company prepared to go public in the early 1990’s, leadership made the strategic decision to streamline operations and eliminate the Assistant Manager position, thereby empowering non-exempt employees with such tasks as resolving customer disputes and making bank deposits. Imagine an hourly employee deciding to refund a dissatisfied guest! But, it worked. The hourly workforce was invigorated, and property managers now had home office support for what were then unconventional solutions.
On another occasion, I observed a division of a Fortune 500 company as it implemented a mass reorganization. In this instance, employees felt under informed and many worried about job security. As the implementation began, a good number were in limbo for months, “officially displaced” as managers competed for their skills and loyalty. It was evident that minimal consideration had been put into the human aspects of the organizational redesign and delivery.
A revolutionary example where people are at the center of transformational change is the recent decision by Zappos to embrace Holacracy, which, according to Forbes magazine, is “a New Age approach to leadership that involves no job titles, no formal bosses, and lots of overlapping work circles instead.” Certainly, Holacracy is not the Holy Grail, and all eyes will be on Zappos to see how it fares with this new model. However, one advantage Holacracy has is that its organizational premise is people-centered and focused on providing clarity around aligning resources and accomplishing tasks.
Human resources matter most when a company decides to restructure. A well-considered design with employees at the forefront will result in an operational transformation that benefits the organization, its employees, and its customers, thereby improving financial performance.
Scott Wise, founder and CEO of Scotty’s Brewhouse restaurants, professes that his employees are the key to his success. They are more important than customers. He states, “If your employees believe in your dreams and values then ultimately they will make your guests happy, too.”
Carol Huggins is a Manager at Optimity Advisors
November 11, 2014
Beyond The Politics: An In-Depth Look At ACA’s Early Returns
by Taylor Anderson
Heading into the second year of the Affordable Care Act’s (ACA) expanded healthcare system, many Americans are still unaware of what it may mean for them. News outlets are quick to highlight rate changes without providing much context around them, as the ACA remains a political flashpoint across partisan lines. Nevertheless, data is available to help shed light on the early effects the ACA has had on the US healthcare market and to help us understand it’s potential future impact.
Historical US Healthcare Costs
Often criticized as a law that will lead to drastic premium rate increases for individuals, the real story behind ACA may have less to do with current premium increases and more to do with tempering the long-term year-over-year increases in healthcare costs, as evidenced by existing trends. The cost of healthcare in the United States has risen at a pace between 4-7 percent per year since before the ACA took effect. This cost burden for insurers has largely been passed onto consumers via premium increases. Premiums in the three years before ACA took effect (2008 – 2010) rose by an average of 10% or more for individual consumers according to a recent report.
One supporting explanation for rising healthcare costs is the corresponding increase in quality of care. Unfortunately, in the pre-ACA timeline the quality of healthcare from a consumer standpoint was largely stagnant or in decline. In one study, the United States ranked last among seven nations in 2004, 2007, and 2010 with regard to equity, access, efficiency, and population health. Working to reverse this trend is a key goal of the ACA. With premiums likely to rise, regardless of ACA impact, the change in quality and access to care will be the true indicators for whether the ACA leads to a more cost-effective healthcare system.
While it will take several years to acquire enough data for a comprehensive understanding of ACA impact, the early signs are positive. For the 2014 enrollment period, consumers searching for healthcare plans had 191 issuers from which to choose across the 36 states with federally facilitated marketplaces. For the 2015 enrollment period, this number has risen to 248 (an increase of almost 30%). In addition to this increase in choice for consumers, the Congressional Budget Office estimates that the ACA will reduce the number of people without health insurance by 25 million by 2016. This increase in choice and availability for consumers should help drive cost-competition among insurers in the marketplace.
The Subsidy Effect
As we investigate increases in 2014 and 2015 rates, critics will be quick to look at any increases as a result of the ACA. However, as indicated above, in the three years prior to ACA implementation (2008-2010), premiums rose by an average of 10% per year. This is indicative of the long-term trend of 4-6% annual increases in the cost of healthcare in the United States. The recent trends in rising premiums as well as data on the new ACA tax subsidies, which help to reduce the actual cost for consumers, must be considered in any conversations about rate increases or decreases in the post-ACA health care market.
One flaw in how the media is reporting on premiums is the tendency to report simply on an insurer’s rate request in terms of an increase or decrease in percent. In reality, this figure provides one insurer’s rate trend over a two-year period; it is impossible to compare one insurer’s rate increase of 15% with another’s decrease of 5% without corresponding dollars. This post provides dollar amounts, where available, as we discuss the impact of federal subsidies on post-ACA marketplace rates.
For the 2014 enrollment year, pre-tax credit premiums for plans on the federally facilitated exchanges were lower than expected, with the weighted average second lowest silver plan (the benchmark plan) being 16 percent below expectations. For a 27-year old this meant $216, a 40-year old $263, and $558 for a 60-year old. When the tax credits are factored in, the numbers are even more favorable for consumers.
Of the more than 8 million people who selected either a state-based or federally facilitated marketplace plan by March 31, 2014 approximately 6.8 million (85%) selected a plan with Federal tax credits. For these individuals who selected plans in the federally facilitated marketplace (FMM), their post-tax credit premiums were 76% less than the full premium. This amounts to a reduction from an average of $346 to $82 per month. For the 69% of individuals selecting a FFM plan, premium post-tax credits were less than $100. For 46% of them, premiums were $50 or less. In some cases, especially with lower cost bronze plans, the tax credit amount may exceed the cost of the plan, resulting in a $0 premium after-tax credit for the enrollee.
Combined with Kaiser Foundation reports indicating that 57% of the 8 million exchange-insured individuals were previously uninsured, this data on tax credits demonstrates positive trends for both affordability and access to care. All of this is in addition to the fact that post-ACA plans are more robust than their predecessors. The provision for essential health benefits ensures that even premiums nominally higher in the post-ACA world will provide more comprehensive benefits for members.
The Consumer Bottom Line
Prior to the ACA, older consumers with pre-existing health conditions were often priced out of the individual market, had their condition(s) excluded from coverage, or were denied altogether. These policyholders often saw premiums rise significantly when they became ill, and there was no federal safety net or support (e.g. subsidies) to help them combat these costs. All of this contributed to higher costs and lower access to care.
The ACA has had a complex impact on consumers, but when placed within historical trends, it provides more access to care, at a similar premium amount. Even at these comparable premium levels, the actual cost for consumers is lower than reported due to the tax credits. With the help of tax credits, the actual cost of care is both more available and affordable than ever for a large number of consumers. The ACA can be better evaluated as time continues but early returns are promising for Americans.
Taylor Anderson is an Associate at Optimity Advisors.
November 4, 2014
Moving from “Mobile First” to “Mobile Only”
by Robert Moss
The concept of “mobile first” development—that is, developing for mobile device contexts first when creating a user experience and then adding layers of enhancements for larger form-factor devices—has been around a good four years or so now. (The term was coined by web designer Luke Wroblewski way back in 2010.) But, it’s still taking a long time to work its way into people’s minds.
On a regular basis I hear from clients that they have on their list of upcoming projects “our mobile app”—either launching a mobile app for the first time or enhancing the one they already have out there to make it more useful and “drive adoption.” Though they may even cite “mobile first” philosophies, it’s clear that in practice a “mobile app” for their organization is still a sort of adjunct to their core business—something generally being pleaded for by their sales and marketing departments so that they have something new and shiny to brag about or, unfortunately all too frequently, because their main competitor just released a mobile app and they desperately want to say, “we have one, too!”
But, the world is shifting beneath our feet even as we race to catch up with the last iteration of mobile technology. Increasingly, mobile devices are ceasing to be another way our customers can do business with us and are instead becoming the only way that a large number of our customers will do business with us. And, that means that for many users it’s not a question even of “mobile first” but of “mobile only.”
In the first stages of mobile application development, applications tended to be about “accessibility”—giving users a way to access those functions they needed—checking their email, looking up the status of an order, finding the nearest retail location of a store—when they were away from their office or their homes and not able to access their computers. It didn’t have to be as rich and effective an experience, and it didn’t have to be a complete experience. It just had to let them get a few basic things done.
In most companies’ initial forays into mobile-enabling their business, they typically looked to their web site and selected the top 5 or 6 functions that their customers or members needed—with an eye, in particular, for things that would be useful when someone was out and about doing things like shopping, commuting, and eating at restaurants. And, that made sense when you looked at mobile devices as an adjunct, something used when a regular desktop or laptop computer wasn’t available.
But, increasingly mobile devices are becoming the primary way of accessing the Internet by a significant number of Americans. Pew Research Internet Project reports that in 2014, 64% of Americans use their smart phones to go online and, of them, 34% go online mostly using their phone and not via a larger device such as a laptop or desktop computer. Not surprisingly, those numbers skew even higher if you just look at younger Americans. 83% of 18-29 year olds, for instance, have a smartphone vs. 49% for 50-64 year olds.
For businesses, that means a totally new line of thought when it comes to conceiving and designing mobile applications. Can your customers do business with you from initiation through the full breadth of the customer lifecycle? That is, can they find you, shop for and purchase your products, maintain the relationship for customer support and maintenance, and ultimately renew or buy more of your products—all on a mobile phone?
For some innovative businesses, like Uber, the answer is a resounding yes. In fact, despite having been an Uber user for over a year, I didn’t even know whether they had a consumer web site until I checked it out while writing this post. It turns out they do, but you can only update your profile and view trip history, not actually buy their products—that is, request a ride—through the browser site.
Most other businesses, especially ones with older, more established business models, are somewhere on the spectrum from “not mobile enabled at all” to “somewhat fully mobile.” But, many are starting to catch up. Almost all the big retail banks now support check deposits on their mobile apps, which may well do away with the last reason many of their customers have to ever visit a physical bank location. In an interesting digital and bricks & mortar hybrid, the major retail pharmacies like Walgreens and CVS have apps that let their customers print photographs directly from their phone and pick them up at their convenience that the nearest retail location.
That key question—how could customers do business with us solely over their mobile phone?—needs to be at the front of minds when we are formulating our digital commerce strategies and designing our online use experiences. After all, if our company isn’t the one doing it, there’s probably someone else out there who is.
Robert Moss @RobertFMoss is a Partner and the leader of the Technology Platform practice at Optimity Advisors.
October 28, 2014
The Many Faces Of Taxonomy
by Mindy Carner
Taxonomy is the classification of information into groups or classes that share similar characteristics and can be used to organize information in the context of relationships between subjects. However, the word “taxonomy” has become somewhat of a buzzword that may not be truly understood even by many people within an organization that is actually dependent on a strong system of classification. The ubiquity of taxonomy is what makes it seem buzzwordy, but there are many different types and formats that taxonomy can take. A simple breakdown of the various forms and uses of data classification can help to make them clearly understood and supported by the organization.
The sitemap taxonomy
A sitemap taxonomy can be applied in a basic way to support the navigation of a website by illustrating the information architecture of a website at a high level. The sitemap represents the site’s structure through the top-level navigation (potentially with drop downs), left-hand navigation (possibly multiple levels deep), as well as header and footer navigation. Web search engines use sitemaps to learn the structure of web sites and improve their presence in search results.
The e-commerce taxonomy
Most e-commerce sites are almost entirely driven by taxonomy. When a taxonomy is successful, users don’t even realize that they are involved in a search experience as they navigate through the filters and refinements of the navigation. For example, every page is innately a search result. Taxonomies that lead to product pages can be very large and complex. In order to coordinate pricing and style with gender and size, or provide the user with dynamic ‘faceted’ navigation, a taxonomy is the architecture through which users travel as they search, or browse, for that perfect item.
The enterprise taxonomy
An enterprise taxonomy is a combination of structures that can be defined, used, and implemented in many ways. Most enterprise taxonomies are used to drive the search engine that combines all major systems into one search index. A search requires structured metadata and taxonomy to help it understand the variety of content that it will index. Significantly, the enterprise taxonomy is much larger than just the intranet; it is all of the internal systems that the company uses for enterprise communication.
Digital and media asset management taxonomy
Taxonomies provide the metadata basis for a DAM (Digital Asset Management) or a MAM (Media Asset Management) that handle images, video and other content types. If these systems are used as the single source of truth for a domain of knowledge, the taxonomy can model that knowledge in a way that is meaningful. Users will be able to depend on the technology to provide an intuitive and easy experience for delivering content.
What about the overlap?
The advantage of a taxonomy is that it provides a single source of vocabulary across the many systems that an enterprise supports. Many companies have enterprise search, a DAM system, and possibly even an outward facing e-commerce site. If the taxonomy is maintained, it can support search across platforms. Optimity Advisors helps clients by developing a customized, departmental or enterprise-wide taxonomy for navigation or for enterprise search, and then helping them develop a program to sustain the discipline to meet their long-term goals.
Mindy Carner is a Senior Associate at Optimity Advisors.
October 21, 2014
DAM Experience Starts With User’s Expectations
by John Horodyski
The first step in exceeding your customer’s expectations is to know those expectations” — Roy H. Williams
The content-driven experience is ultimately the expression of the relationship between a Digital Asset Management (DAM) system and the user. This dynamic is often overlooked when developing the rules or practices that drive a DAM. The ability to easily deliver or find the right content, at the right time is the goal of DAM and the cornerstone of delivering positive customer experience internally and externally. The perception of experience, however, is affected by expectations. Understanding and integrating those expectations into the design of the DAM infrastructure is key.
If You’re Happy and You Know It …
It’s been said that, “our happiness depends less on objective conditions and more on our own expectations. Expectations, however, tend to adapt to conditions.” We not only hold expectations high but also create them to serve our needs as much as possible. But what if our expectations are out of sync with the conditions of a DAM implementation? For example:
What are the assumptions about the type and completeness of the assets in the DAM? Do users assume that all digital content from the organization will be there?
What keywords are expected by applied to retrieve content?
How much content is expected back from a search? Is it very narrow or do users want to see a variety of results and choose for themselves?
Gauge user expectations at the beginning of the DAM planning stage to match features and functionalities to these expectations.
Experience Touchpoints
What exactly is an “experience”? The Oxford English Dictionary (OED) defines “experience” as:
A practical contact with and observation of facts or events
An event or occurrence which leaves an impression on someone
The practical contact points of the DAM are sometimes referred to as “touchpoints.” The more touchpoints there are, the faster and more effective an organization will be in managing the content it has created, bought, sold or licensed. There are some key touchpoints that need to be addressed and understood by the organization in order to drive the user experience.
1. Onboarding
A user’s initial introduction to the DAM can provide a window into their expectations for how they would like the interface with the system. Teaching tagging concurrently with search and discovery will aid in creating top-down, judgment-driven evaluations and bottom-up, data-driven analysis. We recommend pursuing these efforts in parallel whenever possible.
2. Metadata
Good metadata design will increase the return on investment of the assets you have created and is a line of defense against lost opportunities. Think about the digital experience for your users in the way that they need to interact with content and assets to ensure easy identification, discovery and positive user experience.
3. Taxonomy
Without knowing the context of certain words in your organization, search can fail to provide specificity or weight to search terms. A strong controlled vocabulary will help prevent the dreaded “No results” page by linking synonyms and misspellings to the proper terms. In addition, the taxonomy will link synonyms, possible misspellings and acronyms to the terms that are actually written in the documents. Access to the right content is critical.
4. Roles and Permissions
The other side of managing the customer experience is the construction of permissions, users’ roles and security for the DAM. It is never too early to start working on the business rules and practices around access to content. A comprehensive metadata model with fields that are specific to rights usage and management will be critical for users’ ability to use and reuse assets appropriately. Application of security protocols adds a layer of protection and conscientious management.
5. Content Stewardship
The power of DAM is driven by people who can make changes and align DAM with strategic goals of the organization. DAM champions might manage daily operations or act as advocates for budget allocation and interface with multiple stakeholders like Digital Asset Managers, Business Analysts and Technical Engineers. Strong governance ties people together to manage change.
Customer Experience Starts from the Beginning
“If we don’t take care of our customers, someone else will.” — Unknown
Identifying which problems to solve will be the greatest starting point on your DAM journey. First step: identify the specific audience and their needs. Take the time to understand usage scenarios — who will be using the DAM and what procedures / output do they need. By gathering feedback, the possibility of damaging workarounds will be minimized.
Workarounds will still be a problem. When in doubt, people will choose the path of least resistance and bypass the problem, thereby ignoring the issues to be solved. Even worse, while workarounds are often temporary by design, they can easily become entrenched within the culture and ultimately become habitual, if not a convention adopted by a team or department. The goal is to minimize workarounds and maximize your touchpoints.
John Horodyski @jhorodyski is a Partner within the Media & Entertainment practice at Optimity Advisors, focusing on Digital Asset Management, Metadata and Taxonomy.
October 14, 2014
Pssst! Standard Digital Rights Vocabulary Is Already Here
by Julia Goodwin
If your company uses a rights management system, you are translating complex, unstructured contract narrative into a digital rights vocabulary readable by a computer. Your organization is relying on this digital rights vocabulary to do its sales and financial transactions. So, the nagging question is, why can’t we all agree on a standard terminology across companies?
It’s hard to help feeling the pull of the inevitable. The market inexorably rewards velocity: easier transactions, open, common language, elegant and simple solutions that remove impediments. Processes that slow the market tend to be replaced. Yet, an accepted digital rights language hasn’t garnered adoption to enable the market’s needs.
Frequent claims against a common digital rights vocabulary include:
It’s too complicated to standardize; it’s a non-starter
A negotiating edge is lost if we have an exact and cohesive vocabulary
Common users won’t understand it; it needs a specialized legal background to interpret
You’ll never get agreement across companies or industries on terminology
There is surprising news within your own organization that points to the fact that standardized rights terms are possible; they’re already used…and successfully
If you have a rights management system, you’ve gone through the difficult exercise of getting many legal professionals to agree on the vocabulary they use to define rights vocabulary such as the names of rights, channels, territories, languages and contract attributes like options or exclusivity or holdbacks.
All that professional agreement now lies in your rights system which the company trusts to runs availability reports, expiration reports, to expedite and reveal all kinds of knowledge about assets and digital rights in your company.
So, you’ve proven that standard rights vocabulary is possible; that negotiating advantages are lost and that common users won’t understand terms become watered down arguments.
Where a company’s standard rights language does break down is in the transactions between itself and its partners and customers—in the very heart of its sales and distribution transactions. In other words, in the most important parts of its business.
If one of the Digital Rights languages such as ODRL, Dublin Core’s RightsExpression or PRISM were to be perfected and adopted, and we all spoke the same language, what would happen?
Offers, Licensing and Sales agreements would be standardized and swiftly comparable and executed. What is your average contract review-to-execution time and costs today?
Reporting, invoicing and payments required for participations or royalties would be automated and executed timely. How many days a month does it take the Royalties Department to service these tasks today? What is that cost?
Rights Management Systems would be networked across shared partner contract points and requests for subscription content, renewals, status of revenues, and execution of options could take minutes. What do these activities cost now? How far away is your Rights Management information from the point of sales and distribution?
We could also brainstorm many innovative ways for digital rights metadata to travel with the asset such as turning if off when it expires or watermarking it when copied illegally. The marriage of asset and digital rights metadata could finally make indisputable and less contentious DRM possible.
In the medical health field, there is a coding system called ICD-10, or the International Statistical Classification of Diseases, 10th edition. The system allows for doctors, medical coders, billers and insurance payers to automate their processes without ambiguity. Imagine! With ICD-10 there is:
Agreement among International Physicians
A standard that is in its 10th edition
A common vocabulary for transactions involving multiple parties
Vocabulary and codes that precisely define all complex injuries, diseases, severity, anatomic site, etc., to do with the human body
If we can imagine this, somehow a common digital rights vocabulary seems a lot more do-able. We know how to do it individually. Let’s figure out how to do it together.
Julia Goodwin is a Senior Manager at Optimity Advisors
Rod Collins's Blog
- Rod Collins's profile
- 2 followers

