Rod Collins's Blog, page 16
October 7, 2014
Disruptive Technology And The Publishing Industry: A Look Backward
By Robert Moss
These days, you probably have in your home an example of a technological innovation that utterly and violently reshaped the entire business model of the publishing industry. In fact, if you’re like most Americans, you probably have more than one of them. They offered publishers much easier and wider distribution of their products, but in exchange they slashed publishers’ per-unit revenues by some 90% and reduced authors’ earnings to mere pennies. Predictably, it set industry veterans to wailing and gnashing their teeth, declaring that the business was doomed.
No, I’m not talking about smart phone or Kindles or iPads or even laptop computers. I’m talking about paperback books. The market disruption they caused happened 70 years ago.
A Paperback Revolution
The American Paperback Revolution began in 1939 when Robert F. DeGraff founded Pocket Books. DeGraff’s books were small (4-1/4 by 6-1/2 inches), printed on cheap paper, and bound in semi-stiff covers. They sold for twenty-five cents. In order to expand their market, Pocket sold the books not only through bookstores but also in drugstores, newsstands, and railroad stations. Two other companies quickly followed Pocket’s lead. Penguin Books, which had been operating in England since 1935, opened a U.S. branch in 1939. Two years later, magazine publisher Joseph Meyers began Avon Pocket-sized Books. The industry stalled during the Second World War, when paper supplies were strictly rationed, but after the war at least two dozen more firms raced into the market, including New American Library, Bantam, Fawcett, Popular Library, and Dell.
Let’s look at some of the parallels between these two disruptive phenomena in the publishing world: what the rise of paperbacks did to traditional hardback book publishing in the middle part of the 20th century and what eBooks and online journalism are doing to the print publishing world today. (For convenience sake, I’ll roll eBooks and online journalism/magazines into a single category called “ePublishing”)
Lower Costs
Paperbacks: The low cover price of the new paperback format–twenty-five cents for a paperback compared to between two dollars and three dollars for a typical hardback–made books more affordable for the average reader, but they slashed the per-unit revenue for publishers by almost 90%
ePublishing: While traditional publishers are fighting to keep their prices from being slashed a full 90% (which would be pricing an eBook around $2.50 vs. a $25 hardback), they’ve been knocked down quite a lot—and, there are plenty of $2.99 and even $.99 options out there competing against them. It’s even worse for journalism, where ad revenues online are a mere fraction of those of print.
Wider Distribution
Paperbacks: The distribution of paperbacks, which were sold not only in books stores but also in convenient locations such as drugstores and train stations, caused books to be more widely available than ever before
ePublishing: Now, you don’t even have to find a retail establishment. You can download a book from your bed, on a train, or while driving down the road in your car, and you can catch up on the latest news and read your favorite long-form journalism anytime, anywhere, too.
Wailing, Lamentation, Gnashing of Teeth
Paperbacks: Publishers declared it was the end of an era and they were all going broke. Authors echoed the sentiment and, except for a early adopters who embraced the new genre, most “serious authors” rejected paperbacks as both cheapening and an assault on their income.
ePublishing: Ditto.
And the Sky Did Not Fall
Paperbacks: The book publishing industry got along just fine once they factored the economics and distribution requirements of paperback publishing into their business models. Authors learned to how to value and sell their paperback rights, and they became a major chip in their contract negotiations with publishers.
ePublishing: We will see. But it seems likely that the sky will not fall, either.
A Boon to Authors
In the end, what appeared at first to be a major threat to authors’ livelihood turned out to be an economic boon. We’ll use Raymond Chandler, the acclaimed Los Angeles detective novelist and creator of private eye Philip Marlowe, as an example.
At the start of the Paperback Revolution, authors had few options for making money from their books beyond traditional hardback royalties, for the sale of subsidiary rights was not a major factor.
Before paperbacks, the reprint market was limited to a few firms like Grosset & Dunlap, who produced cheap hardback reprints in small print runs–and paid even smaller royalties. In 1940, Raymond Chandler made a whopping $200 from Grosset & Dunlap’s $1 reprint edition of The Big Sleep, his first novel, which they produced from the same printing plates used by Knopf, the original publishers.
The movies industry was in its heyday, and authors could sell the movie rights to their books, but the returns were small tiny compared to the prices bestsellers fetch today. In 1941 and 1942, Chandler sold the screen rights for his second novel, Farewell, My Lovely to RKO Pictures, and those for his third, The High Window, to Twentieth Century-Fox. His combined take from the both was $2,750–a nice bonus, for sure, in 1942 dollars, but not enough to be the foundation for a career.
Enter the new cheap, widely-distributed paperbacks editions. They increased authors’ potential audience and allowed them to keep their books in print long after they ceased to be available in hardcover. These developments rewrote the terms of professional authorship in the United States, giving previously-struggling novelists a new means of earning income from the new works and, equally or even more important, from their back catalog of things they had finished work on years before.
The paperback industry had gotten started in the late 1930s, but it was largely put on hold by World War II paper shortages. Still, by the beginning of 1945 nearly 750,000 copies of The Big Sleep and Farewell, My Lovely, Chandler’s first two novels to be reprinted in paperback, had been sold. Four years later, over three million copies of Chandler’s works had been published. Despite the penny per copy royalty, with large sales the returns on the reprints were becoming significant.
Chandler had suspended writing novels to pursue income writing screenplays for the studios in Hollywood, a business that paid well but he thoroughly detested. In 1947, he wrote to his agent, “I am a damn fool not to be writing novels. I’m still getting $15,000 a year out of those I did write. If I turned out a really good one in the near future, I’d probably get a lot out of it.”
Chandler could not live on the reprint royalties alone, but they provided a significant portion of his income and helped accelerate his return to novel-writing as a full-time profession in the 1950s.
Electronic publishing likely stands to be a boon for both authors and publishers, too, once they figure out how to navigate the new electronic landscape. We’ll look at the way some of them are doing just that in an upcoming post.
It’s a good reminder of how disruptive technologies often play out in an industry. At first, it seems like the sky is falling, and not every established player adapts their business models correctly to survive the change. Those that are able to adapt, however, often come out on the other side stronger than they were going in.
Robert Moss @RobertFMoss is a Partner and the leader of the Technology Platform practice at Optimity Advisors.
September 30, 2014
Records Management – An Invaluable And Holistic Information Tool
by Emily Lanois
How efficiently do you and your organization manage records? Does the term information management fill you with trepidation? A records management project that involves a large and diverse group of employees can yield unexpected benefits. It can sneakily change corporate culture and gather valuable insight into how to end inefficient and risky information management practices.
Records management is a discipline that provides great business value to companies that use it correctly. Its business case is most easily built on projected storage and legal discovery cost savings, along with the business value of preserving vital records for the correct length of time. A basic records management project includes creating and socializing a records management policy and records retention schedule. A retention schedule is nothing more than big-bucket records categories with associated retention periods.
In addition to the concrete benefits that justify the resources used for creating a records management policy and retention schedule, there are a variety of unintended benefits to be gained that should influence the project’s structure. Incorporating a multitude of employees in a records management project creates conditions for an organizational change in information management culture, enables the project team to recognize and take steps towards correcting broader information management issues, and provides a way to identify and mitigate information management risks. Although it may be most time-efficient to establish the fundamentals of a records management project with a small committee, the additional time spent working with a broader segment of the organization creates value that makes the extra time well worth it and provides a base for a greater information management overhaul.
Change in Information Management Culture
When an organization has never participated in a formal records management project, its employees probably believe that keeping all information forever is most efficient. For this cultural belief to change, the employees need to participate in the creation of the records management program. If they are to realize the ease and value of records management, they need to be part of the process.
As with many daunting tasks, employees may overestimate the effort required to identify and manage company records until they try. Initial skepticism turns to support when employees understand how records can be managed using big “buckets.” It takes workflow-specific, face-to-face conversation to convey the ease with which records can be distinguished from non-records and then placed in big bucket retention categories. Once records are identified and categorized, employees are empowered—and often excited—to dispose of records that are past their retention dates.
Employees commonly describe the experience of identifying and categorizing their records (for ultimate preservation or disposal) as “cathartic.” During a recent Records Management project led by Optimity, employees expressed relief that they had gained greater control of their department’s records after doing a records inventory. When employees are empowered to preserve and destroy the records that they create, receive and maintain, they add value to the company as information stewards. They also become champions of the records retention schedule and records management policy, increasing its legitimacy in the organization. Companies receive an unexpected benefit from a simple records management project when corporate culture shifts and information management becomes a valued discipline.
Recognizing and Correcting Information Management Issues
Meeting with employees to gather the information needed to create a records management policy and retention schedule inevitably leads to conversations about information management pain points. Most business groups suffer from daily encounters with issues along the lines of:
Inefficient collaboration processes
Lack of repositories for records storage
Incomplete metadata (or identifying information) about records located in offsite storage
Difficulty accessing information that is used regularly, resulting in the creation of convenience copies and proliferation of sensitive information
Although the above issues translate into inefficiencies and frustration, they are very difficult to remediate without allocation of extra resources to implement new processes, technology, and to clean up poorly organized repositories. Additionally, employees rarely feel that it is their responsibility to spearhead such an initiative lying outside of their job description. As a result, employees are often resigned to dealing with information management headaches as part of the job.
Gathering information about the creation, receipt and maintenance of records from a vast sample of employees provides the context for information management problems across business groups. Most employees are only privy to the information management issues that plague their day-to-day work. From the vantage point of a records management project that engages many employees, patterns are identifiable and business cases for enterprise-wide solutions emerge.
Additionally, senior level support behind the records management project can be used to escalate both departmental and enterprise information management issues to higher levels of authority, accelerating company-wide information management change, beyond the scope of records management (i.e. contract management, collaboration, knowledge transfer issues, etc.). The unexpected benefit of progress towards correcting major information management issues is achieved when problems are identified and high-level support is secured during a basic records management project.
Identification and Mitigation of Information Risks
Gathering records management information from a large cross section of an organization is like detective work. Each question reveals a clue about the information landscape. When asking employees to describe the types of records that they work with, information risks—often unknown to the employee—emerge in conversation. Examples of such risks may include:
Unstructured data repositories not known to management (discovery risk)
Data stored in antiquated technology systems that require great expense to read (financial risk in the event of discovery)
Unsecured PII or PHI stored in company electronic or physical repositories (legal risk)
Vital records stored exclusively on an employees vulnerable hard drive (business, compliance and legal risk)
These types of risk are difficult to identify when doing so is the goal at the outset, while it is easy to gain this unexpected benefit through the natural interview process used to build a records management policy and retention schedule. Employees do not consider their practices to be risky, and do not have ownership of most of the information they work with, and as such are not very responsive to direct inquiries regarding potential information risks. They are even less likely to respond with helpful information when inquiries are directed broadly at their department, and when there is no benefit from offering information. However, when people are simply interviewed about their practices, especially with the benefits of records management laid out beforehand, the details of information workflow and storage emerge and illuminate areas of risk.
The unexpected benefit of awareness about information risks allows an organization to mitigate them, but more specifically directs the project’s records management policy to explicitly forbid and mandate certain practices to prevent the behavior that created risk.
The fundamentals of a records management project lay out the policy and guidelines for how employees should treat company records during their lifecycle, and tells them when and how to discard records when the lifecycle ends. However, a records management project becomes an invaluable and holistic information management tool when large numbers of employees representing a wide cross section of the organization are involved.
Emily Lanois is an Associate at Optimity Advisors
September 23, 2014
Alan Mulally’s Management Secret: Peer Accountability
by Rod Collins
With the retirement of Alan Mulally earlier this summer, Ford’s new CEO, Mark Fields, has a big set of shoes to fill. If he follows the lead of his predecessor and continues the management system that Mulally introduced, Fields is likely to take the automaker to even greater heights. In business, there are few things more powerful than a good management system. Fortunately for Fields, he has inherited a great management system.
When Mulally accepted the offer to become Ford’s chief executive in the summer of 2006, the carmaker was in the midst of a steady decline. Over the previous five years, Ford’s stock price had plummeted by more than half from more than $16 a share to less than $7. To solve its problems, Ford’s board of directors made what many auto insiders considered a bold move when they reached outside their industry and convinced Mulally to leave Boeing to become the carmaker’s new CEO. As the leader of Boeing’s Commercial Airplanes Group, Mulally had successfully taken on the formidable challenge from Europe’s Airbus Industrie by transforming the company into a lean and profitable enterprise. Ford’s board hoped Mulally would do the same for their ailing company.
The Problem is the System, Not the People
Upon assuming the leadership of Ford, Mulally brought a sense of focus that had been missing from the dysfunctional management team he inherited. Although his new board had given him carte blanche to revamp the leadership team, he advised them that he didn’t think he would need to replace many people. Mulally’s initial assessment of Ford’s failed management was that it was the system—and not the people—that was the problem. His solution was to use the peer accountability system that worked so well for him when he was at Boeing. At the heart of this system was a weekly leadership meeting he called the “business plan review” (BPR).
In these sessions, each member of the leadership team was expected to present a concise color-coded update of his or her progress toward meeting key company goals. Projects that are on track or ahead of schedule are colored green, yellow indicates the initiative has potential issues or concerns, and red denotes those programs that are behind schedule or off plan.
Initially, the leadership team resisted the BPR. They had important work to do and didn’t have time for these mandatory weekly sessions. They were used to working in their own fiefdoms, where their authority was unquestioned and they were in total control. Their notion of an effective leadership team was each individual leader doing his or her own thing and doing it well. In the old system, they kept to themselves and stayed out of each other’s way. And when Ford’s leadership team did gather together, they behaved more like fierce competitors than skilled collaborators. Mulally’s expectation that they would come together for weekly sessions where they would provide candid reports and be accountable to each other seemed like a crazy idea and a sure path to the unemployment line. It was not surprising that in those first BPR meetings, everyone on the leadership team reported everything as green.
The Breakthrough
One of the leaders who saw no value in the BPR was Fields, who protested to Mulally that he needed to keep focused on his business unit. Fields saw these meetings as a wasteful distraction from his real work. Mulally remained firm in his resolve to introduce the BPR, asking Fields to trust the process.
In his book, American Icon, Bryce G. Hoffman relates the story of how Fields unwittingly provided the breakthrough that broke the resistance of the leadership team to the BPR process. Fields had been the obvious internal candidate for the CEO job when the automaker recruited Mulally, and given Ford’s culture of fierce competitors, Fields was sure that it was just a matter of time before Mulally nudged out his main internal rival. So, he decided if he was going to lose his job, he might as well go out “in a blaze of glory.” Fields’ unit had been working on a project that was in serious trouble, and he decided at the next BPR meeting, he would color it red. He stunned his colleagues when he made this bold—what some in the room thought was a career ending—move. But, it delighted Mulally, who seized upon the moment to engage the whole leadership team on how they could collaborate together to solve the business issue Fields shared with the group.
Mulally understood that the prime lever of an effective organization is a highly collaborative senior leadership team. Without this lever, it’s very difficult, if not impossible, to have a collaborative organization. He was determined to transform Ford’s team of rivals into a team of collaborators. Fields bold move helped make that a reality.
Building Shared Understanding Means Spending Quality Time Together
Color-coded status reports provide a level of transparency that is sometimes absent from the usual numerical reports, and processing these visual updates as a team instills a discipline of peer accountability that is often lacking in leadership teams. The cadence of frequently gathering the whole team in one place to review all key initiatives helps create a shared understanding about the most important issues of the business. But more importantly, as happened in the case of Ford, it provides critical opportunities for the team members to synchronize their activities to help create extraordinary performance.
Mulally was adamant about the BPR process because he understood that the key dynamic for building a highly effective team is not a one-time offsite team-building event, but rather a frequent cadence where everyone on the team gathers in the same place at the same time for crucial business conversations. The quality of a team is dependent upon the quality of the conversation, and that means taking the time to build a shared understanding of the business by spending quality time together.
In implementing this practice, Mulally was very careful to maintain an environment where it was safe to candidly report the actual status of key activities. Mulally impressed upon the team that there was no value in status meetings where everyone reports that all is well—even when things are not—because people are more concerned with maintaining an image than dealing with reality. When members of a team have a process where they feel that they are accountable to each other and it is safe to tell the truth about the actual state of their projects, they provide themselves with the opportunities to assist each other to more quickly resolve critical issues when they occur. The primary purpose of peer accountability is not to create more pressure for individual performance but rather to identify opportunities for the team to leverage its collective strength. That’s the fundamental dynamic that defines a great management system.
Rod Collins (@collinsrod) is Director of Innovation at Optimity Advisors and author of Wiki Management: A Revolutionary New Model for a Rapidly Changing and Collaborative World (AMACOM Books, 2014).
August 26, 2014
Alan Mulally’s Management Secret: Peer Accountability
by Rod Collins
With the retirement of Alan Mulally last month, Ford’s new CEO, Mark Fields, has a big set of shoes to fill. If he follows the lead of his predecessor and continues the management system that Mulally introduced, Fields is likely to take the automaker to even greater heights. In business, there are few things more powerful than a good management system. Fortunately for Fields, he has inherited a great management system.
When Mulally accepted the offer to become Ford’s chief executive in the summer of 2006, the carmaker was in the midst of a steady decline. Over the previous five years, Ford’s stock price had plummeted by more than half from more than $16 a share to less than $7. To solve its problems, Ford’s board of directors made what many auto insiders considered a bold move when they reached outside their industry and convinced Mulally to leave Boeing to become the carmaker’s new CEO. As the leader of Boeing’s Commercial Airplanes Group, Mulally had successfully taken on the formidable challenge from Europe’s Airbus Industrie by transforming the company into a lean and profitable enterprise. Ford’s board hoped Mulally would do the same for their ailing company.
Upon assuming the leadership of Ford, Mulally brought a sense of focus that had been missing from the dysfunctional management team he inherited. Although his new board has given him carte blanche to revamp the leadership team, he advised them that he didn’t think he would need to replace many people. Mulally’s initial assessment of Ford’s failed management was that it was the system—not the people—that was the problem. His solution was to use the peer accountability system that worked so well for him when he was at Boeing. At the heart of this system was a weekly leadership meeting he called the “business plan review.”
In these sessions, each member of the leadership team presents a concise color-coded update of his or her progress toward meeting key company goals. Projects that are on track or ahead of schedule are colored green, yellow indicates the initiative has potential issues or concerns, and red denotes those programs that are behind schedule or off plan.
Color-coded status reports provide a level of transparency that is sometimes absent from the usual numerical reports, and processing these visual updates as a team instills a discipline of peer accountability that is often lacking in leadership teams. The cadence of frequently gathering the whole team in one place to review all key initiatives helps to create a shared understanding about the most important issues of the business. But more important, as happened in the case of Ford, it provides critical opportunities for the team members to synchronize their activities to help create extraordinary performance.
In implementing this practice, Mulally was very careful to maintain an environment where it was safe to candidly report the actual status of key activities. Mulally impressed upon the team that there was no value in status meetings where everyone reports that all is well—even when things are not—because people are more concerned with maintaining an image than dealing with reality. When members of a team have a process where it is safe to tell the truth about the actual state of their projects, they provide themselves with the opportunities to assist each other to more quickly resolve critical issues when they occur. The primary purpose of peer accountability is not to create more pressure for individual performance but rather to identify opportunities for the team to leverage its collective strength. That’s the fundamental dynamic that defines a great management system.
Rod Collins (@collinsrod) is Director of Innovation at Optimity Advisors and author of Wiki Management: A Revolutionary New Model for a Rapidly Changing and Collaborative World (AMACOM Books, 2014).
August 5, 2014
Digital Asset Innovation
by John Horodyski
“Without change, there is no innovation, creativity or incentive for improvement. Those who initiate change will have a better opportunity to manage the change that is inevitable.” — William Pollard
Innovation, an active process of introducing something “new” or “different,” involves a commitment to looking at the world differently and making the conscious effort to change. The decision to implement a Digital Asset Management (DAM) system can be an innovative moment in gaining operational and intellectual control of your digital assets.
DAM brings with it great responsibility for how the organization’s assets will be efficiently and effectively managed and is essential to growth. Any successful DAM implementation requires more than just new technology; DAM requires a foundation for digital strategy. Creating the whole DAM solution — and connecting it throughout your ecosystem — means that your digital assets can be part of this innovation by generating revenue, increasing efficiencies and enhancing your ability to meet new and emerging market opportunities for your users.
If you want something new, you have to stop doing something old.” — Peter F. Drucker
Innovation Starts at Home
Every strategy needs a foundation — that solid base upon which a structure rests and where meaning may be established. Many structures deserve attention and preparedness for the roadmap of work to be done when building the “House of DAM.” More importantly, these structures should all be reviewed and discussed well before any technology has been considered, let alone purchased. Technology should never lead the decision-making process for DAM — the business sets the foundation for strategy first. Technology is important, and the vendor review and selection process is a critical step in this process, but that step must follow the business requirements and digital strategy.
DAM encompasses the management tasks and technological functionality designed to enhance the inventory, control and distribution of digital assets. Digital assets include rich media such as photographs, videos, graphics, logos, marketing collateral. While most systems are initially required to house the large variety of media, DAM must also consider workflow surrounding the ingestion, annotation, cataloguing, storage, retrieval and distribution of digital assets for use and reuse in marketing or business operations. The metadata or descriptive information embedded in the asset adds these processes and increases the asset’s value.
From IT staff to all users (past and present), keeping a record of what’s happened to the assets and how they were used will help inspire others who seek to innovate for their future use. Expanding markets and complex supply chains for digital images demand constant updating, rethinking and redesign of DAM. Innovative thinking is required to keep apace with the change.
Innovation distinguishes between a leader and a follower.” — Steve Jobs
Innovation Process
Drawing inspiration from Rod Collins’ book, “Wiki Management: A Revolutionary New Model for a Rapidly Changing and Collaborative World,” there are five fundamental phases to the innovation and change management process that have a direct impact on how digital assets and DAM are understood and managed within an organization
1. Understand What’s Most Important to Customers
Innovative leaders understand that the purpose of a business is to create customer value. The most effective organizations today are collaborative networks focused on delighting their customers.
The beginning of your DAM process must ask the question, “what’s most important to your customers?” Customer / user value is a moving target influenced by many factors both inside and outside the organization. This Initial Picture of the values influencing consumer choices and the developments that might reshape future market behavior will collate observations from:
Stakeholder Interviews
Market Analyses
Customer Segmentation Data
Industry Trends
Disruptive Developments
Financial Analyses
Winning Value Propositions
2. Aggregate and Leverage Collective Intelligence
The most effective organizations are those that have the capacity to quickly access and leverage their collective intelligence. An effective method to process the Initial Picture with your staff is in a Collective Intelligence Lab, where a microcosm of your organization is gathered to:
Collectively process key information
Work cross-functionally in small groups to think holistically and expansively about new and different ways to achieve better results
Engage in innovative large group processes to identify the best solutions and their related requirements
This work results in an Innovation Portfolio, a strategic framework for proactively managing the growth of your company in a time of change whether that be for maintenance, incremental change, platform change or disruptive change
3. Build Shared Understanding by Bringing Everyone Together in Open Conversations
People are more likely to enthusiastically implement what they create. Fostering co-creation can be achieved with Collaboration Work-Thru and Iterative Check-In sessions. A Collaboration Work-Thru is a facilitated meeting process that empowers a diverse group of individuals to:
Quickly co-create a shared understanding of the key requirements for effective delivery
Identify the key drivers of timely implementation
Assure that everyone in the organization is on the right page at the start of critical strategic initiatives or operational projects
The Iterative Check-In is a customized cadence for gathering cross-functional teams and updating the shared understanding as changes occur both inside and outside the organization.
4. Focus on the Critical Few Performance Drivers
Information is power. An important part of effective change management design is making sure that people have the information they need to deliver extraordinary performance. In this phase a simple Focused Scorecard is created that serves as:
A powerful frame of reference for everyone to independently gauge progress
A communication tool to promote high engagement
A mechanism for aligning the distributed work throughout the organization with what’s most important to the success of your business
5. Hold People Accountable to Their Peers
Peer-to-peer networks are far more effective than top-down hierarchies to innovate and collaborate at the level needed to keep pace with a rapidly changing world. In this final phase, you need to create customized Peer Accountability Metrics that:
Link individual compensation to collaborative action
Provide incentives for innovative contributions
Reward advancing business interests over individual interests
“Imagination is not only the uniquely human capacity to envision that which is not, and, therefore, the foundation of all invention and innovation. In its arguably most transformative and revelatory capacity, it is the power that enables us to empathize with humans whose experiences we have never shared.” — J.K. Rowling
Building the Business Innovation Case for DAM
A focus on innovation can have many positive impacts on the DAM. Consider the following examples of what innovative effects a DAM may have on users and the organization as a whole:
Support strategic organizational initiatives
Serve as a resource to gauge changes in customer values
Reduce costs
Generate new revenue opportunities
Provide better brand management
Improve collaboration and streamline creative workflow or competitiveness
Enable marketing agility and operational excellence
Brand and market position — and the technologies to support brand success — are essential to any organization’s growth. In order to be successful, leadership will need (and want) to initiate and socialize an innovative process that starts slow and then works towards a bigger and larger end state. Ultimately, this can have direct influence in workflows — from packaging to engineering to licensing to social media to focus groups — with full realization of the DAM serving as the key repository, the single source of truth for your assets. Technology adoption can be overly complex and challenging at times, but that does not need to be the case.
In order to harness that potential power, the right people need to be enabled to make changes, create innovative opportunities with digital assets and align DAM with the strategic goals of the organization. Make sure this opportunity exists to stand back and ensure problems are being solved with this particular DAM solution.
“If you always do what you always did, you will always get what you always got.” — Albert Einstein
Innovation and Change
Technological innovation results in a constantly evolving business environment as data sharing transforms the organization. DAM is central to this change. Information and all its data and digital assets have become more available, accessible and in some ways more accountable in business. We live in a “big data” world with so much data at our discretion and under considerable scrutiny from our content creators, users and stakeholders alike. Our organizations need to be change with the times and not only this, but respond well and be comfortable with our solutions.
Digital Asset Innovation Drives Your Brand
Content is still king, and the ability to strategically set the foundation for the kingdom and take control of your digital assets with DAM is within your reach. The demand for digital assets used for the design, production and distribution of content is not only quantifiably high, but also qualifiably, due to its necessity and criticality in current business operations.
Simply stated, digital asset innovation drives your brand. To get your digital house in order, know what your internal business units and external partners need, and understand how you will need to deliver assets today — and tomorrow — across multiple channels and devices. Creative professionals and those in marketing, communications, operations and other areas require content as a cost of remaining competitive and delivering what the consumer wants.
The ability to provide assets of high value and quality in a timely basis is no longer a wish, it is the expectation. Using DAM effectively can deliver innovation through knowledge and measurable cost savings, time to market gains and greater brand voice consistency — valuable and meaningful effects for your digital assets.
John Horodyski @jhorodyski is a Partner within the Media & Entertainment practice at Optimity Advisors, focusing on Digital Asset Management, Metadata and Taxonomy.
July 29, 2014
How Moore’s Law Is Ending Management As We Know It
by Rod Collins
As they struggle to find their footing in the unfamiliar landscape of a rapidly changing world, many managers are seeking refuge in change-management initiatives to navigate the new terrain. Unfortunately, anecdotal evidence suggests that more than 75 percent of these initiatives fail. Perhaps that’s because the very notion of change management is probably an oxymoron. To think that you can manage change is to imply that you could somehow control change. Change is going to happen whether we approve of it or not. Quite simply, change is impervious to our attempts to manage it. When it comes to change, the central challenge is not about managing change. It’s about managing at the pace of change, and that is an entirely different proposition because the only way you can manage at the pace of change is to change how you manage. To understand why this is so, we need to understand a phenomenon that has come to be known as Moore’s law.
In the mid-1960s, Intel cofounder Gordon Moore observed that the number of transistors that could be placed on a computer chip was doubling every twenty-four months. Said another way, our capacity to store and process information doubles every two years. Moore’s law explains why today’s average teenager has more computing power in her iPhone than the typical Fortune 500 company of the 1960s had in its multimillion-dollar computer center. It also explains why a nineteenth-century management model is unsustainable in a twenty-first century world.
If we were to graph Moore’s law for the period 1984 through 2014, we would find that the graph would start out linear until around 2004 when it would make an exponential turn. We suggest beginning with 1984 because that is the year Apple introduced the first Mac and transformed the computer from a piece of high-priced institutional equipment into an affordable household appliance available to the masses. If we define our computing capacity in 1984 as one unit and then double that amount every two years, we would find that our capacity to store and process information today is more than 32,000 times what it was in 1984. To highlight how rapidly the technological revolution is reshaping the business landscape, in 2016 that capacity will grow to more than 64,000 times when compared to 1984.
Moore’s Law captures the dynamics of our journey between two very different worlds and serves as an analogy to explain where we have been, where we are going, and why what worked yesterday won’t necessarily work tomorrow.
Between 1984 and 2004, Moore’s Law would be essentially a flat, linear line. Thus, if you were a manager in 1995 and were responsible for planning and delivering specific business results for 1998, you would analyze all the available data at your disposal for the period 1990 through 1995. You would thoroughly study all the dynamics and the relationships among the key factors that would guide your decisions in planning for what you would need to do to meet your 1998 goals. With this knowledge, you would do what other successful managers had done for decades. You would forecast the future based on your thorough analysis of the past, and chances are highly likely that you would indeed deliver on the results because1998 would be a linear extrapolation of the period that precedes it.
For the past century, the way we’ve managed our businesses has been based upon two fundamental assumptions that have been rock solid for all that time. The first assumption is that the past is a proxy for the future. This explains why the professional practice of management has been so data driven. In a linear world, when management’s job is to create the future, the secret sauce is to understand and then extrapolate from the past. The second assumption holds that the smartest organizations are those that leverage the intelligence of their smartest individuals. Accordingly, the dominant and near universal structure for both private and public sector organizations has been the top-down hierarchy, where the few at the top direct the activities of the many, based on the belief that the whole organization becomes smarter than it would otherwise be if the workers were allowed to follow their own judgments.
However, If you were a manager in 2005, accountable for meeting specific business goals in 2008, and you built your business plans on the extrapolation of your analysis of the period 2000–2005, chances are highly likely that you would miss the mark because 2008 is not a linear extension of the period 2000–2005. That’s because, after 2004, Moore’s Law becomes exponential.
The significance of Moore’s law is that it demonstrates that we have been rapidly and suddenly thrust into an exponential world in which the rules are very different. In our exponential world, many managers are painfully discovering that the past is no longer a proxy for the future, and they are reluctantly learning from the remarkable successes of innovative businesses—such as Google, Linux, and Wikipedia—that the smartest organizations are now the ones that know how to aggregate and leverage their collective intelligence by designing organizations not as top-down hierarchies but as powerful collaborative networks.
Rod Collins (@collinsrod) is Director of Innovation at Optimity Advisors and author of Wiki Management: A Revolutionary New Model for a Rapidly Changing and Collaborative World (AMACOM Books, 2014).
July 15, 2014
Is A Dual Operating System the Solution for Managing In a Faster Moving World?
by Rod Collins
The single greatest challenge facing every business leader today is how to effectively lead organizations in a time of relentless change. As we’ve learned from the publishing and recording industries, no business model is safe when once popular brands such as Border’s and Tower Records can suddenly vanish from our shopping malls. We have also discovered that new brands in industries that didn’t even exist a mere fifteen years ago, such as Google and Facebook, can achieve levels of success far faster than anyone thought possible. Whether they like it or not, business leaders need to come to terms with the uncomfortable reality that the rules that worked in the relatively stable twentieth century may not apply in a twenty-first century world radically reshaped by the digital revolution. If managers want to manage at the pace of accelerating change, they will need new solutions.
John Kotter’s latest book, Accelerate: Building Strategic Agility for a Faster Moving World is a timely guide for business leaders who are open to management innovation. According to Kotter, businesses have trouble keeping pace with the speed of change because their hierarchical management systems are designed for efficiency rather than strategic agility. While hierarchies can be workable structures for the tactical challenges of running large enterprises, they are limited in their capacity to quickly spot hazards and opportunities, nimbly formulate strategic options, and rapidly execute responsive action.
The problem with hierarchies is that strategic work has traditionally been limited to a small number of people at the top. This is a serious handicap because, as Kotter points out, mastering the challenges of relentless change requires “a radical increase in the number of people involved in creating and executing strategic initiatives.” In addition, the longstanding habit of doing strategic work in yearly cycles is problematic in a world where hazards and opportunities that can make or break the business can happen on any given day.
Kotter’s solution for a more dynamic approach to addressing the challenges of rapid change is the adoption of a dual operating system. This is accomplished by creating a second agile network structure that complements the traditional hierarchical organization. This second structure is not a cross-functional task force that reports to an executive champion, but rather a relatively autonomous system that is free from the usual bureaucratic processes that slow organizations down. The participants in the network are drawn from and continue to work in the hierarchical structure and serve as the catalysts for maintaining the necessary synergies between the two systems.
The key attribute that drives the effectiveness of the network is its lack of bureaucratic barriers. There is no ascription of authority in the network. Thus, anyone can propose an idea or an initiative, and no one has the wherewithal to unilaterally impose his or her will upon the network. All participants are equally empowered to build a consensus around their ideas. If one can garner the voluntary support of enough people, the network provides a powerful opportunity to rapidly develop an idea into a well-formulated strategic proposal for consideration by the company’s leadership team. Because the participants in the network are also part of the hierarchical delivery structure, key requirements for implementing a new idea are integrated into the iterative development of the strategic initiative, increasing the chances that innovations are not only rapidly formulated but are also rapidly deployed. This enables organizations to be both adaptive and productive.
Kotter points out that, while a dual operating system might seem like new idea, it is actually a common occurrence in the evolution of many, if not most, companies. That’s because most start-ups are initially organized as informal networks. As they begin to experience rapid growth and the need to adopt a more sophisticated management structure, most companies evolve into conventional hierarchical organizations. Kotter’s key insight is that, as a company transitions from a network to a hierarchy, there is a brief period in this natural evolution when both of these systems are operating at the same time. Kotter’s proposed solution is designed to harness the dynamics of this transitory phase into a more permanent modus operandi.
The linchpin that makes this arrangement work is the CEO, who serves as the steward for both systems, making sure that they are equally balanced, sufficiently autonomous, and most importantly, effectively interdependent. This stewardship enables the two systems to coexist in an organic relationship that supports both adaptability and productivity in responding to the challenges of a rapidly changing world.
While Kotter’s solution is a workable solution—even a necessary solution for many deeply entrenched bureaucracies—it is, in all likelihood, not a sustainable solution precisely because it is so dependent upon the stewardship of an enlightened CEO. What happens when the CEO moves on and is replaced by a successor who firmly believes in “old school” management? More likely than not, as we’ve learned from GM’s mishandling of its successful Saturn unit, we can expect that once innovative leaders move on, the higher performing networks they built will run the risk of being quickly demolished by the remaining bureaucrats who see no value in any structure that they can’t directly control.
Nevertheless, Kotter’s dual operating system may be the only practical short run survival solution for companies with long hierarchical histories. It is unrealistic to expect a decades-old traditionally organized company to transform itself into an Amazon or a Zappos overnight. Before traditional companies can learn new ways of operating, they must first unlearn the old ways. The dual operating system is a useful transition strategy for accomplishing both the learning and the unlearning. As a long-term solution, however, the dual operating system may be problematic because collaborative networks not bureaucratic hierarchies are the future of management. Any major company created in the foreseeable future is far more likely to be organized like Google than General Motors.
When Larry Page and Sergey Brin first built their company, they were very proactive in making sure that Google had little, if any, hierarchical structure. They understood that, in a world radically transformed by the digital revolution, networks have become smarter, faster, and most importantly, more adaptive than hierarchies. This is the ultimate lesson that every organization needs to grasp to survive in a faster moving world. While a dual operating system is not a permanent solution for managing in a faster moving world, it is a good start for traditional business leaders who have the will and the courage to begin the job of transforming their legacy organizations into highly adaptive twenty-first century enterprises.
Rod Collins (@collinsrod) is Director of Innovation at Optimity Advisors and author of Wiki Management: A Revolutionary New Model for a Rapidly Changing and Collaborative World (AMACOM Books, 2014).
July 8, 2014
The Greatness of Information Governance
by John Horodyski
“I am not bound to win, but I am bound to be true. I am not bound to succeed, but I am bound to live by the light that I have. I must stand with anybody that stands right, and stand with him while he is right, and part with him when he goes wrong.” — Abraham Lincoln
Governance is good — in fact, it is great. Governance, and specifically IT Governance, is defined as, “putting structure around how organizations align IT strategy with business strategy, ensuring that companies stay on track to achieve their strategies and goals, and implementing good ways to measure IT’s performance. It makes sure that all stakeholders’ interests are taken into account and that processes provide measurable results.”
Governance helps us define the rules of the Digital Asset Management (DAM) road — a framework to ensure that program goals are met during implementation and in the future. Governance is the best way to manage and mitigate risk if it starts with a roadmap to ensure success of implementation during the first iteration and measurement tools to become formalized in the operating model. By developing a project charter, working committee and timelines, governance becomes an ongoing practice to deliver ROI, innovation and sustained success by building collaborative opportunities.
Participation from all levels of the organization is key. In particular, engaging the leadership by involving them in the big decisions, holding regular reviews and keeping them talking about DAM, will yield the greatest the benefits from DAM. IT Governance is the best way to manage change from implementation to maintenance of the technology itself.
However, we have reached a time in our history when we must implement Information Governance in order to move our information into the future. This is something that is both more holistic and equally, more specific than IT Governance and needs to address the data and information throughout your organization. Information Governance is the structure around how organizations align information management beginning with metadata, taxonomy, policy development, stewardship and technology to serve the creation, use and distribution of information.
The decision to implement a DAM system is the right step in the right direction to gaining operational and intellectual control of your digital assets, and it is to be taken seriously. Any successful DAM implementation requires more than just new technology — DAM requires a foundation for digital strategy. Creating the whole DAM solution and connecting it throughout your business means that assets can generate revenue, increase efficiencies and meet new and emerging market opportunities.
DAM is more than a sum total of its parts. Digital Asset Management must include a detailed review and analysis of all the contributing factors: digital assets, organization, workflow, security, etc. It takes a considerable effort to get everything in its place — there is no magic here.
“Corporate governance is concerned with holding the balance between economic and social goals and between individual and communal goals. The governance framework is there to encourage the efficient use of resources and equally to require accountability for the stewardship of those resources. The aim is to align as nearly as possible the interests of individuals, corporations and society.” — Sir Adrian Cadbury, Chairman of Cadbury and Cadbury Schwepps.
People, Process and Technology
Governance in the context of information is the ongoing practice of making and setting information policy, establishing standards and making decisions that will be enforced to achieve a unified, coherent information discovery experience. The goal of such governance is to:
Manage digital assets with Digital Asset Management (DAM)
Federate and manage metadata
Manage people, process and technology to support metadata standards
The governance structure establishes the strategic, operational and technical decision-making process required to ensure the DAM team excels in its mission. DAM governance provides strategic leadership, establishes priorities and policies, and is accountable and transparent to the organization. In addition, the governance standards should include a core metadata standard, proscribed workflows and lastly, governance practices that will be carried out on an ongoing basis.
Information Governance Practices
Every organization has a different culture and will naturally take a different approach to Information Governance and how it is applied within an organization. Content owners must be engaged in upholding standards of practice and actively participate in the ongoing development of the taxonomy, metadata standards and governance practices to ensure the continued success of the enterprise search experience. The following elements must be maintained and updated with the site content:
Metadata
Metadata, simply stated, is information that describes other data — in essence, data about data. It is the descriptive, administrative and structural data that define your assets.
Descriptive metadata describes a resource for purposes such as discovery and identification (i.e., information you would use in a search). It can include elements such as title, creator, author and keywords
Structural metadata indicates how compound objects are put together, for example, how a digital image is configured as provided in EXIF data, or how pages are ordered to form chapters (e.g. file format, file dimension, file length)
Administrative metadata provides information that helps manage an asset. Two common subsets of administrative data are rights management metadata (which deals with intellectual property rights) and preservation metadata (which contains information needed to archive and preserve a resource).
Controlled Vocabulary
A Controlled Vocabulary is for drop-downs/pick lists. The use of “preferred terms” is a good way to provide authority and consistency to digital assets. Each tag could point to a different topic, but, fundamentally, it’s the same principal element of the subject under review that is relevant. If the topic is “Country,” and you only have eight countries with which you work, then those eight countries are your controlled list. Control, and stronger yet, authority, is needed to describe your assets. You need to know what it is you are describing and how it may best be described.
Content owners should create a retention policy and retention schedule that quantifies the lifecycle of the asset. Because content varies in its nature, timeliness is a relative term, and site owners need overarching rules that define retention decisions. In order to succeed, people and /or teams will need to review portions of site content monthly for duplication, version control, re-use rules and archiving. Annual review of all site content sections for general timeliness and relevance, archiving and removal and adjusting the retention policy and schedule is also a good practice.
Taxonomy
Taxonomy is the classification of information into groups or classes that share similar characteristics. It is a way to organize information to best solve a business problem based on user needs by exposing relationships between subjects. A well-designed taxonomy brings business processes into alignment, allowing users to intuitively navigate to the “right” content.
The best reason for creating and implementing a single, standard taxonomy across the enterprise is that it provides good business value. But more than that, it enhances and improves enterprise search and enables quick information discovery. Taxonomy provides the consistent and controlled vocabulary that can power the single source of truth as expressed in a DAM or CMS. It is a key enabler for organizing any large body of content. It is required for meaningful information management and critical to effective “findability.”
Information Governance Process
There are seven steps to effective Information Governance:
1. Governance Process — Create Governance Council, Process and Guidelines
Create and Align Processes, Requirements and Controls
Creation of a Governance Council — DAM Governance Council to develop and maintain the metadata, taxonomy for the DAM and associated systems. Core council is comprised of core stakeholders across different departments within Communications and represent issues brought to them by others in their area.
Governance Communication Plan — Governance Council decisions are recorded for future reference to understand the context of decisions and to note the timeframe of terminology changes (e.g., After Q3 of 2013, “layout” was changed to “template” and “template” will be used going forward).
2. Establishing Decision Rights and Accountability
Approval Processes — All decisions are made by the Governance Council with zero tolerance of “ad hoc” or “quick fixes.”
Data Stewards — It is recommended to have individual “Data Stewards” serving as data owners for his /her individual departments / plants. A Data Steward is someone that is responsible for maintaining a data element (controlled vocabulary, etc.) in a metadata schema.
Requirements Gathering — Data Stewards will help define data and specify data quality requirements for existing DAM functionality and for future functionality. Some examples of data quality issues include controlled vocabulary creation for: Full Sun, full sun, fullsun, f sun, etc.
3. Performing Stewardship
Messaging — Regular communications and updates are recommended for monthly and quarterly updates both “in-person” where possible and virtual (email, presentations, videos in DAM, etc.)
Change Management — Data stewardship is evidenced by changes to the user interface: functionality and graphical (e.g. thumbnails), keyword changes, search term modifications, and new metadata models / user types added
Collecting Performance Metrics — It is recommended to have regular reporting on DAM user statistics
4. Managing Change and Issue Resolution
Authority — Only the DAM administrator(s) will be able to make changes to the metadata models configuring the system.
Change Requests — Any requests to “change” any of the functionality and/or business logic behind the DAM must be formally requested via a Change Request Form made available to users from the DAM.
Change Review — The change(s) will be reviewed on an as needed basis depending upon the volume of requests.
5. Building Governance Into Technology
Involving IT — Nominate a member of IT to the Governance Council
Mapping Corporate Systems — Information Management is a corporate issue, not just an IT issue
Managing vendors — Chair of Governance Council to maintain metadata / taxonomy vendor management ownership.
6. Stakeholder Care and Support and Stakeholder Communications
Keeping in Touch — In addition to bi-weekly meetings, there should be monthly and quarterly reports
Gathering Feedback — It is recommended to have quarterly reviews with user groups for 1:1 feedback
Measuring and reporting — Define value will to maintain active users and bring new business opportunities for the DAM.
7. Metadata and Taxonomy Governance
Keeping the metadata relevant — The taxonomy and metadata specifications as terms and vocabularies must change over time.
Using metadata to control the information across the enterprise and assure accuracy and authenticity by
Documenting / mapping the existing databases and their associated workflows in order to effectively manage technical, functional and business changes — one specification change in one database may well affect the entire workflow.
Ensuring the Governance council owns, maintains and updates the master copy of the documentation and specifications.
Evaluating and assessing the User Interface (UI) and User Experience (UX) of the DAM, Taxonomy Navigation, Taxonomy, etc. on a regular basis (e.g., quarterly).
Monitoring the landscape — Start simple and add if required, be aware of the competition and how they name / categorize products, be sensitive to political and cultural viewpoints.
“Organizations need to practice qualitative corporate governance rather than quantitative governance thereby ensuring it is properly run.” — and “You cannot legislate good behaviour.” — Mervyn King, Governor of Bank of England
Key Practices of Information Governance
The benefits of Information Governance practices address the tenets of information strategy and information management as it affects people, process and technology. It focuses upon the value of having reliable metadata, taxonomy, policy development and stewardship with technology.
Metadata Governance
Management of metadata and taxonomy on an enterprise level, enforcing global content stewardship by means of controlled vocabularies on the site level.
Metadata Stewardship
Formalization and accountability over the management of data, metadata and taxonomy by site content management and ownership.
Create Pro-Active and Re-Active Governance
Establishment of Information Governance into existing systems like new internal technologies, websites and SharePoint sites. Also build data governance stewardship into data quality methodology, and build data governance stewardship into issue resolution process.
Performing Stewardship
Data stewardship is evidenced by the response to and ongoing updates to keywords and search terms, as well as additions of new metadata models and user types.
Building Governance into Technology
Need representation from IT on the Governance Committee (Information Management is a corporate issue, not just an IT issue).
Conclusion
DAM is not a project; it is a program. By definition, a project has a finite beginning and end, and a DAM requires considerable attention and governance at all stages and by all stakeholders. In this way, governance is the process, that helps you to ensure that when the initial phases of the DAM initiative are accomplished, you’ll have the opportunity to seek further capital and share the next generation of business valuation with executives.
Assessing health in governance is one of the most telling indicators and accurate predictors of enterprise DAM success. Every organization needs a way to ensure that the creation, use and distribution of information sustain the organization’s strategies and objectives. Information Governance is that qualitative and quantitative method to manage the greatness of information.
John Horodyski @jhorodyski is a Partner within the Media & Entertainment practice at Optimity Advisors, focusing on Digital Asset Management, Metadata and Taxonomy.
July 1, 2014
The High Cost of Assuming Enterprise Search Works Like Google
by Mindy Carner
A common mistake when seeking to improve access to content across the enterprise is assuming that intranet systems will act as “Google for the enterprise.” Unfortunately, this assumption is incorrect because the technology that Google employs is designed for the World Wide Web, not for an intranet. This means that the algorithms, measures and tricks that search engines use to rank your website cannot be mirrored for internal content.
The following list describes several elements of the internet search engine puzzle, and why they do not apply to internal search applications.
Inbound links: The value of a web page is a function of the number of other websites that link to it. Arguably, the more sites that link to yours, the more likely it is that your content is good and trusted. The more credible a web page’s content, the higher its ranking in search. This algorithm simply does not apply to intranet content that does not have such wide variances in quality of content.
Click-throughs: Web search engines also use click-throughs to determine ranking for a page. The more times that people click through your link on a given search, the higher that link will rise in the ranking. This algorithm is also absent from intranet searches.
Structured data: The collection of web pages that constitute the bulk of content on the World Wide Web is called structured data. That’s because the data is organized using HTML (HyperText Markup Language), which categorizes the web data using tags that can tell a search engine what’s a title, what’s a header, and what’s plain old paragraph writing. HTML also includes keywords, descriptions and much more structured markup that a savvy page creator can use to tell search engines what their web page is about. Because intranets rarely, if ever, have sophisticated organizing mechanisms, they produce copious amounts of unstructured content. This includes your Office documents, PDFs, OCRs, and other digital assets that have no formal structure to their content. This lack of formal structure makes for unstructured data, and poor search results.
The danger in not understanding this vital difference between enterprise search applications and internet search engines can lead companies to invest thousands of dollars in a technology solution that will underperform unless they also invest in the people, processes and metadata necessary to transform unstructured data into structured data. Without these additional investments, an off-the-shelf technology application is likely to yield nothing more than a very expensive keyword search.
For an in-depth look at how taxonomy fills in the gaps of an enterprise search solution, please read the Optimity Advisors Orange Paper “Enterprise Search and Taxonomy: Filling the gaps of an out-of-the-box enterprise search solution.”
Mindy Carner is a Senior Associate in the Media and Entertainment practice at Optimity Advisors
June 24, 2014
When Video Rights Go Wrong
by Julia Goodwin
Your company has a contract for a video you’ve bought or sold. All the terms spelling out your rights are there in black and white. They’re good rights, too, and you’re glad you got them. But then something goes wrong. It seems that over time the rights were somehow separated from the video they were defining. Putting the two back together again after so many hands have touched the video and its rights appears to be a daunting challenge.
There’s no such thing as a one-size-fits-all prescription for managing video rights in any organization. From the small production company to the large media conglomerate, “right-sizing” your video rights landscape and its health is a subjective question that depends on the needs of the organization, its workflows, its business focus. If all’s well, then a company’s video rights are tracked, it can share rights information when it needs to, it bills (or gets billed) properly for its distribution agreements, its assets are protected, and it can let go of most of its video rights anxiety.
If you’re not sure how your company is doing, here are some warning signs that your video rights may have gone wrong:
Internal consumers of rights (especially salespeople and derivative product divisions) complain about the amount of time it takes for them to get rights information
Your external participants complain they don’t get paid on time
You’re not sure if the royalties checks you’re getting are accurate
You have boxes of material that you’ll get around to cataloguing and monetizing some day when you have extra budget
You’re not sure how far out into the world your content has gone…without your knowledge or permission
You have multiple systems managing your video rights information
These are just a few of the most dramatic and painful symptoms of video rights gone wrong. With the proliferation of distribution channels for video and the monetary opportunities they bring, having a strong video rights ecosystem will ensure your company’s long-term success.
Companies invest heavily in systems and integrations to create or buy content, describe it, store it and send it out the door. Recently, it has become increasingly important to think carefully about the way video rights weave throughout those workflows, systems and integrations. This exercise will raise improvements to workflows and can introduce innovative new systems that can optimize a company’s time to market by knowing what it can use and protecting its content from misuse. A company can have thousands of videos in its Digital Asset Management (DAM) system, but if it doesn’t know how it is allowed to use them, what are these assets really worth?
Earlier this year, we learned how videos and their rights can be a timeless asset, especially for nostalgia buffs. Thanks to a blog posted by the Baseball Hall of Fame earlier this year, baseball aficionados were treated to a lost moment in our national pastime when video footage of Babe Ruth and Lou Gehrig from 1925 in the Fox Movietone archive at the University of South Carolina made headlines around the world. Actually, this clip had been in the university’s archive for many years and had even been previously licensed. Suddenly, sparked by the blog, this video lit up the media for days. It’s interesting to draw a parallel to a company’s video catalog. Could a company glean its archives for such videos and related rights? Could licensing older content create new interest and income, supported by social media? It’s exciting to think about the creative ways a company might make its video rights go really right.
Knowing the state of a company’s videos and the related rights is critical to a company’s efficient and proactive video ecosystem. To learn more, see the Optimity Advisors Orange Paper on Video Rights
Julia Goodwin is a Senior Manager within the Media & Entertainment practice at Optimity Advisors.
Rod Collins's Blog
- Rod Collins's profile
- 2 followers

