Social (media, technologies, applications) continues to penetrate the corporate world. To date the penetration has primarily been driven by the demand to take advantage of Facebook’s massive reach and enable companies to get closer to their customers. More recently, companies started taking similar advantage of Twitter, LinkedIn and Google+. The quick adoption of social by companies of every size, including the enterprise, has led to the rapid success of companies like Buddy Media, Vitrue, and Wildflower, whose tools and services enable companies to establish their social presence, primarily on Facebook.
In the course of investing in social application companies over the past 3 years, we have considered several different uses of these technologies in business processes, from product development to marketing to customer service and job-applicant tracking. In a report published by the McKinsey Global Institute, the authors identify 10 different ways social technologies can add value to enterprise functions. Along with this broader usage, companies are beginning to ask more pointed questions about the value of social, and the quantifiable benefits they receive from the use of such media, technologies and applications, as well as for actions that will enable them to increase this value. Analytics can play an important role in answering these questions and be used to provide insights and the corresponding actions, i.e., Insight as a Service, to increase the value of social. While the initial use of social analytics has been by the CMO, such analytics can also be used by the HR, customer support and other executives in the corporate suite.
Each use of social generates and leverages different types of data, from simple structured data about the characteristics of a brand’s fan base, to unstructured (and often messy and noisy) postings in activity streams, to more complex data captured in a social graph including Facebook’sOpenGraph. Some of this data (e.g., tweets) is very fast with a short “shelf life,” while other data is more enduring, such as a brand’s social graph. All this is indeed big data and must be approached as such.
Collecting, organizing and preparing this data for analysis can also be challenging. In surveys conducted in 2011 by Gleanster, 89% of the surveyed executives found the tracking and measurement of results from social campaigns to be very challenging, while 94% found it difficult to generate actionable insights from the social campaign data they collect, which suggests that they may not even be collecting the right data. Some of this data can be useful even without being rigorously analyzed. For example, in a recently published whitepaper, Bazaarvoice states social data alone is starting to impact CMOs’ decisions. It is also becoming evident to these executives that social data needs to be integrated with other corporate and third-party data in order to create the metrics and analytics to determine ROI. This points to the realization that while social data, including social graph data, is a new and less tested data type, early adopters of social are still able to recognize that this data by itself will not be sufficient for defining the value-analysis metrics and KPIs.
We are already starting to see that the analysis of social data, when done correctly, can yield significant results, allow business users to determine “what moves the needle” and establish the ROI of using social media and applications. But to determine which of the collected social data needs to be analyzed, one must first define what will constitute success in a particular process where social is used. For example, if social is used in marketing, will success be an increase in customer loyalty, higher brand engagement, or sales growth? What about in customer support? Will success be the optimization of the customer experience, or the reduction of the customer support costs? Once the objective is established, then one can define the metrics and KPIs that will be appropriate for determining its success. For example, to establish social ROI around customer loyalty, a CMO may want to understand whether prospects that are acquired through social media have a higher probability of becoming customers (i.e., convert) than prospects acquired through an email campaign, as well as understand the individual characteristics of the prospects that actually converted. Or the CMO may be interested in determining whether a particular social channel, such as Facebook, is leading to more conversions/sales than another such channel, such as YouTube. In its report on social analytics, Altimeter explains the various business objectives that can be achieved through the use of social and provides different metrics for understanding whether the objective has been successfully achieved. Obviously each metric is driven by different data. It is the analysis of these metrics and KPIs that provides the insights and actions which enable the determination of ROI.
Social Analytics for Marketing
While the insights and actions derived from the analysis of social data can produce significant ROI in a variety of business functions and processes, early attempts to develop social analytics have centered around the marketing department’s needs because CMOs have been some of the earliest adopters of social media for branding and customer acquisition. But while CMOs have seen their brands successfully acquire many fans, they have had trouble determining which of these fans to try to convert to customers, and establishing the lifetime value of each such converted customer. We have seen three types of social analytics for the CMO: sentiment analysis, attribution analysis, and social graph analysis.
Sentiment analysis refers to the measurement of a consumer’s attitude towards a brand and its competitors. The attitude may be towards the overall brand or towards a particular product or service offered by the brand. Sentiment analysis can be used to identify certain trends (e.g., a brand’s loyalty), but more importantly, it can be used to make predictions, such as predicting the broad sales performance of a product (and if combined with location-based and channel data, it can predict the performance around a particular geographic location, or predict the sales performance of the social channel in comparison to the sales performance of other channels, such as email). This type of analysis may be based on simple keyword-counting or on full-blown Natural Language Processing (NLP) of the social streams and other social content such as blogs and wikis. There are already tens of, rather undifferentiated, companies using keyword-counting to provide sentiment analysis solutions and a few that use more sophisticated NLP approaches.
Attribution analysis is the process of assessing the effectiveness of advertising campaigns by channel and each channel’s contribution to sales. As social has become an important channel, marketers have started using it for both direct response and brand advertising campaigns. Through attribution analytics the marketer is trying to determine what percentage of each social ad’s audience has been exposed to the ad, how many times each ad was seen, whether the use of social media “drove” prospects to properties owned by the brand, (e.g., the brand’s web site, or a store), and whether social media is more effective than paid advertising. Given the dynamic nature of social media and the very large number of variables involved with each campaign, isolating the elements that made a campaign successful might be tough, and thus attribution is very difficult.
Social graph analysis refers to the examination of the graph’s structure to reach certain conclusions. A graph’s structure consists of the data associated with each node and the connections/links among each two nodes. In a social graph a node may represent an individual, (e.g., Evangelos Simoudis, his place of residence, his birthday, etc.) or an organization (e.g., the Coca-Cola Company, headquartered in Atlanta). Connections may represent certain relations about the nodes (e.g., Evangelos Simoudis is a fan of Coca-Cola, and John Doe is a friend of Evangelos Simoudis and reads his blog, which he publishes through a particular URL). This is a more recent form of social analytics and it is being made possible by the API-based accessibility of the various social graphs, such as Facebook’s Open Graph. Graph analytics are being used to identify a brand’s advocates and establish which of these advocates are realinfluencers. For example, one way of using the Open Graph to establish whether a fan is a brand’s advocate, is by determining if the fan is publishing sponsored stories about the brand. By subsequently analyzing how these stories propagate along the social graph of each social channel (e.g., a sponsored story published in Facebook may be re-tweeted, thus moving from one social channel, and one social graph, to another) among the advocate’s social sphere, and the social sphere of each of his friends, a brand can determine which of these advocates are influencers. Finally, by analyzing the characteristics of these influencers (e.g., determine each influencer’s lifetime value), marketers can treat them differently than other advocates and fans. For example, the brand may decide to extend them special discounts to buy a product or service. Three of my own portfolio companies (Extole, 8thbridge andThisMoment) have developed extensive social graph analytics that are now being used by their customers. Extole and 8thbridge analyze Facebook’s Open Graph, whereas ThisMoment analyzes Google’s social graph. A very good example of Extole social graph analytics can be found here. According to eConsultancyresearch, 88% of surveyed companies indicated that social graph personalization, which is the result of social graph analytics, generates results.
Conclusions
As the use of social tools expands in a variety of business processes, there is increasing need to analyze the generated big data to extract insights and drive the right action(s), which improve the performance of each such process.
Social data is the key ingredient in social analytics, but often it must be combined with other data to produce the necessary high-quality results, very much like in any other area where analytics is being used.
Though it is considered the most widely used form of social analytics, sentiment analysis is not the only type of analysis to be applied to social data. Attribution analysis and graph analysis are important forms of social analytics.
Marketing departments in general and the CMO in particular have been early adopters of social analytics. By using social analytics they are starting to demonstrate the benefit and associated ROI of applying social media, technologies and applications to a diverse set of tasks, including advertising effectiveness, and brand engagement.
A couple of weeks ago my partners and I hosted a meeting of our SaaS advisory board. This is one of our firm’s four advisory boards. I had written about the role of these boards and the value our firm and portfolio companies derive from our advisors’ insights, feedback and help. Our SaaS advisory board includes executives from SaaS application vendors, CIOs and CTOs of companies that are heavy users of SaaS applications, and leading SaaS consultants. One of the most interesting insights that came out of our meeting was that the Chief Marketing Officers (CMOs) are starting to drive the corporate application agenda. This is consistent with Laura McLellan’s position who in a recent Gartner webinarstated that by 2017 the CMO will spend more on IT than the CIO, as well as the conclusions presentedhere.
There are several reasons for the CMO’s emerging power:
The accelerating move from offline to online for commerce, entertainment, and socialization. The web with its various personas (desktop, social, mobile, local) is enabling corporations to finally become truly customer-centric, i.e., to understand the problems customers face and provide mutually advantageous solutions.
The big data that is collected from the various online and offline interactions between a consumer and a brand can now be utilized effectively through a new generation of analytic solutions that enable corporations to better target existing customers and prospects with the right message, at the right time through the right channel, as well as to assess the effectiveness of each message based on the resulting actions.
Through the consumerization of the enterprise we are seeing a new generation of easier to “consume” applications that don’t resemble the monoliths of the past but instead take their cues from mobile applications, i.e., task-specific pieces of functionality that are easy to install, learn and use effectively.
The cloud that is making acquisition and deployment of these next-generation applications easier and faster.
The new generation of marketing personnel, including CMOs, that is more analytical, data-driven and technology-savvy.
Today we are seeing marketing departments acquiring applications to address online advertising for brand development and direct response commerce, social and mobile marketing and commerce, and analytics. Having anticipated this trend, over the past several years Trident Capital has been investing in such applications and today our relevant portfolio includes the online advertising technology companies Turn, Exelate, Jiwire, Brightroll, Sojern, the social marketing and commerce applicationsExtole and 8thbridge, and the retail analytic applications company Pivotlink.
Though the opportunities may appear brilliant, based on our experiences with these portfolio companies we have learned that working with the marketing organization also presents several challenges:
There is little tolerance for long application-implementation periods, inappropriately functioning software and need for specialized personnel to operate these applications. Marketing departments want to see results quickly and, under today’s typical corporate budget environments, they don’t want to have to hire new people just so that they can use a new application.
Short period during which to demonstrate meaningful ROI. Marketing departments may be willing to evaluate several different applications but they will ultimately commit to the ones that give them quick time to value with sustainable and growing ROI.
Data is not always well organized and structured. This is area where most frequently application vendors see a big difference between working with the marketing and the IT departments. For marketing departments managing data and maintaining its quality are new tasks. This task becomes harder as the volume, velocity, complexity, and structure of customer data are increasing. Marketing application vendors must be prepared to help by providing appropriate services in this area and thus ensuring that the application’s time to value will be short.
Skills for data analysis for insight-generation are lacking. While marketing departments are becoming awash in data, they often don’t have the people who can effectively analyze this data. Again, the marketing application vendors need to step and fill the void by offering their own data analytics and insight generation services.
Shorter initial licensing contracts and smaller marketing campaigns. As they try to understand the value of the myriad of applications offered to them in order to implement their customer-centric strategies, marketing departments feel that they must first “get their feet wet.“ In most cases this approach results in application licensing contracts that initially are short-term (1-3 months), or in smaller-dollar (typically $10-50K) marketing campaigns.
Need sales people who can first speak the marketer’s language rather than IT’s language. Over the past 30 years we have trained a cadre of application sales people who are expert at interacting with IT organizations, speaking IT’s language. This was necessary because front- and back-office enterprise applications, regardless of who was using them, were mostly purchased by the IT organization. If the next generation of marketing application companies is to be successful, they will need to hire sales people who can interact with marketing executives.
The sales cycles for these applications are becoming longer and more complex (see also here). Before the final decision for the licensing of these applications is made, IT is now becoming involved, and will continue to do so. Though the CMO’s prominence is rising, don’t expect the CIO’s role in marketing technology decisions to disappear. Over the past year our relevant portfolio companies started to see CIOs participating in important procurement decisions involving solutions for the marketing department.
The application’s user experience must be tailored to the marketing department’s users. We are starting to see application developers creating user experiences that are more akin to the practices being established around consumer software, particularly PostPC consumer applications. To easily adopt the multitude of new applications offered to them, marketing departments must want to interact with them and must be able to do so easily and with little or, preferably, no training.
As corporations become more customer-centric and the move to online continues, data-driven marketing departments stand to reap big rewards. For this reason they are acquiring a new generation of applications to help them improve their interactions with customers and prospects regardless of channel. Marketing application vendors must understand this trend along with its positive and negative implications, as well as the evolving roles of CMOs and CIOs, in order to best capitalize on it.
A few days ago I presented a webinar on Insight as a Service. In the presentation I tried to provide further details on the concept which I first introduced here and later elaborated here. I am including the webinar presentation (click on the slide below) and the notes because they elaborate further on Insight as a Service and provide some examples.
A few days ago I participated in Pacific Crest’s workshop for private SaaS companies. This workshop is being held every year as part of Pacific Crest’s technology conference. In addition to the spirited discussion among SaaS company executives and investors, during the workshop Pacific Crest’s Brendan Barnicle and David Spitz presented the results of two surveys they conducted. Brendan spoke about the CIO survey (sample of about 100 CIOs) regarding SaaS trends and sentiment. David presented the results of the SaaS private company survey (sample of about 70 private SaaS companies) about business metrics. A number of Trident Capital’s SaaS portfolio companies were invited to participate in the workshop in addition to me. Of the material that was presented, the data that caught my attention included:
Overall 2011 IT budgets are expected to increase by 0.9% over 2010 numbers, compared to the 1.1% increase that was anticipated during the 1H11. Of the surveyed CIOs 35% indicated their intention to re-evaluate their IT budgets during 2H11, with 60% of them anticipating selective budget increases.
SaaS application usage in the enterprise is increasing and adoption of such applications is becoming a higher priority. The surveyed CIOs indicated that today 16% of the applications used by their corporations are SaaS, whereas next year the number will be 17%.
CRM and BI/analytics, including web analytics, remain the top the areas where SaaS applications are first being used in the enterprise. More importantly, according to the survey and other information presented by SaaS vendors in the conference, CIOs are now asking for suggestions on the types of SaaS applications to include in their portfolio. I think that CIOs are realizing the unstoppable SaaS adoption and they don’t want to be marginalized by opposing it as was the case just a couple of years ago. According to InformationWeek, 65% of contracts with SaaS companies are still being initiated by the business, and only 35% is initiated by CIOs.
During 2Q11 SaaS vendors were able to increase prices, around 5%, or they offered fewer discounts. Two reasons were given for this trend. First, corporations are finding that SaaS applications can drive revenues or significantly reduce cost. Second, their employees like using these applications so their usage is expanding within each company. Several of the private SaaS company executives participating in the workshop confirmed this pricing trend but they also countered that they are spending more time than in the past negotiating terms with clients.
Data is becoming an increasingly important component of every SaaS solution. SaaS vendors are starting to exploit the data they collect from each customer either in order to offer additional applications around this data, or to provide benchmarking services among their clients. Almost a year ago I had written about this opportunity and called it insight-as-a-service. In the main conference, Realpage reported how it is using data to offer 10 new applications to its customers.
Security, or the perception about the higher security vulnerability of SaaS applications, remains the biggest obstacle to the broader adoption by the enterprise of cloud computing in general and such applications in particular. The CIOs must also move from a product to a service mentality in order to better support the business units that adopt SaaS applications. The employees of SaaS companies realize the importance of service and are focusing on this issue much more than the employees of on-premise software companies.
During 2011 SaaS companies have been growing faster than during 2010 (median revenue will grow 44% during 2011 vs 40% during 2010) but with over 30% of the respondents projecting YoY growth that will be greater than 60%.
SaaS companies expecting to do $10-25M in 2011 revenue are growing the fastest compared to other smaller and larger companies that were surveyed. These companies expect an average growth rate of 75%. This rate is about double of what it was last year’s, indicating that for SaaS companies $10-15M in revenue provides them with “escape velocity.”
In last year’s survey, SaaS companies that were using field sales were growing faster than those relying predominantly on inside sales. The 2010 survey results were more balanced. The companies using inside sales were growing at similar rates to those using field sales. Moreover, among the 2011 survey participants field and inside sales were the two dominant go-to-market models. Internet and channel appear to be used relatively infrequently as the primary sales models. As I had also reported from our own SaaS portfolio company results, the field sales model is being predominantly used by companies whose solutions command higher ACV (over $60-70K).
Median CAC for new customer dollar was reported at $0.93, whereas for upsells and renewals was reported at $0.28 and $0.16, along expected patterns.
The surveyed companies reported that they expected their gross margin at scale (defined as $50M in annual license revenue) to be 71%. Pacific Crest’s target model is at 79%. Companies that are expected to grow by more than 45% expect to spend 38% of their budget in sales and marketing.
More companies, particularly those with larger ACVs, are reporting contract lengths of more than 1 year, again indicating a trend that we have also been seeing in our own portfolio companies. A few of the company executives that participated in the workshop indicated that their customers are asking for longer-term contracts (primarily 3 years) typically paying 1 year upfront. In fewer cases, SaaS company CEOs reported that they are pushing multiyear contracts on their customers.
Companies are moving away from pricing based on seats and are looking for other business models that are primarily related to the application’s usage.
Best in class churn among the companies surveyed was reported at around 5%.
Most companies that have reached “escape velocity,” i.e., annual revenues that are larger than $10-15M, have raised $25-40M
Companies are reaching breakeven and are expecting to generate free cash flow at around $20M of annual revenue.
I walked away from the workshop with a few conclusions regarding the state of SaaS. First, the data presented reaffirmed the unstoppable wave that SaaS and cloud computing represent, along with the tremendous investment opportunity in these sectors. Second, the presented data further validated the data we track in our own SaaS portfolio companies: customer behavior metrics, annual growth rates, contract terms, ACV, CAC, churn, capital efficiency, and escape velocity. As I had written a couple of years ago, the 2008 recession catalyzed and even accelerated the adoption of SaaS applications in the same way the Y2K problem catalyzed the use of IT outsourcing during the late ‘90s and the first decade of this century. I expect that even if the economy continues to exhibit the weakness that has recently been reported, SaaS companies will do great. Enterprises have come to understand and appreciate the financial benefits resulting from the use of SaaS applications to both their top and bottom lines. Finally, over the past couple of years, more than ever before, enterprises have comprehended the importance of customer centricity and customer intimacy. Through the use of SaaS applications, corporations are able to more easily collect and analyze pertinent transactional customer data and mash it with social, mobile and sensor data creating big data collections of unprecedented detail that help them better understand their customers and thus become more customer-centric.
Despite the macro trends that continue to be challenging for the US economy, concerns about Europe and Greece’s impact on the continent’s economy, and the continued impasse for an agreement that will raise the US debt ceiling, SaaS companies have been reporting strong performance results for 2Q11. I got a good feeling for the quarter when our SaaS portfolio company CEOs started to report preliminary positive results before the end of June. We typically have to send messages asking them to update us on the quarter’s performance. During 1Q11 only 40% of our SaaS companies achieved their numbers. It was a different story during 2Q11 when 90% of our SaaS portfolio companies made or exceeded their financial targets.
The public SaaS companies that have already reported quarterly results (Rightnow, Concur, Netsuite, SPS Commerce, Vocus, AthenaHealth) demonstrated strong performance, meeting or beating expectations. Salesforce and SuccessFactors will be reporting in the next few days but are expected to turn up strong results. During the quarter we didn’t see the M&A activity that was exhibited during 1Q11, but there were plenty of private company financings (both initial and follow-on rounds) indicating the continued strong investment interest in the sector. Most of these rounds commanded valuations that depending on how you see it either defy reason or point to investor bullishness on the strength of the SaaS model; but that’s another story. Based on the data provided to Trident by the private SaaS companies that sought financing during the quarter we saw strong performance during 2Q11.
There were several elements of our SaaS portfolio’s performance we liked:
Increasing demand for SaaS applications particularly by the larger enterprises where business application updates remain one of the top priorities. We are seeing similar upmarket movement by several of the public SaaS vendors.
The percent of multiyear contracts is growing, with more of the larger customers opting for 3-year contracts. These customers are prepaying at least the first year which greatly helps SaaS companies with their cash flow needs.
As the demand for SaaS applications increases, more systems integrators are starting to support the model and are seeking partnerships with the right application vendors, including private companies. Our portfolio companies signed more partnership deals during this quarter than ever before.
Contract renewals met targets and several customers not only renewed but expanded their usage. This is the result of continued strong ROI that customers are seeing through the use of our portfolio’s SaaS applications.
Interest from international clients, particularly from Europe, continues to increase and several US customers are expanding the use of SaaS applications to their international operations and subsidiaries.
While still early for definite conclusions, sales pipelines for 3Q and 4Q are growing generating optimism for strong results during these quarters, assuming the US economy does not stall.
The performance elements we didn’t like include:
Higher discounting and more deals signed at the end of the quarter. The larger customers, particularly those signing multi-year contracts, have continued to hold off until the end of the quarter before they commit. We have seen a few percentage points increase in the amount of discounting necessary in order to close several of these larger deals, compared to 1Q11. In addition to the discounting, as can be expected, signing a contract at the end of the quarter had a negative impact on the target MRR for these companies. However, backlogs grew as expected.
Increasing sales and marketing costs. The move upmarket by several of our SaaS companies, as well as many of the public ones by the way, necessitates a different and more expensive sales and marketing approach. Several of our companies are establishing additional field sales teams to adequately address customer expectations. This move is negatively impacting margins and challenges the high-leverage SaaS sales model that, at least at Trident as long term investors of this software area, we have come to expect.
Some of the larger companies looking to adopt SaaS applications expect hybrid cloud implementations primarily because they want to keep their data behind their firewall. Thus far private SaaS companies have been resisting this requirement because it will necessitate R&D investments to create hybrid cloud versions of their applications. It is not clear whether our companies will ultimately be successful in convincing customers, particularly in industries such as financial services, telco and health care, about the adequacy of the public cloud for their application needs.
We hope that the lackluster performance of 1Q11 was this year’s exception and that our SaaS companies during 3Q and 4Q will continue demonstrating the strong performance they exhibited during 2Q11.
A couple of years ago while analyzing the ways SaaS business applications could evolve we started thinking about the social web’s impact on business processes. At the time, Facebook’s success was accelerating while Twitter and Zynga were emerging. We know that consumer-oriented companies want to engage their customers and prospects in the places they frequent, i.e., the social web. We also started seeing early signs of consumer-oriented technology adoption by corporations. These realizations made us hypothesize that the social web will figure prominently in the next generation of business applications and that these applications will be built on top of new platforms that have the social web at their core. To date we have invested in three social application companies: Extole, a company that provides a social marketing application, 8thbridge, a company that provides a social shopping and commerce application, and Jobvite, a company that provides a social recruiting application.
As we examine the characteristics of our three investments as well as those of other relevant companies we have considered investing, we have concluded that the applications developed by such companies:
Are delivered over the cloud.
Automate businesses processes that target individuals (consumers or employees), e.g., marketing, shopping, recruiting, customer experience management, collaboration. For example, Starbucks’ 13M Facebook fans or Coke’s 11M Facebook fans do nothing for these brands unless they can somehow demonstrate their engagement with each brand. A couple of weeks ago 8thbridge, working with Paramount Pictures, created the Facebook store for the recent Transformers movie. In the first two days the store went up in resulted in 900K new fans that, most importantly, generated 77K content interactions and also led to many ticket sales. As consumers increase their participation in social media, business executives continue to create programs to engage with them. In fact, 70% of interactive marketers are currently piloting processes to drive word-of-mouth marketing through their most vocal consumers, 58% have launched systems encouraging consumers to spread company messages; and 47% have launched social tools that allow consumers to support each other, e.g., forums.
Are implemented on new platforms with social networking structures, e.g., enable access and operations on a social graph, and capabilities, e.g., capitalize on the social graph to enable viral distribution, at their core. In addition, these platforms have global reach, support large partner ecosystems, are device agnostic, support online and mobile communications, advertising and commerce. Facebook has emerged as the strongest platform with these characteristics, in the same way that Google a few years earlier emerged as a platform for applications that had search at their core. However, we expect that Google, Microsoft and Apple will soon augment their own web platforms with such features as well.
Are data-centric, in that they generate and operate on big data, and use analytics to provide a variety of insights pertinent to the business. Insights on how a fan uses his social graph to spread a message about a brand, how to improve user engagement around a brand, identify which customers must be nurtured because they are brand influencers or can attract talent to a particular company, which customers or brand fans must be rewarded because they can address product problems, and which consumers shall be offered a reward, e.g., more frequent flier miles, because they can drive a particular group-buying behavior, e.g., organizing a vacation with a group of their Facebook friends and purchasing plane tickets, identify customers that are ready to defect because they are dissatisfied with a company’s service quality.
Require significant consulting services as corporations are in the very early stages of trying to determine how to take advantage of the social web and appropriately adjust their business processes and practices. In general, we are seeing that as much as 50% of these companies’ revenue could come from consulting services. Unfortunately, the established professional services organizations are not yet able to field the right solutions. As a result, the social application companies end up offering the services themselves.
We continue to look for additional investment opportunities in early and expansion stage companies whose applications use the social web to address important business processes in unique and valuable ways. Some application categories are becoming overcrowded with companies that have little differentiation and overfunded by investors. We, however, are looking for the companies that have developed social applications whose value will not become obvious for another couple of years.
After being away for a few weeks (vacation and business travel), I returned to the Bay Area and attended GigaOm’s Structure conference and Amazon’s AWS Summit. I have been attending Structure since the first event but this was my first time attending the AWS Summit. I was drawn to these conferences for two reasons. First, while over the past 10 years we have invested and continue to invest heavily in SaaS application companies, we have foregone investment opportunities in companies that provide cloud infrastructure solutions, i.e., PaaS and IaaS. I have been trying to determine whether we should be considering investments in cloud infrastructure companies today, particularly capitalizing on our extensive experience from SaaS. Second, I am always interested to hear about cloud computing best practices that can be used by our portfolio companies.
Cloud adoption is growing among corporations of every size but particularly SMBs. In a report published by Forrester on 4/11 the total size of the public cloud market is pegged at roughly $25B (with the majority today being in SaaS applications) and is projected to grow to $160B by 2020. A recent (May 2011) survey conducted by Morgan Stanley establishes that the usage of public clouds for a variety of workloads is expected to show a 23% CAGR over at least the next three years, with higher usage rates for SaaS applications compared to cloud-based infrastructure. SMBs see the use of public clouds as a means of significantly reducing their hardware costs. Larger enterprises are using the public clouds more for rapidly extending the functionality of internally developed applications, with mobile extensions cited most frequently, as well as for executing compute-intensive workloads like analytics. There is also a lot more talk about private clouds and hybrid clouds. Most companies in fact are either using all three types of cloud deployment or state their intent to use them. Only 37% of companies surveyed indicated that they will only use public cloud deployments. This means that SaaS application vendors may need to start thinking more seriously about the need to offer versions of their software running on private and hybrid clouds rather than the public cloud options that most, particularly the startups, have today. For investors this will mean that their SaaS companies may need to invest more in R&D and support as they roll out such options.
Security, interoperability, vendor lock-in, reliability, complexity and data privacy continue to be listed as the top inhibitors to cloud adoption. However, ironically customers don’t always feel that public clouds are less secure than their own data centers. But in security they also tend to include regulatory and compliance issues. Privacy is a different story and one that is viewed differently by the US and Europe.
In a report released during the Structure conference, with some numbers corroborated by Werner Vogels, CTO of Amazon.com during his address at the AWS Summit, GigaOm ranks Amazon as the largest public cloud provider with 2010 revenue of $500M with Rackspace being second. In his address Dr. Vogels talked about the investments Amazon has made and continues to make around AWS. Rackspace, Salesforce, Microsoft, Google, IBM, and other large companies that want to be providers of public cloud infrastructure and services, are also making huge investments in order to remain price-competitive and offer the broadest possible set of services. Based on what I heard during that week, and despite the fact that investments in cloud companies are increasing (64 investments in 2009 totaling $180M, and 93 investments in 2010 totaling $713M) I think that it will be hard, if not impossible, for a startup to compete effectively with the large public cloud computing providers. Public cloud infrastructure has therefore emerged as the battle of the giants. I continue to maintain that the SaaS application space offers more opportunity to startup companies and their investors, Trident included.
Cloud continues to be defined by its benefits rather than its technology. Cloud computing’s benefits that were only stated by vendors, are now validated by customers that have been using such solutions. Surveyed customers see cloud computing as a way to lower IT costs, increase corporate agility, and provide the foundation for 21st century architectures with the right levels of abstraction from infrastructure to platform that will result in higher quality systems. Based on these benefits and what was discussed and reported during the conferences, a few of the best practices that may be of use to SaaS application companies include:
The application’s architecture determines the level of success and the overall cost of deploying the solution to a public cloud. Moreover, the successful use of a public cloud by an application provider requires the continuous collaboration between the cloud provider and the application vendor.
Public cloud vendors strive to provide uniform, rather than peak-performance, characteristics across the computing resources they provision. This must be taken into account by the SaaS application vendors as they architect their solutions.
While a public cloud provider could offer a SaaS application vendor with a low cost way to develop and launch a new application, scaling costs continue to be high, often leading the application vendor to create its own public cloud. Of course, companies the size of Zynga and Netflix are able to negotiate special pricing with their public cloud providers, e.g., AWS. However, for private companies that have not yet reached that scale, the options are more limited.
I expect that this year the number of venture investments and the amount invested in cloud computing companies will surpass last year’s numbers. I continue to see investment opportunities the following areas:
Big Data Analytics. Analysis of Big Data is starting to become synonymous with cloud computing. I expect that this trend will continue as public cloud providers offer Hadoop and MapReduce options allowing analysts to send their large workloads to the cloud. Data movement and “accessibility” of such services by business analysts (you shouldn’t need to be a data scientists just so that you can use such a service) will continue to be issues that will need to be addressed.
Enterprise mobile applications. Because of the smartphone proliferation, enterprises are now starting to aggressively adopt mobile applications. As a first step enterprises are looking to create mobile versions of their mission critical applications, as well as develop mobile applications in completely new, innovative areas. Public and hybrid clouds are uniquely suited for the rapid development and deployment of these applications. Therefore, both specific enterprise mobile applications and environments for developing and deploying such applications are of interest for potential investments.
Data and application integration. I have often written about how the proliferation of public and hybrid clouds will increase the need for integrating data to such applications as well as integrating applications. For example, a cloud-based application may need to access data that is behind the firewall, or two cloud-based applications may need to be integrated to effectively automate a business process.
Data marketplaces and API management. As companies expose more of their data for cloud-based application, the data itself and the APIs through which this access is accomplished will need to be managed. While I think that we are still in very early stages of data marketplaces and API management, and it is not clear whether either of these areas will emerge into substantial markets and lead to the creation of large companies, there may exist a few interesting investment opportunities to consider.
Accelerating data movement. Moving large data sets from data centers to the cloud remains a vexing problem particularly for big data analytics applications. While cloud-based processing and storage performance are improving, data movement has not yet made corresponding strides and is thus approaches to address this problem are ripe for investment.
I continue to be excited about the prospects offered by cloud computing and delighted by the increasing usage of all its layers (application to infrastructure) by small and large companies. I don’t feel that we have missed big opportunities by not investing below the cloud’s application layer, but feel that there are at least five areas that represent interesting areas for future investments.
Last October I wrote about emerging data analytic services that could be offered under the term Insight as a Service. More recently Mark Suster wrote about the need for data as a service, and Gordon Ritter in his post added his thoughts on Insight as a Service. In my post I had proposed three types of data that can be used for the creation of such insights: internal company data, web usage data, and third party syndicated data. That same month I invested in Exelate, a company that provides a marketplace for online data and a Data Management Platform (DMP) that can both be used to improve the targeting effectiveness of display advertising. Online advertising and financial analytics are two areas where data marketplaces are succeeding. Exelate and other marketplaces demonstrate that significant companies can be built in this area. Companies like Thompson/Reuters and Bloomberg, and startups like cloud-based Xignite, have also been successful in selling financial data to application vendors. More recently we are seeing the emergence of startups like Gnip, Factual, Infochimps, WebServius, as well as Microsoft's Azure DataMarket introduced cloud-based marketplaces offering a broad variety of commercial and open source data sets. The success of data-driven application companies such as Zillow, Recorded FuturePayscale and a few others provide proof that innovative solutions can be developed from third-party data. But, the cost of the base data on top of which these solutions are developed, is miniscule compared to the cost of processing and augmenting the data to provide unique value to the user of these solutions. So, outside the online marketing and financial services areas, under what conditions can venture investors make money investing in companies that offer data marketplaces?
Large quantities of data are made available daily. Even though Hadoop and other emerging open source Big Data management technologies are significantly reducing the cost of storing, managing and processing such data, I claim that creating a data marketplace from scratch is still an expensive proposition. For data offered through a marketplace to be valuable it must be unique, and complete. For the marketplace to be successful in addition to having data with such characteristics it must solve the data distribution and monetization problem. This is actually a chicken-and-egg problem. The marketplace must contain enough variety of unique and complete data to attract buyers, but it needs data buyers to attract such data sources. Companies like Factual and Infochimps solve this problem by investing heavily on marketing, offering free onboarding of data, regardless of the data’s revenue potential, and focusing primarily to open source data that is typically coming from governments.
Uniqueness is often created by augmenting a data set in a variety of ways. For example, Zillow processes Google Earth data. Exelate’s analytics group does the same to data contributed by online publishers increasing its value and information content for the DSPs that use it, by identifying trends, attributes that are predictive of specific desired outcomes, etc. It is not clear whether general purpose data marketplaces will be able to provide such augmentation because it will imply that they become experts in several different application areas. This kind of processing costs money and is proprietary, along with the resulting data. As a result, the processed data many never find its way to a data marketplace.
Completeness of a data set is also very important. For example, if I want to compare the room prices of major hotel chains across the US in order to determine whether one chain is consistently more expensive than the others, I will need to obtain prices from every US city where the hotel chains being compared have properties. If I want to be comprehensive I can’t be satisfied having prices for only 60% of the cities, and 50% of the properties of the hotel chain. I need a complete set.
In addition to distribution, monetization, uniqueness and completeness, privacy, data ownership and data location are a few of the other issues data marketplaces must address. The marketplace must have clearly articulated policies about who owns the data and what operations can be performed on the licensed data. For example, I give my data to Facebook and feel relatively comfortable with their data privacy policies or at least the setting I can control. We are now seeing applications being built on top of Facebook with completely undefined privacy policies, (Examples: What data are these applications capturing in addition to what is provided by Facebook? How are they using it? Are they selling the data they capture?). Contributors to data marketplaces must also understand where their data is stored and what can be done on the data. For example, is their data stored in a country where it can be easily subpoenaed, or distributed with no control?
If these are indeed the necessary conditions for creating a financially-viable data marketplace, then rather than focusing on open source data, marketplaces must focus on licensing and distributing premium data that may be coming from companies like Zillow, Experian, Nielsen, etc. However, focusing on such premium data providers could imply a vertical, industry-specific approach to building marketplaces rather than a horizontal, general-purpose approach.
We are generating ever increasing quantities of data that can be used in a new generation of innovative solutions. In the presence of all this data, data marketplaces offer to the data owners an attractive model for distributing and monetizing on such data. However, early indications from companies that have built valuable data-driven solutions, as well as from application-specific data marketplaces show that the type of processing and augmentation that needs to be performed on data before it can be effectively used by such solutions necessitates a vertical marketplace approach rather than a horizontal one.
Today, Friday the 13th of May, 2011, the Boulder BI Brain Trust heard from Larry Hill [find @lkhill1 onTwitter] and Rohit Amarnath [find @ramarnat on Twitter] of Full360 [find @full360 on Twitter] about the company's elasticBI™ offering.
Serving up business intelligence in the Cloud has gone through the general hype cycles of all other software applications, from early application service providers (ASP), through the software as a service (SaaS) pitches to the current Cloud hype, including infrastructure and platform as a service (IaaS and PaaS). All the early efforts have failed. To my mind, there have been three reasons for these failures.
Security concerns on the part of customers
Logistics difficulties in bringing large amounts of data into the cloud
Operational problems in scaling single-tenant instances of the BI stack to large number of customers
Full360, a 15-year-old system integrator & consultancy, with a clientele ranging from startups to the top ten global financial institutions, has come up with a compelling Cloud BI story in elasticBI™, using a combination of open source and proprietary software to build a full BI stack from ETL [Talend OpenStudio as available through Jaspersoft] to the data mart/warehouse [Vertica] to BI reporting, dashboards and data mining [Jaspersoft partnered with Revolution Analytics], all available through Amazon Web Services (AWS). Full360 is building upon their success as Jaspersoft's primary cloud partner, and their involvement in the Rightscale Cloud Management stack, which was a 2010 winner of the SIIA CODiE award, with essentially the same stack as elasticBI.
Full360 has an excellent price point for medium size businesses, or departments within larger organizations. Initial deployment, covering set-up, engineering time and the first month's subscription, comes to less than a proof of concept might cost for a single piece of their stack. The entry level monthly subscription extended out for one year, is far less than an annual subscription or licensing costs for similar software, considering depreciation on the hardware, and the cost of personnel to maintain the system, especially considering that the monthly fee includes operations management and a small amount of consulting time, this is a great deal for medium size businesses.
The stack being offered is full-featured. Jaspersoft has, arguably, the best open source reporting tool available. Talend Open Studio is a very competitive data integration tool, with options for master data management, data quality and even an enterprise service bus for complete data integration from internal and external data sources and web services. Vertica is a very robust and high-performance column-store Analytic Database Management System (ADBMS) with "big data" capabilities that was recently purchased by HP.
All of this is wonderful, but none of it is really new, nor a differentiator from the failed BI services of the past, nor the on-going competition today. Where Full360 may win however, is in how they answer the three challenges that caused the failure of those past efforts.
Security
Full360's elasticBI™ handles the security question with the answer that they're using AWS security. More importantly, they recognized the security concerns as one of their presentation sections today stated, "Hurdles for Cloud BI" being cloud security, data security and application security. All three of these being handled by AWS standard security practices. Whether or not this is suficient, especially in the eyes of customers, is uncertain.
Operations
Operations and maintenance is one area where Full360 is taking great advantage of the evolution of current Cloud services best known methods and "devops" by using Chef opscode recipes for handling deployment, maintenance, ELT and upgrades. However, whether or not this level of automation will be sufficient to counter the lack of a multi-tenant architecture remains to be seen. There are those that argue that true Cloud or even the older SaaS differentiators and ability to scale profitably at their price-points, depends on multi-tenancy, which causes all customers to be at the same version of the stack. The heart of providing multi-tenancy is in the database, and this is the point where most SaaS vendors, other than salesforce-dot-com (SFDC), fail. However, Jaspersoft does claim support for multi-tenant architecture. It may be that Full360 will be able to maintain the balance between security/privacy and scalability with their use of devops, and without creating a new multi-tenant architecture. Also, the point of Cloud services isn't the cloud at all. That is, the fact that the hardware, software, platform, what-have-you is in a remote or distributed data center isn't the point. The point is the elastic self-provisioning. The ability of the customer to add resources on their own, and being charged accordingly.
Data Volume
The entry-level data volume for elacticBI™ is the size of a departmental data mart today. But even today, successfully loading into the Cloud, that much data in a nightly ETL run, simply isn't feasible. Full360 is leveraging Aspera's technology for high-speed data transfer, and AWS does support a form of good ol' fashioned "sneaker net", allowing customers to mail in hard drives. In addition, current customers with larger data volumes, are drawing that data from the cloud, with the source being in AWS already, or from SFDC. This is a problem that will continue to be an "arms race" into the future, with data volumes, source location and bandwidth being in a three-way pile-up.
In conclusion, Full360 has developed an excellent BI Service to suplement their professional services offerings. Larger organizations are still wary of allowing their data out of their control, or may be afraid of the target web services provide for hackers, as exemplified by the recent bank & retailer email spammers, er marketing, and Sony break-ins. Smaller companies, which might find the price attractive enough to offset security concerns, haven't seen the need for BI. So, the question remains as to whether or not the market is interested in BI in the Cloud.
Back in January I had read some reports indicating that after a strong 4Q10, the US economy was showing the signs of a slowdown. During January and February in particular businesses slowed their investment activities. These sobering reports were counterbalanced with more upbeat reports that small and larger enterprises continue their switch to SaaS solutions with CIOs looking to invest in new SaaS applications. This trend was reflected in the sales pipelines of our SaaS portfolio companies that continued to show strong growth during the quarter. However, ultimately our SaaS portfolio performance was ordinary, more reminiscent of the performance during 2Q10 rather than that of 4Q10. In other words, about 40% of our SaaS portfolio companies made or exceeded their quarterly financial targets, with over 70% coming to within 80% of their targets. By comparison, over 80% of our SaaS companies made or exceeded their financial targets during 4Q10.
The public SaaS companies are starting to present a more hopeful story. As these companies are starting to report their quarterly results, e.g., Success Factors, RightNow, netsuite, Constant Contact, SPS Commerce, Vocus, we are seeing bookings and revenue growth that meets or exceeds the guidance they had provided. This growth is coming from multiple industries and particularly from the mid-upper enterprise segment. We are also seeing the public SaaS companies accelerating their M&A activity (see Salesforce, Success Factors, VMWare) as they try to expand the functionality of their core solutions. Such moves of course will likely have a negative impact on margins.
Over the past couple of weeks my partners and I attended several board meetings to review the 1Q11 performance of our SaaS companies. As I mentioned above, the great momentum our SaaS companies achieved during 4Q10 was not carried during 1Q11 resulting in an OK quarter. Our analysis of their performance is leading us to the following conclusions:
Companies of different sizes and from several different industries are considering SaaS applications to address their business needs. This is reflected in the overall size of the sales pipelines of each of our SaaS portfolio companies by number of opportunities and the size of each opportunity. In general we are seeing
the size of the average opportunity increasing,
the age of each opportunity remains small, i.e., few opportunities are over 3-4 months old,
the distribution of contract duration has remained steady, i.e., the distribution among one-, two- and three-year contracts has remained the same as in the previous two quarters,
the distribution among the size of the prospects, i.e., among small, mid-size and larger enterprise companies, has remained the same as in the past two quarters, and
while the geographic distribution of the deals continues to be US-centric, the number of foreign companies in the pipelines is increasing.
Analysis of the deals that were won and lost during the quarter indicates that while companies are accelerating their adoption of SaaS applications, in this economic environment they tend to prefer to sign deals with larger, public SaaS vendors rather that smaller, private ones. Many other companies decided to push their purchase decisions out by 1-2 quarters presumably in order to see how the economy will do during this time.
Regardless of the customer size, and similar to 3Q10 and 4Q10, the majority of the sales closed during the last 2-3 weeks of the quarter.
Customer churn remained 7-10%.
During the past two weeks I have also had the opportunity to talk with other investors with significant SaaS portfolios and heard that their companies reported similar quarterly performance.
Our management teams feel rather upbeat about 2Q11. They are basing their optimism on the discussions they are having with the prospects that didn’t buy during last quarter but have indicated that they will make a decision in the very near future, as well as the overall size of their sales pipelines which have continued to grow with consistent regularity. Finally, as investors we hope that the problems with Amazon’s AWS service that impacted many SaaS companies last week will not have any negative material impact in the 2Q11 revenue and bookings of these companies. It was fortuitous that it happened relatively early in the quarter.