A couple of weeks ago my partners and I hosted a meeting of our SaaS advisory board. This is one of our firm’s four advisory boards. I had written about the role of these boards and the value our firm and portfolio companies derive from our advisors’ insights, feedback and help. Our SaaS advisory board includes executives from SaaS application vendors, CIOs and CTOs of companies that are heavy users of SaaS applications, and leading SaaS consultants. One of the most interesting insights that came out of our meeting was that the Chief Marketing Officers (CMOs) are starting to drive the corporate application agenda. This is consistent with Laura McLellan’s position who in a recent Gartner webinarstated that by 2017 the CMO will spend more on IT than the CIO, as well as the conclusions presentedhere.
There are several reasons for the CMO’s emerging power:
The accelerating move from offline to online for commerce, entertainment, and socialization. The web with its various personas (desktop, social, mobile, local) is enabling corporations to finally become truly customer-centric, i.e., to understand the problems customers face and provide mutually advantageous solutions.
The big data that is collected from the various online and offline interactions between a consumer and a brand can now be utilized effectively through a new generation of analytic solutions that enable corporations to better target existing customers and prospects with the right message, at the right time through the right channel, as well as to assess the effectiveness of each message based on the resulting actions.
Through the consumerization of the enterprise we are seeing a new generation of easier to “consume” applications that don’t resemble the monoliths of the past but instead take their cues from mobile applications, i.e., task-specific pieces of functionality that are easy to install, learn and use effectively.
The cloud that is making acquisition and deployment of these next-generation applications easier and faster.
The new generation of marketing personnel, including CMOs, that is more analytical, data-driven and technology-savvy.
Today we are seeing marketing departments acquiring applications to address online advertising for brand development and direct response commerce, social and mobile marketing and commerce, and analytics. Having anticipated this trend, over the past several years Trident Capital has been investing in such applications and today our relevant portfolio includes the online advertising technology companies Turn, Exelate, Jiwire, Brightroll, Sojern, the social marketing and commerce applicationsExtole and 8thbridge, and the retail analytic applications company Pivotlink.
Though the opportunities may appear brilliant, based on our experiences with these portfolio companies we have learned that working with the marketing organization also presents several challenges:
There is little tolerance for long application-implementation periods, inappropriately functioning software and need for specialized personnel to operate these applications. Marketing departments want to see results quickly and, under today’s typical corporate budget environments, they don’t want to have to hire new people just so that they can use a new application.
Short period during which to demonstrate meaningful ROI. Marketing departments may be willing to evaluate several different applications but they will ultimately commit to the ones that give them quick time to value with sustainable and growing ROI.
Data is not always well organized and structured. This is area where most frequently application vendors see a big difference between working with the marketing and the IT departments. For marketing departments managing data and maintaining its quality are new tasks. This task becomes harder as the volume, velocity, complexity, and structure of customer data are increasing. Marketing application vendors must be prepared to help by providing appropriate services in this area and thus ensuring that the application’s time to value will be short.
Skills for data analysis for insight-generation are lacking. While marketing departments are becoming awash in data, they often don’t have the people who can effectively analyze this data. Again, the marketing application vendors need to step and fill the void by offering their own data analytics and insight generation services.
Shorter initial licensing contracts and smaller marketing campaigns. As they try to understand the value of the myriad of applications offered to them in order to implement their customer-centric strategies, marketing departments feel that they must first “get their feet wet.“ In most cases this approach results in application licensing contracts that initially are short-term (1-3 months), or in smaller-dollar (typically $10-50K) marketing campaigns.
Need sales people who can first speak the marketer’s language rather than IT’s language. Over the past 30 years we have trained a cadre of application sales people who are expert at interacting with IT organizations, speaking IT’s language. This was necessary because front- and back-office enterprise applications, regardless of who was using them, were mostly purchased by the IT organization. If the next generation of marketing application companies is to be successful, they will need to hire sales people who can interact with marketing executives.
The sales cycles for these applications are becoming longer and more complex (see also here). Before the final decision for the licensing of these applications is made, IT is now becoming involved, and will continue to do so. Though the CMO’s prominence is rising, don’t expect the CIO’s role in marketing technology decisions to disappear. Over the past year our relevant portfolio companies started to see CIOs participating in important procurement decisions involving solutions for the marketing department.
The application’s user experience must be tailored to the marketing department’s users. We are starting to see application developers creating user experiences that are more akin to the practices being established around consumer software, particularly PostPC consumer applications. To easily adopt the multitude of new applications offered to them, marketing departments must want to interact with them and must be able to do so easily and with little or, preferably, no training.
As corporations become more customer-centric and the move to online continues, data-driven marketing departments stand to reap big rewards. For this reason they are acquiring a new generation of applications to help them improve their interactions with customers and prospects regardless of channel. Marketing application vendors must understand this trend along with its positive and negative implications, as well as the evolving roles of CMOs and CIOs, in order to best capitalize on it.
A couple of years ago while analyzing the ways SaaS business applications could evolve we started thinking about the social web’s impact on business processes. At the time, Facebook’s success was accelerating while Twitter and Zynga were emerging. We know that consumer-oriented companies want to engage their customers and prospects in the places they frequent, i.e., the social web. We also started seeing early signs of consumer-oriented technology adoption by corporations. These realizations made us hypothesize that the social web will figure prominently in the next generation of business applications and that these applications will be built on top of new platforms that have the social web at their core. To date we have invested in three social application companies: Extole, a company that provides a social marketing application, 8thbridge, a company that provides a social shopping and commerce application, and Jobvite, a company that provides a social recruiting application.
As we examine the characteristics of our three investments as well as those of other relevant companies we have considered investing, we have concluded that the applications developed by such companies:
Are delivered over the cloud.
Automate businesses processes that target individuals (consumers or employees), e.g., marketing, shopping, recruiting, customer experience management, collaboration. For example, Starbucks’ 13M Facebook fans or Coke’s 11M Facebook fans do nothing for these brands unless they can somehow demonstrate their engagement with each brand. A couple of weeks ago 8thbridge, working with Paramount Pictures, created the Facebook store for the recent Transformers movie. In the first two days the store went up in resulted in 900K new fans that, most importantly, generated 77K content interactions and also led to many ticket sales. As consumers increase their participation in social media, business executives continue to create programs to engage with them. In fact, 70% of interactive marketers are currently piloting processes to drive word-of-mouth marketing through their most vocal consumers, 58% have launched systems encouraging consumers to spread company messages; and 47% have launched social tools that allow consumers to support each other, e.g., forums.
Are implemented on new platforms with social networking structures, e.g., enable access and operations on a social graph, and capabilities, e.g., capitalize on the social graph to enable viral distribution, at their core. In addition, these platforms have global reach, support large partner ecosystems, are device agnostic, support online and mobile communications, advertising and commerce. Facebook has emerged as the strongest platform with these characteristics, in the same way that Google a few years earlier emerged as a platform for applications that had search at their core. However, we expect that Google, Microsoft and Apple will soon augment their own web platforms with such features as well.
Are data-centric, in that they generate and operate on big data, and use analytics to provide a variety of insights pertinent to the business. Insights on how a fan uses his social graph to spread a message about a brand, how to improve user engagement around a brand, identify which customers must be nurtured because they are brand influencers or can attract talent to a particular company, which customers or brand fans must be rewarded because they can address product problems, and which consumers shall be offered a reward, e.g., more frequent flier miles, because they can drive a particular group-buying behavior, e.g., organizing a vacation with a group of their Facebook friends and purchasing plane tickets, identify customers that are ready to defect because they are dissatisfied with a company’s service quality.
Require significant consulting services as corporations are in the very early stages of trying to determine how to take advantage of the social web and appropriately adjust their business processes and practices. In general, we are seeing that as much as 50% of these companies’ revenue could come from consulting services. Unfortunately, the established professional services organizations are not yet able to field the right solutions. As a result, the social application companies end up offering the services themselves.
We continue to look for additional investment opportunities in early and expansion stage companies whose applications use the social web to address important business processes in unique and valuable ways. Some application categories are becoming overcrowded with companies that have little differentiation and overfunded by investors. We, however, are looking for the companies that have developed social applications whose value will not become obvious for another couple of years.
After being away for a few weeks (vacation and business travel), I returned to the Bay Area and attended GigaOm’s Structure conference and Amazon’s AWS Summit. I have been attending Structure since the first event but this was my first time attending the AWS Summit. I was drawn to these conferences for two reasons. First, while over the past 10 years we have invested and continue to invest heavily in SaaS application companies, we have foregone investment opportunities in companies that provide cloud infrastructure solutions, i.e., PaaS and IaaS. I have been trying to determine whether we should be considering investments in cloud infrastructure companies today, particularly capitalizing on our extensive experience from SaaS. Second, I am always interested to hear about cloud computing best practices that can be used by our portfolio companies.
Cloud adoption is growing among corporations of every size but particularly SMBs. In a report published by Forrester on 4/11 the total size of the public cloud market is pegged at roughly $25B (with the majority today being in SaaS applications) and is projected to grow to $160B by 2020. A recent (May 2011) survey conducted by Morgan Stanley establishes that the usage of public clouds for a variety of workloads is expected to show a 23% CAGR over at least the next three years, with higher usage rates for SaaS applications compared to cloud-based infrastructure. SMBs see the use of public clouds as a means of significantly reducing their hardware costs. Larger enterprises are using the public clouds more for rapidly extending the functionality of internally developed applications, with mobile extensions cited most frequently, as well as for executing compute-intensive workloads like analytics. There is also a lot more talk about private clouds and hybrid clouds. Most companies in fact are either using all three types of cloud deployment or state their intent to use them. Only 37% of companies surveyed indicated that they will only use public cloud deployments. This means that SaaS application vendors may need to start thinking more seriously about the need to offer versions of their software running on private and hybrid clouds rather than the public cloud options that most, particularly the startups, have today. For investors this will mean that their SaaS companies may need to invest more in R&D and support as they roll out such options.
Security, interoperability, vendor lock-in, reliability, complexity and data privacy continue to be listed as the top inhibitors to cloud adoption. However, ironically customers don’t always feel that public clouds are less secure than their own data centers. But in security they also tend to include regulatory and compliance issues. Privacy is a different story and one that is viewed differently by the US and Europe.
In a report released during the Structure conference, with some numbers corroborated by Werner Vogels, CTO of Amazon.com during his address at the AWS Summit, GigaOm ranks Amazon as the largest public cloud provider with 2010 revenue of $500M with Rackspace being second. In his address Dr. Vogels talked about the investments Amazon has made and continues to make around AWS. Rackspace, Salesforce, Microsoft, Google, IBM, and other large companies that want to be providers of public cloud infrastructure and services, are also making huge investments in order to remain price-competitive and offer the broadest possible set of services. Based on what I heard during that week, and despite the fact that investments in cloud companies are increasing (64 investments in 2009 totaling $180M, and 93 investments in 2010 totaling $713M) I think that it will be hard, if not impossible, for a startup to compete effectively with the large public cloud computing providers. Public cloud infrastructure has therefore emerged as the battle of the giants. I continue to maintain that the SaaS application space offers more opportunity to startup companies and their investors, Trident included.
Cloud continues to be defined by its benefits rather than its technology. Cloud computing’s benefits that were only stated by vendors, are now validated by customers that have been using such solutions. Surveyed customers see cloud computing as a way to lower IT costs, increase corporate agility, and provide the foundation for 21st century architectures with the right levels of abstraction from infrastructure to platform that will result in higher quality systems. Based on these benefits and what was discussed and reported during the conferences, a few of the best practices that may be of use to SaaS application companies include:
The application’s architecture determines the level of success and the overall cost of deploying the solution to a public cloud. Moreover, the successful use of a public cloud by an application provider requires the continuous collaboration between the cloud provider and the application vendor.
Public cloud vendors strive to provide uniform, rather than peak-performance, characteristics across the computing resources they provision. This must be taken into account by the SaaS application vendors as they architect their solutions.
While a public cloud provider could offer a SaaS application vendor with a low cost way to develop and launch a new application, scaling costs continue to be high, often leading the application vendor to create its own public cloud. Of course, companies the size of Zynga and Netflix are able to negotiate special pricing with their public cloud providers, e.g., AWS. However, for private companies that have not yet reached that scale, the options are more limited.
I expect that this year the number of venture investments and the amount invested in cloud computing companies will surpass last year’s numbers. I continue to see investment opportunities the following areas:
Big Data Analytics. Analysis of Big Data is starting to become synonymous with cloud computing. I expect that this trend will continue as public cloud providers offer Hadoop and MapReduce options allowing analysts to send their large workloads to the cloud. Data movement and “accessibility” of such services by business analysts (you shouldn’t need to be a data scientists just so that you can use such a service) will continue to be issues that will need to be addressed.
Enterprise mobile applications. Because of the smartphone proliferation, enterprises are now starting to aggressively adopt mobile applications. As a first step enterprises are looking to create mobile versions of their mission critical applications, as well as develop mobile applications in completely new, innovative areas. Public and hybrid clouds are uniquely suited for the rapid development and deployment of these applications. Therefore, both specific enterprise mobile applications and environments for developing and deploying such applications are of interest for potential investments.
Data and application integration. I have often written about how the proliferation of public and hybrid clouds will increase the need for integrating data to such applications as well as integrating applications. For example, a cloud-based application may need to access data that is behind the firewall, or two cloud-based applications may need to be integrated to effectively automate a business process.
Data marketplaces and API management. As companies expose more of their data for cloud-based application, the data itself and the APIs through which this access is accomplished will need to be managed. While I think that we are still in very early stages of data marketplaces and API management, and it is not clear whether either of these areas will emerge into substantial markets and lead to the creation of large companies, there may exist a few interesting investment opportunities to consider.
Accelerating data movement. Moving large data sets from data centers to the cloud remains a vexing problem particularly for big data analytics applications. While cloud-based processing and storage performance are improving, data movement has not yet made corresponding strides and is thus approaches to address this problem are ripe for investment.
I continue to be excited about the prospects offered by cloud computing and delighted by the increasing usage of all its layers (application to infrastructure) by small and large companies. I don’t feel that we have missed big opportunities by not investing below the cloud’s application layer, but feel that there are at least five areas that represent interesting areas for future investments.
Last October I wrote about emerging data analytic services that could be offered under the term Insight as a Service. More recently Mark Suster wrote about the need for data as a service, and Gordon Ritter in his post added his thoughts on Insight as a Service. In my post I had proposed three types of data that can be used for the creation of such insights: internal company data, web usage data, and third party syndicated data. That same month I invested in Exelate, a company that provides a marketplace for online data and a Data Management Platform (DMP) that can both be used to improve the targeting effectiveness of display advertising. Online advertising and financial analytics are two areas where data marketplaces are succeeding. Exelate and other marketplaces demonstrate that significant companies can be built in this area. Companies like Thompson/Reuters and Bloomberg, and startups like cloud-based Xignite, have also been successful in selling financial data to application vendors. More recently we are seeing the emergence of startups like Gnip, Factual, Infochimps, WebServius, as well as Microsoft's Azure DataMarket introduced cloud-based marketplaces offering a broad variety of commercial and open source data sets. The success of data-driven application companies such as Zillow, Recorded FuturePayscale and a few others provide proof that innovative solutions can be developed from third-party data. But, the cost of the base data on top of which these solutions are developed, is miniscule compared to the cost of processing and augmenting the data to provide unique value to the user of these solutions. So, outside the online marketing and financial services areas, under what conditions can venture investors make money investing in companies that offer data marketplaces?
Large quantities of data are made available daily. Even though Hadoop and other emerging open source Big Data management technologies are significantly reducing the cost of storing, managing and processing such data, I claim that creating a data marketplace from scratch is still an expensive proposition. For data offered through a marketplace to be valuable it must be unique, and complete. For the marketplace to be successful in addition to having data with such characteristics it must solve the data distribution and monetization problem. This is actually a chicken-and-egg problem. The marketplace must contain enough variety of unique and complete data to attract buyers, but it needs data buyers to attract such data sources. Companies like Factual and Infochimps solve this problem by investing heavily on marketing, offering free onboarding of data, regardless of the data’s revenue potential, and focusing primarily to open source data that is typically coming from governments.
Uniqueness is often created by augmenting a data set in a variety of ways. For example, Zillow processes Google Earth data. Exelate’s analytics group does the same to data contributed by online publishers increasing its value and information content for the DSPs that use it, by identifying trends, attributes that are predictive of specific desired outcomes, etc. It is not clear whether general purpose data marketplaces will be able to provide such augmentation because it will imply that they become experts in several different application areas. This kind of processing costs money and is proprietary, along with the resulting data. As a result, the processed data many never find its way to a data marketplace.
Completeness of a data set is also very important. For example, if I want to compare the room prices of major hotel chains across the US in order to determine whether one chain is consistently more expensive than the others, I will need to obtain prices from every US city where the hotel chains being compared have properties. If I want to be comprehensive I can’t be satisfied having prices for only 60% of the cities, and 50% of the properties of the hotel chain. I need a complete set.
In addition to distribution, monetization, uniqueness and completeness, privacy, data ownership and data location are a few of the other issues data marketplaces must address. The marketplace must have clearly articulated policies about who owns the data and what operations can be performed on the licensed data. For example, I give my data to Facebook and feel relatively comfortable with their data privacy policies or at least the setting I can control. We are now seeing applications being built on top of Facebook with completely undefined privacy policies, (Examples: What data are these applications capturing in addition to what is provided by Facebook? How are they using it? Are they selling the data they capture?). Contributors to data marketplaces must also understand where their data is stored and what can be done on the data. For example, is their data stored in a country where it can be easily subpoenaed, or distributed with no control?
If these are indeed the necessary conditions for creating a financially-viable data marketplace, then rather than focusing on open source data, marketplaces must focus on licensing and distributing premium data that may be coming from companies like Zillow, Experian, Nielsen, etc. However, focusing on such premium data providers could imply a vertical, industry-specific approach to building marketplaces rather than a horizontal, general-purpose approach.
We are generating ever increasing quantities of data that can be used in a new generation of innovative solutions. In the presence of all this data, data marketplaces offer to the data owners an attractive model for distributing and monetizing on such data. However, early indications from companies that have built valuable data-driven solutions, as well as from application-specific data marketplaces show that the type of processing and augmentation that needs to be performed on data before it can be effectively used by such solutions necessitates a vertical marketplace approach rather than a horizontal one.
Today, Friday the 13th of May, 2011, the Boulder BI Brain Trust heard from Larry Hill [find @lkhill1 onTwitter] and Rohit Amarnath [find @ramarnat on Twitter] of Full360 [find @full360 on Twitter] about the company's elasticBI™ offering.
Serving up business intelligence in the Cloud has gone through the general hype cycles of all other software applications, from early application service providers (ASP), through the software as a service (SaaS) pitches to the current Cloud hype, including infrastructure and platform as a service (IaaS and PaaS). All the early efforts have failed. To my mind, there have been three reasons for these failures.
Security concerns on the part of customers
Logistics difficulties in bringing large amounts of data into the cloud
Operational problems in scaling single-tenant instances of the BI stack to large number of customers
Full360, a 15-year-old system integrator & consultancy, with a clientele ranging from startups to the top ten global financial institutions, has come up with a compelling Cloud BI story in elasticBI™, using a combination of open source and proprietary software to build a full BI stack from ETL [Talend OpenStudio as available through Jaspersoft] to the data mart/warehouse [Vertica] to BI reporting, dashboards and data mining [Jaspersoft partnered with Revolution Analytics], all available through Amazon Web Services (AWS). Full360 is building upon their success as Jaspersoft's primary cloud partner, and their involvement in the Rightscale Cloud Management stack, which was a 2010 winner of the SIIA CODiE award, with essentially the same stack as elasticBI.
Full360 has an excellent price point for medium size businesses, or departments within larger organizations. Initial deployment, covering set-up, engineering time and the first month's subscription, comes to less than a proof of concept might cost for a single piece of their stack. The entry level monthly subscription extended out for one year, is far less than an annual subscription or licensing costs for similar software, considering depreciation on the hardware, and the cost of personnel to maintain the system, especially considering that the monthly fee includes operations management and a small amount of consulting time, this is a great deal for medium size businesses.
The stack being offered is full-featured. Jaspersoft has, arguably, the best open source reporting tool available. Talend Open Studio is a very competitive data integration tool, with options for master data management, data quality and even an enterprise service bus for complete data integration from internal and external data sources and web services. Vertica is a very robust and high-performance column-store Analytic Database Management System (ADBMS) with "big data" capabilities that was recently purchased by HP.
All of this is wonderful, but none of it is really new, nor a differentiator from the failed BI services of the past, nor the on-going competition today. Where Full360 may win however, is in how they answer the three challenges that caused the failure of those past efforts.
Full360's elasticBI™ handles the security question with the answer that they're using AWS security. More importantly, they recognized the security concerns as one of their presentation sections today stated, "Hurdles for Cloud BI" being cloud security, data security and application security. All three of these being handled by AWS standard security practices. Whether or not this is suficient, especially in the eyes of customers, is uncertain.
Operations and maintenance is one area where Full360 is taking great advantage of the evolution of current Cloud services best known methods and "devops" by using Chef opscode recipes for handling deployment, maintenance, ELT and upgrades. However, whether or not this level of automation will be sufficient to counter the lack of a multi-tenant architecture remains to be seen. There are those that argue that true Cloud or even the older SaaS differentiators and ability to scale profitably at their price-points, depends on multi-tenancy, which causes all customers to be at the same version of the stack. The heart of providing multi-tenancy is in the database, and this is the point where most SaaS vendors, other than salesforce-dot-com (SFDC), fail. However, Jaspersoft does claim support for multi-tenant architecture. It may be that Full360 will be able to maintain the balance between security/privacy and scalability with their use of devops, and without creating a new multi-tenant architecture. Also, the point of Cloud services isn't the cloud at all. That is, the fact that the hardware, software, platform, what-have-you is in a remote or distributed data center isn't the point. The point is the elastic self-provisioning. The ability of the customer to add resources on their own, and being charged accordingly.
The entry-level data volume for elacticBI™ is the size of a departmental data mart today. But even today, successfully loading into the Cloud, that much data in a nightly ETL run, simply isn't feasible. Full360 is leveraging Aspera's technology for high-speed data transfer, and AWS does support a form of good ol' fashioned "sneaker net", allowing customers to mail in hard drives. In addition, current customers with larger data volumes, are drawing that data from the cloud, with the source being in AWS already, or from SFDC. This is a problem that will continue to be an "arms race" into the future, with data volumes, source location and bandwidth being in a three-way pile-up.
In conclusion, Full360 has developed an excellent BI Service to suplement their professional services offerings. Larger organizations are still wary of allowing their data out of their control, or may be afraid of the target web services provide for hackers, as exemplified by the recent bank & retailer email spammers, er marketing, and Sony break-ins. Smaller companies, which might find the price attractive enough to offset security concerns, haven't seen the need for BI. So, the question remains as to whether or not the market is interested in BI in the Cloud.
Back in January I had read some reports indicating that after a strong 4Q10, the US economy was showing the signs of a slowdown. During January and February in particular businesses slowed their investment activities. These sobering reports were counterbalanced with more upbeat reports that small and larger enterprises continue their switch to SaaS solutions with CIOs looking to invest in new SaaS applications. This trend was reflected in the sales pipelines of our SaaS portfolio companies that continued to show strong growth during the quarter. However, ultimately our SaaS portfolio performance was ordinary, more reminiscent of the performance during 2Q10 rather than that of 4Q10. In other words, about 40% of our SaaS portfolio companies made or exceeded their quarterly financial targets, with over 70% coming to within 80% of their targets. By comparison, over 80% of our SaaS companies made or exceeded their financial targets during 4Q10.
The public SaaS companies are starting to present a more hopeful story. As these companies are starting to report their quarterly results, e.g., Success Factors, RightNow, netsuite, Constant Contact, SPS Commerce, Vocus, we are seeing bookings and revenue growth that meets or exceeds the guidance they had provided. This growth is coming from multiple industries and particularly from the mid-upper enterprise segment. We are also seeing the public SaaS companies accelerating their M&A activity (see Salesforce, Success Factors, VMWare) as they try to expand the functionality of their core solutions. Such moves of course will likely have a negative impact on margins.
Over the past couple of weeks my partners and I attended several board meetings to review the 1Q11 performance of our SaaS companies. As I mentioned above, the great momentum our SaaS companies achieved during 4Q10 was not carried during 1Q11 resulting in an OK quarter. Our analysis of their performance is leading us to the following conclusions:
Companies of different sizes and from several different industries are considering SaaS applications to address their business needs. This is reflected in the overall size of the sales pipelines of each of our SaaS portfolio companies by number of opportunities and the size of each opportunity. In general we are seeing
the size of the average opportunity increasing,
the age of each opportunity remains small, i.e., few opportunities are over 3-4 months old,
the distribution of contract duration has remained steady, i.e., the distribution among one-, two- and three-year contracts has remained the same as in the previous two quarters,
the distribution among the size of the prospects, i.e., among small, mid-size and larger enterprise companies, has remained the same as in the past two quarters, and
while the geographic distribution of the deals continues to be US-centric, the number of foreign companies in the pipelines is increasing.
Analysis of the deals that were won and lost during the quarter indicates that while companies are accelerating their adoption of SaaS applications, in this economic environment they tend to prefer to sign deals with larger, public SaaS vendors rather that smaller, private ones. Many other companies decided to push their purchase decisions out by 1-2 quarters presumably in order to see how the economy will do during this time.
Regardless of the customer size, and similar to 3Q10 and 4Q10, the majority of the sales closed during the last 2-3 weeks of the quarter.
Customer churn remained 7-10%.
During the past two weeks I have also had the opportunity to talk with other investors with significant SaaS portfolios and heard that their companies reported similar quarterly performance.
Our management teams feel rather upbeat about 2Q11. They are basing their optimism on the discussions they are having with the prospects that didn’t buy during last quarter but have indicated that they will make a decision in the very near future, as well as the overall size of their sales pipelines which have continued to grow with consistent regularity. Finally, as investors we hope that the problems with Amazon’s AWS service that impacted many SaaS companies last week will not have any negative material impact in the 2Q11 revenue and bookings of these companies. It was fortuitous that it happened relatively early in the quarter.
Happy New Year to all! Like every year I am writing about the technology areas I will be following and focusing on during 2011. These areas build upon those my partners and I followed during 2010. During the holidays I wrote about online advertising, mobile and social web as areas Trident will continue to target.
Tablets and smartphones. In a couple of days I’ll be heading to CES where I expect that several vendors will be introducing new tablets and smartphones targeting different customer segments. My interest around these devices centers on the platforms they support, e.g., HTML5, novel features they will incorporate, e.g., NFC, the new types of applications these features will enable, e.g., mobile wallets, and the types of data they will be generating. Tablets and smartphones are sensor platforms.
Cloud computing, SaaS, and virtualization. Cloud computing was one of the biggest technology trends for 2010 and corporations continued to virtualize their data centers (see my comments from the Goldman conference). Cloud management and management of virtualized environments are two important areas we are targeting. Cloud management in particular is becoming a hot space. We just lost a deal in this area after significant competition with two other venture firms. I am also following closely the evolution of PaaS platforms and the SaaS applications they will be enabling, particularly now that enterprises have started aggressively adopting SaaS applications and developing their own cloud-based applications. We will continue looking for context-aware, social (see below) and vertical SaaS applications. In 2010 we invested in Acclaris (healthcare IT) but passed on several others.
App stores and application models. I am watching how the app store is developing as a general purpose applicaiton distribution mechanism. App stores are moving beyond smartphones (see what Apple is doing with the Mac App Store) into other consumer electronics devices, e.g., TVs, cars, (another area I’ll be watching at CES) and finally the enterprise. An area of interest is application discovery within app stores. As the number of applications offered by an app store increases, identifying those with the functionality that is appropriate for a particular task or specific business process will become very important. Finally, between the proliferation of app stores and the more extensive use of PaaS for application development we see a new model emerging for enterprise application delivery and licensing. Enterprise application functionality will be developed in much smaller chunks and will be priced accordingly, very much like it is happening today in smartphones.
Social computing for the enterprise. We are focusing on three areas within social computing for the enterprise: customer service where I think there is opportunity for significant innovation in business settings, marketing, where word of mouth and friend referral programs are proving very effective for B2C and B2B businesses, and Facebook ecommerce, because so many companies are now setting up their stores within Facebook. We are rethinking the workflows and business processes as we try to better understand how social computing can be used effectively in the enterprise.
Big data and analytics. We will be moving from just collecting and managing/organizing big data (web site data, social data, mobile data, data from the Internet of Things) to thinking how to effectively analyze it. In-memory analytics, Hadoop, Google’s Percolator are technologies we follow. Privacy and security will be important data-related issues that started coming to fore during 2010 and will remain so during 2011. While I don’t expect to see technology-driven solutions to these issues, I anticipate that during 2011 we will need to engage in healthy dialogs about what data privacy in today’s environment really means.
Long gone are the days when Dreamforce was a smallish conference devoted to SaaS; the first conference 10 years ago had fewer than 1000 attendees. This year's conference had over 30k attendees (business users, IT users and vendors) almost 70% higher than last year's. The lines in and around the Moscone, the hotel rates and the jammed restaurants, bars and parking lots around the conference venue provided adequate proof of the high attendance. This was an event of high importance to Salesforce and even to SaaS in general. My impressions:
Based on the attendee affiliations (small and large companies, business and IT users, foreign and domestic delegates) the event provided additional proof that SaaS and cloud computing have penetrated the enterprise for good, as several of us have been predicting. Sarah Friar of Goldman Sachs calls it the "unstoppable SaaS wave." Heroku is very significant acquisition for Salesforce. In addition to the development environment it provides, 1m Ruby application developers that are Heroku's community, including developers of mobile applications, can be channeled toward the platform Salesforce is putting in place.
The introduction of database.com along with Heroku's Ruby-based development environment now position Salesforce among the premier PaaS providers along with Microsoft, VMWare, and maybe even Red Hat thought its acquisition Makara. This is a significant development since Salesforce's force.com platform and APEX language alone were not adequate to provide a general purpose, world-class PaaS (in a previous post I wrote some initial thoughts on force.com). In addition, because of its applications heritage, Salesforce has a wealth of application know-how that it can reflect to its PaaS, whereas companies like Microsoft and VMWare must rely on their third party application developers to acquire the corresponding know-how. Salesforce needs to work quickly to integrate together all its pieces (Chatter, Jigsaw, force.com, database.com, Heroku tools, etc.), in the process defining and exposing the right APIs. In this way developers will be able to create applications for a variety of tasks and complexity, not just CRM-related applications as was the case with force.com. It was already announced that objects and services (application and platform) will be exposed through SOAP and REST APIs. Developers will not be restricted to program only in Ruby but will be able to use any language like Java, C# and PHP. They will also be able to create their own data models. Moreover, by opening up its PaaS, Salesforce will allow developers to use applications developed in other similar platforms like Azure.
The announcements of additional "clouds," such as the one for web site development, prove that Salesforce continues to have a strong vision for where SaaS and cloud computing can go.
As we've seen in previously published surveys, security is no longer the top concern for SaaS adoption. Data and application integration have claimed that spot indicating that we are moving to a phase of trying to make on premise systems work well with the cloud-based ones. The presence of several of the major Indian and Chinese IT outsourcing companies all of which had big booths at the show indicates that the they now see a significant opportunity around systems integration that involves SaaS applications.
As investors we are excited particularly about the PaaS announcements. The emergence of another strong PaaS and the competition it is bound to generate among Salesforce, VMWare, Microsoft, Red Hat, and potentially Google, will be beneficial on two fronts. First, the competition will result in further PaaS innovations. This is obviously good for SaaS application developers who will consider more seriously a PaaS as a viable alternative on top of which to develop a new SaaS application. The improved capabilities of PaaS platforms will also accelerate application development resulting in the creation of new, and most likely, innovative packaged SaaS applications; the type we as investors like funding. The competition among PaaS providers will not only good for the continuing penetration of SaaS applications, but also for lowering the operating costs of deploying and supporting a SaaS applications, thus improving the application vendors' margins. While the PaaS pricing announced by Salesforce announced for the PaaS are on the high side, particularly for smaller ISVs, I expect that competition will lead to lower prices. My only concern from Salesforce's PaaS-related announcements is whether the company can develop the right DNA and evolve into an infrastructure company to ultimately implement the world-class PaaS it announced, since at heart it is still an applications company.
Over the past couple of years I have met with several startups that offer analytic solutions for mobile data. I have not invested in any of them. I had felt that the data captured from feature phones and early generations of smartphones was not rich enough to lead to interesting and distinct analytics. For example, while data captured from a mobile web browser such as sites visited, pageviews, time spent browsing could be analyzed, we didn’t need a new company to do that. Omniture could do that just fine. However, the new smartphones capture more interesting data. These data sets could drive the creation of a new and interesting analytics. As a result, I am becoming interested in mobile data analytics and have been actively looking for investment opportunities in this sector.
The new smartphones are becoming sensor platforms, as well as being computing platforms. In addition to photo and video camera, touch screen, GPS and accelerometer, new types of sensors are being connected to smartphones. For example, Bling Nation has introduced a sensor that adheres to a smartphone and is linked to the user’s PayPal account. Our own portfolio company Zeo has announced that it will connect its sensor to the iPhone in order to capture sleep-related data. Some of the data sets generated by all these sensors that I find interesting include:
The time-series of GPS and accelerometer data for each subscriber. By analyzing these time-series one can predict where and when the subscriber will be next and offer relevant services at the predicted location, e.g., parking availability with offers from parking garages.
Data generated from the use of augmented reality (AR) application can create new advertising opportunities, as well the opportunities to serve up relevant content the user had not thought to ask for.
Configuration data on the complete software stack running on each phone (from firmware to operating system to application software). This data can then be used, for example, by an app store to recommend new available applications that will be augment the user’s productivity. Such configuration databases today exist only in corporate IT settings.
Mobile payments data combined with geolocation data. Analyzing this data can lead to predictions about customer brand or product loyalty.
Entertainment-related applications, e.g., gaming, and health care applications, e.g., prescription dispensing, will also benefit from the analysis of this type of data. I am not certain whether new data management systems will be necessary for such data sets, though I imagine that the data will be big and complex, particularly as various time-series are captured, and will be stored in the cloud.
The wireless carriers may not be in the best position of collecting this data, not only because of their lack of experience with diverse data types, but also because they are regulated businesses. Google and Apple are in a much better position because they already collect much of this data through their Android and iOS platforms respectively. While these companies may also be best able to mine the data, they won’t enter this business in the near term. Instead it will be startups that will first experiment with creating interesting data sets out of the collected data and analyzing them. My assumption, also driving my interest in the sector, is that companies like Google will wait to see how these “experiments” go and proceed to acquire the more interesting of the analytics startup companies.
Users will need to give their permission for this rich data to be collected and combined. Vendors, including wireless carriers, will get the users’ permission by offering free services (something for which consumers have shown interest and affinity), better experience (optimized bandwidth, improved application performance, more accurate recommendations around applications, products, services, social connections, etc), and more accurate targeting of ads in ad-supported services.
The mobile space remains highly fragmented and the talent to create and analyze these data sets may be hard to find. The new smartphone platforms present opportunities for collecting valuable data sets that will lead to the development of unique analytics which will in turn drive important and novel decisions. Startups can lead the way to create these analytics and the enterprise platforms that manage them.
The survey data presented in last August’s Pacific Crest SaaS workshop pointed to the need for a variety of data analytic services. These services that can be offered under, Insight-as-a-Service, can range from business benchmarking, e.g., compare one business to its peers’ that are also customers of the same SaaS vendor, to business process improvement recommendations based on a SaaS application’s usage, e.g., reduce the amount spent on search keywords by using the SEM application’s keyword optimization module, to improving business practices by integrating syndicated data with a client’s own data, e.g., reduce the response time to customer service requests by crowdsourcing responses. Today I wanted to explore Insight-as-a-Service as I think it can be the next layer in the cloud stack and can prove the real differentiator between the existing and next-generation SaaS applications (see also here, and Salesforce’s acquisition of Jigsaw).
There are three broad types of data that can be used for the creation of insights:
Company data. This is the data a company stores in a SaaS application’s database. As SaaS applications add social computing components, e.g., Salesforce’s Chatter, or Yammer’s application, company data will become an even richer set.
Usage data. This is the Web data captured in the process of using a SaaS application, e.g., the modules accessed, the fields used, the reports created, even the amount of time spent on each report.
Syndicated data. This is third-party data, e.g., Bloomberg, LinkedIn, or open source, which can be integrated (mashed) with company data and/or usage data to create information-rich data sets.
Some of the issues that will need to be addressed for such services to be possible include:
Permission to use the data. For this to be possible, corporations must give permission for their company data to be used by the SaaS vendor for benchmarking. For example, if Salesforce customers are willing to make their data available then their sales forces’ effectiveness can be benchmarked against that of peer companies. It may be more likely for companies to give their permission if the data is abstracted or even aggregated in some way.
Data ownership. The ownership of usage data has not been addressed thus far. Before creating and offering insights, ownership will have to be addressed by the SaaS vendors and their customers. Once ownership is established, as I had written before, this data can, at the very least, be used by the SaaS vendor to provide better customer service or even to identify upsell opportunities and customer churn situations. While some vendors, e.g., Netsuite, are starting to utilize parts of usage data, utilization remains low and scarce.
Data privacy. Company and usage data will most definitely include details that may need to be protected and excluded from any analysis. The SaaS vendors will have to understand the data privacy issues and provide corporate clients with the necessary guarantees. Thus far SaaS vendors have only had to make data security guarantees. Privacy concerns around this data will be similar to those that currently surround the internet data that is being used to improve online advertising.
Potential need for pure-play Insight-as-a-Service vendors. The SaaS application companies may not prove capable of providing such insight services. It may be necessary to create specialized vendors to offer such services. Such pure-play vendors may have more appropriate and specialized know-how which will be reflected in their software applications (essentially analytic applications that can organize, manipulate and present insights). In addition they will be able to offer a broader range benchmarking since they will be able to evaluate data across SaaS vendors. However, having such vendors will also necessitate the move of company and usage data to yet another location/cloud thus increasing the security and privacy risks.
Eligibility for accessing these insights and business models under which they can be offered. One approach would be to only offer such insights to as a separate product by the SaaS application’s vendor to its customers. Another approach, particularly if the insights are to be created by a pure-play insights vendor, would be for such vendors to create data coops. Under this scheme corporations contribute company and usage data to the coop, the Insight-as-a-Service vendor analyzes all contributed data, and only offers the results to the companies that belong to the coop. For this service the vendor can use an annual subscription fee not unlike what industry analysts like Forrester and Gartner charge. Internet data companies such as Datalogix, that has created a coop with retail purchase data, can serve as good models to consider. Another business model may be for the vendor, either the SaaS application vendor or the Insight-as-a-Service vendor, to share revenue with the companies providing the company and usage data. Internet data exchanges like Blue Kai and eXelate would provide good business model examples to imitate.
Geography. As we’ve learned with consumer internet data, each country approaches data differently. For example, European countries are more restrictive with the use of collected data. SaaS companies must try to learn from the relevant experiences of internet data companies as they determine how to best offer such insight services.
Data normalization. Usage data will need to be normalized and then aggregated since each customer, and maybe even each individual user, uses a SaaS application differently. This could be tricky.
Hosted applications need not apply. Not all vendors will be able to offer such services. For these services to be successful, data from the entire customer base needs to be aggregated and organized. This implies that vendors claiming to offer SaaS solutions when they are only offering single-tenant hosted solutions deployed in, what amounts to, private clouds will not be able to provide such insight services. In fact, multi-tenant architectures will be even more important for insight-generation because they make data aggregation easier.
Insight-as-a-service can become the next layer of the cloud stack (following Infrastructure-as-a-Service, Platform-as-a-Service and Software-as-a-Service). In addition to SaaS application vendors that can start offering such services, there exists an opportunity to create a new class of pure-play Insight-as-a-Service vendors. Regardless, vendors will need to start addressing the issues and many more that I can’t anticipate at present. But since surveyed customers are already starting to ask for such services, it is time to start creating them. It means that the time for Insight-as-a-Service has arrived.