I'm speaking with Eric Head. He's the director of business development for Foresee Results, which is a company that uses the American Customer Satisfaction Index (ACSI) to integrate customer information for online retailers. Is that correct?
Eric Head: Yes. We essentially use customer satisfaction data from web sites, run it through the ACSI methodology, and then we produce an analysis for our clients that really shows where opportunities for improvement are.
So using ACSI methodology, we take consumer survey data and look at three key areas: the business model, the elements of satisfaction, and the drivers of satisfaction. There are multiple questions on the survey that cover each of these elements. There's overall satisfaction, which is made up of three questions, and then there are future behaviors such as likelihood to return to the site, likelihood to purchase online (if it's a multi-channel retailer), likelihood to purchase offline, recommendation likelihood, et cetera.
In a financial services setting, we could ask likelihood to pay bills, likelihood to manage their account online; if it's a self service/customer service type model, we ask likelihood to use the web site as opposed to the phone, as a less costly channel to do business. We tie satisfaction to our clients' business drive, so that's through the future behaviors.
This survey data goes through this very sophisticated and complex methodology to generate two key numbers. One number is the score. On the 0-to-100 ACSI methodology, a website can ask itself: How are we doing at each of these elements? What are the drivers of customer satisfaction? Both are good to know.
But the real key to the methodology is the impact count. This impact count tells us if certain elements are drivers of satisfaction or not. And essentially what we have are multiple questions -- three overall -- and through this complex regression analysis the engine automatically derives that impact. The higher the number, the stronger the correlation there is between the customer driver satisfaction and the overall customer satisfaction.
It's really much more than simple statistics. It's cause and effect analysis. So now between the score, low score, opportunity for improvement, and impact, will it drive satisfaction? Our client, the business owner, has some great data to be able to figure this out. Where do they place their resources? If they have scarce resources in terms of money, people and development, where do they focus their efforts? In the example I'm showing you here, product information has a fairly low score, 79, yet it has the highest impact score at 1.4. So if we could improve on this, we're going to have the greatest return on moving overall satisfaction.
Now if we could move overall satisfaction, we can also derive an impact by increasing the likelihood to return. So, we can raise the scores on one side by improving another.
Steve: Are these demonstration results from a web site or a single product?
Eric: This would be a web site. It may even be just a section of a web site, so it could be showing, for example, shopping cart abandoners or people that put items in their cart and then leave without buying. We could focus and measure on that particular group.
Steve: This appears to be relatively clean data you are giving the retailers then, and not a big jumble of numbers that no one can decipher.
Eric: Right. It's not like traditional analytics, where we're tagging and tracking every bit of section data possible. What we're doing is tapping the minds of the visitors to understand the attitudes, the intentions, and the perceptions of their visit. More importantly, find out what the client could do to improve the satisfaction, which then leads to the future behaviors. This is all about a cause and effect approach. If we can satisfy and improve satisfaction, it's going to increase loyalty and increase retention, which directly drive business, profitability, lower costs and more revenue.
Steve: What's the time frame on these samples of information? How long are they in place?
Eric: It depends on how frequently the data comes. In any given analysis we do, we create an online dashboard. It's one of the ways we deliver data. The online dashboard depends on how much traffic the site gets and what the response rate is. We could refresh the scores hourly if we wanted to. There's a minimum of 300 in one of the samples that you can see online.
Steve: And this is the actual web page clients can view?
Eric: Right. The client has a login user ID and password that they can use to go in and take a look at what their satisfaction rate looks like, almost real-time. So, this really becomes the operational view. Let's say the client has a major marketing campaign going over the weekend. They could log into their portal and see what's happening to their satisfaction results with that marketing campaign while it's still happening.
The second way we deliver data is through monthly satisfaction performance reports. Those include the satisfaction summary. These reports can be put into a balanced scorecard or management report.
The third way we deliver data is through a satisfaction research analyst, or SRA. Each one of our clients is assigned an SRA. The SRAs, on a periodic basis, do deep dives into the data to look for end sites. They'll do segmentational analysis. They'll do some crosstab. They'll dig as far as they can within a certain segment or group for a client.
They'll look at first time visitors versus frequent visitors; look at those who have an account versus those who don't; those who purchase versus those who haven't; and those who have been in a store in the last 30 days versus those who have not. We can now start to understand the cross channel dynamics between different levels of customers.
The data is then presented in PowerPoint via conference call, where the SRAs go through key findings.
Steve: The SRAs make sure clients understand what they're looking at. So if they want to focus on whatever segment clients prefer, they could do that.
Eric: That's exactly right. There's a survey component to it, but it's really a means to an end. The real value is methodology, the ACSI, and the data that goes through it and the analysis that comes out the other side.
Steve: I wanted to talk about ACSI for a minute if we could, for online retailers who aren't familiar with that methodology. Where it came from and why it's relevant to online shoppers.
Eric: The American Customer Satisfaction Index has been around for close to 12 years now. It comes out of the National Quality Research Center at the University of Michigan Business School on a quarterly basis as a public index. It looks at a grouping of companies and a grouping of industries, and really tells us how satisfied Americans are with goods and services. There are about 200 companies in this index representing a wide variety of industries, over 50 different industries. So retail, financial services including banks and insurance, e-commerce including folks like Amazon and eBay, are all represented.
It's really the gold standard for measuring how satisfied Americans are with goods and services. The ASCI was developed by a gentleman named Dr. Claes Fornell. Doctor Fornell developed an approach to satisfaction based on a unique approach to statistics, a very sophisticated, advanced, complex statistical modeling technique. He built a structured equation model -- a series of algorithms basically -- and has a patent on those.
This proprietary approach was developed back in the late 1980s, and the initial application used Swedish perimeters to measure satisfaction of goods and services in Sweden. In the early 1990s the American Society for Quality wanted to have a similar matrix here in the United States, so they evaluated over 60 different methodologies and they selected Fornell's methodology. So in the early 90s, the ACSI was born.
The key thing to know about the methodology is that there is no more proven or credible approach to satisfaction with the kind of incredible evidence that shows in the ACSI.
One study of the ASCI that showed a one-point increase by a company led to a $275 million increase on a market cap. There's another study -- I believe it came out of Iowa -- that shows a one-point increase in the ACSI leading to a 7-percent increase in operational cash flow, and a 4-percent decrease in cash flow volatility. So whether you're measuring assets as a key business indicator or as cash flow, this methodology absolutely predicts business performance for companies; that's the key takeaway, and really what makes this methodology so powerful.
Again, what we're doing here is applying that methodology to web site satisfaction. I made the statement earlier that there isn't a more proven, credible, accurate, or precise methodology. The ACSI has also proven to be a one-quarter leading indicator for the Gross Domestic Product (GDP). As Americans become more satisfied, about a quarter of a year later the macro-level national GDP rises.
Claes also did something that's very rare and unique among academic types: He actually put his money where his mouth is. He believes so strongly that satisfaction leads to business success that he went out and essentially built a mutual fund based on the ACSI. So instead of looking at key ratios and last quarter earnings -- which is backward-looking information -- he looked at what companies are satisfying their customers today through the ACSI measuring stick, and purchased the stocks of those companies that are rising and sold short stocks of the companies that are falling. The portfolio has done exponentially well.
It's very powerful, yet at the same time, it's very intuitive. If you are satisfying the customers, they're much more likely to come back more often, to spend more of their hard-earned money; they're more likely to tell their friends and neighbors to spend money with you; and that leads to that profit stream which shows up in the stock price.
Danskin athletic apparel, which is specialty women's apparel, essentially took our satisfaction data -- remember, it comes in on a continual basis -- and put it on top of their satisfaction data and online revenues, converged them, and essentially found that the satisfaction data are a one-week leading indicator for online revenues. So as satisfaction rises today, a week out, they expect and predict that their online revenues will rise.
From the macro level to a company level, to a micro web site level, a lot of evidence shows the ACSI keeps focus on satisfaction measurements; you're going to improve your business and improve your financial success if you measure high.
Steve: Which leads to happier consumers, which leads to more spending online.
Eric: Right, it feeds itself.
Steve: Now that is interesting. How long have you guys been around?
Eric: We have been around since fall of 2001, so over four years.
Steve: You currently measure many popular consumer retailers -- Target, Tower Records, Best Buy -- all of these companies which have, coincidently, improved their web sites in the last couple of years, in terms of search and what products they offer online.
Eric: This really allows companies to focus on the customer, and by doing that and using a very scientific, credible, accurate, precise methodology, you as a company are going to really understand what's driving satisfaction, where you can spend your hard-earned budget, and what resources are going to have the greatest impact in improving satisfaction -- which leads to those loyalty and returns metrics, likelihood to return, recommend, purchase online, et cetera.
You'll notice that this is a very broad-based grouping of companies. We've done good work in the retail world, in financial services, in manufacturing companies, in web services companies, with folks like Ask Jeeves, the Weather Channel and Forbes. One of the reports that we generate is the benchmark report. It's done in an aggregate composite fashion, but our clients have the ability to tap into this network. There are literally hundreds of companies and web sites that have an ACSI Foresee Results number, and they can now look outside their organization with their results and start comparing themselves to non-direct competitive peer groups.
It's one of the ways the web has changed the rules of the game, in that when you come to a web site, you're comparing that web experience to the last five or six dozen web sites you've visited. So you may go out to Overstock, then to Best Buy, then to manage your account on Cingular, and then you go to DHL to track a hockey jersey, and then you go to yet another site. You're evaluating each experience: navigation, functionality, look and feel, based on the previous sites you've visited. So you truly can do a lot of best practice analysis of your web site regardless of the industry you are in.
Steve: And your customer base is not just retail -- it's hospitality, online services like Ask Jeeves and Weather.com. Is this a total of 4 million consumers between all of these companies?
Eric: We add companies as this slide continues to evolve, and this is the number of surveys we've done since the beginning, so it's over 5 million.
Steve: How many customer surveys are you doing a day total?
Eric: I believe the number is over 30,000 surveys a day. That is a lot of good, rich user data coming in from a broad-base group of companies and industries.
Steve: So, it benefits both the companies and the online shoppers and the online shopping experience?
Eric: Exactly, the industries as a whole benefit from this.
Steve: Terrific. So if a retailer wants to review your information then, can you tell them the web site?
Eric: It's ForeseeResults.com.
Steve: Is there anything else pertinent to Foresee that I'm not asking you about?
Eric: I think we've covered most of it, but let me mention our position relative to more traditional web analytics. Let me use Quickstream as an example: we compliment Quickstream extremely well. The idea is that with traditional analytics like Quickstream give you a lot of good, rich behavioral data. One of the benefits of the internet is we can track and tag everything, so there's no shortage of the ability to gather behavioral data. How many people came, where they come from, what pages did they see, what items did they put in their cart, where did they abandon, what was the average time spent online -- all of that behavioral data is available to be gathered and collected and organized.
But at the end of the day, especially in a multi-channel scenario, just because someone abandons doesn't mean it's a bad thing. It could be a very good thing. So traditional analytics is going to tell you that if someone abandons and didn't give you their credit card, that's a failure; when in reality, they may have been researching a purchase. They got what they wanted, they put a couple of items in their cart to build a shopping list, and they abandon because they are going to get in their car, drive to the store, touch the merchandise, and give their credit card at the store.
So the website was very successful in being a research tool -- it allowed the consumer in the comfort of their own home to figure out what they wanted, and then they went into the other channel -- offline, brick and mortar -- and the company ultimately got the benefit as a result of that web experience.
So with traditional behavioral data, you may only be getting half the picture. When Foresee comes in, we supply the attitude side of things, so where the attitudes, intentions and perceptions of that site visit, so when you see some behavior abandonment as an example, what were the attitudes and the intentions, next steps of that particular abandon behavior. So, now what you get is a complete 360-degree view of not only what happened on your site, but the "why" behind it, and the "why" is measured by this very scientific, credible, accurate methodology.
So we are starting to do a lot more of integration with traditional Quickstream analytics.
Steve: So that's a full customer life-cycle.
Eric: Correct. And again, that becomes very critical in an offline multi-channel world. There are a lot of statistics that show more than 50 percent of people research online and purchase offline. We saw a lot of that dynamic in the holiday research we did.
Steve: So can we go out on a limb to say that consumers can look forward to much more storefront and online integration?
Eric: Exactly. We live in a multi-channel world. Sam Taylor from Best Buy put it very well when he said, "Best Buy is customer-centric, but we're channel-agnostic." And Forester talks about right channeling: "We want to be there for the customer at the right time through the right channel and ultimately, as long as they are interacting on one of those channels, the company wins."
So the ideal scenario is that you are measuring satisfaction across all channels, and again, that's the power of this yardstick, the ACSI methodology. Now folks within the company who are responsible for the different channels can potentially be using the same measuring stick to look at satisfaction. You know, where the offline guys are looking at one set of satisfaction data and the online guys are looking at another. It's very difficult to figure out the cross-channel dynamics of the business decisions if you are using a different standard.
Steve: Right, everybody's on the same playing field essentially.
Steve: Wow. That's some cutting edge stuff.
Eric: It is, but it's been around a long time also.
Steve: It's taking some basic economic rules and bringing them to the online stage.
Eric: At the end of the day, Claes is an economist. And economics is all about the efficient allocation of scarce resources, and that, at the end of the day, is what we do. We allow our clients to figure out this question: "If I only have X amount of dollars to spend, where do I want to spend it that will have the greatest impact in driving my business?"
Steve: Eric, thanks for talking with us at Technology News.
Have comments on this article? Post them here:
people have commented on this article.