Subscribe to the Insightr Blog RSS feed

Thursday
Sep102009

Privacy concerns in data driven marketing is nothing new

I’m currently carry out a little research using Google Books - a technique I use frequently to scan read marketing books from the 80’s and 90’s to bring a little sense to the uber-digital world that I currently work in. There’s a lot of useful, practical examples of techniques from pre-internet days that those of us who have only ever worked in the internet age are likely unaware of - and while I’ve been doing this for over 12 years, marketing has been around considerably longer.

Anyway, I digress.

While scanning a couple of books this morning a quotation appeared in two separate books from 2003 describing segmentation techniques and how to approach customer targeting - which made me think about the recent privacy issues we’re hearing of in web analytics, adserving, behavioural targeting and the like:

The computer and modern data processing provide the refinement – the means to treat individuals as individuals rather than parts of a large aggregate … the treatment of an individual as an individual will not be an unmixed blessing. Problems concerning the protection of privacy will be large.
Shubrik (1967)

So, this privacy thing is not new, in fact it’s been in the minds of marketers for over 40 years now. Have organisations like Safecount (now a for profit WPP company not a lobbying / protection group) addressed the public concerns over privacy? Are organisations like the Web Analytics Association and the IAB lobbying hard enough for standards and openness?

What of Social Media: With people increasingly opening up their lives to real-time streaming is this a privacy concern? Is it ethical for marketers to extract this data for segmentation and targeting purposes? Should the public think more about protecting their personal data? Interesting topics that will become increasingly important as software and tool give us the ability to crawl, collect, segment and analyse behaviour, attitudes and needs.

Wednesday
Sep092009

Will smarter software mean a loss of thousands of digital marketing jobs in the future?

Over the last few years there has been an explosion in the number of jobs created in digital marketing, yet the shift in advertising dollars hasn’t happened as predicted. 

This is not a new story.

But as I was thinking this afternoon about the kind of jobs created I wondered about the long term health of digital marketing. Let’s face it, the way we do things has become rather bloated. One only needs to look to outspoken bloggers like George Parker on the “Big Dumb Agencies”, but the story there is generally around why some agencies don’t get it - my worry is about the inefficiencies of agencies in serving client briefs.

Imagine if you will, a briefing today for a digital campaign for a big agency, and think about the roles that we might see involved in a campaign from the agency side:

  • Account team (Account Exec, Account Manager, Account Director, Digital Strategist, Planning Director)
  • Creative team (ECD, Art Directors, Copywriters, Flash Engineers, Information Architect) 
  • Production team (Technology Director, Front End Programmers, Project Director, Project Manager)
  • Media team (Media Director, Media Planners, Ad Ops Manager, Search Planner)
  • Measurement team (Analytics Director, Researcher, Web Analyst, Media Analyst, Search Analyst)

OK, so this is a pretty big team and many of you will tell me that this is unrealistic – but it happens in the bigger agencies more often than you think. These folks aren’t full time on the project, but have roles nonetheless.

Now, think about some of the stories we’re hearing about at the moment (here are a few current articles) regarding automation of digital processes:

 

 

So, even today what we’re seeing is a growing trend of software taking over the time consuming work that (for example) would have an analyst spending 100% of her time running reports for various stakeholders rather than analysing data.

What I’m going to predict is that over the coming years we’re going to see an explosion in the number of tools that exist that make development, delivery, reporting and optimisation considerably more streamlined than it currently is.

The net impact of this is that as roles become more automated and streamlined is that either jobs start disappearing or smart marketers and agencies learn to redeploy talent to parts of the business that demand their skill sets. The digital industry has been incredibly inefficient ever since I started, but I think it’s only now that were starting to see the signs that software is going to change this.

Potentially a threat to peoples jobs? Yes. But I think the net impact of marketing automation is (a) increased migration of budgets to online initiatives and (b) increased performance of digital marketing – these are all good things, and things that are essential as the depth of digital channels is increasing faster than people can keep up with.

In the measurement space expect to see increased emphasis on the vendors (Omniture, Wentrends, Google, Yahoo etc) to streamline analysis work through process automation and better access to smarter data collection, organisation, delivery and optimisation tools. The first to market with a faster way of doing business is going to have a clear competitive edge for a considerable amount of time – it’s a battle against the increasingly complexity of analysis tools but a failure to make the work of an analyst more straightforward.

These will be interesting years ahead of us that’s for sure.

Will it be smarter software, or smarter marketing?

Thursday
Sep032009

Two Google Analytics filters that will fix problems with double counting of pagenames

How many of you extensively use Google Analytics filters to better organise and structure your data? Management of data through creative use of filters that organise and enhance data can make the task of analysing the data, getting value out of the investment in tracking and the decision making process considerably simpler.

Google Analytics Filters can be very easy (or complex) to implement depending on what you want to do with your data, in this article I’m going to show you two filters that should be applied by default to all profiles except the original profile.

Note how I say except original profile - it is highly recommended to leave one profile in all accounts untouched and in the raw format so that if you make mistakes or errors with filters (worst case is you lose all data) then you’ll always have a backup to fall back on.

I’m going to run through some examples from http://www.ragu.com:

Filter #1: Lowercase

One of the simplest filters to install, yet one of the most powerful. The lowercase filter simply takes the dimension and turns all data lowercase.

Why is this important?

Google Analytics default tagging is very literal. So, in the example of content reporting, GA takes the entire URI and passes that string into the content reports. Some of your visitors may have a browser that modifies urls, or type in a url that uses miXed cAse or an internal anchor tag might link to .com/Content.html or similar. This will result in potential double counting of content, so in the example below from ragu.com we can see that these pages will show up as separate pages:

/Content.html
/content.html

Using the lowercase filter we will de-duplicate these types of pages. Here’s what the filter looks like:

Filter #2: Remove trailing slash

This is a complex filter to install, as it uses the advanced filter functionality and requires you to use regular expressions to find a string and carry out a modification of the string. In this article I’m going to give you the exact settings required to run the filter - but I’m also going to try and explain how the filter works, so you can try out your own variations with regular expressions.

Why is this important?

So, if we look at the utmp variable that’s sent to Google Analytics in the two screenshots below we can see that for the first example (left) the pagename will be reported as /index.php/ideas/ and for the second example (right) the pagename will be reported as /index.php/ideas - in other words for the same page we’ll have two different pages in reporting.

  

This makes analysis frustrating and difficult.

Here’s what the filter looks like:

The regular expression we’re running in this filter, using the Field A property to the output constructor carries out the following operations:

1. March a string in ‘Field A’ that meets the following criteria: ^/(.*?)/+$ — let’s break this down: ^ does a match for the beginning of a string, / searches for the first slash, (.*?) searches for anything between the first slash and the last slash closed by +$ and then stores this value in a temporary variable called $A1 (Field A, Variable #1 held within the parentheses)

2. Take the temporary variable held in $A1 and populate into a new variable with a slash and the $A1 variable

The use of parentheses in the Field A and Field B temporary variables (eg $A1, $A2, $B1, $B2 etc) allows you to do some pretty cool stuff, but in this example we’re just using it to strip out the trailing slash on any of our variables.

Here’s a screencast (watch in HD on YouTube) showing exactly how to create this filter using the Google Analytics filter manager from the main settings page:

Now what?

With these two filters in place you are likely to have de-deuplicated probably 95% of all instances where pagenames get double counted in Google Analytics and will be able to carry out content analysis without pulling your hair out!

 

Friday
Aug282009

How to guide – integrating Google Website Optimizer with Omniture

I started work with a new client this week and came across an interesting, but I imagine fairly common scenario. The client is a long time Omniture Site Catalyst user and we have just agreed to move them up the analytics food chain and introduce some testing.

The client had no budget to make use of Omniture’s Test & Target product, which meant that we looked to Google to provide the testing engine. I’m a big fan of all the testing tools we use with our clients, from the basic Google Website Optimizer through to the very powerful Autonomy Optimost but will make sure we’re deploying the most appropriate solution for the client.

So, here’s the problem: The client has identified a problem with using GWO – the IT team would not allow us to deploy GWO tags in their web form applications, not because of issues with Google but rather because of the deployment time for us to get conversion tags into the application (prior to commencement we agreed the testing scope would be limited to landing pages and marketing content). This was potentially the end of the project, beyond tracking the conversion of visitors starting the application process we wouldn’t be able to use GWO for measuring real conversion performance.

The solution I am going to describe shows how we can integrate GWO with Omniture, and then use Omniture to collect test data and then to take the analysis offline and carry out the heavy lifting statistical analysis in Excel.

Stage 1: Getting Google Website Optimizer data into Omniture

The easiest part of the process is capturing the test data into Omniture. The process we used has been made considerably simpler with a change made to the GWO code in 2008. A single line of JavaScript now allows us to read the GWO cookie and use that value.

Using the utmx(‘combination_string’) function we can read the variable from the GWO cookie that holds the cell ID of the test variation presented to a visitor:

if (utmx('combination_string') != undefined) {
comboID = utmx('combination_string');
}

With this code we now have a variable populated with a string like 2-1-3 that represents the test factors and variations that a user will see. The next step is for us to send this data to Omniture. For those of you who pay attention to Adam Greco’s blog articles you’ll know we will want to use an eVar to persist this data for a given period, say 30 days. The code is now very simple to pass the GWO ID into Omniture:

s.eVarXX = comboID;

But since we could be running multiple tests I would recommend you pass in the test ID to the eVar to make future filtering of data simpler by introducing a nice naming convention, eg:

s.eVarXX = “TestID:YYYYYYYYYY > CellID:” + comboID;

Now, prior to adding the code to the page we’ll want to ensure the eVar is setup correctly, to do this I would recommend an eVar populated with the following parameters:

Since we’re also going to be analysing the test sample size you will need to call client care and have them enable page views and visits for the eVar. Not many folks know this is possible, but it is – and makes your eVars increasingly powerful.

Next step is to launch your test, all your test data will now be sent into Omniture.

In order to complete the Google Website Optimizer setup, you’ll still need a conversion point – so I’d recommend a conversion tag placed a click event that takes a visitor to the pages you can’t tag, or to apply an ‘engagement’ conversion event such as time on site or bounce rate.

Stage 2: Making it easy for you to quickly analyse the data in Site Catalyst.

So, the heavy lifting analysis isn’t possible inside Site Catalyst as we’ll be analysing sample size, confidence levels and statistical significance – but to make your data more pretty I would recommend you carry out a classification of your eVar. Based on our earlier setup your eVar data should contain a single row for each cell (variation) in the test, eg:

GoogleTestID:2498413088 > ComboID:0-1-1
GoogleTestID:2498413088 > ComboID:0-1-1
GoogleTestID:2498413088 > ComboID:0-1-2 (etc)

This is pretty ugly, so we can classify the data using SAINT to make it more readable – suggest basic classifications would be Test Name and Cell ID – but it wouldn’t be difficult to classify further to show all the variations either as new classifications or as long strings:

This will make it easy to read the data in Site Catalyst, but for those of you who want to get really fancy I’d recommend using another fairly hidden feature of Site Catalyst – classification hierarchies – this allows the analyst to drill down into variables that are classified, eg:

Omniture.Classification.Hierarchy

This will present a report that allows you to first search for Test ID, then to drill down to the Combinations that have run within the test.

This setup, combined with selection of your chosen conversion events will be the basis for the final analysis – so far you’ve got your test data into Site Catalyst and have configured it to look pretty.

Stage 3: Automating data extraction

A big part about testing is making sure we are only analysing the data when we’ve reached 90% confidence in the winning combination beating all, or beating the control. In order to do this we need to regularly extract data and run some tests against it. To do that we’ll want to use the Omniture Excel plug-in to automate the extraction of the raw data.

We’ll need the basic page views (ie sample size), conversions (your chosen custom event[s]) and a calculated conversion rate metric.

I won’t go through the details of using this tool, best you refer to the great Omniture whitepapers on this subject.

Stage 4: Setting up the Statistical Analysis

This is where we need to go a little analytical. There are many articles published that describe how to integrate GWO data with Google Analytics, but so far I haven’t seen any of these articles stress that the analysis still needs to be done with the stats in mind. Here are the basics we’ll need to cover off in our Excel formulas:

First of all we’ll need the basic Omniture raw data from Stage 3.

Then we’ll need to calculate the following:

  • Confidence range @ 80% significance
  • Winning combination lift over all (ie second place winner)
  • Winning combination lift over control (ie combo id 0-0)
  • Standard Error
    • SE against second place winner
    • SE against control
  • Confidence to beat all (ie second place winner)
    • Statistical significance at 85%, 90%, 95% and 99%
  • Confidence to beat control (ie combo id 0-0)
    • Statistical significance at 85%, 90%, 95% and 99%

Once we’ve done this we’ll end up with a spreadsheet that looks like this:

 

So, anyone familiar with Google Website Optimizer will recognise that this table pretty much represents the original formatting that the GWO reporting tool provides.

I have updated the spreadsheet with automation formulas that calculate Cell ID rank, lookups against control and 2nd place winner, conditional formatting to hide certain values, error handling, tests of multiple levels of statistical significance and a charting macro (though it’s still not quite as pretty as the original charts GWO offers).

With this in place, you can now use Google Website Optimizer to test web content where it’s not possible to track full conversion.

I’ll be updating my original spreadsheet over the coming weeks to incorporate additional analysis. Of course for the more statistically minded you could avoid the Excel analysis and take the data straight into SPSS or SAS and carry out a full interaction analysis, perhaps running a model against the factor interactions – maybe even going as far as running taguchi models when carrying out your DoE and using GWO as an exact replacement for Omniture’s Test & Target application…?

Lots of possibilities, so there you have it – Google Website Optimizer and Omniture Site Catalyst in complementary roles.

I hope this was useful for you. Would love to hear how you are integrating tools.

Wednesday
Aug192009

What do your internet profiles say about you? What's your persona?

An interesting tool developed by MIT attempts to parse out data from search engines, social media etc and run text based analysis against this to develop a profile of you or your keywords.

Personas scours the web for information and attempts to characterize the person - to fit them to a predetermined set of categories that an algorithmic process created from a massive corpus of data. The computational process is visualized with each stage of the analysis, finally resulting in the presentation of a seemingly authoritative personal profile.

Fun and interesting, like a DNA profile. Here’s the Insightr profile:

Tuesday
Aug182009

Google predictive model forecasts search volume with average error of just 11.8% 

This morning I saw through a @GoogleAnalytics tweet that Google had updated its Insightr for Search website with a collection of interesting new features including forecasting, animations and multiple language support.

I was interested to learn of this update and was delighted to see that some great new features were implemented as part of this release, it was also satisfying (and very interesting) to earn that India and Singapore index the highest globally for the search term “web analytics

This wasn’t new though, so what I was really interested by was the predicted search volume visualisation Google Insights for Search offers, so here’s an example of the predictions for the term “basketball” (trend with dotted line):
Note - Google allows embedding of search volume, but excludes the predictions in the embed

So, this gets more interesting now - how does Google forecast this volume, especially given the enormous challenges of unpredicability, external dependant variables, seasonality and the like. It’s not a linear forecast we’re seeing here, so this looks like there’s some analytics going into this webpage. I did a little research, and found another blog article from the Google research team describing some very interesting research intoways to model and forecast search volume, citing research amongst others from Nature that used search volume to predict influenze epidemics using search query data, the team summarises its work:

Specifically, we have used a simple forecasting model that learns basic seasonality and general trend. For each trends sequence of interest, we take a point in time, t, which is about a year back, compute a one year forecasting for t based on historical data available at time t, and compare it to the actual trends sequence that occurs since time t. The error between the forecasting trends and the actual trends characterizes the predictability level of a sequence, and when the error is smaller than a pre-defined threshold, we denote the trends query as predictable.

What the team have come up with is a very smart time series model that is validated based on the following criteria:

  • Error Metrics:
    • Mean Absolute Prediction Error (MAPE) < 25%
    • Max Absolute Prediction Error (MaxAPE) < 100%
    • Normalized Mean Sum of Squared Errors (NMSSE) < 10.0
  • Seasonal Consistency Metrics
    • Mean Absolute Difference of the ACF Coef. Sets (MeanAbsACFDiff) < 0.2
    • Max Absolute Difference of the ACF Coef. Sets (MaxAbsACFDiff) < 0.4

Now, the model is validated when the MAPE is less than 25%, but for certain categories the MAPE is down to as low as 1.76% deviation from the model (Food & Drink category):

Amazing work. Of course when it comes to certain categories with complex external factors that do not fit within the norms of the model there is much less fit, for example predicting the explosive growth in online communities:

The team acknowledges the impact of external factors, such as news:

It is important to emphasize that users’ search interest is not necessarily always related to consumer preferences, buying intentions, etc. and can be related sometimes to news or or other associated events. A full discussion on the background and reasons for the following market observations is beyond the scope of this paper.

Nonetheless this is ground breaking for a couple of reasons in my opinion:

  1. A simple user interface to a free tool is validated by deep, rich analytics that is transformed into easy to use tools
  2. The research that Google have carried out in time-series analysis here could be applied to other time-series data, for example predicting web conversion, traffic or other metrics and introducing models of seasonality.

Here’s a direct link to the white paper, it’s complex stuff but interesting!

Tuesday
Aug182009

The future of mobile advertising - a vision for 2020 from Acision and OgilvyOne

It’s nice to predict things, and to have a punt at how advertising channels and the way consumer behaviour impacts their performance. This is exactly what Acision and OgilvyOne have done in a recently published whitepaper entitled “Mobile Advertising 2020 Vision - making the move towards mobile directed advertising with collaboration and individual control”.

The paper attempts to look into the future and makes some interesting predictions on a changing advertising model will become much more user centric and based around deep segmentation and preferences data; requiring the telcos to break down their current silos and break apart into distinct operating units. For those of us here in Singapore this means less operator SMS spam and more in the way of preference centres for us to share our likes and dislikes.

Mobile advertising in 2020 for the consumer is very much about the individual being in control of their own user experience.

One of the things I like about the paper is that it lends itself well to showing how mobile will become a channel enabler that allows people to connect with TV’s, Billboards, Websites etc and create direct channels to say eCommerce and conversion paths. This has long been my thesis with web marketing, that digital in its early days was about creating a deeper dialogue beyond simply pushing messages. Mobile of course has the potential to do this in ways that the ‘traditional’ computer based internet could never do, and it can do this through ubiquitous connectivity, alway on and at high speeds

I’m not sure the paper goes far enough into the future to really identify the big changes that are likely, it seems that many of the technologies, capabilities and advertiser options are very much being practiced by leading advertisers (particularly in Asia) already (though the telco’s are still a major bottleneck in the process of innovation). Here’s the summary of the paper for you, but I recommend that you download the paper and take 15 minutes to read through it:

What could the future look like then?

This leads to the question, will the following scene made famous by the movie Minority Report become the future of advertising? Perhaps…

 

Thursday
Aug132009

Congratulations to OgilvyInteractive for praise in Forrester Wave Study - Interactive Agencies

I was delighted to read this week that the team I used to work with at OgilvyInteractive in the US has been recognised as a leading interactive agency by Forrester in its US Interactive Agencies - Strategy Execution Wave Q3 2009.

This recognition is well deserved for some fine work across 5 key strategic disciplines for its clients, one of which I’m delighted to say is measurement and analytics where Ogilvy was described:

OgilvyInteractive is a Leader because of its high level of competence across all major criteria — specifically measurement and analytics, account management, and social and emerging media. The agency has a large dedicated analytics team and wide array of proprietary tools and models for audience insight, social and emerging media, and analytics.

This is really nice to hear. In my previous role as Head of Web Analytics & Optimisation for OgilvyInteractive, New York and working as part of the wider 360 analytics team, it shows that the hard work we put into developing rigid measurement processes and approaches through to the relationships we built with our analytics partners is again recognised as a leader in the space. I feel partially responsible for helping to win this recognition as I was involved in the submission of our digital measurement approach and strategies to Forrester just before I left Ogilvy.

Since leaving Ogilvy I have been continuing the same approach to thought leadership in the digital strategy space through the use of research, measurement, analysis and optimisation with Insightr, including stretching out to our new partners including Yahoo!

Congratulations again Ogilvy team, this was well deserved!

Thursday
Aug132009

A few recommendations for the Yahoo! Web Analytics team to consider

Based on our assessment of the current 9.5 version of Yahoo! Web Analytics and based on our experience of how Omniture and Google are currently running their support, sales and marketing operations, here’s a list of a few things we think Yahoo! will need to considering doing or implementing over the coming months to really set their market positioning straight:

  • Introduce a beta / testing programme in order to remain competitive with Google Analytics, to show licensed end users a roadmap or commitment to product development.
  • Introduce open data system making use of API’s to allow developers to export / import data into YWA
  • Develop YWA blog and/or user community. Initially Yahoo! should leverage members of the YWACN to start writing articles for a new blog.
  • Launch or announce the Rubix segmentation engine (more info here) and become the global leader in providing visitor level segmentation.
  • Improve the user interface, to bring the power of the tool into the same standards being set by Omniture, Webtrends and Google Analytics – here’s a thought: Omniture make heavy use of the Yahoo! GUI api in their front end (see below). I’m primarily a Mac user and have had some real issues with FF3.5 and the YWA user interface with items not loading or working as desired; in particular the date picker control.
  • Invest developer time in addressing some of the key measurement challenges in focus today: media attribution, social media measurement, offline conversion, mobile, rich media.
  • Get into institutionalised testing – multi-variate, A/B – but focus more on the process. Help users take their data and develop test hypotheses and test matrices from YWA data and help them turn those into test ideas. Maybe this is done by including a statistical significance calculator to help users determine if their hypotheses developed in segmentation are significant (eg does Page X really under-perform against Page Y with statistical significance).

We thought this was rather amusing, to follow up on the earlier comment of Omniture using Yahoo! API’s to render their Site Catalyst user interface - here are a couple of screenshots of Site Catalyst source code showing use of the API:

Now Indextools is part of Yahoo! we’re fully expecting a slick new user interface to appear, just as Google did with their slick new front end.

Here’s looking forward to what Dennis and the YWA team will be doing over the coming months to ramp up capabilities!

 

Tuesday
Aug112009

Insightr Analysis of Yahoo! Web Analytics capabilities

Last week Insightr announced our membership of the Yahoo! Web Analytics Consultant Network (YWACN) and how we would be supporting clients in the Asia Pacific region adopt the powerful analytics now offered by Yahoo! to our clients. This article is a follow up to that announcement where we want to start introducing the power of YWA! to organisations who may not know so much about the secret superstar (here’s a keyword for you: “rubix”) of web analytics.

Note to Dennis - I agree with Eric - please share Rubix with us soon! (and pray don’t tell us this exciting sounding project was cancelled with the Yahoo! acquisition)

This isn’t surprising. Back in the days before Yahoo! acquired Indextools very few users outside of Europe had been exposed to the ‘mid-range’ analytics tool (defined by its pricing model rather than capabilities) called “Indextools”. Of course there were a number of consultancies in Europe who were using Indextools for their clients and doing some tremendous work using the tool. Once Yahoo! announced the acquisition of Indextools there was immediately awareness in the industry that a new tool would be made available, and debate started over whether Yahoo! would make YWA a free tool like Google Analytics.

When I first interviewed Dennis Mortensen around the time he and his family relocated to New York after the acquisition while I still worked for Ogilvy he was unsure of where the product direction would go – though he was very much aware of the huge potential Yahoo! offered the small Indextools team. Immediately Yahoo! carried out the same tasks the Urchin team went through after their Google acquisition – platform migration to a more reliable server network and incorporation of new security policies and users/groups security.

Dennis did share with us some of his visions, which are now starting to come through with the decisions – we had a particularly long and heated discussion around media attribution and the different ways this data could be modelled, so expect something interesting to happen in this space, we also talked about the impact of being connected at a cookie level to the vast Yahoo! user network (something Microsoft didn’t quite have with their MSN Live integration in Gatineau simply because of the reach of Yahoo!) and being able to predict user interest and behaviour from this data set:

I’ll be the first to admit Yahoo! Web Analytics can be a little difficult to use after being spoilt with the slick user interfaces offered by Omniture and Google Analytics, but after a few minutes getting used to the quirks of the user interface you soon realise exactly how much power is under the hood of YWA.

Most of all we would like to send our enormous thanks to the Yahoo! team for adding Insightr to the YWACN network and for making the product free for clients to adopt. We think the strategy taken of working with consulting partners makes a lot of sense as we get under the hood of the tool; while it may upset many early adopters that YWA isn’t available to all as with Google Analytics there are plenty of consultants in the programme who are willing to share new accounts to eager testers.

Things we really like about Yahoo! Web Analytics

  • Report Scheduler
  • Automated Alerts
  • Report Customisation and Bookmarking (full access to all metrics and dimensions)
  • Chart Annotations for Analysis incorporation (for collaborative analysis amongst teams)
  • Custom Calendar Events highlighting holiday / content changes
  • Custom Colour Coding of Data in Tables for faster visual analysis (think Omniture Discover on Premise)
  • Drill down analysis at row level in reports (think Omniture full sub-relations)
  • Custom Segmentation Engine (think Omniture Discover on Demand)
  • Custom Actions (Goals or Events for those used to Google or Omniture)
  • Custom Variables and variable mapping
  • Custom merchandising categorisation (think merchandising eVars for Omniture users)
  • Scenario Analysis (think Omniture Fallout Reports)
  • 200,000 row limit on data downloads (considerably larger than permitted with Google Analytics for keyword analysis)

I’ll be writing more about some of these great features as well as sharing some thoughts on how Yahoo! can make the most of the product to meet the needs of a diverse community.