I are using Crazy Egg to analyze OCW site’s visitors’ behaviors and find out how they interact with the site elements and content.

Unlike other tracking tools such as Google Analytics or Site Meter which track traffic trends and behaviour, Crazy Egg maps every time a visitor clicks on whatever elements we have on the site. This should give us a clear picture of not only where our visitors are on the site but what they are clicking on.

I can also test different versions of a page to see which works better. Armed with this information, we can better identify the best placement for  AdSense ads

or utilize the design that could improve OCW pages conversion rates.

Traditional site tracking tools offer you a ton of information, including: Popular pages, Entry pages, Exit pages, Came from, Visitor Paths, Visit Duration, etc. But ironically it is not possible to use this information to understand what users actually do on the page. Hence, these volumes of information are practically useless in deciding what is wrong with your web site and how you can improve it. The creators of CrazyEgg saw this gap and realized there is a big opportunity to help companies assess the effectiveness of each web page. Once you look at the problem from this point of view, it becomes obvious how important it is to measure and visualize the hot spots on each web page.

Crazy Egg offers four methods to analyse your clicks:

1. Overly Mode

2. List Mode

3. Heat Mode

4. Confetti

Looking at this picture allows you to determine whether your users are clicking where you intend them to click. Often, the results are surprising – because business concepts and the design elements that are obvious to you are sometimes foreign to your users. Because their context coming to your page varies, their clicks are not what you may intend. For example, one thing that you may not expect is that people click a lot on images – even if they are not linked. These clicks can be frustrating to the users, because their expectation to drill in and learn more is not met. And because attention is scarcity these days, any minor disappointment might lead them to leave the page.

Phase I (Data Perspective):

Analyze existing data to determine current state of usability. 

  • 6/4/2009 – Hierarchical Sitemap 
    Used to determine high-level site structure, course taxonomy: will aid in creation of segmentation filters in Google Analytics.  Review directory structure for SEO.
  • 6/4/2009 – Begin SEO analysis with AWR (Advanced Web Ranking)
    Determine current search engine rankings and keyword density levels based on GA keyword research.  Benchmark.
  • 6/8/2009 – Rough KPIs definition
  • 6/11/2009 – Final KPIs definition
    Define a set of measures to be tracked going forward.
  • 6/12/2009 – Begin setup of Google Analytics KPI Dashboard/Filters
    Display KPI data in easily interpretable way, continue tracking KPIs
  • 6/13/2009 – CrazyEgg application implemented
    Application provides insight to user behavior beyond statistical analysis 

     

 

Milestone I Deliverables (Phase I Complete):

  • 6/23/2009 – Google Analytics KPI Dashboard delivered
    Will provide ongoing set of measures to determine effectiveness of campaigns
  • 6/23/2009 – CrazyEgg reports generated and finalized
    Will provide insight to user behavior, how the user uses the site.  Will indicate any navigational issues as well as the prominence of our calls to action (for this report, we will assume that they donations, newsletter signups, email this page functions, feedback, viewing course home pages and downloading .zip files).

 

Phase II (User Perspective):

  • 6/24/2009 – Phase II Kick-off Meeting
  • 6/26/2009 – Survey results analysis complete
    Analyze results from past survey for usability, prepare report.  There may be kernels of knowledge in here that we overlooked, or will provide insight on site usability or future enhancements.
  • 6/30/2009 – Persona definition complete
    Define individual personas based on user statistics.  These personas will be used to aid in User Group selection as well as allowing us a better focus in our campaign efforts.
  • 7/1/2009 thru 7/8/2009 – User Group solicitation complete, group defined
    Group selected based on user personas from TARs as well as friends/colleagues etc. 

     

  • 7/13/2009 – Rountable questions and format defined
    These questions will serve as a springboard for an open rountable dioscussion with our user group.
  • 7/15/2009 – Hold User Group Virtual Roundtable
  • 7/20/2009 – Launch usability survey in production environment
    Launch a short usability survey on the production site.  Random sampling based on session length or other criteria (could be completely random)?
  • 7/22/2009 – Rountable feedback aggregated and documented
  • 7/23/2009 – Usability Lab prep begins
    Determine post- and pre-exercise interview questions
  • 7/29/2009 – Conduct Usability study in MIT Usability Lab
    Monitor random sampling of OCW newbies as they interact with the site to bring visibility to usability issues.

 

Milestone II Deliverables (Phase II Complete):

  • 8/4/2009 – Presentation of User Group feedback
  • 8/4/2009 – Presentation of Usability Study results with recommendations
  • 8/4/2009 – Presentation of production usability survey results

 

Phase III (Branding/Outreach):

  • Prepare campaign effectiveness report
    Compare campaigns dates with GA data (conversion funnels)  to determine effectiveness and identify any bottlenecks/issues in the conversion stream.
  • Tag individual site banners with GA tags (donor campaigns)
    This will allow for better tracking of banner campaign effectiveness. Prepare conversion funnels in GA for campaigns
  • Compile AWR (Advanced Web Ranking) reports and recommendations
    Conduct 1 month trial of AWR to analyze search engine ranking and keyword trends amon our competitors.  Looking not only at other OCW venues, but at commercial and other OER venues as well, we will name the following as competitors: University of Michigan OCW, Utah State OCW, Johns Hopkins OCW, Tufts OCW, The Open University, Connexions, Curriki, Cramster and Academiearth. 

     

  • Establish GA geographic segment filters/reports/dashboard (outreach)
    This will help identify geographic locations where outreach can be targeted.
  • Conduct A/B testing based on feedback from Phases I&II

Phase I:

Implement CrazzyEgg on the following pages:

  • Newsletter Signup
  • Donate page
  • Feedback page
  • All Global pages (~60)
  • Top 20 courses (all pages within)
  • Select HFH pages
  • ‘email this page’ form

Through this tool, we will be able to tell how users use the site, not just that they use the site (standard analytics).

 

Analytics

The current WebTrends and Google Analytics (GA) data can be filtered and reports established to better reveal user behavior. 

  • Establish a set of Key Performance Indicators (KPI).
  • Track conversion goals/funnels (KPIs):  email this page, newsletter forwards, donation conversions, newsletter conversions, feedback conversion, etc.
  • Create Analytics dashboard for KPIs, Goal funnels for Exec Management – monthly meetings to discuss results.
  • Use advanced segmentation in GA to reveal demographics of conversions events. (Can help with outreach)

 

Phase II:

User Group
Solicit current users to become members of a user group.  Select up to 20 users based on pre-defined criteria. 

  • Conduct individual interviews of group members, how they use the site, what they feel is lacking, what can be improved, how they would improve it, etc.
  • Host virtual conference to discuss OCW usability and brainstorm improvements as a group.
  • Beta release usability improvements to group and collect/analyze feedback – vet and implement changes.

 

Usability study/ Focus group

Solicit group of students/employees from campus and surrounding area.  Provide incentive (eg., Starbucks card) for participation in study.  Users should have little or no knowledge of OCW and never have visited the site.

  • Employ MIT’s usability lab and conduct multiple studies of user behavior followed by interviews/debriefings.
  • Conduct A/B testing on this group – prepare multiple visual iterations of key site elements: navigation, conversion objects and user workflow.  Determine which are most effective based on user interviews. 

 

Site Usability Survey in production environment

  • Conduct selective usability survey on production site to small sampling of users.
  • Create real-time ‘dashboard’ of the results indicating performance in key areas of usability (ease of navigation, effectiveness of content, ease of course discoverability, etc.)

 

Phase III:

A/B testing on production site

  • Conduct A/B testing with small precentage of site users through Google Webmaster Tools.

 

OCW Discoverability/Marketing Effectiveness (Outreach)

  • Campaign effectiveness (banners)
    Measure banner clicks, track flow through conversion funnel
  • Newsletter effectiveness
    Track newsletter clickthrough flow through conversion funnel to determine the effectiveness of these mailings.
  • SEO Analysis
    Utilize Advanced Web Ranking (http://www.advancedwebranking.com/) to determine current ranking against competitors, keyword desnsity/effectiveness, inbound and outbound link quality and quantity etc. 

OCW Usability Assessment Project

 

Strategy

I firmly believe in the ACRE methodology; Acquire, Convert, Retain, Extend.  OCW has the power of the MIT name and being the forerunner in OCW to Acquire users. But, once on the site, we must provide a pleasant and fulfilling user experience to have the user Convert (sign up for newsletter, donate, provide feedback, explore course content etc.), Retain the user (keep them on the site exploring course content as long as possible and have them come back in the future) as well as Extend our user base (through word of mouth, email this page function, embeds, newsletter forwards, RSS feeds etc.).  At present, course use is difficult to measure.  Ideally, new goals and measures will come about as a result of this usability exercise.

 

The Whole Picture

In order to determine the true usability of a website, several factors have to be considered beyond site analytics/statistics.  Although these methods do provide insight into user behavior, they only present a small view of the site’s usability and should serve as a foundation for further research.  To get the whole picture, we must look beyond the statistical, or data ‘box’ and consider the human user.  For this, we can employ tactics such as user groups, usability studies, software applications, A/B and behavioral testing (as well as site analytics) among others.

 

Hueristic Nature of Humans

The goal of a website usability survey is to take the guesswork out of the user’s  viewing experience.  The user will generally have a clear inention for visiting a site.  If that intention cannot be immediately addressed on the landing page, the user will start to take a trial and error approach – taking educated guesses at which path may guide them to the desired content.  This usually ends in user frustration and a bounce, where the user exits the site and starts their investigation over somewhere else (most likely a search engine).

 

If provided with a clear/concise path to the content, heuristic behavior can be minimized, resulting in an overall better user experience and multiple return visits.

Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!