Author Archives: krcmic.com

What is Google Analytics 4? GA4 vs. Universal Analytics: A side-by-side comparison and how does it differ from Universal Analytics? 2.part

In the first part of our exploration and tour of the past, we dived into the fascinating journey of Urchin, tracing its evolution from a pioneering analytics tool to its pivotal role in the creation of Google Analytics. This story highlighted key milestones, from Urchin’s early days and its acquisition by Google, to the transformative steps that led to the development of Google Analytics as a dominant force in web analytics.

In the next part of our story, we’re going to look at the newest big change in Google Analytics: the start of Google Analytics 4 (GA4). We’ll make GA4 easy to understand. We’ll explain what it is and how it’s different from the older version, Universal Analytics. We’ll compare GA4 and Universal Analytics, pointing out what’s new and better in GA4. This includes how it handles data, respects privacy, tracks users, and uses smart technology for deeper insights.

This section is not just about the technical side of GA4. We’ll also talk about what these changes mean for businesses and marketers. How will moving to GA4 affect them? What challenges might they face, and what new changes could they bring in this fast-changing digital world?

Let’s dive into Google Analytics 4 together and see how it’s changing the game in understanding website data.

Google Analytics shift to Universal Analytics

Post-acquisition, Google utilized the foundation laid by Urchin to develop what would become one of the most popular web analytics services worldwide. Urchin’s technology and insights were instrumental in the creation of Google Analytics, a platform that has since become a staple for webmasters, marketers, and businesses looking to gain insights into their online presence and performance.

Following the acquisition, Google set out to transform Urchin on Demand into a more accessible and powerful web analytics tool. In November 2005, Google launched the first version of Google Analytics. This launch marked a significant moment in the web analytics industry, as it made robust analytics tools available for free to webmasters and marketers worldwide. The initial release was so popular that Google had to temporarily halt new sign-ups due to overwhelming demand.

The early versions of Google Analytics focused primarily on providing insights into website traffic, such as page views, session durations, and bounce rates. It was a tool designed to help website owners understand how visitors interacted with their sites. This data was crucial for businesses and marketers to optimize their websites for better user engagement and conversion rates.

Over the years, Google Analytics underwent several updates to enhance its functionality and user interface. These updates included improved goal tracking, e-commerce reporting, and the introduction of custom reports, which allowed users to tailor their data analysis more closely to their specific needs.

In 2007, Google introduced a major update with the launch of Google Analytics v2. This version featured a completely redesigned user interface, making it more user-friendly and visually appealing. The update also included new features like internal site search tracking and event tracking, which provided deeper insights into user behavior on websites.

Another significant development came in 2009 with the introduction of real-time analytics, enabling users to see active visitors on their site and their activities in real-time. This feature was a game-changer for immediate data analysis and decision-making.

Despite these advancements, the digital landscape was rapidly evolving, with the rise of mobile devices and the need for cross-platform tracking. To address these challenges, Google introduced Universal Analytics in 2012, marking the next major phase in the evolution of Google Analytics. Universal Analytics brought significant enhancements, including the ability to track user interactions across different devices and platforms, providing a more comprehensive view of the customer journey.

In 2013, Google introduced Universal Analytics (UA), which quickly became the go-to platform for web tracking. However, on March 16, 2022, Google announced that Universal Analytics would be phased out, with the transition set to begin in July 2023. This marked a significant shift in Google’s approach to web analytics, as indicated in their final announcement on the matter.

The period before Universal Analytics was thus characterized by continuous innovation and improvement, as Google Analytics evolved from a basic website traffic analysis tool into a sophisticated platform capable of providing deep insights into user behavior and digital marketing effectiveness.

What happened before GA4?

Google Analytics has changed a lot since Google bought it in 2005. Back then, Google bought a product named “Urchin Analytics” – that’s where the term UTM, or Urchin Tracking Modules, comes from.

This became from now and then the original Google Analytics.

In 2013, Google introduced Universal Analytics (UA), which became the main way to track website data.

New Universal Analytics

Universal Analytics (UA), introduced by Google as the next evolution of Google Analytics, brought several significant improvements and new features compared to the older versions of Google Analytics. Here are some of the key advancements:

  1. User ID tracking – UA introduced the User ID feature, allowing more accurate tracking of individual users across multiple devices. This was a major leap from the older versions, which were more focused on sessions and didn’t track individual users as effectively across various devices.
  2. Custom dimensions and metrics – Universal Analytics allowed for more customization in tracking. Users could define their custom dimensions and metrics to track data that was specific to their business needs, something that wasn’t as flexible in previous versions.
  3. Enhanced E-commerce reporting – UA brought in a more sophisticated e-commerce reporting feature, enabling deeper insights into customer purchasing behavior, product performance, and more detailed transaction data.
  4. Simplified and more accurate tracking code – the tracking code in UA (analytics.js) was simpler and more efficient compared to the older ga.js script. This resulted in more accurate data collection and easier implementation.
  5. Offline data tracking – Universal Analytics provided the ability to track offline interactions, a feature not available in the previous versions. This allowed for a more comprehensive view of customer interactions beyond just their online activities.
  6. Session and campaign timeout handling – UA offered more control over session and campaign timeouts, allowing users to define the length of these periods based on their specific business requirements.
  7. More powerful data collection API – the measurement protocol in UA allowed for data collection from any digital device, not just websites, broadening the scope of analytics to include things like gaming consoles and customer relationship management systems.
  8. Enhanced site speed and performance metrics – UA included more detailed site speed and performance metrics, giving insights into how website performance impacted user experience.
  9. Custom channel groupings and content groupings – it allowed for more granular segmentation of traffic and content, helping in better analysis and understanding of user engagement.
  10. Mobile app analytics – UA was better equipped for mobile app analytics compared to its predecessors, providing detailed insights into mobile app usage and user engagement.

These improvements made Universal Analytics a more robust, flexible, and user-centric analytics tool, offering businesses deeper insights into their user’s behaviors and preferences.

But on March 16, 2022, Google said UA would stop working in July 2023.

When was Universal Analytics deprecated?

“On July 1, 2023, this property will stop processing data. Starting in March 2023, for continued website measurement, migrate your original property settings to a Google Analytics 4 (GA4) property, or they’ll be copied for you to an existing GA4 property, reusing existing site tags.”

That was the original message all marketers saw for the first time on March 16, 2022, by Google, who announced that Universal Analytics would stop processing hits on July 1, 2023, for standard UA properties, and on October 1, 2023, for UA 360 properties. The announcement, given the relatively short time frame, caught many marketers off guard and sparked a bit of panic within the (marketing and web analytics) industry. Although Google Analytics 4 had been available since October 2020, its adoption rate was still quite low at the time. This likely influenced Google’s decision to expedite the transition by setting an earlier deprecation date for Universal Analytics.

While more than a year might have seemed like sufficient time to implement a new tracking script on a website, the critical role of Google Analytics for most businesses in performance measurement made the transition period quite significant. Prior to starting the implementation of Google Analytics 4, it was essential to grasp its differences from Universal Analytics. Our list of important distinctions was prepared to aid in this understanding.

What were the main reasons Google introduced GA4?

Google introduced Google Analytics 4 (GA4) primarily to adapt to the evolving digital landscape where technology and user behavior are constantly changing. The traditional version of Google Analytics (Universal Analytics) was not equipped to provide accurate insights considering these rapid changes. Here are the main reasons for the introduction of GA4:

  1. Adapting to a changing digital landscape – GA4 was designed to keep up with advancements in technology and the evolving patterns of user behavior, which the older version may not accurately capture. In an increasingly dynamic digital environment, Google Analytics 4 (GA4) has been engineered to provide a robust analytics tool that keeps pace with the rapid technological advancements and shifting user behaviors online. The predecessor, Universal Analytics, was based on a session-based data model primarily focused on desktop web data. As the internet landscape transitioned more towards mobile and app-based interactions, Universal Analytics’ capabilities to provide insightful data started lagging.GA4 introduces an event-based data model that is more adaptable and suitable for a variety of user interactions across websites and apps. This model allows for the collection and analysis of data from both web and app sources in a unified manner, offering a more comprehensive view of user engagement. This is crucial as user interactions become more varied and less predictable, with multiple devices and platforms being used interchangeably. Moreover, GA4’s flexible approach to data measurement recognizes that the digital landscape is not static. It acknowledges the introduction of new platforms, devices, and user behaviors that could emerge in the future, ensuring that the tool remains relevant and effective.
  2. Privacy-first approach – GA4 represents a shift towards privacy-centric tracking. With increasing concerns about user privacy and data protection, GA4 aims to address these challenges while still offering robust analytics. The decade-old Universal Analytics (UA) platform, with its data architecture that increasingly hindered scalability and its non-compliance with emerging data privacy laws like GDPR and CCPA, created an urgent need for Google to facilitate a transition to a more modern solution. This led to the development of Google Analytics 4 (GA4), which has been fundamentally designed with privacy at its core to meet these contemporary challenges.GA4 ushers in a range of privacy-centric features to help businesses align with stringent data protection standards. These features include:
    • IP anonymization – by default, Google Analytics 4 anonymizes IP addresses, reducing the risk of personal data breaches and ensuring user anonymity.
    • Data retention controls – Google Analytics 4 offers more granular controls over data storage durations, allowing organizations to decide how long they retain data.
    • Server and data transfer locations – Google Analytics 4 provides clear policies regarding the location of its data servers and restricts data transfers, crucial for compliance with regulations like GDPR which have specific requirements about where and how personal data can be stored and processed.
    • Consent mode – this feature allows businesses to adjust how they collect and use data based on the consent provided by users, ensuring compliance with user privacy preferences.
    • User data deletion – Google Analytics 4 enables the deletion of users’ personal data upon request, supporting the ‘right to be forgotten’ as stipulated by privacy regulations.
    • PII handling rules – strict rules on the handling of personally identifiable information (PII) are implemented to prevent privacy violations.
  3. Cross-channel measurement – the platform is geared towards x-channel measurement, which allows for a more integrated view of the customer journey across different platforms and devices.
  4. Advanced machine learning – GA4 incorporates Google’s advanced machine learning to provide predictive insights and data, which can help organizations anticipate future actions of users and better understand the patterns in website traffic and user behavior without relying on traditional page hits​​.
  5. Enhanced data visualization and analytics – with a focus on machine learning and data visualization, GA4 offers more comprehensive predictive analytics and insights, as well as improved data visualizations compared to its predecessor. It also simplifies the tracking process, making it more user-friendly.

All key differences between GA4 and Universal Analytics

How does GA4 measure users compared to Universal Analytics?

Universal Analytics relies on “cookie-based” tracking to collect data. When a website uses Universal Analytics, it places a cookie in the user’s web browser. This cookie enables the platform to monitor and record the user’s web activity during their session on the site. Universal Analytics uses a session-based data model for measurement.

In contrast, GA4 offers a more versatile approach. According to Google, GA4 allows businesses to measure user activity across various platforms and devices using multiple forms of identity. This includes first-party data and “Google signals” from users who have consented to personalized ads. GA4 also continues to use cookies where available for tracking purposes. However, instead of relying solely on tracking sessions, GA4 adopts an event-based data model.

In today’s privacy-conscious world, the use of cookies is under scrutiny, and their prevalence may decrease over time. While this trend is arguably positive for user privacy, it poses challenges for digital marketers, who have traditionally relied on cookies for tracking and targeting.

Universal Analytics properties Google Analytics 4 properties 
Measurement Session-based data model Flexible event-based data model
Reporting Limited cross-device and cross-platform reporting Full cross-device and cross-platform reporting
Automation Limited automation Machine learning throughout to improve and simplify insight discovery

In GA4, users are tracked using an event-based data model, differing from the session-based model used previously. Unlike Universal Analytics, which mainly uses the ‘Total Users’ metric, GA4 emphasizes ‘Active Users.’ It also tracks ‘Total Users,’ ‘New Users,’ and ‘Returning Users.’ While diving into this aspect of GA4 might not be advisable for beginners in Google Analytics, a guide on user measurement in GA4 could serve as a valuable resource for future reference.

In GA4, all hits are tracked as an event

Let’s get this straight from Google. The Session-Based Model.

In UA properties, Analytics groups data into sessions, and these sessions are the foundation of all reporting. A session is a group of user interactions with your website that take place within a given time frame.

During a session, Analytics collects and stores user interactions, such as pageviews, events, and eCommerce transactions, as hits. A single session can contain multiple hits, depending on how a user interacts with your website.

How about the event-based model?

In GA4 properties, you can still see session data, but Analytics collects and stores user interactions with your website or app as events. Events provide insight on what’s happening in your website or app, such as pageviews, button clicks, user actions, or system events.

Events can collect and send pieces of information that more fully specify the action the user took or add further context to the event or user. This information could include things like the value of purchase, the title of the page a user visited, or the geographic location of the user.

This might not sound like much, but this is a huge, huge difference.

Event-based data vs. session-based data

Displayed in the table is how Google Analytics captures data. On the left, the Universal Analytics property lists various “hit types,” each representing a different kind of data. In web analytics, a “hit” refers to any interaction a visitor makes with your site or app, such as a click, page view, scroll, file download, purchase, or any other trackable action. On the right, the GA4 property simplifies this by categorizing all types of hits as Events.

Universal Analytics Google Analytics 4
Page View Event
Event Event
Social Event
Transaction/e-commerce Event
User timing Event
Exception Event
App/screen view Event

Events existed in UA as well; with an associated category, action, and label; but these classifications do not exist in GA4. Instead, GA4 works with event parameters that are additional pieces of information about the action (event) a user took. Some event parameters are sent automatically, such as page_title, and additional ones can be added (you can log up to 25 event parameters with each event). Since the data models are fundamentally different, Google recommends that you do not simply copy over existing event logic from UA to GA4, but instead implement new logic that makes sense in this new context. 

Universal Analytics can track events such as button clicks, scroll depth, and downloads, but this necessitates the use of Google Tag Manager. In GA4, while some events still require Google Tag Manager for tracking (known as “recommended events” and “custom events”), there are also events that GA4 automatically measures without additional tools. These automatically tracked events are divided into two categories: “automatically collected events” and “enhanced measurement events.”

Event types in GA4

Let’s dive deeper into the different event categories in GA4, as outlined in the Google Developers guide for GA4.

There are four main event categories in GA4, but I’ll focus on two: Automatically Collected events and Enhanced Measurement events. These event types are logged automatically through either gtag or gtm configuration and don’t require extra coding.

Here’s an overview of automatically collected events:

  1. session_start: Triggers when a user initiates a session on a website or app.
  2. first_visit: Fires if it’s a user’s first visit to the site or app.
  3. user_engagement: This occurs when a visitor has been on a page for at least 10 seconds, viewed two pages, or completed a conversion event. GA4 tracks this automatically.

Now, regarding enhanced measurement events:

These include page views, scrolls, outbound clicks, site searches, video engagement, and file downloads. Enhanced measurement events are particularly noteworthy as an upgrade in GA4 compared to UA, where tracking these events requires additional effort.

In the Admin section of your GA4 property, under Data Stream, you can view these enhanced measurement events. Except for page views, you have the option to toggle off any of these enhanced measurement events if you choose.

  • Category 3 – Recommended Events – these events come with predefined names and parameters, tailored for various business models. Implementing them requires custom coding, often done via Google Tag Manager. Their defining feature is that Google has suggested specific names for these events. For instance, when a visitor adds an item to their cart, Google suggests naming this event add_to_cart. The initiation of the checkout process should be labeled as begin_checkout, and the submission of a contact form as generate_lead. You can find a comprehensive list of these Recommended Events on Google’s website, detailing all the suggested event names.
  • Category 4 – Custom Events – similar to recommended events, custom events also need custom coding and can be set up using Google Tag Manager. However, unlike recommended events, there are no predefined names or parameters from Google. For example, tracking internal link clicks on your site would fall under custom events. While GA4’s Enhanced Measurement can track outbound (external) link clicks, tracking internal link clicks requires more effort through custom event implementation. The following sections offer a comparison of how to approach this.

Tracking link click events in GA4 vs. UA

In Universal Analytics (UA), data tracking primarily revolves around pageviews. UA tracks when a URL loads, capturing that as a pageview. However, user actions that don’t lead to a new page loading on the tracked domain, such as video plays or clicks within or outside the domain, aren’t automatically tracked. For tracking these “events” like link clicks, Universal Analytics requires assistance from Google Tag Manager.

Setting this up can be quite intricate, especially for marketers doing it for the first time. It involves creating variables, triggers, and tags in Google Tag Manager to track specific actions, translating them into data in Google Analytics. For example, a Universal Analytics event tag might be configured to track all link clicks on a particular site. A key difference in UA, compared to GA4, is the use of pre-defined “event parameters” such as category, action, and label. These parameters provide additional context to the tracked event, aiding in data interpretation.

  • Category = link_click: This parameter is consistently used every time the tag is activated.
  • Action = {{Click URL}}: This variable captures the specific URL clicked by the user.
  • Label = {{Page URL}}: Another variable, this one records the URL of the page the user was on when the link was clicked.

In contrast, GA4 simplifies this process by automatically tracking certain types of link clicks, reducing the need for such a detailed setup in many cases.

Event tracking in GA4

Contrary to Universal Analytics, GA4 is designed with event tracking capabilities built-in, rather than relying primarily on pageview tracking. As previously mentioned, GA4 automatically tracks certain types of events (like automatically collected events and enhanced measurement events). However, for recommended events and custom events, manual creation using Google Tag Manager is necessary.

GA4 automatically captures some basic “event parameters” with every event. These include:

  • language
  • page_location
  • page_referrer
  • page_title
  • screen_resolution

For additional parameters in recommended and custom events, there’s an extra step involved: these parameters need to be registered as custom dimensions in GA4. This aspect of GA4 can be initially confusing and for those finding it non-intuitive, consulting a practical guide on understanding event parameters in GA4 might be beneficial.

While some events in GA4 can be tracked directly (like automatically collected events and enhanced measurement events), others, such as recommended events and custom events, still require the use of Google Tag Manager. An illustrative example of this is tracking internal link clicks, which falls under the category of events requiring additional setup in GA4.

How to compare reporting in Google Analytics 4 (GA4) and Universal Analytics (UA)

Universal Analytics was designed as a robust platform with a variety of standard reports, whereas GA4 offers relatively fewer standard reports, placing a greater emphasis on custom reporting or data exportation. Examining the area of acquisition reporting, which is crucial in Google Analytics, highlights the differences between GA4 and UA.

Acquisition reporting is essential for analyzing the performance of different traffic sources on a website. This type of reporting is critical for evaluating the effectiveness of channels like organic search, email, and social media in driving purchases or other conversions. It not only aids in understanding overall business performance but also assists in making informed decisions about budget allocation.

While the fundamental concept of acquisition reporting in GA4 is similar to that in UA, there are notable distinctions. GA4’s approach to acquisition reporting might require more customization or data manipulation compared to the more out-of-the-box solutions provided by UA. This shift indicates a move towards a more flexible, albeit potentially more complex, reporting framework in GA4.

GA4 reports

In GA4’s Acquisition reporting section, as highlighted in the above box, there are just three standard reports available. A significant omission here is the Source/Medium report, which was highly favored for its effectiveness in analyzing traffic performance across various channels (like comparing google / organic with bing / organic or google / cpc).

The whole interface has changed and some standard reports need to be activated via Library in the left-down corner (you have to also publish the changes, otherwise you will see no change in default reports).

You can adjust all reports according to your needs (you can create your own views).

To conduct more nuanced analyses in GA4, additional effort is required.

This could involve exporting data for deeper examination or creating custom “Exploration” reports.

Another option is to integrate with Looker Studio (previously known as Data Studio) to construct more tailored reports. While these custom reports can be extremely useful, they initially demand more time and effort to set up compared to the more straightforward reporting system in Universal Analytics. F

Universal Analytics reports

In Universal Analytics (UA), as indicated in the blue box, a significantly larger array of standard reports is available compared to GA4. When you dive into all the reporting sections in UA, you’ll find a total of 30 standard reports. This is a stark contrast to the mere 3 standard reports offered in GA4.

And, of course, we see the faithful Source / Medium report among the 30.


User reports (UA) in GA4cheatsheet

Universal Analytics Google Analytics 4
Demographics data – overview User – User Atributes – Overview
Age / Gender / Location User – User Atributes – Demographic details
Technical data – overview Tech – Tech Overview
Operation system / Device type / Browsers Tech – Tech Details

Acquisition reports (UA) in GA4cheatsheet

Universal Analytics Google Analytics 4
Acquisition Overview Lifecycle – Acquisition – Acquisition Overview
All Traffic – Channels Lifecycle – Acquisition – Traffic acquisition: Session default channel group

Advertising – Performance – All Channels

All Traffic – Source/Medium Lifecycle – Acquisition – Traffic acquisition: Session default channel group
All Channel Reports – by users Lifecycle – Acquisition – Users
Google Ads Lifecycle – Acquisition – Acquisition Overview – Google Ads tab
Search Console Search Console – Search Console – Queries
Google Organic Search Search Console – Search Console – Google organic search traffic: Landing page + query string
Organic Traffic (all) Lifecycle – Acquisition – Traffic acquisition: Session default channel group

Behaviour + event reports (UA) in GA4cheatsheet

Universal Analytics Google Analytics 4
All Pages Lifecycle – Engagement – Page and Screens
Events Lifecycle – Engagement – Events
 Behaviour (new vs returning users)  Lifecycle – Retention (you can find here also Customer Lifetime Value metric)

E-commerce reports (UA) in GA4cheatsheet

Universal Analytics Google Analytics 4
Conversions  Lifecycle – Engagement – Conversions
Items Revenue  Business Objectives – Drive Online Sales – E-commerce Purchases
Purchase Funnel  Business Objectives – Drive Online Sales – Purchase Journey

Metrics comparison: GA4 vs. Universal Analytics

In GA4, there are three new metrics that differ from those in Universal Analytics (UA):

  1. Engaged session – Google defines this metric as a session that lasts longer than 10 seconds, includes a conversion event, or has two or more screen or page views.
  2. Average engagement time per session – this metric measures the duration of user engagement per session, essentially the time spent actively interacting with the page (like scrolling) while it remains the primary window on the screen.
  3. Engagement rate – this is calculated as the ratio of Engaged Sessions to total sessions. For example, if you have 1,000 total sessions and 130 qualify as Engaged Sessions (as per Google’s definition), the Engagement Rate would be 13%.

These metrics are not available in UA.

They replace some of the metrics that were phased out in GA4, such as average session duration, pages per session, and bounce rate (the percentage of single-page view sessions).

Regarding bounce rate, although initially absent in GA4, it was later introduced but with a different calculation method than in UA, which can be confusing. Understanding this difference is crucial for the accurate interpretation of data in GA4.

The reason behind these metric differences between GA4 and Universal Analytics relates to their respective data models. UA’s model is centered around sessions and pageviews, making it straightforward to calculate metrics like pages per session or bounce rate. In contrast, GA4’s model prioritizes event collection and processing over traditional page views and sessions, necessitating a different approach to metric calculations.

Difference between sessions in GA4 and UA

Definition of session in Universal Analytics (UA)

Sessions end when:

  • 30 minutes of inactivity (or your session timeout settings)
  • The clock passing midnight (resulting in a new session)
  • New campaign parameters are encountered (i.e. if you use UTM parameters for internal links on your website > therefore, this is not recommended by Google to use UTMs for internal linking of your website). 
Definition of session in Google Analytics 4 (GA4)

Fewer occurrences of how a session can end. Sessions end only if there is:

  • 30 minutes of inactivity (or your session timeout settings)
    • the session can now carry over across midnight
    • session and are not affected by encountering new campaign parameters

If your site has a global audience, this can cause discrepancies in the session figures you see for UA and GA4 respectively.

Why bounce rate is no longer used in Google Analytics 4?

There are plenty of reasons why Google decided to phase out the old bounce rate when it first launched GA4, from a perceived lack of relevancy to the popularity of single-page applications (SPA). Bounce rate has been around since the beginning of Google Analytics, which means it took more than a decade for marketers and agencies to become attached to the bounce rate. But in an age of single-page applications, the old definition of bounce rate isn’t useful anymore. Since there’s technically only one “page,” every visit is seen as a bounce, which isn’t true or helpful.

Single-Page Applications (SPAs):

  • Unlike old websites where every click loads a new page, SPAs rewrite the current page on-the-fly, making things fast and smooth.
  • Big names like Gmail, Netflix, and Facebook use this design, making your browsing quick and seamless without waiting for new pages to load.

Google now focuses on more meaningful interactions, looking at what users do on the page, rather than just counting page loads. This way, businesses can understand their audience better and see if their website is actually engaging or needs improvement.

Therefore Google has chosen to take a more “positive” approach and report on figures for bounce rate => new metric engagement rate = bounce rate is the inverse of engagement rate.

But also it is calculated differently in GA4 than in UA.

GA UA vs GA4 – how are bounce rate and engagement rate calculated

  • Bounce rate in Universal Analytics – someone visits your client’s website and only looks at a single page before leaving, that is considered a “bounce” in UA.
  • Bounce rate in GA4 – the bounce rate in Google Analytics (GA4) is the percentage of sessions that:
    • were less than 10 seconds long,
    • had zero conversion events or
    • had less than 2 page or screen views.
  • Engagement rate in GA4 – an inverse metric of bounce rate.

GA UA vs GA4 – longer data processing delay

  • As you migrate to GA4, one of the first things you may notice is the extended data processing time, which leads to delays.
  • Universal Analytics typically provides data within four hours.

GA4 introduces a wider range of data delays, spanning from 12 to 48 hours.

GA UA vs GA4 – Google Tag Manager is even more important

Now, with GA4, this is no longer possible. All goals are event-based, so you need Google Tag Manager to set it.

A lot of standard settings which UA had in default, now need manual setup in GTM/GA4.

Segments in Google Analytics 4 (GA4) vs. Universal Analytics (UA)

Segments in both GA4 and Universal Analytics serve the same fundamental purpose: they enable the analysis of specific subsets of your Google Analytics data, providing deeper insights into user behavior and website or app performance. While segments operate similarly in both versions, allowing analysis of up to four segments simultaneously, there are differences in the types of segments you can create and the process of creating them.

In GA4, there are three types of segments available for creation:

  1. User segments – focus on the characteristics and behavior of individual users.
  2. Session segments – focus on specific sessions.
  3. Event segments – targeting particular events.

In contrast, Universal Analytics offers only two types of segments:

  1. User segments.
  2. Session segments.

The most notable difference lies in the process of creating these segments. In GA4, segment creation is integrated into the “Explorations” section, the same area used for creating custom reports. This integration signifies a more unified approach to data analysis and reporting in GA4. For a detailed comparison and guidance on creating custom segments, reviewing a walkthrough of segment creation in Google Analytics 4 versus Universal Analytics could be highly beneficial.

Website and app tracking in the same property

A highly anticipated feature of GA4 is its capability to consolidate the tracking of both website and app data within a single property. GA4 adopts the measurement approach utilized by Google Analytics Firebase, which is tailored for mobile apps, where every user interaction is recorded as an event. This event-based model allows for a seamless integration of data across websites and mobile apps. The unified data framework simplifies the process of aggregating and analyzing user behavior across different platforms, providing a holistic view of the user journey. This means analysts and marketers can track a user’s interaction from the website to the app, or vice versa, without losing context, making it considerably easier to assess combined data for comprehensive insights.

Additional Advantages of Google Analytics 4 (GA4)

Google highlights several key benefits of GA4, emphasizing its advanced capabilities in understanding user interactions and adapting to the evolving digital landscape:

  1. Enhanced user interaction tracking – GA4 is designed to measure, unify, and de-duplicate user interaction data, offering a clearer picture of the user journey across platforms and devices.
  2. Adaptability to privacy changes – the platform is built to adjust to changing privacy regulations and user expectations, ensuring data collection remains compliant and effective.
  3. Intelligent business insights – GA4 utilizes machine learning to uncover valuable business insights, helping users to understand and predict customer behaviors more accurately.
  4. Actionable data utilization – the system is geared towards helping businesses more effectively act on their data to achieve specific goals and objectives.
  5. Free integration with Google BigQuery – GA4 offers free integration with Google BigQuery, a feature previously available only in the paid GA360 plans with Universal Analytics. This integration is a significant enhancement, allowing for more complex data analysis and processing.

From a practical learning perspective, the most important advantage is that Google has positioned GA4 as the new standard for digital analytics. This means that, inevitably, GA4 will replace Universal Analytics for all digital marketers and measurement teams, making proficiency in GA4 not just beneficial but essential for future success in the field.

What is Google Analytics 4? GA4 vs. Universal Analytics: A side-by-side comparison and how does it differ from Universal Analytics? 1.part

Universal Analytics has already seen its sunset in July 2023, paving the way for its successor, Google Analytics 4 (GA4). Since becoming the standard for new properties on Google Analytics as of October 14, 2020, there hasn’t been a widespread eagerness among marketers and web developers to transition from UA to GA4.

GA4 introduces a significantly different operational framework compared to UA, bringing along new and improved features but also facing criticism for certain bugs and the absence of some popular UA features. This article will explore all major distinctions between these two platforms (Universal Analytics and Google Analytics 4) and we will get also into more detail to show you what new opportunities GA4 offers.

But first, we will get a little bit of the history of this amazing web tracking tool.

Brief history of Urchin/Google Analytics

Our story begins in the late 1990s, a time when the internet was rapidly expanding and businesses were just beginning to realize the potential of an online presence.

Urchin Analytics began its journey in the realm of web analytics as a product of Urchin Software Corporation, founded in 1995. The company, focusing on the web statistics and web analytics field, developed Urchin as a software solution to help businesses understand and interpret web traffic data. In 1998, Urchin Software Corporation emerged, as a pioneer in the field of web analytics. Their product, Urchin, was groundbreaking, offering website owners invaluable insights into visitor behavior.

As the internet evolved in the early 2000s, Urchin Software Corporation adapted by introducing a new product – Urchin On Demand. This service marked a significant shift from traditional, software-based analytics to a more accessible, service-based model. Urchin On Demand allowed users to monitor and analyze their web traffic through a hosted solution, eliminating the need for installing complex software on their own servers. This move was pivotal in making web analytics more user-friendly and widely accessible. Urchin on Demand, was one of the early tools available for website traffic analysis.

The potential of Urchin Analytics did not go unnoticed by the tech giant Google. In April 2005, in a move that would significantly shape the future of web analytics, Google acquired Urchin Software Corporation. This acquisition was a strategic step for Google, as it sought to expand its footprint in the world of online analytics and advertising.

The legacy of Urchin Analytics is thus deeply intertwined with the evolution of web analytics as a whole, marking a significant chapter in the history of how businesses understand and interact with their digital audiences.

First (former Urchin) team day at Google, April 21, 2005 after acquisition.

First (former Urchin) team day at Google, April 21, 2005 after acquisition.

First website tracking software Urchin and its evolution and Urchin Software Corporation story

TL; DR: Urchin Software Corporation, originating in San Diego, CA, was co-founded by Paul Muret, Jack Ancone, Brett Crosby, and Scott Crosby. In April 2005, Google acquired the company, transforming Urchin into “Urchin from Google,” and eventually evolving it into Google Analytics. With the many anniversaries of this acquisition having recently passed, it seemed an opportune moment to document the company’s history for future reference. This account may not captivate those unconnected to its journey; it’s more a personal closure of that chapter.

Perhaps, this story also subtly indicates that success doesn’t always require massive initial funding or rapid growth. Sometimes, a more modest approach with gradual progress can lead to significant achievements.

Founding of Urchin Software Corporation

In the late months of 1995, the seeds of what would become Urchin Software Corp. were sown by two post-college roommates, Paul Muret and Scott Crosby, in the Bay Park neighborhood of San Diego. Paul, who had been working in the Space Physics department at UCSD, stumbled upon the world of HTML 1.0 while uploading the department’s syllabus online. This exposure sparked an idea in him, a vision of a business opportunity to create websites for other businesses.

One evening, Paul returned home, brimming with excitement about this newfound opportunity. To illustrate his point, he showed Scott a simple website he had created for UCSD, featuring bright blue text on a grey background, with some of the text possibly even blinking in an early web aesthetic. Convinced by Paul’s enthusiasm and the potential in this nascent internet era, the two embarked on drafting a business plan.

Their plan, a blueprint for a venture into the digital frontier, was presented to Scott’s uncle, Chuck Scott. Chuck, a figure of financial means, saw promise in the young entrepreneurs’ vision. He agreed to invest $10,000 in their new company and even provided them with a small desk space in a corner of his office at C.B.S. Scientific. Little did he know, it would be a considerable time before this investment brought its fruit, marking the humble beginnings of a journey that would significantly impact the digital analytics world.

In the wake of receiving financial backing from Chuck Scott, the fledgling company embarked on its journey by purchasing a Sun SPARC 20 for server duties and securing an ISDN line, a significant expense at the time. The office computers were interconnected using 10base2 networking, a system that relied on coaxial cables with twist-lock fittings, reminiscent of TV cables but now seen as antiquated.

The first Urchin webserver, running at 50 MHz, was ~$3200 in 1995 money. That was about 1/3 of the total raised capital in that time for Urchin Software Company.

Paul and Scott, the duo behind this venture, began the arduous task of customer acquisition. Their clientele grew gradually, mostly comprising small businesses that paid a modest monthly fee. Among their early clients were Cinemagic, a vintage movie poster company run by Herb and Roberta, and ReVest, a financial startup. The owner of ReVest was notably averse to using email, leading to website edits being communicated through lengthy thermal-transfer faxes that unspooled across the office floor each morning. Another notable client was a lesser-known division of Pioneer Electronics, specializing in the production of LaserDiscs, a format already considered archaic at the time.

Buoyed by these early successes, the company leased office space in a modest brownish-green building located in the faux-historic, theme park-like area of Old Town, San Diego, not far from Rockin’ Baja Lobster. The office could accommodate up to four desks, five if the vestibule was counted, possibly intended for a secretary. In 1997, the company welcomed a new member, Brett Crosby, Scott’s younger brother, marking a turning point as the business began to gain momentum. They managed to secure contracts with two of the larger local employers: Sharp Healthcare, a hospital system, and Solar Turbines, a power generation subsidiary of Caterpillar.

Despite these significant contracts, the company still catered to numerous small clients, hosting their websites on a single web server and charging a recurring fee. To accurately bill for bandwidth usage—a costly resource at the time—Paul developed a simple log analyzer. This tool not only tallied bytes transferred but also provided a user-friendly web interface, tracking referrers, “hits”, and pageviews. This innovation laid the groundwork for the first version of Urchin. After further enhancements, including date-range features and user authentication, Urchin was showcased to customers, receiving generally favorable feedback.

This period was a critical one for the company, with the deal from Solar Turbines alone, bringing in $10,000 per month, playing a vital role in keeping the business afloat for over a year. This early phase of struggle and gradual success was the foundation upon which the future of web analytics was built.

In 1997, the team behind what would become Urchin Software Corp embarked on their first-ever trade show adventure. In a creative twist, they borrowed giant blue light boxes from an underwear startup. These boxes, made of 1-inch thick particle board, were notably heavy and cumbersome, adding a unique challenge to their trade show debut.

To add a bit of flair to their booth, they enlisted the help of friends who, intrigued by the novelty of an internet trade show, volunteered to assist for the day. These friends, playfully referred to as “booth babes,” brought lively energy to the booth. However, the long hours and bustling environment of the trade show proved to be more demanding than anticipated. As a result, their initial enthusiasm vanished (probably), and they decided not to volunteer for such events again :-). This first trade show experience was a mix of improvisation, camaraderie, and learning, marking a memorable step in the company’s early journey.

In the late 1990s, a pivotal moment arose for the company that would later become Urchin Software Corp, thanks to a connection through Brett Crosby’s girlfriend, Julie, who worked in the advertising and web development industry. Julie was employed by Rubin Postaer Interactive (RPI), a company that still exists today, a subsidiary of the prominent Los Angeles-based RPA (Rubin Postaer and Associates), which managed the Honda.com account. It was discovered that Honda.com, then using WebTrends for web analytics, struggled with processing their daily Apache access logs within a single day, leading to a backlog.

Seizing this opportunity, the team managed to acquire a few days’ worth of server logs from Honda.com to process as a demonstration. Impressively, they completed the task in approximately 30 minutes, a feat that led to them becoming the web analytics solution for American Honda. This success marked a turning point, indicating the potential to build a business around Urchin’s log processing technology.

Around the same time, Jack Ancone joined the team as the CFO and relocated to San Diego. The company then moved into an office at 2165 India St.

The Urchin Software Company moved into an office at 2165 India St. This is how it looked like.

The Urchin Software Company moved into an office at 2165 India St. This is how it looked like.

In the early days of 1998, a significant milestone was reached for the team behind Urchin Software Corp. They celebrated their first sale of the “Pro” version of Urchin, priced at $199. This moment marked a turning point, prompting a strategic shift in their business model. They decided to focus solely on software, divesting themselves of their hosting and web development services. These segments were handed over, without any financial gain, to a local web development shop. This bold move transformed Urchin company into a pure software company, a transition met with enthusiastic high-fives all around.

To support this new direction, the Urchin Software Company needed additional funding. They tapped into their family networks and collaborated with a boutique venture capital firm, Green Thumb Capital from New York City, brought in by Jack. This effort successfully raised $1 million, increasing their total external capital to approximately $1.25 million. Despite future attempts, this would be the last of their fundraising, except for a manageable debt of around $400,000, which was later repaid with interest and warrants. Green Thumb Capital, to their credit, never pressured them for returns, likely as surprised as anyone when Google later acquired Urchin.

During the late 1990s, as they navigated the challenges of selling enterprise software, the team opted for an unconventional, advertising-based strategy to capture market share. In an era where internet companies were often valued by the number of “eyeballs” they attracted, they released Urchin ASAP, a free version of their software supported by banner ads, alongside Urchin ISP. Both versions were tailored for hosting operations. The Urchin Software Company team thought they could make a significant fraction of a cent per click on these ads, on top of some infinitesimal CPM. Although the banner ads didn’t generate significant revenue, they did succeed in gaining valuable exposure.

One of the Urchin ASAP banner ads, which advertised itself when no one else wanted the space

This approach, combined with the quality of their software, set the stage for their first major breakthrough in the industry.

In the old days of the internet, a time when Tumblr and Blogger were still on the horizon and Geocities was a household name, there existed a platform known as Nettaxi. This relatively obscure service claimed to host a staggering 100,000 “sites,” a figure that intrigued the creators of Urchin. Seeing Nettaxi as a potential goldmine for user engagement, they struck a unique deal: Urchin’s sophisticated web analytics tools would be offered to Nettaxi completely free of charge, in exchange for the ad revenue generated from the traffic of all these sites. The financial outcome of this arrangement? A few cents, if there were any, but the real value, however, was in the claim that Urchin company now could say – they were servicing 100,000 sites. This statistic significantly boosted their market presence.

In a move reminiscent of Google’s playful logo variations, the Urchin team introduced their own creative twist: the “Urchin of the Day.” This quirky feature, which involved changing the Urchin logo in the interface’s upper-left corner, was more than just a fun gimmick. It was an attempt to forge a closer bond with their user base. Whether it achieved this goal is debatable, but it certainly kept its designer, Jason Collins, busy and entertained for a considerable period. His creations, ranging from whimsical to downright hilarious, are still remembered fondly. Among these was a special design by Shepard Fairey, later famous for his iconic “Obama Hope” poster, who contributed a “Power to the People” version of the Urchin logo.

Urchin of the Day – created by Jason Collins (designer/graphic) – his creations weren’t just static images; they were dynamic, animated GIFs that brought a sense of life and movement to the company’s interface. With each new design, Jason’s talent shone, blending his love for cars with his flair for digital artistry, making the “Urchin of the Day” a much-anticipated reveal among the team and clients alike.

In the late 1990s, a small, ambitious team led by Brett Crosby, the VP of Sales and Marketing at Urchin Software Corp., was on a mission to elevate their latest creation, Urchin 2.0, into the spotlight. Their target? Earthlink – a giant in the internet service provider industry, nearly rivaling AOL in size and influence. The challenge, however, was making contact with someone influential within such a colossal organization. Undeterred, Brett resorted to the simplest yet most persistent method: repeatedly submitting inquiries through Earthlink’s web form, a testament to determination over sophistication.

“We had no idea how to reach anyone important at a place like that, so Brett did the natural thing and filled out a web form. Again, and again, and again. He must have submitted that thing 20 or 30 times. Finally, he got a response. Rob Maupin, VP of hosting (or something similar) agreed to a meeting. We were stunned.”

After what seemed like an endless stream of attempts, their persistence paid off. Rob Maupin, a high-ranking executive at Earthlink, responded, agreeing to a meeting.

So they were about to pitch to one of the internet’s behemoths. In their best vehicle, Brett’s old but reliable Mercedes bought for that purpose for 4,000 USD, they went to Pasadena, while Scott Crosby, another key figure in the company, stayed at the office, partly out of managerial duty, partly out of sheer intimidation.

The meeting with Rob Maupin was a reality check. He bluntly criticized the Urchin 2.0 interface for its overwhelming use of blue, a fair point that the team had to concede.

Too blue blue blue blue blue Urchin 2 - version, which was used in presentation for Earthlink

Too blue blue blue blue blue Urchin 2 – version, which was used in presentation for Earthlink

Despite this, Maupin saw potential in Urchin’s speed and efficiency, crucial factors for web hosting services these days (and nowadays also). After accommodating Earthlink’s requests for modifications, Urchin struck a deal that would become a cornerstone of their success: $4,000 per month for unlimited use of Urchin software across all Earthlink-hosted websites.

By 2001, the company had evolved into Urchin Software Corporation, and it was time to seek additional funding. The process of pitching to venture capitalists was grueling and distracting, but eventually, they managed to secure commitments from two reputable firms. The funding, however, was scheduled to be finalized on September 12th, 2001 – a day after the world-changing events of 9/11. The aftermath of the tragedy put the investment on hold indefinitely.

Having already expanded in anticipation of the $7 million investment, Urchin found itself in a precarious financial position. They had to make drastic cuts, including laying off 12 employees and giving up office space, a day they somberly referred to as Black Friday. Facing a dire cash flow crisis, they had no choice but to seek loans from benefactors Chuck Scott and Jerry Navarra, who provided the much-needed funds in exchange for interest and warrants. This period marked a challenging phase for Urchin, with drastic cost reductions and employees voluntarily taking significant pay cuts to keep the company afloat. Despite the hardships, the team’s resilience and dedication kept Urchin alive, even when hope seemed fleeting.

 

 

The business was divided into three main areas: web development, hosting, and software development, following a strategy of diversification.

In January 1998, the company celebrated its first sale of the “Pro” version of Urchin for $199. This milestone was followed by a strategic shift to focus solely on software development, leading to the abrupt discontinuation of hosting and web development services. The company transitioned into a pure software company, a decision marked by a symbolic high-five.

To support this new direction, the team raised $1 million through family connections and a boutique venture capital firm, Green Thumb Capital of NYC, bringing their total outside capital to approximately $1.25 million. Despite later attempts, this would be the last of their fundraising, except for a small amount of debt financing.

In an effort to capture market share in the late 1990s, the company adopted an advertising-based approach, releasing Urchin ASAP, a free counterpart to Urchin ISP, both aimed at hosting operations. The plan was to generate revenue through banner ads displayed at the top of each page, capitalizing on the era’s emphasis on “eyeballs” over profitability. However, this strategy did not yield significant financial returns, but it did provide valuable exposure and helped establish the software’s reputation in the market. This approach laid the groundwork for the company’s first major breakthrough in the world of web analytics.

In the early 2000s, the tech industry was still reeling from the aftermath of the dot-com bubble burst. Revenue streams were inconsistent, and growth was slower than anticipated.

The company’s primary revenue source had been substantial annual licensing deals, often involving lengthy and complex negotiations. One of their most significant contracts, exceeding $1 million, was secured by Jack Ancone with Cable & Wireless, a major player in the global telecommunications and hosting industry. However, despite these promising deals with companies like Winstar, KeyBridge, and Worldport, payments often fell through as these seemingly resource-rich companies faced their own financial limitations.

To invigorate sales and streamline processes, Urchin made a strategic decision to simplify their enterprise deals with hosting companies. This new approach, although less lucrative in the short term, was based on the modest deal they had previously struck with Earthlink. The new Site License Model (SLM) was straightforward: hosting companies would pay $5,000 per month for each physical data center, receiving unlimited access to Urchin software under a simple, one-page contract, no legalese, and nothing really to negotiate.. This model quickly gained popularity, attracting major hosting companies in the US and Europe, including Rackspace (now part of IBM), Everyone’s Internet (aka EV1 Servers), The Planet, and Mediatemple and many others.

By the fall of 2003, these deals had propelled Urchin into a cashflow-positive position. They were also successfully selling individual licenses to self-hosted organizations, including Fortune 500 companies and numerous university systems.

The sales team, having been significantly reduced during the company’s financial struggles in 2001, was small but mighty. Paul Botto, Nikki Morrissey, and Megan Cash, who had worked without pay during the toughest times, played a crucial role in Urchin’s recovery. Their efforts, combined with a new commission model that offered low base pay but high commission rates, led to a significant boost in sales.

Paul and Megan eventually joined Google, while Nikki chose a different path.

Urchin 4 had an easter egg that no one ever found. If you clicked a random “rivet” in the sexy brushed aluminum interface, you’d be treated to a photo of the illustrious Urchin dev team: Doug Silver, Nathan Moon, Paul, Jonathon Vance, Rolf Schreiber, and Jim Napier. Most of these guys are still at Google (date to Aug. 2016).

Urchin’s international expansion had its ups and downs, including a failed attempt to establish an office in Tokyo. However, the launch of a channel program, particularly in markets where English wasn’t the primary language, proved to be a wise move. Japan, for instance, became a strong market for Urchin, thanks to the efforts of Jason Senn, who managed the channel program and also took on the role of chief office builder.

Product-wise, Urchin was evolving. If Urchin 2 opened doors and Urchin 3 maintained standards, Urchin 4 was a game changer. It featured a modern, Apple-esque design and introduced the Urchin Traffic Monitor (UTM). The UTM was a pioneering method that combined Apache or IIS log files with cookies, enabling the identification of unique visitors. This hybrid approach of using both log files and cookies set Urchin apart from competitors who relied solely on one method or the other. Urchin’s innovative approach laid the groundwork for more advanced web analytics practices, foreshadowing the capabilities of future tools like Google Analytics.

Paul, looking here like a cartel drug lord, withdrew something like $53,000 in cash for Urchin Software Company employees for their Christmas bonuses in 2004. Funny thing is, Google also gave out actual cash money bonuses for years after former Urchin employees joined — millions of dollars in currency. Great minds think alike I guess. This photo was taken on December 17, 2004.

Once upon a time in the tech world, Urchin Software Corporation released Urchin 4, a product that continued the company’s quirky tradition of supporting an incredibly diverse array of platforms. If you ever stumble upon Google’s Urchin 4 help page, you’ll be amused to see the list of supported operating systems, including the obscure Yellow Dog Linux. The team at Urchin had a vision: they believed that by supporting a wide range of platforms, they might break into major corporations or universities that used less common systems like AIX or HP-UX. However, reality proved different, with most customers opting for the Linux or Windows IIS versions.

The team’s enthusiasm for diverse platforms led them to acquire various servers from eBay, enjoying the challenge of getting Apache and a compiler running on each unique system. They even dabbled with a NeXT version, though they steered clear of DEC after struggling to boot up the machine.

Urchin 4 marked a turning point for the company. It was the first version that they felt could truly compete with any other product in the market, not just in back-end performance. But it was Urchin 5 that took things to a whole new level. It was a powerhouse of a product, albeit a bit overwhelming with its layers of menus and submenus. It was a dream for analytics enthusiasts, packed with features like e-commerce tracking, the Campaign Tracking Module, and multiserver versions. Urchin 6 introduced a groundbreaking feature: individual visitor history drill-down, a capability so sensitive that Google later decided to remove it entirely.

Until Urchin 5, the company had operated on a traditional software licensing model. But by 2004, it was clear that a hosted version was necessary. So, they invested in servers, upgraded their T1 line, and launched Urchin 6, available both on-premises and as a hosted service. This new business model was an instant success, with companies willing to pay for the convenience of not having to manage the software themselves.

By the summer of 2004, Urchin boasted the largest installed base among web analytics vendors, measured by the number of websites using their product. Tradeshows, once a daunting task, became enjoyable events for the team. It was at the Search Engine Strategies 2004 in San Jose that Urchin caught the eye of Google. Wesley Chan, a Product Manager, and David Friedberg from Corporate Development, were on the lookout for a web analytics company. Despite the unconventional approach of Urchin, they saw potential in what they found.

Paul Botto, Scott Crosby, and Brett Crosby, at Search Engine Strategies 2004, San Jose, where they first met with the Google people

Paul Botto, Scott Crosby, and Brett Crosby, at Search Engine Strategies 2004, San Jose, where they first met with the Google people.

Brett Scott prepares to “fax” the signed acquisition agreement back to Google. By this time Urchin Software Companywere sufficiently profitable that it was a tough decision to sell. Brett Scott signed the actual, final paperwork in a tuxedo about 30 seconds before walking down the aisle at his wedding.

In a tale of ambition and success, a small but innovative company named Urchin Software Corporation found itself at a pivotal moment. Just a few weeks after catching the attention of tech giant Google at a tradeshow, an offer was made to acquire Urchin. This period was marked by interest from various players in the tech world, including WebSideStory, a public company at the time, which even offered a higher bid. However, the Urchin team believed Google was the right choice (and time showed that it was the right decision).

The process of selling Urchin to Google, however, was far from smooth. It was expected to conclude shortly after Google’s IPO in late 2004, but the legal intricacies, particularly around intellectual property and patent risks, made it a nerve-wracking experience. The founders were personally liable for any potential patent infringements, a daunting prospect given their new association with a major player like Google. The deal was finally sealed in April 2005, by which time Google’s stock had doubled, impacting the financials of the deal.

Joining Google for Urchin team in 2005 was a unique experience. The company, still in its relative youth with around 3,000 employees, had a vibrant culture. Everyone could gather for a single, grand holiday party, and celebrities like MC Hammer were a common sight.

Jack, MC Hammer, and Chris Sacca (2005).

Jack, MC Hammer, and Chris Sacca (2005).

On their first day, the Urchin team met with Eric Schmidt, Google’s CEO at the time. Schmidt immediately recognized the potential of Urchin’s web analytics in relation to Google’s Adwords. He remained a supportive and accessible figure throughout their integration into Google. Brett Crosby, who later became Senior Director of Marketing at Google, even had an office next to Schmidt.

The Urchin team was initially placed in the “fishbowl” of Building 42 at Google’s Mountain View campus, in close proximity to Google’s founders, Larry Page and Sergey Brin. Sergey, known for his eccentricities, had a laser engraver in his office, complete with an air duct for venting gases. They also shared the space with Mike Stoppelman, a new Google engineer whose brother would soon found Yelp.

As Google prepared to launch Google Analytics, the rebranded Urchin product, in 2005, there was apprehension about its reception. Wesley Chan, the Google Product Manager leading the integration, initiated a daily “war room” to ensure the product’s success. The team was given specific objectives and tight timelines to meet. The effort involved educating the rest of Google about the product, with team members touring Google offices nationwide.

When “Urchin from Google” was announced as a free service for any website in the world, the response was overwhelming. The demand was so high that it strained Google’s infrastructure, leading to a temporary shutdown of new signups. This was a problem of success, but it frustrated many. Eventually, signups were reopened using an invitation model, and Google Analytics began its journey to becoming the ubiquitous tool it is known today.

In the grand narrative of tech acquisitions, Google’s purchase of various companies stands out. Among these acquisitions, some, like YouTube and Keyhole (which became Google Earth), soared to great heights. Others, however, like Dodgeball, faded into obscurity, victims of the complex dynamics within a large corporation. This phenomenon, partly due to Google’s acquisition strategy and partly due to the inertia and fog that often accompany big companies, led to many promising ventures dissolving into the corporate ether.

For acquisitions under a certain threshold, rumored to be around $50 million, the decision-making process was startlingly straightforward: a single VP’s approval could seal the deal. However, once these companies were integrated, they often found themselves adrift in the vast sea of Google’s operations. Without a high-level champion or a clear path to significant revenue generation – often benchmarked at around $100 million annually – these acquisitions struggled to maintain their identity and purpose.

Urchin, the company behind what would become Google Analytics, was one of the fortunate few. It found powerful allies in Wesley Chan, a product manager who recognized the need for robust analytics to bolster Adwords, and Eric Schmidt, then CEO of Google. Schmidt quickly grasped how web traffic analysis could enhance Adwords’ effectiveness. A few years later, an internal study at Google, conducted by a team of quantitative analysts, demonstrated a substantial increase in ad spending across a wide range of customers, validating the strategic importance of the acquisition.

As time passed, the original team from Urchin began to disperse within Google. Some left the company, but a significant number remained, continuing to contribute to Google Analytics. Notably, Paul, a key member of the Urchin team, rose to become a senior VP of engineering, overseeing not just Google Analytics but also the display ads segment.

You can read the whole Urchin Software Company story directly from the co-founder of Urchin (brother of the next founder Brett Scott): https://urchin.biz/urchin-software-corp-89a1f5292999

Seven Urchin versions

There were 7 different product versions of Urchin (the predecessor of the later well-known Google Analytics/Universal Analytics).

Urchin 1: The early days of a web analytics pioneer

In the mid-1990s, the digital landscape was burgeoning, and amidst this backdrop, Paul Muret and Scott Crosby, fresh out of college, embarked on an entrepreneurial journey. Sharing an apartment in San Diego, they founded a company in 1995 with a vision to create business websites. Their venture was kickstarted with a modest $10,000 seed fund from Scott’s uncle, who also provided them with a workspace in his company, C.B.S. Scientific. This initial investment was channeled into acquiring a Sun SPARC 20 server and renting an expensive ISDN line, a significant step for young entrepreneurs.

Their business began to gain traction, securing clients and generating revenue through monthly fees. This success enabled them to move into their own office space, and in 1997, Scott’s brother, Brett Crosby, joined the team. The company was growing, attracting larger clients, yet all their websites were hosted on a single server, sharing that one ISDN line.

Paul Muret, demonstrating his programming prowess, developed a rudimentary log file analysis system. This system, initially basic, was capable of calculating website traffic and presenting it through a web interface. Gradually, he enhanced the system, adding metrics like pageviews, referrer data, and hits. This evolution marked the birth of Urchin, a simple yet effective tool for log file analysis.

Urchin’s potential was soon recognized when Brett’s girlfriend introduced them to Honda.com. Winning Honda.com as a client was a pivotal moment, as Urchin became their standard web analytics software. This success shaped the company’s future direction. Around this time, Jack Ancone joined the team as the CFO, and the company, then known as “Quantified Systems Inc.,” shifted its focus to encompass web development, hosting, and software development.

The development of Urchin continued, and in January 1998, the first professional version was released, priced at $199. This version marked a significant milestone, and soon after, a strategic decision was made to concentrate solely on software development, moving away from other business areas. This shift necessitated additional funding, and the team successfully raised $1 million to fuel their journey as a dedicated software company.

Web tracking software Urchin 2 and the evolution to Urchin 3

Early version of Urchin 2

Early version of Urchin 2

 

This is how Urchin 3 website and admin looked like in the past

 This is how Urchin 3 website and admin looked in the past

As Urchin continued to evolve, two distinct versions emerged: the commercial “Urchin ISP” and the free “Urchin ASAP.” The latter was an innovative approach, aiming to generate revenue through advertising banners. This model incorporated both CPM (cost-per-mille) and CPC (cost-per-click) for a banner displayed at the top of the Urchin web interface. Adding a touch of creativity, there was an “Urchin of the day” feature, where the Urchin logo was regularly updated with current, sometimes animated, graphics.

In 1999, Brett Crosby took on the challenge of promoting Urchin 2.0. After considerable effort, he secured a meeting with Rob Maupin from Earthlink. Despite initial reservations about the web interface’s overly blue color scheme, Maupin decided to give Urchin a chance.

This opportunity marked a significant turning point for Urchin. With a few software modifications, Urchin was soon established as the standard web analytics software for all websites hosted by Earthlink. This partnership was not only a testament to Urchin’s growing capabilities but also a lucrative deal, with Earthlink paying $4,000 a month for the service.

By 2001, Urchin had progressed to version 3, reflecting continuous improvements and growing recognition in the field of web analytics. This version marked another step in Urchin’s journey, setting the stage for further advancements and wider adoption.

The company underwent a significant transformation, rebranding itself as the Urchin Software Corporation. During a period of flourishing business, Urchin set its sights on expansion, successfully securing a promising $7 million in funding. However, the tragic events of September 11, 2001, disrupted these investment plans, leading to unforeseen financial challenges. The company had already committed funds based on these pledges, resulting in a liquidity crisis. This difficult phase forced Urchin to make tough decisions, including layoffs and office closures. In a bid to stay afloat, they turned to affluent individuals, notably Chuck Scott and Jerry Navarra, for financial support. From 2001 to 2002, Urchin faced a strenuous period, with exhaustive negotiations and some employees even forgoing their salaries to keep the company running.

During this time, Urchin 3 was offered in various configurations to cater to different business needs:

  • Urchin Dedicated: Designed for a single server hosting up to 25 websites, priced at $495, with each additional batch of 25 websites costing $295.
  • Urchin Enterprise: Aimed at larger operations with 2 servers and up to 25 websites, available for $4,995, and an additional $1,995 for each extra server.
  • Urchin Data Center (on request): This version was provided based on specific client requests.

Ultimately, Urchin decided to streamline its business model, focusing on simpler and more direct business deals, even if it meant earning less revenue. This strategic shift was aimed at stabilizing the company during a challenging period in its history.

Web tracking software – version Urchin 4

Demo for Urchin 4 (available only in Web Archive services): https://web.archive.org/web/20030207025325/http://www.urchin.com/products/tour/

In 2002, Urchin Software Corporation unveiled Urchin 4, a significant upgrade that sported a sleek design reminiscent of Apple’s aluminum aesthetic.

A pivotal innovation in Urchin 4 was the introduction of the “Urchin Traffic Monitor” (UTM). This feature marked a significant advancement by incorporating JavaScript tracking alongside traditional web server log file analysis. The use of browser cookies for tracking and visitor recognition laid the groundwork for what would eventually evolve into Google Analytics. Urchin 4 maintained its versatility, supporting a wide range of operating systems including AIX, FreeBSD, IRIX, Mac OS X, Red Hat Linux, Solaris, and Windows.

Web tracking software – version Urchin 5

Urchin 5 - had e-commerce/”ROI” tracking, the Campaign Tracking Module, and multiserver versions that could all conspire to get the price pretty high.

Urchin 5 – had e-commerce/”ROI” tracking, the Campaign Tracking Module, and multiserver versions that could all conspire to get the price pretty high.

Following this, Urchin 5 was released, bringing with it notable enhancements such as E-Commerce tracking, campaign tracking, and support for multi-server environments. Scott Crosby, reflecting on this version in 2016, noted, “Urchin 4 was the first release I felt could compete with anyone in terms of back-end performance. But Urchin 5 was superior in every way, and I’m sure thousands of instances still run to this day. If anything, Urchin 5 was just too much of a good thing.”

The pricing structure for Urchin 5 was as follows:

  • Base Module: Priced at $895, it included 100 Profiles (up to 100 sites) and one Log Source for each profile (additional load balancing modules required for more servers).
  • Additional 100 Profiles: Available for $695.
  • Additional Load Balancing Module: Priced at $695, accommodating all profiles for load balancing.
  • Ecommerce Reporting Module: Available for $695.
  • Campaign Tracking Module: Priced at $3995.
  • Profit Suite: A comprehensive package including Urchin 5, the E-commerce Module, and the Campaign Tracking Module, priced at $4995.

Urchin 6

Image Source and Help article to Urchin User Explorer: https://support.google.com/urchin/answer/2633730?hl=en&ref_topic=2633609

Urchin 6 marked the final iteration under the Urchin brand, available through Google or authorized Urchin dealers. This version introduced a notable feature, “Individual Visitor History,” now known as “User Explorer.”

For a single license, Urchin 6 was priced at $2995, while hosting companies were charged $5000 monthly per physical data center. Notably, Urchin 6 was the first to offer a cloud solution, priced at $500 per month. This innovation contributed to Urchin becoming the world’s most popular web analysis tool by the summer of 2004, in terms of installation numbers.

During the Search Engine Strategies conference in San Jose in 2004, Google representatives Wesley Chan and David Friedberg encountered Urchin. This meeting led to Google extending an offer to acquire Urchin, despite competing offers, including a higher bid from WebSideStory. The acquisition was finalized in April 2005, a time when Google had about 3,000 employees, relatively small compared to its current size.

Installation guides for Urchin 6, such as the one for Windows, are still accessible online: https://support.google.com/urchin/answer/2591336?hl=de&ref_topic=2591275

Key differences between Urchin 5 and 6 (this was copied from the Google Urchin help):

  • Major Features:
    – Up to 1000 profiles (domains), log sources, e-commerce, and campaign tracking are all included with the base license; no add-on modules
    – Individual visitor-level tracking, including session (path) data
    – Comprehensive SEO/SEM campaign tracking features, 4 goals per profile
    – Rich cross-segmenting available from most reports
    – Full suite of visitor geo-location reports (not just visitor domain)
    – Processing speed roughly on par with Urchin 5 but with much richer reports
  • Platform Support:
    – Broad range of Linux platforms supported with only 2 builds (Linux 2.4 and 2.6 kernels)
    – Added support for FreeBSD 5 and FreeBSD 6
    – Dropped support for MacOS X and Solaris (may be reconsidered if sufficient demand is demonstrated)
  • Installers:
    – Windows installer is now distributed as an MSI package, with better-unattended install support, integration with SMS
  • Configuration:
    – Relational database (MySQL or PostgreSQL) administrative configuration backend
    – Support for configuration database hosted on the remote configuration server
  • Web Server:
    – Upgraded to the latest Apache 1.3.X release
    – OpenSSL and mod-ssl upgraded to the latest versions
    – Removed default modules not used by Urchin
    – Added mod_expires for proper cache control headers
  • Task Scheduler:
    – Scheduler now runs as two processes – master scheduler and slave scheduler
    – Tasks easily managed via scripting interface to back-end configuration DB
  • Visitor Tracking:
    – The old __utm.js tracking javascript has been replaced with a GoogleAnalytics-compatible urchin.js tracking javascript (existing sites willneed to upgrade)
  • Log Processing:
    – geodata stored in memory (larger runtime memory footprint)
    – Range of Days feature in log sources allows multi-day search for log files matching a particular date pattern
    – Ability to run profiles entirely in memory
  • Data & Storage:
    – Profile databases now default to 100,000 records/month (instead of 10,000) with the option to increase up to 500,000 records per month
    – Expanded Geodata: full set of geolocation data from Quova, replaces domain-only MaxMind data
    – Monthly table record limit increased from a default of 10,000 to 100,000 records
    – 50 monthly files per profile, now organized by subdirectory
  • Reporting UI:
    – Flash replaces Adobe SVG for rendering graphs & charts
    – Report exporting only in CSV and XML (removed unreliable MS Word/Excel exporting)
    – All Profiles report is defunct
    – New visitor session/path-level reporting capabilities
  • E-commerce:
    – E-commerce transactions can be written directly to webserver logs via special functions in the tracking javascript (identical to GA)
    – External shopping cart logs in ELF2 format are also supported
  • Security:
    – Urchin 6 has gone through a thorough quality assurance for cross-site scripting (XSS) and XSRF vulnerabilities

This is Urchin 6. Individual visitor history drill-down — potentially controversial I guess. But at least there wasn’t a “composite sketch” of the visitor. That would have been SO COOL. 

Urchin 7 – groundbreaking ancestor of Google Analytics

Urchin 7 – The new UI looks very similar to later well-known Google Analytics

Urchin 7 marked the final chapter of the Urchin series, now under the Google umbrella and aptly named “Urchin 7 by Google.” This version was made freely available to users, initially through an invitation-based model, leading to a rapid expansion in the use of “Urchin by Google.”

Many of the Urchin employee profiles are also linked there.

Appendix I: Key personalities connected to Urchin Software Company

Numerous members of the original Urchin team continue to make their mark at Google.

Notably, Paul Muret serves as the Vice President of Engineering for Analytics and Display Ads. Other team members have ventured into entrepreneurial roles, founding new companies. Below is a list of the Urchin team members over the years, presented in no particular order, along with links to their subsequent ventures where available.

  1. Paul Muret
  2. Brett Crosby — PeerStreet
  3. Scott Crosby
  4. Jack Ancone
  5. Scott Crosby
  6. Paul Botto
  7. Rolf Schreiber
  8. Jason Senn
  9. Jim Napier
  10. Hui-Sok “Nathan” Moon
  11. Alden DeSoto
  12. Jonathon Vance(s with Wolves)
  13. Doug Silver
  14. Jason Collins
  15. Justin Beope — Upas Street Brewing
  16. Megan Cash
  17. Christian Powell
  18. Nikki Morrissey
  19. Mike Chipman — Actual Metrics (Angelfish product)
  20. Steve Gott
  21. Ted Ryan
  22. Jeromy Henry
  23. Annie Aubrey
  24. Alex Ortiz
  25. Kelley Wilson
  26. Christina Hild
  27. David Cerce
  28. Ryan Walker
  29. Nick Mihailovski
  30. Bill Rhodes
  31. Jason Chen
  32. Juba Smith
  33. Bret Aarons
  34. Merrick
  35. Bart Fromm
  36. Chi Kwan
  37. Ed Schwartz
  38. Andy Smith
  39. Ed Petersen
  40. Cindy Lee
  41. Davee Schultie
  42. Joanna Rocchio
  43. Ben Norton

In the next article we will look more into the shift to Google Analytics/Universal Analytics up to Google Analytics 4. We will also cover the description and comparison of what is new in GA4.

Churn rate

The churn rate is a vital metric for every app. Discover what the churn rate means, how to compute it, and why it serves as a crucial Key Performance Indicator (KPI).

The churn rate is the percentage of users who have stopped using an app. It can be users who stopped using the app altogether or those who uninstalled it. The choice between these definitions depends on the app’s type and goals. For instance, the customer churn rate focuses on users who have discontinued using the app’s products or services, which is valuable for subscription-based apps.

Essentially, the churn rate counts how many users leave the app within a specific time period.

Why is the churn rate important?

A high churn rate may indicate that you’re investing in user acquisition but not maximizing the return on investment. Identifying the reasons for churn allows you to enhance user retention and subsequently increase revenue. Some mobile app sectors anticipate a high churn rate, like hyper-casual gaming, which considers it part of their business strategy.

How to calculate the churn rate?

To calculate the churn rate for your app, decide whether you want to measure inactive users, uninstalls, or subscription cancellations. Then, select your time frame, such as annual or monthly churn rate. In-app event tracking helps pinpoint when users tend to churn, aiding in determining the most relevant measurement period and identifying areas for intervention to reduce churn.

Churn rate formula

Churn rate formula

What is a good churn rate?

Sometimes, a negative KPI can actually be a positive outcome. A negative churn rate indicates that an app has generated more revenue from both new and existing customers than it has lost due to users who have stopped using it.

However, it’s important to recognize that churn is a natural part of any business. Some users may not like an app, find a better alternative, or simply no longer require its features. Nonetheless, churn rate remains a vital KPI for app developers to consider.

By analyzing the churn rate, app developers can determine if changes are needed to enhance customer retention. This may involve improving the user experience, optimizing features, or adjusting prices. The benchmarks for churn rate can vary depending on your app’s industry, location, and platform.

It’s crucial to understand that when a user churns, it doesn’t mean they’re lost forever. There are various strategies to re-engage these users, and you can explore deeper use cases for using churn rate as a KPI in our user lifecycle guide. Additionally, you can learn more about achieving a good retention rate and discover ten strategies for improving your app’s user retention.

Key performance indicators (KPI) – how to define KPIs, examples KPIs and many more

In the realm of mobile marketing, key performance indicators (KPIs) are metrics utilized to evaluate the performance of a mobile application and the encompassing business ecosystem.

These crucial marketing KPIs encompass various in-app elements like user retention, monetization, and active user count. Additionally, external factors such as review scores and chart rankings, along with campaign indicators like click-through rates and campaign costs, play a pivotal role.

App KPI metrics can extend to impressions, clicks, installs, reattribution, sessions, and triggered events, all of which serve as fundamental components for subsequent calculations.

KPIs play a significant role in ensuring that your teams are working towards the overall objectives of the organization. Here are some of the primary reasons why key performance indicators are essential:

  1. Team alignment – whether assessing project success or employee performance, KPIs keep teams moving in the same direction.
  2. Health assessment – key performance indicators offer a realistic evaluation of your organization’s health, covering aspects like risk factors and financial indicators.
  3. Adaptation – KPIs provide clear insights into your achievements and setbacks, enabling you to focus more on what works and less on what doesn’t.
  4. Accountability – ensure that every team member adds value by using key performance indicators that help employees track their progress and assist managers in facilitating progress.

Types of KPIs

Key performance indicators come in various forms. While some measure monthly progress toward a goal, others have a longer-term perspective. The common thread among all KPIs is their connection to strategic objectives. Here’s an overview of some of the most common types of KPIs:

  1. Strategic KPIs – these overarching indicators monitor organizational goals. Executives typically rely on one or two strategic KPIs to assess the organization’s current status. Examples include return on investment, revenue, and market share.
  2. Operational KPIs – these KPIs usually measure performance over a shorter timeframe and focus on organizational processes and efficiencies. Examples include regional sales, monthly transportation costs, and cost per acquisition (CPA).
  3. Functional unit KPIs – many KPIs are specific to particular functions like finance or IT. IT may track metrics such as time to resolution or average uptime, while finance KPIs may include gross profit margin or return on assets. These functional KPIs can also be categorized as strategic or operational.
  4. Leading vs. Lagging KPIs – regardless of the type of KPI, it’s essential to distinguish between leading indicators and lagging indicators. Leading KPIs can help predict outcomes while lagging KPIs track what has already occurred. Organizations use a combination of both to ensure they monitor what matters most.

How to develop KPIs

With a vast amount of data available, it can be tempting to measure everything or, at the very least, the easiest things to measure. However, it’s crucial to ensure that you’re measuring only the key performance indicators (KPIs) that will help you achieve your business goals. The strategic focus is a vital aspect of defining KPIs. Here are some best practices for developing the right KPIs:

  1. Clarify KPI utilization – engage with individuals who will use the KPI report to understand their objectives and how they intend to use the information. This helps define relevant and valuable KPIs for business users.
  2. Align with strategic goals – ensure that your KPIs are directly related to your overall business objectives. Even if they are associated with specific business functions like HR or marketing, every KPI should directly support your overarching business goals.
  3. Create SMART KPIs – effective KPIs adhere to the SMART formula, which stands for specific, measurable, attainable, realistic, and time-bound. Examples include “Increase sales by 5% per quarter” or “Raise the Net Promoter Score by 25% over the next three years.”
  4. Maintain clarity – all members of the organization should comprehend your KPIs so they can take action based on them. This emphasizes the importance of data literacy. When people understand how to work with data, they can make decisions that drive positive outcomes.
  5. Plan for iteration – recognize that as your business and customer dynamics evolve, you may need to adjust your KPIs. Some KPIs may become irrelevant, or changes may be necessary based on performance. Ensure you have a plan in place to assess and modify KPIs as needed.
  6. Avoid KPI overload – the abundance of data and interactive data visualization in business intelligence can lead to the temptation to measure everything. However, remember that the essence of key performance indicators is to focus on the most critical objectives. Prevent KPI overload by concentrating on the most impactful measures.

How to setup KPI strategies

If your key performance indicators (KPIs) aren’t delivering the expected results, it’s time to refine your approach. Here are three actions you can take to ensure that people throughout the organization understand the significance of your KPIs and how to utilize them for data-driven decision-making that influences your business:

  1. Choose the most relevant KPIs – to ensure you measure what truly matters, it’s essential to include a mix of leading and lagging indicators. Lagging indicators help assess results over a specific timeframe, such as sales over the past 30 days. Leading indicators enable you to anticipate potential outcomes based on data, empowering you to make adjustments for improved results.
  2. Foster a KPI-focused culture – key performance indicators hold little value if individuals don’t comprehend their significance or how to utilize them (including understanding the KPI acronym). Enhance data literacy across your organization to ensure that everyone collaborates toward strategic objectives. Educate your employees, assign them relevant KPIs, and utilize a top-tier business intelligence platform to ensure everyone makes decisions that drive your business forward.
  3. Continuously improve – keep your key performance indicators up-to-date by revising them in response to shifts in the market, customer preferences, and organizational dynamics. Regularly convene to review KPIs, thoroughly assess performance to identify areas for adjustment, and communicate any modifications to keep teams well-informed and aligned.

Examples of key performance indicators (KPIs)

Each department within a business employs distinct key performance indicators (KPIs) to monitor their progress. Numerous organizations utilize KPI dashboards to provide a centralized platform for visualizing, evaluating, and analyzing their performance metrics. Here are some KPI examples categorized by department, along with a dashboard representation for each:

  1. Finance
  2. Sales
  3. Marketing
  4. Information Technology (IT)
  5. Customer Service

Examples of finance KPIs

Finance managers have various options to monitor financial progress, encompassing expenses, revenue, margins, and cash management. Here are some examples of key performance indicators (KPIs) that can assist in tracking financial performance:

  1. Gross profit margin (and %)
  2. Operating profit margin (and %)
  3. Net profit margin (and %)
  4. Operating expense ratio

Examples of sales KPIs

To ensure your sales teams meet their targets, it’s crucial to track and regularly assess sales-related KPIs, including those related to leads, opportunities, closed sales, and volume. Here are examples of KPIs for sales teams:

  1. New inbound leads
  2. New qualified opportunities
  3. Total pipeline value
  4. Sales volume by location
  5. Average order value

Examples of marketing KPIs

Effectively manage marketing spend, conversion rates, and other indicators of marketing success by defining KPIs that align with your organization’s strategic goals. Here are some marketing KPIs to consider:

  1. Marketing qualified leads (MQLs)
  2. Sales qualified leads (SQLs)
  3. Conversion rates (for specific goals)
  4. Social program return on investment (ROI) (by platform)
  5. Return on ad spend (ROAS)

Examples of IT KPIs

KPIs can help maintain accountability and provide early warnings about potential issues, from support tickets to server downtime. IT teams can set targets for KPIs such as:

  1. Total support tickets
  2. Open support tickets
  3. Ticket resolution time
  4. Security-related downtime
  5. IT costs vs. revenue
  6. Reopened tickets

Examples of customer service KPIs

Customer service leaders need to monitor progress related to customers, employees, and finances. KPIs should cover both short- and long-term targets, including support response times and customer satisfaction. Here are some customer service KPIs:

  1. First contact resolution rate
  2. Average response time
  3. Most active support agents
  4. Cost per conversation
  5. Customer effort score

Key information about the term “key performance indicators (KPIs)” to remember

  • KPI stands for key performance indicators.
  • KPIs are targets used to measure progress toward strategic objectives, crucial for business success.
  • KPIs can be strategic (e.g., revenue growth rate, net profit margin) or operational (e.g., order fulfillment time, time to market) and are specific to various business units.
  • Consider factors like usage, alignment with strategic goals, SMART criteria (specific, measurable, attainable, time-bound), clarity, adaptability, and focus on measuring the most important aspects.
  • Develop high-performing KPIs by balancing leading and lagging indicators, fostering a data-driven culture, and regularly reviewing and adapting KPIs to changing circumstances.

Attribution fraud – what is it and how to detect attribution fraud in your campaign data?

Have you ever wondered how some mobile apps seem to magically get credited with installs they didn’t earn? It’s called “attribution fraud”.

What is attribution fraud?

Imagine you just installed a cool new app. You found it on your own or maybe through a friend’s recommendation. Now, someone shady wants to swoop in and take credit for your discovery. Attribution fraud is their sneaky way of doing it.

Attribution fraud stands out from other app install fraud schemes due to a crucial distinction, making it a trickier challenge to uncover.

While various fraudulent tactics often involve promoting underperforming apps, attribution fraud takes a different approach. It results in paid installs that can, surprisingly, deliver some of the most outstanding performance within your advertising campaign. These installs come from actual, genuine users.

How does attribution fraud work?

Fraudulent vendors rely on malware, which is like a digital spy, quietly hiding on your device. This malware watches what you do and waits for you to install an app. When it detects a new app installation, it goes into action.

Let’s say you installed the app organically or through a legitimate source. The malware doesn’t care; it wants credit. So, it creates a fake report, making it look like you clicked on an ad just before installing the app. This way, it tricks the system into thinking the shady ad was the reason you installed the app.

The fraudsters win, and you might not even realize it. They’ve just stolen credit for your action.

Attribution fraud – paying for what you already own

To illustrate this concept, let me share a brief story. Imagine a skilled chef renowned for his pizzeria, whose pizzeria has become famous thanks to his home-grown organic tomatoes. As the restaurant gains popularity, the chef’s demand for tomatoes increases, prompting him to purchase them from the local market. As the months go by, his own garden starts to produce fewer and fewer tomatoes, even though he continues to spend time and money tending to them.

The tomatoes he’s purchasing from the market are equally delightful, and his pizzeria remains just as popular. However, he’s overlooking a crucial aspect – he’s being defrauded. The tomatoes he painstakingly cultivated are being illicitly taken from his garden and then peddled back to him by the market vendor. He’s essentially paying for something that is already rightfully his.

This analogy mirrors the issue of app installs attribution fraud. Installs that naturally occur or are generated by other advertising sources are incorrectly attributed.

Fraudulent vendors employ various methods to manipulate the last-click attribution model. Regardless of the tactics used, the outcome remains the same: marketers end up paying for installs they would have acquired anyway. In essence, they’re repurchasing their own tomatoes.

Attribution fraud affects a quarter of all paid app installs

Attribution fraud is a big issue, and it affects a lot of paid app installs. In January 2018, a company called Machine looked at 4.6 million app installs, and they found that 52% of them were fake. This means there were about 3,322 fake app installs every hour.

Most of these fake installs happened because of attribution fraud. In fact, 54% of them were because of attribution fraud. This means there were about 1.3 million fake app installs every month just because of attribution fraud.

These numbers show that attribution fraud is a serious problem. But not many people talk about it. One reason is that knowing these numbers can be scary. And telling clients about it can be even scarier. Some people might think it’s okay because they’re spending their budget and getting good results, even if it’s not real.

Another issue is that it’s very tough to find attribution fraud.

Most fake app installs can be found by looking at how users behave later. But because attribution fraud steals real and often natural installs, we can’t use this method to catch them.

Ways to spot attribution fraud in your current campaign data

Here are three ways to find attribution fraud in your campaign data:

  1. High click volumes: If a site reports an unusually large number of clicks, it might be a sign of fraud.
  2. Low click-to-install rate: If the rate of clicks to actual installs is very low for a particular site, it could indicate fraud.
  3. High conversion rates: If a site consistently performs much better than others, it may be suspicious.

If you notice any of these signs in your campaign data, you might be paying for stolen installs.

Although it can be tough to accept that many of your installs are unnecessary expenses, filtering out attribution fraud has a significant benefit. Your paid installs will decrease, and organic installs will increase, without affecting your results in terms of conversions, in-app performance, or the number of app installs.

In simple terms, you’ll spend less to achieve the same outcomes. This means more budget to invest in installs genuinely generated by your advertising partners and a much better return on investment (ROI), which is the ultimate goal for any app marketer.

Ad server – what is it and what are ad servers used for?

An ad server is a software platform responsible for overseeing the deployment of digital advertising campaigns.

What exactly is an ad server? What’s the role of an ad server?

An ad server stores different versions of creative assets for an advertising campaign, including images, audio, and video files. It then determines which versions to display to specific users. Additionally, ad servers can gather data, such as click-through rates and impressions, offering valuable insights into an ad’s effectiveness.

In a matter of milliseconds as a webpage loads, an ad server selects the most appropriate ad to place in an available advertising slot within a mobile app or website. This selection is made from a pool of available advertisements.

Ad servers function as data-driven intermediaries, forging connections between advertisements and specific audiences by leveraging descriptive tags related to factors like geolocation, interests, and behaviors. For instance, when promoting outdoor gear, the ad server seeks individuals whose data signals an affinity for activities like hiking.

These advanced algorithms rely on a range of decision-making criteria, encompassing multiple targeting variables, the frequency of ad presentations, the placement and format of display, and the potential for generating revenue.

How does an ad server work?

In response to the escalating requirements of the digital marketing sector, ad servers have progressed beyond their initial role of merely storing and delivering ads. They have incorporated real-time decision-making capabilities and the ability to provide insights into campaign performance.

This expansion understandably leads to some uncertainty regarding how the capabilities of ad servers align with those of other advertising technology platforms.

Ad servers are different from ad networks, which collect available ad spaces from certain publishers and arrange their sales to advertisers.

The main distinction between programmatic tools like ad networks, exchanges, SSPs, DSPs, and an ad server is that an ad server handles all the necessary elements.

For publishers, it enables them to serve and manage all types of ads: direct ones, in-house promotions (house ads), and ads from various programmatic sources. For advertisers, ad servers help with creative management and tracking ads displayed on different publishers’ websites and apps.

Here’s how it works: When advertisers buy ad slots, they can upload their ad materials to the ad server of the network. Then, when a publisher’s website or mobile app requests an ad, the ad server can generate the right tags and display the appropriate ad to the specific user, all from a central location.

Many ad servers come bundled with demand-side platforms (DSPs) – the interfaces that allow advertisers to purchase inventory from publishers. A DSP relies on an ad server to store ad materials and serve them to websites and mobile apps.

Conversely, an ad server without a DSP does not enable advertisers to connect to the programmatic ecosystem, where they can participate in automated real-time bidding (RTB) auctions for ad placement.

Ad servers that don’t have the DSP interface (or an SSP, which is where publishers handle their ad space) engage in something called a “direct deal.”

In a direct deal, a publisher sells their ad space directly to an advertiser. They negotiate the terms through a more traditional, hands-on media buying arrangement.

In this process, the ad servers of both the publisher and the advertiser communicate with each other to show ads to visitors on the publisher’s website or app. It’s worth noting that even though they use the same technology, publishers and advertisers use ad servers differently.

Types of ad servers

There are two primary types of ad servers:

  1. First-party (used by publishers)
  2. Third-party (used by advertisers)

Both of these ad servers have similar technical abilities, but they serve slightly different purposes for each party involved.

To put it in simple terms, publishers employ ad servers to have direct control over where and to whom their ad space is displayed. On the other hand, advertisers use ad servers to gather and review campaign data across various networks and publishing platforms where their ads appear.

Let’s delve deeper into each type.

First-party ad servers (used by publishers)

First-party ad servers are utilized by publishers to effectively manage their own ad inventory and gauge the performance of advertisers’ campaigns in terms of revenue and conversions.

These ad servers empower publishers to oversee ad slots on their websites and apps and display ads that have been directly sold to advertisers. Additionally, publishers’ ad servers can analyze user data, encompassing factors like geolocation, language, online behavior, and demographic attributes (when users provide consent), such as age and gender.

Simultaneously, these servers process the predefined business rules that determine which ads are eligible to appear in specific slots and establish the order in which advertisers can fill them through bidding. Leveraging this data, the servers then select the most suitable ads from the highest-value advertisers to showcase in the available ad slots.

Regarding measurement, first-party servers primarily focus on evaluating how an ad performs within a particular placement on the publisher’s website or mobile app.

Third-party ad servers (used by advertisers)

Third-party ad servers serve as a tool for advertisers to indirectly work with multiple publishing platforms. They enable advertisers to store, distribute, and measure various versions of active ad campaigns.

Advertisers often use these ad servers as an effective way to experiment with different creative variations and measure how their campaigns perform across different placements.

Here are some benefits of using third-party ad servers:

  1. Streamlined creative management – advertisers can manage their creative materials without the need for constant updates with publishers.
  2. Efficient template creation – they can develop templates to quickly generate new creatives that meet the diverse requirements of various publishing platforms.
  3. Testing variations – advertisers can test multiple versions of the same campaign to determine which one works best for specific target audiences and on particular platforms.
  4. Real-time optimization – campaign delivery can be optimized in real time for better results.
  5. Frequency control – advertisers can limit how often a single ad is shown to a user through frequency capping.
  6. Budget management – ad spend can be evenly distributed across different placements within a specified timeframe.
  7. Comprehensive data collection – advertisers can gather detailed data about campaign performance across all their placements, leading to more transparent and accurate reporting compared to relying solely on each publisher’s data.
  8. Traffic and engagement analysis – this data allows for the measurement of traffic and engagement across multiple sources, helping optimize future advertising spend.
  9. Centralized insights – all key metrics and insights are consolidated in one location, facilitating efficient reporting.

In essence, third-party ad servers provide advertisers with a powerful tool to effectively manage and optimize their ad campaigns across various platforms, ultimately leading to more efficient and data-driven advertising efforts.

Understanding hosted vs. self-hosted ad servers

When it comes to ad servers, you have two main options: hosted and self-hosted. The difference between them is quite simple:

Hosted servers

    • Pros:
      • You don’t need much technical knowledge because an external service provider handles most of the work and offers training and support.
      • The provider keeps an eye on your server’s speed and reliability and takes responsibility for addressing any issues.
    • Cons:
      • This higher-touch service typically comes at a cost.
      • You have limited control over data ownership and customization options.

Self-hosted (or open-source) servers

    • Pros:
      • You pay a one-time setup fee and then handle ongoing server maintenance costs.
      • You have full control over your data and can fully customize both the front-end and back-end aspects.
    • Cons:
      • Installation, customization, and support require dedicated technical expertise.
      • An open-source server might lack certain features, necessitating the use of additional plug-ins to achieve the desired functionalities.

Choosing between these two options depends on factors like how much control you want, the associated costs, the speed of implementation, and how user-friendly the solution is for you.

Best ad server platforms

When it comes to ad server platforms, there are several options available for both first-party and third-party ad serving. Here are a few notable names that stand out for their scale and quality:

  1. DoubleClick (now Google Ad Manager):
    • DoubleClick, now known as Google Ad Manager since 2018, remains a top choice for publishers. Google acquired DoubleClick in 2008.
    • It effectively combines two services: DoubleClick for Publishers (DFP) and DoubleClick Ad Exchange (AdX). Additionally, it offers DoubleClick Campaign Manager (DCM) for advertisers and agencies.
  2. OpenX:
    • OpenX is an integrated SSP (supply-side platform) that combines an ad server with a real-time bidding exchange for programmatic ad placements.
  3. Kevel:
    • Kevel is an open-source platform used by some of the world’s most heavily visited websites, such as Reddit and Ticketmaster.
    • What sets Kevel apart is its DIY approach, offering a wide range of APIs for building highly customized ad server solutions. However, this also means it requires dedicated internal expertise and management.
  4. ironSource:
    • ironSource focuses specifically on in-app advertising within the mobile gaming industry.
  5. Google and Facebook self-serve platforms:
    • Google and Facebook have self-serve ad platforms that allow advertisers, especially small businesses, to log in and set up their own ads and campaign management.
  6. AdMob:
    • AdMob, owned by Google, specializes in mobile app advertising. It offers a wide range of ad formats, including banner ads, interstitials, and rewarded video ads, allowing app developers to monetize their mobile applications effectively.
  7. AppNexus (Now Xandr):
    • AppNexus, now part of Xandr, provides a comprehensive suite of digital advertising solutions. It serves both publishers and advertisers, offering real-time bidding (RTB) capabilities, programmatic advertising, and data-driven insights.
  8. SmartyAds:
    • SmartyAds is an end-to-end programmatic advertising platform that caters to advertisers, publishers, and ad agencies. It offers a variety of ad formats, targeting options, and tools for campaign management and optimization.
  9. Revive Adserver:
    • Revive Adserver is an open-source ad-serving platform designed for publishers, advertisers, and ad networks. It allows for efficient ad management, targeting, and tracking while offering flexibility through its open-source nature.
  10. Rubicon Project (Now Magnite):
    • The Rubicon Project, now part of Magnite, offers a global exchange for programmatic advertising. It connects publishers and advertisers to optimize ad inventory and maximize revenue through real-time auctions and data-driven insights.
  11. Adform:
    • Adform is a comprehensive ad tech platform that provides solutions for advertisers, agencies, and publishers. It offers features like data management, ad serving, and programmatic buying to enhance the efficiency of digital advertising campaigns.
  12. Sizmek (Now Amazon Advertising):
    • Sizmek, now part of Amazon Advertising, offers an array of ad solutions, including creative optimization, data-driven targeting, and ad serving. It empowers advertisers to deliver engaging and effective campaigns across various digital channels.

These platforms vary in their features and specialties, so the choice depends on your specific needs and objectives. Whether you’re a publisher, advertiser, or part of a different business, there’s likely a platform that suits your requirements in the dynamic world of digital advertising.

How to choose the right ad server for your needs

Wondering which ad server is the best fit for your requirements? Your role and position in the market will play a significant role in guiding you toward the ideal solution for your business. As you evaluate your specific needs, the crucial factor to consider is the amount of time and effort you’re willing to invest in implementing a solution that aligns with your goals.

For publishers – publishers should look for platforms that support rich media ad formats and provide self-serve account management options for advertisers. Optimization tools that help prioritize high CPM (cost per mile) ads are also essential.

For advertisers – advertisers should focus on features like conversion measurement, optimization capabilities through A/B testing, robust analytics, and possibly APIs that enable advanced customization.

For ad networks – ad networks may be interested in white-label solutions, custom permissions, and support for a diverse range of ad formats.

By assessing your unique needs and considering your role in the advertising landscape, you can make an informed choice about the ad server that best serves your objectives.

Key information about the term “ad server” to remember

  • Imagine ad servers as the behind-the-scenes engines that drive the entire digital advertising world, much like web servers do for websites. The process they manage is intricate, involving a web of interconnected technologies. When everything runs smoothly, you might not even notice, but if something goes awry, it can have a significant impact.
  • Depending on your business goals, ad servers utilize their core capabilities in various ways to assist you in overseeing your digital marketing endeavors. Whether you’re a publisher or an advertiser, an ad server can act as a dynamic control center for efficiently managing your ad inventory, creative materials, and partnerships.
  • When you own your ad server, you also possess your data. In this sense, your ad server becomes the central hub of business intelligence, processing vital information about your customers, investments, and the effectiveness of your strategies across different channels. It consolidates this invaluable data in one place, ready to inform critical decision-making processes.