Author Archives: krcmic.com

View through rate (VTR)

The view-through rate (VTR) represents what percent of viewers see the whole video ad to the end. This helps marketers know if their ad is working well and track their ad’s effectiveness and improve advertisement effectiveness.

What is View Through Rate (VTR)?

View Through Rate (VTR) is a way to see how good video ads are, like those on YouTube or Hulu. It checks what part of people watch the ad from the beginning to the end.

Usually, when we look at how well a video ad does, we see how many people saw it or how many times it was shown.

Typical performance tracking metrics for video ads include views and impressions (or the number of people exposed to the video). But these KPIs/numbers don’t show how much of your ad people are watching or if they actually reached the end and saw your call-to-action (CTA).

But these numbers don’t tell us if people watched the whole ad or saw the important message at the end. To understand if a video ad works well, we need to look at the VTR. It tells us how many people watched the whole ad compared to how many people saw it start (divides how many viewers watched the entire video ad by the total number of impressions).

The percentage we get from VTR helps us know if an ad is working, where people stop watching, and how to make better ads.

How does VTR work?

VTR is mainly for ads that people can choose to skip.

Ads that you can’t skip usually have a VTR close to 100% because people have to watch them to get to their videos. The VTR might be lower if people stop watching the main video. But even then, the VTR looks high. You can still check VTR on these ads, but it doesn’t really tell you if the ad worked as it does with skippable ads.

If your ad is one people can skip, VTR helps you understand what they think at first. For instance, if the VTR is low, maybe the video didn’t grab their attention enough to watch all of it.

But, VTR doesn’t always tell you everything. If someone clicks on your site before the ad ends, it won’t count in the VTR, even though the ad worked.

To get a better idea of how your ad is doing, you can check the VTR at different parts, like 25%, 50%, and 75% of the ad. This shows you when people usually skip the ad.

If most people leave before the end when you ask them to do something (the CTA), you might need to make the ad more interesting or put the CTA earlier.

This tracking works because of a special code in the ad’s video player that sees how much of the ad someone watches. You can also use other codes called “cookies” to see what people do after the ad, like if they go to your website.

VTR (View Through Rate) formula

To figure out the VTR, you use this simple math:

VTR = (Number of times the whole ad is watched / Total number of times the ad is shown) x 100

Imagine your video ad is shown to 1,000 people on places like YouTube or social media. If only 10 of those people watch your ad from start to finish, then you have 10 complete views out of 1,000 times your ad was seen. To find the VTR, you put these numbers into the formula like this:

VTR = (10)/(1000) x 100 VTR = 0.01 x 100 VTR = 1%

So, the VTR is 1%, meaning 1% of the people who saw the ad watched it all the way through.

Understanding benchmarks: Improving your VTR

Is a higher VTR always better for your video ad? Not really. Usually, a VTR of 15% is seen as good, but not every ad needs to reach this.

Think about what your video ad’s goal is and what you’re advertising. Sometimes, even a low VTR is okay. Like, if your ad is for something expensive, just a few sales can cover the cost of making the ad.

Before you try to make your VTR better, think about these things that can change it:

  • Where your ad shows up (like which websites or social media)
  • The age, gender, and interests of the people you want to watch your ad
  • What do your video’s title and picture look like
  • The style, feel, and topic of your video
  • How long your video is

Improving your VTR

To make your VTR better, first think about where your ad is shown. If it’s low, maybe the people watching the video don’t connect with your ad. You can look into who your ad should be for and then show it to people who might like it more.

Also, if your video is longer than two minutes, people might not watch the whole thing. They could get bored or distracted. Short videos, like under a minute or even 30 seconds, often work better.

You have about eight seconds to grab someone’s attention with your ad. If nothing interesting happens in those first eight seconds, people might stop watching.

The start of your video needs to be good and clear. If it’s poor quality or takes too long to load, people might not stick around to watch, which could make your VTR go down.

What’s the difference between VTR and VCR?

The Video Completion Rate (VCR) shows the percentage of people who watch a video all the way to the end or up to a certain point.

VTR and VCR are about the same thing: how many people watch a video till the end? You can use either term. But, to keep things clear, it’s best to pick either VTR or VCR and use that term consistently when you’re talking about how many people watch your video completely.

What’s the difference between VTR and CTR?

Click-through rate (CTR) is about how many people click on something like an ad or a link compared to everyone who saw it.

VTR tells you how many people watch a whole video, but it doesn’t say what they do after watching. CTR, on the other hand, tells you how many people go to your website or a special page after watching the video.

Since VTR and CTR look at different things, you can use them together to see how well your video ad works overall. For example, if not many people watch the whole video (low VTR) but a lot of people click on your website (high CTR), your ad is doing well. But if lots of people watch the whole video (high VTR) and not many click-throughs (low CTR), then the video might not be that good at getting people to buy something.

KEY INFORMATION ABOUT THE TERM “VTR” TO REMEMBER

  • View-through rate (VTR) is about figuring out how many people watch a whole video ad. You find it by dividing the number of people who watch the whole thing by the number of people who saw the video. Then, you multiply that by 100 to get a percentage.
  • There’s a special code in the video that helps businesses see how many people watch the video and when they stop watching.
  • A good VTR can change depending on a few things like how long the video is, where it’s shown, and the style and feel of the video.
  • You can use VTR and VCR in the same way to see if a video ad is effective, based on how many people watch it all.
  • VTR and CTR are both ways to check how well a video ad works. But, CTR is about how many people go to your website or a special page after watching the ad.

Mobile malware

Mobile malware is bad software made to get into mobile phones and tablets. It can come from ads or apps and does things like steal important information, use your device in ways it shouldn’t, lock your device and ask for money to unlock it or create fake internet traffic.

What is mobile malware?

Malware is a well-known problem for computers. But now, as more people use mobile phones a lot, bad people like hackers and fraudsters are attacking phones too.

Mobile malware is mostly bad software that gets downloaded onto your phone without you knowing it.

Click injection: How mobile malware works

Mobile malware is bad code that runs on your phone without you realizing it.

Sometimes, it’s already on the phone when you buy it, especially with cheaper phones. But usually, it gets on your phone through apps, especially ones from places other than the official app store or apps you add to your phone yourself.

It’s used to steal important info, use your phone in ways it shouldn’t be used, lock your phone and ask for money to unlock it or create fake internet traffic.

Why mobile malware is important?

For people who work in mobile marketing, mobile malware is a big deal. It’s often used to hijack clicks and installs, which can mess up marketing efforts.

What is Google Analytics 4? GA4 vs. Universal Analytics: A side-by-side comparison and how does it differ from Universal Analytics? 2.part

In the first part of our exploration and tour of the past, we dived into the fascinating journey of Urchin, tracing its evolution from a pioneering analytics tool to its pivotal role in the creation of Google Analytics. This story highlighted key milestones, from Urchin’s early days and its acquisition by Google, to the transformative steps that led to the development of Google Analytics as a dominant force in web analytics.

In the next part of our story, we’re going to look at the newest big change in Google Analytics: the start of Google Analytics 4 (GA4). We’ll make GA4 easy to understand. We’ll explain what it is and how it’s different from the older version, Universal Analytics. We’ll compare GA4 and Universal Analytics, pointing out what’s new and better in GA4. This includes how it handles data, respects privacy, tracks users, and uses smart technology for deeper insights.

This section is not just about the technical side of GA4. We’ll also talk about what these changes mean for businesses and marketers. How will moving to GA4 affect them? What challenges might they face, and what new changes could they bring in this fast-changing digital world?

Let’s dive into Google Analytics 4 together and see how it’s changing the game in understanding website data.

Google Analytics shift to Universal Analytics

Post-acquisition, Google utilized the foundation laid by Urchin to develop what would become one of the most popular web analytics services worldwide. Urchin’s technology and insights were instrumental in the creation of Google Analytics, a platform that has since become a staple for webmasters, marketers, and businesses looking to gain insights into their online presence and performance.

Following the acquisition, Google set out to transform Urchin on Demand into a more accessible and powerful web analytics tool. In November 2005, Google launched the first version of Google Analytics. This launch marked a significant moment in the web analytics industry, as it made robust analytics tools available for free to webmasters and marketers worldwide. The initial release was so popular that Google had to temporarily halt new sign-ups due to overwhelming demand.

The early versions of Google Analytics focused primarily on providing insights into website traffic, such as page views, session durations, and bounce rates. It was a tool designed to help website owners understand how visitors interacted with their sites. This data was crucial for businesses and marketers to optimize their websites for better user engagement and conversion rates.

Over the years, Google Analytics underwent several updates to enhance its functionality and user interface. These updates included improved goal tracking, e-commerce reporting, and the introduction of custom reports, which allowed users to tailor their data analysis more closely to their specific needs.

In 2007, Google introduced a major update with the launch of Google Analytics v2. This version featured a completely redesigned user interface, making it more user-friendly and visually appealing. The update also included new features like internal site search tracking and event tracking, which provided deeper insights into user behavior on websites.

Another significant development came in 2009 with the introduction of real-time analytics, enabling users to see active visitors on their site and their activities in real-time. This feature was a game-changer for immediate data analysis and decision-making.

Despite these advancements, the digital landscape was rapidly evolving, with the rise of mobile devices and the need for cross-platform tracking. To address these challenges, Google introduced Universal Analytics in 2012, marking the next major phase in the evolution of Google Analytics. Universal Analytics brought significant enhancements, including the ability to track user interactions across different devices and platforms, providing a more comprehensive view of the customer journey.

In 2013, Google introduced Universal Analytics (UA), which quickly became the go-to platform for web tracking. However, on March 16, 2022, Google announced that Universal Analytics would be phased out, with the transition set to begin in July 2023. This marked a significant shift in Google’s approach to web analytics, as indicated in their final announcement on the matter.

The period before Universal Analytics was thus characterized by continuous innovation and improvement, as Google Analytics evolved from a basic website traffic analysis tool into a sophisticated platform capable of providing deep insights into user behavior and digital marketing effectiveness.

What happened before GA4?

Google Analytics has changed a lot since Google bought it in 2005. Back then, Google bought a product named “Urchin Analytics” – that’s where the term UTM, or Urchin Tracking Modules, comes from.

This became from now and then the original Google Analytics.

In 2013, Google introduced Universal Analytics (UA), which became the main way to track website data.

New Universal Analytics

Universal Analytics (UA), introduced by Google as the next evolution of Google Analytics, brought several significant improvements and new features compared to the older versions of Google Analytics. Here are some of the key advancements:

  1. User ID tracking – UA introduced the User ID feature, allowing more accurate tracking of individual users across multiple devices. This was a major leap from the older versions, which were more focused on sessions and didn’t track individual users as effectively across various devices.
  2. Custom dimensions and metrics – Universal Analytics allowed for more customization in tracking. Users could define their custom dimensions and metrics to track data that was specific to their business needs, something that wasn’t as flexible in previous versions.
  3. Enhanced E-commerce reporting – UA brought in a more sophisticated e-commerce reporting feature, enabling deeper insights into customer purchasing behavior, product performance, and more detailed transaction data.
  4. Simplified and more accurate tracking code – the tracking code in UA (analytics.js) was simpler and more efficient compared to the older ga.js script. This resulted in more accurate data collection and easier implementation.
  5. Offline data tracking – Universal Analytics provided the ability to track offline interactions, a feature not available in the previous versions. This allowed for a more comprehensive view of customer interactions beyond just their online activities.
  6. Session and campaign timeout handling – UA offered more control over session and campaign timeouts, allowing users to define the length of these periods based on their specific business requirements.
  7. More powerful data collection API – the measurement protocol in UA allowed for data collection from any digital device, not just websites, broadening the scope of analytics to include things like gaming consoles and customer relationship management systems.
  8. Enhanced site speed and performance metrics – UA included more detailed site speed and performance metrics, giving insights into how website performance impacted user experience.
  9. Custom channel groupings and content groupings – it allowed for more granular segmentation of traffic and content, helping in better analysis and understanding of user engagement.
  10. Mobile app analytics – UA was better equipped for mobile app analytics compared to its predecessors, providing detailed insights into mobile app usage and user engagement.

These improvements made Universal Analytics a more robust, flexible, and user-centric analytics tool, offering businesses deeper insights into their user’s behaviors and preferences.

But on March 16, 2022, Google said UA would stop working in July 2023.

When was Universal Analytics deprecated?

“On July 1, 2023, this property will stop processing data. Starting in March 2023, for continued website measurement, migrate your original property settings to a Google Analytics 4 (GA4) property, or they’ll be copied for you to an existing GA4 property, reusing existing site tags.”

That was the original message all marketers saw for the first time on March 16, 2022, by Google, who announced that Universal Analytics would stop processing hits on July 1, 2023, for standard UA properties, and on October 1, 2023, for UA 360 properties. The announcement, given the relatively short time frame, caught many marketers off guard and sparked a bit of panic within the (marketing and web analytics) industry. Although Google Analytics 4 had been available since October 2020, its adoption rate was still quite low at the time. This likely influenced Google’s decision to expedite the transition by setting an earlier deprecation date for Universal Analytics.

While more than a year might have seemed like sufficient time to implement a new tracking script on a website, the critical role of Google Analytics for most businesses in performance measurement made the transition period quite significant. Prior to starting the implementation of Google Analytics 4, it was essential to grasp its differences from Universal Analytics. Our list of important distinctions was prepared to aid in this understanding.

What were the main reasons Google introduced GA4?

Google introduced Google Analytics 4 (GA4) primarily to adapt to the evolving digital landscape where technology and user behavior are constantly changing. The traditional version of Google Analytics (Universal Analytics) was not equipped to provide accurate insights considering these rapid changes. Here are the main reasons for the introduction of GA4:

  1. Adapting to a changing digital landscape – GA4 was designed to keep up with advancements in technology and the evolving patterns of user behavior, which the older version may not accurately capture. In an increasingly dynamic digital environment, Google Analytics 4 (GA4) has been engineered to provide a robust analytics tool that keeps pace with the rapid technological advancements and shifting user behaviors online. The predecessor, Universal Analytics, was based on a session-based data model primarily focused on desktop web data. As the internet landscape transitioned more towards mobile and app-based interactions, Universal Analytics’ capabilities to provide insightful data started lagging.GA4 introduces an event-based data model that is more adaptable and suitable for a variety of user interactions across websites and apps. This model allows for the collection and analysis of data from both web and app sources in a unified manner, offering a more comprehensive view of user engagement. This is crucial as user interactions become more varied and less predictable, with multiple devices and platforms being used interchangeably. Moreover, GA4’s flexible approach to data measurement recognizes that the digital landscape is not static. It acknowledges the introduction of new platforms, devices, and user behaviors that could emerge in the future, ensuring that the tool remains relevant and effective.
  2. Privacy-first approach – GA4 represents a shift towards privacy-centric tracking. With increasing concerns about user privacy and data protection, GA4 aims to address these challenges while still offering robust analytics. The decade-old Universal Analytics (UA) platform, with its data architecture that increasingly hindered scalability and its non-compliance with emerging data privacy laws like GDPR and CCPA, created an urgent need for Google to facilitate a transition to a more modern solution. This led to the development of Google Analytics 4 (GA4), which has been fundamentally designed with privacy at its core to meet these contemporary challenges.GA4 ushers in a range of privacy-centric features to help businesses align with stringent data protection standards. These features include:
    • IP anonymization – by default, Google Analytics 4 anonymizes IP addresses, reducing the risk of personal data breaches and ensuring user anonymity.
    • Data retention controls – Google Analytics 4 offers more granular controls over data storage durations, allowing organizations to decide how long they retain data.
    • Server and data transfer locations – Google Analytics 4 provides clear policies regarding the location of its data servers and restricts data transfers, crucial for compliance with regulations like GDPR which have specific requirements about where and how personal data can be stored and processed.
    • Consent mode – this feature allows businesses to adjust how they collect and use data based on the consent provided by users, ensuring compliance with user privacy preferences.
    • User data deletion – Google Analytics 4 enables the deletion of users’ personal data upon request, supporting the ‘right to be forgotten’ as stipulated by privacy regulations.
    • PII handling rules – strict rules on the handling of personally identifiable information (PII) are implemented to prevent privacy violations.
  3. Cross-channel measurement – the platform is geared towards x-channel measurement, which allows for a more integrated view of the customer journey across different platforms and devices.
  4. Advanced machine learning – GA4 incorporates Google’s advanced machine learning to provide predictive insights and data, which can help organizations anticipate future actions of users and better understand the patterns in website traffic and user behavior without relying on traditional page hits​​.
  5. Enhanced data visualization and analytics – with a focus on machine learning and data visualization, GA4 offers more comprehensive predictive analytics and insights, as well as improved data visualizations compared to its predecessor. It also simplifies the tracking process, making it more user-friendly.

All key differences between GA4 and Universal Analytics

How does GA4 measure users compared to Universal Analytics?

Universal Analytics relies on “cookie-based” tracking to collect data. When a website uses Universal Analytics, it places a cookie in the user’s web browser. This cookie enables the platform to monitor and record the user’s web activity during their session on the site. Universal Analytics uses a session-based data model for measurement.

In contrast, GA4 offers a more versatile approach. According to Google, GA4 allows businesses to measure user activity across various platforms and devices using multiple forms of identity. This includes first-party data and “Google signals” from users who have consented to personalized ads. GA4 also continues to use cookies where available for tracking purposes. However, instead of relying solely on tracking sessions, GA4 adopts an event-based data model.

In today’s privacy-conscious world, the use of cookies is under scrutiny, and their prevalence may decrease over time. While this trend is arguably positive for user privacy, it poses challenges for digital marketers, who have traditionally relied on cookies for tracking and targeting.

Universal Analytics properties Google Analytics 4 properties 
Measurement Session-based data model Flexible event-based data model
Reporting Limited cross-device and cross-platform reporting Full cross-device and cross-platform reporting
Automation Limited automation Machine learning throughout to improve and simplify insight discovery

In GA4, users are tracked using an event-based data model, differing from the session-based model used previously. Unlike Universal Analytics, which mainly uses the ‘Total Users’ metric, GA4 emphasizes ‘Active Users.’ It also tracks ‘Total Users,’ ‘New Users,’ and ‘Returning Users.’ While diving into this aspect of GA4 might not be advisable for beginners in Google Analytics, a guide on user measurement in GA4 could serve as a valuable resource for future reference.

In GA4, all hits are tracked as an event

Let’s get this straight from Google. The Session-Based Model.

In UA properties, Analytics groups data into sessions, and these sessions are the foundation of all reporting. A session is a group of user interactions with your website that take place within a given time frame.

During a session, Analytics collects and stores user interactions, such as pageviews, events, and eCommerce transactions, as hits. A single session can contain multiple hits, depending on how a user interacts with your website.

How about the event-based model?

In GA4 properties, you can still see session data, but Analytics collects and stores user interactions with your website or app as events. Events provide insight on what’s happening in your website or app, such as pageviews, button clicks, user actions, or system events.

Events can collect and send pieces of information that more fully specify the action the user took or add further context to the event or user. This information could include things like the value of purchase, the title of the page a user visited, or the geographic location of the user.

This might not sound like much, but this is a huge, huge difference.

Event-based data vs. session-based data

Displayed in the table is how Google Analytics captures data. On the left, the Universal Analytics property lists various “hit types,” each representing a different kind of data. In web analytics, a “hit” refers to any interaction a visitor makes with your site or app, such as a click, page view, scroll, file download, purchase, or any other trackable action. On the right, the GA4 property simplifies this by categorizing all types of hits as Events.

Universal Analytics Google Analytics 4
Page View Event
Event Event
Social Event
Transaction/e-commerce Event
User timing Event
Exception Event
App/screen view Event

Events existed in UA as well; with an associated category, action, and label; but these classifications do not exist in GA4. Instead, GA4 works with event parameters that are additional pieces of information about the action (event) a user took. Some event parameters are sent automatically, such as page_title, and additional ones can be added (you can log up to 25 event parameters with each event). Since the data models are fundamentally different, Google recommends that you do not simply copy over existing event logic from UA to GA4, but instead implement new logic that makes sense in this new context. 

Universal Analytics can track events such as button clicks, scroll depth, and downloads, but this necessitates the use of Google Tag Manager. In GA4, while some events still require Google Tag Manager for tracking (known as “recommended events” and “custom events”), there are also events that GA4 automatically measures without additional tools. These automatically tracked events are divided into two categories: “automatically collected events” and “enhanced measurement events.”

Event types in GA4

Let’s dive deeper into the different event categories in GA4, as outlined in the Google Developers guide for GA4.

There are four main event categories in GA4, but I’ll focus on two: Automatically Collected events and Enhanced Measurement events. These event types are logged automatically through either gtag or gtm configuration and don’t require extra coding.

Here’s an overview of automatically collected events:

  1. session_start: Triggers when a user initiates a session on a website or app.
  2. first_visit: Fires if it’s a user’s first visit to the site or app.
  3. user_engagement: This occurs when a visitor has been on a page for at least 10 seconds, viewed two pages, or completed a conversion event. GA4 tracks this automatically.

Now, regarding enhanced measurement events:

These include page views, scrolls, outbound clicks, site searches, video engagement, and file downloads. Enhanced measurement events are particularly noteworthy as an upgrade in GA4 compared to UA, where tracking these events requires additional effort.

In the Admin section of your GA4 property, under Data Stream, you can view these enhanced measurement events. Except for page views, you have the option to toggle off any of these enhanced measurement events if you choose.

  • Category 3 – Recommended Events – these events come with predefined names and parameters, tailored for various business models. Implementing them requires custom coding, often done via Google Tag Manager. Their defining feature is that Google has suggested specific names for these events. For instance, when a visitor adds an item to their cart, Google suggests naming this event add_to_cart. The initiation of the checkout process should be labeled as begin_checkout, and the submission of a contact form as generate_lead. You can find a comprehensive list of these Recommended Events on Google’s website, detailing all the suggested event names.
  • Category 4 – Custom Events – similar to recommended events, custom events also need custom coding and can be set up using Google Tag Manager. However, unlike recommended events, there are no predefined names or parameters from Google. For example, tracking internal link clicks on your site would fall under custom events. While GA4’s Enhanced Measurement can track outbound (external) link clicks, tracking internal link clicks requires more effort through custom event implementation. The following sections offer a comparison of how to approach this.

Tracking link click events in GA4 vs. UA

In Universal Analytics (UA), data tracking primarily revolves around pageviews. UA tracks when a URL loads, capturing that as a pageview. However, user actions that don’t lead to a new page loading on the tracked domain, such as video plays or clicks within or outside the domain, aren’t automatically tracked. For tracking these “events” like link clicks, Universal Analytics requires assistance from Google Tag Manager.

Setting this up can be quite intricate, especially for marketers doing it for the first time. It involves creating variables, triggers, and tags in Google Tag Manager to track specific actions, translating them into data in Google Analytics. For example, a Universal Analytics event tag might be configured to track all link clicks on a particular site. A key difference in UA, compared to GA4, is the use of pre-defined “event parameters” such as category, action, and label. These parameters provide additional context to the tracked event, aiding in data interpretation.

  • Category = link_click: This parameter is consistently used every time the tag is activated.
  • Action = {{Click URL}}: This variable captures the specific URL clicked by the user.
  • Label = {{Page URL}}: Another variable, this one records the URL of the page the user was on when the link was clicked.

In contrast, GA4 simplifies this process by automatically tracking certain types of link clicks, reducing the need for such a detailed setup in many cases.

Event tracking in GA4

Contrary to Universal Analytics, GA4 is designed with event tracking capabilities built-in, rather than relying primarily on pageview tracking. As previously mentioned, GA4 automatically tracks certain types of events (like automatically collected events and enhanced measurement events). However, for recommended events and custom events, manual creation using Google Tag Manager is necessary.

GA4 automatically captures some basic “event parameters” with every event. These include:

  • language
  • page_location
  • page_referrer
  • page_title
  • screen_resolution

For additional parameters in recommended and custom events, there’s an extra step involved: these parameters need to be registered as custom dimensions in GA4. This aspect of GA4 can be initially confusing and for those finding it non-intuitive, consulting a practical guide on understanding event parameters in GA4 might be beneficial.

While some events in GA4 can be tracked directly (like automatically collected events and enhanced measurement events), others, such as recommended events and custom events, still require the use of Google Tag Manager. An illustrative example of this is tracking internal link clicks, which falls under the category of events requiring additional setup in GA4.

How to compare reporting in Google Analytics 4 (GA4) and Universal Analytics (UA)

Universal Analytics was designed as a robust platform with a variety of standard reports, whereas GA4 offers relatively fewer standard reports, placing a greater emphasis on custom reporting or data exportation. Examining the area of acquisition reporting, which is crucial in Google Analytics, highlights the differences between GA4 and UA.

Acquisition reporting is essential for analyzing the performance of different traffic sources on a website. This type of reporting is critical for evaluating the effectiveness of channels like organic search, email, and social media in driving purchases or other conversions. It not only aids in understanding overall business performance but also assists in making informed decisions about budget allocation.

While the fundamental concept of acquisition reporting in GA4 is similar to that in UA, there are notable distinctions. GA4’s approach to acquisition reporting might require more customization or data manipulation compared to the more out-of-the-box solutions provided by UA. This shift indicates a move towards a more flexible, albeit potentially more complex, reporting framework in GA4.

GA4 reports

In GA4’s Acquisition reporting section, as highlighted in the above box, there are just three standard reports available. A significant omission here is the Source/Medium report, which was highly favored for its effectiveness in analyzing traffic performance across various channels (like comparing google / organic with bing / organic or google / cpc).

The whole interface has changed and some standard reports need to be activated via Library in the left-down corner (you have to also publish the changes, otherwise you will see no change in default reports).

You can adjust all reports according to your needs (you can create your own views).

To conduct more nuanced analyses in GA4, additional effort is required.

This could involve exporting data for deeper examination or creating custom “Exploration” reports.

Another option is to integrate with Looker Studio (previously known as Data Studio) to construct more tailored reports. While these custom reports can be extremely useful, they initially demand more time and effort to set up compared to the more straightforward reporting system in Universal Analytics. F

Universal Analytics reports

In Universal Analytics (UA), as indicated in the blue box, a significantly larger array of standard reports is available compared to GA4. When you dive into all the reporting sections in UA, you’ll find a total of 30 standard reports. This is a stark contrast to the mere 3 standard reports offered in GA4.

And, of course, we see the faithful Source / Medium report among the 30.


User reports (UA) in GA4cheatsheet

Universal Analytics Google Analytics 4
Demographics data – overview User – User Atributes – Overview
Age / Gender / Location User – User Atributes – Demographic details
Technical data – overview Tech – Tech Overview
Operation system / Device type / Browsers Tech – Tech Details

Acquisition reports (UA) in GA4cheatsheet

Universal Analytics Google Analytics 4
Acquisition Overview Lifecycle – Acquisition – Acquisition Overview
All Traffic – Channels Lifecycle – Acquisition – Traffic acquisition: Session default channel group

Advertising – Performance – All Channels

All Traffic – Source/Medium Lifecycle – Acquisition – Traffic acquisition: Session default channel group
All Channel Reports – by users Lifecycle – Acquisition – Users
Google Ads Lifecycle – Acquisition – Acquisition Overview – Google Ads tab
Search Console Search Console – Search Console – Queries
Google Organic Search Search Console – Search Console – Google organic search traffic: Landing page + query string
Organic Traffic (all) Lifecycle – Acquisition – Traffic acquisition: Session default channel group

Behaviour + event reports (UA) in GA4cheatsheet

Universal Analytics Google Analytics 4
All Pages Lifecycle – Engagement – Page and Screens
Events Lifecycle – Engagement – Events
 Behaviour (new vs returning users)  Lifecycle – Retention (you can find here also Customer Lifetime Value metric)

E-commerce reports (UA) in GA4cheatsheet

Universal Analytics Google Analytics 4
Conversions  Lifecycle – Engagement – Conversions
Items Revenue  Business Objectives – Drive Online Sales – E-commerce Purchases
Purchase Funnel  Business Objectives – Drive Online Sales – Purchase Journey

Metrics comparison: GA4 vs. Universal Analytics

In GA4, there are three new metrics that differ from those in Universal Analytics (UA):

  1. Engaged session – Google defines this metric as a session that lasts longer than 10 seconds, includes a conversion event, or has two or more screen or page views.
  2. Average engagement time per session – this metric measures the duration of user engagement per session, essentially the time spent actively interacting with the page (like scrolling) while it remains the primary window on the screen.
  3. Engagement rate – this is calculated as the ratio of Engaged Sessions to total sessions. For example, if you have 1,000 total sessions and 130 qualify as Engaged Sessions (as per Google’s definition), the Engagement Rate would be 13%.

These metrics are not available in UA.

They replace some of the metrics that were phased out in GA4, such as average session duration, pages per session, and bounce rate (the percentage of single-page view sessions).

Regarding bounce rate, although initially absent in GA4, it was later introduced but with a different calculation method than in UA, which can be confusing. Understanding this difference is crucial for the accurate interpretation of data in GA4.

The reason behind these metric differences between GA4 and Universal Analytics relates to their respective data models. UA’s model is centered around sessions and pageviews, making it straightforward to calculate metrics like pages per session or bounce rate. In contrast, GA4’s model prioritizes event collection and processing over traditional page views and sessions, necessitating a different approach to metric calculations.

Difference between sessions in GA4 and UA

Definition of session in Universal Analytics (UA)

Sessions end when:

  • 30 minutes of inactivity (or your session timeout settings)
  • The clock passing midnight (resulting in a new session)
  • New campaign parameters are encountered (i.e. if you use UTM parameters for internal links on your website > therefore, this is not recommended by Google to use UTMs for internal linking of your website). 
Definition of session in Google Analytics 4 (GA4)

Fewer occurrences of how a session can end. Sessions end only if there is:

  • 30 minutes of inactivity (or your session timeout settings)
    • the session can now carry over across midnight
    • session and are not affected by encountering new campaign parameters

If your site has a global audience, this can cause discrepancies in the session figures you see for UA and GA4 respectively.

Why bounce rate is no longer used in Google Analytics 4?

There are plenty of reasons why Google decided to phase out the old bounce rate when it first launched GA4, from a perceived lack of relevancy to the popularity of single-page applications (SPA). Bounce rate has been around since the beginning of Google Analytics, which means it took more than a decade for marketers and agencies to become attached to the bounce rate. But in an age of single-page applications, the old definition of bounce rate isn’t useful anymore. Since there’s technically only one “page,” every visit is seen as a bounce, which isn’t true or helpful.

Single-Page Applications (SPAs):

  • Unlike old websites where every click loads a new page, SPAs rewrite the current page on-the-fly, making things fast and smooth.
  • Big names like Gmail, Netflix, and Facebook use this design, making your browsing quick and seamless without waiting for new pages to load.

Google now focuses on more meaningful interactions, looking at what users do on the page, rather than just counting page loads. This way, businesses can understand their audience better and see if their website is actually engaging or needs improvement.

Therefore Google has chosen to take a more “positive” approach and report on figures for bounce rate => new metric engagement rate = bounce rate is the inverse of engagement rate.

But also it is calculated differently in GA4 than in UA.

GA UA vs GA4 – how are bounce rate and engagement rate calculated

  • Bounce rate in Universal Analytics – someone visits your client’s website and only looks at a single page before leaving, that is considered a “bounce” in UA.
  • Bounce rate in GA4 – the bounce rate in Google Analytics (GA4) is the percentage of sessions that:
    • were less than 10 seconds long,
    • had zero conversion events or
    • had less than 2 page or screen views.
  • Engagement rate in GA4 – an inverse metric of bounce rate.

GA UA vs GA4 – longer data processing delay

  • As you migrate to GA4, one of the first things you may notice is the extended data processing time, which leads to delays.
  • Universal Analytics typically provides data within four hours.

GA4 introduces a wider range of data delays, spanning from 12 to 48 hours.

GA UA vs GA4 – Google Tag Manager is even more important

Now, with GA4, this is no longer possible. All goals are event-based, so you need Google Tag Manager to set it.

A lot of standard settings which UA had in default, now need manual setup in GTM/GA4.

Segments in Google Analytics 4 (GA4) vs. Universal Analytics (UA)

Segments in both GA4 and Universal Analytics serve the same fundamental purpose: they enable the analysis of specific subsets of your Google Analytics data, providing deeper insights into user behavior and website or app performance. While segments operate similarly in both versions, allowing analysis of up to four segments simultaneously, there are differences in the types of segments you can create and the process of creating them.

In GA4, there are three types of segments available for creation:

  1. User segments – focus on the characteristics and behavior of individual users.
  2. Session segments – focus on specific sessions.
  3. Event segments – targeting particular events.

In contrast, Universal Analytics offers only two types of segments:

  1. User segments.
  2. Session segments.

The most notable difference lies in the process of creating these segments. In GA4, segment creation is integrated into the “Explorations” section, the same area used for creating custom reports. This integration signifies a more unified approach to data analysis and reporting in GA4. For a detailed comparison and guidance on creating custom segments, reviewing a walkthrough of segment creation in Google Analytics 4 versus Universal Analytics could be highly beneficial.

Website and app tracking in the same property

A highly anticipated feature of GA4 is its capability to consolidate the tracking of both website and app data within a single property. GA4 adopts the measurement approach utilized by Google Analytics Firebase, which is tailored for mobile apps, where every user interaction is recorded as an event. This event-based model allows for a seamless integration of data across websites and mobile apps. The unified data framework simplifies the process of aggregating and analyzing user behavior across different platforms, providing a holistic view of the user journey. This means analysts and marketers can track a user’s interaction from the website to the app, or vice versa, without losing context, making it considerably easier to assess combined data for comprehensive insights.

Additional Advantages of Google Analytics 4 (GA4)

Google highlights several key benefits of GA4, emphasizing its advanced capabilities in understanding user interactions and adapting to the evolving digital landscape:

  1. Enhanced user interaction tracking – GA4 is designed to measure, unify, and de-duplicate user interaction data, offering a clearer picture of the user journey across platforms and devices.
  2. Adaptability to privacy changes – the platform is built to adjust to changing privacy regulations and user expectations, ensuring data collection remains compliant and effective.
  3. Intelligent business insights – GA4 utilizes machine learning to uncover valuable business insights, helping users to understand and predict customer behaviors more accurately.
  4. Actionable data utilization – the system is geared towards helping businesses more effectively act on their data to achieve specific goals and objectives.
  5. Free integration with Google BigQuery – GA4 offers free integration with Google BigQuery, a feature previously available only in the paid GA360 plans with Universal Analytics. This integration is a significant enhancement, allowing for more complex data analysis and processing.

From a practical learning perspective, the most important advantage is that Google has positioned GA4 as the new standard for digital analytics. This means that, inevitably, GA4 will replace Universal Analytics for all digital marketers and measurement teams, making proficiency in GA4 not just beneficial but essential for future success in the field.

What is Google Analytics 4? GA4 vs. Universal Analytics: A side-by-side comparison and how does it differ from Universal Analytics? 1.part

Universal Analytics has already seen its sunset in July 2023, paving the way for its successor, Google Analytics 4 (GA4). Since becoming the standard for new properties on Google Analytics as of October 14, 2020, there hasn’t been a widespread eagerness among marketers and web developers to transition from UA to GA4.

GA4 introduces a significantly different operational framework compared to UA, bringing along new and improved features but also facing criticism for certain bugs and the absence of some popular UA features. This article will explore all major distinctions between these two platforms (Universal Analytics and Google Analytics 4) and we will get also into more detail to show you what new opportunities GA4 offers.

But first, we will get a little bit of the history of this amazing web tracking tool.

Brief history of Urchin/Google Analytics

Our story begins in the late 1990s, a time when the internet was rapidly expanding and businesses were just beginning to realize the potential of an online presence.

Urchin Analytics began its journey in the realm of web analytics as a product of Urchin Software Corporation, founded in 1995. The company, focusing on the web statistics and web analytics field, developed Urchin as a software solution to help businesses understand and interpret web traffic data. In 1998, Urchin Software Corporation emerged, as a pioneer in the field of web analytics. Their product, Urchin, was groundbreaking, offering website owners invaluable insights into visitor behavior.

As the internet evolved in the early 2000s, Urchin Software Corporation adapted by introducing a new product – Urchin On Demand. This service marked a significant shift from traditional, software-based analytics to a more accessible, service-based model. Urchin On Demand allowed users to monitor and analyze their web traffic through a hosted solution, eliminating the need for installing complex software on their own servers. This move was pivotal in making web analytics more user-friendly and widely accessible. Urchin on Demand, was one of the early tools available for website traffic analysis.

The potential of Urchin Analytics did not go unnoticed by the tech giant Google. In April 2005, in a move that would significantly shape the future of web analytics, Google acquired Urchin Software Corporation. This acquisition was a strategic step for Google, as it sought to expand its footprint in the world of online analytics and advertising.

The legacy of Urchin Analytics is thus deeply intertwined with the evolution of web analytics as a whole, marking a significant chapter in the history of how businesses understand and interact with their digital audiences.

First (former Urchin) team day at Google, April 21, 2005 after acquisition.

First (former Urchin) team day at Google, April 21, 2005 after acquisition.

First website tracking software Urchin and its evolution and Urchin Software Corporation story

TL; DR: Urchin Software Corporation, originating in San Diego, CA, was co-founded by Paul Muret, Jack Ancone, Brett Crosby, and Scott Crosby. In April 2005, Google acquired the company, transforming Urchin into “Urchin from Google,” and eventually evolving it into Google Analytics. With the many anniversaries of this acquisition having recently passed, it seemed an opportune moment to document the company’s history for future reference. This account may not captivate those unconnected to its journey; it’s more a personal closure of that chapter.

Perhaps, this story also subtly indicates that success doesn’t always require massive initial funding or rapid growth. Sometimes, a more modest approach with gradual progress can lead to significant achievements.

Founding of Urchin Software Corporation

In the late months of 1995, the seeds of what would become Urchin Software Corp. were sown by two post-college roommates, Paul Muret and Scott Crosby, in the Bay Park neighborhood of San Diego. Paul, who had been working in the Space Physics department at UCSD, stumbled upon the world of HTML 1.0 while uploading the department’s syllabus online. This exposure sparked an idea in him, a vision of a business opportunity to create websites for other businesses.

One evening, Paul returned home, brimming with excitement about this newfound opportunity. To illustrate his point, he showed Scott a simple website he had created for UCSD, featuring bright blue text on a grey background, with some of the text possibly even blinking in an early web aesthetic. Convinced by Paul’s enthusiasm and the potential in this nascent internet era, the two embarked on drafting a business plan.

Their plan, a blueprint for a venture into the digital frontier, was presented to Scott’s uncle, Chuck Scott. Chuck, a figure of financial means, saw promise in the young entrepreneurs’ vision. He agreed to invest $10,000 in their new company and even provided them with a small desk space in a corner of his office at C.B.S. Scientific. Little did he know, it would be a considerable time before this investment brought its fruit, marking the humble beginnings of a journey that would significantly impact the digital analytics world.

In the wake of receiving financial backing from Chuck Scott, the fledgling company embarked on its journey by purchasing a Sun SPARC 20 for server duties and securing an ISDN line, a significant expense at the time. The office computers were interconnected using 10base2 networking, a system that relied on coaxial cables with twist-lock fittings, reminiscent of TV cables but now seen as antiquated.

The first Urchin webserver, running at 50 MHz, was ~$3200 in 1995 money. That was about 1/3 of the total raised capital in that time for Urchin Software Company.

Paul and Scott, the duo behind this venture, began the arduous task of customer acquisition. Their clientele grew gradually, mostly comprising small businesses that paid a modest monthly fee. Among their early clients were Cinemagic, a vintage movie poster company run by Herb and Roberta, and ReVest, a financial startup. The owner of ReVest was notably averse to using email, leading to website edits being communicated through lengthy thermal-transfer faxes that unspooled across the office floor each morning. Another notable client was a lesser-known division of Pioneer Electronics, specializing in the production of LaserDiscs, a format already considered archaic at the time.

Buoyed by these early successes, the company leased office space in a modest brownish-green building located in the faux-historic, theme park-like area of Old Town, San Diego, not far from Rockin’ Baja Lobster. The office could accommodate up to four desks, five if the vestibule was counted, possibly intended for a secretary. In 1997, the company welcomed a new member, Brett Crosby, Scott’s younger brother, marking a turning point as the business began to gain momentum. They managed to secure contracts with two of the larger local employers: Sharp Healthcare, a hospital system, and Solar Turbines, a power generation subsidiary of Caterpillar.

Despite these significant contracts, the company still catered to numerous small clients, hosting their websites on a single web server and charging a recurring fee. To accurately bill for bandwidth usage—a costly resource at the time—Paul developed a simple log analyzer. This tool not only tallied bytes transferred but also provided a user-friendly web interface, tracking referrers, “hits”, and pageviews. This innovation laid the groundwork for the first version of Urchin. After further enhancements, including date-range features and user authentication, Urchin was showcased to customers, receiving generally favorable feedback.

This period was a critical one for the company, with the deal from Solar Turbines alone, bringing in $10,000 per month, playing a vital role in keeping the business afloat for over a year. This early phase of struggle and gradual success was the foundation upon which the future of web analytics was built.

In 1997, the team behind what would become Urchin Software Corp embarked on their first-ever trade show adventure. In a creative twist, they borrowed giant blue light boxes from an underwear startup. These boxes, made of 1-inch thick particle board, were notably heavy and cumbersome, adding a unique challenge to their trade show debut.

To add a bit of flair to their booth, they enlisted the help of friends who, intrigued by the novelty of an internet trade show, volunteered to assist for the day. These friends, playfully referred to as “booth babes,” brought lively energy to the booth. However, the long hours and bustling environment of the trade show proved to be more demanding than anticipated. As a result, their initial enthusiasm vanished (probably), and they decided not to volunteer for such events again :-). This first trade show experience was a mix of improvisation, camaraderie, and learning, marking a memorable step in the company’s early journey.

In the late 1990s, a pivotal moment arose for the company that would later become Urchin Software Corp, thanks to a connection through Brett Crosby’s girlfriend, Julie, who worked in the advertising and web development industry. Julie was employed by Rubin Postaer Interactive (RPI), a company that still exists today, a subsidiary of the prominent Los Angeles-based RPA (Rubin Postaer and Associates), which managed the Honda.com account. It was discovered that Honda.com, then using WebTrends for web analytics, struggled with processing their daily Apache access logs within a single day, leading to a backlog.

Seizing this opportunity, the team managed to acquire a few days’ worth of server logs from Honda.com to process as a demonstration. Impressively, they completed the task in approximately 30 minutes, a feat that led to them becoming the web analytics solution for American Honda. This success marked a turning point, indicating the potential to build a business around Urchin’s log processing technology.

Around the same time, Jack Ancone joined the team as the CFO and relocated to San Diego. The company then moved into an office at 2165 India St.

The Urchin Software Company moved into an office at 2165 India St. This is how it looked like.

The Urchin Software Company moved into an office at 2165 India St. This is how it looked like.

In the early days of 1998, a significant milestone was reached for the team behind Urchin Software Corp. They celebrated their first sale of the “Pro” version of Urchin, priced at $199. This moment marked a turning point, prompting a strategic shift in their business model. They decided to focus solely on software, divesting themselves of their hosting and web development services. These segments were handed over, without any financial gain, to a local web development shop. This bold move transformed Urchin company into a pure software company, a transition met with enthusiastic high-fives all around.

To support this new direction, the Urchin Software Company needed additional funding. They tapped into their family networks and collaborated with a boutique venture capital firm, Green Thumb Capital from New York City, brought in by Jack. This effort successfully raised $1 million, increasing their total external capital to approximately $1.25 million. Despite future attempts, this would be the last of their fundraising, except for a manageable debt of around $400,000, which was later repaid with interest and warrants. Green Thumb Capital, to their credit, never pressured them for returns, likely as surprised as anyone when Google later acquired Urchin.

During the late 1990s, as they navigated the challenges of selling enterprise software, the team opted for an unconventional, advertising-based strategy to capture market share. In an era where internet companies were often valued by the number of “eyeballs” they attracted, they released Urchin ASAP, a free version of their software supported by banner ads, alongside Urchin ISP. Both versions were tailored for hosting operations. The Urchin Software Company team thought they could make a significant fraction of a cent per click on these ads, on top of some infinitesimal CPM. Although the banner ads didn’t generate significant revenue, they did succeed in gaining valuable exposure.

One of the Urchin ASAP banner ads, which advertised itself when no one else wanted the space

This approach, combined with the quality of their software, set the stage for their first major breakthrough in the industry.

In the old days of the internet, a time when Tumblr and Blogger were still on the horizon and Geocities was a household name, there existed a platform known as Nettaxi. This relatively obscure service claimed to host a staggering 100,000 “sites,” a figure that intrigued the creators of Urchin. Seeing Nettaxi as a potential goldmine for user engagement, they struck a unique deal: Urchin’s sophisticated web analytics tools would be offered to Nettaxi completely free of charge, in exchange for the ad revenue generated from the traffic of all these sites. The financial outcome of this arrangement? A few cents, if there were any, but the real value, however, was in the claim that Urchin company now could say – they were servicing 100,000 sites. This statistic significantly boosted their market presence.

In a move reminiscent of Google’s playful logo variations, the Urchin team introduced their own creative twist: the “Urchin of the Day.” This quirky feature, which involved changing the Urchin logo in the interface’s upper-left corner, was more than just a fun gimmick. It was an attempt to forge a closer bond with their user base. Whether it achieved this goal is debatable, but it certainly kept its designer, Jason Collins, busy and entertained for a considerable period. His creations, ranging from whimsical to downright hilarious, are still remembered fondly. Among these was a special design by Shepard Fairey, later famous for his iconic “Obama Hope” poster, who contributed a “Power to the People” version of the Urchin logo.

Urchin of the Day – created by Jason Collins (designer/graphic) – his creations weren’t just static images; they were dynamic, animated GIFs that brought a sense of life and movement to the company’s interface. With each new design, Jason’s talent shone, blending his love for cars with his flair for digital artistry, making the “Urchin of the Day” a much-anticipated reveal among the team and clients alike.

In the late 1990s, a small, ambitious team led by Brett Crosby, the VP of Sales and Marketing at Urchin Software Corp., was on a mission to elevate their latest creation, Urchin 2.0, into the spotlight. Their target? Earthlink – a giant in the internet service provider industry, nearly rivaling AOL in size and influence. The challenge, however, was making contact with someone influential within such a colossal organization. Undeterred, Brett resorted to the simplest yet most persistent method: repeatedly submitting inquiries through Earthlink’s web form, a testament to determination over sophistication.

“We had no idea how to reach anyone important at a place like that, so Brett did the natural thing and filled out a web form. Again, and again, and again. He must have submitted that thing 20 or 30 times. Finally, he got a response. Rob Maupin, VP of hosting (or something similar) agreed to a meeting. We were stunned.”

After what seemed like an endless stream of attempts, their persistence paid off. Rob Maupin, a high-ranking executive at Earthlink, responded, agreeing to a meeting.

So they were about to pitch to one of the internet’s behemoths. In their best vehicle, Brett’s old but reliable Mercedes bought for that purpose for 4,000 USD, they went to Pasadena, while Scott Crosby, another key figure in the company, stayed at the office, partly out of managerial duty, partly out of sheer intimidation.

The meeting with Rob Maupin was a reality check. He bluntly criticized the Urchin 2.0 interface for its overwhelming use of blue, a fair point that the team had to concede.

Too blue blue blue blue blue Urchin 2 - version, which was used in presentation for Earthlink

Too blue blue blue blue blue Urchin 2 – version, which was used in presentation for Earthlink

Despite this, Maupin saw potential in Urchin’s speed and efficiency, crucial factors for web hosting services these days (and nowadays also). After accommodating Earthlink’s requests for modifications, Urchin struck a deal that would become a cornerstone of their success: $4,000 per month for unlimited use of Urchin software across all Earthlink-hosted websites.

By 2001, the company had evolved into Urchin Software Corporation, and it was time to seek additional funding. The process of pitching to venture capitalists was grueling and distracting, but eventually, they managed to secure commitments from two reputable firms. The funding, however, was scheduled to be finalized on September 12th, 2001 – a day after the world-changing events of 9/11. The aftermath of the tragedy put the investment on hold indefinitely.

Having already expanded in anticipation of the $7 million investment, Urchin found itself in a precarious financial position. They had to make drastic cuts, including laying off 12 employees and giving up office space, a day they somberly referred to as Black Friday. Facing a dire cash flow crisis, they had no choice but to seek loans from benefactors Chuck Scott and Jerry Navarra, who provided the much-needed funds in exchange for interest and warrants. This period marked a challenging phase for Urchin, with drastic cost reductions and employees voluntarily taking significant pay cuts to keep the company afloat. Despite the hardships, the team’s resilience and dedication kept Urchin alive, even when hope seemed fleeting.

 

 

The business was divided into three main areas: web development, hosting, and software development, following a strategy of diversification.

In January 1998, the company celebrated its first sale of the “Pro” version of Urchin for $199. This milestone was followed by a strategic shift to focus solely on software development, leading to the abrupt discontinuation of hosting and web development services. The company transitioned into a pure software company, a decision marked by a symbolic high-five.

To support this new direction, the team raised $1 million through family connections and a boutique venture capital firm, Green Thumb Capital of NYC, bringing their total outside capital to approximately $1.25 million. Despite later attempts, this would be the last of their fundraising, except for a small amount of debt financing.

In an effort to capture market share in the late 1990s, the company adopted an advertising-based approach, releasing Urchin ASAP, a free counterpart to Urchin ISP, both aimed at hosting operations. The plan was to generate revenue through banner ads displayed at the top of each page, capitalizing on the era’s emphasis on “eyeballs” over profitability. However, this strategy did not yield significant financial returns, but it did provide valuable exposure and helped establish the software’s reputation in the market. This approach laid the groundwork for the company’s first major breakthrough in the world of web analytics.

In the early 2000s, the tech industry was still reeling from the aftermath of the dot-com bubble burst. Revenue streams were inconsistent, and growth was slower than anticipated.

The company’s primary revenue source had been substantial annual licensing deals, often involving lengthy and complex negotiations. One of their most significant contracts, exceeding $1 million, was secured by Jack Ancone with Cable & Wireless, a major player in the global telecommunications and hosting industry. However, despite these promising deals with companies like Winstar, KeyBridge, and Worldport, payments often fell through as these seemingly resource-rich companies faced their own financial limitations.

To invigorate sales and streamline processes, Urchin made a strategic decision to simplify their enterprise deals with hosting companies. This new approach, although less lucrative in the short term, was based on the modest deal they had previously struck with Earthlink. The new Site License Model (SLM) was straightforward: hosting companies would pay $5,000 per month for each physical data center, receiving unlimited access to Urchin software under a simple, one-page contract, no legalese, and nothing really to negotiate.. This model quickly gained popularity, attracting major hosting companies in the US and Europe, including Rackspace (now part of IBM), Everyone’s Internet (aka EV1 Servers), The Planet, and Mediatemple and many others.

By the fall of 2003, these deals had propelled Urchin into a cashflow-positive position. They were also successfully selling individual licenses to self-hosted organizations, including Fortune 500 companies and numerous university systems.

The sales team, having been significantly reduced during the company’s financial struggles in 2001, was small but mighty. Paul Botto, Nikki Morrissey, and Megan Cash, who had worked without pay during the toughest times, played a crucial role in Urchin’s recovery. Their efforts, combined with a new commission model that offered low base pay but high commission rates, led to a significant boost in sales.

Paul and Megan eventually joined Google, while Nikki chose a different path.

Urchin 4 had an easter egg that no one ever found. If you clicked a random “rivet” in the sexy brushed aluminum interface, you’d be treated to a photo of the illustrious Urchin dev team: Doug Silver, Nathan Moon, Paul, Jonathon Vance, Rolf Schreiber, and Jim Napier. Most of these guys are still at Google (date to Aug. 2016).

Urchin’s international expansion had its ups and downs, including a failed attempt to establish an office in Tokyo. However, the launch of a channel program, particularly in markets where English wasn’t the primary language, proved to be a wise move. Japan, for instance, became a strong market for Urchin, thanks to the efforts of Jason Senn, who managed the channel program and also took on the role of chief office builder.

Product-wise, Urchin was evolving. If Urchin 2 opened doors and Urchin 3 maintained standards, Urchin 4 was a game changer. It featured a modern, Apple-esque design and introduced the Urchin Traffic Monitor (UTM). The UTM was a pioneering method that combined Apache or IIS log files with cookies, enabling the identification of unique visitors. This hybrid approach of using both log files and cookies set Urchin apart from competitors who relied solely on one method or the other. Urchin’s innovative approach laid the groundwork for more advanced web analytics practices, foreshadowing the capabilities of future tools like Google Analytics.

Paul, looking here like a cartel drug lord, withdrew something like $53,000 in cash for Urchin Software Company employees for their Christmas bonuses in 2004. Funny thing is, Google also gave out actual cash money bonuses for years after former Urchin employees joined — millions of dollars in currency. Great minds think alike I guess. This photo was taken on December 17, 2004.

Once upon a time in the tech world, Urchin Software Corporation released Urchin 4, a product that continued the company’s quirky tradition of supporting an incredibly diverse array of platforms. If you ever stumble upon Google’s Urchin 4 help page, you’ll be amused to see the list of supported operating systems, including the obscure Yellow Dog Linux. The team at Urchin had a vision: they believed that by supporting a wide range of platforms, they might break into major corporations or universities that used less common systems like AIX or HP-UX. However, reality proved different, with most customers opting for the Linux or Windows IIS versions.

The team’s enthusiasm for diverse platforms led them to acquire various servers from eBay, enjoying the challenge of getting Apache and a compiler running on each unique system. They even dabbled with a NeXT version, though they steered clear of DEC after struggling to boot up the machine.

Urchin 4 marked a turning point for the company. It was the first version that they felt could truly compete with any other product in the market, not just in back-end performance. But it was Urchin 5 that took things to a whole new level. It was a powerhouse of a product, albeit a bit overwhelming with its layers of menus and submenus. It was a dream for analytics enthusiasts, packed with features like e-commerce tracking, the Campaign Tracking Module, and multiserver versions. Urchin 6 introduced a groundbreaking feature: individual visitor history drill-down, a capability so sensitive that Google later decided to remove it entirely.

Until Urchin 5, the company had operated on a traditional software licensing model. But by 2004, it was clear that a hosted version was necessary. So, they invested in servers, upgraded their T1 line, and launched Urchin 6, available both on-premises and as a hosted service. This new business model was an instant success, with companies willing to pay for the convenience of not having to manage the software themselves.

By the summer of 2004, Urchin boasted the largest installed base among web analytics vendors, measured by the number of websites using their product. Tradeshows, once a daunting task, became enjoyable events for the team. It was at the Search Engine Strategies 2004 in San Jose that Urchin caught the eye of Google. Wesley Chan, a Product Manager, and David Friedberg from Corporate Development, were on the lookout for a web analytics company. Despite the unconventional approach of Urchin, they saw potential in what they found.

Paul Botto, Scott Crosby, and Brett Crosby, at Search Engine Strategies 2004, San Jose, where they first met with the Google people

Paul Botto, Scott Crosby, and Brett Crosby, at Search Engine Strategies 2004, San Jose, where they first met with the Google people.

Brett Scott prepares to “fax” the signed acquisition agreement back to Google. By this time Urchin Software Companywere sufficiently profitable that it was a tough decision to sell. Brett Scott signed the actual, final paperwork in a tuxedo about 30 seconds before walking down the aisle at his wedding.

In a tale of ambition and success, a small but innovative company named Urchin Software Corporation found itself at a pivotal moment. Just a few weeks after catching the attention of tech giant Google at a tradeshow, an offer was made to acquire Urchin. This period was marked by interest from various players in the tech world, including WebSideStory, a public company at the time, which even offered a higher bid. However, the Urchin team believed Google was the right choice (and time showed that it was the right decision).

The process of selling Urchin to Google, however, was far from smooth. It was expected to conclude shortly after Google’s IPO in late 2004, but the legal intricacies, particularly around intellectual property and patent risks, made it a nerve-wracking experience. The founders were personally liable for any potential patent infringements, a daunting prospect given their new association with a major player like Google. The deal was finally sealed in April 2005, by which time Google’s stock had doubled, impacting the financials of the deal.

Joining Google for Urchin team in 2005 was a unique experience. The company, still in its relative youth with around 3,000 employees, had a vibrant culture. Everyone could gather for a single, grand holiday party, and celebrities like MC Hammer were a common sight.

Jack, MC Hammer, and Chris Sacca (2005).

Jack, MC Hammer, and Chris Sacca (2005).

On their first day, the Urchin team met with Eric Schmidt, Google’s CEO at the time. Schmidt immediately recognized the potential of Urchin’s web analytics in relation to Google’s Adwords. He remained a supportive and accessible figure throughout their integration into Google. Brett Crosby, who later became Senior Director of Marketing at Google, even had an office next to Schmidt.

The Urchin team was initially placed in the “fishbowl” of Building 42 at Google’s Mountain View campus, in close proximity to Google’s founders, Larry Page and Sergey Brin. Sergey, known for his eccentricities, had a laser engraver in his office, complete with an air duct for venting gases. They also shared the space with Mike Stoppelman, a new Google engineer whose brother would soon found Yelp.

As Google prepared to launch Google Analytics, the rebranded Urchin product, in 2005, there was apprehension about its reception. Wesley Chan, the Google Product Manager leading the integration, initiated a daily “war room” to ensure the product’s success. The team was given specific objectives and tight timelines to meet. The effort involved educating the rest of Google about the product, with team members touring Google offices nationwide.

When “Urchin from Google” was announced as a free service for any website in the world, the response was overwhelming. The demand was so high that it strained Google’s infrastructure, leading to a temporary shutdown of new signups. This was a problem of success, but it frustrated many. Eventually, signups were reopened using an invitation model, and Google Analytics began its journey to becoming the ubiquitous tool it is known today.

In the grand narrative of tech acquisitions, Google’s purchase of various companies stands out. Among these acquisitions, some, like YouTube and Keyhole (which became Google Earth), soared to great heights. Others, however, like Dodgeball, faded into obscurity, victims of the complex dynamics within a large corporation. This phenomenon, partly due to Google’s acquisition strategy and partly due to the inertia and fog that often accompany big companies, led to many promising ventures dissolving into the corporate ether.

For acquisitions under a certain threshold, rumored to be around $50 million, the decision-making process was startlingly straightforward: a single VP’s approval could seal the deal. However, once these companies were integrated, they often found themselves adrift in the vast sea of Google’s operations. Without a high-level champion or a clear path to significant revenue generation – often benchmarked at around $100 million annually – these acquisitions struggled to maintain their identity and purpose.

Urchin, the company behind what would become Google Analytics, was one of the fortunate few. It found powerful allies in Wesley Chan, a product manager who recognized the need for robust analytics to bolster Adwords, and Eric Schmidt, then CEO of Google. Schmidt quickly grasped how web traffic analysis could enhance Adwords’ effectiveness. A few years later, an internal study at Google, conducted by a team of quantitative analysts, demonstrated a substantial increase in ad spending across a wide range of customers, validating the strategic importance of the acquisition.

As time passed, the original team from Urchin began to disperse within Google. Some left the company, but a significant number remained, continuing to contribute to Google Analytics. Notably, Paul, a key member of the Urchin team, rose to become a senior VP of engineering, overseeing not just Google Analytics but also the display ads segment.

You can read the whole Urchin Software Company story directly from the co-founder of Urchin (brother of the next founder Brett Scott): https://urchin.biz/urchin-software-corp-89a1f5292999

Seven Urchin versions

There were 7 different product versions of Urchin (the predecessor of the later well-known Google Analytics/Universal Analytics).

Urchin 1: The early days of a web analytics pioneer

In the mid-1990s, the digital landscape was burgeoning, and amidst this backdrop, Paul Muret and Scott Crosby, fresh out of college, embarked on an entrepreneurial journey. Sharing an apartment in San Diego, they founded a company in 1995 with a vision to create business websites. Their venture was kickstarted with a modest $10,000 seed fund from Scott’s uncle, who also provided them with a workspace in his company, C.B.S. Scientific. This initial investment was channeled into acquiring a Sun SPARC 20 server and renting an expensive ISDN line, a significant step for young entrepreneurs.

Their business began to gain traction, securing clients and generating revenue through monthly fees. This success enabled them to move into their own office space, and in 1997, Scott’s brother, Brett Crosby, joined the team. The company was growing, attracting larger clients, yet all their websites were hosted on a single server, sharing that one ISDN line.

Paul Muret, demonstrating his programming prowess, developed a rudimentary log file analysis system. This system, initially basic, was capable of calculating website traffic and presenting it through a web interface. Gradually, he enhanced the system, adding metrics like pageviews, referrer data, and hits. This evolution marked the birth of Urchin, a simple yet effective tool for log file analysis.

Urchin’s potential was soon recognized when Brett’s girlfriend introduced them to Honda.com. Winning Honda.com as a client was a pivotal moment, as Urchin became their standard web analytics software. This success shaped the company’s future direction. Around this time, Jack Ancone joined the team as the CFO, and the company, then known as “Quantified Systems Inc.,” shifted its focus to encompass web development, hosting, and software development.

The development of Urchin continued, and in January 1998, the first professional version was released, priced at $199. This version marked a significant milestone, and soon after, a strategic decision was made to concentrate solely on software development, moving away from other business areas. This shift necessitated additional funding, and the team successfully raised $1 million to fuel their journey as a dedicated software company.

Web tracking software Urchin 2 and the evolution to Urchin 3

Early version of Urchin 2

Early version of Urchin 2

 

This is how Urchin 3 website and admin looked like in the past

 This is how Urchin 3 website and admin looked in the past

As Urchin continued to evolve, two distinct versions emerged: the commercial “Urchin ISP” and the free “Urchin ASAP.” The latter was an innovative approach, aiming to generate revenue through advertising banners. This model incorporated both CPM (cost-per-mille) and CPC (cost-per-click) for a banner displayed at the top of the Urchin web interface. Adding a touch of creativity, there was an “Urchin of the day” feature, where the Urchin logo was regularly updated with current, sometimes animated, graphics.

In 1999, Brett Crosby took on the challenge of promoting Urchin 2.0. After considerable effort, he secured a meeting with Rob Maupin from Earthlink. Despite initial reservations about the web interface’s overly blue color scheme, Maupin decided to give Urchin a chance.

This opportunity marked a significant turning point for Urchin. With a few software modifications, Urchin was soon established as the standard web analytics software for all websites hosted by Earthlink. This partnership was not only a testament to Urchin’s growing capabilities but also a lucrative deal, with Earthlink paying $4,000 a month for the service.

By 2001, Urchin had progressed to version 3, reflecting continuous improvements and growing recognition in the field of web analytics. This version marked another step in Urchin’s journey, setting the stage for further advancements and wider adoption.

The company underwent a significant transformation, rebranding itself as the Urchin Software Corporation. During a period of flourishing business, Urchin set its sights on expansion, successfully securing a promising $7 million in funding. However, the tragic events of September 11, 2001, disrupted these investment plans, leading to unforeseen financial challenges. The company had already committed funds based on these pledges, resulting in a liquidity crisis. This difficult phase forced Urchin to make tough decisions, including layoffs and office closures. In a bid to stay afloat, they turned to affluent individuals, notably Chuck Scott and Jerry Navarra, for financial support. From 2001 to 2002, Urchin faced a strenuous period, with exhaustive negotiations and some employees even forgoing their salaries to keep the company running.

During this time, Urchin 3 was offered in various configurations to cater to different business needs:

  • Urchin Dedicated: Designed for a single server hosting up to 25 websites, priced at $495, with each additional batch of 25 websites costing $295.
  • Urchin Enterprise: Aimed at larger operations with 2 servers and up to 25 websites, available for $4,995, and an additional $1,995 for each extra server.
  • Urchin Data Center (on request): This version was provided based on specific client requests.

Ultimately, Urchin decided to streamline its business model, focusing on simpler and more direct business deals, even if it meant earning less revenue. This strategic shift was aimed at stabilizing the company during a challenging period in its history.

Web tracking software – version Urchin 4

Demo for Urchin 4 (available only in Web Archive services): https://web.archive.org/web/20030207025325/http://www.urchin.com/products/tour/

In 2002, Urchin Software Corporation unveiled Urchin 4, a significant upgrade that sported a sleek design reminiscent of Apple’s aluminum aesthetic.

A pivotal innovation in Urchin 4 was the introduction of the “Urchin Traffic Monitor” (UTM). This feature marked a significant advancement by incorporating JavaScript tracking alongside traditional web server log file analysis. The use of browser cookies for tracking and visitor recognition laid the groundwork for what would eventually evolve into Google Analytics. Urchin 4 maintained its versatility, supporting a wide range of operating systems including AIX, FreeBSD, IRIX, Mac OS X, Red Hat Linux, Solaris, and Windows.

Web tracking software – version Urchin 5

Urchin 5 - had e-commerce/”ROI” tracking, the Campaign Tracking Module, and multiserver versions that could all conspire to get the price pretty high.

Urchin 5 – had e-commerce/”ROI” tracking, the Campaign Tracking Module, and multiserver versions that could all conspire to get the price pretty high.

Following this, Urchin 5 was released, bringing with it notable enhancements such as E-Commerce tracking, campaign tracking, and support for multi-server environments. Scott Crosby, reflecting on this version in 2016, noted, “Urchin 4 was the first release I felt could compete with anyone in terms of back-end performance. But Urchin 5 was superior in every way, and I’m sure thousands of instances still run to this day. If anything, Urchin 5 was just too much of a good thing.”

The pricing structure for Urchin 5 was as follows:

  • Base Module: Priced at $895, it included 100 Profiles (up to 100 sites) and one Log Source for each profile (additional load balancing modules required for more servers).
  • Additional 100 Profiles: Available for $695.
  • Additional Load Balancing Module: Priced at $695, accommodating all profiles for load balancing.
  • Ecommerce Reporting Module: Available for $695.
  • Campaign Tracking Module: Priced at $3995.
  • Profit Suite: A comprehensive package including Urchin 5, the E-commerce Module, and the Campaign Tracking Module, priced at $4995.

Urchin 6

Image Source and Help article to Urchin User Explorer: https://support.google.com/urchin/answer/2633730?hl=en&ref_topic=2633609

Urchin 6 marked the final iteration under the Urchin brand, available through Google or authorized Urchin dealers. This version introduced a notable feature, “Individual Visitor History,” now known as “User Explorer.”

For a single license, Urchin 6 was priced at $2995, while hosting companies were charged $5000 monthly per physical data center. Notably, Urchin 6 was the first to offer a cloud solution, priced at $500 per month. This innovation contributed to Urchin becoming the world’s most popular web analysis tool by the summer of 2004, in terms of installation numbers.

During the Search Engine Strategies conference in San Jose in 2004, Google representatives Wesley Chan and David Friedberg encountered Urchin. This meeting led to Google extending an offer to acquire Urchin, despite competing offers, including a higher bid from WebSideStory. The acquisition was finalized in April 2005, a time when Google had about 3,000 employees, relatively small compared to its current size.

Installation guides for Urchin 6, such as the one for Windows, are still accessible online: https://support.google.com/urchin/answer/2591336?hl=de&ref_topic=2591275

Key differences between Urchin 5 and 6 (this was copied from the Google Urchin help):

  • Major Features:
    – Up to 1000 profiles (domains), log sources, e-commerce, and campaign tracking are all included with the base license; no add-on modules
    – Individual visitor-level tracking, including session (path) data
    – Comprehensive SEO/SEM campaign tracking features, 4 goals per profile
    – Rich cross-segmenting available from most reports
    – Full suite of visitor geo-location reports (not just visitor domain)
    – Processing speed roughly on par with Urchin 5 but with much richer reports
  • Platform Support:
    – Broad range of Linux platforms supported with only 2 builds (Linux 2.4 and 2.6 kernels)
    – Added support for FreeBSD 5 and FreeBSD 6
    – Dropped support for MacOS X and Solaris (may be reconsidered if sufficient demand is demonstrated)
  • Installers:
    – Windows installer is now distributed as an MSI package, with better-unattended install support, integration with SMS
  • Configuration:
    – Relational database (MySQL or PostgreSQL) administrative configuration backend
    – Support for configuration database hosted on the remote configuration server
  • Web Server:
    – Upgraded to the latest Apache 1.3.X release
    – OpenSSL and mod-ssl upgraded to the latest versions
    – Removed default modules not used by Urchin
    – Added mod_expires for proper cache control headers
  • Task Scheduler:
    – Scheduler now runs as two processes – master scheduler and slave scheduler
    – Tasks easily managed via scripting interface to back-end configuration DB
  • Visitor Tracking:
    – The old __utm.js tracking javascript has been replaced with a GoogleAnalytics-compatible urchin.js tracking javascript (existing sites willneed to upgrade)
  • Log Processing:
    – geodata stored in memory (larger runtime memory footprint)
    – Range of Days feature in log sources allows multi-day search for log files matching a particular date pattern
    – Ability to run profiles entirely in memory
  • Data & Storage:
    – Profile databases now default to 100,000 records/month (instead of 10,000) with the option to increase up to 500,000 records per month
    – Expanded Geodata: full set of geolocation data from Quova, replaces domain-only MaxMind data
    – Monthly table record limit increased from a default of 10,000 to 100,000 records
    – 50 monthly files per profile, now organized by subdirectory
  • Reporting UI:
    – Flash replaces Adobe SVG for rendering graphs & charts
    – Report exporting only in CSV and XML (removed unreliable MS Word/Excel exporting)
    – All Profiles report is defunct
    – New visitor session/path-level reporting capabilities
  • E-commerce:
    – E-commerce transactions can be written directly to webserver logs via special functions in the tracking javascript (identical to GA)
    – External shopping cart logs in ELF2 format are also supported
  • Security:
    – Urchin 6 has gone through a thorough quality assurance for cross-site scripting (XSS) and XSRF vulnerabilities

This is Urchin 6. Individual visitor history drill-down — potentially controversial I guess. But at least there wasn’t a “composite sketch” of the visitor. That would have been SO COOL. 

Urchin 7 – groundbreaking ancestor of Google Analytics

Urchin 7 – The new UI looks very similar to later well-known Google Analytics

Urchin 7 marked the final chapter of the Urchin series, now under the Google umbrella and aptly named “Urchin 7 by Google.” This version was made freely available to users, initially through an invitation-based model, leading to a rapid expansion in the use of “Urchin by Google.”

Many of the Urchin employee profiles are also linked there.

Appendix I: Key personalities connected to Urchin Software Company

Numerous members of the original Urchin team continue to make their mark at Google.

Notably, Paul Muret serves as the Vice President of Engineering for Analytics and Display Ads. Other team members have ventured into entrepreneurial roles, founding new companies. Below is a list of the Urchin team members over the years, presented in no particular order, along with links to their subsequent ventures where available.

  1. Paul Muret
  2. Brett Crosby — PeerStreet
  3. Scott Crosby
  4. Jack Ancone
  5. Scott Crosby
  6. Paul Botto
  7. Rolf Schreiber
  8. Jason Senn
  9. Jim Napier
  10. Hui-Sok “Nathan” Moon
  11. Alden DeSoto
  12. Jonathon Vance(s with Wolves)
  13. Doug Silver
  14. Jason Collins
  15. Justin Beope — Upas Street Brewing
  16. Megan Cash
  17. Christian Powell
  18. Nikki Morrissey
  19. Mike Chipman — Actual Metrics (Angelfish product)
  20. Steve Gott
  21. Ted Ryan
  22. Jeromy Henry
  23. Annie Aubrey
  24. Alex Ortiz
  25. Kelley Wilson
  26. Christina Hild
  27. David Cerce
  28. Ryan Walker
  29. Nick Mihailovski
  30. Bill Rhodes
  31. Jason Chen
  32. Juba Smith
  33. Bret Aarons
  34. Merrick
  35. Bart Fromm
  36. Chi Kwan
  37. Ed Schwartz
  38. Andy Smith
  39. Ed Petersen
  40. Cindy Lee
  41. Davee Schultie
  42. Joanna Rocchio
  43. Ben Norton

In the next article we will look more into the shift to Google Analytics/Universal Analytics up to Google Analytics 4. We will also cover the description and comparison of what is new in GA4.

Churn rate

The churn rate is a vital metric for every app. Discover what the churn rate means, how to compute it, and why it serves as a crucial Key Performance Indicator (KPI).

The churn rate is the percentage of users who have stopped using an app. It can be users who stopped using the app altogether or those who uninstalled it. The choice between these definitions depends on the app’s type and goals. For instance, the customer churn rate focuses on users who have discontinued using the app’s products or services, which is valuable for subscription-based apps.

Essentially, the churn rate counts how many users leave the app within a specific time period.

Why is the churn rate important?

A high churn rate may indicate that you’re investing in user acquisition but not maximizing the return on investment. Identifying the reasons for churn allows you to enhance user retention and subsequently increase revenue. Some mobile app sectors anticipate a high churn rate, like hyper-casual gaming, which considers it part of their business strategy.

How to calculate the churn rate?

To calculate the churn rate for your app, decide whether you want to measure inactive users, uninstalls, or subscription cancellations. Then, select your time frame, such as annual or monthly churn rate. In-app event tracking helps pinpoint when users tend to churn, aiding in determining the most relevant measurement period and identifying areas for intervention to reduce churn.

Churn rate formula

Churn rate formula

What is a good churn rate?

Sometimes, a negative KPI can actually be a positive outcome. A negative churn rate indicates that an app has generated more revenue from both new and existing customers than it has lost due to users who have stopped using it.

However, it’s important to recognize that churn is a natural part of any business. Some users may not like an app, find a better alternative, or simply no longer require its features. Nonetheless, churn rate remains a vital KPI for app developers to consider.

By analyzing the churn rate, app developers can determine if changes are needed to enhance customer retention. This may involve improving the user experience, optimizing features, or adjusting prices. The benchmarks for churn rate can vary depending on your app’s industry, location, and platform.

It’s crucial to understand that when a user churns, it doesn’t mean they’re lost forever. There are various strategies to re-engage these users, and you can explore deeper use cases for using churn rate as a KPI in our user lifecycle guide. Additionally, you can learn more about achieving a good retention rate and discover ten strategies for improving your app’s user retention.