Home

How to Sync Google Analytics and Hubspot Custom Events

by Tristan Donnally

Introduction

If your site is not hosted in the HubSpot CMS or if you would like to use Google Analytics events as hooks for other automations within HubSpot it may become useful to sync the two. As of writing this article there is not an easy way to have data flow from Google Analytics into Hubspot and marketplace integrations only allow egress from Hubspot to Google Analytics. We will discuss at a high level how this can be setup.

Before we continue it is very important that there is a unique identifier setup upstream within your tags that will identify users via a UID and this UID needs to be present in HubSpot fields for each Contact. Without this connection it will be impossible to connect the two and this article assumes you have this setup.

Getting Started with BigQuery

The first issue we will uncover is that Google Analytics data is aggregated and it does not return data on a per user basis. This is where BigQuery comes in. BigQuery is a product also offered by Google which can sync data from GA4 and display individual user data. This data can be set to return on a daily basis or it can be streamed. This setup is pretty easy but it assumes you already have a Google Cloud project setup and have BigQuery enabled. If you do not you can find docs for the setup here and the steps are as follows:

Step 1: Create a Google-APIs-Console project and enable BigQuery

Note: You must be an Editor or above to create a Google-APIs-Console project and enable BigQuery.

  1. Log in to the Google Cloud Console.
  2. Create a new Google Cloud Console project or select an existing project.
  3. Navigate to the APIs table. Open the Navigation menu in the top-left corner, click APIs & Services, then click Library.
  4. Activate BigQuery. Under Google Cloud APIs, click BigQuery API. On the following page, click Enable.
  5. If prompted, review and agree to the Terms of Service.

Step 2: Link a Google Analytics 4 property to BigQuery

After you complete the first two steps, you can enable BigQuery Export from Analytics Admin.

BigQuery Export is subject to the same collection and configuration limits as Google Analytics. If you need higher limits, you can upgrade your property to 360.

  1. In Admin, under Product Links, click BigQuery Links.

    Note: The previous link opens to the last Analytics property you accessed. You must be signed in to a Google Account to open the property. You can change the property using the property selector.

    • You must be an Editor or above at the property level to link an Analytics property to BigQuery.
    • You must also use an email address that has OWNER access to the BigQuery project (view Permissions below for detailed access requirements).
  2. Click Link.

  3. Click Choose a BigQuery project to display a list of projects for which you have access. If you have linked Analytics and Firebase (or plan to), consider exporting to the same Cloud project, which will facilitate easier joins with other Firebase data.

  4. Select a project from the list, then click Confirm.

  5. Select a location for the data. (If your project already has a dataset for the Analytics property, you can't configure this option.)

  6. Click Next.

  7. Select Configure data streams and events to select which data streams to include with the export and specific events to exclude from the export. You can exclude events by either clicking Add to select from a list of existing events or by clicking Specify event by name to choose existing events by name or to specify event names that have yet to be collected on the property.

  8. Click Done.

  9. Select Include advertising identifiers for mobile app streams if you want to include advertising identifiers.

  10. Select either or both a Daily (once a day) or Streaming (continuous) export of data. For Analytics 360 properties, you may also select Fresh Daily.

  11. Click Next.

  12. Review your settings, then click Submit.

Querying BigQuery

After the connection is setup and data as flowed in we will need to query BigQuery to grab our event data. To do so first we will need to install the Google Cloud Big Query library via the following command:

pip install google-cloud-bigquery

You will also need to authenticate. There are a couple options when developing locally but the easiest would to run the following command and to login with an account that has at least the BigQuery User role:

gcloud auth application-default login

After this you are now ready to query your table. Here is an example script where we query all of the unique user ids.

from google.cloud import bigquery

bq = bigquery.Client
yesterday = datetime.now() - timedelta(days=1)
users_table = f"users_{yesterday.strftime('%Y%m%d')}

query = f"""
SELECT user_id 
FROM `{users_table}` 
WHERE user_id IS NOT NULL
"""
user_rows = bq.query(query).result()
users = [row["user_id"] for row in user_rows]

You can then take that array and splice portions to query the events table like so:

events_table = f"events_{yesterday.strftime('%Y%m%d')}

batch_size = 50
for i in range(0, len(users), batch_size):
    batch = users[i:i + batch_size]
    user_ids_str = "', '".join(batch)
    
    query = f"""
    SELECT
      user_id,
      event_timestamp,
      event_name,
      event_params
    FROM `project.dataset.{events_table}`
    WHERE user_id IN ('{user_ids_str}')
    ORDER BY user_id, event_timestamp
    """
    event_rows = bq.query(query).result()
    
    for row in event_rows:
        print(row.user_id, row.event_name)

This is a very simplified example and you will need to decide how to organize your methods/manage the data but this is the general process.

Posting to Hubspot

Once we have our events rows it is time to post the data to HubSpot. This involves searching for Contacts based on our unqiue ids, storing those results in a dict and then referencing them as we upload our custom events. Here is an example of how to call the search endpoint:

url = "https://api.hubapi.com/crm/v3/objects/contacts/search"
customer_id_property = {your-unique-id-name}

# notice how we are just taking the first 100 elements of users. 
# In practice you will pass these slices as arguments to a search method you create.
customer_ids = users[:100]

payload = {
    "filterGroups": [{
        "filters": [{
            "propertyName": customer_id_property,
            "operator": "IN",
            "values": customer_ids # using an array lowers the amount of API calls
        }]
    }],
    "properties": [customer_id_property],
    "limit": 100
}
r = requests.post(url, json=payload, timeout=request_timeout)

# store in a Dict for instant lookups
user_map: Dict[str, str] = {}
for row in r.json().get("results", []):
    props = row.get("properties", {}) or {}
    cid = props.get(customer_id_property)
    if cid:
        user_map[cid] = row["id"]

This will provide us with a dict with the structure of {userid: contact ID}. When we are iterating through the event rows we created earlier we will now be able to instantly search contact IDs by our unique user ID.

Finally we get to the actual posting of the event info to HubSpot custom events:

for row in event_rows:
    contact_id = user_map.get(row.user_id)
    
    properties = {}
    for param in row.event_params:
        key = param.key
        value = param.value.string_value or param.value.int_value or param.value.float_value or param.value.double_value
        properties[key] = value
    
    occurred_at = row.event_timestamp // 1000
    
    payload = {
        "eventName": row.event_name,
        "objectId": contact_id,
        "occurredAt": str(occurred_at),
        "properties": properties,
    }
    
    r = requests.post(HS_EVENTS_URL, json=payload, timeout=request_timeout)

And that is that! Your GA4 events will now be available in your Contacts custom events.

Conclusion

Hopefully this helps you create your own BigQuery to HubSpot integrations. The next step would be to create automations via HubSpot workflows to use the new event data in sales and marketing efforts. If you have any questions or are interested in having this implemented feel free to reach out in via the form below.

Stay in the Loop

Sign up for updates on our latest posts and major releases plus occasional tips, case studies, and behind-the-scenes notes. No spam, ever. Unsubscribe anytime.

Ready to Forge Your Next Project?

Tell us a little about your project and we’ll get back to you within 1 business day.

What are you interested in?