/ #bigquery #firebase 

Free Product Analytics with Firebase + BigQuery + Rakam

Many of us landed on Firebase due to it’s powerful and free services such as Crashlytics, Remove Config, Cloud Messaging, Analytics and many more. When it comes to understanding how your users behave, analytics plays a crucial role. Building a custom data pipeline may cost you in the early stages of your idea. So why not use Firebase Analytics? It is free and has bunch of built in reports and a dashboard showing the most critical metrics, like revenue, version adoption, active users and so on.

Hopefully, you crushed those early stage numbers and reached enormous KPIs that you could not even dream of. You need to squeeze that data for more insights, understand more of your data. Aggregated metrics with few hours of cache are not sufficient for you anymore. You need answers to your adhoc questions; complex funnel, retention and segmentation reports, and you need them realtime. Now it is time to setup your brand new data pipeline and pay hundreds, if not thousands of dollars. Right?… Not necessarily?


To access unsampled data first, we have link our project to BigQuery. Doing that will allow you to query not Analytics but also A/B Testing, Crashlytics, Predictions and Cloud Messaging. To link your project simply follow the steps below;

  • Navigate to settings page in your project.
  • Click onto Integrations tab.
  • Click to link on the BigQuery card.

For more visit here

Firebase BigQuery Schema

Firebase schema on BigQuery has its pros and cons. It is hard to compose a query but efficient if you filter on partition suffixes. Canonical form is great for ease of modeling. However repeated records on user attributes and event parameters needs unnest operation. Get more info about the schema here. Firebase also collects some well known events by default, without any additional steps. They may differ from tracker to tracker, for more checkout also here.

1. Partitioned Tables

There are two raw events table, both are stored in partitioned table format;

  • Historical Tables: Prefixed with events_ followed by date, such as events_20200109
  • Daily Table: Prefixed with events_intraday_ followed by today, such as events_intraday_20200110

Good news is that you can use * to query all the events tables. But keep in mind that filtering the partition suffix will help you reduce your costs (a lot!!)

SELECT * FROM events_* WHERE _TABLE_SUFFIX BETWEEN '20200107' AND '20200109'

add one more filter on event_type AND even_type = 'in_app_purchase' and here are the revenue events for you!

Need real-time data, with historicals? Use UNION ALL

SELECT * FROM events_* WHERE _TABLE_SUFFIX BETWEEN '20200107' AND '20200109'
SELECT * FROM events_intraday_* WHERE _TABLE_SUFFIX = '20200110'

2. User Properties and Event Parameters

All the common attributes such as device properties, geo informations are there for all events, and those custom parameters (like product_id in you in_app_purchase event) are stored as nested keys and values. If you’re not familiar with BigQuery repeated record types think these fields as array of JSON object. Let’s assume a purchase event where we send the product_id and screen_name as event parameters;

		"key": "product_id",
		"value": {
			"int_value": null,
			"float_value": null,
			"double_value": null,
			"string_value": "consumables.oneYear"
		"key": "screen_name",
		"value": {
			"int_value": null,
			"float_value": null,
			"double_value": null,
			"string_value": "promotions.first_open"

How do we filter on the product_id?, or event project it?



	(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'product_id') as product_id,
	(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'screen_name') as screen_name
FROM events_*
WHERE event_name = 'in_app_purchase' AND _TABLE_SUFFIX BETWEEN '20200107' AND '20200109'

NOTE THAT: The same principal applies for user_properties. Keep in mind that this is an expensive operation and will increase your bytes-processed.

Connecting your BigQuery to Rakam-BI

Rakam offers free service up to 3 team members with various bootstrapping options (recipes) for providers such as Firebase, Segment, Tenjin, Rakam-API and many on coming soon. If you’re ready to give it a try start with exploring the demo project..

1. Create a service account

  • bigquery.datasets.get
  • bigquery.jobs.create
  • bigquery.tables.get
  • bigquery.tables.getData
  • bigquery.tables.list

2. Connecting your BigQuery Datasource

Navigate to Connect new datasource and select Firebase or click here. Get your project and dataset name (you can find them over the BigQuery Console). Your dataset name will start with analytics_

3. Installing Firebase Recipe

Once you connect your project, you will be directed to the recipe installation page. First choose your database and schema (project and dataset). Firebase datasets starts with analytics_ followed by your project identifier. To discover your event parameters and user properties Rakam will execute two SQL queries on your database. You can review this query if you like so, and click to Click to run Sql button. Lastly, if you are not sending in app purchase data to Firebase, toggle the Install Revenue Metrics to off state.

Click on the Install button when the queries are finished. Rakam will now model your events, parameters and user properties into models. This might take few minutes depending the number of distinct events and properties you have. Note that both queries are scanning through your data over the last 15 days.

4. Ready to Explore

All your common custom Firebase events are now modelled. A 5 minutes of your time is now what could take days to do.


Now you have dashboards based on session and purchase segmentations. Navigate to your boards and explore your installed dashboards via this recipe. We’re adding more and more dashboards to the Firebase recipe every week. You can always go to Recipes page and upgrade.


Explore more on Segmentation, Funnel and Retention reports here on the quick walkthrough video we prepared for you 🚀


Ilker Cam