We provide SDKs for Android, iOS and web platforms. The SDKs automatically enrich the data depending on the platform and send the data to your Rakam API.
Rakam API will automatically enrich your customer data with location, referrer and device
information in API.
The SDK has in-built properties such as advertisement id in mobile devices or the URL in websites.
You can either ingest the data into your Postgresql database or Snowflake cluster that horizontal scales easily. Then, run SQL queries on your event-data using the schema sent from your SDKs.
It's the easiest way to start using Rakam. The API server enriches, sanitizes the data and INSERT into Postgresql in real-time. We make use of partitioned tables and BRIN indexes in order to leverage analytical features of Postgresql. This deployment type is scalable if you have less than 100M events per month because Postgresql is not horizontally scalable.
Snowflake based solution makes use of a distributed commit log (we support for Kafka, Kinesis and Google PubSub), a distributed storage system (we support S3 and Google Storage) for storing raw event data and Snowflake cluster if you need SQL access. Since it's horizontally scalable, it's the big-data solution that supports more than 100M events per month.
We believe in using the right tool for the right technology. Here is the comparison matrix for similar solutions:
We're in favor of ELT since it's easier to maintain. (Transforming the data in the data-warehouse)