Rakam is an analytics platform that allows you to create your analytics services.
Rakam is a modular analytics platform that gives you a set of features to create your own analytics service.
Typical workflow of using Rakam:
- Collect data from multiple sources with trackers, client libraries, webhooks, tasks etc.
- Enrich and sanitize your event data with event-mappers
- Store data in a data warehouse to analyze it later. (Postgresql, HDFS, S3 etc.)
- Analyze your event data with your SQL queries and integrated rich analytics APIs (funnel, retention, real-time reports, event streams
- Create custom dashboards and real-time reports using Rakam BI
- Develop your own modules for Rakam to customize it for your needs.
All these features come with a single box, you just need to specify which modules you want to use using a configuration file (config.properties) and Rakam will do the rest for you.
We also provide cloud deployment tools for scaling your Rakam cluster easily.
If your event data-set can fit in a single server, we recommend using Postgresql backend. Rakam will collect all your events in row-oriented format in a Postgresql node. All the features provided by Rakam are supported in Postgresql deployment type.
However Rakam is designed to be highly scalable in order to provide a solution for high work-loads. You can configure Rakam to send events to a distributed commit-log such as Apache Kafka or Amazon Kinesis in serialized Apache Avro format and process data in PrestoDB workers and store them in a distributed filesystem in a columnar format.