Heap

Definition and overview of Heap integration

The Heap integration allows you to seamlessly export Statsig events to your Heap app. This powerful connection enables you to track and analyze user behavior across feature gates and experiment group assignments.

When the Statsig client SDK is initialized, it automatically sends events to Heap, providing valuable insights into how users interact with your application. Additionally, the integration forwards other custom events logged through Statsig, ensuring a comprehensive view of user activity.

By leveraging the Heap integration, you can:

  • Gain visibility into feature adoption and usage patterns

  • Analyze the impact of experiments on user behavior

  • Segment users based on their exposure to specific features or experiment variants

This integration empowers you to make data-driven decisions and optimize your product experience by combining the strengths of Statsig's experimentation platform with Heap's robust analytics capabilities.

Event types and formatting

Statsig supports two main types of events: client SDK initialize events and exposures/custom events. Client SDK initialize events are sent to your Heap app whenever the Statsig client SDK is initialized, providing information about the user's feature gate and experiment assignments.

Exposures and custom events are forwarded by Statsig to Heap as they are received from SDKs, integrations, or the HTTP API. These events are sent in batches using a JSON format, with each event containing detailed information about the event type, user, timestamp, and relevant metadata.

The event structure includes fields such as eventName, user, userID, timestamp, value, metadata, statsigMetadata, timeUUID, and unitID. This standardized format allows for easy integration and analysis of the event data within Heap.

Some key event types include:

  • Custom events (logEvent): User-defined events with custom metadata

  • Feature gate exposures (checkGate): Information about a user's exposure to a specific feature gate

  • Dynamic config exposures (getConfig): Details about a user's exposure to a dynamic config

  • Experiment exposures (getExperiment): Data on a user's assignment to a specific experiment variant

By leveraging these event types and the standardized JSON format, you can seamlessly export your Statsig event data to Heap for further analysis and insights. This integration enables you to gain a deeper understanding of how your feature gates, experiments, and custom events impact user behavior and product performance.

Steps to configure outbound events using Heap App ID

Navigate to your Heap Projects page to find and copy the App ID for your project. Paste the App ID into the App ID input field for the Heap configuration in the Statsig Integrations page. Save your changes to start sending events from Statsig to your configured Heap app.

Best practices for experiment design and monitoring

When designing experiments, start with a clear hypothesis and define the metrics you'll use to validate it. Ensure your sample size is large enough to detect the expected effect. Monitor your experiments closely, especially in the early stages, to catch any unexpected behavior or issues.

Use Statsig's built-in guardrail checks to ensure your experiments are set up correctly before launching. These checks help prevent common pitfalls like underpowered experiments or incorrect metric definitions. Regularly review your experiment results and be prepared to make adjustments if needed.

Considerations for moving from proof of concept to production

As you transition from a proof of concept to production, consider the following:

  • Ensure your feature flags and experiments are properly documented and communicated to all relevant stakeholders. This includes the purpose, expected outcomes, and any dependencies.

  • Set up a governance process for managing feature flags and experiments in production. This may include requiring approvals for new experiments, establishing guidelines for when to ramp up or down, and defining roles and responsibilities.

  • Continuously monitor the performance and stability of your production environment. Have a plan in place for quickly rolling back or adjusting feature flags and experiments if issues arise. Statsig's change history and rollback capabilities can help streamline this process.

By following these best practices and considering the key factors for moving to production, you can successfully scale your experimentation program with Statsig and Heap. The combination of Statsig's powerful feature flagging and experimentation platform with Heap's robust analytics capabilities enables you to make data-driven decisions and continuously optimize your product.

Join the #1 experimentation community

Connect with like-minded product leaders, data scientists, and engineers to share the latest in product experimentation.

Try Statsig Today

Get started for free. Add your whole team!

Why the best build with us

OpenAI OpenAI
Brex Brex
Notion Notion
SoundCloud SoundCloud
Ancestry Ancestry
At OpenAI, we want to iterate as fast as possible. Statsig enables us to grow, scale, and learn efficiently. Integrating experimentation with product analytics and feature flagging has been crucial for quickly understanding and addressing our users' top priorities.
OpenAI
Dave Cummings
Engineering Manager, ChatGPT
Brex's mission is to help businesses move fast. Statsig is now helping our engineers move fast. It has been a game changer to automate the manual lift typical to running experiments and has helped product teams ship the right features to their users quickly.
Brex
Karandeep Anand
President
At Notion, we're continuously learning what our users value and want every team to run experiments to learn more. It’s also critical to maintain speed as a habit. Statsig's experimentation platform enables both this speed and learning for us.
Notion
Mengying Li
Data Science Manager
We evaluated Optimizely, LaunchDarkly, Split, and Eppo, but ultimately selected Statsig due to its comprehensive end-to-end integration. We wanted a complete solution rather than a partial one, including everything from the stats engine to data ingestion.
SoundCloud
Don Browning
SVP, Data & Platform Engineering
We only had so many analysts. Statsig provided the necessary tools to remove the bottleneck. I know that we are able to impact our key business metrics in a positive way with Statsig. We are definitely heading in the right direction with Statsig.
Ancestry
Partha Sarathi
Director of Engineering
We use cookies to ensure you get the best experience on our website.
Privacy Policy