Search overlay panel for performing site-wide searches

Boost Performance & Scale with Postgres Advanced. Join Pilot Now!

Data Analytics

Today we are announcing a beta release of our new streaming data connector between Heroku Postgres and Apache Kafka on Heroku . Heroku runs millions of Postgres services and tens of thousands of Apache Kafka services, and we increasingly see developers choosing to start with Apache Kafka as the foundation of their data architecture. But for those who are Postgres-first, it is challenging to adopt without a full app rewrite. Developers want a seamless integration between the two services, and we are delivering it today, at no additional charge, for Heroku Private Spaces and Shield Spaces customers.

Moving beyond Postgres and…

The data we store holds value, but refining data into meaning remains a difficult task. Over the last few months, we've taken a step back to figure out what we can do to help our users cross that divide, and rebuilt Heroku Dataclips from scratch with that goal in mind. The result is an experience that makes accessing and working with your data easier than ever, enabling anyone on your team familiar with SQL to take advantage of your most valuable asset without the need for specialized tools or knowledge of the database.

Dataclips is a flexible, lightweight…

The recent introduction of Platform Events and Change Data Capture (CDC) in Salesforce has launched us into a new age of integration capabilities. Today, it's possible to develop custom apps that respond to activity in Salesforce. Whether you're creating a memorable customer interaction or implementing an internal workflow for employees, consider an event-sourced design to improve responsiveness and durability of the app.

In this article, we'll look at an event-sourced app architecture that consumes the Salesforce Streaming API using the elegant jsforce JavaScript library in a Node app on Heroku .

Streaming with jsforce

Building a SaaS product, a system to handle sensor data from an internet-connected thermostat or car, or an e-commerce store often requires handling a large stream of product usage data, or events. Managing event streams lets you view, in near real-time, how users are interacting with your SaaS app or the products on your e-commerce store; this is interesting because it lets you spot anomalies and get immediate data-driven feedback on new features. While this type of stream visualization is useful to a point, pushing events into a data warehouse lets you ask deeper questions using SQL.

In this post, we’ll…

This is the first in a series of blog posts examining the evolution of web app architecture over the past 10 years. This post examines the forces that have driven the architectural changes and a high-level view of a new architecture. In future posts, we’ll zoom in to details of specific parts of the system. The standard web application architecture suitable for many organizations has changed drastically in the past 10 years. Back in Heroku’s early days in 2008, a standard web application architecture consisted of a web process type to respond to HTTP requests, a database to persist…

Designing scalable, fault tolerant, and maintainable stream processing systems is not trivial. The Kafka Streams Java library paired with an Apache Kafka cluster simplifies the amount and complexity of the code you have to write for your stream processing system.

Unlike other stream processing systems, Kafka Streams frees you from having to worry about building and maintaining separate infrastructural dependencies alongside your Kafka clusters. However, you still need to worry about provisioning, orchestrating, and monitoring infrastructure for your Kafka Streams applications.

Heroku makes it easy for you to deploy, run, and scale your Kafka Streams applications by…

Event-driven architectures are on the rise, in response to fast-moving data and constellations of inter-connected systems. In order to support this trend, last year we released Apache Kafka on Heroku – a gracefully integrated, fully managed, and carefully optimized element of Heroku's platform that is the culmination of years of experience of running many hundreds of Kafka clusters in production and contributing code to the Kafka ecosystem.

Today, we are excited to announce additional plans and pricing in our Kafka offering in order to make Apache Kafka more accessible, and to better support development, testing, and low volume…

Many of the compelling and engaging application experiences we enjoy every day are powered by event-based systems; requesting a ride and watching its progress, communicating with a friend or large group in real time, or connecting our increasingly intelligent devices to our phones and each other. Behind the scenes, similar architectures let developers connect separate services into single systems, or process huge data streams to generate real-time insights. Together, these event-driven architectures and systems are quickly becoming a powerful complement to the relational database and app server models that have been at the core of Internet applications for…

For almost two years now, the Heroku Dashboard has provided a metrics page to display information about memory usage and CPU load for all of the dynos running an application. Additionally, we've been providing aggregate error metrics, as well as metrics from the Heroku router about incoming requests: average and P95 response time, counts by status, etc.

Almost all of this information is being slurped out of an application's log stream via the Log Runtime Metrics labs feature. For applications that don't have this flag enabled, which is most applications on the platform, the relevant logs are still…

Many of Heroku's internal components make heavy use of logfmt to log information about what's going on in production. The format is hugely valuable in that it allows us to retroactively analyze what happened during any arbitrary request to our components, query our log traces in very flexible ways, and combined with Splunk, easily generate arbitrary metrics on historical data. It's unquestionably been an invaluable tool for fixing countless bugs, tracking down the root cause of many production incidents, and assessing usage in ways that would have been difficult otherwise.

That said, when viewed in the wrong light,…

Subscribe to the full-text RSS feed for Data Analytics.