Apache Kafka
- Engineering
- Last Updated: August 15, 2024
- Jonathan Brown
Modern applications have an unceasing buzz of user activity and data flows. Users send a flurry of one-click reactions to social media posts. Wearable tech and other IoT sensors work nonstop to transmit event data from their environments. Meanwhile, customers on e-commerce sites perform shopping cart actions or product searches which can bring immediate impact to operations. Today’s software organizations need the ability to process and respond to this rich stream of real-time data.
That’s why they adopt an event-driven architecture (EDA) for their applications.
Long gone are the days of monolithic applications with components tightly coupled…
- Engineering
- Last Updated: May 09, 2024
- David Murray
One of our most important goals at Heroku is to be boring . Don’t get us wrong, we certainly hope that you’re excited about the Heroku developer experience — as heavy users of Heroku ourselves, we certainly are! But, even more so, we hope that you don’t have to spend all that much time thinking about Heroku. We want you to be able to spend your time thinking about the awesome, mission-critical things you’re building with Heroku, rather than worrying about the security, reliability, or performance of the underlying infrastructure they run on.
Keeping Heroku “boring” enough to…
- News
- Last Updated: October 08, 2020
- Scott Truitt
This summer, we announced the beta release of our new streaming data connectors between Heroku Postgres and Apache Kafka on Heroku . These connectors make Change Data Capture (CDC) possible on Heroku with minimal effort. Anyone with a Private or Shield Space , as well as a Postgres and an Apache Kafka add-on in that space, can use Streaming Data Connectors today at no additional charge.
Customers use connectors to build streaming data pipelines between Salesforce and external stores like a Snowflake data lake or an AWS Kinesis queue for integration with other data sources. They also refactor…
- News
- Last Updated: July 10, 2020
- Scott Truitt
Today we are announcing a beta release of our new streaming data connector between Heroku Postgres and Apache Kafka on Heroku . Heroku runs millions of Postgres services and tens of thousands of Apache Kafka services, and we increasingly see developers choosing to start with Apache Kafka as the foundation of their data architecture. But for those who are Postgres-first, it is challenging to adopt without a full app rewrite. Developers want a seamless integration between the two services, and we are delivering it today, at no additional charge, for Heroku Private Spaces and Shield Spaces customers.
Moving beyond Postgres and…
- News
- Last Updated: June 11, 2020
- Scott Truitt
We are thrilled to announce that Heroku Shield for Redis is now generally available and certified for handling PHI, PII, and HIPAA-compliant data. Heroku Shield for Redis is the final missing data service for Heroku Shield, which is an integrated set of Heroku services with additional security features needed for building high compliance applications. All Heroku Managed Data Services — Heroku Connect, Heroku Data for Redis, Heroku Postgres, and Apache Kafka on Heroku — are now fully certified for handling PHI, PII, and HIPAA-compliant data as part of Heroku Shield. Security and compliance come standard with Heroku Shield,…
- News
- Last Updated: May 06, 2020
- Scott Truitt
Security is always top of mind for Heroku customers; COVID-19 has further increased the urgency for enterprises and developers to deliver more mission-critical applications with sensitive and regulated data.
Given the needs of our customers, including those in regulated industries like Health & Life Sciences and Financial Services, we are thrilled to announce that Heroku Private Spaces and Shield customers can now deploy a new Postgres, Redis, or Apache Kafka service with a key created and managed in their private AWS KMS account. With BYOK, enterprises gain full data custody and data access control without taking on the…
- News
- Last Updated: May 14, 2024
- Scott Truitt
Today, we’re thrilled to announce four new trusted data integrations that allow data to flow seamlessly and securely between Heroku and external resources in public clouds and private data centers:
Heroku Postgres via mutual TLS
Heroku Postgres via PrivateLink
Apache Kafka on Heroku via PrivateLink
Heroku Redis via PrivateLink
These integrations expand Heroku's security and trust boundary to cover the connections to external resources and the data that passes through them. They enable true multi-cloud app and data architectures and keep developers focused on delivering value versus managing infrastructure. Data is the driving force in modern app development, and…
- News
- Last Updated: October 01, 2019
- Scott Truitt
We are thrilled to announce that Apache Kafka on Heroku Shield is now generally available and certified for handling PHI, PII, and HIPAA-compliant data. Our newest managed data service unifies Heroku Shield, a set of Heroku platform services that offer additional security features needed for building high compliance applications, with Apache Kafka on Heroku, our fully-managed service based on the leading open-source solution for handling event streams.
Organizations of all sizes face relentless pressure to bring better apps and experiences to market, and those with a strong focus on data security like Health and Life Sciences (HLS) organizations…
- News
- Last Updated: July 23, 2019
- Scott Truitt
There are many reasons to choose Heroku Data services, but keeping the services you use secure and up-to-date rank near the top. This foundation of trust is the most important commitment we make to our customers, and frequent and timely maintenances are one way we deliver on this promise.
We do everything we can to minimize downtime, which is typically between 10 – 60 seconds per maintenance. There are ways for you to minimize disruption too (see the tips and tricks below). The rest of the post explains how we think about Heroku Data maintenances, how we perform…
- Engineering
- Last Updated: July 11, 2019
- Ali Hamidi
This blog post is adapted from a talk given by Ali Hamidi at Data Council SF ’19 titled “ Operating Multi-Tenant Kafka Services for Developers on Heroku .”
https://www.youtube.com/embed/-AtHKoTNR1k
Thousands of developers use Heroku’s Apache Kafka service to process millions of transactions on our platform—and many of them do so through our multi-tenant Kafka service. Operating Kafka clusters at this scale requires careful planning to ensure capacity and uptime across a wide range of customer use cases. With significant automation and test suites, we’re able to do this without a massive operations team.
In this post,…
Subscribe to the full-text RSS feed for Engineering.