Looking for more podcasts? Tune in to the Salesforce Developer podcast to hear short and insightful stories for developers, from developers.
Special Episode: Health Metrics at Scale
Hosted by Trey Ford and Jacob Silzer, with guest Tim Panagos.
The COVID-19 pandemic has forced many industries to rethink how they operate. Amidst those changes, businesses are looking for new ways to keep on top of rapidly changing health guidelines. Microshare is a provider of data-driven solutions for IoT devices. Tim Panagos, their CTO, talks about what they built and what they adapted in order to massively pivot their business to provide solutions that communities desperately need.
Jacob Silzer, Trusted Security Director at Heroku, is co-hosting this episode with Trey Ford, VP of Platform, Trust, and Strategy at Salesforce. They're sitting down with Tim Panagos, CTO of Microshare. Microshare began as a platform to aggregate wireless sensor data from IoT devices. Much of their use cases were for smart offices; for example, seeing how occupied a conference room was, whether a hot desk was open, and when a particular area had been cleaned. For hospitals, their platform monitored physical equipment, such as patient beds and medicine carts. These devices ran on a LoRaWAN network, which doesn't communicate through Wi-Fi, making it ideal for secure locations.
Then, COVID hit. Suddenly, information from these devices, where were useful from a logistic and compliance standpoint, became extraordinarily important. By tracking the amount of people in a room, the amount of times an area was cleaned, or the movements of an unknown carrier, whole industries can monitor their overall health efficacy, comparing their performance not only with their past, but also, with other similar businesses. Tim gives the example of an airport here: one could potentially see how their cleaning efficiency is compared to other airports around the world. Part of the success of their platform is their reliance on the blockchain. Businesses can monitor not only how their data has changed over time, but they can look back on any block and identify what factors may have contributed to a decreased in performance.
Tim notes that the platform is not bulletproof. For starters, tracking equipment requires people to remember to add trackers; a simple error, to be sure, but one which is frequently forgotten as new crash carts are added to quickly meet increased demands. But more urgently, no one really knows how the future will change in response to the knowledge we only currently have. Microshare was only able to pivot because they were wise enough to recognize that their business and their software needed to be flexible. They essentially threw out their 2020 roadmap in order to focus on tracking clean air and open spaces. The best way to move forward is not just to collect more data, but to also add context to it, which will create meaningful value.
Links from this episode
Jake: So, hello, today, we are going to talk about Health Metrics at Scale with our special guest, Tim Panagos, CTO of Microshare. I am your host, Jacob Silzer, Trusted Security Director at Heroku. My passion really lies in cybersecurity. I work closely with customers to help educate them on best practices for securing their use of cloud services. Today, I'm also joined with our lively co-host, Trey Ford, VP of Platform, Trust, and Strategy. Trey, would you like to give a quick introduction?
Trey: Thanks, Jake. Hey, my name is Trey Ford. I am VP of Platform and focused on security. I'm a business security leader for Salesforce.
Jake: Tim, as our special guest, would you like to give a quick introduction of yourself and tell us a little bit more about Microshare and what the company mission is?
Tim: Sure. I'm Tim Panagos and I'm the co-founder and CTO for Microshare. And Microshare is now an eight-year-old company, really founded to democratize some of the data-driven technologies that I saw being used in large scale enterprises, trying to drive those technologies in a way that makes them more adaptable for less traditionally served industries.
Tim: Leading up to 2019, we were a seven-year-old startup. And like many startups of that age, we've been through several pivots. The mission that I described, democratizing technologies, were very broad. And we set out to really navigate and find out what people were really interested in in terms of what technologies did they really want to adapt in their businesses. So, we went through a number of pivots. But where we netted out was that we have a novel way of collecting and sharing data, hence the name Microshare.
Tim: And we really focused down on IoT information, Internet of Things information, specifically wireless sensor data, as an important new data source that didn't really have a traditional legacy home for that information. So, that's really where we navigated towards. And found as we continued to focus the startup, we really found that it was in the built space, smart buildings, smart spaces, that we really began to find some traction. So, our platform is really about bringing a lot of different wireless data to gather from a whole massive range of devices.
Tim: And making it so that business can use the combined views to make better decisions about how they manage their space. And typically, up to 2019, we were seeing use cases like occupancy, be that desk or be that conference room, knowing how many people are in a space, and whether you can find a flex desk if you're coming into work or what conference rooms might be available at any given time, we were seeing smart cleaning.
Tim: Because even leading up to 2019, we saw facilities' managers really always being asked to do more with less relative to keeping spaces clean and comfortable for the occupants never getting paid more on their leases. So, it's a race to zero there. And things like asset tracking. So, if I'm in a hospital environment, where are the patient beds, where's the medicine cart, those things. Obviously, a lot of variety there in terms of content.
Tim: But ultimately, it's employing the same wireless sensors and bringing them together with dashboards that allow people to operate. So, those were our real big use cases in 2019. And I think it was really teeing up 2020 to be a breakout year for us. Really getting scale, really getting a focus and having some fairly broad industry adaption of those use cases.
Jake: So, 2020 was a very tumultuous year. Obviously, there were a number of things that were unexpected that changed our way of life. Being in Seattle on the west coast, we had things like the West Coast fires. So, I had to work indoors. But also, we had COVID. So, how has Microshare adapted to some of these changes? How has this changed your product roadmap? And how are you helping some of your customers adapt to some of the changes of 2020?
Tim: Yeah, obviously, it's a very disruptive time to be a part of any business, right? But as a startup, I think we had an advantage in that. We were used to pivoting. If we were good at anything, it was being nimble. And so, although it was disruptive, we certainly had some really large sales that got put on the back burner when we came into March, April. And really had to rethink what we're up to. The good news is that the platform was built really to be flexible enough that we could throw new technologies onto it, and very quickly get that data in shape to be able to drive new kinds of decisions.
Tim: So, contact tracing was the immediate need in the marketplace, still is. And we were able to take all the tools and technologies that we were using really for asset tracking, and retool it a little bit to be able to help people use wearables to trace actual contacts and manage the contact tracing in a commercial environment. So, that was really an obsession of ours for the middle six months of this year. And really began to, where we sit today, it's awesome.
Tim: But we're a market leader in deploying contact tracing and commercial environments right now. I won't quite say a standstill, but it certainly wasn't a use case that we were contemplating has been a majority of our business and leading into the 2020 year. But that's really where 2020 has taken us. And as we get into 2020, the latter half, it's really about how to fold contact tracing into the wider offering set that we bring together.
Tim: Because I think what the science has said, and what people that we are working with on contact tracing are finding is contact tracing is really just this start of your data-driven wellness journey, if you want to say it that way. And so, what we're really now saying is the integration of contact tracing with some of the things that are more traditional for us like smart cleaning.
Tim: Because visibility of cleaning in an environment really does go a long way to reassuring people even as we find out from the scientists that transmission is less likely through surfaces, it's a very visible activity. And it does put employees' minds at rest when they see people going out of their way to make sure that the built spaces are properly sanitized and cleaned for them. And air quality is really important.
Tim: Because now, we know that it is a respiratory disease, and it's airborne. And as you said, Jake, even air quality from smoke, all of a sudden, is a big issue for some of our customers. So, we fold it in air quality as a key component, and occupancy, social distancing. Do we have too many people in a conference room? Do we have people spaced out over a flex space? And asset tracking are important too.
Tim: Can people locate the machinery that they need to operate? And are those machines clusters for contact? So, we need to increase the number of coffee machines or the number of metal punches, or whatever those machines that people tend to count on. How do you change your operations or an asset perspective to also minimize the risk to the employees that are using those machines?
Tim: So, really, the latter half of 2020 is now about taking all of those data sources, folding them together to give you a really complete picture of what returning to work and continuing to be productive and thriving in the post-COVID world. That's really what we're now focused on.
Jake: Let's dive a little bit into some of the technology that you're using as part of these use cases. You mentioned that you're doing contact tracing. I know that you have a range of different types of wearables that you're using. And some of those wearables are using somewhat unique technology. In the news, we hear a lot about people using cell phones for contact tracing, whether it be in Japan, Korea, other countries. But it sounds like you've taken a little bit of a different approach when it comes to the technology supporting your solutions.
Tim: That's right. For a long time being a member of the LoRA Alliance, so it's an alliance of vendors who are organized around the LoRaWAN spec, which is an open source, I would almost say, wireless technology. So, open source from the perspective that the bandwidth all across the world, though it happens in different wireless bands, is free spectrum. So, anybody can set up a network and use it. That's what LoRaWAN was based on. And it's an LPWAN technology.
Tim: So, it's long range, low power, but also low bandwidth. And it's perfect for simpler low fidelity sensors. Because you can stand up a network, or you can take advantage of an existing public network to put sensors on it, and get data into our cloud solution. So, that was really the founding technology. At the core of it, we don't manufacture devices. We influence the design of them but we are not really a device group. And actually, most of my platform is cloud based.
Tim: So, I really don't even care about the wireless aspects of this. But managing a network at scale turns out to be something that most of our customers need to be done for them. They want to demystify or they want to derisk. They want to be involved in most of that. They really just want the insights. They don't care about all that preliminary stuff. So, as we've gone into the market and really figured out what people are looking for, we've had to get more and more deep in device selection, wireless management, and things like that.
Tim: So, going into 2019, that was really our strength is using the 500 plus device vendors in the lower ecosystem to onboard the best possible measurement telemetry solutions for our customers. And then as we turn to contact tracing, we had this LoRaWAN backbone very much in our wheelhouse. And one of the cool things about that from a wireless perspective is that you can have a network in, let's just say, an office environment from a security perspective, and it doesn't need to touch your office network at all.
Tim: So, there's no Ethernet connections, there's no Wi-Fi connections. This network drops into a space and operates completely outside of the typical operating environment. And that really does put a lot of the security concerns about integrating these as potential attack surfaces onto a broader more attractive operational environment, it puts that to bed. And as we got into contact racing where speed is of the essence, people are looking to get back to work immediately, being able to turn these things around really quickly.
Tim: Because we didn't have to go through all these what ports do you need to have open and all these concerns that you go through in corporate security. And we were able to sidestep a lot of that complication because the networks don't overlap at all. And that, we found really helpful in contact tracing as well. But then, as you get to the fine point about contact tracing, there's been a lot about cell phones as the locus of the individual detections, and it works fine.
Tim: But it only works fine if people have smartphones. And what we found really early on is a lot of the people that were coming to us couldn't depend on people having smartphones. Because A, they're either operating with large workforces in non-Western spaces, right? So you can't presume that a factory worker has a smartphone, at least not a secure and current one, right? And places where people couldn't bring their smartphones.
Tim: So, sensitive environments, clean rooms, high security installations, where people are not allowed to bring the smartphones even if they have them, what's the solution for that? So, we ended up going into really looking at wearables as a counterpoint to the cell phone. It's not that I don't believe you couldn't use mobile as an endpoint. And I think over time, people will realize that ultimately, you want a hybrid.
Tim: You want some smartphone data, you want some wearable data. And I think it will be up to us to fold that stuff in for people as they get into that environment. But at the moment, all of our deployments really are through Bluetooth wearables that connect in a novel way to a LoRaWAN network. And what that really allows people to do is, also from a trust perspective, that people don't have to worry about, "Oh, is my employer-
Trey: Let me pick on you for a minute. Hang on.
Tim: Please step in.
Trey: All right. So, this sounds like you guys practically stepped in it. I think you guys won the lottery. You've got some infrastructure, a business plan where you're tracking resources around an office or a manufacturing facility or even a secured environment. And you've got a network architecture, this LoRA architecture that's not connected to corporate assets, not internet connected for any other corporate purposes using a spectrum that's not a threat to WiFi or anything else.
Trey: And we went from tracking resources for cleaning or key assets we need to identify, making sure cleaning crews touched all the right rooms, or people who've been showing up to use their meeting rooms, that thing. And what we had the opportunity to do was pivot beautifully into a space where a commoditized highly competitive dumb device bracelet could be slipped on, join this network, track these resources.
Trey: And you're capturing all the telemetry to know who was close to who and what windows of time, have those have been recently cleaned? What's the air circulation? What's the blast radius look like if we had an event? Then, you're getting into more privacy stuff. From a security standpoint, you are articulating and defending this value proposition well that there's no overhead, there's no endpoint concerns.
Trey: There's no patching, there's no device, ubiquity concerns. I think that all plays well. But this is just a beautiful go-to market that was a perfect fit to you in my mind. What major changes did you have to make to your infrastructure to make this move?
Tim: No, I think that's right, Trey. And I did go out of my way to try to build an infrastructure that was adaptable. So, we wanted that right from the get go. Because it wasn't super clear how the market would evolve and what people would really care about. So, that was built into our cloud strategy, our wireless strategy, our device strategy, our business strategy was predicated on that. So, we did have at least the foresight to know that we weren't going to predict the market well, right?
Tim: We knew we weren't smart enough to get it right from the start. So, let's build it to be flexible. And like you said, and the world came our way. And I think that's what startup success, frankly, has always looked like for me is you just got to stick around long enough til your number comes up, right?
Trey: Well, you've also made the right investments coming in, right? In terms of application and data architecture, the way that you set up provisioning. You said this, and this isn't just a buzzword. I'd like for you to unpack this in one of these upcoming sections where you said democratizing this information, the ability for this information to be exposed, to be shared, to be collaborated against without violating some of the privacy or security sensitivities.
Trey: If you're deployed in a cleared facility, or a government facility, you don't want to correlate which human was on which floor. There's ways to easily sanitize that if you architect it that way. But I think you did that from the start.
Tim: Yeah, that's right. I think that was one of the original insights that we had was that the data privacy was going to be a hot topic, and had to be taken head on. And the way I thought about the problem was really, I guess, cliché to say, first principles, right?
Trey: I think it's the right place to start, 100%.
Tim: It is. It's also in vogue to say you do it that way. And I would say maybe we're just too stupid to know what we didn't know so we started at the basics, right? But the way we thought about it was what's it like to be an owner of data, a collector of data, a contributor of data in the 21st century? And we found ultimately, a lot of the backdrop assumptions based on 50 years of technology was I own the database I own, I own the data.
Tim: And you see a lot of friction over that simple assumption. And I said, "Is that really a baseline assumption? Do we have to make that assumption?" And so, instead of thinking about the world as, "I own the sensor, I own the network, I own the cloud instance, I own the database, therefore, I own the data," which is I think, the normal enterprise intuition about how that works. And say, "Actually, what if you can't assume any of that stuff if all of those things are fungible?"
Tim: But you know on the back end that the intent is people want to use data to make better decisions, what do you got to do in the middle to make it so that it's less clumsy privacy-wise, less clumsy from a sharing perspective, and less duplicative? Because if we keep following that stovepipe of the world, I think I use this analogy with you guys earlier.
Tim: But the notion that if there's five different people who are interested in where my dog is at any given time, and I keep bolting a new sensor onto his neck so that everybody can have their own stovepipe view his location, actually, the dog is going to tip over and he's not super strong. It's ridiculous, right? You can't really think about the world that way. You can't even really think about a conference room.
Tim: You got all these blinking lights in a conference room of things stuck to a table and up on the ceiling. And people start to get rightly nervous when they see all that stuff. So, there's-
Trey: It's inefficient and wasteful, but I want to zoom back to something else back because I think this is fascinating. So, you decomposed something that I think was fashioned for the time when you were originally building up the architecture of your organization, your technology stack. There's this notion that people have data that they own. And then, there's observations about that data. We can talk about data information, knowledge, the continuum of where you're creating value.
Trey: But when we talk about how data is used to drive value, there's a point where you're custodian of someone else's data. But the observations, the insights, the federation and access of some of those insights, where you're starting to move towards democratization, the technology allows you to collect the data. But the data and the observations against it provide the value. And how are you using this data to drive value? You started talking a little bit about federating that. Can you unpack that a little more for us?
Tim: Our view of how an ecosystem around data should be constructed is think of it as a pyramid, right? Or I guess the analogy of data is oil, right, that's ubiquitous now. But the realization is that oil is actually a fairly useless product. It's just sticky gross stuff, doesn't really burn, raw oil is not super useful. It's the process of refinement that turns it into a whole host of different things that are useful. And maybe then to drop that analogy to be a little more ecologically friendly.
Tim: When you talk about a pyramid of enrichment, at the baseline, you've got all of these things collecting raw data. And in our case, most of that raw data comes from wireless sensors, which is relatively stupid. It's not particularly rife with insight in and of itself. And if I gave it to most of my customers as raw information, most of them wouldn't really know what to do with it because it's fairly low level.
Tim: And I think that's where a lot of IoT projects have come and gone for people is they just assume that devices would be as smart as humans are. I see that the bathroom needs to be cleaned. Well, humans make a lot of intuitive processing that you're not even aware of. And when you put a sensor in, your intuition, I think, is that it will do what the human does quite naturally. But it doesn't know that the bathroom needs to be cleaned.
Tim: All it knows is there's a chemical signal, there's a motion signal. There's only 20 things about the physical world you can actually measure. And it doesn't make any leaps about what those measuring things are going to matter to.
Trey: I feel really bad for commuters listening to this and having a mental picture.
Tim: I suppose, I suppose. And that the next step up in that pyramid is then to continually combine, refine, and improve the data to begin to generate insights out of that raw stuff. And that can be the realm of AI. It can be the realm of human analyst. There's a lot of different ways that data can go from raw to useful. But what we have envisioned is this idea that multiple participants in the ecosystem might themselves recombine each layer in the pyramid, and create novel insights.
Tim: Anybody who's doing AI knows that most machine learning wants a lot of rows of data and a lot of columns of data, ideally, right, good quality data, assuming it. So, you want a lot of features, which means you want to combine a lot of different sensor types. But they have to be correlated together so that they go along with the same row. And that's not trivial. But if you could do that, the insights you can generate would be novel, potentially powerful, potentially predictive, et cetera.
Tim: And we wanted to create an environment where that refinement was fluid. That the end purchaser could buy an insight without having any exposure to the raw data, or the data science that's happening in the middle. And I guess, turtles all the way down, right, the ability for people to jump in and add refinement. And that's what the share is really from in the Microshare name. It's creating that ecosystem where we can get fluid with the exchange so we can add more value to it.
Trey: That makes a lot of sense. One of the things I'm most fascinated about was when we get towards benchmarking, when we start asking sophisticated business level questions, how are we doing in terms of conference utilization? Or how are we doing in terms of office density, or any of these sorts of patterns? It's one thing to say, "Hey, we've got a data-driven program that's informing how we manage our facilities, how we manage cleaning, how we measure utilization, energy burn all these sorts of things." Benchmarking becomes fascinating. Can you talk a little bit about that?
Tim: Yeah. So, we have, I think, four pillars that I talked about in our product go-to market. The absolute baseline is collect raw data, right, make that easy. If you can't do that, then all of what I've just said goes out the window. So, we spend a lot of time figuring out how to do that at scale. Which means if you want to deploy 10,000 sensors over a weekend, we're the company you're going to call because I've got them in stock.
Tim: I've got all the installation, infrastructure necessary, I can onboard them. So, it takes that out of the science project realm. So, that's table stakes as far as I'm concerned. I think we're still differentiated in market and that we do that because not everybody's caught up to that but that's just baseline, right? And so now, I got a bunch of raw data, what do I do about it? And then the pillars, I think, number one are scoring.
Tim: So, scoring is about aggregating and deriving a useful metric out of the raw data. So, scoring might be like a FICO score for your credit, right, your credit score. Taking a whole bunch of data and then reducing it to a single number that should represent the complexity of all that data in a useful way. Second pillar then is benchmarking, right? So, it helps me to know that my credit score might be a 350 but Jake's is a 750. That's more helpful, right?
Tim: Okay, well, now I can put myself in context of my peers and my potential peers, now that score gets even more valuable. And then, on top of that is best practice. Because the very next thing I want to know is what's Jake doing that I'm not doing? How do I improve my credit score? How do I get from 350 to 450, to 550, et cetera? The final pillar, which actually isn't really a stack, it's behind all of these is you can't do any of that without sharing the data in a sensitive way behind the scenes.
Tim: Because in order to benchmark, I've got to implicitly be able to compare my scores with somebody else's scores. And in order to do best practices, I got to be able to see what I'm doing compared to what other people are doing. I think those are the value-add stacks we're trying to make available for all of the use cases we do. Some cases, we do that work. Some cases, our customers do it themselves.
Tim: Many cases, our partners are getting involved to do that data science. But I think that's how I think about the core stack of that. And benchmarking, I think, is the first thing that people can really latch on to. Because just being able to see, our people or my peers doing better with getting people, getting employees back to work. Is my cleaning crew doing a good job of cleaning the space that I have, right?
Tim: So, I run an airport, and I can see other airports around the world and compare, how is my cleaning efficiency doing? Is my customer feedback comparable versus the amount that I'm spending? And oh, if it's not, how are the people handling scheduling? How are the people handling supplies? How are the people doing just in time remediation? Then, you get into sharing best practices. But that's how I see it breaking it out.
Jake: Yeah. So, it sounds like you're really bringing an adult level of sophistication to the data that's being collected by these dumb devices. That's something that you had mentioned before that really resonated with me. With you collecting all of this data, especially at scale, how are you ensuring the integrity of that data and the quality of that data? I'm sure that's not a trivial process to undertake.
Tim: Particularly in IoT, there's a lot of buzz about insecurity and things like that around IoT. And for sure, as you begin to distribute your attack surface potential across 10,000 random tweeting nodes, that can be a real potential challenge. But what we have begun to do is really stamp metadata into the information that we collect. The metadata that I store about a particular sensor report is massively disproportionate to the data itself. I've stolen a lot of concepts out of blockchain, honestly. I've been gratuitously thievery-
Trey: I'm glad it's good for something, that's great.
Tim: It is good. It is good for something. And things like cryptographic signatures are one of the things that we have adapted, so that you can at any time, look at the checksum of data, you can see how it was initially presented and what you've got it at the backend. And as you aggregate, we keep track of the aggregated checksums. It's not quite blockchain because I'm not using distributed databases and things like that.
Tim: And in the future, maybe we do. If I'm honest, I just don't see it as mature enough today. But the storage mechanism and distribution mechanism barely matters in my model. What's important really is that we know the provenance of the information-
Trey: You've got fidelity on the supply chain. It all makes sense all the way through. I mean, that makes perfect sense. That's really great.
Tim: Absolutely right. If you think about that derivation process of continual refinement, you want to be able to trace from the final insight. Say it's a score on your cleaning that's a worse single number. Can I go backwards in time and say, "What was the data that went into that score? And did that data come from known sources? And can I get all the way back to the part numbers and these sources?" People are doing manufacturing these days, and it's been a big fiscal effort.
Tim: And when I looked at this, I said, "Why don't we do this immediately in the digital world because why is this any less important to trace your raw materials all the way back to the supply?" So, we've been spending a lot of time with that. If I'm honest though, most of our customers don't care at the moment. They're not at the level of sophistication where they're worried so much about this. Because just getting the data, the telemetry has been novel.
Tim: And if I'm honest, most of them don't really even fully appreciate what's possible with sharing. It's been our bet that as people in the marketplace get more trained on this, they become more aware. They'll get more sophisticated. And they'll start asking questions like, "Hey, wait a minute, how are you managing the provenance of this data?" We'll be like, "Hey, no problem. Here it is. We've been doing this all along."
Tim: It's a big bet because there's a fair amount of our engineering that's gone into these things that are below the perceptive level of our market. But I think at the end, it is what will differentiate us. And when I say we do things at scale, it's not just 10,000 devices, it's 10,000 reliable devices, and the insights that come from it have all of this hidden complexity that adds, hopefully, just that level of trust that people will need to make decisions on this information as well.
Jake: Yeah. And I think it's interesting if you're capturing data and how that data has changed over time, there may be future use cases that your customers think about. And now, they can go back and look at that data and make important business decisions. So, you really are the unique place to be able to be capturing all of this different information that companies can't traditionally track and store.
Tim: Absolutely, Jake. And I think if you go back to first principles, and this won't be shocking to your audience, right, because we probably have all lived it our entire careers, but the intrinsic value of data. If data wasn't valuable, we wouldn't worry about cybersecurity, right? Throwing away data is a bad idea. But accumulating data is a risk. How do you balance those things? And how do you go back?
Tim: Five years from now, you'll look back. How do you know that that data at that time was high quality? And what was the context of the data? Context is everything. Because without context, the raw data is useless. So, how do you develop robust ways of maintaining that tagged context so that you can look back over time and get value out of it?
Trey: That's the fascinating thing. So, a moment ago, you made a comment about you're capturing metadata about this data. Of course, it's flowing in, you're enriching it on some level. And pairing that to your, I'm going to call it the zero to one comment that customers don't yet care about that next level about how or why. I find that on customer journeys that their questions, customer questions and proof.
Trey: And that varies wildly based upon timeframe, or time and seed, or general organizational maturity. Then, you've run into some of these really large institutions that have unbelievably good questions. They don't show up with a 200-question questionnaire, they show up with three questions. And they're the hard, hard questions that really inform where you're at. And when you start looking at observations, what's my cleaning team's "credit score"?
Trey: What is the amalgamated magical output number that scores them? How did you arrive at that? How do you enrich that? How does that compare to elsewhere? It's more than just raw data point. There's a lot of context that you're going to have to find a way to marry up across all these institutions.
Tim: Yeah, that's right. So, I would say that the biggest value we add in what we call the smart network actually is not the network, it's the smart. And what that really boils down to is metadata attribution. And very important to be able to, at any point in time, know what the context was. So, for instance, we don't store just the raw telematics. And then, you got to look up the tags that belong to telematics because tags change over time, devices get replaced over time.
Tim: And if you try to do that as a reference table problem, it really quickly gets out of control. So, what you've got to make sure is that every piece of data carries along with it enough context that as the world changes after its initial collection, it always has what it needs to be able to tell you what it meant. And we see things like asset tracking, for instance, where people will put a tag on a crash cart in a hospital. And then, they'll track that around and get a sense of where it goes.
Tim: And then, three weeks later, they'll put the same on a wheelchair. And get a sense of where all their wheelchairs end up. And that's cool. But if you looked at the data later, and you assumed that it was on a wheelchair and you saw the initial behavior as a crash cart, it would look very different. So, how do you future proof that by continuing to grab the right context and move it through? And actually, that's a fairly simple thing.
Tim: The question is, what if I got the context wrong? So, this is a real problem we see, right? Like, "I'm going to tag it this way." And then, six months later, I realized I should have tagged it that way. What do I do now? I want to update tags. Okay, now, we've got a problem, right? Now, we've got data-
Trey: You have to unwind the blockchain at that point, right? It's all about unwinding the blockchain.
Tim: It's exactly what it's like, Trey. And that's one of the reasons why we can't really blockchain it, because you do have to allow some of the data to be updated in retrospect but you have to be careful about that. And you have to keep the fingerprints of that trend because what if they made a mistake in the mistake.
Trey: Right. And I think this is that expensive wisdom, right? This is why you want to partner with someone. One of the things we run into with a lot of companies, and not a week goes by I'm not told, "Hey, I built my own Heroku." And that's fun because everyone's building their own stack and do the right thing. How does it scale? Or where are the more expensive wisdom moments where we've got to solve this problem durably?
Trey: What are the got yous along the way? And this is something you will have solved many times over. And this is where I love partnering with firms that have thought this through have bled in those problem sets. I don't want to reinvent that. I don't want to learn those lessons. This is the good stuff.
Tim: Yeah. One of my favorite Tommy Boy quotes I have to get sophisticated with you guys is I could get a good look at a steak by putting my head at the cow's ass but I'd rather trust my butcher. Right? So, I'm trying to be the trusted butcher. Because most of my clients don't have the time, honestly, they got day jobs, they're under pressure. And they trust me to know what makes a good steak, in this case, how to know that data is secure, how to know that it has high provenance, how to know that I'm maximizing its value over time.
Tim: Even if they're not so worried about that yet because they just came to the counter to buy some meat, later when they go to grill this meat, and they'll find that I've done all the things I need to do to make it the best experience. Because what they care about is eating steak. They actually don't care about anything in the middle. And if that analogy also is not vegan friendly, what's another one? Ultimately, we have to be good custodians of the data.
Tim: Which also means not just doing the minimum, doing whatever is conceivably predictable that might want to happen with the data and try to do it today, even though it's expensive and it's tricky. Because we want them to have a good experience on the backend and not regret it.
Jake: So, I think that's a great segue. You're collecting all types of data. Much of that data is very sensitive, maybe personal information, even protected health information. So, I'd love to hear a little bit about how you approach security and, to some extent, compliance for all of the sensitive data that you're collecting.
Tim: Yeah. So, this is an ongoing journey, I think. Obviously, from a startup's perspective, doing things like ISO 20001 is a major goal of ours out ahead. And the big boys have done SOC2 and everything else, but massively expensive. And I find a lot of it is about getting your paperwork right. So, I don't want to pooh-pooh that because operational discipline and process is really important to me. But you also have to escalate into that.
Tim: And so, we're on our journey really from treating, I would say, security as a just daily engineering task to now having the belt and suspenders of not only thinking about it when we engineer but also having that triple check that goes along with the more mature processes and documentation process. And we're right in the middle of that. We just did the cyber essentials and GDPR certifications in Europe. That was our first third party official certification.
Tim: So, we completed that in August. And those are little brothers to things like SOC2, right? And we host all through other infrastructure, right? So, our infrastructure providers, typically AWS and Azure, obviously, those guys are managing and maintaining much more sophisticated levels of compliance. And we know that we can stand on some of that. We can stand on their backs with some of that.
Tim: But as we evolve, we really want to have the tools and processes in place that allow us to stand up as well to those auditing requirements. And we're trying to do it a little ahead of what our customers are asking for. But Trey, I think as you pointed out, not all customers are created equal. We've got some that are super sophisticated, and really know their business. And yeah, we get the 300-page questionnaires that we have to fill out but I'm not entirely sure anybody reads when I submit.
Tim: But we're doing our really damnedest to answer those questions really well, and to understand what's underneath them, what the intent of them is. Because it all boils down to what do we need to do to be good custodians of the information entrusted with it?
Jake: And would you say that it's predominantly, I guess, regulations and customers that are driving your security compliance program? Or is it fear? Or is it something else?
Tim: I think as an entrepreneur, you have to be mostly fearless. So, dumb and fearless, I think, is the recipe that I've applied. But customers, I think, bring the fear for you. Right? And sometimes, it's well-defined fear, right? They've got their 300-page questionnaire. They've got the three boiled down really good questions, and sometimes not. So, you have to be able to blend them with science a little bit.
Tim: You're like, "Look, I thought this problem through. Here's the white paper. Here's the complete questionnaire. Here are all the questions you should have asked me. And we've answered." That gives them a way to say, "Well, I'll buy this. I don't have time to go through it all. But I'm convinced that you've at least thought about it." And that generally gets us through to the next round. But I would say we're really customer driven on it.
Tim: And the challenge is always to stay just a step ahead so that we can keep our prices low. Because price sensitivities is a thing in the marketplace, for sure. Democratizing is not charging what Accenture would charge for such a project, which has teams dedicated to continually answering cybersecurity questions. So, to keep my costs aligned but still be trusted, that's the edge we're walking on a daily basis. And trying to just stay ahead of those needs.
Tim: So, it is absolutely, as CTO, a daily concern of mine. Where are we with this, and what's the next level of sophistication taking steps towards the next plateau? And regulation isn't insubstantial either. But what we look at in GDPR, we do a lot of business in Europe. So, GDPR is really important. But I think what people fail to realize, there's a lot of wiggle room in GDPR. And probably, from my perspective, there's too much wiggle room.
Tim: That people feel a little more comfortable because you say GDPR, then they should. And I think that's true of things like the California Consumer Data Protection as well and bills that are modeled after those. They're a first step but I don't think they go far enough. And so, yeah, we want to be compliant with the regulation. But I don't want to stop there. Because I actually think we can do better.
Jake: I find the same thing as well. I find that a lot of these different regulations and compliance programs, they still leave a lot of wiggle room. And we're all trying to constantly catch up and address the security compliance privacy needs of our customers and the industry as a whole. And Trey, I know you have lots of thoughts on security and compliance.
Jake: Having worked with Trey, one thing that he really shaped my view of is that compliance is really a byproduct of engineering excellence. That's something that's really stuck with me over the years as well. So, Trey, do you want to talk a little bit about your thoughts in the space?
Trey: Oh, dear. I'll make a short one, yeah. What strikes me with this is the byproduct, again, of excellence in engineering is what security should be finding compliance is basically having run that program well. I do take issue with the notion of getting your paperwork right. But if you did build this from the ground up with that in mind, that's effectively what it is is making sure everything's documented correctly. What I love from your belt and suspenders references that you're bringing a third party in to check your work.
Trey: They're challenging your thinking. They're double checking your assumptions. They're pushing against your implementation to make sure that you did effectively achieve those outcomes. GDPR, even PCI, they feel really loose when you look at them. Part of that, working against that as an implementer is scary because we want clarity. And what they're trying to do is define outcomes. PCI, they were wise ahead of their time when they started pushing on the intent behind the control.
Trey: And so, when you look at the letter and intent, what they're trying to do is achieve a certain outcome. They're managing against a certain loss. GDPR is going to be the same way. I think the hardest part with that is it's so new and it's so foreign. And what we're getting into now is data sovereignty. And so, we think of ourselves as custodians, the additional layers of controls that follow what the different countries are going to want.
Trey: And now, I think we're seeing a game of I think it's political control. It's a power play. California likes to be the first on the map, and we're going to see a bunch of other states following. Now, what I think we need to be pushing towards is more of a global approach to this. This is something that I would love to see Salesforce get involved in. But ultimately, what we're looking at here is we have users that use our technologies that may or may not be willing victims, if you will.
Trey: Folks wearing your sensors didn't talk to you. They didn't opt into this. And so, what can you find out about their lives? What story can you tell? Stepping up, the customers that you have that I have, when we think about how we serve them, how do we enable them to defend themselves against us as a platform? And so, that gets into new concepts. Tokenization is not brand new, but in terms of implementation against a privacy standard, or custodianship standard, it's a great approach.
Trey: And these technologies are very new. And they come at a high engineering cost because you're introducing a lot of things that don't necessarily unlock new markets, new capabilities, new features, new sales. But back to Jake's core principle that it was pointing me at, I found that building the security program at Heroku, we were focused on serving the engineers. There are really bright people working through complex problems and failure modes and solution sets when we talk about expensive wisdom.
Trey: The question is, how do we serve them to achieve an outcome that creates safety, durability, and regulatory compliance on the outcome? Ultimately, that partnership's a beautiful thing. And I find, again, going all the way back to the first principles, how do we achieve it? What do we need to think about as we achieve it? That's a big one.
Tim: One of the challenges I have in many cases is the letter. Let's just take my average customer and their 200-page security questionnaire. They are usually pitched at a very different organizations in mind. They usually aren't written with cloud in mind. They're not written with a modern distributed organization mind. They're more of an organizational focus. So, the intent is absolutely critical. I find myself trying to read through to the intent in some very pointed questions and precision, right?
Tim: There are some very precise questions that are nonsensical in my context. I, nevertheless, have to answer. Any advice there that you can offer?
Trey: I do have a lot of thoughts there. So, first and foremost, the spirit and intent of a controller for regulation, I think, is virtuous. And if we focus on that, this is where you get into trouble having selected an audit or having an auditor forced on you. None of these regulations say that, if perhaps the CSA or really modern stuff, is thinking about shared responsibility models. To your point, you work on AWS, you work on Azure. All right, think of your own database. Did you lock it down? Is it exposed?
Trey: How do you manage authentication? How are you logging? You pull that string. Now, you've had a body of common controls. How many regulations does it take to screw in a light bulb, I mean, to define a secure password? As it turns out, all of them. So, this is something that I've really tried to empower Jake with. Jake was my partner in responding to a lot of questionnaires inbound from customers that want to do the right thing.
Trey: And they've all got a unique questionnaire. So, the Cloud Security Alliance has something we call the CAIQ. It's a consensus questionnaire that gives a common body and a crosswalk through the CCM. A crosswalk of we want to talk about passwords, here's everybody in their dog that major standards globally that talk about password controls, this is how they map. And so, that turns into a fairly uniform singular way to do this.
Trey: You have to start assessment or someone check your work on that. You've at least got a starting place for that conversation. That should be confirming you're roughly tall enough to ride that ride. But it's one audit, one questionnaire and one starting place for a conversation.
Tim: It's great. I'm glad you said that because that's what we're doing. We're using the CSA. And you added value, Trey. Because I didn't know it was pronounced cake. So, I can't tell you how much time organizationally we have wasted it by saying C-A-I-Q, that is money in my pocket right there, the time saved. So, that's awesome. But yeah, I think we've tried to do that too, because we do see that the CSA is obviously coming from it from a slightly more modern less enterprise but still robust perspective.
Tim: And the traceability back to all these other standards really does help align intent. So, we've been pretty happy with that. And what we've tried to do is be proactive with our customers. Because frankly, the business is usually our point of contact with sales, right? It's not generally IT that come shopping for what we offer. But we try to involve IT really early on, even though the business often resists that.
Tim: And one of the ways we try to involve them is to say, "Look, here, we have this robust questionnaire and a white paper that goes along with it. And can you please immediately go contact your IT department. If you have a CSO, or something like that, great, forward it on to them." And at least let them know that you're considering this. And we want to try to be proactive and say, "Look, here's the way we think about this problem."
Tim: "And if you still want to send me your questionnaire, I'll do my best to fill it out." But in the end, I hope to head those things off by being proactive and saying, "Look, let's engage in proactively that conversation about security." So, you're not like catching up later. Because I think that's the flaw in the process is, "Hey, we bought this thing, go take a look at it."
Jake: Yeah. And I think you're taking the right approach there. I mean, being proactive, right? The first thing I do when I get a questionnaire with like 3000 questions on it, I immediately schedule a meeting with the customer just to make sure we're speaking the same language. What is the delineation of responsibility? Who's responsible for what? And we've actually created an extension to the CSA CAIQ. We call it Mark Security and Compliance Runbooks.
Jake: But that really walks the customer through what are these major responsibilities that they have when using the Heroku platform. And we found that a great way to bridge some of that gap and to quickly move on to the providing the customer value in helping them understand what they can do to make sure that their usage of the service is secure.
Tim: Yeah. That resonates a lot, Jake, because I think responsibility is often at the core of the confusion, right? I get a lot of questions about how we manage the admin passwords. And in the end, I shouldn't know what your admin password is because you should be the admin for some of these things. So, making it clear like this isn't a managed service, per se, although it is a SaaS offering, right?
Tim: So, you really got to get into the weeds there to talk about, "Well, obviously, I have a cloud admin, and how do I manage his password, but that's not specific to your deployment. And in the end, your deployment audit follow your password policies. But I don't know what they are. I just want to support them." So, there's so many layers to a cloud level accountability that's probably a helpful thing to do.
Jake: Well, I think we're heading towards the end here. I wanted to give you both an opportunity to give some closing thoughts. Trey, you go first so that Tim can end here.
Trey: What's fascinating about this is this isn't a conversation about the actual IoT devices. This is about enriching and understanding not the data but the information in the context and creating value that's meaningful on a personal level, on an organizational level, on an ecosystem level. I'm encouraged that the thinking going on. And I'm super excited about the pivot that you all were able to make in response to this.
Trey: This seems like a very natural logical extension. And I think this is also an encouragement for our listeners to think about building some of these capabilities from a security and privacy standpoint as they're designing and building their infrastructures.
Jake: So, Tim, do you have any closing thoughts as well that you'd like to leave us with?
Tim: Yeah. In the end, I think one of the things that I'm trying to move the market towards, and as you mentioned, Jake, what does it mean to be a good custodian of that personal information isn't just keeping it secure. Because I think the industry has this knee jerk reaction to lockdown. And in a time like this, the power of the data is to open it up. But as you said earlier as well, we're flirting with health data here. We try to be clear about that line and not go over it.
Tim: But let's be honest, when lives are at stake, it's not a moral line that you can defend, even if it is a legal one. And so, we do need to be really careful with the information we have. But part of that care is to recognize that putting the data back into the hands of the people who can make best use of it, particularly in the context of COVID, giving the data back to the people who are going to be able to protect their coworkers and their friends and their family, because you've not locked down the data, you've taken the exact opposite approach to privacy, and opened it back up to them.
Tim: And giving them their own control of what real consent means. Giving them visibility, giving them control, giving them agency over the data and what it means. That's the opposite of locking it down. I think that's the wrong reaction. Although, it's the simple first reaction to minimizing the risk. In our case, I think the greatest risk is that we overlock this data. We don't exploit it to its full potential for making people's lives better. And I think that's part of the current moment. But I think it extends beyond as well.
Tim: Because as I see it, being data-driven isn't going to go away. And it shouldn't go away. Because individually and societally, it makes sense for people to run businesses better, to operate governments better, to make better individual choices. And as humans with monkey brains, I think having data at our fingertips that's carefully curated is what's going to be able to allow us to collaborate globally to make better use of resources to optimize human well-being and comfort.
Tim: All of those things that we struggle with society right now, I think in the end, I'm not saying that data is a panacea, but it's, I think, really critical recipe in the medicine that we all need to take. And if we lock it down too much, I'm just afraid it's going to stay out of the light. And that's where bad things happen, right? It's in the dark. So, as much as I want to be secure and everything else, our privacy stance is really to say privacy comes through transparency, and not through obfuscation.
Tim: And that is also about making the data more available to the end users. You have to put it back in the power of the people who are contributing the data. And that's sometimes controversial stance, but I think that's the path of true maturity. You go through cloud as an initial step and you go back to mobile, and that's taking it, centralizing it, then decentralizing it, and then decentralizing it some more. And Trey, you mentioned earlier, what does it mean to democratize? That's what it means. It's not trusting Big Brother with our data.
Tim: It's trusting each other with our data. And how do you do that sensibly? Right? You still have to be good custodians of that data. You still have to trust your platform. But yeah, that's the journey we're on. And again, I'm not sure the market is ready for it. But as a startup, if you're not going to do something different, what keeps you up in the morning, right? It isn't more IoT sensors in my case, it's more power to the people by giving them good insights to make decisions on.
Jake: Yeah, I couldn't agree more. And transparency is key to trust. So, I think you hit on all the right points there. Well, that's all the time that we have today. I would like to thank our special guests, Tim Panagos for an insightful and topical conversation about health metrics at scale. And I'd also like to thank our special co-host, Trey Ford, for bringing his perspective as always.
Trey: Thanks, Jake.
Tim: Yeah, this has been awesome. It's great to be part of the Code[ish] and all the really cool topics you guys bring forward, so I'm humbled and gratified to be part of it.
A podcast brought to you by the developer advocate team at Heroku, exploring code, technology, tools, tips, and the life of the developer.
VP, PM - Platform Trust, Heroku
Dad. Product VP, security nerd, mediocre typist.
Trust & Security Director, Heroku
Jacob Silzer is a cybersecurity and compliance expert who has dedicated his career to serving as a trusted collaborator to Salesforce customers.
More episodes from Code[ish]
Jim Jagielski and Alyssa Arvin
Jim Jagielski is the newest member of Salesforce’s Open Source Program Office, but he’s no newbie to open source. In this episode, he talks with Alyssa Arvin, Senior Program Manager for Open Source about his early explorations into open... →
Lisa Marshall and Greg Nokes
This episode of Codeish includes Greg Nokes, distinguished technical architect with Salesforce Heroku, and Lisa Marshall, Senior Vice President of TMP Innovation & Learning at Salesforce. Lisa manages a team within technology and product... →
Innocent Bindura and Greg Nokes
How do you know an application is performing well beyond the absence of crash reports? Innocent Bindura, a senior developer at Raygun, shares the company's tools and utilities, discusses the importance of monitoring P99 latency, and talks... →