Transcript for:
Insights on Data Activator Presentation

Hey, Will, how's it going? I'm doing very well, thank you. How are you, Chris?

Oh, just, I'm having a fantastic day. I got all the lights up. I've got Thanksgiving under the belt. I have two Thanksgivings under the belt. I'm just so grateful.

Nice to hear. Yeah, that's great. Are you doing all right? Mostly, yeah. I sprayed my ankle the other day, so I'm hobbling around everywhere, and then my home internet went over, so I'm somewhere else doing this, but I'm making it work.

Well, hang on. So, let me get this straight. You can't walk, and you have no internet, and yet you're here to bring us Data Activator, is that right?

I'm going to be here for you, man. Yeah, yeah, always. That's fantastic. Oh, and we've got people from Nigeria on the... online so how awesome is that we are spanning the globe already so that's great good to see you and we've got our friends around LinkedIn and an ex so good to see you guys there as well um so uh you know oh and there's of course Aaron is here as well a post if you're where you're from inside the chat and I've got something here that I really want to understand how many people are already using data activate right before we even talk about it i would like to know if you've tried it are you familiar with what it is or what are you doing with it that's something i'd like to to just get a feel for because uh you know will i i mean you showed this to me how long ago like in the whole like when this was in six months maybe maybe nine months ago maybe even a little a little longer than that it's been yeah it's been brewing for a little while let's put it that way yeah yeah and uh straight up i thought when i saw this i'm like this is going to transform the way people interact with data because now it's going to take us from like you know like just seeing data to like letting data drive what we're doing here so i was just super excited oh yeah i mean hey uh i i know there's a lot of people using it and you know when we launched the public preview in October, we were actually surprised by how many people went and tried it out.

It's been really good to see a load of people come and say, as you said, right, I've got this data. I look at it every day. Why can't something do that for me? Why can't we help get the data linked straight to the action that I would be taking?

Why can't I automate some of that? And it seems to be solving problems for people. It's brilliant. well and so we're going to get into that in just a second because i'm going to make a statement here i think data activator is going to save businesses the second they start using it like someone is going to save their job save their business save their department by using data activator to notify them when some complex scenario has taken place to then to do something that data activator is super powerful yeah but we're going to get into that in just a second um uh i want to do inter introductions not everyone knows will will um uh for my audience uh you know why don't you introduce yourself yeah sure um okay so i'm will thompson uh i am the group product manager for data activator um it's a very fancy title but basically means i lead the direction of the the product in terms of what it does and what problems it tries to solve and uh my main job There's me and two guys who report to me.

There's going to be a third one, actually. It's very exciting. We're growing the team.

Our job really is to go to the market, go to all of our existing Fabric customers and new customers as well, and find out where can our tools and our capability help them and listen to those problems, listen to the scenarios that they want that help with. and then translate it back to the engineers to say, hey, this is what we need to go build to be able to help these customers be successful. I've been running this team for a little over a year now. Before that, I was in the Power BI world, and a lot of people will know me from that. I looked after Power BI Desktop for a while, I looked after the DAX and modeling world, I looked after visuals way back when.

I was on the Power BI team since before it was Power BI really. since 2012 um that's when i moved over to the engineering world um before that yeah i was at microsoft in the sales world in the uk um talking to again it's like talking to customers and helping understand what their business problems are they're trying to solve and then how can our bi solution you know as it was then reporting services integration services analysis services how can we apply that to those problems that they're trying to solve um and help make things better yeah i mean will you've you've done so much for for this space i mean really i mean tristan's calling it out you know like thank you so much for all your hard work in this area and actually this kind of leads into um the like leading question that i like to ask of people it's like I'd like to understand what your personal journey was from five-year-old Will who wanted to be a firefighter or a policeman or superhero. How did you go from that to the group product manager for Data Activator?

How did that occur? It's a really topical conversation. I've got twin girls, they're eight, so the same age as Power BI.

They were born in May and we launched Power BI in July. uh let's have three babies um and we're having the same conversation with the girls at the moment because one of them desperately wants to work in a zoo she wants to work with cheetahs and wild animals and the other one doesn't know quite so sure but she she thinks she wants to be like a stylist and do hair and nails and stuff it's it's things that kids love right now will that be something that they continue to love when they're 18 25 getting into their careers and figuring out what they want to do so for me at age five or eight or ten i have no idea what i wanted to do It was not a pre-envisaged career that I thought I'd get into. When I was a teenager, I really got into product design, like industrial product design and architecture. And I kind of had an idea that I was going to go and be a built environment architect or maybe like a civil engineer or something.

I think I spent too much time playing SimCity as a kid. So I wanted to do road planning and all that sort of stuff. But something in that kind of built environment space. And then I looked at, okay, well, if I wanted to be an architect, what would I have to do at university? Oh, I just go and study for years and years and years.

I don't want to do seven years for an architecture degree to go and design car parks. Sounds like too much work. And I kind of thought to myself, well, okay, what else do I like? Well, I love computers. I've always been a big computer geek.

There's a great picture of me aged, yeah, about five with BBC Micro sitting at my parents. front room or wherever and and i'm sitting in front of the keyboard with a big crt monitor and i'm carrying something on my fingers i'm obviously doing something sort of semi-educational um and so i thought okay well computing is obviously a thing that i could do and pick up so i went and studied that um when i was at university and i wanted to do something that was a little bit more vocational than just computer science so half of my degree course was geographical information systems So how do we use again, it's like that built environment thing, the right we use computers to help understand, you know, spatial relationships and data, or how do we model the real world in a digital environment. And I think some of that was kind of what got me into thinking about, okay, well, the world of data and the world of applying technology to solve those kind of problems must have kind of inspired something.

I even did, I did like some data science modules as well. Some, you know, and this was 2003 or something yeah so i kind of i learned a bit about the statistics behind it and we used a bit of spss and some of those packages that have been around forever um so so you were in the geospatial and you were on the rbi visualization yeah i mean eventually i got back into that i mean so i the mapping piece that i spent such a long time with i never did anything with once i started working at microsoft until SQL Server 2008 R2 came out and it had maps in it. I was like, wow, this is great.

This is my thing. I know this stuff. So talking to customers about that and getting into the geospatial world when SQL 2005, 2008, I forget now, had geospatial data types.

I kind of got into that again. And then, yeah, when I moved to the engineering team and I got put on Power BI Visuals, I was like, great. And I owned tables and I owned charts. and somebody else owned maps i mean there's this there is some irony here that this person who had spent you know i'd spent the best part of four years studying cartography and mapping and how to use digital uh digital tools to do to do mapping then i didn't actually own that feature well maybe that explains the state of mapping maybe so i mean i can't i i unfortunately i can't completely devolve myself in responsibility because then when i came back so i spent a bit of time looking after visuals um went away for six months to actually go and implement. This was a really fun little side journey that I went on.

was once Power BI had launched and we'd spent, I don't know, 18 months, two years without having the market. I was like, okay, looking for a new challenge, what could I do? It was about the time that Power BI and the Power Platform were coming together and we were bringing in a bunch of other stuff under, James Phillips was the guy who was running the org. He said, I need somebody to go and build an internal implementation of Power BI.

so that we can monitor all of the different business units so that we we know how many users we've got across all these different products and we um you know kind of have a common way of reporting and and putting all this stuff into dashboards i was like great i'd love to go do that because it would give me as a as a product manager on the team you know a feel for actually what does it really mean to go and implement this stuff um and it's something that you know as as product managers we're a little dev um devolved from connected actually implemented disconnecting actually doing it like we use it you know our day-to-day to kind of do some analytics on you know telemetry or how people are using the products but we tend to do it because we all love it we tend to do it um as individuals and we don't always kind of there's not very many people on the team who have had that experience of how do i actually roll it out across a large organization which folks like you know we're doing every day you know that that's your job is rolling out at that scale and making sure that it it adheres to all of the governance and all the processes that are in that sort of organization microsoft doesn't have that culture because it's very kind of like everybody's a technologist so we all just do ourselves um so that's been a really interesting kind of journey for for me to go spend a bit of time doing that and it was me and um there's a guy called hayden richardson who's another gpm on the power bi team we went off and and did that for for six months or so and that was great still around yeah yeah yeah yeah he's okay Now, when I went back into the Power BI, product team looked after the DAX and modeling things. I saw Tristan's comment about create DAX. Can we create DAX calculated columns with that track factor?

The question, Tristan, is really, should we create DAX calculated columns with that track factor? I think we all know the answer. Yes. The answer is no DAX columns.

That is our defect default. This is latest going on about DAX columns. uh and to be clear i think this is a five to ten year war that uh starting now because i think it's gonna take five to ten years to win this battle i don't have real short-sighted uh like i i don't think by spring we'll have a victory in this front i think it's a long battle but if you don't start a 10-year battle it's just going to take that much longer to to get get through it so i'll start at some points i think the interesting thing in in the data activator world and we'll talk more about it from a product point of view is that so many of those um uh concepts and principles that i was you know looking after in the dax world um and and you know i've got however many years of bi experience kind of backing it up all go out the window because we're not talking about warehouses and star schemas and columns and measures and stuff we're talking about events And real-time streaming data looks very different. And the way that you have to think about modeling it is quite different. You know, it's maybe not entirely, you know, it's not like throw everything you knew about it out the window.

The principles of what you're trying to do with the data, what the business outcomes that you're trying to drive still apply. But the modeling techniques, the things you have to worry about change. And the...

the kind of the knobs and dials that you've got to optimize the system change. You know, whether it's like you've got events coming through in a stream, you want to act on some of those events, but what if some of them arrive late? What if there's some network issue that means a couple of those events arrive a minute late or two minutes late? How long are you willing to wait before you say, okay, we've got that minute's worth of data? You know, what latencies are you willing to allow?

It's not an issue in the batch world. Typically, if we're loading something once a day or even three or four times a day, you just load what's there. And if it's not there, you don't worry about it because it will get loaded next time and the latencies are not that important.

In the real-time world, we have to worry about these things. And that's been a really interesting mindset shift for me to kind of unpick some of the things like, okay, you don't have to worry about that. That stuff that you worry about in a BI world, you don't have to worry about it here. But there's a bunch of other stuff that you have to think about instead.

um that's been really interesting and and let's get into this in just a second but i i kevin i can hear kevin talking to me he's like you have to start with the rules and so we we do have a few rules inside of here uh number one we are here to answer your questions right like i do want to learn from will on this stuff but please if you have questions and we've already got this started in the chat so you guys are good at this you know this but if anyone's new here oh hit hashtag new because i'd like to see new people in the chat that'd be awesome uh but please prefix your question with a cue so we could make them make sure they get queued up and um please don't spam them any of the uh well i guess just power bi guy is a moderator he could block you or i could block you but you know if you're spamming stuff we're not a big enough group where that's much of an issue so that hasn't been an issue but i'm letting you know if tristan if you get out of hand we're bringing the hammer guys um uh with with that though we're gonna we are gonna dive into it and uh we do take our communal sip so i think we've done this on every live stream Because I'm not in my regular location. I don't have anything with me. I would normally have, it would normally be a cup of very English tea. It would normally be, one of the greatest things about working from home, my wife who is a superstar has been quite good at kind of just bringing me constant streams of tea, talking about streaming data, streams of tea. She'll come along and you'll see it just kind of appear in the background and deliver me a mug.

And it will be common old garden, English breakfast tea. Bit of milk, no sugar, don't need sugar, I'm sweet enough as it is. What have you got, Chris?

Well, I've been rotating through different protein shakes, so we've got a little bit of a protein drink today. Nice. We're going to dive right into that and then we're going to hit into Microsoft Active Air.

Grab your cup. grab your tea grab your coffee grab your protein shake grab whatever we got and let's do that camino sip and then we'll dive into it it's important to hydrate very good yes it is important to hydrate and uh if you're into keto i didn't know what keto was until my wife came and told me what keto was like i think we're at it was like 10 days ago and i lost 16 pounds in 10 days it's like that stuff is unbelievably effective so give that a shot that's not an advertisement because i don't know how i did um it works quite well all right let's head over to uh what is and for a baseline what the heck is data activator how does this work how does this operate so let's let's put it in context of the rest of fabric um so microsoft fabric this this this new data platform that the microsoft been building recently um and that for the most part went to ga at ignite a couple of weeks ago So Fabric is this data platform that provides a whole bunch of different capabilities for data ingestion, data engineering, all the analytic capabilities. And there's also a set of capabilities in there, real time analytics. And Data Activator sits alongside those.

We're actually part of the same kind of team organization within the Fabric world. Where Data Activator sits is saying, you know, you've done all of this work to get your data. pulled into this single platform you've done your work to analyze it pull out interesting insights stick it in a report to visualize it help people understand what those patterns are in that data but then what you know ultimately what you're trying to do with all this data is make some change or have some input into a business process that drives the performance of your business some way of saying hey when i see a particular pattern happening whether that's you know my sales drop or we see a load of delays in our packages being delivered, or we've got a bunch of staff who are suddenly not able to deal with the incoming call volume in a call center.

There's some action that we need to take, and whether that's simply notifying somebody, or maybe reaching into a system to actually tweak or tune something automatically, calling out to go and set some meeting requests up, whatever it might be that the action that you want to take, Data Activator sits there. So in a way, of monitoring data, looking for particular conditions, and then taking an action. That's where Data Activator sits. And the reason that we put it alongside everything else here in Fabric is that we don't really care where the data's coming from that we're going to go monitor. We can just as happily sit on top of one of these real-time streams as we can on top of a Power BI report.

You know, we think about a bunch of data in a report that's being loaded maybe once a day. It's just a slow-moving stream. It's still a stream, just... It only gets updated once a day.

And to us, it all looks the same. More like a flood and then a drop. Yeah, a big flood of data that comes in once a day.

Yeah, great. But it's fine. We don't mind.

and over time we're going to extend that further along that diagram so that we can sit directly on top of things like the data warehouse sit on top on top of your data factory on top of your data engineering ideally you know i i think there's a great idea a great place where we can get to eventually where you know a data scientist building a notebook says i've got this model that i've created that pulls in a bunch of data does a bunch of fraud detection or uses whatever kind of smart data sciencey stuff that they do like magic and the output is um you know a scored list of potential fraud cases or something like that. Okay, great. Then what? Somebody goes and does something with that information.

Well, Data Activator can sit on top of that notebook, check it as it's being run continually over that new data. All your logic is in that notebook. But Data Activator can say, okay, for each of those outputs, we're going to do a thing. Or, you know, maybe we'll do some further processing on it. If the number of number of potential fraud cases in a particular account that you've got.

or the number of potential fraud cases for a particular customer or whatever goes over a certain threshold we'll go do a thing so linking together all of those different components um it's why fabric as a platform has got so much potential value right when you think about starting using these two things together and and so i i think maybe that gets into like a question then tristan was kind of getting at it and um i think we we kind of talked about it like because we have two real low code automation tools are at our hands we've got power automate and now we have data activated um data activator hooks into power so can we talk about like why would i use you know as far you know as tristan said it's power automate or data activator just for real-time streaming stuff and then we use uh power automate for everything else what's your take on it? It's a good question. We get this a lot when people sort of see what you can do with Data Activator.

There's often this question like where's the grey area, like where's the overlap between those two? And you know that first question about like is it just relevant for real-time, near real-time data? As I said, you know we don't really care how quickly or what the latency is of the data coming to Data Activator.

We'll very happily sit on top of that whether it's fast moving or slow moving and in fact combinations of the two. We can't quite do in the product yet, but we're planning to be able to say, okay, well, maybe I've got some reference data that's slower moving that comes from a data warehouse or comes from Power BI report. I only get some data once a day, but I'm going to combine it with the real time streams so that I can say, tell me when the tire pressure, the data from these sensors about my tire pressures come and go over or below a certain threshold.

But the alerts that I want to send have to go to a particular driver who's driving that truck. I've got that in some reference data. I want to combine the two and the event fires, I just need to know what's the latest value from that reference data. Go grab that, and that's going to tell me which driver I need to go and send an alert to. Combining that data together is something that Data Activator is very good at, and it's something that's difficult to do in Power Automate.

If you think about trying to sit on top of a stream of data from Power Automate, I know they've got connectors to event hubs, but kind of everything that you need to know about the how to evaluate that workflow needs to be in that event it it's it's possible but it's very clunky to say okay when an event comes in go and run a query to go and get some of the data um or go or even worse it's like go and run a query to look at the history of those events you know if i wanted to know um what's the delta between the latest one and the previous one how do I go in one Power Automate flow and go back and get the previous value and then continue evaluating my flow? That's hard, if not impossible. Excuse me.

That's really where Data Activator shines. We are about looking at a series of data points that represents a value going up and down or a series of events that are coming in, and being able to say, tell me when a certain number of them happens in a time period. or look at the previous one, do some calculation, and tell me when that hits a certain threshold.

Even further down the road, we're not there yet, but we'd like to be able to do things like simple state machines. So I've got maybe something that represents a user's journey around an application. They came to our insurance application, and they looked at auto insurance, and they bought that, and they went and looked at homeowner's insurance, but they didn't buy it. There's a series of events that we've got in terms of what the user did.

The action we might take is let's send them an email and say, hey, it looked like you looked at that homeowner's insurance. You've already bought the auto insurance. We'll give you a discount on it. you know setting up that kind of state machine so you again how would you do that with power to me yeah that's all really that's right well and one of the things that when i first saw power automate this like the look like the opportunity to address those complex automations that that people in business would do all the time right like if i put my like salesman shoes on and i think okay hey if i have a small customer who starts to like like you know really start to spend over a large period of time i want to know about that to set up a meeting to like have additional conversation or conversely and this is where i say like power automate is going to save people's businesses if i've got customers who suddenly have a huge uptick in customer service incidents i want to know about that so i can go out there and i can address that problem because keeping a customer happy is so much easier and more valuable than trying to land new customers right and that's where data activator fits yeah totally yeah it really is i think um being able to um being able to look at what happens over quite a potentially long period of time and understand those patterns and the common behaviors maybe that you see from a particular customer and proactively say when you see that pattern occurring, go and do a thing.

And we talk about kind of in the future, where do we want Data Activator to end up? There's this vision that we've got for applying AI in these cases as well, where the models, the rules, the conditions, the logic that Data Activator might express can be tailored and tuned as the business changes. And as the...

patterns that we see evolve so you know we've talked about kind of pulling data in from power bi from event streams but these no code there's no code environment that we have in the middle to be able to build these these rules and relationships between customers and purchases and whatever the triggers that go along with those they they need somebody to be able to go and say yes when this value crosses this threshold five times in the 10 minute period then we want to fire a trigger But what if you don't know what those rules are? What if you just want to say something went wrong at 1am last night? I can show you all the data and I can say there was a problem there. And the system and the AI magic in the system can go and figure out from all those signals, or possibly all that noise, where's the signal?

What are the rules and conditions? This thing went up and this thing went down. And they go up and down all the time, but they went up and down at the same time. um or one went up whilst someone don't yeah something like that that's the interesting condition it can figure that out for you we can use some co-pilot some magic to say hey that's the true that that's the trigger that's the rule that you want to and you want to look for in the future and then go go take whatever actions needed well and i really like this outline of put a message in teams send them send email and then use third part automate because isn't that kind of how you'd want to be notified to make sure that whatever you've created with data activators the appropriate response like yeah yeah i don't want my clients emailed you know and spammed a billion times right i think you're my automate or my data activator right we we've got a lot to do i think around um managing the spam potential for this as well um so at the moment we can direct the messages to the appropriate person based on the data that's coming in so if if the events and the data that you've got in there or that reference data has you know you can find the appropriate email to send this notification to sure we can direct it to the right person but you're still going to get an alert every time that that battery fires um there's some some things that you can do today to say okay well i'm gonna kind of create these higher order events where it's like i'm looking at the number of times something went wrong actually that is just a new event stream we're not going to fire an email each time something goes wrong but we're going to count the number of times it goes wrong and in a 24 hour period if it goes wrong 24 times or how many times then we're going to go send that message so you can do a little bit of that today but we want to make that a lot simpler to say hey i don't want to spam people all the time i want to um just send like a daily digest here's all the times that something went wrong just over the course of a day or maybe i'll send you an email the first time it goes wrong but if there's subsequent issues over a one hour period don't bother sending any more emails i'm on it um we also hear some requirements about like escalation you know the first time it goes wrong send me an email if it goes wrong 10 times give me a phone call if it goes wrong 20 times and i still haven't done anything about it even my manager you know those escalation paths are something that we are thinking about building into those those action uh that action system um component on the right side of this diagram um to make sure that those notifications go to the right people with the right frequency so that it's actionable rather than just becoming more noise right right and i think that's the biggest risk is that noise events um can be just definite right yeah i know when uh i was power bi admin in a large organization like i was getting 15 000 emails a day on data sets that failed like i don't know yeah i didn't do anything with any of those right like it was impossible for power bi to alert me to something that was of relevance because it was there was just too much noise right yeah yeah yeah and actually finding finding the right person to send the alert to is really important And we've already talked about, yes, I could pick out the email address from, based on the data that's coming through, I can pick out an email address and send that email to the right person.

But even more useful is like sending it to a whole Teams channel so that there's potentially many people on the end of that who can take an action based on the alert. But there's also kind of the concept of, well, when something goes wrong with the trigger itself. maybe you know we didn't receive any data for a while because there's some upstream system that's sorry i don't know is that upstream upstream system that's not not sending us the data properly well we should notify you to say hey there's something wrong with your trigger like you're not going to get the alerts that you'd expect to get um but who does that go to go to the person who created the alert or should it go to the same people who received it because they probably you know if that's like some um some ops some person in the operations world who's actually like fixing the issues they probably can't do anything about the data that's coming in that needs to go to probably somebody else actually not even the person who created the trigger so there's there's again this is kind of the is what i mean about these operational uh systems it's very different to the bi world we have to think about like there's some slight differences there about like what expectations people have about how they can build these systems and how these things put together well i think that that does make sense it kind of even lends uh the question james has about can data activator detect an anomaly in a model, tell Power Automate to refresh that model, wait for the refresh to finish, then check to see that the anomaly still exists, and only then send an alert?

Yeah. Do you do something like that? Depending on the scenario, I don't know whether we have exactly what we'd need to support that today.

But in principle, yes. The idea that, okay, I've got a stream of data, I find an anomaly in it. Oh, one of the really common use cases we're hearing is people who want to trick.

and check like a data pipeline health so run a data pipeline and when it finishes you know emit an event that says hey we just ran it took two hours to load we did 10 000 rows great everything looks good you know the next day it runs again and it only loads 2 000 rows it took 30 seconds left okay well it didn't fail as such but that's weird it's a lot less than we'd expect so there's an anomaly let's go detect that now rather than notifying somebody the first thing we'll do is run the pipeline again because maybe there's some transient issue maybe we'll run it again and this time it will load all the data everything will be good don't need to alert anybody but you know we'll just listen to that next event that says hey the pipeline finished oh okay now it loaded 10 000 rows no problem okay that's still loading 2000 rows now let's go and go and set that alert so yeah you could totally set that up assuming that you're getting the right data through and you're sending the right things those those kind of um we kind of internally we've been calling the kind of system events like the things that the rest of fabric are emitting that's something we're looking at a little bit further out um you know how how can we expose more of that um it's like the sort of stuff that you get in the fabric of admin logs of how you add in logs how can we expose that for people to use in data activator to help manage the way that they're using fabric you know it's not necessarily the business results but managing the data platform um There's a ton of value in exposing that stuff as well. I remember one of the very first times we spoke to a few customers about this. There was a guy who was saying, I want to get a notification or at least I want to set up a trigger the first time a user opens a particular Power BI report.

We've got this report that we send out to everybody in our organization and the first time that somebody opens it, I want to send them an e-mail with a bunch of how to. what does the report show you? How do you interpret the values in that report? And I thought that's a really interesting case because as a platform, as a product, on the product side, we've got telemetry that tells us when somebody opens a report.

and we've got the user ID and we've got the report ID. We can expose that back to the person who owns the report, the creator of the report, so that they could stick an alert on top of it and say, if it's the first time you've seen that user do it, then go and do a thing. Those are really interesting cases where, again, it's like, can you do that in Power Automate? Well, you could, but how would you know it's the first time the user's done it? You'd have to keep that record somewhere and build that into it.

We can handle that. It's the first time. this is the first unique usage for that user we can go and do that sort of thing um so that's again a little bit further out we want to go and expose some of that things in the future uh but give you a flavor for where we go well and i i absolutely love that i would love like that story you're telling about like data quality testing in your pipelines like actually be able to like have that be an integrated component inside of your devops and all of your processing be just fantastic for regression testing and yeah i i love that and jai is asking can we push like notifications to the server right right through service now so we don't have native connectors to service now and a bunch of other things and frankly we're not going to go build those as like native data activator integrations because there's this wonderful tool called power automate which is great with all those it's got hundreds of those connectors and so you can call a power automate workflow and you can pass through whatever kind of parameters you might need from from the the way the event has been handled and the way that we've detected those conditions whatever information you need you pass that over to power ultimate and say hey go and call a flow that puts a ticket into service now um and you know that that offloads a bit of the connector work to that framework and says hey you know if you've got some random esoteric system that you know you and four other people around the world use there's probably a connector for it because people are building all those connectors we're not going to go build that, but Power Automate can handle it, and so you can integrate it that way. That's just, honestly, that's fantastic. I love the story that you're outlining here, especially in this middle here, that data models that you could stick on top of anything.

Then creating that, having our data scientists be able to directly integrate their work with the outcomes of and the work that our business users are doing just this completes the cycle and the story right it's just such a such an awesome capability this is fantastic um yeah uh now why why don't we jump into a little demo and we'll i'll kind of show you a little bit of kind of how this works my mind exactly reading your mind yeah nice yeah yeah that's great all right cool so um The scenario that I have for this little demo is around these rental bikes. So for those of you who didn't figure out already from my accent, I'm from the UK. I didn't grow up in London, but I used to go there all the time for work.

And you'll see these bikes whizzing around the city. There's 800 or so of these docking station points where you can pick up a bike, use an app to unlock it, take it somewhere else, drop it off and carry it on your day. And you get them all over the world, right?

So these exist in big cities all over the place. The cool thing about London is that there's actually... a public API that you can use to see how many bikes are available and how many open spaces there are at any of the docking stations around the city. Transport for London are the company who operate it and they make all that data available in the public for free.

And we thought this is a great idea. We can call that REST API, not directly. So I just saw James's comment, actually it's really, really, really relevant here.

We're calling that docking station API actually using Eventstream. We're actually doing a little bit more behind the scenes right now. But again, the Event Stream team are working on this, where you'll be able to go and just pull that REST API on a schedule and pull that data, turn it into a stream and pull it into Fabric. What we're doing at the moment, pulling that API, processing it with Event Stream, and pushing it into our docking stations reflex.

The item that you create when you're using Data Activator, we call it a reflex. In the same way that in Power BI you create a report, in Data Activator you create a reflex. And there's a couple of reasons why we call it Reflex.

Partly it's because that's what our code name was, which we love. But it describes kind of what we're doing. It's this kind of stimulus response system. You know, you think about your reflexes in your body. There's something that causes, you know, there's some stimulus that causes this action to happen.

That's kind of what we're doing here with Data Activator. We're listening for these inputs, taking an action accordingly. We talk about kind of building a digital nervous system for your business.

So I'm really kind of marketing phrase, but I love it as an aspiration of what we want to do and what we want to enable for people. I mean, yeah, being able to have every limb in your organization, be able to sense and feel what's going on around it, and then take automatic responses, that's awesome, right? And it is in a low-code framework, right? Like this is activated, right?

This is data activated, right? Yeah, so what we're actually looking at here is the event stream. So this is on the input.

I see James'comment is about output. We'll come back to that later. So this is actually the input side event streams is the artifact we're using to pull that data in. And then once it's being pushed into the Data Activator Reflex, this is the Reflex view.

So I'm here in the data view where I'm looking at these are all the different events that we're listening to, the streams that we're looking at. And you can see these docking station events are coming through. So this is real time pulling that REST API and pulling in the data.

As these rows kind of flash up in green, this is new information coming in. So you can see they're coming in a few every second. And we're basically just cycling through all those docking stations saying how many, you won't see it on your phone yet, we'll come back to that. How many bikes are available in each of these docking stations at any given time. And so there's information about the street, the neighborhood, latitude and longitude of it, back to my mapping stuff.

And then how many bikes are available and how many empty slots there are. So when we build these models, these triggers in Data Activator, We have to say, how do we interpret all these events? Because as a human, you can look at this and you know what a street is, you know what a neighborhood is. How do we tell the system what to look at, what to identify?

So we're going to create one of these docking station objects to track the information that's coming in for each of these. And the column in these events that tells us how to uniquely identify one of these docking stations is this byte point column. So it's just got an ID column that goes along with it. Now I could also choose the other fields from these events that are relevant for me in building these triggers.

We see quite often that particularly with like IoT streams, there's a whole bunch of information that comes through that might tell you about, you know, the battery level of the IoT sensor or like the signal strength of its Wi-Fi connection. A lot of the time we don't care about that. Yeah, sure, if you're the infrastructure person, you do care about it, but for the actual business event data, that might not be relevant.

So you don't need to bring that all in, just choose the relevant pieces. And then when we come over to design mode, this is going to hopefully drop us over. Yeah. And we can see that we've built this docking station object. It's bringing those events in.

And there's a bunch of properties that are going along with that. They're going to pull out the values from those columns in the events and show them to us. Now, I spotted a little problem with my demo here that for some reason is not rendering my charts. So I am going to flip over to a video version of it.

and we're going to run this because you'll see the same sort of thing so um what we'd see then here we go great is here's the values of how many bikes are available what's the bike all right i gotta say uh kudos for you on this level of preparation like oh my demo's not working i'm gonna pull up a video of this already up and going like what the heck man you're For those of you who have been following me around over the last few weeks, I've done this demo before, so I've got a video prepared. All right. Kudos. Okay. I'm sorry.

Keep going. Okay. So what we're looking at here is for each of these five byte points, what we do is we actually take a sample.

You can see up in the corner here, there's 120 byte points overall that we've been looking at, but we're just sampling five of them here. Because if we tried to plot all 120 on this chart, you wouldn't be able to read it. It would be a mess. right um you know it's like there are some data visualization principles that it happens to be that i've spent however many years in the bi space learning about these these best practices so we're applying them here as well um so yeah we're picking these five and you can see as the number of bikes get checked in as people pick them up or drop them back off they get these values go up and down so what we actually want to do then is say okay let's use that and build a trigger on top of that we want to do something when when the number of bikes in any of these docking stations goes goes up or down And in this case, the scenario here might be, you know, if there's not enough bikes in a particular space, we need to go and send people out with a truck to say, go and find a bike from somewhere else and redistribute them so that there's bikes available wherever people want to go and pick one up.

So in building a trigger, what we have to do is say, what's the value that we want to check? What's the condition that we want to detect? What's the pattern we want to look for?

And then what action do we want to take? So we're always doing this select, detect, and act. What's the watchword for using DetriActivator? What do I want to select?

What do I want to detect? And what action do I want to take? So I'm going to start off by selecting that property, that byte count property that got created already.

And that'll show us the same values going up and down again. And then the next thing is, okay, what's the condition that I want to detect here? So we can do a bunch of different things, whether it's, you know, becomes greater than or less than a particular value. And just pause it there.

When that value enters or exits a range, we can also trigger on top of, you know, logical values like true, false values, or if you've got string values, just anytime it changes to or from a particular value. So we see again with kind of um some iot sensors what they'll do is they'll emit like a status value am i am i running am i idle am i overheated whatever you can react to those as well it doesn't have to be a numeric numeric thing and there's a whole bunch of these that we're adding to to over time as well um this is one of the areas where we want the most in terms of feedback like what do you need in these conditions you know we've done basic sort of numeric things but is there is there something more complex that you need i talked about kind of these state machines tracking like user journeys or something. That's a much more advanced use case. We want to get there. What else is there?

What else do you need? Actually, one of the things that we've added here, particularly for these numeric things, is this ability to say, rather than firing each time the number of available bikes becomes less than three, I only want to fire an alert when we see it drop below that threshold, let's say five times over the course of an hour. Or maybe I want to say only fire this alert when the number goes below three and stays there for 15 minutes or something.

Really good case in point here. I was talking to an airline and they were looking at reacting to gate changes. So, you know, you've got a bunch of people waiting around for the plane. It was meant to be coming in at gate 12. Some reason, you know, something happening out on the airfield, which means that actually it's going to be coming in at a different gate, gate 15. They said. one of the most common things that happens is that it will change and then within a minute or two it will change again and if it changes once there's an indication it's probably going to change a couple more times before they finally know where this plane's going to come in and so they said well don't bother sending an alert every time it changes because you'll just spam people and they'll be having to run around the terminal up and down you know to get to the right gate monitor it and once the gates once the gate has settled and like we know that it's been it's changed to a particular gate and it stayed there for 10 minutes then said the alert and it's a really great way of just reducing the amount of spam that the system might generate so you don't want to react every time wait until it's settled that we know we're a bit more confident about it give it a bit of a bit of time and what you can see at the bottom here is um you know to help understand how often this trigger would have fired across all of the values not just the five that we've been sampling across all of them how often would this trigger have been activated So you can see here, there's a spike here where it would have fired twice in this time period at 3.05 UTC.

There's a couple of other times here where it would have fired a bit more often. And that might help you kind of just tweak the values. What's the threshold that I care about?

If my freezer temperature goes up and down, I'm going to send alert when it gets to 34 degrees. Well, actually, you set that in and you see, oh, it would have generated loads of alerts. Okay.

maybe 35 degrees okay it's a smaller number of alerts there may be some you know maybe some reason why it's got to be that value but if you're just looking to say hey how how can we just optimize what people are working on you want to kind of tweak it and tune it a little bit right maybe people are coming in and out so it's a big deal but otherwise right like yeah yeah exactly and then the last piece is then what action do you want to take um so we we talked about email teams message I can use this custom action to go and call a Power Automate workflow as well. We'll come back to that one. It's great at the moment because I haven't set one up in this demo. But for something like a Teams message, you'll see it fills it in with my details to start with.

So you can see my email address. But then you can put in some other details as well. So you want to send some context with this message typically.

So some kind of headline that's going to grab people's attention when they get this message. same sort of thing you'd get an email and then you can put in an optional message as well so um what i did here in this headline was you can see how i'm using these curly braces around street well that doesn't say go go and get the latest value from the street column um so for for the you know when that trigger fires for the event that fires that trigger go get go get the street value as well so we can send that through as context in this in this message and we could use that here to do what we're saying about sending it to the appropriate person if you had an email field came through you can put that in the recipients bucket put the curly braces around it and it will go and send that to the right person so let's go to I'm going to send a test alert and pull in the teams window in my video here we go uh so this is the kind of alert that you get so it's got that headline it's pulled in wellington road as the street that fired the event and then it gives me the information that i need about when did the trigger fire how many bikes there are how many empty slots there are what do i need to go do about it so you know as the person out in the truck like okay great i've got to get over to st john's wood i'm going to go look for some other bikes nearby um figure out where to go pick them up from and redistribute them back to this uh this street in wellington road to make sure there's stuff there people to use next time they want to go pick up one of these bikes pretty cool right so yeah this kind of operational alerting on top of streams of data think about how you would do this previously right you've got a bunch of event hub stuff to do you've got a bunch of like streaming analytics code whatever it might be it's probably a bunch of engineers involved this is not a low code this has not been a low code environment before and now we're making it hopefully i i like to think and the feedback we've got so far is that this is pretty straightforward lower it's defense well actually is there code in here arguably you know i've had people say well what you're doing is you're encoding the logic through the ui so you are coding right i'm not writing script i'm not i'm not yeah putting out my back skills or anything like that so yeah right but no there's there's not really any code here in the way that my people can get this is a problem no code solution yeah that's that's fantastic yeah and did you create this dashboard or did it create i i i did this stuff this is this is not the most beautiful report but uh with copilot we don't know these things yeah yeah sure yeah so what the other thing i wanted to point out here is like we've been talking about streaming data here but um not everybody's got that streaming data but i bet everybody who watches this has got a power bi report and like i said before you know we'll very happily sit on top of a power bi report as well so i'm going to go away from my demo here because the video that i recorded is a few months old we changed the ui a bit so i wanted to show this live so let's say i've got i've got a report here that i want to i want to monitor this and send me an alert when when some value changes here you'll see this little set alert button on any of the visual menus um it's also up in the ribbon um for your alerts and this gives you a little pain here to fill out to go and create one of these triggers on the flight. So I can say, do you want to get email or teams? What measure am I looking for?

In this case, it's this average occupancy rate. Tell me when it becomes greater than or less than changes from or to a particular value, type in your threshold. And we do things like picking up all the filters from the slices and selections that you might have made in the report. So you always, the alerts always run on the thing that you were looking at in this report. And that's been really important.

It's just like, it should just work, right? You should just get what you expect. Then I need to go say, okay, where do I want to go save this? I'm going to put this back into my workspace.

It's fine. I can create a new reflex, give it a name. When I go hit create alert, this is going to go set up that reflex, hook it up to the Power BI report, set a schedule so that it's going and looking at this report every hour to go pull in the latest values. set that condition and start the trigger running behind. I don't need to do anything.

This is one of the reasons why I've changed. I wanted to show this live rather than the video because that's it. All I need to do is fill out that pane.

For Power BI users, I don't really need to know that I'm using Data Activator. That just makes it super simple for people to just say, hey, I need an alert. Now my alert's running.

It's there already. Well, good. I'm going to start getting some Teams messages or emails, whatever I selected.

As soon as this as soon as this value changes. That's fantastic. So this is something that like everybody. This is completely integrated.

Yeah. Yeah. Everybody can go use this now, right? You can go try this out straight away.

Assuming you've got fabric, assuming you've got a fabric premium capacity and it does need fabric premium, you can set up a trial and go try this out. Just you need to save that reflex item into a capacity. So, okay. All right. So I see how this works.

how do i if i'm a report user come and manage all of my reflexes uh and now maybe you don't have a story for this i don't know but let's say i've set up a whole bunch of them and now i'm getting spammed and my yeah yeah sure my phone just does a vibrator at this point it's just going constantly i mean i said like can we get these alerts on mobiles like yes you know you send a team's message it will come through it team's mobile um if there's other ways that you want people to get alerted like sms or if you want to get alerted through maybe like the power bi app i'd love to hear um Please, please, please use our ideas forum. I know there's been problems with the Power BI ideas forum in the past. When I was in the Power BI team, I understood the pain of using that.

But because we're such a new product, there's not many data activator ideas. If you go and tag it and say, hey, here's a data activator idea, put it in the right category, I'll see it. In fact, I've got a data activator alert that tells me when the number of items goes up. So right now, everyone, we're really...

God, let's have five minutes to submit an idea and have it show up on the slides. Yeah, we'll see it show up. I think it's only checking once a day because it's actually going through a Power BI report.

I think it only refreshes once a day. So I won't say it yet, but I'll say it tomorrow. But please, yeah, let us know what are the systems that you want to integrate with, both in terms of the input that you need to listen for.

What are the conditions you want to check for? And what are the actions you want to take? Those are the three buckets to think about. Like if it doesn't do it today, we'll take the feedback.

We'll do it. Yeah, that's a great idea. I think there's one on the forum already about that.

If there isn't, it's a great thing. We should put it on there. We've heard it before. Yeah, being able to kind of categorize them appropriately.

I'd love to know like what level of detail do you need? Like just high, medium, low? High, medium, critical?

Do you want to... like a numeric ranking because if you think about all the alerts that might fly around in an organization there could be loads and everybody will make them high because everybody's stuff is important right why would you ever use medium i'm joking but um i i i i want to know like what what what do we need to build for that is maybe high medium and critical is enough um please please let us know that's great yeah so that would be my like my main call to action to everybody here is try it out and then give us that feedback on the ideas forum. You know, I think it's because we're this new product. Like I said, we went to public preview in October. We're shipping every week, same as the rest of Fabric.

And we can be really agile and adjust our plans based on people's feedback. And I'd love to do I would love to look at the ideas for him tomorrow and go, oh, well, I'm going to throw my plan out the window. because everybody's asking for something different. We build what people need.

And I can attest that Will and the entire Fabric platform, especially when products are in preview, like your ability to respond is why I'm not a Tableau developer today. Like that just blew me away at how fast you were able to actually make changes to the service and these capabilities. please please please go out and put post your ideas out on uh the website uh gosh my short term memory has just gone to garbage ideas that fabric that microsoft.com it's there yeah and uh you know i'll plug the community forums as well so community.fabric.microsoft.com if you've got questions or ideas or you just i'd love to hear more about kind of the scenarios that you think would be useful for if you can't kind of boil it down to a particular idea and you just want to kind of give us a bit of bit of feedback and have a conversation so like i said there's there's me there's two pms who report to me we're about to hire a third um we check those forums pretty much every day um if we don't it probably means we're sick or on holiday or whatever um i mean the downside of having a small team is when one person's not around so one of the guys was just on paternity leave he's just come back but he had three we were we spent three months including the launch period with me and one other person that's like that's not enough to find your product i won't make that comment but uh my phone thing never worked i don't know we didn't do this so uh we didn't get to that bit but that's all right okay all right good deal just making sure i wasn't missing something here um awesome well well i i'm very grateful that you took this time out of your day to show us what that activator is answer some uh great questions and honestly like i think like Like Jai's got some great ideas from, you know, categorizing alerts and maybe even like even spitball in here in the UI, allow us to like put in an escalation.

All right. Like if it's a low value thing, here's my here's my limits. Here's how I want to get notified. Like if it's medium. Right.

Like allow me to fill out what I want on there and just natively create that like navigational experience for people. people without having to do much, right? Like that'd be awesome. yeah that's a good one thank you cool yeah and and you know if if anybody else has got questions or thoughts or anything you can find me on linkedin um i'm not on twitter that much anymore i do occasionally check it um i use blue sky a bit more trying to get away from twitter because i'm not a kind of platform anymore um but yeah linkedin's great great great place to to find me i actually use that i find myself using that way more in the last couple of months than i've ever done before it's interesting and i will say i there's equal representation of youtube and and linkedin viewers that oh linkedin is above my uh my youtube viewership so uh i just gotta get linkedin to monetize me so awesome well thank you very much will thank you everyone for for for spending your lunch hour with us we really appreciate it uh will any parting last words no that's great thank you very much for having me i really appreciate the time to come and uh spread the message about about directvating all right awesome thanks everybody peace just like see ya baker tilly digital combined strategic industry insight and advanced technical expertise to uncover and solve your digital transformation challenges if you're interested in learning more check out our website at bakertilly.com digital