Transcript for:
Innovations in AI for Renewable Energy

hi everyone uh this is the result uh I've been working as a senior manager uh data science at SoftBank energy at shop Bank energy uh we are a renewable energy platform uh to develop uh energy uh projects uh then own energy projects as well as maintain Energy Products projects through their lifetime and inside uh SB energy in the data science uh world uh we work with the forecasting problems we work with uh anomaly detection problems we work with optimization as well as predictive maintenance and we have been working with uh abacus for the last six months and it has been an amazing journey we have deployed couple of models and uh expanding uh our partnership with Abacus on multiple fronts thank you everyone um Anthony you have been customer for a long time if you can again introduce yourself at the panel and go over the problems that you're trying to solve with us that would be great you know we've been a customer thanks to you nandish um thanks to everybody so I I'm uh with Johnson Controls and run a team of uh data scientists globally that um Is Anchored in our Global Services business um but we do take on projects across the Enterprise I think primarily um are sort of what pays the bills for the team is we have a number of churn models or retention models in our Global Services business um I think we're up to 12 models that we have in production um predicting what customers are going to leave our Global Services businesses we have three different domains um in in multiple regions so we have that we also have uh anomaly detection as well um finding uh aberrant Parts uh sales purchases uh in the field um and I'll leave it at that and then we're also doing we just started an optimization project with our finance department for our CFO um working on cash to order so how can we how can we reduce DSO or the time to collect cash in the Enterprise uh and and so we have a number of sort of proofs of concept that we've done as well but those are our big three projects right now makes sense uh thank you very much uh so moving on to the next question uh whenever you decide of AI projects the first thing that comes to everybody's mind is should I build it in-house or buy it from a vendor uh did did you guys go through the same experience and uh like how did you make a decision of like building it in-house versus going with a vendor uh again restaurant you can go first yeah so I mean in these days like developing a in-house solution probably is out of the window uh it takes a lot of effort and resources and bandwidth uh and uh especially businesses like us who are uh whose main focus is on the renewable energy energy side uh we are not developing core uh ml Solutions right so out I would probably say that question probably would be to go with traditional providers like uh AWS or Azure compared to Abacus uh so what I uh found or like what initially we were doing is uh we were developing Solutions uh in Azure and recently moved to uh Abacus uh we did the proof of concept with them and what I found uh while choosing the partner uh the most important things would be uh whether you can have a collaborative uh effort in development developing the solution how much uh time you're getting uh from the provider whether uh for example like uh say uh taking a model into production requires uh a lot of back and forth and if you are interested to do it in AWS or Azure you need a lot of uh steep learning curve uh and throughout this processor if a provider can help me to basically navigate uh through the path and reach my objectives fairly easily and basically save my bandwidth and time so that I can focus on other tasks and delivering uh value for my internal customers so uh in in these uh aspects uh Abacus has been kind of really helpful uh onboarding us in their platform uh developing the models and deploying the models have been fairly uh easy and straightforward and looking forward to extend the partnership again thank you Arizona uh Terry what about you yeah same same as res one um we we just didn't have the bandwidth to do it for us it wasn't necessarily building a plot in mlops platform in-house we were hard coding in Python um and I bet you know the incident we switched to Abacus it saved us 80 of the time we were spending um and it's not just you know training the algorithm there are so many features that improve um efficiency in the platform but it's we we were able to use the the time that we spent um in Python on data wrangling and feature engineering and things outside of the platform to make the models better so it we we just we couldn't have scaled the way we did um either with uh you know trying to build our own native platform or doing it in Python we did evaluate also data bricks uh my company has a global partnership with Microsoft so that was sort of you know the there for us to try um but again there are there are features and functionalities and Abacus that are that are unique and continue to evolve I mean some of the things that I talk with the team about that they leverage today are pretty special foreign thank you both of you I think that's a good uh advertisement for us um actually uh we have seen this a lot uh I mean I think we see customers in three buckets like the first uh like the beginning stage is when people try to like build everything in-house the second stage is trying to use the cloud vendors and the third stages like using specialized platforms like articles I think where Cloud Windows uh what we have uh typically seen where Cloud vendor slack is uh they don't pay attention to detail they're not completely focused on the uh ml platform have a bunch of pieces that you have to stitch together and we have seen this uh happening a lot with a lot of customers and they're coming to Abacus uh just like you mentioned so uh moving on to the next question I mean a lot of people might be curious like how did you guys find out about uh about Abacus like was it that we reached out to you like you saw us somewhere uh just wanted to know your experience on that yeah of course yes it's interesting like uh uh I was subscribed to Stanford's uh uh digital economy Labs uh seminar and I think uh back in uh October last year there was a presentation from Abacus there and uh I basically is uh tuned in into that presentation uh and uh knew about Abacus and at the time like uh in May uh sorry in November I think uh charging Beauty was released so we were excited like uh we uh kind of uh reached out to Abacus in December uh and then like uh things moved on from there uh we uh did a couple of proof of Concepts uh and then signed an agreement in I think March and then continued uh on the journey makes sense uh Terry yeah so we um we I got a call or I got one of the the members of my team um somehow got connected with Abacus and had an initial call uh and then brought me in we really like the platform um and the ease of doing business with the team was was a big deal uh it was very very easy uh we we gave them a data set um that we used to train a model um they ran it through the platform the Precision increased tremendously uh it it it was very easy and there were they were very collaborative and still are we have a weekly session with one of their senior data scientists and he's very helpful um so it's the the barrier to entry was actually very low it was it was very easy awesome uh yeah and how was the experience integrating Abacus with the rest of your infrastructure like uh how long did it actually take uh for you to realize value and uh what support did you receive throughout the whole process I think Terry you mentioned um there's one like uh how was yeah how was your experience with uh integrating it with infrastructure maybe president you can go first on deadly next yeah we we uh still have uh most of our data in Azure uh and also we pull data through third-party apis and uh the integration is fairly straightforward like uh there are Azure connectors uh available uh as well as uh uh we can basically use uh notebooks to uh pull the data as well as use connectors uh uh and then there are feature groups we can pull the data or the inference outputs from the platform to our platforms uh uh we we didn't face like any difficulties uh uh pulling data as well as throwing data to Abacus and even if there are issues like there are people too basically sort it out so that is the amazing thing um entering from your end like okay how was the infrastructure uh how is basically integrating Abacus with the rest of your infrastructure to realize well um this has been the challenge for us and it's it's not really anything to do with Abacus it's we have a bunch of security protocols about anything coming into our Network um so in a in a perfect world we'd be able to both push and pull from our data store from our snowflake instance into Abacus and then output our inferences into snowflake but we have to have an intermediate FTP site uh because our our security team is not comfortable with that connection yet there are alternatives I mean there are there are other things that we could do but the the security challenges from our group have been a bit of a um something for us to consider but um infrastructure Wise It's it it's there's there are no problems no problems at all awesome uh yeah and um so for the next question like uh when we see this all the time uh like we meet uh some code First Data scientists once in a while who want to write all the code themselves and are hesitant in using any AI platform I mean we have seen these people who believe that uh writing the code gives them a lot of control uh they know exactly what is happening they can they can have the code for the exact model and things like that have you had any such resistance from your internal data science teams that they want to like go through the process and build the models themselves rather than like uh using uh vendor like abacus uh this one yeah yeah so yeah this uh question uh if I think thoroughly on that like it depends on the use case and the problem itself so if if uh you are working on a uh say very uh Innovative Cutting Edge problem for example like uh developing chat gpd5 I I maybe then then of the self-solute solutions are not available readily right uh so in those instances like you probably uh need to grind through your problem so that you can do better but if if uh the use case is uh kind of like uh forecasting or uh if the use case is uh churn detection uh for which uh a lot of like uh of the self Solutions or models are available then uh the best approach uh is to basically use the of the self solution and then maybe try to improve on the data rather than focusing on the model itself try to uh you improve the data set or the data quality so that the model basically takes care of the best available data and uh in in that particular case like uh yeah I mean we are focusing more on uh how we can add more data uh how we can basically figure out which data works better for our model what can we use uh to improve the forecasting rather than like uh trying to come up with the best possible model because the accuracy difference uh in choosing the model is fairly uh limited compared to the the data itself so uh on that side like again I would say uh the data is the most important thing and the other one is uh regarding uh of the self uh say uh uh developing models uh internally versus uh in the platform I would say uh there are Google collab uh Abacus has their uh notebooks uh you have code spaces so if if uh developing uh models or running notebooks uh in those plat if you run notebooks in those platforms the integration and accessibility of the code itself becomes very easy uh and it's very easy to basically collaborate so in in those cases like uh yeah I mean uh data scientists should should be like a very comfortable in using the platform rather than uh having a cold storage like a computer type solution so yeah I mean I haven't seen uh anyone uh uh not embracing that but yeah I I mean I feel that is more intuitive that you should use uh Cloud Solutions uh for the ease of your work makes a lot of sense Teddy yeah I I agree uh I think the I think all of AI at this point is is sort of falls under the same explanation of it's an augment um it's not either or there's the the platform for us has been a fantastic augment to our work and allowed us to focus on feature engineering and and you know business case business con building business context which is the most important part uh the training of the algorithm is is probably the last 10 percent last 15 percent um in in I mean if someone really wants to do that in Python by hand then by all means do it but I think they're if you're if your objective is to solve business problems and to scale um it's it it's very uh sensible um to leverage the capabilities of the platform and spend your time uh embedding with the business and figuring out how you can improve um you know the features that you're feeding to it so I think I think the discussion of um you know does is the platform gonna steal work from data scientists is is not necessarily sensible because it's an augment it very much enhances the work totally totally agree with uh both of you I mean the way I look at this problem is uh you have data uh and you want to solve a problem so there are two steps to it one framing the problem right so a model can extract signal from the data or the or the data that you give it uh I think that has been enough research that's done on the second part there are a lot of models that you can just get out of the box and companies like us are doing more and more research in this area data scientists build real is a lot more value if they focus on the first piece which is like how do I frame the problem how do I get more information to solve our problem and we have seen this time and again with a lot of our customers where the actual value is realized when we kind of tweak the problem or frame it differently than like try out again it goes well with one of our core principles which is do not reinvent the wheel uh basically like if something exists there's no point like uh like trying to build it again and again yeah totally uh on board with like what you get settings I think moving on to the next piece uh given that we have already discussed about chat GPT another lens a couple of times um and we all know that like generative Al and llms is currently in Trend and there's a hot uh discussion topic uh would like to hear uh your perspective on uh how you are leveraging that in your business today or have you planned to leverage it in your business I mean basically how you plan to leverage generative AI or the labs as one yeah so yeah I have been answering all the questions first so yeah sure maybe we can flip it from the next question on yeah uh yeah I would love to hear that this perspective as well uh so uh yeah for for this uh llm and uh large language model related discussions uh the way I see uh is uh in in in a business or in a industry the way uh it can help uh is to improve uh the productivity uh of the business and the way it can improve the productivity is uh through uh democratizing the information sharing within the team and uh through uh workflow automation so uh I've been uh thinking about like how we can uh integrate a chattel um uh into our uh say work process and then the first thing that comes to us is like uh when we see a fancy uh YouTube video like uh you are doing this and that uh you get the information uh very quickly uh the more important it uh importance it can bring is if you can for example connect the chat LM with your uh say uh uh with your uh Microsoft teams so that like uh if uh the company has a information uh that company needs to share then whether I can access it without asking to someone uh if I have a database whether I can basically pull the data directly from the database writing uh chat prompts rather than like for example uh other things that you can do uh with chat GPT and channel M which are more like learn based Concepts so yeah I mean on the business scale uh I would say yeah I mean uh improving the business underlying workflow uh automating it or improving the workflow process and uh sharing the information probably would be uh kind of uh the steps that I would look in uh our uh use case uh where uh chat LM can significantly hook you know I I bet I've had um five or six different instances where someone's come to me um for us to evaluate you know what do we do with chat GPT and I I handle it the same way that I handle every other uh application it's just another extension um of Applied math applied science like what problem are you trying to solve um and and what outcome do you want and what problem needs to be solved to deliver that outcome and if chat GPT is the tool to do that then it can be done I think generally um there are a class of applications and an Enterprise that um some flavor of these applications would be really well served depending on the specific problem by Chad gbt and their augments their assistance of some sort and it's um you know in in as an example um in the Abacus platform I know my team heavily uses the um you know the functionality to to develop script um with prompts and org proofread script with prompts or error codes or or uh you know review you know errors and and find bugs and code um and I think those types of applications any sort of augment or assistant application or uh res one mentioned a couple that I really like as well but it you know it all for me it it has um come down to does does the does the technology solve a specific problem that's leading to a particular outcome and I think that's um yeah I treat it just like every other application at this point makes a ton of sense like um yeah as Bindu mentioned in the previous presentation we are trying to use AI everywhere in the platform like want to make it use make it easy for data scientists to build uh products like Terry is like what you mentioned and also as well like we have Standalone use cases like the ones that you mentioned uh specifically chatting with your own data like pulling data from uh your tables without having to write code which we call it data llm uh I mean I think these are the two main uh like heavily adopted use cases by a lot of our customers what uh use cases you see uh as a uh platform owner uh for a channel for the Enterprises yeah so um I think it's more about like uh agents kind of stuff like where uh like we combine code uh a large language model Vector store um and a bunch of and and like orchestrating that really well so you can execute tasks without human involvement um so one great example that I've seen uh again this is like an internal use case is AI chat that window mentioned so like literally you can chat with the platform you can ask it questions like oh which uh like like I mean if you have a data about customer churn you can ask like oh show me all customers who kind of uh uh like are spending only like less than uh spending less than like thousand dollars this month uh you could ask it to say like oh why don't you plot a chart between like uh customer spending greater than thousand dollars and the uh you know like the the market cap of the company you could do like all these very interesting things uh which would have earlier required a lot of time uh for data scientists to uh create on the platform uh yeah so that's my take so basically it's going to be more actions uh chatting with your data uh yeah so these are the main things that I see going forward uh cool so maybe this time I'll flip the question first to Terry uh like Terry you have been a customer with us for a long time like what's your favorite feature about the platform ah there are many I think from an efficiency standpoint um the the native um ability to train multiple algorithms at the same time uh we haven't seen a lot of variability in the Precision of those uh XG boost tends to be the one that for our app you know all most of our models that that puts out the best results the highest Precision uh but I think there I separate like what has been an efficiency savings for our team and then what has had the biggest business impact and the feature importance that we get per over for the model overall but then for each individual inference has been uh invaluable in in terms of when we put an inference into the business of you know potentially an at-risk contract here are the top five reasons why and um to you know it's not that you can't do that off the platform but that that's native to the platform is uh it's both an efficiency a productivity game for us but it also has really helped us um provide value with the business in terms of how to uptake the output that we're giving them and that's that's been a big a big bonus a lot there I'll I'll just say this if the the what we found with the platform is that things that are barely past research papers like the latest technology that is available in the field is in the platform and um and and if it's not you're working with us to put it in there and that has been um you know over the almost two years that we've been using it has been a huge benefit for us and something we appreciate thank you nice one yeah so uh the main uh couple of benefits that I see is uh the first one like uh uh I would say we were looking for an optimization solution and uh the premium uh optimization partner or like a provider is a groupie optimization so for our use cases like having a license from a groupie or a optimization partner can be very expensive uh we were trying to build from slow and uh the optimization use case is not available in Azure and uh AWS to the best of my knowledge so having uh Abacus basically helps us uh to develop our optimization workflow the second thing that I saw is the code generator like we have a copilot subscription uh but uh slowly uh we are moving into Abacus uh uh code generator so probably uh in near future we can even cancel the GitHub copilot subscription uh uh and the third thing uh yeah I would say is like uh the new uh chat AI uh interface uh that uh Abacus is launching I would love to have that as a connector in uh in our Microsoft teams or slack platform so that like uh we can uh pull data directly from Abacus uh through our native chat platform makes a lot of sense yeah I think uh integration with slack and teams is like one of the heavily asked features and we are on top of it we should have it soon cool um so now the other side of the coin uh Terry like are there any features that you wish to see in the Abacus platform in the near future I I think you're you're probably there or or working on it but um you know the last 15 the training of algorithms obviously that's the the core of the platform but the bulk of our data any data science project is in feature engineering and data wrangling and I think um that the more um you know the more of of an augment that can be native to the platform from that perspective in terms of um you know making the connection between the performance of the algorithm and what's their you know is it that the the the data is out of distribution and I know those features are there in the platform but anything to help with the feature engineering and sort of the the bulk of the preliminary work that consumes so much time um you know sort of that connection to data ingest into the platform I think would be a big help Yeah we actually believe the AHS and like AI building a will kind of solve that problem so that's like Cutting Edge research that we are kind of there but yeah uh I I totally get like uh what you're where you're coming from like it takes a lot of time to kind of do the data wrangling phase I mean model training is like really quick um and we are like trying to use a as much as possible to kind of automate that so hopefully uh I mean we made we already made some progress and we'll see a lot more in that area no I was just gonna say it if you know making the connection between like especially a standard application like a forecasting application or a churn application um if there's a way to evaluate what features are there um and then suggest you know where to go to you know if obviously if you're not explaining very much of the variance uh of the data set with what you have uh some sort of native application that suggests what to look for or what additional features might improve the performance and I know that's hard that's hard to do but uh anything and you know I think I feel like that's the next Frontier like what do you have that equates to this level of performance what else what else could you look for to improve the performance of the model makes a lot of sense uh respond yeah yeah so it I mean one thing that I uh on top of like the uh say uh Microsoft teams uh connected thing uh one thing uh I found that would be really helpful for our use cases if we can have a decision interpretability so we uh use uh say forecasting and we are using our optimization uh use case but uh the interpretability feature is I think uh Missing uh so for example if we are generating a inference or forecasting or if you are bidding uh uh or offering uh through the optimization uh how basically uh the inference was generated quickly figuring out like if there is a spike in the forecast or basically uh if there is a flat line in the forecast like how we can basically uh connect with the raw data to interpret what underlying a regressor was basically uh the driver of the spike for example so that interpretation would be extremely helpful and uh on the use case side I would say uh especially like for the energy and renewable energy business uh if you have a GIS based use cases where uh uh say developers uh for example like solar uh energy developers can basically uh use the GIS uh platform to come up with uh say predictive models uh can be extremely helpful makes a lot of sense yeah definitely noted I think on the former one uh I'm not sure like uh if we have uh shown showed you this like but the explainability for forecasting model should solve some of the uh things that you mentioned uh maybe we can like take a look at offline uh so that's yeah but the later uh definitely noted I think uh yeah makes makes sense for some of the use cases so cool uh so I think that moving to the frag end of the panel so uh like what suggestions do you have for other Executives looking to uh start leveraging their organizations uh how do they go about the process of thinking uh of picking the vendors what's building it uh yeah generally any advice that you have for others who are looking uh that like looking to start leveraging AI in their rocks telling yeah I think not to belabor the point but I think it's it's it's it's crucial not to want to do it just to do it and it gets crucial to have uh clearly defined outcomes like quantifiable out quantifiable outcomes and then very clearly defined two or three problems I mean if you the more you get the more complex it is to solve them but really clearly Define two or three problems that would deliver that quantifiable outcome and um I think then if you do that over and over um sometimes it's AI sometimes it's just basic statistics sometimes it's maybe a dashboard it's it's just solving problems and I think that's what has been crucial for us is just to look at uh or evaluate um tools that help us solve problems efficiently and and you know the platform obviously has done that for us it's we've been able to scale applications that we we wouldn't be able to hire enough people to scale and so I think for us it's been it's been really really important to be focused very clearly on measurable outcomes and then problems to be solved to deliver those outcomes and then attaching the technical solution to that yeah totally agree uh with uh today like finding the right problems and solving them on a greedy manner is the right approach here uh because at the end of the day like you have to support the business uh and uh basically the most Roi is the most important problem and on top of that like I would like to add maybe uh while uh developing the solutions uh or solving the problems having a crawl walk run type of strategy can be helpful uh not only directly jumping into the run or going to the uh best possible AI solution rather than uh in most cases having a crawl or run type like easy solution can maybe solve 60 70 percent of the problem in a very uh uh quick time period so after that for example in each of the iterations or each of the Milestones are achieved uh having the mental related to basically have a retrospective to see foreign approach thank you that's very useful um thank you Terry and restaurant it was wonderful chatting with you uh I think we are on time so I'll hand over the Michael Bindu for the next panel thank you thank you