well thank you for joining us everyone uh as as mentioned previously I'm Yousef alfaham technical marketing manager at Autodesk and with me today is Ryan Brown Technical Solutions engineer at Autodesk and our topic for today is holistic Wastewater collection modeling using infoworks ICM going to transition to the next slide before we get into the topic before we get into the content safe Harvest statement we're just going to mention to you that if we any one of us mentions anything any forward-looking statements pertaining to any of our Solutions we do not want you making any purchasing decisions based on those statements so that is pretty much the summary of this entire slide and that transitions us to our agenda for today so we're first going to introduce infoworks ICM some of you may have heard of it some of you maybe haven't uh we're going to tell you why what are some of the unique uh capabilities in ICM and then we're going to dive right into Wastewater collection modeling and cover sequentially from Network creation to model calibration to analyses and output generation uh so this is typically the logic behind you know a typical Wastewater collection modeling project we're going to wrap up with some resources and then Q&A at the end so infoworks ICM it's fitting to discuss ICM in the context of Autodesk water portfolio so primarily we're split to desktop Solutions and Cloud Solutions in desktop we have water distribution Wastewater collection storm water and flood and in Cloud we have asset management and operational analytics for today we're going to be talking about Wastewater collection and more specifically Urban scale Wastewater collection which is uh using our solution infoworks ICM so what is infoworks ICM actually the main question always is is generally asked is what is ICM what do it stand for it stands for integrated catchment modeling it allows you to model River Urban drainage and Overland and conduct hyd hydrologic and hydraulic studies or assessments why infoworks ICM as you might be familiar in in a modeling project it's generally Beyond a modeler or multiple modelers or multiple team members trying to work and leverage the same model so one of the unique capabilities of infoworks ICM is the work group database historically we've been positioning the on premise workg group database which pretty much works as follows you have a central data Hub with a Master model or Master copy of your model it is up toate and you have multiple users being able to access that Central copy of your model and in the events where you have let's say a an element that was edited by two multiple two different users then there's a clash or or some sort of a conflict ICM is going to flag that and tell you need to resolve the conflict before you can proceed so there's you know it simplifies this collaborative environment between your team members as you're trying to work in Access the same model so that's been the I guess historically what's been positioned and in recent years it actually transitioned to a cloud uh I'm going to call it a cloud work group database so it's similar to what I described in the previous slide except you don't need the onon premised it infrastructure like you would have needed with the on premise database so that takes us right into our topic so Wastewater collection this is going to be an engaging session so we're going to ask you quite a few poll questions and we we would hope for your participation uh for us and an engagement with us so we can get right into our first poll question which is going to ask you what is your current solution for Waste Water collection modeling we're just going to give you a few seconds there are four options infoworks ICM info sewer infos swim XP swim competitor solution or we do not perform modeling thank you for responding please continue the responses seems like for the most part let's see what's leading right now it's between ICM and we do not perform modeling actually we do not perform modeling is dominating to an extent right now and uh we're going to give folks just a few more seconds and then we're going to transition to our next slide okay so the responses most of you well most of the most of the attendees use ICM and then the the other majority do not perform any modeling and we have some uh folks using infos SE infos swim and XP and then some using competitors modeling Solutions as well and that is a good segue to our next question which is which what which is what is your preferred modeling solution for Wastewater collection so some of you may be using something but may you know may maybe you'd like to use something else and uh that's that's really the driver behind this question so we're just going to give folks a few more seconds and obviously if you if you don't perform any modeling you might aspire to use something or maybe you know do not know yet okay responses are still coming in we're just going to give folks another second or two all right we can end the poll there so majority of you have responded that you'd prefer using infoworks ICM which is great for us to hear and hopefully this presentation is going to you know motivate that or continue to lead that the final poll question before we get kick this off is what is the biggest pain point to those that perform Wastewater collection modeling what is the biggest pain point in your present Wastewater collection modeling workflow so is it in the network creation segment or is it in the model calibration or is it in the analyses OR output generation just going to give folks a few more seconds it seems like most people the biggest pain point for most is is typically in the model calibration based on the responses that we're getting and we're just going to give you a few more seconds all right we can go ahead and close the poll so yeah most most respondents have uh have highlighted model calibration to be the main pain point in the Wastewater collection modeling workflow so thank you all for engaging with us this takes us to our first pillar so a as you've noticed from the previous poll question we've segmented the the uh the webinar to four primary pillars under Wastewater collection the first one is going to be Network creation the second one is going to be model calibration the third one is going to be modeling analysis and the fourth is going to be output generation and the PowerPoint is going to progress in that same sequence that we highlighted in the previous poll question so with network creation the first thing that comes to mind is leveraging icm's open data import Center in order to bring in your node and conduit shape files and then map those accordingly uh so what you're seeing in the short video here is how easy it is using the open data import Center to bring in your node shape files and map them the object fields and the and the import fields and then doing the same for the conduit once you have that shap file you bring it in and then you map it again between the object field and the import field and it's you know however many clicks those those took to be able to import uh your your network conduits and nodes and other other features that you might have existing and then from that you might be interested in bringing in some other background layers or uh GIS shape files such as uh parcel data uh that you can bring in through the gis layer control within infoworks ICM and on top of that as well some people have you know some people prefer accessing models with web Maps so you also have the ability to use icm's GIS layer control to bring in a web map server a web map using a URL and you're going to see when you confirm it's just going to load up in in a second or two and that's pretty much the process to go from a blank ICM sheet to a network with some with your nodes and conduits and backgrounds and and Parcels Etc and typic after you've created your network there's going to be a desire to know where my data gaps are because typically during the import process not all data is going to be complete and uh ICM provides you the ability to I'm going to pause here for a second the ability to validate your network once you've brought it in and you have color coding that kind of highlights what you need to pay attention to before you proceed to the next step in network creation or or or invalidation and one of the key things or one of the unique things about it is once you've validated your model you're going to see this uh the coding the code at the bottom where it's saying you know there's a node that's missing ground level if you click on that element in the properties it's going to have this inline validation color coding that's going to highlight immediately to you what you need to populate instead of having a search where the ground validation is so I think this is in my opinion a pretty unique uh pretty unique and streamlined process for validating your Network and being able to visually identify where the gaps are the next thing we're going to look at is being able to plot a profile and once again highlight that there are obviously some funky things happening in this long section or profile including maybe some incorrect or missing invert elevations and what you're noticing here is once again you're clicking on this on this element in the profile and you're seeing the inline validation color that's indicating that you need to input this this number into ICM and so the the the thing about it is Data typically in the Upstream process of importing your network is never going to be fully complete and ICM actually provides you uh with the ability to automate The Filling of that da that missing data instead of having it go into each missing conduit or node and input those manually so ICM has an inference uh feature that as a reminder this is the the profile that had missing data in it so what you can do is you can certainly go into each of those missing data points and you know populate those numbers manually but with ICM what you you can automate that process and use the inference feature to ask the software to populate missing conduit and no data and I'm going to pause here for a second and also be able to flag that data so what you're seeing where the mount is right now you're seeing flag infert values so it's going to any data that the software changes using the inference feature you're going to be able to identify that easily with a data flag and you're going to see that towards the end of this video so how does it work it's as simple as dragon and drop the inference feature into your network and it's complete and what what what's the proof that it's been completed once you validate it all the red flags have been gone and you're going to plot that long section again and you're going to see how it looks in just a second and there you go none of the funky you know looking inverts or missing elevations and that's the data flag that I was referring to to you're seeing this INF which indicates that this value was populated through the inference feature and a cool thing about data Flags is actually and this was a a neat neat video that Ryan had created what he did here is he changed the data in that in that in that field and you're seeing that the flag automatically changed to and it updated from in INF which means it was an inference uh data input to RFB which is Ryan's initials meaning that he updated that or he input that so that's one of one of the unique things about ICM is just the ability to track where your data came from was it imported from your shape files uh was it it updated through an inference or was it manually updated through by a user so so far the sequence that we've progressed you in is building or creating your network validating your network and populating your data gaps now assuming you already have a network that let's say there's a development or project that's happening or that's getting built in your in your in your uh in your system and that development they're using civil 3D to build to build their Network preliminary Network there is an integration between civil 3D and infoworks ICM where you can bring in through a an addin ribbon within civil 3D you can bring in a pipe Network that you've built out in civil 3D into your infoworks ICM Network or even a blank model so what you're seeing here is you do have a pipe Network in civil 3D that you're going to there you go if I pause here for a second on the top left hand side this is under an addin you have the import export between civil 3D and ICM and this is built in you simply just need to download the addin and this is going to be built into civil 3D so not a lot of manual workarounds between like exporting and importing Etc uh this is going to push out a a CSV file and then you can go to your ICM and bring in that CSV file from civil 3D and you can do similarly going from infoworks ICM to civil 3D and the idea is any development that you have happening in your system uh you can build out that Network in civil 3D and bring it into ICM and run the analyses and validate it and maybe you know change uh change diameters or change uh change profiles Etc finally after you have a completed Network The Logical Next Step becomes you know your your dry weather flow dials or your dials dial pattern so in ICM you have the ability the streamlined ability to do that uh in this in this short video we're going to show you how to how you can set up a dry weather flow for a residential a residential dial and you obviously can create as many di dials as you would like you're going to see the profile number associated where it says here profile one that is going to be associated with the dial that you're creating and the reason that is important is because when you're when you're assigning dials to subcatchments you're going to be assigning it based on the profile number and not based on this description that you're inputting so this is an example where we're set setting up a residential dial profile and we have the data in Excel spreadsheet copying that and bringing the factors in you're going to see the graph up update accordingly and you know you can create as many dials as you'd like and assign them to the appropriate subcatchments as we're going to see in this video so we've selected our subcatchment and you're going to see in the properties there's your Wastewater profile in the dial associated with it so it's not going to mention the description in there but it's just going to mention the profile number the other thing that comes after you've developed your uh your network maybe you've uh You' validated it you've used the inference tool to populate missing data you've maybe brought in a new network from pipe from civil 3D and you've input your dials you might want to set up your rainfall events and how you do that we have three video actually two videos that demonstrate you can either set up a custom rainfall event or a user defined rainfall event and uh and in that case it's not going to be a designed rainfall so you're going to uncheck that and you're going to you know populate your userdefined rainfall event accordingly we also demonstrate in the second video how you can set up an SCS designed rainfall event so you have a long list of events that you can choose from here we we selected SCS and then you're just populating the uh the characteristics of your rainfall event clicking okay you can even graph that uh to take a look at it visually if you would like similarly you can do the same thing uh for a Noah Atlas 14 rainfall event and that's what's going to be shown here in this example example and you're going to use the rainfall generator to get that data imported into ICM and you're seeing all the data here and you also have the ability to graph that data so what's being demonstrated is just the flexibility of the and the number of options you have to set as you set up your rainfall from user defined to the to the various design rainfall options and that concludes our brief overview of the capabilities of ICM in uh in in the network creation and now we transition to model calibration which I'm going to pass on to Ryan for for him to cover this portion y thanks youf um you hear me all right yep okay cool um so yeah infoworks ICM has um a Time series database integration this is really uh uh a very very neat thing in terms of being able to uh directly link into things like skat data um this is really the first step in transitioning a model from something that's just used for planning into something that could be used for live modeling um so that time series database we were looking at at the beginning um was a series of different flow monitors and level sensors and things like that uh for this particular case I know that there is a flow meter on this uh pipe and so we can see overlay the uh observed value there in that kind of reddish color and then the the uh model data in the reddish color and then in the goldish color The observed data how well those are matching up to each other the volumes associated with um how much you know volume is uh within that based on the flow basically area under the curve kind of thing uh we can also look at scatter graphs uh export these out into csvs and do further statistical analysis if we wanted to um stuff like um Nash Su cliff and and other things like that to to make sure that uh you are seeing uh you know a nice onetoone relationship between those observed and uh predicted values um and being able to uh you know streamline that that process and and really uh more often be able to calibrate the model rather than doing it just every couple of years or so um that is a little bit um you know Advanced uh you could say in some cases for um some utilities or some organizations that um you know maybe they just don't have a lot of growth in their system and really don't need that direct link into their skated data and having to calibrate uh much more often the option there um is a fairly similar um idea but the setup is a lot less complicated uh where we're using flow survey data um so these flow surveys can include things like observe flow like we're seeing here but also level information and then the array fall EV it that would be associated with that time period that the observed event is uh taken on so instead of having the time series database in the Run we are just going to use the flow survey in that run and then run the simulations um and you know after that completes it's going to look fairly similar to what we were seeing before with the time series database um integration and being able to display uh the oberved on top of the uh predicted or or the modeled data um also a pretty key thing is to be able to use SQL queries so SQL queries can be used for just about anything from editing the model to looking at results um these can be particularly useful in cases of calibration just because uh it does uh lend itself to um do things like you know adding scenarios like like we're seeing here um or increasing or multiplying against different factors so you can see here I've added those scenaries using the SQL query uh but it's all just a copy of the base and and as we kind of progress through here being able to uh update the um population in each one of those simply by multiplying a factor against them um I've also used these in cases of uh modeling um River sections and things like that just by uh being able to have multiplication Factor against that you can also use these in conjunction with uh results data so uh referencing something in the results if um there's a particular metric you're trying to meet um and you say I I want to make sure my D overd is this and if it's not then I want to increase the pipe by three inches or something like that you certainly have the capability of doing that with uh SQL queries um instead of having to necessarily manually go through and figure out match up between the model and the results and all that kind of mess yeah okay there we go um also uh we' uh it's it's not that new now um I I still kind of consider it new but um it is something that we have um rolled out and it is available with any uh infoworks um infoworks ICM license at no additional cost but it's uh to be able to run simulations in the cloud um in this particular scenario I have a Noah Ensemble of storm events there are dozens and dozens within this file um in the idea here um you know maybe we're running some Monte Carlo um climate change types of things uh and we're trying to and and we're running you know hundreds if not thousands of simulations trying to get an idea of uh what things look like um for a 1D system um and I kind of get to this a little B later for a 1D system it's not um necessarily as impactful if you're just running one scenario but if you are running multiple scenarios as you can see here uh it is uh you know pre-processing and then uploading to Cloud uh but all these simulations are run in parallel so instead of uh like on a local computer or a local server um having the the hardware there and and running each simulation one by one we're running all these in parallel and so instead of going one by one which would take something like four hours um this particular instance took six minutes U to run all these different scenarios here that's about the end there we go yeah um so I did do a little bit of testing like I mentioned for one dimensional systems this is a fairly simple I think it was like 800 pipes or something like that and typically only takes a couple of minutes to run but I ran a thousand simulations and as you can see here um included in that cloud run time is the amount of time it would take to upload and download results too um just to make a fair comparison but um not not huge there but if we look at the 1D 2D analyses um we have uh some really uh great Hardware that we're using uh on the cloud uh GPU cards can be uh leveraged for the 2D uh computations and because of that um the better GPU cart that you got uh the better it's going to go and so as you can see here uh 3600 simulations uh just over two hours and I tried doing the same thing on my local computer ran it overnight and just gave up at it for 17 hours um based on the progress it probably would have been something like 24 to 30 hours uh but um yeah uh pretty pretty impactful things there uh We've also even had clients uh going to uh on Prim into the cloud and and seeing what kind of benefits they have um they had six modeling solution or simulations and they were uh taking a couple hours each so uh running those all individually um they had you know six times 12 or six x 2 equals 12 um after moving to the cloud they were able to run all those in parallel not to mention being able to leverage all that uh nice hardware and not having that it overhead uh it took about two minutes for them all to run um really impacting the efficiency on that project and modeling analysis this is you right yeah okay I'll hand it back awesome so just just a re quick recap before you we I guess progress forward so far we've covered uh the the sequentially going from Network creation to model calibration and what some of the unique capabilities in ICM that exist uh from what Ryan just mentioned from the time series database to uh cloud computing that's the intent and that's what we intend to cover in in this PowerPoint and we continue to cover so so far we've we've done Network creation model calibration next we get to modeling analysis but before we move on just some more engaging poll questions so a a question on steady state and EPS runs so do you predominantly utilize steady state runs extended period simulations or is it evenly split between both during your analyses so do you do steady state primarily or EPS or both we're just going to give folks a few more seconds to answer this one okay it seems like most people do e are evenly split between uh between both with a close second eps and then a good number of people primarily just just do steady state and that takes us to our next question people that just do steady state runs which of the following do you perceive to be limitations of steady state runs uh and and and sorry which of the these limitations would you perceive compelling to transition from steady state to EPS so people that just strictly do steady State uh do you perceive any of these limitations that we highlighted there as compelling to transition to EPs and we're just going to give folks a few more seconds to answer this one seem like a good good split between all of the above and that it's a snapshot in time and not dynamic okay final question people that strictly do steady state what might be limiting your adoption of eps do you think it's is it the absence of eps capabilities in your present modeling solution is it the number of required inputs is it the perceived complexity of eps or is it the ad perceived adequacy of steady state so we're getting a lot of puts on a lot of respondents on highlighting that the number of required inputs of eps is what's limiting their adoption and also the perceived complexity of it okay so poll questions aside thank you all for responding uh to our poll questions and engaging with us and and really giving us your your thoughts and feedback as as we as we ask those questions as a reminder in this section we're going to be talking about the modeling analysis so after calibration you might want to start running some analyses there's a lot that you can do we broke it down into uh planning what you can do with ICM from a planning standpoint what you can do from a design standpoint and what you can do from an operation standpoint so from a planning standpoint the first thing that comes to mind is being able to do the leverage the total storage dialogue to identify in your network the static live dead storage and as we're playing in this video here the the value of that is being able to potentially in the planning phase identify where you might have measurable or or significant death storage which might contribute to something like H2S formation and you're seeing how simple it is to identify again the LI the static live and dead storage and uh in a second what we're going to show is after you've run that total storage analysis uh you are going to have the ability to go to your results table and identify here so this is oops apologies I went too far ahead so this is going to be the the the column that's showing you the death storage we're going to sort that by ascending so you're going to see the greatest values up top and sorry descending so you're going to see the greatest values up top and then you're going to highlight the conduits in your network that have measurable debt storage and that would allow you to detect things like this which is going to pop up in just a second things like this in your network which might be limiting the draining of you know areas of your areas of your network so that was the first case that we identified in uh in in the leveraging of ICM for planning purposes and granted this might be a simplified example but the point is is you know applies throughout the other the other thing we wanted to cover is modeling to streamline design so search charge and scripting this is an example of a network or a long section where you're getting search charging and uh and that's obviously problematic what you could do manually is through iterations upsize pipes accordingly and uh and and resolve that which might take you you know depending on your experience it might take you a significant amount of time amount of time instead what you can do is two things either scripting or SQL in this in this video we're going to cover scripting which might sound overwhelming but it's maybe not as overwhelming as we make it to be and I'm going to explain why in this short video so the first thing is after you've noticed that they're search charging you might be interested in identifying the the P that have a Max sear charge state of either one or two depending if you're looking at search charg search charging by depth or by flow in this example we're strictly focusing on pipes search charging that have a max value of two and you're seeing there's quite a few of them here and so the iterations to change that or to upssize pipes uh in order to get them to a lower value than two might be again quite a bit of effort so what you can do is and I this looks overwhelming but we're going to get to our script and we're going to explain it so here this is the script of Interest right in the middle of your screen here so what we're saying is that in the script if if there's a Max search charge of two uh then if it's less than two then nothing to upssize but if it's greater than two then we want to upssize the pipes based on certain uh criteria that we've input in the script and the convenience of it is you're also going to be tracking through uh version history you're going to be tracking all the iterations where the model had upsized the pipes incrementally and that's what you're seeing here so every in every single increment or every single version here you're seeing that the each scenario represents an upsized pipe based on the script that you ran and when you get to the final scenario which you could select to obviously validate that that the script is given you what you needed you're going to run that final scenario and you're going to view the results and plot the profile and keep in mind early on we were looking at a a long section that was just search charging throughout and after we've run the script we're going to look at our long section again and you're going to see it's no longer sear charging the idea is things that you could have done through many iterations many add to that yes go ahead Ryan oh okay U yeah just to add to that basically what that script does is just Loop through um and continually uh run the model um we did get a question that I feel like is relevant we could go ahead and answer but how do you run a script in ICM so uh the language is called Ruby uh we do have uh but it's just a matter of um I but it's I believe it's a file and then it up right now um I guess keep going and then I'll um I'll come back on and answer yep yep so as as Ryan highlighted that's the that's the search Char State and scripting now you can do something similar along those lines with SQL queries in in ICM so the idea here is again same to the previous example where you have the the sech charging State uh but you would like to update you would like to update the conduit size or the conduit width based on set criteria and you're immediately probably noticing that SQL seems a little bit simpler uh than your than your uh than the previous example and what we're saying here in this SQL query is that you identify the max search charge of greater than equal to two if the conduit width is less than 17 then you update the conduit with by 3 in and you're also going to flag that data update with a flag called SQL or SQL and then in addition if the maxer charge is greater than or equal to two and the conduit width is greater than 17 you're going to update the conduit width by adding 6 in and you're going to also flag that as SQL and then finally there's this neat thing that Ryan added in there which is we were asking the software to push out a a CSV file with all the the information of the updated of the updated elements in the network so probably a simpler interface than uh what we covered previously in scripting but both options are there for you and the idea is similar is that all these stats that you would have to do multiple iterations on and do a lot of manual checking and going back and forth and adjusting and creating scenarios to check and looking at version history Etc you no longer need to do that with things that might automate that process using either scripting or SQL queries and probably one thing that's worth mentioning is the SQL creaters once you've created at once you can use it across multiple networks you can share with multiple team team members and colleagues and use it and use it use it accordingly so the transition so far the sequence so far has been in modeling analysis uh looking at how you can use ICM uh to to support you in the planning phase by maybe identifying uh static debt and life storage and and mitigating the formation or limiting the opport the opportunity for H2S formation due to dead storage uh We've also looked at modeling to streamline design using scripting and SQL queries to eliminate the multiple iterations that might exist in a specific workflow and overcoming that the final example that we cover is modeling to support operations and uh Ryan touched on that a little bit during his segment when he was talking about time series database but in essence what we're talking about here is using forecasted rainfall data and bringing that into ICM through the time series database to be able to predict or forecast how your system might respond in the event or due to that upcoming uh you know rainfall or storm so the key there is being able to be proactive in in uh in maybe having a response or or an operating procedure in the event that the storm comes we're going to do these steps to mitigate maybe an SSO or a CSO event and uh that you know that applies across the board for forecasted events in addition as Ryan highlighted with time series database you also have the ability to input observed data for maybe looking at past events and how your system behaved in past events so that's the unique thing about time series database and uh Ryan talked about the live connection but in this we're we're talking about maybe having observe data that you're using to input into your into your model and seeing how your system would respond to that so that concludes our modeling analysis portion and then we transition to Output generation where Ryan is going to cover this this final final piece here yeah and I think I'm having a little bit of connection issues um so I turned off my camera to maybe help um are you hearing me all right yep yep okay um yep so in the output generation uh we have things like um G yeah things like the 3D view uh just being able to visualize things and make sure that things are how you think they they should be um you know looking at Bridges this way uh looking at the terrains this is an exaggerated look um we also have other options for being able to look at the I'm actually one second should be a better connection um you want to hit the next one do you see that no it's not refreshing for me okay I'll just um yeah so we got the 3D view uh We've also got thematic mapping so with the Thematic mapping just being able to save those and the drag and drop so you can kind of create libraries of these and be able to um highlight things that you might um often want to see um or particular ways of course you can um right click and customize those as well um with the uh properties uh menu and all that um next up there is uh profile views plan views as we're seeing U all that thematic mapping matching up between the profile view and the plan view uh tabular results can also be pulled as well as graphical uh results um next up looking at some of the reporting features you can utilize SQL queries for so building customized tables highlighting where uh certain issues might be going on uh again really really flexible um not necessarily uh complex um certainly not as complex as some of the scripts we saw uh earlier um I know it's not pulling up but um these are relatively simple like on line kind of things um and again that uh GitHub Source uh is is uh kind of chalk full of these um and then you know these Factory ICM Factory SQL queries it's not built into ICM but we're certainly happy to share um this kind of General list of some of the common things that users might want to see and use it as a starting point um to uh be able to write additional scripts and with that I'll head it back to ysph for the resources yeah awesome so we're right on time we probably just have about a minute or so and for resources to wrap up uh the the two things we want to highlight that we have on demand training for infoworks ICM that you can access through the Autodesk website it's about 6 hours it has a data set and a video that you can follow along with in addition we have kind of deeper uh trainings that are shorter in length and they're more uh specific about certain topics that you might be interested in and finally we just want to highlight our one water blog where we have a lot of content that we put out into uh into our blog pertaining to ICM uh transitioning to ICM some case studies some thought leadership convers ations and uh sometimes we put in some workflow videos as well so that is the end of our presentation but we did have a final poll question that we're hoping you can engage with and let us know your feedback so the final poll question is would you like to be contacted by a product expert for a demo that poll question should show up shortly it has not is a pole active or not yet no the pole can we activate the Pole right just give me a second give me just two seconds and you should be able to see it in just a moment I guess we're there it is okay should okay all right so if you could please address respond to that poll question if you'd like to be contacted by a product expert for a demo and we're just going to give folks a few seconds to respond to that given the delay apologies for that and we'll just keep it open okay um yeah so did we want to go into some of the questions here yes if you'd like to go through Q&A you can absolutely do that we'll leave that poll question open until the end if you'd like to go ahead and select yes or no and youf and Ryan will go ahead and answer some of those uh Q&A that's in the chat yeah I I literally have a fire alarm going off in the H I'm in so hopefully that's not too distracting I don't know if it's uh audible at all but um we've got the first question is there an efficient process for importing open Channel networks from Gis for example having the top width bottom width and channel there we go it's off now uh bottom width and channel depth of a trapezoidal channel um unfortunately no and it kind of depends on whether you're looking to incorporate um it into uh just the 2D surface or if you're um trying to build a an actual like River Reach uh type of thing um we certainly have open channels uh within uh infoworks that you can Define just the trapezoidal channel in a 1D sense in addition to a true River type of system where it would be more like heck and and we do have an importer for agress as well the geometry at least um if you wanted to incorporate something like that uh into the 2D surface you could certainly bring those in as brake lines and have that top width and um or or have those lines for the tops and bottoms um defined in there and bring those in as a brake line so that you'd be able to utilize um those like spaces and once you go through the meshing being able to uh have that um got a question on uh limitations for the cloud simulations in terms of storage uh and run times and things like that I did put in the chat um some more information on that but just to kind of follow up on that there there is no limitation currently um on storage or the number of simulations that could be run uh but really really good to hear that um it looks like that same um person Alex um had a client running three days on their local computer and cut that down to four hours and the cloud uh we also got a question about calibration parameters um in this particular uh presentation we didn't really do uh calibration per se um things that you might want to look at for calibrating a model uh depends on what type of model it is if it's a storm water model if it's a 2d model um but commonly the things that get tweaked are are things like Mannings uh roughnesses for uh pipes or for surfaces uh you might also need to tweak things like uh Weir coefficients or orice coefficients um based on that observed data and then also if it's like a wet weather analysis and trying to calibrate what weather flows uh incorporating um I guess depending on the module that you want to go uh use the more common one being rtk you would adjust those rtk parameters uh running a script I did end up finding out where that is it's in network and about 2third of the way down there's a run Ruby script uh option where you would just navigate to uh that RB file if you are looking for a software um to uh write any of those scripts um I know there's code writer is a freely available one um so there is are some options for that being able to um uh pull that in um and again um I I did put the link to the GitHub um so uh just having that as a resource uh there's SQL uh queries in there as well as uh Ruby scripts for uh just some different things that um our support team has seen some of our clients do and thought that it would be useful to kind of share with the wider audience uh we got another question with where are the template sqls available uh um I'm I'm guessing it might be related to that library that I was showing earlier um I think it's up on GitHub it'd probably be a good idea I I uh I'll take a d double look in there and just put a transportable database with um those kind of factory SQL queries uh together uh of course you could always email us too um with our contact information there uh for being able to uh pull in um uh for us to be able to send those um queries to you um got another question about inputting i i to the model uh like I mentioned when we were talking about the calibration piece of things uh we do have the traditional rtk method so having a short medium and long-term response uh of the uh influ infiltration uh we also have What's called the ground infiltration model that ground infiltration model is a little more f physics based uh using um properties of the soil and infiltration rates and uh different things like that it can be a particular particularly more accurate for uh areas that see variable um variable groundwater tables and um of course there's going to be more inputs with that um so uh there's kind of a balance there in terms of whether you really need that kind of the same decision um that we were talking about earlier with the steady state versus EPS kind of uh thought process there um but that kind of relays us into the next question of what's the difference between a steady state and an EPS run uh or extended period um simulation a study state is um basically what what what that means is it's um just constantly running a flow through the system um it's much more simplistic it doesn't take into account storage that's uh inherently in the system um and because of that that's where we see some of those uh oversizing of the infrastructure um with the EPS run it is taking into account storage and and different things like that to be able to uh really more optimally size the system um yoseph I know you have experience with that with the pable water side of things I I don't know if you were to add anything yeah no it's it's quite similar as well on the on the drinking water side with steady state in EPs and really the the trend is going towards EPS for the you know the benefits uh and uh and the you know the optimized approach that you can take with EPS over steady state and that actually is going to be the I guess the the conclusion of of Q&A at this point because the the I'm seeing quite a few questions that we haven't tackled so please feel free to email to email us directly with any of your follow-up questions and uh and with that we're just going to you know conclude our portion of the webinar we're going to transition it over so thank you all once again for for attending and we're just going to transition now all right this was great thank you so much this actually concludes today's e showcase thank you all for attending the recording will be available at wefers guide. we.org we will send all registrant and email tomorrow with that link and please visit our event calendar such just sign up for future webcast and again if your question was not responded to in the chat during Q&A you can uh reach out to the presenters or they will reach out to you once they get the chat transcript thank you so much for joining us today [Music] [Applause] oh [Applause] [Music] oh