all right so let's start this meeting welcome to this third ismg tc222 online Workshop my name is mik bis I'm the vice chair of this group and I'm very happy to be here with you to to do this uh third Workshop so you have here the agenda we will try to fit uh the time that are proposed and as you can see for the Q&A there is an invitation to use an app course slider so I will suggest you to try to reach it and it would be used in order to ask some question to the different presenters and as you can see on the agenda we have time for that so now I will go into a brief introduction of ISS and G tc222 and uh we will also introduce uh this topic of jch standards that is the topic of today so let's go for it so unfortunately M the chair of U of this group is uh is not here so I will perform this presentation on behalf of him so ISS mg if you don't know it for now is the international Society for soul mechanics and geot techical Engineering uh we have 90 member societies National bodies and around 20,000 individual members in it so the ismg is organized in 37 technical committees you have seven about fundamentals they have the name tc1 and other numbers after that uh 21 about application and nine about impact on society so the tc222 is about beam and digital twins for techniques and is one of the last TC that was the done in ismg so the tc222 uh is organized with a conil so you can see who has the members of that conil so Magnus is the chair it I'm the vice chair mcstor that is here with us is the secretary and also so yich andu are Co members of tc222 but together we discuss uh how we organize the activity and the animation of that group especially we have different uh background and also speciality uh and uh in order to highlight things about new technologies about academics about application in smart cities if you you want to follow us uh there are multiple way to do that uh we have a website uh that is hosted by ismg and you can re you can reach with that link uh also on LinkedIn uh we have a page certainly most of some of us may have found it and that's where we communicate about our events and we also have a YouTube channel and I will uh let you know about this YouTube channel because that's where we uh provide uh the previous Workshop recording so this uh Workshop is recorded and the recording will be available uh few time before this so now we will go into uh an introduction about uh digital standards for Geotech I will explain you a little bit what it is about and I remember you that you can have uh possibility to ask some questions thanks to slido with the link also that is pasted on the chat let's go for this uh introduction brief introduction about what is digital standards for for Geotech um as I mentioned uh before tc222 is one of the most digital oriented TC from ismg and uh when you look at the topic that is beam and digital Twins and this target you can see that standards are necessary to achieve that goal so in our terms of reference we Define uh some an objective uh that is writing guidelines and recommendations and in this objective we have something about uh standardization So the plan for us is to be uh a link for the different uh organizations that are talking about standards for geot technices and uh what we try to do is uh to facilitate collaboration and development of Standards especially with four organization but noted to those four uh the building mational open Jo spal Consortium what youc AGS endings so you will have some presentation about those four organization and uh what we try to do is well when you are gical Engineers you want to do something regarding the provision of geal data what we will try to do is to propose some guidelines and Technical recommendations so you will uh easily Define how to to do that so if you want to know more about uh um or terms of reference you can have a look to it thanks to the website of tc222 regarding the topic of digital standards I wanted to pointing out one paper that is not um I would say the position of ismg it's a it's a paper that was written by David tol uh in 2009 about the international data standards for genical engineering so this paper was proposed for the International Conference of so mechanics and Engineering so ICS mg and as you if you have a look to it there are some statement that are still true and of course things that have changed since 2009 so what I appreciate in that paper is that there are some ideas that the topic of standardization is something that should be addressed together so there there was this idea of having a joint TC between ismg I isrm this uh joint TC still exist it's called JTC two and uh as you can see there is this idea that well station should be done together and in this paper you can see that uh some uh key players I would say or community were already identified as uh as a providers of solution in order to provide jum data so these key players in 20 Ona where AGS digs and since then there is also bu mational that came so uh that's exactly the four communities we will address today the key challenge for standardizations uh of course when we talk about doing some standards you can see that when you want to exchange the data there is a big topic about semantics and semantics elements this is due to the fact that we have different cultures we come from different domains with different backgrounds and have different practice so there are multiple way to describe the same things using the same words or not so semantics element is quite important in order to be able to exchange properly the data and that's something that station should address uh another point about standardization is that you have different possible level of sophistication so when you would like to propose the standards you of course would like to have something human whatever but if you have a ID to reuse the data thanks to machine this format should be machineable and sometimes a compromise should be found between the two of them uh also when you are doing some standards there are multiple way to address it uh you will be able to standardize everything so you will have small boxes uh this has some advantages but also some drawbacks and you can have another option that is having bigger containers so uh things that are standardized as a wall but having a capacity to be more flexible inside so these are things that should be balanced and the other things is of course that you have a dependence of Technologies uh we use digital things and so uh technology uh is is uh running and you have different solution that can be offered CSV XM and Jon and of course uh they have some advantages and drabs so depending of the choice you may you will emphasis different capacities and results and I think that the standards we will have an highight today are well um showing these differences but also similarities I would like also to introduce you uh an interesting activity that was a Geotech interoperability experiments that well reflect this idea to work together and to Define standards together so in January 2019 there was a gical data transation Workshop that was held in Paris uh and this IDE was when building mational wanted to address the topic of geot technques to try to Federate with existing standards and people that were already working on that topic so there were people from ogc but also AGS and things that were represented in this the ID from that Workshop was to try to work together to have the definition of common semantics common tools and ipis and also provide some guidance for people that want to expose data so uh the Dream the Dream came through in February 2022 uh thanks to the fundings of a project called mine in France and uh this ID and this project was a jotech EXP that try to address what I just mentioned before so this project just uh ended came to an end in 2023 December 2023 so the result of it are already available um the work plan we had in this Geotech IE was to address two things as I mentioned before this is to find the balance between Community oriented goal and also technical oriented goals so regarding the Community orienting goals you can see that it's about uh working on the semantics uh having a common conceptual model uh provide some guidance thanks to a technical paper and also uh implementation guide for software vendors and on the other end there was this idea that of course defining standards is nice but we also want to be them to be implemented and so this idea was to extend the existing ogc geoscience standards and also provide technical documentation uh about how to use them with ogc apis so the results of uh this jot are available uh they are mostly divided into two documents there is this white paper that is provide you a very simple overview of what we try to do of philosophy and also there is a a bigger technical paper that has a form of a Wiki that is available from that link and that is a kind of index of all the deliverables we providing in the that project so I wish just show you an overview of what is available for you uh there is first this activity that has been performed trying to identify the Geotech Concepts and mapping between the standard existing standards so we came into this classification this is something that was commonly done with uh the IFC tunnel project and that led to this proposal to structure uh our different data into book A book b Book C uh something that comes from U the French International tunneling Association first so here you can see the different uh Concepts to identify and of course the idea was to see how they are currently addressed in the different formats in the different standards at the were part of that project another activity mostly focus on ogc uh was about the providing extension to ogc and ISO standards while there were some developments in AGS in digs in IFC T there were also activities about having this standard extended to be compliant with the proposal we Define before and one other activity we had was to work also together on the the use of API especially one ogc API in order to provide the Geotech investigation so uh we had a common activities that was to take real data from Boh logs CPT spt pressure meter and to see how they can fit to that IPI so there was the development of an API or more exactly extension of existing API in order to that to do that and uh we try with people from hgs dig to see how it can fit or it can be used in order to facilitate the provision of Geotech data so that's it for my introduction of uh this topic of digital standards for Geotech uh now I will give the hand to L in order to perform presentation about AGS uh and I remember you that if you have some question either for me either for Neil Dan yonas please use the slid Link in order to ask them and we will have time in order to address them so thank you and then the floor is yours okay I will just bring my slides up hello can you hear me cuz I did freeze then we do you can good right I froze for a little while uh it looks like you do have my slides now that was a little bit scary um okay that was a little bit scary right um good afternoon everyone or indeed good morning or even good evening depending on where you are uh so my name is Neil shadwick I I work as an independent consultant these days although I have spent many years with a major design consultant I'm also a member of the AGS data management working group which I will be introducing in a few minutes time so I'm going to talk about the AGS data transfer format and this is a fairly quick uh 20 minute um introduction and in terms of what I'm going to cover I'm going to talk about what is it um uh what's its history um I'm going to look have a look at how it actually works so we're actually going to have a look at it um and I'm going to look at um some things uh that are complimentary um ASI HS piling I'll introduce those say a little bit about where AGS data might go in the future um and today is very much about interoperability so um I will do a little piece on that at the end first of all what is the as data format um some of you will know it well some perhaps have heard of it some perhaps don't know it at all but I'll start at the beginning it is a dat transfer formats for ground investigation data so that's geotechnical and Geo environmental data so it's used for the transfer of data between different organizations although these days also gets used quite a lot even within organizations perhaps between different teams and its scope is factual ground investigation data so this is the stuff you'll see in what we call factual Grand investigation reports betet in France um so it's the stuff you see on bore hole logs but also all the test data including lab test data uh also monitoring and in fact that goes beyond investigations an ads can actually handle some types of construction monitoring as well although it's not quite so commonly used for that one thing it doesn't include is interpreted or design data but we do now have a new format that does do that and it's called AGS I and I will return briefly to that later on in this talk um but first of all on the main AGS data formats I'm going to just talk about a little bit of history and basically give a little bit of background and context and first of all the as format was first published in Believe It or Not 1992 now to put that into context it's actually about the same age as the worldwide web um I can V for the fact that we use using floppy disc to transfer data around at that time email hadn't really arrived yet um and interestingly there weren't really any comparable standards to influence it at the time so it kind of I guess it evolved in a little bit of a bubble uh but we like to say that it was created by the industry for the industry and it's over 30 years old now but it's still using it it's had a few updates um but it's still essentially the same um but if we're still using it and obviously we must have got something right and are doing something right in fact it's been very successful it's now routinely used here in the United Kingdom it's on most projects it's a standard requirement and it doesn't even get questioned um and it's pretty well implemented uh by The Specialist software that we tend to use but it has also been adopted and in some cases adapted in a few other countries in particular Australia uh New Zealand I think it's relatively recent uh Hong Kong longterm users also Singapore um seen it take off a little bit in Ireland uh there's been occasional sightings in Mainland Europe and the Middle East now I just want to just talk a little bit about the AGS part of AGS data because if you're not from the UK then you probably won't be familiar with that well the AGS is an organization it's the association of geotechnical and Geo environmental Specialists and it's a trade Association so basically a grouping of like-minded companies it's a not for-profit entity it is based in the United Kingdom and it is for uk-based companies it's membership uh we have a lot of ground investigation contractors um but also uh Consultants both specialist and generalist and and a few ofs as you can see there um membership is 183 different entities so it's a pretty strong body now AGS data is actually only a small part of what the AGS does um and it's actually divided up into several working groups uh but the important one for today is the data management working group who look after the AGS format um and that's the group that as I said I was part of so how does AGS actually get used um well I guess the most common soorry I guess the most common use case is sorry few technical difficulties here there we go that's better um right so most common use case is provision of data associated with a ground investigation report uh produced by a ground investigation contractor uh data provided in AGS format to a design consultant who will then go and do their interpretation and design and interestedly in the UK these are usually different companies different organizations that's how we tend to work uh which probably explains why um AGS developed in the UK and maybe not in other countries um so much or equivalent um but this is only one application of AGS because if you actually look at how uh things the activities that we undertake before we even get that final report we actually find that AGS data can and often is used as part of that as part in the supply chain as you can see here and then the design consultant shouldn't be the last person to see the data because there may be other designers that are interested uh there may be um piling contractors for example who might want to see it good clients will be taken an interest and we also have our British Geological Survey as well who have always collected data but now they're actually collecting ads data as well um and they are making it available to us to inform new projects so that's just a brief introduction to where ADS what ads is about and where it came from now let's have a little bit of a look at how it actually works in practice um and essentially the AGS data format is made up of two components and the first of those is a data structure which we Define in what we call the data dictionary you might think of that as the schema and then we have the file transfer format so this is how we get from the structured data to that final file deliverable you might want to think of that as encoding and these are defined and underpinned by a set of rules and all this is of course captured in the documentation so let's have a look at the data structure first um and say we must conform to a specified structure and the first important concept is we divide up the data into what we call groups and these are groups are arranged by subject area so for example we have exploratory ho locations and also some general metadata for the holes basically where we Define the holes is in a group called Locker um samples of sampling together and they're in a group called samp the different in situe tests well we have a different group for each test type um or sometimes two um and the same applies with the lab tests as well so that actually means that we end up with a lot of groups and you'll see that in just a moment and then each group gets divided up into Fields alternative owners headings or effectively the same thing so for example in the samp group we have headings for the whole ID depth type reference and so on now this might seem like a fairly familiar concept and indeed it should be um it's not an object model as such but it can be mapped to that sort of structure um and indeed it can be mapped to dat atase type structure in fact it's got a lot in common with database structures um so basically our groups are effectively analogous to objects in an object model or the tables that you'd find in a database and the fields and headings that would be your object attributes all the columns and Fields you'll find in a database table so a fairly familiar concept and another familiar concept is linking of the groups as you would Link in a database or in object model um we've actually got a pretty logical hierarchical structure we stick to parent child relationships we keep it simple we don't have relationships dashing off in all sorts of different directions um it's implemented using the concept of shared key Fields I'm not going to go through this in detail but you can see that basically we do have a logical flow through um everything starts at the top with the holes in locker um and with lab test we have have to go through the sample first and lab test onal samples and so on AP use for the phone if you can hear that um so here is the uh full schema or the full uh structure um so as you can see here it's big there are no less than 148 different groups here um each line is a group here um so obviously I'm not going to go through all of that now the colors here are simply The Unofficial groupings of the groups and it's just really to to help navigation so I'm just going to concentrate on those to give you a flavor of what's actually included um and on the left hand side here we've got uh locations and basically the exploratory whole construction details we've got a set of groups that cover that we've got a few groups that cover the geological descriptions uh and then we've got samples um so this is basically information you would find on a ball hole log um and then we also have of course GE technical Institue testing uh I've got about 20 odd groups that cover that and then probably getting on for half the schema is actually the lab testing as you can see here predominantly geotechnical we got some aggregate stuff as well although interestingly we have a slightly different structure to deal with the uh Environmental Testing because it's all held within one single group and a similar concept is actually used for the monitoring which is actually dealt with in a single pair of groups for different types of monitoring in terms of the file format so this is a snippet of a very very simple AGS file but it is essentially a text file um so you can open it in a text editor which just happen to use AGS file extension for it it is a bespoke form at it looks a lot like CSV in fact we call it CSV like um it's essentially blocks of CSV data although they do also have multiple header lines as you can see if you look carefully and each block here is the data for a particular group terms of the pros and cons of this one of the good things is it is something that's easy to understand you don't need to be a data specialist to be able to follow this and in fact pretty much anyone could if they wanted to paste this out paste parts of it out to an Excel file um not something we necessarily recommend by the way because there's plenty of things that can go wrong but it does happen um file size actually turns out to be relatively compact compared for to some Alternatives um but the downside is it does it is bespoke so it does need a bit of work programmatically to pass out if you're bringing it into an application the documentation um it's not a b a British standard and it's not an ISO standard and that is actually a deliberate decision that was made a few years ago um it's just a document that is published on the AGS website although it has effectively become the standard for ground data transfer in the UK um so this is freely available at that link um you don't need to login or anything like that you should be able to access it directly uh you will find there are some additional resources on the website uh and they generally will require a login or will require a fee to be paid at company level that's things like code lists and guidance and so on in fact in theory uh you do technically need to register to be uh to be able to use the format so now just going to talk a little bit about uh I guess some complementary additional formats because AGS has grown in recent years and we'll start with one I mentioned briefly earlier which is agsi and this is a separate new format for ground model and interpreted data uh it is live it's we've at version one um it does diff quite significantly from the AGS format in that it uses I guess a more up to-date sort of object model approach and Json encoding and the documentation is basically a website and our current Focus there is very much on implementation um both in terms of projects and in software we do have an early adopters group set up and if you're interested you're most welcome to reach out and get involved with that and then we also have something called AGS piling now this is a little bit behind agsi we have got a fairly mature draft that's already been issued uh but not quite reach beta status um uh we have got a working group that's formed just under a year ago um in fact we had a meeting this morning um so this is led by the AGS but also includes significant membership from the FPS which is the Federation of piling Specialists that's similar group to AGS but for piling um and also there's also interests uh from the deep foundations Institute as well and the idea with AGS paring is it will include primarily the construction record as built details or the L or measured data from construction uh Power test data of course but also design schedule information so returning to the main AGS factual data format uh where may it be going in the future well first of all probably the first big thing you'll like to see next is AGS version 4.2 which uh is expected to include an overhaul of the CPT and and pressure meter testing tables to address some some issues with those in fact there's already drafts of those out for comment uh and also uh looking to introduce some Advanced lab testing for the first time as well now in terms of some of the topics that have been just discussed within our group um uh we are looking at alternative file formats maybe Jason maybe not we'll see um so it is on the radar um but Al on the radar is apis and what does this mean for AGS and the as format um because we are seeing people using apis for dealing with geotechnical data at the moment now we think that if we do nothing it potentially might become a bit of a challenge to us but we can also see opportunities if we're proactive on this so that's what we're looking to try and do at the moment and then we uh have ag5 um so this would be expected to be a major update um there's no particular timeline at the moment but we are now seriously discussing it um in fact we have a quite Major meeting on it coming up in May um uh where we're looking to really try and nail down the requirements and objectives and go from there um so obviously AGS doesn't exist in a bule bubble so thinking about interoperability um uh first of all uh We've participated as Miku has already mentioned in this ogc geotechnical interoperability experiment work and as part of that uh we were able to Map find that we could map AGS to the conceptual model that was developed um that will revolves around the book A stuff for the investigations um although we do find that most of our data in most of our groups actually ends up as observations in the conceptu model uh we also had a look at mapping to the Sens of things API model um which could be done but there are difficulties there because we end up with the AGS group data being split across many different um object so it's not necessarily straightforward um one thing we just point out is when we were looking at doing this ogc work um the idea wasn't necessarily to develop a format to compete or replace AGS or indeed digs which you'll hear about in a moment um and in fact the work that was done was very much leveraging our existing and extensive and well-developed data dictionaries as part of that work now IFC which you're also going to hear about today is an interesting one now we actually think that this is a where AIS data can be a complement here in that it could be a linked data set set because AGS data is good for uh structured data um not so good for complex geometry whereas IFC can perhaps turn it around good on complex geometry not so good on structured data um so maybe AGS can exist side by side as the data set that maybe something that Jonas May pick up on uh so that's all I was going to say um and despite the inter options I'm almost on time um uh so just a few headlines just to sum up AGS data format it's it's well established and I can't stress this enough it is very well used in the United Kingdom so it's real um it's getting on a bit it maybe looks a little bit oldfashioned in how it does things but it's a relative simple structure and file format and that has served as well um the ads data format itself is for for fractual ground investigation data but we do have this new adsi format for ground models and interpreted data um and we also have HS piling under development and you can see that we have also been developing some links with IFC and also with the work of the ogc as well and finally if you want to find out a little bit more about ads data then obviously uh you could go to the documentation that be a good start if you want to go even further than that there are training courses es that are available they're not official AGS courses they are commercial um but if you search for AGS data format training then you should be able to find them um if you're interested but struggling to find them then you're welcome to reach out to me personally so thank you very much n that's my presentation and I think the idea was we're going to have a few minutes of questions if there are any on slider yes we had some questions thank you very much Neil for this overview um so we will try to find interesting thing so there is there are questions about advantages between AGS versus digs versus IFC so I would say that you will have an overview of what digs and IFC can do for Geotech soon so I would just maybe transform that question to Neil to just what is in your opinion Neil one of the best thing that AGS do main advantage of AGS why is it so successful um why is AGS been so successful I think well first of all one of the reasons it developed was there was a clear demand in the way that we work in the UK so that's one of the reasons why I think it's it's been able to be embedded it didn't happen overnight it did take many many years to get there um I think it's success it's it's it's it's simple um and it makes it relatively accessible I think or that's what I would say um uh and I think I sometimes look at some of the other things that are developed and I think they're more for data scientists rather than geotechnical Engineers so there's always an interesting debate to have there we could probably debate that one all day but I think that's one of the things that I would say um uh we've been being quite stable as well in fact there is an argument that we haven't updated frequently enough um but it is useful to have some stability so that people can work with it and in particular the software that's um developed to support it I mean if the software hadn't supported it then uh it wouldn't have had the success um it has thank you uh there is another question is why don't you go as a would say standards like uh having ISO certification or something like that o well I've got to be a little bit maybe I've got to be a little bit careful about what I say here um and also I don't necessarily know the background although there has been some discussion of this within AGS recently in a different context um my understanding is it was discussed but we're talking about 15 20 years ago um and I think a concern at the time was that that um I wasn't around at the time so I'm reporting what others have told me but um uh firstly uh if we gone down the BS or ISO route uh a would have been required to have changed the format in certain ways and move it to some to marry in with other data formats and it was already fairly well established so there wasn't very much appetite um for that and also there's a matter of control as well because it would have led uh potentially to losing a bit of control so there is a trade-off between control but also having that status of a BS or an ISO standard now in our case it's worked out absolutely fine um because there is one advantage which is Maybe not immediately obvious but AGS can make this available for free the documentation whereas uh people have to pay to uh view ISO standards and BSS and things like that orbe it companies tend to pay for that so um that is actually in our industry that is actually a significant factor so um yeah and and that's an interesting one that but that's it there was a conscious decision made a few years ago and for AGS it's worked out fine okay thanks so there are two there are also questions that are more about especially contributing to a GSI if I understand well and also um another one uh yes and yes all to address threea so maybe this one you could address it in directly in the in the chat it's not an easy one to answer I suppose yeah there's specific questions here that yeah it may be easy for me while I'm watching some of the others two actually maybe do that yeah I don't want to get into now about AGS over digs as well given that Dan is coming up to talk about digs you exactly exactly and you will have an overview of what is digs and also what can be IFC can offer and also a question about what is book a in the context of data mapping so book a is about geotechnical investigation that's was a wording we use uh maybe um I think Yas will come back to it in its presentation so I propose to move to the presentation of Dan no about digs uh this that was built upon uh AGS at this time so Dan if you are here I would propose you to to share your screen I I am here and hopefully we can get the uh the slide presentation up we have some uh orange flow oh no we have your presentation D we have the presentation okay I didn't realize I didn't realize we were gonna get into a knockout drag out knockdown drag out fight with AGS but um certainly not it's certainly not the intent and and actually digs those a lot to ads in terms of it in terms of its origin um so um thank you everyone uh for me it's a good morning uh good afternoon good evening for the rest of you um my name is Dan Ponte I'm uh uh have been involved with the digs project essentially since his Inception and has been one of the folks that has been involved a lot of the schema design um my background is actually in in geology um also an engineering geology I was a research scientist at the US Geological Survey for about 40 years almost 40 years um and um was got involved in in the in the digs project in part because of uh some desires and needs on our in to begin to uh standardize borole data and other types of um geologic and technical investigations so that was how I sort of got into this and it's and I I feel very strongly about trying to promote um these sorts of Standards so I've continued to be involved in it even after I've left the survey um so the overview what I'll talk about today is kind of a similar format to what Neil had done what is Diggs um I'm going to talk a little bit about current implementations we're um uh not as far along in terms of implementation clearly is AGS but things are beginning to to kick off um I'll talk a little bit about the data model what exactly digs does what's in it how it's organized um some recent extensions that that are taking us out of the ground investigations mode a little bit and extending that um and also some of the current and future efforts that we're into and then a little bit of discussion about Diggs and our involvement in in the Geotech IE um so what is digs um it's a us-based transfer standard similar to AGS um for GE Technical and GE environmental data um as opposed to the the sort of flat file CSV sort of format that HS uses um RS is an uh is an object based XML schema with Associated XML dictionaries for test properties and code lists it is built on geography markup language the gml the ogc standards for uh defining vocabularies for geometries and and uh locations of features um and so that's sort of why we're we're wedded into into XML um we also incorporate some other XML vocabularies as well um the gml 3.3 extensions that um bring in linear referencing and and Vector types then we use this for uh using uh developing local coordinate systems uh for bore holes and other types of what we call sampling features I'll get into that a little bit later um we also use some um some parts of the wiml standards this comes from energistics this is an oil industry uh uh XML vocabulary for their drilling work um and what we use from them is our measure types and unit symbols um they have defined uh quantity classes according to different types of measurement so we can Define for example a pressure measure and be able to hone in on the on appropriate unit types for pressures and be able to validate that within the schema so we found that to be very useful so we've Incorporated that as well um a lot of our structure and object structures are influenced by um ogc's observation measurement standard um the earlier version one which was around when we were when we were building this and the gml coverage models for our observations so we're leaning a lot on existing standards um we uh are open sourced we're largely volunteer um developed um we do support ags4 groups and headings our one of our goals was to be able to interoperate with AGS and be able to uh um accommodate information that's in the AGS standard um and we support both exploration ground investigation data and construction activity information um a lot of uh digs originated about in the early 90s from efforts that um were underway uh in the in the highway Transportation realm and as well as earthquake Engineering in terms of trying to disseminate information in a standard fashion there were all sorts of ways of doing that back then um but Diggs as an XML standard really started in about 2004 with a pooled fund study that was funded by the federal highways organization uh Association in in uh in the US um there was an initial 1.0 version of digs that was created um in the early 2000s that was um not very usable it was clunky it wasn't very welld designed um so that got revamped in part under the opes of the associate the uh American Society of civil engineers their Geo Institute um we became a special project around 20 2012 and asce is now the uh the the goo Institute is now the sort of overseer of Diggs and we do get a little bit of funding from them through their special project program to do some of the development work um for uh uh for the standard um the first sort of stable production version was released in 2020 that was um version 2.5 and we've just released a new update to that which is uh includes a lot more um uh uh uh it's it's a richer data set a lot more uh test procedures um associated with with it as well as extensions into construction activity information uh you can find information uh at our website at digm l.org all of the um the fun parts of the of the Dig standard were all hosted in a GitHub repository the digs ml so there the schema repositories are there there's documentation and other resources um and we're working on uh revamping that and and becoming and getting a lot more documentation online um so the goals of digs when it was when we were developing it is we wanted to develop a standardized structure for for GE technical data that leveraged existing web-based Technologies as much as we could uh we didn't rely we didn't want to rely on any proprietary software or systems we wanted something that could be readily validated parsed and queried and transformed to other formats with existing tools so some of the difficulty that neilan mentioned it's difficult to parse AGS standard because you need to have specialized software for that we wanted to see we could take advantage of existing tools and use formats that that would allow us to do that not have to worry about it um we wanted to be able to to accommodate more complex data relations um ags's Simplicity also has some limitations uh we wanted something that was flexible enough that we could accommodate different standards of practice and data detail um and uh and so most elements are optional there's a lot of stuff in digs that you don't need to use um but it's there if you need it um we also wanted it to be extensible such that we could add to the to the schema as um needs needs uh expanded uh without breaking what we already existed in terms of its its design and it's also profil which means that you can take a dig schema and customize it to your needs within limitations um you can make optional elements mandatory you can remove optional elements you don't need and you can sort of tailor it to your um to your specific needs um if you if you desire to do that um so what we wanted to do was with with this is an is accommodate a wide variety of practice in languages units of measure Etc and not dictate practice or change workflow processes this is just designed to be a pipeline to enable the transmission of information from one system one organization uh to to another much as Neil had shown that that AG s is used for um in terms of current implementation one of the things that happened with our 2.5 release in 2020 was that uh an organization in the US called ashto which is an American Association of uh State Highway Transportation officials it's a big standards body in the US involving around um Highway and road construction um had uh has um uh made digs now a provisional standard of practice for the dissemination geotechnical data so that's Peak some interest from vendors and Departments of Transportation in terms of how best to to move into a better uh into more standardized digital interchange of data um one of the problems that in terms of getting going and I think NE sort of touched on this a little bit it took a little bit for AGS to go is that there's sort of a chicken and egg problem um if you're a user of of uh geotechnical information why should you export things or move things around in digs format if there isn't any software to support it if you're from the software vendor side why should I be developing support for a standard that nobody's using and until that sort of breaks um things are a little slow to slow to get going but that's starting to happen now with Diggs um rock science is a a a software product that um is used for borole log um display and implementation they now have uh developed a module to import digs into their into their system for producing B hole logs 4dm in a similar fashion only they're they support both Import and Export of the digs format now um there is a this GRL weep is a is a piling um uh analysis application that has uh in their new version 14 will be able to import CPT and spt along with some other ground information data into their analysis software for um uh for git technical characterization characterization and their analysis software propil data um geocomp is now has uh export of their triaxial uh test data um TBL log is another uh boring log and data collection software package that it that can export into digs and um data forensics um which is a a small software company that builds tools for open ground for B's open ground has developed an import converter service to take Dig's information and be able to bring that into the into the into the open ground Cloud so we're starting to see um a fairly rapid uh initiation of uh acceptance of the digs format from a number of different vendors and these things are coming online if they're not already online they'll be online very soon in in and advertise releases and this recent uptake um uh in the vendor software support rep is really comes at an opportune time because it's an um a lot of the US state Departments of transportations are moving toward digital project delivery at a fairly rapid pace so this is likely the driver the the that defin sort of the purpose and need um that really didn't exist much in the past when uh individual organizations had their own workflows operating and there didn't seem to be much desire or need from their parts for this sort of crossplatform standardization so things are kind of interestingly coming together um and have been over the last couple of years um another application that's been interesting is a a web-based product from an organization nonprofit called giosetta giosetta is compiling and collecting um uh historic geotechnical information from primarily from Departments of transportation from other sources um uh across the US um they're uh integrating that information with uh soil map data and ground water data and the and digs is the is the mechanism by which that data moves in and out of the out of the platform you can go on there online you can pick out sites you can generate um both boring logs and tabular information from the digs data on the Fly um on the website um but beyond that they're also being using this information uh with machine learning tools to do predict to uh do predictive analysis and um uh make estimates for U soil characterization SBT estimates based on these data that are being collected so this is one of these interesting applications when we begin to be able to standardize um these data sources to be able to utilize massive amounts of data to feed into machine learning uh and AI uh projects for looking at at at at really interesting um uh predictive about or predictive applications so this is kind kind of a cool thing that's also beginning and Diggs is the is the pipeline for all of that information um so the digs data model um so a digs data file or what we call a digs instance is simply a collection of XML objects and if you're not familiar with XML this is an example of what it kind of looks like um this is actually an example of some information that described some lithology uh data within Diggs um the fundamental element is a thing called an element which is basically um uh structured text that has a tag that's it's enclosed in the angle bracket so this tag here is Legend code and a value associated with that here here called silt um this is sort of the fundamental piece of of of an XML element you can also have data within the tag we call those attributes um there's no real rule in XML as to what data can go into a is in the element and what data goes into into the attributes uh we use it primarily for metadata about the data in this case this attribute defines an identifier for a particular XML element um we also use identifiers a lot as pointers to resources this particular element classification code identifies the type of lithology it's called the Sandy silt and this um URL that's sitting here in this codespace attribute is a pointer to a dictionary definition of Sandy silt associated with the um uhm standard for unified soil classification so digs doesn't tell you how to define geology but it wants you to to tell other people what that information where that information comes from so um this is done through um these codespace attributes and dictionaries um which digs is developing U sets of those as well but they can also be your own dictionaries so we're not telling you how to do things but we ask you to tell the rest of the world as you transfer this data how that information is defined um obviously also elements can have other elements associated with them so this lithology element contains a bunch of other Elements Incorporated within it and when we have a complex element like that in digp we call that an object so that's our XML object and then in the context of that the elements that are part of those XML objects are what we call properties so the properties of this of this lithology object include things like its description and its Legend code and the classification symbol and classification codes so what a digs element a digs uh instance looks like is just a collection of all these different types of objects that can be fairly comp complex with objects within objects as well and so we have this sort of H flat level digs data model from which every single one of these objects fall into one of these boxes which we call object classes so an object class is actually a type of um uh or or a related set of digs objects that all inherit from what we call an abstract type a type of XML object that has properties that are common to everything within that class and those classes are associated to each other by referencing property so rather than having a hierarchical top- down structure like AGS has we have more of an associative structure between the different types of objects and how they're and how they're interrelated I'm not going to go through all these in in a lot of detail about I'll talk about a few of them one of the important ones is what we call a sampling feature sampling features are physical objects um or locations through which we make observations or from where we um uh uh conduct an activity of some sort um and what it does is provides the fra it's essentially a viewport into the world and it provides the the dimensionality and the scope of the of the of the area that we're that we're looking at and it also allows us to define a reference system that can be can can locate observations within that sampling feature in a more convenient fashion um so an example if we're kind of thinking about the world we've got the target if our Target is the natural ground and we have a series of different kinds of targets but let's say we have a natural ground Target we sample that natural ground Target through sampling features one of those happens to be a bore hole um and I'm showing an example here where we may have a bore hole where we excavate into the ground or we have a trenched wall or we have a face cut into that surface those have specific types of views into the world and we model those in in different ways if we just look at the information from a bore hole or the information from a trench wall our understanding of that Target in a bore hole is essentially onedimensional it's a linear object and we can locate uh information along that linear object by distance from some origin so that's their depth measure depth and B hole standard kind of things if we look at something along a trench we can identify observations within the trench wall in two Dimensions we have an ex X and a y or an X and a zco coordinate that identify locations within those things so we classify these sampling features by their dimensionality those we car those carry properties that Define how um that that that Define the the um the shape of those features and also how um local referencing local local locations can be locally referenced within that feature um so we do a thing called local coordinate referencing or or linear referencing um which which allows us to Define um locations in a more simpler fashion than an absolute coordinate system so if we look at a bore hole for example um we have two geometries associated with the bore hole one is the reference point which is the essentially the origin or the collar of the of the hole and the other we call a center line which essentially uh defines the the bore holes trajectory and that's a line string that has vertices that are defined in a real world coordinate system in three dimensions um and then we can define an object that we call a linear spatial referencing system where we can Define the linear element of that as the center line the reference which would be the reference point of the origin for that of that line how the measurements are made which in typically is absolute measurements from the origin and what that unit of measurement is so we can say that we want to have a a unit of measurement in meters we're going to track along that bore hole and then we can use that object as our spatial reference system when we Define a point so if we have a sample located say at 4.2 meters down the B hole we can use the we can have this point location object that defines that location and it points to the spatial reference system that we've defined for the B hole and that tells us that 4.2 is down the trajectory of the bore hole and it's in meters we can expand that from one dimension to two dimension ions by looking at something like a crosssection here or or or a Trent section where we Define a couple of other geomet geometric features we have a reference point we have a reference Edge which is essentially equivalent to the center line that's the red line in this diagram and then we have a polygon or a series of surfaces or series of polygons like we would here that defines the extent of the actual feature in in two-dimensional space and then we can Define things within it um either as polygons or lines or points like the fault or a sample location or that or that geologic unit that's that's pointed out there and then we can define a spacial reference system with that as well but we add another component to it besides the the the linear element in the reference we add a thing called a um an offset Vector which defines the second dimension Direction so we use a unit Vector in this case we're looking at a vertical cross-section so the unit Vector says that all we've got is motion in the in the Z Direction but you can change that unit Vector coordinate system any way you want to and you can rotate that surface and Define any sorts of or any sort of orientation whether it's vertical horizontal in a map view or anything or anything different from that and then we have another extension to that that adds another um orthogonal Vector type to it to Define three-dimensional volumes and three-dimensional spaces we're going kind of beyond the idea of the single B hole it's all encapsulated in the sampling feature um finally uh we we talk about sampling activities I'm going to go real quick because I'm running behind um but we we we separate the activities of samples from samples themselves um trying to get this to work I'm going to skip through this but basically we we can define a sample activity that gets samples um that activity produces a sample or several samples that's what the activities do we then can subsample from those through another activity and then those samples potentially can turn into um specimens for Laboratory Testing specimens um are not samples they have different properties that are associated with the test measurement themselves um also in our conceptual model we have two ways of defining observations one of them is called a measurement and the other is called Observation systems measurements are are used for um uh numerical um information that's largely derived from testing equipment or sensors laboratory tests and situ tests um use this we have a um uh a defined structure that's that's modeled after the gml coverage models or um the specialized measurements one it's called the test that identifies um uh measurements or results associated with spatial distribution of data so that's used for things like CPT tests or lab standard laboratory tests we have a flip side one called a monitor which uses a temporal domain so that's used for our monitoring measurements those have similar structures and then they're associated with a procedure object which defines the metadata and and um intermediate observations and results for those tests we currently have 61 of those Laboratory test procedures defined um 16 in situe test procedures defined some general procedures and we have a dictionary that defines the observed properties there's 349 of them currently in our dictionary um observation systems are defined are used for direct human interactions observations that that require judgment or analysis typically category data um there are different specializations for those but these are the types of information that that typically go into Bor hole logs that are descriptive or interpretive things like soil classifications statgraphic designations color descriptions Geo um geotechnical unit um uh determinations those sorts of things identification of faults those sorts of things go into structures that are part of our observation systems so those are the two types of measurements we're developing some extensions to handle geophysical data the challenges here are the massive amounts of information and some of the complexity of the raw data we do this two ways um one is to define the process geophysical data the final results are done in um uh it uses our test object like every other measurement we've developed a procedure object to report the processing steps for that and we've added some pieces um uh some sampling feature specializations and some geometries to enable uh more compact encoding of that information so that we aren't blowing things up as well as defining some spatial temporal CR uh coordinate referencing systems for things like um uh uh seismic uh time time series or I'm sorry time section time volumes um we've also uh have have those structures for two and 3D and then we have a container object we're carrying raw geophysical data and a procedural object that allows us to U uh represent the um the metadata associated with the field survey including Source receiver geometry Source receiver uh configurations and those are those equipment types um we're also getting into construction activities um these include primarily gring right now and we're moving into piling uh construction we have two object classes for that one for describing the construction activity and then the other one we call a program which is where we specify uh design criteria specification and performance uh performance specs um we've done this now for grouting um we we support uh Rock permeation compaction and soil mixing um we have a series of grout sample test procedures uh associ ated with these the program components in include grout Mix Design and grouting specifications and performance specs um we're currently working on developing um data structs for measurement well drilling data uh driven pile installation data and as built driven pile information along with Cod list and dictionary developments and API development so finally uh we've been involved as HS has with the Geotech iie um we're uh we we're involved because we want our system to be interoperable and we want to learn what other folks are doing um so it's important for us to try to maintain interoperability with other standards and systems and to align um our structure with the ogc conceptual models for bore holes and sampling features samples and specimens and the like and we're quite successful in being able to do that in large part because digs incorporates a lot of these um uh ogc Concepts um in its structure um and so we've contributed object property definitions to the to the um to this um effort um uh measurement property definitions and uh as well as uh developing requirements and helping to uh extend the um uh ogc sensor things API um to be able to accommodate these data and we've been successfully mapping to and from um that that system so that's it um thank you very much and if there's time for any questions I'll take them otherwise we can do it later yeah thank you Dan especially for attending this meeting at 5: in the morning for you um as you said yeah we bit out of time but uh I will take one question that was about uh why did you choose XML I think there was a question about why XML versus Jason but I guess it was just because there was not so much Jason at that time when you hook of it there there wasn't Jason at that time and actually because we we wanted to leverage the the um the geometry definitions from gml that was weding us to XML actually the way Diggs is structured right now um I think XML is still the best format for it Jason does have some limitations uh it doesn't it doesn't handle name spaces easily although that's becoming better developed at the time there were no Jason schemas that's changed but there's still a lot of U there's a lot of there's a lot more um flexibility in the structure with XML that um I'm not certain that we'll be moving to Jason soon but as Jason continues to develop that might be something that we'll we'll move toward okay thanks so I would propose to to move on and uh welcome yonas yonas if you're still with us I will post I am still here hello fine hello so same as Neil same as AGS people want to know about IC let's see what I can do um good afternoon from Austria and hello to every all the other time zones I would start sharing my screen if it works here we are now I need to organize my windows and here we go okay that's the final one yeah yes I can tell you a little bit about um the IC tunnel project and the you initiative to include geology and geot techniques into into the IFC scheme starting with the motivation of course um as most of you know um in tunneling or other infrastructures you're dealing with these CH challenges of large volumes of data um big complex uh sets of factual data especially you need to translate your geological models GE geological conditions into geot technical models that are adequate for certain design issues uh you frequently need to verify uh your models when additional data are coming in and of course the infrastructure projects are associated with high cost for the construction uh or the ground conditions are responsible or limit or Define the the the costs for the projects and there are a huge factor of uncertainty um this is the reason why be it is a many of the infrastructure owners are aware that um they need to pay attention to this and as they develop their um digital tools for design and and um develop Bim models and digital Twins and so on uh for their buildings for construction design operation um also the ground conditions came came into the focus so um I start this from the other sides because IFC of course uh was not developed for ground models and um and your technical data but it came from the other side it came from the design side um so I think to all of us to how why we need ground models and why we need data structures to handle our data they are clear um but why IFC um IFC is the standard established standard in in open Bim projects in many parts of the world um one benefit is that it includes the geometry along with properties and um yeah a detailed semantic model for for buildings and and many many other things equipment and so on so far uh ground ground models are not have not really been supported by this scheme but however um IFC is frequently used already or IFC ground models are commonly requested by infrastructure owners because they say we are doing the project in Bim we use building information modeling modeling so please give us your ground model in IFC that we can include it in our common data environment um so far all these U requests need to be answered by custom Solutions um proxy elements and yeah using existing schemas to to include the geometry and the properties of of the ground models and um this puts of course big limitations to these models and to the applications um in these Bim projects where the ground model comes in Via IFC they are frequently or or usually only used as the background models so there's not much um model model based workflows or modelbased interpretation analyzes on um IFC model so far from my experience or from the experience of our group but we think that this can be changed if the if the IFC schem supports geot technical data and becomes more established also in the in the um software landscape from for ground modeling or or our um domain tools let's say um of course there's no intention to repl play this established formats that we just heard about um that's why I started this introduction from the other side yeah uh there is this request to bring ground models and geot Technical data into the Bim world and IFC could be one vehicle for that one argument is that um many of these standards that that that we know AGS SS as examples are not designed of not so strong on the geometry site so um let's have a look on this or overview on this project what is what is happening at the moment on the IFC side uh in the course of the IFC tunnel project um the project started with a requirement analysis um where domain experts me Mikel and and many others also joining this call today um developed um yeah we wrote this requirement analyzis report that maybe some of you know um defining use cases defining typical exchange scenarios for geotechnical data and ground models and on that base um the taxonomy and the conceptual model have been developed of course this went along with all these coordinations uh with people from very different backgrounds from very different initiatives sitting together on the table table um bringing in what is already there bringing in uh their personal experience bringing in um contributions from all parts of the world um and coming trying to develop a conceptual model or or in or set up a conceptual model that reflects on one hand the the workflows that that we think are relevant according to the requirement analysis but also takes what is there from these existing standards like AGS scks um and so on so this was the big activity of these of us domain experts let's say um and the goal of this conceptual model was reached by end of 2022 uh 2023 the draft of the schema extension was um shared with the technical team was shared with software vendors in the course of the deployment program and at the moment this is the implementation of this conceptual model to the schema is happening and I will show you in some slides later where you can have a look into these draft versions so the focus for us um according to our requirement analyzis was a few points that you see here um we said this was a discussion before should factual data be included or supported by IFC at all because there's existing standards maybe it's not necessary um to exchange them in IFC because um the question was if it's really necessary to bring them into the common data environment of a huge infrastructure project who wants to see the geotechnical tests in there yeah um one argument to to offer this option was of course the visual ation that you have a format that supports 3D and can be easily imported to show where is information uh where does information exist and we not where are the gaps um and another point or another idea was to to maybe widen the the range of tools that can be used to to to make um spatial analyzis geometrical analyses in 3D with all these tools that are developed for for Bim models and so on when you have the ground or the geotechnical data available in IFC many many other tools can use this and people can develop their custom workflows for further investigations uh for further analyzes and so on um then one goal was of course to support geot technical models for design um and we also agreed in this domain team that uh broader or or more General geological models are a most important source for us when we develop a certain model for a certain design case so we want to support uh geological models as well as very specific geotechnical design models and another goal was to link the ground model um to the building model so to use the model to describe the ground building interactions and to use it for for prosis and and um as an support for design applications this is Illustrated in in this slide here you see again these books that appeared before that Mikel mentioned um they came from the French standard and established in our group in the last years um but not only there but also in in several other Publications so we we think it's most important to distinguish what are observations what are factual data and what is interpretation of course the the the interface is not always clear there's always a certain interpretation included in the observation but we need to to express this somehow in the model or to structure our model um accordingly so factual data typically include all kind of geotechnical testing field mapping um existing literature can even be imported or linked um monitoring data are an important source and they all should be represented somehow in the or should be supported by by the IFC schema to um to show what the database actually is before you do the interpretation then the interpreted interpreted models um include not only geotechnical is but also geolog iCal aspects uh we go into detail on that and with this example for a tunnel we think it's most important to to show the connection from a volumetric model to for example a linear structure like this tunnel and to have a design related prognosis model to have a design related element in your model scheme that describes certain conditions specifically for that building so this is the the model to describe the building ground interaction and this is the one that can be used for further interpretation um or for further applications like um contractual topics time quantity estimates and cost so what are the the the the most important aspects in those models according to to our analyis um we think regarding observations um yeah we made a collection of typical data that that we think are essential to be supported we also had to exclude some things or or to we put a focus on certain things and we had to bring it on a yeah on a certain level of detail that is adequate to not overload the IFC schema and to not interfere too much with these existing standards um then we needed to decide which all elements or which information should be linked to an object which what are actually the the objects that we model and what are other elements like semantic elements that don't do not necessarily not necessarily need um a geometry then most important for us to not to overload the schemer but give a chance to link external files to use IFC as kind of a front end connection to the design model um visual ation tool um but have the chance to include very complex data sets um and then one thing that is already applied and that is commonly used in IFC are custom property sets because of course we cannot Define all all um necessary properties to describe certain tests certain aspects that might be relevant in a project in a country according to a certain standard and so on um for book b as I said we said uh we wanted to have parallel models because you of course you can classify the ground according to certain aspects you can have a um strateg graphic geological model you have a model of tectonic structures you can have a Model A hydrogeological model and you can have um a geotechnical model for a certain design case but you might have a different geotechnical model for another application and of course it must be possible to have multiple models that describe the same piece of ground so this was uh one condition and then I mentioned before this Geotech synthesis model was Our intention to define a link from ground model to building so what's the status [Music] um this conceptual model has been completed as I said currently the tech team is working on it along with the deployment project where the software venders are included already the goal is to deliver this proposal in the course of this summer and then this uh the the building smart International board will need to check need to review and give the approval this is expected to happen in end of this year uh if you more interested in this process and the status you can use the link below the slides will be shared of course so let's have a look into the content let's see um which elements we have defined now um starting with book a with observations and measurement measurement results um there are different kind of objects the first one that I want to show you are these let's say containers the spatial elements um that actually describe where certain in where certain observation or measurement has been made yeah typical example would be a ble of course um it can be a mapped zone or a tested Zone something where certain um certain method has been used uh you might be aware that IFC 4.3 already included um a simple yeah a simple concept for ground models including balls um the short coming from our perspective was that this scheme included only interpretation that are actually then mapped on a ball hole to say this part of the ball hole belongs to a certain interpreted unit and we said we need to support book a so we changed this um definition we changed the semantics but we keep this I ofc ble element from for 4x3 um and we we establish the the the possibility to include observations to this object so um there are many objects uh many options now how to attach your information to the model uh the simplest one that is commonly used already um is to model only the B hole in 3D as a cylinder show in a Comm data environment look we have a b hole here then you include the hyperlink into the properties and somebody um can drop certain files um on SharePoint or wherever to Quarry it of course there's much more sophisticated methods to do this in IFC and this this shall be supported by the schema that um external files can be linked to B objects but it should be also possible to go into more detail and model the logs or the observations that have been made along the bohle as 3D objects and to attach property sets for login data yeah so the schema the scheme that is um in The Proposal now includes um yeah a standardized slim set of Key properties that are usually locked but um as I said before most important is the option to use custom property sets because um logging is never or B log is never performed in the same way all over the world in all the projects there's a lot of software specific country specific standard specific uh geology specific aspects that you need to lock and it must be possible to use this um this uh to define the properties of the data you want to to add to the B log uh according to your needs and it should be possible to map these existing standards to to map existing data digies like like AGS or digs or other standards into such custom property sets to use IFC really as the front end but keep the data structure as it is in other standards um I linked at some occasions here the the guideline that has been published last year by the German DP that's the the German Austrian Switzerland branch of the ITA because this was one of these initiatives that was running parallel with IFC tunnel and we adopted a lot of the thoughts from IFC tunnel into this guideline so it might be interesting it's available in English and in German and uh you can have a closer look at this especially regarding this yeah granularity level of models that is described there so now let's go to the measurements themselves so we we mentioned before the the containers the spatial objects uh now there have been some um observation types defined starting from from uh B log measure while drilling interval certain tests um certain observations that are made during field mapping um of course this is this does not cover all kind of of observations but it's it's on a certain on a high enough level I would say to include um most of the most of the observation that you can make because as I said you can model this and then you can Define your custom pets to further distinguish uh what has been actually observed or what test has been performed I will get there in a few minutes um another example is tunnel documentation because we said yeah of course for IFC tunnel it's not only necessary to include the information that is collected before the project but also during execution um include the documentation inside the tunnel and make comparison of predicted versus encountered conditions um so you can say that the tunnel face is actually area where observations are made it's a mapped Zone according to this terminology and on this mapped Zone you can measure discontinuities you can measure uh you can measure certain units lithologies you can measure water inflow and so on so any of these observations can have a geometry but be linked to the spatial element so this can look like this you have the tunnel face itself then you have two mapped units one part here one part here and then you can of course go in more detail and and and include uh traces of discontinuities over break and so on and attach these geometries to your mapped unit and to your tunnel face so no more details on the test results um we said we cannot copy existing data dictionaries and we cannot go into very detail and Define uh 10 different types of stiffness or elasticity modulus deformation modulus static Dynamic modulus and so on so there has been a lot of discussion how much how much properties shall be included and it and we agreed to do this rather um Exemplar ratory how to say that in English to Define some um example properties that show how the model elements can be used for example we said um we can have a p set for dilom test we can have a p set for CPT for spt but just with a few Key Properties use this Key Properties as they are included or the definition and terminology use this from existing standards so any user will see how we can use this observation and this test result elements um so if we go into the the documentation now you you will be able to use this link when you um when you get the slides you can jump with this link to the building smart International homepage and you can go to change locks up here and scroll down and you can look at new entities that have been included in IFC 4x4 and I would like to take this example to go to geoscience Observation so you see this is the definition of geoscience um observation detailed collected information including measured parameters and so on and here you see that there are certain IC OBS uh geoscience observation type enumerations you see that there are several um types of observation defined already this can be a b loog this can be a mapped feature it can be inc2 test result so this is this is the level in the scheme where where the kind of observations are defined and if you want to go in further detail if you want to describe what kind of test has been applied you can add the accordingly the the relevant property set to this test result element and the property set would be for example an inc2 test CPT and what do you see very thin there there's not much details it just says con resistance side friction po pressure Yarns modulus um and there will be a few more of such um test supported but the intention is to leave this open because we cannot uh try to copy the data dictionary from ajs and include hundreds of tests and hundreds of properties for two actual tests and so on um but we just provide the interface where this data and this um where this information can be connected there's also one pie set a common one to say um for many observations you you you need the same um records independent of the test to say at which depth of the ball has it be performed what was the method what what's the type of test and so on there are more common pets to describe date uh time who worked on it what was the purpose of the test and so on going back to the presentation if you want to go in more detail and see what is actually being discussed how were the decision made or what is still pending because you will see that many of the of the of the elements that are showed in the slides of the conceptual model are not available yet in this schemer documentation um but you can go to GitHub and this one is also available to the public and you can search for certain points in the issues and you find for example what was um commented how to deal with IFC Bohol how to deal with IFC geoscience observations and science features and so on and there's a lot of details you can participate the discussion and you can see what is behind this concept that the tech technical team actually used to to to prepare the proposal to building smart International um no time to go into more detail on that so back to the presentation here we are and let's go to book b to the interpreted models but those um we see we said we need to distinguish several kind of classifications so we have three parallel Concepts now you can have a geology model with certain geology elements like a geological Unit A Fault a contact or even a fold structure those um terminologies or or those terms were mostly adopted from GSL then we can have a hydro GE model this was also aligned with giml and you can have a geotechnical model where we distinguish actually two features a geotechnical unit which would represent for example a layer a certain volume or you can have a Street discontinuity that might carry discontinuity strength properties uh like friction angle cohesion that is relevant for a certain feature that is really modeled in 3D so this is again an overview of of uh these units you will find all of them in the documentation on building smart International homepage that I showed before and I want to pick out one that is most relevant for the for the geotechnical engineers and for the designers so we usually provide a geotechnical model specific for um for a certain project for a certain application and we Define ground properties for the design and for now the scheme includes three approaches to describe the properties of such a layouted unit one is a property set for soil like materials with py properties um for soil um of course this list can also be endless depending on the material models you want to uh apply in your GE technical analysis and so on but again this would be the approach to use custom property sets the second option is um description of rock material and we said typically in rock you need to describe the in rock property you need to describe the discontinuity properties and usually you have Rock Mass parameters that you use on a larger scale so there's three property sets for these type of of units and then we said um it might be helpful to see the rough idea of your material of your materials in the model but usually in tunneling you cannot predict in detail where these materials are so you have a rough idea from the geological model but you cannot say this ground type is here with 2 m thickness and so on so um this is again a concept of model granularity this third um property set or this third approach that I showed actually allows to yeah to describe the distribution of certain materials just in the semantics and your model could look like this you have a detailed model close to the portals or in a certain area of high interest where you have more knowledge where you have more investigation data um where you model your materials in detail and Define the geot technical properties but for most part of the tunnel you will not be sure and you just say in this part I have a certain distribution of I don't know 60% 60 to 80% of this unit and um of this material and the rest is the other material yeah so this can also be handled in the properties but you you can give designer an idea what to expect this links us to the topic of uncertainty as you know in all of these work groups you end up in discussions about uncertainty and how to handle it and how to classify it and how to transport it and we all know that um any model is is is not useful without describing its uncertainty um and there's several approaches to bring this out of reports and out of discussions into the model um one approach was to use different definitions for geotechnical units that's something that I just uh showed and here I want to also point you again to this do guideline what this topic or where this concept is explained in more detail um then something that shall be included in the concept of IFC tunnel is a parameter distribution model this is any kind of overlapping model that describes a certain property in 3D may it be with a voxel model or with some shell models or with some some um other parametric descriptions um and this can be used to yeah quantify if you have an agreed classification for your uncertain to you can of course provide this along with your model um then you have the chance to use this Geotech synthesis model the model that connects your ground model actually to the design or to a certain alignment uh to Define yeah uncertainties along this alignments with properties um and of course in the factual data or in the yeah recommended recommended your technical parameters you can work with property uh ranges to say the cohesion is I don't know 100 to 200 kilopascal um so that's something that is possible in ic to use bounded values instead of fixed single values yes one thing that is currently ongoing is yeah more work on uncertainty pets to um take existing ratings or existing approaches from literature from from scienti from scientific Publications and so on and include them as a property set to attach this to certain objects like a contact like a layer but maybe also to the overall model boundary to say these and these and these are the uncertainties associated with this model so the benefit would be that someone who receives this model gets all your input on the uncertainty along with the model with this I'm at the end and I hope that there's some time left for questions we yeah we have one question but it's more about implementation I mean people uh say that uh in the tools software they are only H to IFC to the xray so there was a question about the implementation of IFC 4.3 or even 4.4 so yeah if you ask some info yes that's that always depends on the software vendors of course and this is the the problem that we all face uh we are working now let's say on the edge and and pushing the limits and try to bring thing into IFC candidate but this the the classical Bim software or the the most of the projects do not even use for H3 because they're always behind and there have been big jumps in the in the last upgrades um of course this will be a long way until this yeah really reaches the market yes I'm aware of this and this is um of course painful but anyway you have to start at some point and I don't know who's in this group if somebody from software venders from the software developing side is here and can can comment on this but of course this is big hope for us there's some viewers that even exclude the for for candidates uh uh models like Bim vision for example you can open these test models that have been exchanged with the tech team you can open them in Bim vision and there might be some others but until this really reaches the geotechnical software that you can export your ground model from your modeling software or that you can request um testal or Bohol logs associated with an IFC file this is still a way to go MH thank you so and there were also other questions but the team answer them kazunori and girl came to to to help so I guess we have more questions about that so thank you Yas and uh I will propose you a last one last presentation that is about ogc standard and also what we we did uh in J so I will give the Flor to myself to to do that and let's go just let me know if you see my screen be helpful yes here they are thank you so let's go back to this one hey come on mhm or screen is it possible okay my keyboard is dead Okay so uh to finish our overview about the standards for jch here is an overview of ogc standard especially one that has been mentioned before that is Jo so about the ogc this is a acronym for open Jo special Consortium uh home of Jo spal Innovation collaboration standards and this is a very large group that is about defining standards uh for different topic that have in common especially the the wish to expose data into in conformance with the fair principle so makes the data findable accessible interoperable and reusable so you may know that in ogc there are multiple things that are addressed but especially what you can find is some people from different domain providing data models uh in Jo science for example and also API so application programming interface uh are more exactly tools that enable to make the data accessible so um certainly ogc is well known for E API that make it famous so some that you certainly have already used are the whole API so wxs Services WM wfs very convenient to access access some data like maps and also uh geology data of any kind and there are also a new series of API so for example API features and of API that I already mentioned so regarding ogc um and especially the data models for geoscience so the main one is geoc IML it's a data model in order to cover geoscience as a general thing starting with geology then there was a first extension for hydro geology called grw ml2 and uh the way those standards are developed and extended are discussed in a group called The geoscience Domain working group so this group uh decide of the different extension that should be made and especially in order to do that we explore possibilities with interoperability experiments so we had one uh regarding the B hole we also had a second one regarding environmental linked features about web semantics and there was also the jotech IE that I mentioned before so all those activity helps to uh make the geoscience standard best as possible in order to address different uh purpose so the interest from the ogc geoscience domain working for geot well the first one uh if I go back to the presentation of the paper from David to was that it was Jo same as been designed for that so we first address the uh topic of geological maps and observation but the idea of GEOS and that's why it's not geology ml but GE ml for geoscience ML is to be able to uh extend it for other geoscience data such as geot technices so um the objectives we had when working on that topic were to have a semantic consistency in the Jo science community so we do believe that the people of AGS digs and IFC that address geot techniques all belong to the geoscience community and the idea was to uh avoid some barriers between us and to facilitate the exchange of data between us so what we target was a seamless transition between G and beam you you certainly use G daily maybe also beam and the idea that these are two different environment but still the data you may want to use are about the same thing even if they are not represented the same way so uh what ogc GE science doming group was bringing into to that project was J and Quantum ml2 as basis for extensions to Geotech um our idea was also in ogc to show what other data models can do including those in ogc and the things that is pretty nice in the ogc standards is the connection with the ogc API or web services so the all this data models that are proposed by ogc have been designed in order to be made accessible firm and with these API so that's one quality the uh the data models of OG have they can they go nice with some API um regarding the deliverables we made for home Geotech I so here is an example of what you will be able to see uh for for example a com set identify the material sample you can see the different flavor of it depending if you are talking about uh IFC version the ogc version AGS and digs and you can see different uh definition and also the properties we identify that would be nice to have independently from the tools you you want to use so uh you have a lot of them material sample is only one of them and uh all those description are available on the wiki that is mentioned here as I mentioned before uh our idea was also to extend the uh standards from ogc so um as yonas mentioned Jo s was a base for many G to also uh but still there were things that were missing in order to properly address the topic of geot techniques so for example the geot tech unit that has been identify as relevant for Geotech purpose is proposed As A New Concept in JL and also we figure out to connect it to the existing uh objects we have in JL same from G water ml2 plenty things that were already there but needed some extension and also when it came to address the topic about the description of the surrounding constructions and also uh the design of the building then there were also standards in ogc that can help especially if you want to build this seamless transition between G and beam so if you don't have uh a model of uh uh your infrastructure or more exactly the surrounding Construction in beam you may have them in G and it can be useful for your purpose so we help you to find the suitable data for that uh one of the other thing we worked on is in this Geotech IE was how to provide the Geotech investigation data I mean we we have some long discussion in IFC if this topic should be addressed or not and the idea was how to facilitate the ReUse of the existing things the existing data model like digs and AGS so um we work on the basis that is this ISO standard that so at first it was aist that uh that came to ISO that's something that happened and we are Happ for that uh about how to describe uh these observation so as you can see on the picture on the right that is that comes from a colleague called cshl there are multiple way to say that this uh expression that phase is happy so you can just mention it as a a pair of U parameter and value but the idea in the observation M S is to go beyond that and to provide the observation result and all of these context so you will not just provide the result and what it is about you will propose uh more more information like uh the observe property the feature of interest when where the observation was made who and how this absortion was made and this standards now also include the steps of sampling and preparation that we often use in the geote techniques so uh we work on this common basis try to find the connection between AGS digs and also how it can be exposed on IFC uh in order to have this level of uh information uh to make the data reusable uh the there are plenty of other capacity like the fact that you can link observation and you then have an idea of the prence of the data and this is something very useful in the purpose of geot techniques to know how the data have been provided so we work on that thing and extend that standard and especially then we also uh extended uh an API called sensoring API to fit the purpose of geot techniques so this apis uh comes very nicely with the observation m and Sample data model that I mentioned before and the idea was to extend it to a set B that are linear data so you can see uh how the data model is and uh which would say classes enable to describe what so you have for example the description of your Boh its location and trajectory you can explain also the different samplings that have been made from from it uh you can associate observation to either a point or a line or also an interval in this Boles or on the sample itself so this was uh the data model we we work on and we tested it with AGS and digs and Ne and then reported with results for that so you can find them inside the Wiki of our page including a guidance including a demo and uh the plugin that has been developed are freely accessible to you so as the kind of conclusion about ogc standards for Geotech uh I show you that the uh the nice thing that we managed to identify the semantics we were using it's not perfectly aligned but we see a differences between IFC T AGS and digs uh we made multiple proposal of ogc Standards extension to Target consistent in the that's our main goal and we also experiment one API that could help to provide Geotech data so the main use case we have in mind when doing that is that which what is nice with theca uh standard is those API can enable to bridge the gaps uh to make the connection between beam and G environment so you can for example think about accessing Geotech investigation from a environment thanks to GC API you can think about also exposing data you already organized with AGS and thanks to gcpi and also something we demonstrate was the possibility to access uh description of Geotech object injs from the beam environment so that's things we imagine would really help in order to Target beam and digital twam for geotex so here you go that's hand from my side and if you have any questions I can answer them that's do we have some question for me let me have a look no it seems rather quiet SL though okay okay I mean it's no problem we are already over uh 4:00 here in Paris so I will just propose you to do the the wrapup and uh for this session there you go I share my screen again okay so the wrap up if my computer want up should be okay then so a wrap up um from so I came back with my heart of vice sh of this group um some conclusion perspective so thanks to that topic I think that we demonstrated that digital standards was not uh digital stand forch data is not a newpic some standards work on it since a long time and continue to do it but digital Twins and beam will launch the interest and also the need for Geotech standards uh it offers this possibility to work together and of course the this Target offer New Perspectives and also new technologies that could be could help to Target what we uh dream of regarding the provision of jotech data so there is no question about the role of ismg tc222 so in our terms of reference we plan to continue to support the activity of sterilization uh that was started in jote especially in connection with BSI with ogc AGS and digs uh and we also think that ismg tc222 might be a place where we could discuss things that are more related to The Domain of geote techniques not necessarily about the Technologies but this could be a place where we could uh discuss about vocabularies that should be in like the list of the Geotech procedures the list of offer properties so we could have a common basis on that I also open the door to the API as the the digs and AGS mentioned that this is something in studio for them regarding the next event we will continue to offer you a nice Workshop we hope you you love that one uh the next one will be not too far it should be we will try to have it at the last week of May between May 27 and May 31 it's not already scheduled but we are working on that uh the topics we have two topics uh that we would like to address National geech Database or more exactly uh countries that think that it's nice to share Geotech data commonly to to build sustainably and also what we would like to explore is a links with AI machine learning so there is another TC in icmg that is addressing that so uh in order to help us to organize this we would like to propose you uh uh to contact us and the propose uh yeah presentation if you are happy to we will uh be happy to receive that and uh we will figure based on the answer we will have what we be plann uh another meeting physical one we should be in Lisbon at the end of August in order to have a first physical meeting of ismg tc222 if you plan to go there uh we will be happy to see you reminding you that you can follow us on our website on LinkedIn on you on YouTube and if you want to become an ISS mg tc222 member then this is possible but you have to get in touch first with your National Society so there is a list of the National Society of uh ismg that is available from that website and if there are empty places then you can uh candidate to become one of us and we'll be happy to be with you so thanks for your attention I alsoo would like to thanks all the presenters that uh give their time in order to have these nice meetings I hope you enjoy it and uh I will say goodbye to you and uh see you soon so thank you