Transcript for:
Webinar on Cyber Security Testing and Evaluation by CSIAC

welcome and thank you for joining us for this webinar presentation we are the cyber security and information systems information analysis Center or csiac one of three IAC domains in the dod information analysis centers operating under the defense technical information center d-tick within the office of the under secretary of defense for research and Engineering our informative webinar Series highlights current and emerging research and Technology developments it presents an opportunity for accelerating The dod's Leverage of these advancements by increasing awareness and fostering technical collaboration csix serves as one of the Premier information research partners and curators of technology advancements and trends for the cyber security and information systems community as such our organization supports those working in the cyber security and information systems domain of DOD research and engineering we do so by helping navigate the vast landscape of scientific and Technical information allowing our customers to get a head start on their technical projects with an understanding of the cyber security and information systems DOD research and Engineering landscape we provide research and Analysis services we help unlock access to information knowledge and best practices from government industry and Academia to stimulate Innovation Foster collaboration and eliminate redundancy we hope you enjoyed this webinar presentation and that it serves as a catalyst for Community collaboration and improved DOD cyber security and information systems research good day everyone thank you for joining us for this webinar presentation my name is Philip Payne I am the technical lead for the cyber security and information systems information analysis Center or CSI before we get started today I would like to note a couple administrative items first if you are dialed in my phone and would like a copy of the slides they were posted to the CSI webinar announcement you can go to cs.csi.org forward slash webinars and find today's webinar also put a link to that in the chat uh when you click on it at the bottom of the announcement it will say to view the webinar slides click here second all participants are muted but feel free to chat using the attendee chat window in the lower right hand side of the webinar screen uh you can use that to chat with each other and I'll be monitoring that chat as well uh however if you like to pose a question for the Q a session at the end uh please click the icon with three dots icons labeled more slash panel options to bring up the Q a window as part of your layout at the end of the presentation I will go over the Q a uh for the benefit of those on the phone I'll read the questions out loud to the president uh if you have a technical issue during the presentation have no fear the full presentation will be available online please check back to the CSI website once the webinar is posted the go to webinar button will take you uh to the YouTube link uh with that said I will uh introduce today's presenters uh first we have miss our standard uh Sarah standard is a 1988 U.S Naval Academy graduate and a retired Navy Captain retiring in 2013. in 2014 she began working for ABIA LLC where she developed and instructed nav air specific cyber warfare course for the navair acquisition Workforce teaching over 3000 in the first offering of the course in 2016 she transitioned to serve as a cyber security interoperability technical director to now the executive director for developmental test evaluation and assessments in the office of the under secretary of defense for research and Engineering we also have nilo Thomas nilo graduated from New Mexico State University in 2013 with a BS in aerospace engineering he worked for the Air Force 47 cyber space test Squadron for eight years we were where he was a test manager leading the largest test programs in the test Squadron unified platform and Joint cyber command and control the dod's premier cyberspace weapon systems in 2021 he started working for DOT and E and he works as the organization's software and cyber advisor managing the diverse portfolio of dotne software and cyber initiatives on behalf of operational testing evaluation with that said we'll get started with today's webinar thank you good morning or good afternoon I guess I don't even know where we are it is morning for some of you maybe an afternoon for others so both apply I'm Sarah standard and nilo and I are going to tag team throughout this presentation it will go fast we have a very technically dense presentation but we will not cover it in the depth that is on the slides we have two primary topics to cover today and those are an overview quick overview of DOD test and evaluation policy and guidance updates and then we will Deep dive into cyber tne policy and guidance please uh let us know if you have questions and we will take turns monitoring in chat to see what we if we can answer any questions on the Fly all right so yes yes so so this is the policy update do you T and E and dtna uh publish uh an Enterprise teeny guidebook in 2022 here to support our test and evaluation documentation Dodi 5000.89 um and what this what to Dodi 5089 is is our test evaluation documentation uh at the policy level for what is expected from DT and OT we are currently revising this document and we are revising this document to meet the needs of various other documents uh other manuals that we are writing in the department DOT and E has written various memorandums over its uh rich and storied history and we're trying to consolidate these memorandums into uh several smaller dodms we are working jointly with DT Ena to write these manuals the one that we're going to focus on today is the dodm 5000.xc Cyber tne but you can see that there are other ones here like the temp test uh dotum the software tne dotum testing in um emso MSO environments or and testing and mod for modeling and simulation we these dodms will also have supporting guidance uh companion guides over here on the far right that'll kind of explain how to do the testing that the manuals require so if you're familiar with the dod's cyber security guidebook 2.1 that came out in 2020. we are updating that manual to now flow with the cyberdotum and uh and partner with it to explain the how of of what is required in that cyber dodm so with that that's kind of where we're going with DOD tne policy at a high level so there's there's these so again there's 5089 that's the large tne overarching policy then they are they are dodms that are underneath that kind of subset into specific domains of tne that are split that are unique to that highlight unique challenges within the department uh that warranted additional policy then there's an Enterprise D guidebook that focuses on um on the acquisition Pathways themselves and then finally there are companion guides that talk about the dodms uh the the details on the dodms so all again and lastly all of that is Co-op co-authored all right go ahead Sarah all right thank you nilo so in the dod manual the approach being co-developed between our two organization is to supplement what's in the Dodi 5089 which is also under revision so that that new Dodi 5089 will be very different from what you have today published you can see several themes here that we're focusing on focusing on trying to make sure that our systems are being delivered to be resilient is essentially what it comes down to and using t e to verify and validate that aspect the manual will support the Adaptive acquisition framework and um then on the companion guide I'm on slide four for those of you who are following along at home the companion guide is that how-to for cyber and it does replace as nilo indicated the current DOD cyber security teeny guy book version 2.1 you will notice we have a change in terminology we're dropping cyber security and we're moving to cyber to be more inclusive of all aspects of the threat space and all concerns that we have in terms of Defending against that threat so there is an increased emphasis on iterative testing continuous if you're using automated in particular add being agile being able to focus on the recover capabilities the resilience capabilities and your cyber survivability capabilities so for those of you that joined a couple months ago you heard the Cyber survivability endorsement presentation by Steve pitcher and then last month you would have heard from the system security engineering cyber guide from Crow's office and so Kate that was Katie so we are piggybacking on top of those two prior presentations and uh hopefully we'll be able to link that in for you today a big part of what we want to focus on is the contractor's role and being able to have better integration of the contractor and Government testing and then always considering the operational Mission regardless of whether you're doing operational testing or you're doing developmental testing so shifting left starting early and iterating is kind of the theme all right Milo okay Sarah you're gonna have to click for me but this is a visual representation of our key concepts for cyber tne um these Concepts enable iterative cyber TNA and they're not tied to any specific acquisition pathway but they are tied to the decisions within those acquisition pathways so don't think of this as a timeline think of this more as a concept mapping if you will um so this first this first click here uh is that the program needs to establish a cyber working group as early as possible which inherits roles and responsibilities and key organizations from the tne working integrated project team as it pertains to developing and updating the Cyber teeny strategy you can see what there's expectations of of maintaining and updating that cyber teeny strategy is here on the the left the working group identifies representatives from the lead developmental test organization and the operational test agency and they're each responsible for ensuring that test satisfies dtnot objectives go ahead and click Sarah okay the Cyber tne strategy needs to be integrated into the overarching strategy here and the the way to do that is to by using what we call the integrated decision support key that green box underneath our temp T and E strategy um these green kind of arrows that are flowing off of there is kind of like an extension of the idsk it's explaining how the idsk would map support a map program decision so these stars for example on the major capability acquisition these could be Milestone ABC ioc FOC and it's mapping to rnf RMF programmatic decisions like the ATO process and it's also mapping down to different tiers of testing if you will sub component testing component testing subsystem system and all the way up to System of system level testing and the idea with the idsk is that you're mapping the data that you need with the decisions that you're con that that you need to support and your tne is bringing that data in time for decision making at any step of the process okay um so when we develop that strategy we need to prioritize and scope our testing throughout the Continuum of of the life cycle and the the way the Cyber working group does that is through these three activities down here in the bottom this bottom gray bar here we identify the attack surface we identify the threats that are going to be representative of our system against our system and we conduct mission-based cyber risk assessments um so go ahead there again testing needs to be performed iteratively on these components what the gray bars the gray bar down there is doing is finding out where the most critical components of testing is and um and scoping our testing because we cannot test everywhere and everything so we have to test Focus fire testing on the critical components here you can see we've got five different types of testing that we will dive deeper into uh throughout the slide deck but there's there's um cyber otd cyber live fire cyber DT contractor uh tne and integrated contractor government tne um again these test events depending on your program will be capturing different points captured in different points of the program and then lastly there's security verification tests which we will also cover briefly go ahead Sarah all of this information needs to flow into databases held at the appropriate classification level and the idea is uh that this data is accessible to all stakeholders go ahead Sarah and then the last thing is that all of this information if it's in an accessible Place allows um organizations like the program offices the ldto the OTA and the um other you know the oversight organizations like usdr and Eno Dot and E to write reports at any moment in time to also support decision making go ahead Sarah so you're probably familiar with the six phases that were in the previous guidebook so if we show that we can we have a mapping here of how those six faces map to this diagram so go ahead Sarah click on the first one so understanding the cyber security requirements was phase one in the old six phase process and basically you're you're always doing that in our process right you really you should have done been doing that always in the other process as well but the term phase might have implied that there is a timing aspect to it that you do early and then you don't really do it anymore um so here we're trying to make sure that you're constantly going back to make sure that you understand the Cyber requirements and you're probably doing that uh you know in time for if you click Sarah phase two you're doing that in time to characterize your attack surface so every time you need to re-characterize your attack surfaces you can see we kind of have that occurring multiple times throughout this process you would also want to pre-game Yourself by understanding if cyber requirements have changed from then to now go ahead Sarah so phases three and four were our DT events Cooperative vulnerability identification and ident adversarial cyber security DT e these the terms cooperative and adversarial have been pulled out of the out of the totem so we will mention those a few times maybe to help cross communicate but overarchingly those terms are gone from our guidance we can you the the department can still use that terminology but it's just not in our policy anymore um either way though these those test events that DT were expected to conduct are still here they're just considered government cyber dtanine well they're also contractor so contractor yes yes go ahead Sarah and then five and six the were the Cooperative vulnerability penetration assessment and the adversarial assessment these were more operational test events you can see that they're still kind of captured down here as test events as the arrows and then the assessment parts are captured as reports because the adversarial assessment and the Cooperative vulnerability penetration assessment should be grabbing data from many test defense to roll up into an assessment of cyber Mission effects on the system go ahead Sarah I guess this is this is you right Sarah sorry I thought I was on mute yeah I had to put this plane the back and forth game there okay so um now I'm gonna we're gonna dive into some of the details the technical details so we're looking now at that integrated decision support key what do we mean by that Okay so the integrated decision support key supports those decisions all of the decisions and the what we mean by their complementary and independent the testing is not part of risk management framework and risk but risk management framework and tests should inform each other that's what we mean by there there was a question in chat that's what we mean by they are in uh independent of each other the dodcio owns the RMF process uh ousd-r-n-e and Dot knee on the tne processes so there's different uh policy owners and the processes should be complementary very much so so you should be planning those things to happen together all right so the Cyber work group who leads the program cyber tne uh strategy development they are responsible for ensuring they provide input into that idsk that cyber teeny strategy must be integrated with the program's tne strategy you have to plan all those different types of tests that Nila was was highlighting and those accessible data's databases have to enable that re the ReUse or those repositories of data should enable the ReUse of that data across any uh decision maker who needs that so the idsk is a program managed document that allows the program to schedule their testing to inform the decisions of interest and they can be you know with what with each pathway they can be buried what those decisions will be but the idsk helps to determine what tests do I need to do to generate the data that will help answer the questions that inform the decisions so it's a it's a building block approach if you will and so each you have to have this data forever in order to have this uh decision making and so you you want to focus on where am I going to get that data who's going to be doing it who what do I need in place and that helps you write your your test and evaluation strategy as well as building this overarching idsk so on the left side here is the building of the idsk doing the planning writing your tne strategy on the right side is the execution piece where the data actually gets generated okay so what I want you to focus on is the center section here so the program manager has decisions that have to be made should I move forward should I accept what the contractor has delivered should I move into operational tests those can be examples of decisions in order to have those decisions they have to Define what they have to evaluate and in order to know what you want to evaluate you have to define the test events the experiments the modeling and simulation events that you need to generate the data and then those test events experiments and m s events Define the resources that are required by the program which then drives scheduling to inform the decisions so your schedule when you execute your schedule it will generate you you'll be doing these test events which will generate that data which will inform those decisions so what you're documenting in your tne strategy is the plan what you're doing when you execute is collecting the data to inform the decisions hopefully that's clear we want both operational and Technical information captured in the idsk with operational capabilities and Technical capabilities being able to be evaluated so here's a notional idea of what we mean by an idsk this is by no way prescriptive you can do it any way you want this is just trying to help people understand what we mean in the Cyber area the Cyber attributes with within an idsk so again there's an operational capability or evaluation on the operational side and there's a technical capabilities to evaluate on the technical side and so you might have something like focus on operational Mission effects or system data security risk management pieces so confidentiality Integrity availability there's your RMF integrated with test right and then on the Cyber survivability side there is Mission critical capabilities that have to be evaluated with real operators and Defenders involved as well but in the system Focus you'll have the prevent mitigate recover and adapt within the system Focus that you should be testing on in in developmental testing so you can see there's overlap potential for overlap so what will happen is the idsk will help illuminate opportunities to do integrated testing what so when you have contractor testing and you have Government testing wanting to gather similar data then you can do an integrated contract or government DT type event if you have opportunities during government developmental tests where operational tests can also use the same data to inform their uh evaluation areas then again you have another opportunity for integrated government dtot so that's the intent behind the idsk there are other types of integrated testing or there are three three major types I guess of integrated testing that we think exist in that you can consider so I already mentioned the contractor government integrated testing and I already mentioned the integrated government dtot what I did not mention was the integrated functional cyber test so when you are doing testing for your key performance parameters your whatever your measure your measures of Effectiveness measures of performance then you have an opportunity to add cyber into those test events if you plan it right obviously so that's the idea behind the integrated functional cyber testings in the current cyber tne guidebook so we do talk about it briefly but we're trying to elevate that a little bit more as well um the draft idsk is available for use if you want it uh that's just the Cyber portion though so it's it's bigger than that um all right so let's focus on the Cyber this is still me right yeah cyber working group activities for scoping the Cyber teenies I'm going to focus on the Cyber requirements which again goes back to Joint staff cyber survivability attributes the threat characterization piece the attack surface characterization mission-based cyber risk assessments and using cyber tne so three of those are in that gray bar in the diagram the other two are or the other yeah the other two are um built into the the policy and the guidance so it's important for the Cyber working group to understand the requirements that are in place for that system whether they be cyber survivability cyber security or whatever they want to call them doesn't really matter so the Cyber working group unless it's emerged working group with these system security engineers and the and the Cyber testers and all that in one big happy group that's perfectly fine you can do it that way but if not then it they should be supporting the system security Engineers to Define the performance specification and an actual design that accounts for mission risk and be is implementing testable measurable prevent mitigate recover and adapt capabilities so we Advocate following the joint staff cyber survivability endorsement even if you're not a joint system and it's really important that the engineers develop metrics so having the testers at the table with the system security Engineers can help emphasize that metric piece and the measurement piece and how are we really going to test this it also helps because they come with I think like a hacker kind of perspective the testers do so they come representing the threat if you will it's like having the thread at your table saying don't do that because I'm going to do this so it's it's helpful to have this this integration and this activity in our DOD manual is request you know we say you should repeat this for each acquisition then idsk decision so these are the 10 cyber survivability attributes and when it comes to the resilience piece I believe CSA 7 8 9 and 10 help you get to the resilience piece whereas the prevent piece is very important you also want to focus on that resilience as written oftentimes they don't they aren't directly obvious how they're measurable but if you start to think about how long is it taking me to manage my system performance if it's degraded by cyber event then you can get a measurable requirement out of that how long does it take me to recover my system capabilities how long does it take me to detect anomalies how long does it take me to to harden my Cyber attack surface on a recurring basis how long does it take me to patch my system those kinds of things can get you to measurable requirements on the engineering side so now I'm pivoting over to Katie watmore's presentation the csas are very high level requirements and Engineers have to have lower level measurable requirements in order to have an ability to say yes we're on track to achieving that cyber survivability attribute so the performance specification has to articulate those cyber survivability attributes as actual requirements and they then the contractor has to take that performance specification and do the decomposition into lower levels of the system and Define the actual ways they're going to measure and demonstrate as they're building the system that they're on track to achieving that system survivability key performance parameter with a cyber survivability endorsement it's often not done I am all the time not done maybe today so uh I think it's an important aspect that we need to get better at doing and we need to get better at asking our contractors to help us with that so when it comes to getting that mission and threat context we use mission-based cyber risk assessments so I'm going to be getting to those here shortly but in terms of you reusing testing results when you when you do these iterative activities here the threat assessment and the attack surface characterization and the mission-based Cyber risk assessments you take those testing results then you can really get to an affordable way to do the Cyber testing that we're advocating so that you aren't trying to test everything in the world because you can't you Scope to the critical cyber terrain that you care about right so you need to do you need to focus on what is the threat capable of what do we think the threat might be capable of I know that's impossible but you have to use the intelligence products that you get and then take a look at where is the critical components inside of your system where is the critical functionality inside of your system and how might an attacker gain access and exploit anything in there and what might they be trying to do so mission-based cyber risk assessments can help you do that kind of analysis taking the current threat information and your attack surface knowing what you interface with and all those parts and pieces within your system and be able to study them from a mission based perspective so the threat characterization piece we've uh we've asked some for some help from the Intel community and they built this uh understanding of how the intelligence Community Works to put into our companion guide so this uh picture will be roughly described in the companion guide it's not it's not the Cyber working group's job to get the threat Intel it's making sure that they have the threat Intel and how understanding how threat Intel is generated is important for the Cyber working group to understand where they can expect to see results coming out of that so again this is this has to be repeated for each acquisition and idsk decision if you look at that attack surface this is just a fun picture to look at you know it's there's a lot of stuff out there that our systems might be dependent upon including critical infrastructure including the training devices including the maintenance devices and their supply chain and the repair facilities there's just a bunch out there so all the different ways that an adversary might be able to get to the system or get information out about your system and so our our contractor our defense industrial base is a big partner in this attack surface space and so you have to consider them as well we have in the dod manual a table that describes some evolving attack surface elements that we wanted to highlight to draw a particular attention to so we will focus on these uh the how to look at some of these in the companion guide the dod manual just highlights them as with some considerations for what you should be thinking about in terms of testing in these uh with respect to these different attack surface elements so this is a modification of the famous wheel of Doom or wheel of death or wheel of access that the Air Force originally created for this version I this is the version that I published in the Cyber tabletop guide for DOD you notice I've added critical infrastructure the system is dependent upon critical infrastructure and I've also added the development and test environments processes and tools to this because everything in here has a development and test environment process and Tool even if it's a cots product so you have to think about all of that when you are thinking about where is my crit where's my key cyber terrain for my system and so the attack surface characterization can be separate from the mbcra it should precede the mbcra or it can be a standalone so or it can be as part of your mbcra your mission-based cyber risk assessment and it you should be working with the program Protection Team in these cases so all the entry points exit points and you know all that stuff you look at that in your attack surface characterization you you get that mission decomposition from program Protection Team now if they haven't done it that is an indicator you need to do that and this again is for each acquisition and idsk decision you should have current data on your attack surface um yeah I think that's it on this slide so where do mission-based cyber risk assessments fit in the systems engineering process model which I think I grabbed this from Dau this picture so you see at the top here what comes in is the requirements the Cyber survivability attributes one through ten and there is a process by which a requirements analysis is performed that requirements analysis process generates the performance specification which goes into the functional analysis and allocation and there's feedback loops everywhere as you can see the the uh the technical performance measures become part of that system's analysis and control effort that has to be done at a management process right and then ultimately you're going to get some sort of a design out of this and that design has to go through a verification process that's where t e comes in in the early stages your mission-based cyber risk assessment should be informing your requirements your engineering and your testing that you need to do so that's where it fits um they are used to inform concept selection they can be used to inform Your Design before you lock that down they can be used to track your system progress what the government gets from the contractor you should do an mbcra there and and when it's in operations you should be looking at what am I still resilient to my system in my current threat environment so minimum if inputs latest system details and you can see all those here current threat characterization a listing and Analysis of existing known vulnerabilities so if you have a Bill of material great because that will be very useful here what you produce from the mission-based Cyber risk assessment are estimate submission impact using input from your operational users your Defenders your maintainers engineers the developers and you get an attack surface characterization and you also get an attack path analysis out of this and the reports will have scenarios of those attack paths and those threat vignettes which can inform testing and should inform testing so our companion guy will have much more detail than this so on using test results in particular it's important as part of this Engineering Process in that verification step that your contractor is doing a t and e as they are going through this this engineering Loop here with respect to improving the system design in order to meet the requirements so it's important for that cyber testing informs the engineering the remediation that should happen the prioritization of what to mediate what to mitigate any maintenance that has to be performed you know how are we going to know that our system is being attacked we need some sort of Maintenance training you know that says when you see this do that or when you update the software you know reset these configurations um these cyber tests should inform sustainment and Defender processes so lots to think about and now I think I'm turning it over to nilo thanks Sarah all right so uh what we're going to do now is talk about cyber testing uh the planning of specific executions of of specific cyber test events uh so go ahead Sarah so this table is a summary of the table that's in our cyber totem you can see here that there are six topics here the first three are pretty common to all test plans but they do have a cyber spin in them so system for example talks about the architecture of the system right tested a test environment talks about the conduct conditions assumptions limitations constraints uh and also describing the Cyber environment of the system time and resources pretty self-explanatory the next three are a little bit more um related specifically to cyber testing we need to capture cyber test activities for example um so this this table here shows one row but this is a pretty big Row in our actual document this includes reconnaissance activities uh penetration activities activities to verify previously found vulnerabilities and how the test organizations will emulate the threats we also jumping up a row we also want to talk about vulnerability tracking and retesting we want to talk about tracking about tracking the vulnerabilities how are they going to use results from previous tests to to identify vulnerabilities uh how are they going to measure the severity of those vulnerabilities and then on the last row here we want to talk about Defender activities we need to capture how during or after an attack will the testers collect observations from um from the Cyber Defenders this could be these cyber Defenders could be War Fighters the people equipped with the systems operators uh cyber Defenders they could be cssps right we just want to make sure that we are tracking them across the the process of detection prevention mitigation response response and Recovery a lot of these plans will only talk to part of this so for example DT test events May only conduct a penetration test on earlier lab environments whereas OT tests will conduct a test on perhaps the entire system or a large subset of the system in a realistic environment with trained operators or cyber Defenders so they're in the dotum there's a lot of text in these sections but it is not expected that every single test plan will cover every single item in that table go ahead Sarah foreign like I was stating before our figure captures five rows that we'll we'll dive a little bit deeper into I will start with the small little orange faded row called security verification testing these are automated continuous and manual testing that uh tend to be the kind of the rmf's compliance compliance verification security control assessments these could be static uh application security testing or dynamic application security tested and these could be software composition analysis these are basically a lot of tests that could occur earlier in the software code development to ensure that the software is um free or at a lower risk of some of the easier to catch vulnerabilities hopefully this information flows up to the DT and OT team so that we can test vulnerabilities that may impact the mission further if uh if they're not captured during development um no I will note that these security verification tests aren't usually conducted by the DT or OT test teams they're usually conducted by the contractor um the other thing I want to talk about is that the other four rows will dive a little bit deeper into so contractor testing DT LST knee and OT and E um a goal of all of the testing here is to inform capture data and measurements for resilience and survivability of the system we are trying our tests at any point are trying to determine the access to the vulnerabilities system exposures and points of penetration right so we're trying to figure out where we're vulnerable how could we be exploited and then also we're trying to figure out once we are exploited what what uh what happens to the mission right and these are going to occur from the component level and up right so if there's a critical component that the um that the idsk identify early on then there could be a contractor test that can is conducted early on to verify that that system is secure last thing I want to talk about is that we want to iterate this process previous test events in previous decisions will inform the need for future testing so if uh if there are previous test results that are perhaps negative then we might need to conduct additional testing or if a new threat profile comes out there are various reasons why we would iterate through the process go ahead Sarah okay I think this is me so I'm going to cover both contractor and government cyber DT um so it is much more than vulnerability scans this is um a sore point for me because over my seven years I've struggled to get programs to understand that the threat is relevant and you need to look at it earlier than waiting until operational test where it's going to be potentially impossible to address the problem that you identify so the whole point of cyber developmental test is to find problems fix them if they're bad and then test and verify that you did fix what was what was not good so identify and mitigate risks identify engineering technical issues measure the specified requirements for the system capabilities for prevent mitigate recover and adapt that's a big part that I'm really trying to focus programs on is what are you measuring in tests don't just count vulnerabilities actually go and tell me what you're going to measure how well do you prevent mitigate and recover or adapt then finally you need to verify that your products what you're getting from the contractor are compliant with the contractual and the technical requirements including the sticks and any exposures within the known National vulnerability database So you you're going to look at all of that and you're going to do that through iteratively using contractor cyber testing so your contract needs to require that the contractors are measuring and Reporting on these requirements early enough so that you can view the program can respond and recover um mitigate remediate you know prioritize figure out what needs to be fixed and then when the when the product is delivered to the government at whatever Point you're getting the first you know delivery from a contractor and follow on deliveries there should be a consideration for have you delivered me a system that today is at meeting the requirements I needed to meet you know are there any exploitable vulnerabilities in my system or inability to meet the requirements as specified right when I first get it and if not then the contract has to support some sort of a get well plan if you will if they however on the other hand if they do meet that that's an opportunity to award the contractor you know an incentive so if they are able to deliver a system that the government when they do their acceptance testing finds no Mission impacting vulnerabilities or no Mission you know any requirements that aren't being met then you know they should get a bonus for that sort of an incentive award and then finally government cyber developmental test this will focus on the components the subsystems prototypes and any all the developmental systems right as they're going on and getting matured eventually going to be moving into operational tests again trying to find and fix so the contract has to support that remediation and that mitigation and the system security Engineers are a big part of these the looking at these results and understanding them with the testers so we're trying to focus on multiple of the test events on all of the critical components as you build up the system all right over to you okay I was answering a chat question uh so there's a question about for software intensive systems how do you see cyber teeny integrating with sulfurtini and we'll go ahead and address that yeah so so we are going to write a uh appendix in the cybertini campaigning guide that will integrate software Factory processes better into cyber teeny this is going to be a new section uh so it'll be matured further as we actually execute these processes a little bit better but for now we are going to write some guidance uh based off of some research that our teams have done uh and a few Pilots that will carry this uh forward into the software teeny realm and hopefully integrate those two two worlds a little bit better okay so back to the slide cyber operational tests and evaluation so I'll talk about this now go ahead um so you can see there's a lot of text here what I like to talk about with operational test and evaluation is what I like to call three sufficiencies OT e has to be conducted on one a production representative system two with operationally representative users and three in an operationally representative environment so for the cyberspace domain this means that our cyber Defenders need to be a part of this test so basically in addition and in addition to those three things so we need to have a system that's ready to to field right we need to have people that are trained to use that system and we need to put that system in a place that is representative lastly we need the Cyber Defenders as part of that testing it's it's crucial for us to know how those systems respond and recover from cyber attacks many tests that we currently see in operational testing do not cover this currently so for example we need to to assess our Co-op plans if we can right determine the length of time it takes for the system to return to a nominal state uh you can see on the second paragraph here uh that we're trying to integrate cyber otini more with the larger teeny efforts so we note that cyber otni is part of larger test events like the operational assessment or the initial operational test and evaluation or follow-on test evaluation uh and then lastly like I said we are trying to to make sure that the Cyber testing for operational tests is meets those three sufficiencies again production representative system operationally representative users operationally representative environment go ahead Sarah a key goal of cyber OT e and really the end goal of it is to determine Mission impact from cyber effects so our red teams will exploit the system and determine how much damage they can do right and our cyber Defenders are going to determine how they can recover from that damage this is very crucial for our testing like I said in the last Slide the OT the OT and E Team assesses all of that as part of our overarching survivability suitability and Effectiveness assessment and so but but they lean on everything that has come before them all the security verification testing all the DT testing right so that's also crucial that that occurs right so because Sarah and I have seen many times that these DT tests aren't being conducted we're hoping that with the guidance of developing the Cyber working group analyzing the requirements going through the MBC array that you can kind of walk through the process and and scope testing throughout so that this testing here at the end that that OT e typically conducts um is not the is not the it's not the first place where we see some of these problems go ahead sir again I I keep hammering this home but uh we want to make sure that we are capturing what the users and Defenders will do on OT e so in dtne there's more of a focus on how the system will prevent mitigate and recover from actions but in operational tests we have to bring the person in as part of the suitability assessment so there's a part of it where it doesn't matter if the system can defend perfectly if the operator has no idea what to do when he gets an alert right or if he gets an alert and he misattributes it or something like that that all of that the human the human interfacing with the system really matters and that that's where training is critical making sure the right people are here in the test um all of that very important to do uh and then lastly I go back real quick the threat we want to make sure that we're emulating a real threat so our red teams are trained to to emulate the real threats um and they and they do that to show to people at the end of the day hey this is what a real threat could do to our system and this is how your cyber Defenders reacted to the system there are there could be potentially some problems here we are not we are not ready so we need to we need to patch this up and that's really the end goal of cyber okay go ahead all right so now I'm going to talk to this newer concept which is cyber live fire um and I will state that this is new and we're still working on this uh internally in dotini but go ahead so our current law notes that we need to emulate realistic survivability testing so that law right there this concept from our perspective Dot and E's perspective includes both kinetic and non-kinetic effects so the idea is that most programs that are already conducting live fire test and evaluation will have a live fire TNA working group that is also subservient to the tneu Whippet and the idea here is that the sci wig and the live fire working groups must coordinate together to conduct what is called the mission-based risk assessment the mission-based Cyber risk assessment will feed into the mission-based risk assessment and this this overarching process here is this process of making sure that cyber live fire meets can meet some of the objectives of the overarching live fire test team in the non-kinetic space um the idea there is that life like I said li-fi will further scope that testing uh especially when cyber tests can affect the physical domain right so we're trying to tie these domains closer together we're trying to kind of bring it all together into one large survivability assessment if you will where we talk about all domains of of survivability additionally another thing that these cyber live fire tests could be is if we test the uh a cyber cyber effects on a full-up production representative system similar to what is called the full up system level test in live fire and we tested against that and see how the actual entire system handles it and it may roll up and include destructive testing if for cyber tne it might be the the ultimate Capstone for testing it is similar to what already exists with the with regards to the adversarial assessment and so there there is some terminology that is similar here and like I said we are still working together with Arne and internally to make sure that we have this concept worked out but it will be in our campaigning guide as uh further fleshed out with this information we are developing several Pilots now like for example that mbra process mission-based risk assessment process we're trying to to pile it out and we're also piloting out a full up a fusel cyber test event within the joint live fire program okay next slide okay uh so last thing here reporting uh these are the common Concepts that need to be reported out across the Cyber teeny program note that I didn't State documents so again a given test report might not have all of this information in it it will be various test reports could have various test points so example there could be a test that or an assessment that's done just on the supply chain right it only talks about the supply chain others could focus on electromagnetic spectrum testing our older language the cvpas and CVI cvis focused on the rows in the center vulnerability identification and exposure identification um while the AAS and AC DTS focused on operational Mission effects and the prevent mitigate recovery steps so there's still there's still an expectation that there are separate types of test events that are occurring that will report out separate pieces of this process but throughout this process of cyber tne you are capturing this information for your program so that we can identify at the end of the day where are we vulnerable and when we are vulnerable what does that do to our mission uh Nila we have a question and for these updated documents to be formally released um we are shooting for this year uh right now we are informally working through comments with the services on the manual and the companion guide is in parallel trying to be keep up I guess is what I would say so that's that and then the other question was are you including in your test manual the development of metrics to collect false positives false negatives when you're assessing alerts and warnings to an operator that might be a companion guide topic not necessarily the manual level because you know that's a little bit more deeper and technically into the content than the manual is intended to go right so again there's two there's two levels the policy is the dodm the Department of Defense manual right and that's that is right now currently in the whs issuances portal that's moving along but we do have a Guidance the Cyber teeny companion guide which will capture a lot of best practices a lot of the the text from the older uh campaigning the cyber security guidebook that Sarah had written in the past uh that we published in that I don't want to say I wrote the whole darn thing but you wrote it in spirit some of it I did right um and when are we releasing the policy on cyberlife fire at the same time so the Cyber live fire policy is is in the 89 the 89 requires live fire on everything and then the Cyber portion of it is in the manual all right all right so um thank you for bearing with us we've spent a long time discussing what we're going in cyber tne um like I stated earlier we're trying to make sure that this process is iterative recursive it feeds back on itself we need to start by understanding our cyber requirements right we need they need to be measurable testable meaningful achievable right we start with those maybe those csas from The Joint um from the j6 right that has the 10 csas and then we break those down into actual requirements um then we need to apply our system threat analysis and we need to identify where what our what our threats are and we need to identify our attack surface of these systems then as we start designing the system we've got to decompose the mission map it to the system and then find out where are we where are the critical components what are the components that we need to test that's that's where we take the that that decomposition we take that and we actually design tests to inform um the the vulnerability and exposure and the exploitation of these systems and then we use that information to inform remediation mitigation and maintenance of our of our of our systems um at every level right subcomponent all the way up to systemless system we're hoping to do that through the mission-based Cyber risk assessment process that kind of most of that stuff that I highlighted there uh and then lastly we're trying to use all the automated testing that will come before in some of the more mature processes that you're hearing about like in software factories where the software Factory kind of does a lot of the security scanning ahead of time you're trying to pull that information forward so that that will help us scope our testing as well so lastly just want to hammer home we're trying to make this more data driven so a lot of that data flows back into that that large Enterprise data repository or maybe not Enterprise but at least program level data data repository so that people who work on that program can conduct their analysis their assessments and their reports at any moment in time in an Ideal World we know we're not quite there yet but that's where we want to go in in our policy and in our guidances create a place where this this data sits within programs and then use that use that information to help the Teeny Community Drive decision making uh with relevant and timely access to test reports and then lastly Sarah here has this little catch catch-all here that says don't just catch our vulnerabilities don't count them right measure performance and capabilities in contested cyberspace and when we're hoping that we can do that from the beginning as the program matures out their requirements we kind of came in and state hey these look at these things right come into these things put these in put this text in your contract um and and then will you follow along for the ride uh as the program is developed to to make sure that we are measuring performance and capabilities across the life cycle so uh with that Sarah do you have any last words I do not have any last words you did fantastic as always but we will open the floor up for any other questions we didn't answer if we missed your question in chat apologies you can ask it again or maybe Phil can guide us on that um and we I agree Mike Lillian though we do want to be clear and jargon free that's part of the just call it cyber test and evaluation perspective and and not being too overly sensitive about what's Cooperative what's adversarial um so I see a question are there additional resources for live fire cyber TNA so there is a pilot going on for live fire cyber TNT a couple of them at navair and in the Air Force the there will be products produced on their processes from doing those and so we will make those available when we can um I hope that's I know that's not the answer you necessarily wanted Bill any other questions we've missed um first and foremost uh thank you for your time and your participation uh this was actually the last of a three-part webinar series um that was kind of mentioned previously um today was a little unique because we did have two presenters uh we did have the ability to kind of monitor the chat and answer questions um as they go um so I don't really see anything that was outstanding from the chat uh one of the questions that I did receive privately to to one of the hosts in the chat was will the transcript of the chat be available um that is something that we normally do not do but uh since today we did have the unique experience of uh a lot of interaction within the chat I think that will be helpful for all of our members so I believe WebEx does have that functionality to export uh the chat um if not then you know at the bare minimum we can just do a copy and paste to make sure everybody has the availability um and obviously those who were dialed in over the phone uh wouldn't be able to see the chat so I think that'll be beneficial for uh everybody who's presenting so um be on the lookout for that we'll send that following uh following up with the email so everybody can benefit from the answers that we did see in the chat I know there were a lot of questions about whether or not um this will be recorded this will be recorded uh that recording will be uploaded to the CSI web web page um within it within a day or two uh the slides are up there now as well um so please check back for that I do want to tackle I do want to tackle a couple questions going by here defense business systems are not exempt from cyber life fire tests to the best of my knowledge Lilo uh about the live fire questions yeah so so my understanding of of our live fire law right is that if we consider a system to if we consider a system to be vulnerable in the cyberspace domain then in quotes life fire applies right but this is not it's non-kinetic live fire so it's so this is why where there's still some there's still some discussions right on how we're trying to determine this but but the the there there's an argument to be made that potentially what cyber DT and OT already does is cyber live fire right um it's just that the terminology is swapped around so there's um there's definitely some translation that we have to do between the live fire community and the cyber cyber community cyber test and evaluation Community here and that that could potentially be all the differences between cyber DT cyber OT and um and cyber live fire is is a communication uh mapping back to the um overarching live fire process but to the larger question of is is cyber life are applicable it is because the system will be vulnerable in the cyberspace domain over yep that's a great that's a great answer um and then I think we have another question here um challenges in the past with date with sharing data due to classification of some results have been discussion have there been discussions about streamlining processes or distilling info to get to help get relevant input to contractors Engineers there are always conversations around that thank you for that question Derek uh it is a known Challenge and we are I was just in a you know Round Table this morning where that came up again it comes up every time we we know that's a challenge one of the things TR the test Resource Management Center is working on is how to measure and share test data that would be another useful thing I think for for programs to get other programs tested without knowing which program it's for from maybe how can we reuse information so there's we have a lot of work to do absolutely and how do these principles get DNA principles get incorporated into operational exercises Dot and E conducts the go ahead you know cap okay yes yes yes yeah our cyber security Assessment program cap conducts exercise uh supports exercise assessments for all the combatant commands already so when cap is uh brought into one of those exercises assessments they do already bring a lot of the operational test expectations from um from the OT side here at Dot and E and pull that forward into the exercise assessments and then once they do that so they support the building of the blue team and the red team cells the blue and red cells uh they bring the Cyber threats and then they they execute the testing and then they report that back to you know back to the combatant command and back to our programs right so that information can flow backwards if a system was in that exercise right that they will report back to the the program office and state hey your system did this and this exercise so we do kind of already we do kind of already do that there's definitely more there's always more we could do to to pull that forward um but that is that is definitely already something neotini is trying to do right now okay I guess I'm not seeing any more questions did I stop sharing I think I did so I'm not sure anymore am I I can't even tell I still see the question slide oh okay so it doesn't let me stop sharing let's stop share there it is okay all right so Phil thank you very much for this opportunity we uh we welcome the opportunity to travel around to your program offices or your program Executive offices or whoever you need us to uh Neil and I want to help you know people understand the the policy and the guidance and be successful in executing absolutely thank you very much for your time and your participation uh like I said the recording of this presentation will be up on the CSI website um within within a day or two if you go to the announcement you can find the slides now I will be sending an email um with the very least a copy and paste of the Q a in the chat from today uh that transcript since um it was so interactive today um and I appreciate your time hopefully we see everyone on our our next webinar which I believe is scheduled for July 12th uh we have the folks who miss doing an overview or risk management framework but with that said we'll sign off for today thank you very much