okay I'm going to continue my left off and going to go with some of those things again I was listing to some properties for results and I'm going to talk to more about proofs so um and the underlying questions um really the connections between Tye two matricies and Quantum water oric which is interesting really don't understand it at all so the the anyway the situation is we're given a type two Matrix W ordering by n and we're going to Define matrices y sub i j okay in this form now just to remind you this is the ratio of two columns and this is the well the of two columns I transpose so this is a something like an Out product um now it's immediate from the definition that the y suby i is equal to the one n times I so we have L identity matrices and we could also see that the transpose of Y AJ is yji um is W is slat than YJ is emission um and it's almost worth calling out an exercise um and here any also well it's also so you see that this is true you know the thing going on here is if well all you can really use is the fact that um W time W inverse transpose = n * y now the point is when you multiplying w a matrix times transpose you're effectively taking the dot product of two rows okay so what you're doing you're taking the dot product of a row of w with a row of w inverse okay so you put in other words you pick a row you pick a second row you place this entry by its reciprocal and take a do product and this is telling you that that result of that process will be um n if you're using the same row twice and otherwise it will be zero and so you have that relation um and so um that that's going to drive everything that we do so the so the point then is these guys are right in potent okay um and so it well I'll go on a second so we're going to Define um script so w to be the NBN block Matrix with rry Y AJ we call it the Matrix of point of w now this should remind you a bit of a Quantum of a Quantum permutation okay in the quantum permutation you put in the entries of projections and these are in general not projections there are interesting cases where they are and that's we'll be discussing that but they're not required to be rejections okay um the um is the um as was um the this Matrix is um symmetric um and there's a few properties there which um I'm not really going to need much so I just leave them there to look at um but so we have this Matrix the entries of projections we're not requiring them to be a FAL rejections they need to be Mission okay now sometimes they might be but that's another story which we'll get to um so the um well little fact from Matrix there's um some sense three ways of defining the product two matricies one is to say well the IG IJ entry is the sum of a k times B KJ sum over k okay the other definition is to say well you take a row of a and take it dot product so to speak with a column of B and the third way so that's using the inner product the Third Way is to use the outer product and so we have this identity here and so this is just multiplication written in a different form okay it it's quite useful um the and so um so Point here we AI as a column ma convention this is a row and so this is a sum of rank one matri now this is not a film you'd want to use it is you'd have to create these um rank one matrices and that all up so the other ways of doing it would be more efficient but it's still useful okay in this particular case well reful here where you know the um W Times the W in verse transpose is n * I okay so I'm just going to apply this identity to that and you get that the um this is equal n * I so I'm just replacing writing this using the Out product formula okay um now we just go back this is essentially the um this is the entry of our um matrix script y so these are em potents and they're well not quite but well they yes they are poent and there something to end timesy identity so we get that um now I want to look at this um in an equivalent way using something which I call a dual basis so the assumption is on given vectors X1 through XM in a vector space um well in inpr space probably with and the second vectors y1 through ym um now we say the second vectors form a dual basis if the products y transpose XJ ital doj now you can take VDB Pro space but basically we you know quite space going to be real so I'm not going to f about that so the nice thing about dual basis is if there is a dual basis that that certifies at X1 through X little independent okay that's definitely if um the um this down in blue um so if I write down the Matrix um um the if I write down the Matrix um y1 YN transpose X1 should be M then this should be the so if y satisfy this then these columns form a dual basis to this so if you want to say a highl way you can say that X this these vectors have a dual basis if any if this Matrix um X1 through XM has a left inverse but just not worri too much about that so there's different way looking at it but the the point about dual basis is there are times when it's actually easy to see that there's a dual basis and that let you satisfy very quickly that setly dependent I'm not requiring that X1 through XM be an actual basis okay I mean sometimes it will be but the I use the term dual basis in the slightly more General sense um now the um the point is if that given a dual basis um well this is this is the equation this equation here to is this um now this is saying the wide transposes um now it's probably EAS we're going to assume at this point that um n is equal to didn't B wish I express it and so this tells us that y transpose inverse of X so I can shop the xra Y and then I use my out of product trick and I get this sum okay um now it's very easy to check in this case of the product of two of these guys you get a y transpose time XJ that's Delta YJ so the rank One matricies X iix transpose is a set of peir wise aual item potent sum to the identity so we're trying very hard to be a quum permutation the only problem is that the entries not required to be um rejections they're only required to be ents but they Su to their identity their pairwise or okay um now there there's couple things about this the well one thing really first thing is that this generality is actually it works there it covers some usful things there's stuff we can do with it okay and so and it's not much harder well it's not really any harder to work at this level generally and then specialize to the flat pae where we need it um and that's is already a bit of a surprise because you know usually if you get something that works nicely like a Quantum pration and you start iming Small Change definition it all blows up in your face but it's not so um so contining on this from the algebra thing um if a matrix mamp commutes with XI transposes then of course we get this now this is rank one this is rank one the column space of this is spanned by XI the column space of this is spanned by mxi these two things are equal so the span of XI is equal to the span of mxi and that tells you XI is L vector and just for a change of us L is the value okay and a similar argument tells you that y transfers in is a left Vector with an value which temporarily by me now if we compute the trace of these two guys now these two traces are equal this Trace um so the mxi just gives you Lambda so you get TR of Lambda this so you get Lambda trace of this is Mu so you get the Lambda is equal to mu and so this tells you that if M commutes with x i Yi transpose then XY is a right AR Vector y I transpose is a left AR vector and they have the same L value okay um and so that's what I written down here okay so we we have that now um assume now that um if um that prob should be and dra so assume that we've got I vectors X1 through xn for um if M and XI XJ YJ transpose commute for all or J you can't tell J F and you've got m is M time the identity is equal to M times the some of the pot land XI Yi transpose um and so this is looking a lot like a spal decomposition the main difference is that these are not Mission well neither's this so we can't expect it um so this gives an analog for now but this will only be correct if the um this sum n is the dimension of the underlying space so what we've done here or is construct a spectral decomposition for diagonalize or mat because I require a basis of I vectors and the Matrix is a b there a basis of I vectors for a matrix if not is the Matrix is diagonalizable Okay so we've extended the definition of decomposition from from symmetric matrices to diagonalizable matrices now it turns out when you for purposes of numerical work when you're trying to do real calculations um mat the nice matrices to work with are the matrices the normal in particular if a matrix is normal then Its Behavior is to consider controlled by its Aron values if the Matrix is not normal that is no longer true and a diagonalizable makers doesn't have to be normal but so there's some Neal reasons why we work with this saal decomposition where we normally do with those assumptions we have but the point still is um that it if there is a basis of vectors then we can get something that looks very much like theal decomposition well it is like theal de composition um and so we just gone with that um and so um the consequence of this is why is the Matrix of hi poent of an end byend type two Matrix each row and column y sums to the identity and if W is flat then um The Matrix BS is flat and hence it's a quantal permutation so the summary is that a type two Matrix gives us a um gives us a matrix of potent and it's a type two Matrix is flat emission and Tye two matrix it's a flat Matrix then um we get a quantum pretation now and so from this point of view you we've got a slight generalization of quantum permutations now we're not really concerned in this course we're not concerned with the generalization if you're interested in spin models link variant then qu the um generalization is very important but for the purpose of of the Quant stuff we're really only interested in what happens in the flat case that's possible that could change one day but think stand with the quantum situation the type two matrices you're going to deal with will be flat type two matrices um and they do come up in a number of places there this work on socaled um symmetric information to complete pvms and um mut unbi spaces where you get um flat types of matrices appearing but we're not going into that so um but still as I said the summary is that if I have a um given or probably the important things if you give me a flat unry I stuck a quum permutation from it now it's got a special form and we'll see more about that as we go but each flat unry determines a Quantum permutation and in a sense where in the unusual problem instead of saying here's a graph one of some Quantum auts we in a situation where we've got a Quantum Quantum permutation and we ask what is autois of um so uh to continue working with this um so uh the I'm just defining um the swap operation on CN T to CN so it just Maps the vector a t b to the vector B Tor a now s is a permutation um because it's just permuting elements of a basis the the these guys the vectors this form span and they contain a basis and so is defined on a basis and there a permutation so squares the identity and the claim is that if W is type two then s y subw * s is y subw transpose and the point is the type two matrices always com in pairs W and W transpose and the connection is interesting and useful um and and um well basically just go through and replace things by their definitions and there's no point me going through this in the detail now I've got nothing to add um you go through these definitions you get that these two guys are equal and then you think long and hard about what um so e times the r Matrix EJ * s you're picking out the IJ entry the trouble is that the tensor instead of having because this is we're dealing with the square the product of two t of prodct two matrices instead of having indices IJ we're going have indices II Prime and JJ Prime so that's what's happening here um but so in other words this is just the another way of saying this we're taking the RS entry of yigj and that's we can write that intense notation like this so we we get this result so there's nothing deep in this you just have to grind through and try and keep your tens product straight which I kind of way do okay so um and so this will be useful this means we can prove things about Y subw and we'll get things better w w transfers more less for free now to present some of the results I'm going to use bracket notation this is not the direct Sly direct bracket notation this is the SL this is the Le bracket um and so it's a Le you a prod two matrices um be a bit more General and so two matri is the same order then bracket of m n is defined to be this now um so this is by linear so it's linear in m and linear and n and it's two symmetric if you swap M and N the sign changes it's not associative um but I said it's linear in each coordinate um now I'm just using this a convenient notation because I can say the make M to mute if only if so and your name equals um actually not all get clear why but I sometimes find this slightly more convenient okay but we won't be doing any so I'm just using notation remember think about the at all so now the um okay so we have the following theorem if why is the Matrix white poent of a type two Matrix W um then the num algeb W so these are the matrices that have the column ratios of w as I vectors um consist of the matrices M such that I tensor M to Y and the alge W transpose is just nens I the matri is nens I Comm with Y okay so um now and so we then now we'll see in in a little while um the well the the mat so it follows from this that the matrices and and NW transpose from what here in algebra this is Quin um well it's a good point it's going to follow what we do certainly if Why is flat we get a cerent albate and in that case we would have a Quantum permutation so be getting Quantum automorphisms of the graphs in the algebra and so we have some sort of a connection between the num algebra and the cor morphis um and and uh the only place where this connection is written down is basically the these notes um and it's not entirely C what we can do with it but um it's something of Interest considerable interest to us okay so um so we pl these two relations um and basically we just need to prove one and then we use our s trans s um y sub w s trick to convert to the um W transfers um notice these are both the Matrix V imponent of w there a bit um this could have been the Matrix vent W transposed but it's not okay it doesn't matter too much because we've established now that the relation between Y subw and Y subw transpose or one's just a permutation of the other per one is permutation equivalent to the other and so okay so the proof so um I'm proving the first part um now think about go back to my again I 10 for m is block diagonal with copies of M down the diagonal and Y is a block Matrix so if these two guys say you've got him down here and you've got blocks y over here now if these two guys commute you see that M must commute with each of the blocks of life okay and that's why I was doing that stuff about okay but these are rank one matrices so if M commutes with YJ then the column space of YJ is space or The Columns of YJ give you I Vector for M and the rows give you left I Vector for m so we um and so then and so this is what I'm just saying here attempt to M with all these guys just is a rank um one right everything um so and so and so that gives us that m is it's m that tells me mq all these things and therefore it's in the algebra um I probably shouldn't have written star there but you it doesn't affect anything I'm doing um for the second claim we just use the fact that with buying s to this um where we pull in a we put an S an s squ there um and so s n tensor i s is I tensor n that's what s does and S like s is y w transpose and so the conclusion then is that if W is type two then NW is a commutative coherent algebra and um if Y is equal to the Matrix right transpose then Y is a coord water this of this graph in NW um and I probably should say um Let Me Be bit more precise here um so if W is not flat we're getting something a generalization of a Quantum pretation now there's a um it's resulting on the state um the purpose in the notes and in various places are P by adly um but it goes back to the Mur um and this is giving us information which I haven't proved yet and it um i' like to be able to derive it from playing with Matrix viid potent but I haven't succeeded doing that yet um if W is an N byn type 2 Matrix and M then the Mur W then the well the Matrix of I values of M is in the numer W transpose so so the ma Theta is taking elements of the neur algebra W to the memoral with W transports okay and the image of a matrix product m in would be a share product so we're dealing with an algebra of here is clation that follows then this guy has be chose on a sh application um and we also find that if m& in the r algebra then the image the well the Matrix of I values of the share product M * n um is equal to the product of the images n * n with the factor of one of n to confuse things um so that tells us and so it's this that tells us that the um theer algeb are share closed or both sides are share closed because this is essentially implying that if you take um well the W Maps the numer albra understand so see the wops sends in W to n w transpose and th W transpose W trans W and so the composition of these two guys maps in W to itself and then W transposed to itself and you can check that it's basically the trans the composition is basically the transpose map so it's invertible to that gives you so this is an important result as Jud nura um it could be proved with the same amount of difficulty as anything else I've been doing in this last lecture um for last in this lect last part of last lecture is not spectacularly Difficult by mange um um I don't have a proof a nice proof using the um Matrix V bance I'd like to have one that okay so we have this relation between so type two matrices give us matrices of it poent slap type two matrices give us Quantum projections um and we have these numer algb and they're related to the automic groups qu groups um that's all very well and good but it'd be nice to have some some examples um and this is a bigger problem that might appear at first but we'll s draw it so the simplest case to deal with perhaps is to take the vandom one Matrix again now it's type two and the important property of the vandom one Matrix is the share product of any two columns is a column of the vandom one Matrix okay um and the share ratio of any two columns um is a column of the vandom one Matrix so in fact The Columns of the vandom one Matrix um form a group um under sh application with the sh inverse is inverse now and you all want vectors identity so you have a sh Group which we W go into now so we that nice property now if p is the pration Matrix of the N Cycle um um then p is in the numeral algebra of V because each column of V is an IA Vector for p okay and so therefore that implies in the algebra definition and in fact um this is just linear algebra you don't have to um V is the algebra the um inside the is the a in P um and I may even assign this in exercise the point is that the um the I values of the argum Val is if P A distinct so its image in um and so theer of P will be a datal matrix distinct datal entries and so you can use that to show that n w is in fact the algeb poly and P are the of circular matrices okay so we know what the numerology is in this case what you exercise there and I said that's in some sense p in the algebra you don't have to know about type two matrices or Quantum commutation then you just do a few calculations along these lines so we get the algebra C okay so we know the number algebra okay and so that's nice because then we have that relation between the numer algs and the things that commute with itens M and so on so we're in reasonable shape um but the in this case we discover that all we the bad news is that all entries have already commute now that's not entirely Tri the the entries of Y in this case can each be all be written the form X um XY stard um well more precisely x i XI times XJ stard where X1 xn run over a collection of agors and so from that you can show that the entries of Y commute and so the we don't get any um well interesting Quantum automorphisms now I'm mildly annoyed that we have this concept of a Quantum automorphism Group which I did more about towards in lecture okay and so you would expect that the elements of the Quant Mor group would be called Quant morm oric but the people in the area seem to be reserving Quantum ORS for all the element of the quantum group that that are not classical okay so the terminology is a little bit confusing um but so um and but the point is that so we can for the um we take the Venom on Matrix we can determine theala we know the interest of w we get a qu of permutation that's the good news the bad news is that it's a direct Su of permutations okay still it works um and you know so um now to the second class examples I want to work with had mod matricies now so we'll start with an N byn hadide Matrix um now a general comment here is if W1 and W2 are the type two matrices then the numeral algebra of W1 10 to W2 which is a type um obviously the type two Matrix is the dener product of these two neurolog and again this is really straightforward forward we working up on the exercises um now the dimension of any the miror is at least two because it contains i j um if if it's only two it's something that's trivial enough to be throwing that word around before very long um so um so for example we can get a numer even so we can get a numeral algebra um of any Dimension we want any large you know of any larger than the in you give me so where is a range of values um it'll turn out this is not a very interesting construction but is um this could be in mind um now um yeah so this is actually saying what I said before that the um if I have flat 2 Matrix and the set of columns W is Clos under sh multiplication then all entries in YW committed um and there's the actual proof um the it follows that if W is a chronic product EV yeah if W is a chronic product vant one matricies and the entries wide that will Comm commute now one reason this is important is that we get the I said that the vand one Matrix is a type two Matrix type two Matrix more generally the character table of any Filan group is a type two Matrix but the character table of any fining group is a chronical product of vandom L matrices so this is telling us that if we even we go past Beyond vandor matrices and cly groups to more General ability groups we're not going to get into um any Quantum aism with non-com entries okay so we can't do any better away got the here in the middle um the so when I say numerologist trivial of this Dimension is two um as a result due to and probably others it says that this H is n by had Matrix Dimension is greater than two the H divisible by hand I mention this just to show you there are restrictions so we know if you have how to BU Matrix water greater than two it's aut digible by four um if you want the num greater than two then you is restriction on N um now the um bad news is that for all known hand matricies the entries of the Matrix of um item potent commute now all Nar means had by matrices up to order 28 you did the calculations as many years ago don't have any written record anywhere um so the smallest open case is on 32 at the time we were doing that there wasn't a table of the um we didn't have a list of the Han mates of order 32 I think there is one around now so it could be pushed at 32 if anybody really cared um it's not at all um well I have no real idea here I mean it can go on with the hadine Matrix where the um Matrix VI point just got non Comm entries so um it's if it is the smallest examples on 32 vertices 3 32 um so um so in a sense okay we' got so using flat type two matrices we can get um Matrix of and non trial num algebras but all our Quantum limitations are competive and so that's bad news um now we can get non-committed things if we go drop the flat restriction so there's a limited supply of typ two matrices that not that are not flat and have non prival algebra okay and non products you be a little bit careful when I say non products I if if feel slightly nervous sometimes when you have a product construction there's a way of twisting the product a bit you get something which is technically not a product um that works the same way um so I'm using product in that vague sense um but the interesting thing is that the one family we do have where we get some interesting type two mat e just based on the hat graphs which we discussed it before okay so the um now the when I discussed the how mod matrices I talked about the adjacency Matrix and so if n byn how mod Matrix The adjacency Matrix of the graph can be written as a 4N by 4N Matrix in this form where this H is kind of it's the doubling of the had Matrix so we replace the minus one get rid of the minus one so we get that form and so in fact you get four mes to play with you get the identity you get J minus I blocks on the identity so this is just two copies of thej Matrix graph on K you get our Matrix graph and if you think about for you'll see that this will also um this is also a 01 Matrix so we get these four matrices and L it up to the1 Matrix okay now and they're also sh orthogonal okay and so it seems very plausible that these guys could be the basis of a algebra and they are now if I've been talking about dis the graphs and things I could do it all using that but we I'm just going to ask you to trust me so these are the commical basis of the commun cerent algebra you can check that they can do without any very difficulty um and Nur approved this albra contains a type two Matrix W such the W is contained in a num it's own num algebra and that that then implies Dimension neurolog at least three and this gave a new this gives you means you have a linking variant as link variance go it's not particularly interesting but there is a Linkin variant there okay and so we have a have this situation um so how M graphs work and so that gives you infinite family examples and I think there's a couple is it generalization of these there's a twisted product yeah so there there are some generalizations of this but so but anyway I'm just giving examples this does work okay and there are some R examples um there's one more example I want to talk about this is called the hman S grass um now the human graph is strong radi graph it was first found by mner okay um for historical reasons um well the reason one reason it's interesting is it's got um an interesting automorph group and so and so there's a group called hman Sims group and so this CH this is called neip graph but it was discovered by m independently and quite a bit earlier so it's a strong regular graph it's a graph on 100 vertices bcy is 22 22 regular graph 100 vertices there are no triangles and any two veres are distance to have actually six common neighbors and just show this AC J graph and their scalers Alpha Beta gamma such that this when the combination is type two and W design the neurolog now this gave an interesting this gave an interesting linking variant very a lot of interest I should say that these numbers alpha beta and gamma are not particularly Pleasant the same applies to the um formulas for the um when you're dealing with when the mura was dealing with the these guys you can find a a linear combination of these which is a type two Matrix you don't you don't really want to try and calculate what alpab b g and daughter are it's it's painful enough to get out m is quite short but so and an interesting result so we have the Hy grass and we have the Sims graph um now it's the if you're going to generalize this construction um you want a triangle free proved you need a triangle free strong graph with some other properties um and the problem is that the largest non bipar triangle for strong graph we know is T graph so we do no a strong graph which is triangle free not bipartite and with more than 100 vertices now this has been an open question for at least 50 years and people are sused about it so we don't know if there are more triangle fre Str or not that's an interesting question and an orig important question it's own right but we don't know if there are more there's some chance that we would get more more link variant that giv link variant um and so the well H graph is triangle free and what group contains the hgim group which is a sporadic simple group um and contains H group as a subgroup of index too so the a group H SC is not simple but it's got a subgroup index too which is simple and which is the hi simp group and it's one of the sporadic simple groups so it's not constructed using Le Theory um and this is just saying what I said about triangle three graphs aren't really common um we have many examples type two matrices are not rare okay the problem is that the ones we construct almost invariably have morod Dimension too okay so they're related to cety designs to equiangular lines You Know M bases all sorts of things give rise to type two matrices just in most cases and except for the cases I told you about theolog got to mention to oring with products don't extra time um okay um now I wanted to um indicated once or twice that I'm giving you sort of a simplified definition of quantum automorphism and I thought I would um show you a little bit of what we're leading out um so I'm thinking of my Quantum limitations and matrices of projections um um and the projections coming from final dimensional Matrix algebra fine the general definition you you're allowed to play with bigger algebras you allowed to use cstar algebras so I want to say a little bit about that so um the um so with CED algebras the exam canonical example um is the set of bounded linear maps on a hbert space well for example a space of all in matrices acting on C to n okay so that's the conical example of sear algebra um the key parts of C algebra are an underlying algebra b algebra I'll explain that in a second and an evolution and for involution think conjugate transfers so the difference between a general algebra and a c star algebra is the star you've got this involution acting I'll explain that now so an involutional algebra whatever the complex number is about such the star of the starus original it's it's order it reverses um and it's linear over R but not over C okay so that's what it is now okay so that's a star a I mean this is you know think con transpose always works okay so Bic algebra is a normed algebra so you got a algebra operators and each there's a norm defined um and the important property is that the norm of the product is less than equal to the product of the factors so most Norms you've seen are like that okay that's a BAC algebra um and so a c star algebra is a BAC algebra with an evolution which satisfies this condition and so the condition what this tells you um um after you do enough work is that the um the the evolution of the norm are closely connected okay so there so there's a strong connection between the normal involution now the dragon is a compact Quantum group is a c algebra with a unit now the Opera theorists don't normally require the algebraist to have an identity but in this case you do and you've got a home morphism and you won't like this the maps the C algebra to its tensor square and satisfies these two unlikely conditions which you may be better off not trying to be code right now now I should warn you that when um tensor products um there's nothing simple about tensor products the problem when you're dealing with with algebras or normed algebras is what is the norm you're going to use on the tension product it's not given to you for free and so there's some conditions in there so so just defining the norm on these guys um requires attention to detail say and then we have have these two maps now this map here um is called a co-product should write down now um so CL prodes are in a sense dual the products but I don't think you want know about that right now but my point simply is that there is um a definition of these guys called compact Quantum Roots now in Cy albas you can Define projections okay it's an element an poent element which is equal to its start okay and a Quantum permutationally um is an N Matrix rejection from a c algebra with unit so that's what the Callum sum some identity now even now I'm still suppressing some complications some which I did phits okay but the basic the basic point is that the when people like Dave Robertson talk about qu W off they allow the the projections to come from something bigger than just the Matrix AR okay and it turns out that the by increasing the size you actually get it makes a difference the two the concepts are different um now for our purposes it doesn't much matter since it turns out that all doesn't matter which con whe which variant you use Computing the design of two graphs of quantum my because undecidable so one's not easier than the other okay um and everybody will agree that the quantum permutations we use sit inside the quantum gr and so my view is that the comat torial questions we can State nicely just using the definition that we've been using if you want to go on the quantum information Theory and the relation between the qu Mor groups and qu information then you need to go into more details about the definition but yeah if you choose to play with the comics like I do then you can just smile when they start worrying about whether this is a tensor commuting or a Quantum commuting variant okay and so that's it for today