Hi, I'm Tom Jenen with
The View on AdTech today. It's no secret that SSPs have evolved
into demand partners, that really moved them away
from providing technology that comprehensively solves publishers'
needs in a demand-agnostic way. Today, we're talking with one company
that fills that gap. Nils Lind is the CEO and Founder
of Assertive Yield, and we'll dive into their latest
innovation, Prebid Server, and the impact that it's
having on publishers’ bottom lines. Nils, it’s great to have you today. Thank you very much
for having me, Tom. I was wondering,
what has Assertive Yield now developed to address publishers
revenue needs? For Assertive Yield,
we take a holistic picture overall, of the publisher’s needs and always
from the perspective of a publisher. At the same time, we are working
with the publisher in an unbiased partnership, in the sense that,
we from our end, are not paying any demand, but at the same time,
we’re optimizing it holistically to drive the highest potential revenue for the combined amount of revenue
the publisher can achieve. And we divide it in multiple ways. On one end, it starts with very
granular and sophisticated data, on the other end, we have developed
Machine Learning systems that are able to predict certain events and revenue
coming in very precisely. But they've become so sophisticated
that we have already predicted over $1 billion in revenue. At the same time, we are able to capture
everything that is happening and optimize
the complete ad delivery on the page. So, from our perspective,
it's a complete picture of from what should happen, at what point
in time, at what prices, and so on. What of that is different
compared to what else has been going on
in the industry? For us, a lot of the focus
has been on the Machine Learning side of it itself. So we come from a
data-driven perspective. We've built one of the most granular
solutions in the market in terms of tracking what is going on. And we have gained
a very deep understanding of what effects have what kind of outcomes and using this experience
then in combination with the data science side of it
to really dig down and optimize things, and to not just stop at optimizing
and viewing performance in one scenario, but constantly
reinventing and re-engineering these kinds of optimizations
so that they can do well in different scenarios
within a constantly changing landscape. And does the algorithm
do that all on its own? It's always a combination of manually
helping it and optimizing it. But over the years
it became more and more automated, and it's more and more capable
of dealing with these changes and automatically detecting them
and readjusting and retunning itself, because we don't have, unlimited
time to constantly adjust it. Yeah. When did you start building
this algorithm? Is it just one algorithm, by the way,
or is it many? It's multiple or many algorithms
in general without tuning all of them. And for different use cases, it's
often a combination of different algorithms that deal with
and manage different scenarios, because one algorithm might be doing really well
with something like Black Friday, while another algorithm is better
and the performance is quite static or quite stable. It initially started like -the data tracking side
and so on, goes back to 2017. But all of the Machine Learning
started around 2020 or end of 2019. That is when we have really been
focusing on how we can turn this data into outcomes that will generate
more revenue for the publisher. Sometimes when I talk to people
about Assertive Yield, I tend to look at it from the product development,
almost as a linear progression. First, the company built
its Analytics package, which was innovative and more complete than a lot of the other
analytics packages on the market. But then, once the analytics
were there, somehow, you had to make it easier
for Publishers to take advantage of that data into optimizations
that would be more automated. And so, you built the Wrapper,
and you built it as a Client-side Wrapper,
because you could get more data, and make use of the,
analytics. But then, once you understood
all of the demand that was coming through, based on
the optimizations, then it made sense to make Dynamic
Flooring one of those optimizations. Dynamic Flooring was an outgrowth of understanding
which bidders were going to bid, and how much they were going to bid
when they did. But once you understood that, you also
knew who wasn't going to bid. So, Traffic Shaping
made a lot of sense at that point. Is that kind of how
you look at your progression? Yeah, pretty much. We went from the Analytics
stage to Machine Learning optimization,
then following a Wrapper and so on. Step-by-step, really
trying on one hand, we have this very sophisticated, real time,
very granular dataset, that is not only about
revenue information, but it is also engagement information,
We call that Vitals Performance. That performance are
holistic picture of what is going on. It's relevant for pretty much
every team there's within a publisher: the Product team can use it,
the AdOps team can use it, RevOps team can use it,
the Finance team can use it. It's really bringing everything
together under one platform, and having one data source
to reference so we can agree on.
This is like the story. Yeah. Even the Editorial teams
can understand not just the traffic to a page,
but also the value of that page for Advertisers. They may
or may not want to take advantage of that information, but it's there
for them for the first time. Yeah. Having all of this data
available, of course, we need to put
time to make use of it. Of course we have Alerts,
and other things that automatically, in real-time, let us know when something goes wrong
or if there's an opportunity. But in any case, we need to put time
to get value out of it. So sitting on this very vast amount pool of data that is very granular
and is very accurate in its nature, it made sense to start saying, okay,
what can we do automatically, to reduce the amount of time
someone would need to spend working with the data to get results? How can we generate
those results automatically? It’s then where our Traffic Shaping was born,
where Dynamic Flooring was born, where Supply Shaping was born. But then at the same time,
we still want to make certain things manually because not
everything can be automated. It's always like, I don't think there's any kind of system
that is able to really operate at its full potential, fully
automatically. It's always a combination
of manually optimizing, in combination with a machine to get to the best results. And a lot of the optimization
we want to do requires one speaking
to the Engineering team, getting that translated and so on, and properly
testing and evaluating it, It's a very long process. So the next organic evolution step
for us was okay, we should make it simple
to make those changes. We should make it very easy to analyze
those changes, and produce accurate data on them. That is when we then build
the complete Revenue Management System. Which is a Wrapper,
which is a Tag Management, everything that controls
everything that happens on the page related to Advertisements, in some cases, even the Layouts. Because there's so much
that can be done, do you find that
Publishers need additional hands to make best use of it,
or actually, can they do even more with fewer? It can go in both directions,
because on one hand, of course, they need
less resources to do the same stuff. But on the other hand,
all of a sudden, they can do a lot more and they want to do a lot more. Because now it's no longer
as tiresome, no longer as annoying to do something because you can do it
in a few minutes to an hour. So all of a sudden there's
instead of testing one thing a month, you want to test 20 things a month. Until quite recently,
you were building this as a Wrapper-
based Client-side solution. What prompted the shift
to Server-side? When we integrate
from a Wrapper perspective, it was in general,
the perfect integration because we are in control
of everything that is happening. Where on the page should the ad
show up, and at what point in time, what is the Lazy loading offset, what is
the prefetching and so on. Everything we can control. It's the way we can achieve
the highest potential uplift and also have the most holistic
control and optimization potential. But when we integrate
from a Wrapper perspective, it is kind of ripping out
everything that is there and re-implementing it. It is not something
that can be done in a day. It is a longer process,
and the more sophisticated a publishers are, the more tools
and things that publisher’s using, the longer it can take. It's usually still relatively quick,
and that can be done anywhere from like a week to a month. But it is still a considerable amount
of resources that need to be put in. Yeah, it feels like that. Just like you said in the beginning,
there's been a huge gap in publisher- focused,
demand agnostic technology. Such that publishers, especially the bigger publishers, have had to build their own,
to fill that gap. But now you come along
and you've spent, years building out technology
that fills that gap specifically,
and it's your only job. So of course, it seems like
it would probably work better. But, at the same time, it's got to be a big lift for publishers
to rip out what they've done before. I can see how that works. Is that one of the goals
of the Prebid server? Is to try
and, make it an easier task? It kind of is,
because with Prebid server itself, on one end, of course, step-by-step,
things are moving server side. So it is the logical direction
to go in. But then for us, it was also we were at some point, we started
providing these kind of algorithms and this Machine Learning also to
SSPs that would be running server side. Even that,
we have been doing this since around 2021. So we have also gained very strong sophistication
within doing this stuff server-side. So it became a logical step. But then we decided, okay,
we want to run Auction server-side,
or we want to move from server side as that is where we are
having say cookies going away and so on, to integrate
these optimization server side. So we used all of these sophistication
gains and all of these improvements
we have built over the years for SSPs and for Publishers and
put it on top of a Prebid server. So in the end,
we now have a Prebid server itself that in general, is more or less
running the auction, taking care of the privacy things, and a few other things but primarily
running the auction. We now enriched it
with the Machine Learning stuff we have been doing for the publisher
client-side for a long time, and have been doing for SSP
server-side. How long would it take
for a publisher who implements this? Because it doesn't take long
to implement, right? You just said you could do it
in a day. Yeah. So how long would it take
for them to start seeing some benefit? Yeah. The really nice thing with the server-side
integration, is that it can be run alongside,
within the existing stack. Of course, we will only be able to improve
this kind of one piece of the entire puzzle. Yes. Not the wrapper. Yeah. But we can already generate
significant results there. For a publisher,
it's a relatively easy lift Because all we have to do
is pretty much, the integration is quite simple
to integrate a new bidder on the stack, but there’s not even
the need to update TXT records.
Because, again, we are unbiased, we we don't bring any demand,
we purely work and help the publisher
optimize what they already have. We
squeeze out more, out of the demand partners they may already work with. The integration, oftentimes, it can be anywhere
from 30 minutes, if the publisher can quickly make
those changes, or within a day, and they are live, and running with it. Now, seeing the results, initially we can always see
the real time effects of it. But then over the course of roughly
one week, we can see how after one week, it reaches roughly
Like peak performance. There's a bid of a warm-up
time as the SSP and so on, has to learn about this kind of
new path to buy inventory. The reason also behind
that, is that on one end, We are setting floor prices. On the other hand,
we are shaping the traffic, so we cut out requests that are not
really interesting for the SSP. We try to focus on
sending the requests that will actually have
the possibility of winning and beyond, and generating revenue to the right
SSPs, and it takes SSPs
time to learn that. So we're not going to see that
within a few minutes, they are only going to see it over
a few days, That this path
of buying is very clean, for which we then
are also able to tap into inventory, that before was kind of untouched. So that's the Traffic Shaping benefit
from implementing the Prebid server,
that you're going to know who is going to
bid and how much they're going to bid. So you can actually send
only the traffic, only the bid requests to people who are going to actually provide
something of value to the publisher. Is that right? Yeah. That's correct. How do you know which, which bidders are going to bid
and how much are they going to bid? So in the beginning,
we know from the data we have from other publishers
already running it. But at the same time, of course,
we gather information specifically about the new publisher
that just integrated it. As the data is ramping up,
we then prioritize for the predictions of this given publisher, their own data, of course.
Because there's always some variants between the different IDs
and the different domains and so on. So is that contextual? Is that based on the page,
or what is that based on? From our perspective,
we kind of got to the point where it's quite sophisticated in the sense
that, we are creating user profiles. So for every single user
that is visiting the domain, we are starting
to create a small profile on, and we learn the historic information
about that user. Now over time, we are assigning the
user to different segments, and then through all of these combination of things,
we're able to get very, very close, as close as we
can be, to an accurate prediction. This is also what was built over those many years, like iteration
after iteration after iteration. How can we be as precise
as possible? On one end, We might say we don't care about the user
because the cookie is going away anyway. But there will be alternatives, right? There are already alternatives to that.
So for us, tt's very important
that we do understand the user, that we do understand
the user's performance, and that we are also able
to understand the user's performance if you have never seen that user
before. Being able to have a lookalike information
available for that given user. So how else can you use
that information to impact publisher revenue positively? On one end,
it is, of course, shaping the traffic, and making sure that the highest value
inventory goes to the SSPs, which results in being able
to influence the decision of the DSPs on,
if they’re gonna run an auction for it, and how many of these DSPs are actually
included in that auction. Because quite often, an auction
size is somewhere around 4 - 8 bidders, although,
a SSP is working with hundreds, at least big ones. But then, there’re auction
where only one bidder’s included. So it's really
the optimization of the SSP, and from being very close
to the publisher, and seeing the full picture of
what is going on, be able to predict very more precisely as to
what is interesting for which partner, and then be more precise in cutting it
and basically preparing the food for DSP so that they can
more easily, consume it. Okay. But then on the other side, for example,
we are setting the floor prices, and on the floor pricing side,
it is a lot about being able to predict what is the demand partner
willing to pay for the inventory, and what is
the value of the inventory. So all demand partners, you're going to understand how much,
they're willing to pay, for every single auction
before the auction happens. Is that right? Yeah. So for every single auction,
and even for each creative side part of that auction, and for each placement
part of that auction, we are predicting what is the likely selling price
for what inventory, for that given moment, with an understanding, oh, we just
move from one month to the next one, so there's a normal
market drop in performance. Or we are just
moving towards the end of a month, so we already know that
the revenue is going up. I see. So that,
Dynamic Flooring sounds like something
that other people have offered, How is ours different? There's many different ways of doing
Dynamic Flooring. For us, the focus has been on using that
granular data for the most precise predictions to come as close
as possible to the price point that the market is going to pay for that
inventory, and then figuring out what is the
right level of aggressiveness to set the floor price, which depends on how close
our prediction is to the likely outcome. So for example, we could say,
okay, the floor price with a 50% probability, the closing bid will be between
$120 and $145. Now we need to decide
where do we put the floor price here? And that depends on how precise we’re in the prediction, but also understanding
how precise we are at that moment. So the more we know about a user,
the more precise we can become and the more aggressive we can become
in setting the floor price. That's one side of what we do
really well, but on the other side,
it is also understanding that there are different strategies, that work one week and the next week,
a different strategy is working. There's one strategy that works
better on iOS, a different one that work better on Android, per country,
per time of a day, and so on. Also, when we test all of these
different strategies, understanding at what point in time,
for what segment of traffic do we need to switch the strategy
that is being used the most, to keep the uplift at the peak? So in addition to, I mean the Dynamic Flooring
sounds really interesting. All of this seems to be based
on the original analytics that you've been working on
from the very beginning. Are those analytics available
in the Prebid server as well? So when we run Prebid server,
we primarily include the Prebid specific analytics, so that both the publisher
and us can see what is going on, not only on the server-side,
but also with the clients, at auction Still, because not every single bidder
is supporting server-side yet. And at the same time, that somebody as a publisher
might prefer to keep client side. This data is then, of course,
being used primarily to analyze it, is available in real time,
and it is available with a very low granularity,
and it even supports custom data being pushed into it as well. So it's not only any kind of testing
we are doing, and optimization we are doing,
we can break the data out. It's also any kind of test
the publisher’s doing and performing, so that we can understand what is the incremental impact
of this Prebid server in combination to other tests and
optimizations for publishers running it. So it sounds
like a lot of this is all being based on the actual data that Assertive Yield is able to determine from the auctions, from the
publishers websites. This doesn't
sound like it's very cookie based, so how does this set them up for more of a Cookieless Future? In terms of Cookieless future in general,
we do have Identity Integration server side. A lot of the Identity solutions that are available
Can work on server-side. Of course, they also will work
automatically out-of-the-box
if they are running client-side, but we are adding more
and more to the server-side as well. To give an example,
if we are optimizing or running an AMP inventory, right now,
we can't use any Identity solutions. A lot of the bad performance
we see on AMP traffic is because we can't do anything
for Identity in terms of iOS traffic. But then when we are running
the server -side auctions, and have the server-side
integrations of identity, now all of a sudden we can use them. It's a similar kind of case
when we look at In-app inventory,
when we no longer have a device ID, being able to use those server side
now opens up possibilities. But before, we were limited
in those specific inventory types. What kind of revenue impact do you see
with that kind of Identity Solution? It depends on the Identity Solution in the end,
the publisher wants to activate. But when we look at iOS specifically,
you're seeing lots of 15% lift is very common. In many cases, it's even lots of 30%. Of course, at this point
it does include ID bridging solutions, but transparently being submitted
through e-IDs and so on. And that's all
within the control of the publisher? In any case, they can pick and choose
which ID solutions or aspects of it they'd like to use it. Pretty much. Yes.
Yeah. Going forward,
what do you recommend for publishers who are concerned
about their future revenues? From a publishers perspective in regards
to future revenue, It is paramount on one end,
to in my opinion, squeeze out what is possible from the open market,
to not be left behind there. There's a lot of sophistication
which we can find in from those publishers,
that have the development resources available,
that are able to have 5, 10 engineers
just working on Adtech solutions, similar
with the big sales houses. But then, when we are
a mid-sized publisher, or even a big publisher
with limited resources, we just can't get to that
level of sophistication. So, there's double digits of revenue
we are leaving on the table from the Open market. At the same time, it is, of course, the shift towards building
more and more of a direct business, but also building
on the identity stack, so that we can capture
as much as possible from right now the iOS inventory,
the Firefox inventory. But as we know, it's
getting more and more that will be without cookies. Yeah. So it's most important right now,
to get a clear sense of what is possible,
and to how much revenue can be extracted out of your current,
stack, and then so that you can establish the right bar
for all of your direct spend. So obviously,
if you're making more money from Programmatic, then you obviously
are going to charge even more for your Direct because you know where
that essential floor is. Is that right? Yeah, yeah. Correctly. Most publishers
never fully sell out directly, so you always have this portion
of inventory, and sure, you can push up the prices,
but you might not even need to because, 5% more on Open
auction, 10% more, 15% more, it's still
some money to a bottom line. In Publishers Cookieless traffic
today, they're very often seeing, not full Sell-through. In fact, they may not sell half
of their cookieless traffic today, at all. Do you see an uptick
in the amount of Sell- through that publishers are seeing? When they work
with these Identity Solutions? Definitely. Like the Fill Rate is not back
to like the Android levels, but it's not far off. Also, the CPMs we see,
with all of the Identity Solutions integrated at least like for US,
inventory, and some other countries, unfortunately there's not too much
possible on the European market. It gets like with, sometimes, 20% of Android performance. So looking at the, looking at,
Prebid server, you'd say that overall, what would your revenue expectation be
if you were a publisher implementing Assertive Yield’s
Prebid server? What would you
like to see? When we integrate Prebid server
across publishers, and it doesn't really matter
what kind of publisher, it could be, one that is purely Open
Auction, or Programmatic, or it could be one that is 40% Direct, let's say. But one story we always see is
that, the amount of revenue that comes from Prebid, as one of the revenue
channels, is oftentimes increasing by double digits in the 20, 30,
sometimes 40% range. So it's a huge increase. Of course, it depends on how sophisticated already
the clients at auctions are, and if it's already flowing,
running there and so on, which can reduce it a bit, but
It's still like a significant lift. And then if you look at it
on a holistic picture, let's take, for example, a publisher
that is running only Open Auction. It’s like 20, 30 or 40% lift
on Prebid alone, results in an overall
increase of usually around 15%. So with Ad Exchange and Amazon
and everything included, 15% lift just for moving bidders
from client-side to server-side in order to utilize these optimizations,
is it's a pretty significant impact. Of course, the even better step, would be running
those integrations, Also, the client-side,
to capture the entire, holistically everything that is going on, in
in which case it can go way beyond 20%. but purely with a very simple
integration of moving stuff on server-side and being able to utilize
all of these Machine Learning-based optimizations, you can already achieve
such significant lift for nothing
in terms of integration work. What do you think demand partners
think of all this work that you're doing to help publishers
make more money? Are they? You think they're slightly, unhappy about having to pay more
for their inventory? So you mean like the Advertisers
specifically, or you mean more of a middleman, SSP, DSP ? Probably more in the middle. Yeah. I don't think it makes any difference
for them. They take their margin,
usually a revenue share, the advertising spend that is on the
market remains more or less the same. One thing
that is, of course, happening is that the optimizations we do have varying
effects on different partners. For some of them,
we can squeeze out more, for others we can squeeze out less. To give one example, if an SSP is applying
Traffic Shaping on their end, meaning
they are making the decision on if they drop the incoming
Request, or if they run an auction, and how many DSPs to include? Quite often
they use the floor price where it is coming in, as
one of the decisioning factors. But if they only look at the floor
price unrelated to the floor price is coming from, they might quite often make the decision
not to run an auction. Because they assume
it's lacks a static floor, or it's like a floor that doesn't
have much precision behind it, and then they are missing out. Because, another SSP that’s
submitting it and sending it through, is benefiting from us -
being so good at predicting what is the right floor
price for that given auction. So, do you think that overall, it
kind of pushes, demand partners
to see more value in that publisher? Definitely. Especially with the Traffic
Shaping that is running, where we’re cutting down requests, so the SSP in the end,
is receiving overall less traffic. But what they are receiving
is basically the traffic that they can transact on,
that is improving what they look at, like an auction RPM,
meaning how much revenue do VS and SSP generate per million auctions.
For example, per partner, per this ID. That is of course, heavily
improving as we fill up the traffic and optimize and cut out the stuff
they don't care about. Right.
So demand partner wins because they see more of the quality
traffic that they want to see, and the publisher wins
because they sell more of the traffic that they want to sell
and at a higher price. So ultimately,
it kind of it’s a rising tide, and all of the boats are floating
that. Right? Yeah, yeah. But don't necessarily
see more of an inventory because we can't just
create new inventory. But, they primarily
see that better portion of inventory. So they would see more inventory if they were throttling down
that inventory in the past, right? And then they decided to turn
more of it on because it's efficient. Yeah. So then the SSPs are saying,
oh, traffic coming from this publisher, for this server-
side integration is like 200 times more clean than what we receive client-side
or even on the Open market. Then the SSP is inclined,
of course, to run a lot more auctions for that publisher and include
more bidders in those auctions, especially if those predictions are already better
than what they can do internally. Because, they have more data
available, more holistic picture of what is happening
on the publisher's end, and also because the model
that is kind of predicting everything there can really focus on
that one publisher instead of like thousands,
tens of thousands of publishers. When you look at the competitive
side, how many, other companies
are offering something a lot like this? If you look specifically
at Prebid server, we have a lot of companies
that offer it, and it has been
around for many, many years. What makes a big difference from the Assertive Yield solution is that, on one end, we are not an SSP. We also don't plug
in any of our demands. So everything we do is really squeeze out
more of what you already have. So there is no way
we can make any kind of money in the back or otherwise. We are fully focused
on taking what you have and optimizing it. Then on the other hand,
the Machine Learning stuff we have built over the years
and the optimization we have built over the years,
it really goes back like 4 or 5 years where we have focused and focused
and focused on how do we do this Floor Price Prediction, how do we figure out
what is the right floor price to set, what it the right
strategy to use in different cases? Where should we cut traffic? Where should we not cut traffic
and so on. Our methodology,
the Assertive Yield methodology, that includes the really
sophisticated Machine Learning and the fact
that it's so easy to implement, makes that a no brainer
for publishers to, put into place and to give it a try
and they'll find out within a few days whether or not it's right, and then
they can turn it off or keep going. Is that about right? Yeah, it's pretty much some investing half an hour to a day, depending on how efficient it is
to integrate something like this. Similar to integrating a new bidder, letting it run for a few days,
analyzing results. And in the majority of cases,
we should see 10 to 15% overall lift, not just Prebid, but overall. So then, you know, a lot of publishers
have been out there today turning off bidders right, turning off different demand partners,
maybe they're resellers, maybe they just want to get a,
better, a better score
from carbon rating agencies. In this case, it sounds like now
that they've done that with Prebid server
from Assertive Yield installed, they would be able to try out new demand partners
and see if they make any sense. Is that right? Yeah. That's right. That's of course, a big question here. As we all know,
they all send it to the same DSPs in the end, and some of them
have those relationships directly, Some of them are reselling other SSPs and then in the end, we are hitting
the same SSP even multiple times, one directly and one’s for resellers
and another time through, say, Amazon TAM and Open
Bidding, and everything else, which is creating a lot of inflated inventory,
which often results in the Traffic shaping being more
aggressive for that given seed. But at the other
side, it also gives some opportunities in terms of getting through to the DSP,
and getting through frequency capping. and other things
that are kind of like barriers. So it's it's a double-edged
sword, kind of. So it sounds like,
even though in the past, it has been a real problem for both buyers and sellers
to have too many, you know, demand partners,
plugged in. With Prebid server and the Traffic Shaping
that's available, since there's going to be some intelligence behind the choices
that are made automatically, that putting
in a different demand provider, one that may have something unique
to offer, there's no risk anymore. Because the demand providers
are going to see what they see, based on what the logic presents. Yeah, we could plug in more or less
as many as we want, of course, there’s still like
a soft limit that we want to maintain, in terms of
who do we want to work with, how likely are they going
to pay us long term and so on. How healthy is that company,
and how direct are the paths to the demand partners, what other discrepancies,
what other payment terms, or quickly do they pay us? Yeah. How often do they deduct stuff? All these kind of things, and then advertising quality,
how well do they filter the inventory? Many many layers
we want to consider there. But when we say, for example,
client-side, we would put a limit on 5 or 8 bidders
or something in that range. With server-side,
it doesn't really matter. We could run the auction
of 20 to 30 to 40, but the moment we use so many, especially
then we will have resellers in there. So we will have resellers
that are hitting Pubmatic, same as we are hitting them directly, and all of the other big SSPs. And at that point,
we really want to make sure that we have some kind of Traffic
Shaping in there, so that our score
for this given domain and for this given seed
and so on, is becoming completely low value
and being way below the average. So it sounds fair. What, what's next? What's what's next
beyond the Prebid server? So for us with Prebid server,
it's overall a big focus points still, still optimizing
it, still growing it, and so on. By now, the Traffic
Shaping, the flooring and so all of that stuff is integrated quite well and working really well, but there are
still steps to improve it further. And also to give a publisher
more control on how they want to integrate it, and being able
to make changes themselves. Right now, we make a lot of the decisions
for a publisher, and we can make adjustments
for the publisher, but in the end, we want the publisher to have
the option, for example, to decide how clean do
I want to be with the auction? What level of potential revenue loss
am I willing to take? To send half
as many requests out, for example? To be very aggressive
with how we are throttling or what we are sending out, to be more green,
or do I want to only cut out what doesn't cost me
any revenue? These kind of level of controls,
in combination, then of course, with Identity,
and other things. Where would you say is, some of the good successes? Who would you say
is really delivering? Who would you say
is using the Assertive Yield platform to the best of
its ability? It's a tricky question (LOL). Of course, we have people
on the customer success side, that have a lot of experience using it and have a lot of experience also,
like what kind of optimizations make sense and what works
and what outcome to expect. So for them, it's easy to use it to its almost full potential. For a publisher, quite often it's
used by multiple people. On one hand,
we have product people using it, making technical changes
to it, measuring what is the impact of changes
we do our product, our website. On the other hand,
we have the RevOps people figuring out, okay, each test
and everything they're running, what is the impact? What makes sense? Okay. So it depends, from publisher
to publisher. Oftentimes,
it's also a collaboration between the publisher and us. For example, the publisher comes with ideas,
we come with ideas and we design a roadmap as to what should be
tested at what point in time, what has the highest potential
for uplift, and make sense to integrate at
what step? So can you talk a little bit
about the support that a publisher
would get working with Assertive Yield? In general, it depends
on what the publisher is looking for. We are not doing
any kind of full white-glove service in the sense that,
we do everything, including demand. We are really positioned to the point where we never bring demand
because we want to be fully unbiased. But then some publishers
do everything internally, and only ask certain product questions
when they're getting stuck. And other publishers,
we have a call with every single week, even multiple times a week,
and they just overlook the data, the performance,
and make the final decisions. In many cases, we even bring all of the recommendations
on what to do. Sounds like
it's really easy to get started and really easy to,
get the most use out of it. I know you've got a big event
tomorrow, what comments do you think
you're going to get from publishers? What do you think
is going to be their biggest concern? What are publishers
going to be talking about tomorrow? I think it's a lot about the general state
of the industry. It's no secret that overall, it's basically, the industry
as a general, it is shrinking. If you look at the Open Web, on one end,
the traffic has already been down from social media for a long time. Now, such traffic is going down basically,
the amount or the traffic sources that those publishers can have
are becoming way less. At the same time, it's of course,
it means that everyone that was able to already build
a direct relationship with the reader is in a good position, but
also for those, it's more difficult now to maintain that readership and to gain new readership. In general, for the industry that has already been through
the Covid period and afterwards, one end Covid period
being very good in terms of more traffic and more revenue and so on, especially
during the second half. But on the other hand, now with everything going back to reality,
or having gone back to reality, it has created an even bigger shift
or rift between reality and the state the publishers were operating at. I think the big topic is, as it is very imminent right now,
as to what are the right steps or what is the right direction
for this entire industry. And how as a publisher,
are we positioning ourselves a proper way? On one end,
to be able to stay relevant, stay sizable, keeping readership levels and volumes we have, going as much as we can. But at the same time, then also still remaining and being able to deal
with all of the changes happening to the industry. On one end,
it's getting more and more difficult in terms of overall traffic,
in terms of monetization. But then at the same time,
we have this huge changes happening within the industry,
with the cookie going away of all privacy
related topics and other parts of it, which are really a core fundamental change to it. Do you have,
recommendations for publishers? If you were a publisher today, where would you be
investing this money that, Assertive Yield Prebid server
is going to make them? Where would you reinvest that money? I think I would try
to invest it in the part where improving the Reader… Experience? Not necessarily experience,
but basically Retention. Making sure that these readers are coming back
as much as they possibly can. Having the right loop mechanism in place
that recaptures the readership. At the same time, seeing
how can we, with the changing landscape where social traffic is very difficult
to not exist and where such traffic is becoming very difficult
to non-existent potentially. How can we remain relevant
in terms of keeping, driving new readers coming in? But then, at the same time, and probably the biggest part
is making the entire organization as efficient
as it can be and flexible. So as the change is coming, And as we know where we are heading,
we are able to jump on that wagon as quickly as we can and capture
as much of it as possible. Preparing to be as dynamic
as possible to the changes that are coming. Yeah,
it feels like there's quite a lot of, it feels like there's quite a lot
for any publisher to consider. At the same time,
that they're uncertain about what what next
year's revenues are going to bring. It's hard for them to consider
what kinds of, investments to make
and how many people they can hire, or if they can hire anyone,
how they can reallocate some of the resources they currently
have into new opportunities. I can imagine that that's going to be a lot of the conversation around
what are they going to do? And even the people in the room, it's here's always a lot of,
conversations that we have around people
who are programmatically focused. Programmatic
certainly isn't going away, but there's been a lot of conversation
around, Direct demand, especially when they see that from the marketers and from the DSPs, there's a lot of interests in having,
more deals in place. It feels like there's
quite a lot for any publisher, there's a lot for them to consider. Is there? When you were a publisher
and that was not too long ago, you started,
a game site, right? I was
operating multiple online communities, kind of
similar to Reddit, I would say. Some was
in the gaming space, some were more about automotives,
motorcycles and so on. A mile in these two verticals. Was that fun? Was it fun being a publisher? It was fun for me
because I'm technical and it was a time
when basically Prebid came out. So there was a lot of stuff
to figure out, integrate, and invent,
and build around it. But in general, even at that point
in time, it was already like we made the experience that from one day to another, one of our main sales house was acquired and decided, okay,
we're going to keep everything in-house from now on in terms of
all of the direct spend. Which for us, meant 1/3
of a quarter or even in some cases, a fifth of our revenue
4/5 of our revenue disappeared. Yeah, it had a significant impact and right now, of course,
everything is a bit more predictable, we can already see what is the performance
impact on Chrome traffic of the cookies going away, somewhere around 30%,
which is way better than what it is on iOS. Primarily
since advertisers are just focusing on Android
and SSP are optimizing for Android inventory because they can capture
more revenue from that than from iOS. The amount of iOS inventory a DSP’s seeing is way less
than there’s actually on the market. It even makes it difficult
for our buyer to buy iOS inventory. But that 30% will it be going back up
the moment the cookie alternative Or like the cookie’s fully away? Or will it remain on that level? What will happen as the traffic is overall going down? Does it mean the prices go up?
or will it remain the same? Will the market become irrelevant
for some of the buyers? Will they say it's actually not even worth it to spend 10 to 20%
on Open market programmatic? So a lot of questions. Yeah, but that's what makes
makes this industry fun. There's always more problems to solve, and innovative people
who can, you know, turn their brains
to solve those problems. So it's really great
that you're in the industry, because we really need
more of that brainpower. Thank you. Thank you very much, Nils,
it's been great having you here. Please, tune in for our next episode,
and please do subscribe and like, and write in the comments. Thanks very much. Thank you very much, Tom. It was a pleasure.