Rotem Shaul, Author at Primis https://www.primis.tech The leading video discovery platform Wed, 06 Nov 2024 08:36:54 +0000 en-US hourly 1 https://wordpress.org/?v=5.8.10 https://www.primis.tech/wp-content/uploads/2021/01/favicon.png Rotem Shaul, Author at Primis https://www.primis.tech 32 32 From Discovery to Stagnation: AI is Shaping the Future of Publishing https://www.primis.tech/insights/from-discovery-to-stagnation-ai-is-shaping-the-future-of-publishing/ Sun, 08 Sep 2024 13:02:09 +0000 https://www.primis.tech/?p=107169 In the publishing and advertising industries, there’s growing concern about the potential threats AI poses to publishers, news, and similar sectors. Many worry about implications for the free press and job security, rallying against these perceived threats. While these concerns are valid, the conversation needs a broader perspective. Concerns vs. Reality Free Press AI doesn’t […]

The post From Discovery to Stagnation: AI is Shaping the Future of Publishing appeared first on Primis.

]]>
In the publishing and advertising industries, there’s growing concern about the potential threats AI poses to publishers, news, and similar sectors. Many worry about implications for the free press and job security, rallying against these perceived threats. While these concerns are valid, the conversation needs a broader perspective.

Concerns vs. Reality

Free Press

AI doesn’t inherently threaten the concept of a free press. The fundamental right to publish and express opinions remains intact; what changes is the audience size. AI may reduce the frequency of people consuming traditional news sources, but the essence of a free press persists. If people visit 50% less and 50% of sites shut down, enough sites will still cover every important aspect. Before the internet, we consumed news via television, newspapers, and radio, with fewer options, yet the concept of a free press was as strong.

Number of Journalists in the US (1950s to 2020s)

Job Losses and Historical Context

The fear of job losses due to AI is real but mirrors fears from past technological advancements, fears that ended up not materializing in most cases. In the 18th century, 80% of the world’s population worked in agriculture. Today, that number is around 5%, yet we don’t face 75% unemployment. Technological improvements in agriculture freed up labor for more productive and diverse occupations.

While the farmer’s concern about world famine was genuine, it stemmed from self-interest bias. Today, the publishing and advertising industry’s fears about AI could be similarly biased, rooted in the desire to preserve the status quo.

Even if the free press remains intact and job losses ultimately add efficiency to the market, there is a larger concern.

Content Discovery vs. Definitive Answers

Progress is driven by the continuous exchange of ideas. Millions contribute to this process, with evolution and innovation created by passing ideas back and forth. To exchange ideas, one needs to be exposed to ideas, adopt opinions, and discuss.

AI provides a single, definitive response, hindering the natural discovery process we’re accustomed to. We often discover new content without realizing it, like reading various menus in restaurants over the years versus asking an AI for a single recommendation. While an AI’s answer might be convenient, over time, we learn more about food by reading menus, trying new things, even dishes we didn’t like. This process of discovery and experimentation helps us evolve in ways that wouldn’t happen in a one-answer world. The same context applies to discovering things on the web.

Search Engines

Search engines encourage users to explore multiple sources. On average, users visit 2-3 websites per search before finding the information they need. When including refinements and modifications to the search query, this number increases, potentially visiting 4-6 websites in total.

A personal experience with cocktail making: I used to search for recipes on Google, exploring multiple links. Some sites would add unique suggestions for upgrading a Margarita or an Old Fashioned. I evolved. Lately, I’ve been relying on ChatGPT and got a single, definitive recipe. While the cocktails are still great, my learning and creativity have plateaued. The shift from discovery to passive consumption limited my personal growth; I explore less than I used to.

When you apply the same concept to everything we are searching for, not just cocktails,relying on a single, definitive source as a means of gathering information prevents you from learning, testing, and adopting independent opinions. The less people discover content, the slower new ideas will be assimilated.

Discovery Within Websites

Websites are built for discovery. You go on a news site to learn the latest news about local politics, and while scrolling the homepage, you get exposed to news about climate change, celebrity gossip, the latest in sports, and so much more. The most successful sites are those that have perfected keeping their users reading and discovering more.

The same applies to almost every category. Surfing a recipe site to get one recipe, you get exposed to others. You might not cook those other recipes right away, but for those who love cooking, they save it, try later, or just learn something new.

The internet, whether it’s search engines or sites, is built around helping users discover more and more content. As they discover more content, opinions are shaped, skills improved, and ideas developed.

Frequency

The first time I heard about climate change, I wasn’t really bothered. It took a lot of time for my mind to digest the threat. The pace of change is related to frequency—how often and how many times I was exposed.

When addressing frequency from a content discovery point of view, it’s more than just repeating the message; exposure alone isn’t enough. Users need to spend real time consuming it, and that requires effort. Let’s look at this article as an example:

  • The Writer Matters: Most people didn’t make it to this part; some left because they don’t relate to my writing style. The same people, hearing the same opinion from someone they enjoy reading, would have read everything and maybe adopted some ideas I shared. Different people saying the same things differently result in more readers digesting new information.
  • Timing Matters: To read this article, you have to spend a lot of time, meaning if I “met” you at the wrong time, you wouldn’t. If others share the same content repeatedly, the chances of many writers “meeting” you at the right time will increase, resulting in more people reading this perspective.
  • Counterargument: Many of you might not agree with me; some might want to prove me wrong by writing a counterargument. By doing so, they add another layer to the discussion. Fewer people discovering content will result in fewer pushbacks and counterarguments.

The Importance of Publishing

As sites receive less traffic, people will discover content less frequently, diminishing the crucial repetitive effect needed to drive any message. Additionally, we lose the diversity of people reading the same message in different styles, leading to fewer counterarguments and debates as exposure decreases.

AI as Counter-Technology to Historical Advancements

Technological advancements have historically helped people discover more information. Long ago, there was usually one source: a king, Church, or Rabbi, allowing only one point of view. New ideas took a long time to emerge. Newspapers provided 2-3 different points of view daily. Then came radio and TV, offering 10-15, and later, hundreds of channels and opinions in real-time. The internet allowed us to discover much more, giving access to multiple perspectives. Each new technology cannibalized old ones by being more efficient in exposing information. The internet replaced newspapers with easier access to content. Smartphones made it even easier than desktops. Every advancement resulted in more information, advancements equaled quantity.

AI, in the context of spreading information, does exactly the opposite. For the first time ever, we have a tech advancement that reduces the amount of information people will be exposed to. This is a huge setback; as people discover fewer opinions, we revert to historical times when there was one source of information. If AI existed long ago, its modules would be trained according to the available information at that time and would answer things like:

  • Women shouldn’t vote because they don’t have the same IQ as men.
  • Blacks were created to serve whites.
  • Gay people shouldn’t be accepted in our society.

And like old times, we risk making new ideas surface harder and less frequently. You can’t debate the king, you can’t debate God (pope, rabbi), and you can’t debate AI. Stagnation in human progress is the risk we are facing.

The State of Publishing and Progress

The free press is not in existential danger. Even if half of all news sites lost traffic or closed, sufficient coverage would still exist, maintaining the legal right to criticize the government or anyone else. However, the state of publishing reflects global discovery and learning, from cocktail tips to diverse news perspectives. Reduced exposure to varied opinions can slow human progress and innovation.

The number of active content sites and related jobs can be perceived as some kind of measurement tool to how much new information people are accessing. Each site, even small ones, represents individuals searching, reading, and learning. Every site that shuts down signifies a shift towards passive consumption of fixed answers.

While definitive answers offer convenience, AI must enhance our pursuit of knowledge, not limit it. The state of publishing is our crystal ball, predicting the pace of future progress. A healthy publishing industry, with thriving publishers, signifies a brighter future, not just for those working in publishing and advertising.

The post From Discovery to Stagnation: AI is Shaping the Future of Publishing appeared first on Primis.

]]>
Publishers – Shut Up This Time, or Don’t Blame Google Next Time https://www.primis.tech/insights/publishers-shut-up-this-time-or-dont-blame-google-next-time-2/ Mon, 29 Jul 2024 14:00:18 +0000 https://www.primis.tech/?p=107121 Overcoverage Leads to Overreaction Publishers hold immense power and influence. When they push a narrative, it quickly becomes the talk of the town. When many publishers align on the same agenda, the impact is huge. Over the last decade, mainstream news sites, general interest magazines, financial journals, and TV channels have extensively covered internet privacy. […]

The post Publishers – Shut Up This Time, or Don’t Blame Google Next Time appeared first on Primis.

]]>
Overcoverage Leads to Overreaction

Publishers hold immense power and influence. When they push a narrative, it quickly becomes the talk of the town. When many publishers align on the same agenda, the impact is huge.

Over the last decade, mainstream news sites, general interest magazines, financial journals, and TV channels have extensively covered internet privacy. Everyone loves going after giants, and attacking big platforms like Google brings traffic. Additionally, privacy issues are closely tied to the publishing world, leading to biased decision-making regarding the topic’s importance. In my opinion, the topic, though important, received more attention than its actual impact warranted and experienced overcoverage.

This surge in privacy-related content amplified users’ concerns, leading to significant industry shifts as users worldwide demanded more privacy. This pressure led companies like Apple, Mozilla, and others to eliminate third-party cookies from their browsers. It also made Google contemplate doing the same with Chrome.

Browser Market Share

Chrome’s market share has seen a year-over-year decline. While some of this decline can be attributed to the natural growth of Safari, driven by the increasing popularity of Apple devices, in my opinion, privacy concerns have also eroded Chrome’s user base. This is evident when examining Chrome’s shrinking market share on non-Apple devices. The market share of browsers like Brave and DuckDuckGo climbing from 0% to 2.7% further highlights the increasing user demand for enhanced privacy features.

While Chrome is a free product, Chrome users are valuable to Google, and losing users directly impacts Google’s revenue.

Legislation and Regulation Heat

Simultaneously, legislators, seeing the demand for privacy, stepped in and created regulations like GDPR (General Data Protection Regulation) and CCPA (California Consumer Privacy Act). More countries, such as Brazil with its LGPD (Lei Geral de Proteção de Dados) and Canada with its PIPEDA (Personal Information Protection and Electronic Documents Act), are adding their own regulations. Additionally, more U.S. states, including New York, Virginia, and Colorado, are enacting their own privacy laws. These regulations further pressured companies, including Google, to prioritize user privacy.

Moreover, Google has faced additional legislative pressure regarding privacy issues, with various governments scrutinizing their data collection and usage practices. This has led to investigations and fines in several jurisdictions, pushing Google to adopt more stringent privacy measures to comply with the evolving regulatory landscape.

Show me the incentive and I’ll show you the outcome.” – Charlie Munger

Google is between a rock and a hard place, stuck between two powerful incentives. On one hand, they are losing browser market share and facing increased scrutiny from users and legislators demanding better privacy protections. On the other hand, they have a significant stake in the advertising world and the substantial revenue associated with it. This dual pressure explains why, unlike other browsers, Google has been seeking a solution that would eliminate cookies without significantly harming advertisers.

Earlier this week, Google announced their decision to halt their plans to kill third-party cookies. It seems that the revenue from online advertising presented a more substantial incentive than the potential market share loss from not aligning with other browsers in eliminating cookies. This decision highlights the challenging balance Google must strike between adhering to privacy demands and maintaining its advertising revenue.

Overreaction and Self-Responsibility

The immense pressure from publishers’ coverage led people globally to demand more privacy. Google capitulated to this pressure, only to be blamed by the same publishers for wanting to harm them. Google did not want to kill cookies; they caved to the demands. If publishers apply too much pressure again, the cycle will repeat. This time, however, Google might not be able to backtrack. Excessive pressure could lead to more stringent regulations, reduced innovation, and a less balanced digital ecosystem.

Unlike the clickbaity article titles suggest, I believe privacy is important and should be covered. However, publishers should be mindful that overcoverage can unintentionally cause overreaction. They must ensure they don’t give this topic more share of voice than it deserves, whether due to bias towards a topic associated with the publishing world or because of the allure of attacking big tech giants.

It’s crucial for all stakeholders to consider the long-term implications of their actions and strive for a balanced approach that addresses privacy concerns without stifling progress or disproportionately targeting specific entities. Responsible coverage is key to maintaining a balanced and informed public discourse.

The post Publishers – Shut Up This Time, or Don’t Blame Google Next Time appeared first on Primis.

]]>
For Transparency Into Resellers, Ads.txt Needs To Be More Like Sellers.json https://www.primis.tech/insights/ads-txt-needs-to-be-more-like-sellers-json/ Thu, 03 Aug 2023 10:04:55 +0000 https://www.primis.tech/?p=84869 According to the IAB Tech Lab, ads.txt was supposed to be “a simple, flexible, and secure method that publishers and distributors can use to publicly declare the companies they authorize to sell their digital inventory.” But, in practice, ads.txt has proven difficult for publishers to manage and maintain, leading to outdated and inaccurate entries and misrepresentations […]

The post For Transparency Into Resellers, Ads.txt Needs To Be More Like Sellers.json appeared first on Primis.

]]>
According to the IAB Tech Lab, ads.txt was supposed to be “a simple, flexible, and secure method that publishers and distributors can use to publicly declare the companies they authorize to sell their digital inventory.”

But, in practice, ads.txt has proven difficult for publishers to manage and maintain, leading to outdated and inaccurate entries and misrepresentations of reseller relationships.

The current model for ads.txt isn’t helping the industry understand how publishers are working with sellers and resellers. Instead, to improve transparency into these relationships, we need an ads.txt version 2.0 that uses the JSON notation instead.

Ads.txt Introduces Unnecessary Complexity

When ads.txt was launched in 2017, our industry wasn’t as transparent as it is now. Instead of publishers declaring sellers, ads.txt required them to declare exchanges and seat IDs.

As a result, the ads.txt structure looks like this:

Advertising system, sellerID, relationship

For example, an ads.txt entry for PubMatic might look like this:

pubmatic.com, 123456, reseller

But it is challenging for publishers to understand what the IDs stand for and who they authorize to sell their inventory, which creates confusion.

According to Jounce Media, the average publisher in the top 10K cohort authorized 205 supply paths in early 2020. By late 2022, that average tripled to 622, and it continues to grow.

According to Primis’ Sellers.guide, the average publisher lists 69 sellers in ads.txt, while claiming to work with just 20 sellers. And publishers average about 26 companies in their ads.txt file that claim to be “direct” sellers. This lack of clarity leaves buyers at risk of uncertainty.

Looking to Sellers.json

To improve transparency into reseller relationships, we need an ads.txt version 2.0 that uses the JSON notation instead.

In a post-sellers.json world, we can build better ads.txt protocols to improve efficiency and transparency. For example, a seller’s entry could look like this:

{

“seller_name” : “sellerX.com”,

“relationship” : “reseller”,

{“paths” : [“google.com”, “openx.com”, “pubmatic.com”, “rubiconproject.com”, “triplelift.com”]}

}

The IDs will be mapped by DSPs and ad tech platforms. A DSP will need to look up sellerX.com in the sellers.json files of those platforms, save the ID and know that when they get an ad request from OpenX via seat ID 12345678, it is coming from sellerX.com acting as a reseller.

Publishers would have a window into seeing who really has access to sell their inventory, which will help protect them from hidden sellers and misrepresented relationships.

With this proposed approach, publishers will no longer need to use 622 ads.txt lines to represent 69 sellers. They would only need to maintain an ads.txt file with 69 lines – one for each seller.

And there are other benefits as well. Having 69 lines with sellers’ names attached to them, instead of over 600, will make removing sellers easier for publishers.

Additionally, if a company wants to piggyback on another company’s lines, it will be easier for a publisher to notice and ask SellerX why they are sending ads.txt lines for SellerY.

Today, different exchanges often use different names for the same company. But a new ads.txt model that uses sellers.json’s naming method will eradicate this problem.

A Better Framework for All 

Publishers often don’t have the tools or resources to effectively manage reseller relationships in a complex, highly technical setting. By allowing publishers to name companies instead of exchanges and seat IDs, we pass much of the technical responsibility of managing these relationships to ad tech companies that are more equipped to handle them.

Changing the infrastructure of ads.txt is extremely challenging and will require an industrywide commitment. But a big, one-time investment that results in better efficiency and trust for years may be worth it.

Previously published on AdExchanger’s “The Sell Sider” column. See the original publication here.

The post For Transparency Into Resellers, Ads.txt Needs To Be More Like Sellers.json appeared first on Primis.

]]>
Advertising Still Hasn’t Solved Its (Lack of) Transparency Problem https://www.primis.tech/insights/advertisings-lack-of-transparency/ Sun, 21 Mar 2021 15:00:59 +0000 https://www.primis.tech/?p=4951 In 2017, Marc Pritchard, chief brand officer at P&G, the largest advertiser in the world, took to the stage at the annual IAB leadership meeting and said P&G would no longer engage with the industry’s “crappy media supply chain.” The media-buying process is opaque and riddled with fraud, Pritchard said, and P&G demanded agencies, ad […]

The post Advertising Still Hasn’t Solved Its (Lack of) Transparency Problem appeared first on Primis.

]]>
In 2017, Marc Pritchard, chief brand officer at P&G, the largest advertiser in the world, took to the stage at the annual IAB leadership meeting and said P&G would no longer engage with the industry’s “crappy media supply chain.”

The media-buying process is opaque and riddled with fraud, Pritchard said, and P&G demanded agencies, ad tech vendors and publishers subject themselves to third-party measurement and a universal viewability standard. If they failed to comply, P&G would pull its ad spend.

The industry scrambled to develop transparency solutions in the immediate wake of Pritchard’s comments. (People tend to listen when a marketer threatens to pull $7 billion in ad spend.) Among the proposed solutions were ads.txt and sellers.json.

Ads.txt was created in the summer of 2017 by the IAB to help publishers exert more control over their advertising inventory. Short for Authorized Digital Sellers, ads.txt limited a publisher’s inventory to a select list of trusted ad tech vendors.

Sellers.json, released in 2019 by the IAB, was a similar technology aimed at buyers. Using sellers.json, marketers could see the complex web of parties involved in buying, reselling, and eventually publishing their ads.

The technologies are very well-intentioned and well-designed, from a technology point of view — they both attempt to identify all of the intermediaries involved in placing an ad buy, and to map the often circuitous route an ad takes on its way from brand to publisher.

The problem with ads.txt and sellers.json is that they’re too damn complicated. Ads.txt and sellers.json files are meant to include a simple list of all accredited resellers. Instead, these files have become overloaded with extraneous information— a wall of text as complicated and indecipherable as the ad tech marketplace itself.

This complexity is why marketers have not utilized ads.txt and sellers.json to the fullest extent. Because some of the files contain thousands of lines, it has become a challenge to understand, and others reject the idea of working with resellers completely because they fear what they don’t understand. The net result is that the transparency initiatives are not transparent enough for the people they are meant to serve. 

Meanwhile, fraudsters learned how to manipulate the system by taking advantage of the lack of understanding and knowledge of the people who operate it. Last year, consulting firm PwC found that just 26% of marketers’ programmatic ad spend ends-up as viewable ad impressions. A separate PwC study found that 15% of ad spend vanishes into an “unknown delta.” Only 51% of ad spend actually makes its way to publishers. Of that 51%, about half is wasted on fraud and non-viewable impressions.

The worst part about the lack of transparency is that it prevents marketers from spending more on digital. If brands and agencies had a better idea of where their money was going and were confident their ad spend wasn’t being wasted, ad spend would increase and all parties would benefit.

Download the complete guide to Video Discovery

The industry needs new tools — ones that enable ads.txt and sellers.json to live up to their full potential of streamlining the transparency process, instead of overwhelming marketers with inscrutable technical solutions. The tools need to be so easy that they are just as useful to the CMO and entry-level agency buyers as they are to data analysts and the engineers building the code.

Transparency means total visibility into every stage of the media supply chain. But transparency also means that everyone is able to understand what’s going on. Until we achieve that, the media supply chain will remain just like it was four years ago: crappy.

The post Advertising Still Hasn’t Solved Its (Lack of) Transparency Problem appeared first on Primis.

]]>
Show Me the Incentive, I’ll Show You the Outcome: Will Google Follow in Apple’s Footsteps? https://www.primis.tech/insights/will-google-follow-apples-footsteps/ Thu, 15 Oct 2020 11:52:25 +0000 https://www.primis.tech/?p=3045 Following years of Safari changing the way it treats user privacy, Apple announced that its upcoming operating system IOS 14 will only share IDFA (Identifier for Advertisers) when users opt-in. Some say this will be the deathblow for personalized advertising on Apple devices, as fewer than 5% of users will choose to opt-in. Criticism of […]

The post Show Me the Incentive, I’ll Show You the Outcome: Will Google Follow in Apple’s Footsteps? appeared first on Primis.

]]>
Following years of Safari changing the way it treats user privacy, Apple announced that its upcoming operating system IOS 14 will only share IDFA (Identifier for Advertisers) when users opt-in. Some say this will be the deathblow for personalized advertising on Apple devices, as fewer than 5% of users will choose to opt-in. Criticism of this move has ranged from accusing Apple of wanting to control the publishing world, give its own ad network an unfair competitive advantage, to a desire to harm Facebook and Google

Here’s a different theory for why Apple would make such an aggressive decision. Charlie Munger once said, “Show me the incentive, I’ll show you the outcome.” To understand Apple’s incentive, we need to better understand who their clients are. Apple reports for Q2 2020 show that selling hardware accounted for 77% of Apple’s revenue, while services accounted for the remaining 23%.

Apple Q2 earnings

Apple services itself is comprised of many products – Apple Music, App Store, Icloud, Itunes, Apple Books, Apple Pay, AppleCare Licencing, and more. In terms of revenue, the only product that is related to advertising is the App Store, but, even here, most of the revenue is driven by app downloads that came from search or in-app purchases and not from paid advertising.

Upon understanding that Apple’s revenue has little to do with advertising, figuring out an alternative agenda for IDFA and Safari changes becomes easier. Apple’s clients are the end-users buying hardware and services, so the incentive is to please them and gain more market share by answering the trend of users demanding more privacy. Taking more advertising dollars from the market can be a nice upside, but not the main agenda. 

Also, Apple blocking dozens of 3rd party trackers also earns them trust from their already loyal base of consumers. Those consumers won’t mind Apple using first-party data as long as Apple is the sole company to gain access and they commit to not sharing this data with other ad tech companies they never heard of. 

Apple’s incentive to gain market share along with recent privacy trends will result in more privacy initiatives in the Apple ecosystem, regardless of what Tim Cook’s company decides to do in the ad space. This can help us tackle the elephant in the room, how Google will react.

The Google Angle

Google has a diverse revenue stream, so they need to balance more types of clients and incentives – brands, agencies, publishers, O&O assets (YouTube, search etc), Android and, finally, the end-user.

Because there are more types of incentives, analyzing the outcome becomes harder. This is why I believe Google hasn’t made up its mind on what to do. It releases ideas to the market to figure out how the public and the industry will react, while buying time for new technologies and trends to emerge. 

Download the publisher's guide to privacy in 2020

My prediction is that the incentive map for google will change and Google will do the same thing Apple is doing, the reasoning behind it is again, the incentive map. As time goes by we will see two trends. Growing pressure from users demanding more privacy combined with some good tech currently being developed to support a cookieless ad ecosystem. Those two trends will change the incentive map enough so it will be in Google’s best interest to do the same thing Apple did and is still doing.

Conclusion

Looking at the incentive map it is easier to understand that it was in Apple best interest to kill 3rd party cookies and IDFA sharing. As a company that mainly sells hardware and user experience, their clients pay extra to enjoy the closed garden Apple experience. Apple might use this opportunity to gain a new ad revenue stream, but there is a small upside and it won’t be the main incentive for their actions.

Google seems to be stalling as the balance between privacy and revenue is not decisive yet. As time goes by, the threat for the advertising part of the business will grow smaller and the potential harm from the user part of the business will grow larger. This pincer movement will change the incentive map for Google and it will ultimately be in their best interest to do what Apple is doing now. When they do, it won’t be to control publishing, fight facebook, or because of any other conspiracies. It will be because changing incentives drive changing outcomes.

The post Show Me the Incentive, I’ll Show You the Outcome: Will Google Follow in Apple’s Footsteps? appeared first on Primis.

]]>
Is Ad Tech Consolidation Killing Programmatic? https://www.primis.tech/insights/ad-tech-consolidation-killing-programmatic/ Thu, 28 Mar 2019 13:48:07 +0000 https://www.primis.tech/?p=689 As featured on AdExchanger   Programmatic advertising has grown by hundreds of percentage points, and as the industry has matured, market consolidation has commenced. Many vendors are too small and cannot compete with the industry’s maturing tech and big players’ data. Others are in a M&A frenzy to get better technology, sales efficiency or data […]

The post Is Ad Tech Consolidation Killing Programmatic? appeared first on Primis.

]]>
As featured on AdExchanger

 

Programmatic advertising has grown by hundreds of percentage points, and as the industry has matured, market consolidation has commenced.

Many vendors are too small and cannot compete with the industry’s maturing tech and big players’ data. Others are in a M&A frenzy to get better technology, sales efficiency or data to position better themselves in the market.

From Jan 2016 to April 2018, advertisers reduced the number of demand-side platforms (DSPs) to 4.2 from 7.1, a 40% decline, according to eMarketer.

For some, the trend can seem natural, but there is a big downside for the whole programmatic industry and DSPs: M&As are self-annihilating the programmatic industry.

There are thousands of brands competing in online ad auctions, but somehow supply-side platforms (SSPs) get fewer and fewer competing bids as the years go by. This is not because there are fewer buyers that want the inventory – it’s mostly because the number of DSPs has decreased over time.

The real value of each impression is not being valued according to the real supply and demand out there. It is heavily affected by the number of DSPs that represent the demand.

The Auction

In order to demonstrate this, we need to better understand the Vickrey Auction, also known as second-price auction, where the highest bidder pays the second-highest bid.

Is Ad Tech Consolidation Killing Programmatic? How the market is supposed to work.

In a perfect world, the brands would compete against each other, and brand A would win the impression and pay $8. However, in a consolidated world, the scenario is increasingly becoming like this:

Is Ad Tech Consolidation Killing Programmatic? How the market actually works.

The impression sold for $3, because the SSP set a floor. If the publisher used a $9 floor, it would have tripled its income. Setting floors on a whim is not a recipe for an efficient market.

The waste is huge. SSPs and publishers build teams and technologies trying to predict the best floors to use, attempting to find the sweet spot to simultaneously optimizing fill rates and CPMs.

With hundreds of thousands of domains selling billions of impressions to tens of thousands of buyers, one would expect that the market would support itself by engineering an auction that helps determine the true value of each impression, without people manually setting floors and investing in prediction algorithms to find the right floors.

First-Price Auction?

In a first-price auction, buyers know that if everyone bids the highest bid on a desirable impression, it might work out well for that one impression, but in the long term all buyers will lose. Buyers will have to try to adjust their bids and behavior to the market and not always bid the most they can.

Download the complete guide to video monetization

So with first-price auctions, buyers will spend as much effort trying to determine how much they want to bid as sellers trying to identify the optimal price floors. We are not solving the problem – we’re just passing it on to someone else.

Solving the Problem

One easy engineering solution that would dramatically increase competition is to break down the bid activity of DSPs to trading desks so that each trading desk will get a different seat ID and bid separately, even while using the same technology provider. An SSP can send DSPs bid requests per each instance and get different bids:

Is Ad Tech Consolidation Killing Programmatic? How the market can work.

As shown above, if the DSPs created different seat IDs for each trading desk, the market would get a better view of the true demand and supply. There is no reason why all trading desks should compete under the same bid as they do now.

The setup should also be quite simple – no need to change bidding algorithms or integrations.

The fact that people are optimizing floors manually and companies are building tech to predict the best floors per geo, device, etc., should be a wakeup call for everyone. The open market is not doing its job; consolidation is here to stay and we need to change the bidding mechanism so the market will function properly. Shifting the problem from the seller to the buyer doesn’t solve anything.

We need to work together and find a better solution that will automatically determine a fair value for each impression in a way that represents the true demand and supply, as seen by trading desks, brands and publishers.

 

The post Is Ad Tech Consolidation Killing Programmatic? appeared first on Primis.

]]>