Contact Info
an image
Amin Sayedi
Tepper School of Business
Ph.D. Program
Carnegie Mellon University
5000 Forbes ave
Pittsburgh, PA 15213
Email: ssay...@cmu.edu

Phone: (412) 268-2295


Papers Under Review and Working Papers


Competitive Poaching in Sponsored Search Advertising and Strategic Impact on Traditional Advertising, joint with K. Jerath and K. Srinivasan [pdf]
Working Paper

An important decision for a firm is how to allocate its advertising budget among different types of advertising. Most traditional channels of advertising, such as advertising on television and in print, serve the purpose of building consumer awareness and desire about the firm's products. With recent developments in technology, sponsored search (or paid search) advertising at search engines in response to a keyword searched by a user has become an important part of most firms' advertising efforts. An advantage of sponsored search advertising is that, since the firm advertises in response to a consumer-initiated search, it is a highly targeted form of communication and the sales-conversion rate is typically higher than in traditional advertising. However, a consumer would search for a specific product or brand only if she is already aware of the same due to previous awareness-generating traditional advertising efforts. Moreover, competing firms can use sponsored search to free-ride on the awareness-building efforts of other firms by directly advertising on their keywords and therefore "poaching" their customers. Anecdotal evidence shows that this is a frequent occurrence. In other words, traditional advertising builds awareness, while sponsored search is a form of technology-enabled communication that helps to target consumers in a later stage of the purchase process, which induces competitors to poach these potential customers.
Using a game theory model, we study the implications of these tradeoffs on the advertising decisions of competing firms, and on the design of the sponsored search auction by the search engine. We find that symmetric firms may follow asymmetric advertising strategies, with one firm focusing on traditional advertising and the other firm focusing on sponsored search with poaching. Interestingly, the search engine benefits from discouraging competition in its own auctions in sponsored search advertising by shielding firms from competitors' bids. This explains the practice of employing "keyword relevance" measures, under which search engines such as Google, Yahoo! and Bing under-weight the bids of firms bidding on competitors' keywords. We also obtain various other interesting insights on the interplay between sponsored search advertising and traditional advertising.




Exclusive Display in Sponsored Search Advertising, joint with K. Jerath [pdf]
Rivision Invited to Marketing Science

As sponsored search becomes increasingly important as an advertising medium for firms, search engines are exploring more advanced bidding and ranking mechanisms to increase their revenue from sponsored search auctions. For instance, Google, Yahoo! and Bing are investigating auction mechanisms in which each advertiser submits two bids: one bid for the standard display format in which multiple advertisers are displayed, and one bid for being shown exclusively. If the exclusive-placement bid by an advertiser is high enough then only that advertiser is displayed, otherwise multiple advertisers are displayed and ranked based on their multiple-placement bids.
We call such auctions two-dimensional auctions and study two extensions of the Generalized Second Price (GSP) mechanism that are currently being evaluated by search engines. We show that allowing advertisers to bid for exclusivity always generates higher revenues for the search engine. Interestingly, even if the final outcome is multiple display, the search engine still extracts higher revenue because of increased competition among advertisers, simply because bidding for exclusivity is allowed. In fact, one of the auctions we consider can extract the full surplus of the bidders as search engine revenue under certain conditions. Under other conditions on the advertisers' valuations for exclusivity as well as the heterogeneity in their valuations for exclusivity, exclusive display auctions can benefit the advertisers also.




Publications


A Near Pareto Optimal Auction with Budget Constraints, joint with I. Hafalir and R. Ravi [pdf]
Forthcoming at Games and Economic Behavior

In a setup where a divisible good is to be allocated to a set of bidders with budget constraints, we introduce a mechanism in the spirit of the Vickrey auction. In the mechanism we propose, understating budgets or values is weakly dominated. Since the revenue is increasing in budgets and values, all kinds of equilibrium deviations from true valuations turn out to be beneficial to the auctioneer. We also show that ex-post Nash equilibrium of our mechanism is near Pareto optimal in the sense that all full winners' values are above all full losers' values.


We Know Who You Followed Last Summer: Inferring Social Link Creation Times in Twitter, joint with C. Borgs, J. Chayes, B. Karrer, B. Meeder and R. Ravi [pdf]

Understanding a network's temporal evolution appears to require multiple observations of the graph over time. These often expensive repeated crawls are only able to answer questions about what happened from observation to observation, and not what happened before or between network snapshots. Contrary to this picture, we propose a method for Twitter's social network that takes a single static snapshot of network edges and user account creation times to accurately infer when these edges were formed. This method can be exact in theory, and we demonstrate empirically for a large subset of Twitter relationships it is accurate to within hours in practice. We study users who have a very large number of edges or who are recommended by Twitter. We examine the graph formed by these nearly 1,800 Twitter celebrities and their 862 million edges in detail, showing that a single static snapshot can give novel insights about Twitter's evolution. We conclude from this analysis that real-world events and changes to Twitter's interface for recommending users strongly influence network growth.

Appeared in the Twentieth International World Wide Web Conference, WWW '11 (Acceptance Rate 12%).


Game-theoretic Models of Information Overload in Social Networks, joint with C. Borgs, J. Chayes, B. Karrer, B. Meeder, R. Ravi and R. Reagans [pdf]

We study the effect of information overload on user engagement in an asymmetric social network like Twitter. We introduce simple game-theoretic models that capture rate competition between celebrities producing updates in such networks where users non-strategically choose a subset of celebrities to follow based on the utility derived from high quality updates as well as disutility derived from having to wade through too many updates. Our two variants model the two behaviors of users dropping some potential connections (followership model) or leaving the network altogether (engagement model). We show that under very simple and natural models of celebrity rate competition, there is no pure strategy Nash equilibrium under the first model. We then identify special cases in both models when pure rate equilibria exist for the celebrities: For the followership model, we show existence of pure rate equilibria when there is a global ranking of the celebrities in terms of the quality of their updates to users. For the engagement model, pure rate equilibria exist when all users are interested in the same number of celebrities, or when they are interested in at most two. Finally, we also give a finite though inefficient procedure to determine if pure equilibria exist in the general case of the followership model.

Appeared in the Seventh Workshop on Algorithms and Models for the Web Graph, WAW '10.


Trading Off Mistakes and Don't-know Predictions, joint with A. Blum and M. Zadimoghaddam [pdf]

We discuss an online learning framework in which the agent is allowed to say "I don't know" as well as making incorrect predictions on given examples. We analyze the trade off between saying "I don't know" and making mistakes. If the number of don't know predictions is forced to be zero, the model reduces to the well-known mistake-bound model introduced by Littlestone [Lit88]. On the other hand, if no mistakes are allowed, the model reduces to KWIK framework introduced by Li et. al. [LLW08]. We propose a general, though inefficient, algorithm for general finite concept classes that minimizes the number of don't-know predictions if a certain number of mistakes are allowed. We then present specific polynomial-time algorithms for the concept classes of monotone disjunctions and linear separators.

Appeared in the Twenty-fourth Annual Conference on Neural Information Processing Systems, Spotlight Paper (spotlight paper acceptance rate was less than 6%) NIPS '10.


Expressive Auctions for Externalities in Online Advertising, joint with A. Ghosh [pdf]

When online ads are shown together, they compete for user attention and conversions, imposing negative externalities on each other. While the competition for user attention in sponsored search can be captured via models of clickthrough rates, the post-click competition for conversions cannot: since the value-per-click of an advertiser is proportional to the conversion probability conditional on a click, which depends on the other ads displayed, the private value of an advertiser is no longer one-dimensional, and the GSP mechanism is not adequately expressive. We study the design of expressive GSP-like mechanisms for the simplest form that an advertiser's private value can have in the presence of such externalities--- an advertiser's value depends on exclusivity, i.e., whether her ad is shown exclusively, or along with other ads.

Part of our results appeared in the Fifth Workshop on Ad Auctions, AAW '09.
Appeared in the Nineteenth International World Wide Web Conference, WWW '10 (Acceptance Rate 14%).


Mechanism Design for Complexity-constrained Bidders, joint with R. Kumar and M. Mahdian [pdf]

A well-known result due to Vickery gives a mechanism for selling a number of goods to interested buyers in a way that achieves the maximum social welfare. In practice, a problem with this mechanism is that it requires the buyers to specify a large number of values. In this paper we study the problem of designing optimal mechanisms subject to constraints on the complexity of the bidding language in a setting where buyers have additive valuations for a large set of goods. This setting is motivated by sponsored search auctions, where the valuations of the advertisers are more or less additive, and the number of keywords that are up for sale is huge. We give a complete solution for this problem when the valuations of the buyers are drawn from simple classes of prior distributions. For a more realistic class of priors, we show that a mechanism akin to the broad match mechanism currently in use provides a reasonable bicriteria approximation.

Appeared in the Fifth Workshop on Internet and Network Economics, WINE '09.


Minimizing Movement, joint with E. Demaine, M. Hajiaghayi, H. Mahini, S. Oveisgharan, M. Zadimoghaddam
Journal Version [pdf] Conference Version [pdf]

We give approximation algorithms and inapproximability results for a class of movement problems. In general, these problems involve planning the coordinated motion of a large collection of objects (representing anything from a robot swarm or firefighter team to map labels or network messages) to achieve a global property of the network while minimizing the maximum or average movement. In particular, we consider the goals of achieving connectivity (undirected and directed), achieving connectivity between a given pair of vertices, achieving independence (a dispersion problem), and achieving a perfect matching (with applications to multicasting). This general family of movement problems encompass an intriguing range of graph and geometric algorithms, with several real-world applications and a surprising range of approximability. In some cases, we obtain tight approximation and inapproximability results using direct techniques (without use of PCP), assuming just that P != NP.

Journal version appeared in ACM Transactions on Algorithms ACM TALG 5(3) 2009.
Conference version appeared in Proceedings of the 18th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA '07 .


Scheduling to Minimize Gaps and Power Consumption, joint with E. Demaine, M. Ghodsi, M. Hajiaghayi and M. Zadimoghaddam [pdf]

This paper considers scheduling tasks while minimizing the power consumption of one or more processors, each of which can go to sleep at a fixed cost alpha. There are two natural versions of this problem, both considered extensively in recent work: minimize the total power consumption (including computation time), or minimize the number of "gaps" in execution. For both versions in a multiprocessor system, we develop a polynomial-time algorithm based on sophisticated dynamic programming. In a generalization of the power-saving problem, where each task can execute in any of a specified set of time intervals, we develop a (1 + (2/3)alpha)-approximation, and show that dependence on alpha is necessary. In contrast, the analogous multi-interval gap scheduling problem is set-cover hard (and thus not o(lg n)-approximable), even in the special cases of just two intervals per job or just three unit intervals per job. We also prove several other hardness-of-approximation results. Finally, we give an O(sqrt(n))-approximation for maximizing throughput given a hard upper bound on the number of gaps.

Appeared in Proceedings of 19th ACM Symposium on Parallelism in Algorithms and Architectures, SPAA '07 .


Spanning Trees with Minimum Weighted Degrees, joint with M. Ghodsi, H. Mahini, K. Mirjalali, S. Oveisgharan and M. Zadimoghaddam [pdf]

Given a metric graph G, we are concerned with finding a spanning tree of G where the maximum weighted degree of its vertices is minimum. In a metric graph (or its spanning tree), the weighted degree of a vertex is defined as the sum of the weights of its incident edges. In this paper, we propose a 4.5-approximation algorithm for this problem. We also prove it is NP-hard to approximate this problem within a 2-epsilon factor.

Appeared in Information Processing Letters, IPL 104(3) 2007.