## "And I'm More Interested In *How* You Arrive At Your Answer . . ."

Articles about what questions D. E. Shaw’s interviewers asked Larry Summers bring back fond memories of my own interview process at the firm.

I was applying for a job in the mailroom, and I was interviewed by eight different people, who asked a variety of questions ranging from the usual sort (about my education and work experience, about my goals and habits, that sort of thing) to the sort of “brain teasers” that D. E. Shaw (along with Microsoft, McKinsey, and no doubt other human-capital-intensive shops) is famous for.

My favorite was the interviewer who asked me: “how many fax machines are there in New York City?”

“Pardon?”

“How many fax machines are there in New York City?”

“I don’t have any idea.”

“Well, I’m interested in hearing how you would answer that question – and I’m more interested in *how* you arrive at your answer than in what your answer ultimately is.”

So I said a number. And then I said, “no, wait, that sounds too small,” and said another, larger number. And then I said, again, “no, wait, that still sounds too small,” and I said another, even larger number.

“What are you doing?”

“I’m saying numbers and, if they sound too small when I hear them out loud, I’m saying bigger ones.”

That was the end of the interview.

I have no idea how or why, but they hired me anyway.

As an aside, here are four ways of dealing with the St. Petersburg Paradox (referenced in the *Slate* article linked above):

- You cannot get more than “all” the money. If all the world’s assets are worth $100 trillion or thereabouts, then you simply cannot get more money than that. In fact, your ability to accumulate additional wealth stops well short of that figure, but that’s an absolute limit. You hit that limit after 48 tosses. Therefore, if the expected value of each toss is $0.50, you would not spend more than $24 to play the game.

- The value of a sufficiently low-probability event, no matter what the potential payoff, is zero. We know this because we are not willing to spend any money at all to try to avert extremely unlikely but massively catastrophic potential events (e.g., the possibility that running a supercollider will destroy all matter or, less universally catastrophic, create a mini-black-hole that absorbs the earth). In other words, some “black swans” are so black that nobody on earth worries about meeting one. Depending on where you set the threshold for worthlessness, you’d cap the value of the game somewhere south of $40.

- If you are absolutely determined to assign value to events that simply will not happen if you play this game until the universe reaches heat death, and absolutely committed to the idea that “expected value” is all that matters, it’s worth pointing out that it takes time to toss a coin. Time has value. If it takes 5 seconds to toss a coin, then you could toss a coin roughly 6 million times in a year. If the expected value of a coin toss is $0.50, then the expected value of continuous play over a year is $3 million. The value of $3 million/year annuity is finite, though its precise value depends on the prevailing interest rate. Change the number of coin tosses per year and you change the value. (I’m assuming, of course, that the player bears none of costs of running the game, including hiring someone to flip coins, otherwise these would need to be factored in.)

- Finally, perhaps the best way to calculate the “value” of the game would be to think about how roughly equivalent games of pure chance (e.g., roulette, slots, lottery tickets) are priced in the real world – where those games are negative-expected-value. That gives you a “market” price of the chance at a huge payoff, or what people will pay for risk of gain way out on the tails of the probability distribution. Let’s say a lottery ticket costs $1 and gives you a one-in-100 million chance at getting a $20 million payout. You’re paying five times expected value for that risk that you seek, but that’s not the point – you’re willing to pay it, so to you that risk is “worth” $1. To get a (roughly) $17 million payout in the coin-flip game would require 25 heads to come up in a row – that’s a (roughly) 1-in-34 million chance. So this game is worth (roughly) 3 lottery tickets, or $3. Of course, that number’s too low as you could also get smaller prizes – but that’s true of lots of games of chance (e.g., slots) as well, and you could similarly compare the odds of getting prizes of different sizes and determine how many rounds of any of these other, negative-expected-value games the coin-toss game is “worth” based on the market value of the various payoffs and the relative probability of getting said payoffs. This is really a variation on the second method, but one that recognizes that, when we’re talking about gain, we use expected value for high-probability events, we are (sometimes) risk-seeking when we’re talking about low-probability events, and we value not at all events beyond a certain horizon of probability. Setting $100 million as the most anyone might “plausibly” expect to take home, the “capped expected value” comes in at $14, but based on the pricing on comparable games of chance people would probably pay more – 2 to 3 times that. (In fact, $40 is probably too high a price point, but if you re-work the game as buying a 1/10th “share” of the payout, you’d probably get takers at $4/share.)

Ha! Good times. I remember my first interview of the day (not at DE Shaw), a guy walked in, handed me a piece of paper with the stock equation on it, and said “derive Black-Scholes.” I did it. He nodded, then left the room. Then the brainteasers started…

My two favorite to do a first cut:

1) How many people need to be in a room until it is 50% likely that two of them have the same birthday?

2) The Monty Hall problem. I have almost gotten into fights with people when I tell them the answer.

Side note: Your first answer to the Petersburg thing is my favorite to give. I like to throw in that you end up with an symmetry problem, where the option becomes increasingly valuable (indeed, the big value is in the tail) the more likely it is that the other person will not be able to pay it under any circumstance of the world we live in.

A good point, in light of the CDS fiasco…

— rortybomb · Apr 13, 07:25 PM · #

Being able to understand the setup of the Monty Hall problem: 1 point. Knowing the right answer to the Monty Hall problem: 2 points. Being able to

explainthe answer to the Monty Hall problem in a way that makes sense and feels right: 20 points.— Chet · Apr 14, 03:24 AM · #

re: how to answer

“How many fax machines are there in New York City?”I know the point is to check your inductive reasoning, but I immediately thought of Matthew Broderick’s

Wargamesstrategy.— JA · Apr 14, 12:48 PM · #

What was your final answer? I’m guessing one million. (That’s my answer, but I’m not revealing how I got there. Nobody’s offering me a job.)

— letterman · Apr 15, 05:55 PM · #