A company is looking to hire a worker out of n applicants, and the company knows that one of the applicants is the best. The company interviews the applicants individually, and after each interview, it must decide whether to hire the applicant or keep looking. If the company decides to hire an applicant, it can no longer interview other applicants. The goal is to maximize the probability of hiring the best worker.
Let's assume that the best worker is the k-th applicant in the list, where k is unknown. If the company stops and hires the k-th applicant, the expected value of the decision is k/n. On the other hand, if the company continues to interview the next applicant, the expected value of the decision is (k+1)/(n+1). With these values, you can derive the expected value for continuing the search and compare that against the expected value of stopping the search (selecting the current candidate).
The optimal solution to the hiring problems ends up being this:
Reject the first n/e applicants, where n is the total number of applicants and e is the natural logarithm. Then stop at the first applicant who is better than every applicant that has been interviewed so far. If you get to the last candidate and you haven't selected anyone, choose the last candidate.
For large values of n, the probability of selecting the top candidate is about 37% with this method. Of course, this is a toy example, and the constraints in the problem rarely hold (especially not when hiring employees). But it's an interesting way to reason about a class of problems.