Introduction to Algorithms
3rd Edition
ISBN: 9780262033848
Author: Thomas H. Cormen, Ronald L. Rivest, Charles E. Leiserson, Clifford Stein
Publisher: MIT Press
expand_more
expand_more
format_list_bulleted
Question
Chapter 5.4, Problem 7E
Program Plan Intro
To show that the probability is less than
Expert Solution & Answer
Want to see the full answer?
Check out a sample textbook solutionStudents have asked these similar questions
Hat-check problem. Use indicator random variables to solve the following problem, which is known as the hat-check problem. Each of n customers gives a hat to a hat-check person at a restaurant. The hat-check person gives the hats back to the customers in a random order. What is the expected number of customers who get back their own hat?
Hypergeometric distribution
Given user defined numbers k and n, if n cards are drawn from a deck, find the probability that k cards are black.
Find the probability that at least k cards are black.
INPUT
11 7
OUTPUT
0.1628063397551007
0.24927823677714275
ANY help would be greatly appreciated.
From 1965 to 1974, in U.S. there were M= 17,857,857 male livebirths and F= 16,974,194 female livebirths. We model the number of male livebirth as a binomial distribution withparameterssize = M+F and prob = p. The following code computes the maximum likelihood estimator for p.
male = 17857857
female = 16974194
ll <-function(p){dbinom(male, size = male+female, prob=p, log=TRUE) }
ps <-seq(0.01, 0.99, by = 0.001)
ll.ps <-ll(ps)
plot(ps, ll.ps, type='l')
phat <- ps[which.max(ll.ps)]
abline(v = phat, col='blue')
QUESTION: For this problem, can you give a theoretical formula for the maximum likelihood estimator,ˆp, usingMandF? (No need to compute the numerical value.)
Chapter 5 Solutions
Introduction to Algorithms
Knowledge Booster
Similar questions
- Show that randomized quick-sort runs in O(n log n) time with probability 1 − 1/n2 . Hint: Use the Chernoff bound that states that if we flip a coin k times, then the probability that we get fewer than k/16 heads is less than 2−k/8.arrow_forwardA tourist car operator finds that during the past few months, the car's use has varied so much that the cost of manufacturing the car varied considerably. During the past 200 days, the demand for the car fluctuated as below: Trips per week Frequency 0 16 1 24 2 30 3 60 4 40 5 30 Simulate the demand for a 10-week period. Use the random numbers 82, 96, 18, 96, 20, 84, 56, 11, 52, 03.arrow_forwardConsider the same house rent prediction problem where you are supposed to predict price of a house based on just its area. Suppose you have n samples with their respective areas, x(¹), x(²),...,x(n), their true house rents y(¹), y(2),..., y(n). Let's say, you train a linear regres- sor that predicts f(x)) = 0 + 0₁x). The parameters, and 0₁ are scalars and are learned by minimizing mean-squared-error loss with L1-regularization through gradient descent with a learning rate a and the regularization strength constant A. Answer the following questions. 1. Express the loss function(L) in terms of x(i),y(i), n, 00, 01, X. 2. Compute L 200 ƏL 3. Compute 20₁ 4. Write update rules for 0o and 0₁ Hint: d|w| dw undefined -1 w>0 w=0 w <0arrow_forward
- In inductive hypothesis, we assume that is true and show that p(k+1) is true for karrow_forwardFor n 2 1, let a, be the number of ways to tile a 1 x n strip with 1 x 1 squares and 1 x 2 dominoes. The squares can be blue or red, while all dominoes are white. What is a2? Answer:arrow_forwardConsider the same house rent prediction problem where you are supposed to predict price of a house based on just its area. Suppose you have n samples with their respective areas, x(1), x(2), ... , x(n), their true house rents y(1), y(2),..., y(n). Let's say, you train a linear regres- sor that predicts f(x()) = 00 + 01x(e). The parameters 6o and 0, are scalars and are learned by minimizing mean-squared-error loss with L2-regularization through gradient descent with a learning rate a and the regularization strength constant A. Answer the following questions. 1. Express the loss function(L) in terms of x), y@), n, 0, 01, A. 2. Compute L 3. Compute 4. Write update rules for 6, and O1arrow_forwardAn electrician has wired n lights, all initially on, so that: 1) light 1 can always be turned on/off, and 2) for k > 1, light k cannot be turned either on or off unless light k – 1 is on and all preceding lights are off for k > 1. The question we want to explore is the following: how many moves are required to turn all n lights off? For n = 5, a solution sequence has been worked out below. Fill in the missing entries. The lights are counted from left to right, so the first bit is the first light, and so on. 11111 01111 11011 10011 00010 10010 11010arrow_forwardWrite Algorithm for Straightforward pairings for a round robin tournament.in: round index r (0 ≤ r ≤ 2 · (n − 1)/2); number of players n (1 ≤ n)out: sequence R of n player indices indicating the match pairings between players R2i and R2i+1, when i = 0,..., n/2 − 1; if n is odd, Rn−1 indicates the restingplayerarrow_forwardIn the hiring problem, what is the probability that the n’th best candidate will be hired after being interviewed?arrow_forwardFrom 1965 to 1974, in U.S. there were M = 17, 857, 857 male livebirths and F = 16, 974, 194 female livebirths. We model the number of male livebirth as a binomial distribution with parameters size = M+F and prob = p. The following code computes the maximum likelihood estimator for p. M <- 17857857 F <- 16974194 ll <- function(p){ dbinom(M, size=M+F, prob=p, log=TRUE) } ps <- seq(0.01, 0.99, by = 0.001) ll.ps <- ll(ps) plot(ps, ll.ps, type='l') phat <- ps[which.max(ll.ps)] abline(v = phat, col='blue') Question: An estimator for p, denoted by pˆ, is obtained by ps[which.max(ll.ps)]. Is this the maximum likelihood estimator? Why (explain the code)?arrow_forwardFrom 1965 to 1974, in U.S. there were M = 17, 857, 857 male livebirths and F = 16, 974, 194 female livebirths. We model the number of male livebirth as a binomial distribution with parameters size = M+F and prob = p. The following code computes the maximum likelihood estimator for p. M <- 17857857 F <- 16974194 ll <- function(p){ dbinom(M, size=M+F, prob=p, log=TRUE) } ps <- seq(0.01, 0.99, by = 0.001) ll.ps <- ll(ps) plot(ps, ll.ps, type='l') phat <- ps[which.max(ll.ps)] abline(v = phat, col='blue') Question: What can we learn from the plot?arrow_forwardno handwritten Alice and Bob are playing a match to see who is the first to win n games, for some fixed n > 0. Suppose Alice and Bob are equally competent, that is, each of them wins a game with probability 1/2. Further, suppose that they have already played i + j games, of which Alice won i and Bob won j. Give an efficient algorithm to compute the probability that Alice will go on to win the match. For example, if i = n − 1 and j = n − 3, then the probability that Alice will win the match is 7/8, since she must win any of the next three games.arrow_forwardarrow_back_iosSEE MORE QUESTIONSarrow_forward_ios
Recommended textbooks for you
- Operations Research : Applications and AlgorithmsComputer ScienceISBN:9780534380588Author:Wayne L. WinstonPublisher:Brooks Cole
Operations Research : Applications and Algorithms
Computer Science
ISBN:9780534380588
Author:Wayne L. Winston
Publisher:Brooks Cole