O'Reilly logo

Data Analysis with R - Second Edition by Tony Fischetti

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Allocation of memory

Refer all the way back to Chapter 5, Using Data to Reason About the World. Remember when we created a mock population of women's heights in the US, and we repeatedly took 10,000 samples of 40 from it to demonstrate the sampling distribution of the sample means? In a code comment, I mentioned in passing that the snippet numeric(10000) created an empty vector of 10,000 elements, but I never explained why we did that. Why didn't we just create a vector of 1, and continually tack on each new sample mean to the end of it? This is demonstrated as follows:

set.seed(1) all.us.women <- rnorm(10000, mean=65, sd=3.5) means.of.our.samples.bad <- c(1) # I'm increasing the number of # samples to 30,000 to prove a point for(i in 1:30000){ ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required