6
Nodes, Edges, and Statistical (In)dependence
Welcome to Part 2 of our book! Congratulations on getting this far!
In this part, we’ll dive into the world of causal inference. We’ll combine all the knowledge that we’ve gained so far and start building on top of it. This chapter will introduce us to two powerful concepts – d-separation and estimands.
Combining these two concepts with what we’ve learned so far will equip us with a flexible toolkit to compute causal effects.
Further down the road, we’ll discuss back-door and front-door criteria – two powerful methods to identify causal effects – and introduce a more general notion of Judea Pearl’s do-calculus. Finally, we’ll present instrumental variables – a family of techniques broadly applied ...
Get Causal Inference and Discovery in Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.