Day 2: Working with Big Data
With Day 1’s table creation and manipulation under our belts, it’s time to start adding some serious data to our wiki table. Today, you’ll script against the HBase APIs, ultimately streaming Wikipedia content right into our wiki! Along the way, you’ll pick up some performance tricks for making faster import jobs. Finally, you’ll poke around in HBase’s internals to see how it partitions data into regions, achieving a series of both performance and disaster recovery goals.
Importing Data, Invoking Scripts
One common problem people face when trying a new database system is how to migrate data into it. Handcrafting Put operations with static strings, as you did in Day 1, is all well and good, but you can do better. ...
Get Seven Databases in Seven Weeks, 2nd Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.