Chapter 2. Extracting Textual Insights with APIs
When you want to determine the approach to a research question or start working on a text analytics project, the availability of data is often the first stumbling block. A simple Google search or the more specific Dataset search will throw up curated datasets, and we will use some of these in subsequent chapters of this book. Depending on your project, such datasets may turn out to be generic and not suitable for your use case. You might have to create your own dataset, and application programming interfaces (APIs) are one way to extract data programmatically in an automated fashion.
What Youâll Learn and What Weâll Build
In this chapter, we will provide an overview of APIs and introduce blueprints to extract data for your project from popular websites like GitHub and Twitter. You will learn about using authentication tokens, handling pagination, understanding rate limits, and automating data extraction. At the end of this chapter, you will be able to create your own datasets by making API calls to any identified service. While the blueprints are illustrated with specific examples such as GitHub and Twitter, they can be used to work with any API.
Application Programming Interfaces
APIs are interfaces that allow software applications or components to communicate with one another without having to know how they are implemented. The API provides a set of definitions and protocols including the kinds of requests that can be made, ...
Get Blueprints for Text Analytics Using Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.