10 Service integration with Azure Data Factory
This chapter covers
-
Building a single-step processing pipeline
-
Using a secret key store
-
Scheduling batch data processing
In previous chapters, you’ve learned how to use Azure services to ingest and transform data. Except for Stream Analytics (SA), which automatically processes incoming data, you have added the data or triggered a process manually. In this chapter, you’ll learn how to move data between services on a schedule. You’ll learn how to move files between Azure Storage accounts and your Data Lake store (ADLS store). You’ll also learn how to run U-SQL scripts on a schedule to transform data. You’ll use Azure Data Lake Analytics (ADLA) to read and transform data from multiple sources. ...
Get Azure Storage, Streaming, and Batch Analytics now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.