The only dependency that we need to develop a Spark word count program is Spark core. The build plugin helps us package our application into a JAR:
<?xml version="1.0" encoding="UTF-8"?><project xmlns=" http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <artifactId>mastering-hadoop-3</artifactId> <groupId>com.packt</groupId> <version>1.0-SNAPSHOT</version> </parent> <artifactId>chapter6</artifactId> <dependencies> <!-- https://mvnrepository.com/artifact/org.apache.Spark/Spark-core --> <dependency> <groupId>org.apache.Spark</groupId> <artifactId> ...