This is a very basic example of how to use Test Driven Development (TDD) in the context of PySpark, Spark's Python API.
- Use brew to install Apache Spark:
brew install apache-spark
- Change logging settings:
cd /usr/local/Cellar/apache-spark/2.1.0/libexec/conf
cp log4j.properties.template log4j.properties
- Set info to error:
log4j.rootCategory=ERROR, console
- Add this to your bash profile:
export SPARK_HOME="/usr/local/Cellar/apache-spark/2.1.0/libexec/"
- Use nosetests to run the test:
nosetests -vs test_clustering.py
- Apache Spark Spark 2.1.0
- Python Python 3.5
- nosetests nose 1.3.7