Spark-Shell Startup Errors

I have downloaded Spark binaries for Windows and tried running spark-shell batch file and ran into some issues. Following are the issues and the respective solutions:

Problem 1

ERROR Shell: Failed to locate the winutils binary in the hadoop binary path Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(
at org.apache.hadoop.util.Shell.getWinUtilsPath(
at org.apache.hadoop.util.Shell.(
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.findHadoopBinary(
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.(
at org.apache.hadoop.hive.conf.HiveConf.(


Download the winutils.exe from and copy it to some directory say c:/hadoop/bin

Define the environmental variable HADOOP_HOME as C:/hadoop

Problem 2

ERROR SparkContext: Error initializing SparkContext.
java.lang.AssertionError: assertion failed: Expected hostname
at scala.Predef$.assert(Predef.scala:170)
at org.apache.spark.util.Utils$.checkHost(Utils.scala:931)
at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:31)


Set environmental variable SPARK_LOCAL_HOSTNAME as localhost