problems with spark 2.3.x and Alluxio 2.8.0 java.lang.IllegalArgumentException: port out of range:-1

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

problems with spark 2.3.x and Alluxio 2.8.0 java.lang.IllegalArgumentException: port out of range:-1

Anton Kulaga
I am trying to make Spark 2.3.0 and Alluxio work together. However, I constantly when I make a simple test from the website tutorial, like
"""
val s = sc.textFile("alluxio://alluxio-master:19998/LICENSE")
s.count
"""

I get the following nasty errors:
"""
s: org.apache.spark.rdd.RDD[String] = alluxio://alluxio-master:19998/LICENSE MapPartitionsRDD[15] at textFile at <console>:46 java.lang.IllegalArgumentException: port out of range:-1 at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143) at java.net.InetSocketAddress.<init>(InetSocketAddress.java:224) at alluxio.util.network.NetworkAddressUtils.getConnectAddress(NetworkAddressUtils.java:218) at alluxio.master.MasterInquireClient$Factory.getConnectDetails(MasterInquireClient.java:100) at alluxio.hadoop.AbstractFileSystem.connectDetailsMatch(AbstractFileSystem.java:540) at alluxio.hadoop.AbstractFileSystem.initialize(AbstractFileSystem.java:463) at alluxio.hadoop.FileSystem.initialize(FileSystem.java:27) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:258) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:200) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:251) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:251) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2099) at org.apache.spark.rdd.RDD.count(RDD.scala:1162) ... 64 elided
"""

My current docker stack configuration is https://github.com/antonkulaga/bigdata-docker/blob/master/bigdata-extended.yml and Alluxio seems to work fine (all tests pass on alluxio master and worker nodes and web UI works fine). I also do not forget to add alluxio client to spark classpath in zeppelin

--
You received this message because you are subscribed to the Google Groups "Alluxio Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: problems with spark 2.3.x and Alluxio 2.8.0 java.lang.IllegalArgumentException: port out of range:-1

Andrew Audibert
Hi Anton,

It looks like something is setting the "alluxio.master.port" property to -1. Could you be setting it in your /spark/conf/spark-defaults.conf? For debugging, try looking through the log4j output from Spark, and try increasing the log level to debug.

Does the error persist across restarting the spark shell?

- Andrew

On Thu, Aug 16, 2018 at 3:11 PM Anton Kulaga <[hidden email]> wrote:
I am trying to make Spark 2.3.0 and Alluxio work together. However, I constantly when I make a simple test from the website tutorial, like
"""
val s = sc.textFile("alluxio://alluxio-master:19998/LICENSE")
s.count
"""

I get the following nasty errors:
"""
s: org.apache.spark.rdd.RDD[String] = alluxio://alluxio-master:19998/LICENSE MapPartitionsRDD[15] at textFile at <console>:46 java.lang.IllegalArgumentException: port out of range:-1 at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143) at java.net.InetSocketAddress.<init>(InetSocketAddress.java:224) at alluxio.util.network.NetworkAddressUtils.getConnectAddress(NetworkAddressUtils.java:218) at alluxio.master.MasterInquireClient$Factory.getConnectDetails(MasterInquireClient.java:100) at alluxio.hadoop.AbstractFileSystem.connectDetailsMatch(AbstractFileSystem.java:540) at alluxio.hadoop.AbstractFileSystem.initialize(AbstractFileSystem.java:463) at alluxio.hadoop.FileSystem.initialize(FileSystem.java:27) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:258) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:200) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:251) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:251) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2099) at org.apache.spark.rdd.RDD.count(RDD.scala:1162) ... 64 elided
"""

My current docker stack configuration is https://github.com/antonkulaga/bigdata-docker/blob/master/bigdata-extended.yml and Alluxio seems to work fine (all tests pass on alluxio master and worker nodes and web UI works fine). I also do not forget to add alluxio client to spark classpath in zeppelin

--
You received this message because you are subscribed to the Google Groups "Alluxio Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
--

--
You received this message because you are subscribed to the Google Groups "Alluxio Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: problems with spark 2.3.x and Alluxio 2.8.0 java.lang.IllegalArgumentException: port out of range:-1

Anton Kulaga
In reply to this post by Anton Kulaga
After playing a bit with config I started to get thrift errors instead ( https://alluxio.atlassian.net/browse/ALLUXIO-3298?atlOrigin=eyJpIjoiMWU3NDk0YjlkMjNiNGY3NGJhMTYzY2E1NDQ3ZDhkZTEiLCJwIjoiaiJ9 )

--
You received this message because you are subscribed to the Google Groups "Alluxio Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.
Reply | Threaded
Open this post in threaded view
|

Re: problems with spark 2.3.x and Alluxio 2.8.0 java.lang.IllegalArgumentException: port out of range:-1

Andrew Audibert
In case anyone else has this problem in the future, could you share what config you needed to change?

On Thursday, August 16, 2018 at 4:01:50 PM UTC-7, Anton Kulaga wrote:
After playing a bit with config I started to get thrift errors instead ( <a href="https://alluxio.atlassian.net/browse/ALLUXIO-3298?atlOrigin=eyJpIjoiMWU3NDk0YjlkMjNiNGY3NGJhMTYzY2E1NDQ3ZDhkZTEiLCJwIjoiaiJ9" target="_blank" rel="nofollow" onmousedown="this.href=&#39;https://www.google.com/url?q\x3dhttps%3A%2F%2Falluxio.atlassian.net%2Fbrowse%2FALLUXIO-3298%3FatlOrigin%3DeyJpIjoiMWU3NDk0YjlkMjNiNGY3NGJhMTYzY2E1NDQ3ZDhkZTEiLCJwIjoiaiJ9\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNFCIkXGsxsqj9fi5zWVE6mYMF1KcQ&#39;;return true;" onclick="this.href=&#39;https://www.google.com/url?q\x3dhttps%3A%2F%2Falluxio.atlassian.net%2Fbrowse%2FALLUXIO-3298%3FatlOrigin%3DeyJpIjoiMWU3NDk0YjlkMjNiNGY3NGJhMTYzY2E1NDQ3ZDhkZTEiLCJwIjoiaiJ9\x26sa\x3dD\x26sntz\x3d1\x26usg\x3dAFQjCNFCIkXGsxsqj9fi5zWVE6mYMF1KcQ&#39;;return true;">https://alluxio.atlassian.net/browse/ALLUXIO-3298?atlOrigin=eyJpIjoiMWU3NDk0YjlkMjNiNGY3NGJhMTYzY2E1NDQ3ZDhkZTEiLCJwIjoiaiJ9 )

--
You received this message because you are subscribed to the Google Groups "Alluxio Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
For more options, visit https://groups.google.com/d/optout.