Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
849 views
in Technique[技术] by (71.8m points)

apache spark - How does web UI calculate Storage Memory (in Executors tab)?

I'm trying to understand how Spark 2.1.0 allocates memory on nodes.

Suppose I'm starting a local PySpark REPL assigning it 2GB of memory:

$ pyspark --conf spark.driver.memory=2g

Spark UI tells that there are 956.6 MB allocated for storage memory:

enter image description here

I don't understand how to get to that number, this is my thinking process:

  1. Driver heap size is set to 2048 MB,
  2. According to docs: (2048 MB - 300 MB) * 0.6 = 1048.8 MB are used for both execution and storage regions (unified),
  3. Additionally 1048.8 MB * 0.5 = 524.4 MB within unified region should be reserved as immune storage region

So, how was the value 956.6 MB in Spark actually calculated?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You seem to be using local mode (with one driver that also acts as the only executor), but it should also be applicable to other clustered modes.

Enable the INFO logging level for BlockManagerMasterEndpoint to know how much memory Spark sees the property you set on the command line (as spark.driver.memory).

log4j.logger.org.apache.spark.storage.BlockManagerMasterEndpoint=INFO

When you start spark-shell --conf spark.driver.memory=2g you'll see the following:

$ ./bin/spark-shell --conf spark.driver.memory=2g
...
17/05/07 15:20:50 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.8:57177 with 912.3 MB RAM, BlockManagerId(driver, 192.168.1.8, 57177, None)

As you can see the available memory is 912.3 which is calculated as follows (see UnifiedMemoryManager.getMaxMemory):

// local mode with --conf spark.driver.memory=2g
scala> sc.getConf.getSizeAsBytes("spark.driver.memory")
res0: Long = 2147483648

scala> val systemMemory = Runtime.getRuntime.maxMemory

// fixed amount of memory for non-storage, non-execution purposes
val reservedMemory = 300 * 1024 * 1024

// minimum system memory required
val minSystemMemory = (reservedMemory * 1.5).ceil.toLong

val usableMemory = systemMemory - reservedMemory

val memoryFraction = sc.getConf.getDouble("spark.memory.fraction", 0.6)
scala> val maxMemory = (usableMemory * memoryFraction).toLong
maxMemory: Long = 956615884

import org.apache.spark.network.util.JavaUtils
scala> JavaUtils.byteStringAsMb(maxMemory + "b")
res1: Long = 912

Let's review how web UI calculates the memory (which is different from what's above and is supposed to just display the value!). That's the surprising part.

How the Storage Memory is displayed in web UI is controlled by the custom JavaScript function formatBytes in utils.js that (mapped to Scala) looks as follows:

def formatBytes(bytes: Double) = {
  val k = 1000
  val i = math.floor(math.log(bytes) / math.log(k))
  val maxMemoryWebUI = bytes / math.pow(k, i)
  f"$maxMemoryWebUI%1.1f"
}
scala> println(formatBytes(maxMemory))
956.6

956.6! That's exactly what web UI shows and is quite different from what Spark's UnifiedMemoryManager considers the available memory. Quite surprising, isn't it?


I think it's a bug and filled it as SPARK-20691.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...