java - Overhead if GC for memory in the JVM vs Swift style ARC -


the company work in have kind of different viewpoints regarding jvm development platform.

based on paper here - http://people.cs.umass.edu/~emery/pubs/gcvsmalloc.pdf

they saying oracle jvm requires 3-5x memory overheead i.e operate 1gb jvm require 3-5 gb of ram counteract jvm overhead , swift style arc answer gc issues.

i have made counter arguments not oracle/sun jvm conducted study on , experimental vm , arc has own problems circular references.

is there research conducted on exactly/approximately overhead of gc memory in jvm, not find any.

my questions summarized

1) there visible overhead gc. cause 3-5x cost of ram seems unreasonable if fact true.

also, big data applications such apache spark,hbase,cassandra operate in terabyte/petabyte memory scale. if there such overhead in gc why develop in such platform?

2) arc considered inferior other runtime gc tracing algorithms. if true, helpful if there papers directly comparing effects of arc compile time malloc/free vs jvm gc runtime cleanup

there claim chris lattner says gc consumes 3-5x memory here - https://lists.swift.org/pipermail/swift-evolution/week-of-mon-20160208/009422.html

is there visible overhead gc. cause 3-5x cost of ram seems unreasonable if fact true.

this misunderstanding. can run jvm 99% of heap used, gc regularly. if give application more memory, able work more efficiently. adding more memory heap can improve throughput. have seen work 3x. except in extreme cases, unlikely see benefit in adding more.

also, big data applications such apache spark,hbase,cassandra operate in terabyte/petabyte memory scale. if there such overhead in gc why develop in such platform?

when working big data, use memory mapped files , off heap memory. places bulk of data managed application not gc. no different how data base written in c++ might operate.

arc considered inferior other runtime gc tracing algorithms.

i couldn't comment on how smart arc is. java doesn't place restrictions on how gc should operate, sub-text is; has @ least handle circular references. less assumed unacceptable.

btw java uses malloc/free via direct bytebuffers.

jobs datasets such 1 gb

what makes data set 1 gb. compressed on disk might 100 mb. raw uncompressed data might 1 gb. in memory data structure might 2 gb, , throughput might faster if use 1 or 2 gb work on data structure.


Comments

Popular posts from this blog

java - SSE Emitter : Manage timeouts and complete() -

jquery - uncaught exception: DataTables Editor - remote hosting of code not allowed -

java - How to resolve error - package com.squareup.okhttp3 doesn't exist? -