O have a dynamic computation model based on which I run these transformations
Looks like "dymamic computation model" is "I write my code as if scala had no types" We can help you with modelling, but we need simple but complete code samples about what your task is
Looks like "dymamic computation model" is "I write my code as if scala had no types" We can help you with modelling, but we need simple but complete code samples about what your task is
Ok I can share that but this is isolated still you guys should be able to answer how to implement different collection types as generics in the same class right?
Are you shure ? Here is explanation from documentation A Resilient Distributed Dataset (RDD), the basic abstraction in Spark. Represents an immutable, partitioned collection of elements that can be operated on in parallel
Are you shure ? Here is explanation from documentation A Resilient Distributed Dataset (RDD), the basic abstraction in Spark. Represents an immutable, partitioned collection of elements that can be operated on in parallel
Are you shure ? Here is explanation from documentation A Resilient Distributed Dataset (RDD), the basic abstraction in Spark. Represents an immutable, partitioned collection of elements that can be operated on in parallel
It could be understood as a collection semantically, but not as a collection in scala stdlib sense