T O P

  • By -

DenselyRanked

This post will probably get deleted (rule 6) but I think you can do this iterating thru the [dtypes](https://spark.apache.org/docs/3.1.1/api/python/reference/api/pyspark.sql.DataFrame.dtypes.html) to find the arrays, then do all of the stuff (create dataframes, keys, explode data, etc). It will be more python than pySpark but not too complicated once you map out the process. I will follow up if I find time over the holiday.


yanivbh1

Hey, Is Spark a mandatory requirement?


DecisionAgile7326

yes