```python linenums="1" spark = ( SparkSession.builder.master("local[]").appName("test").getOrCreate() ) d = [ Event(1, "abc"), Event(2, "ddd"), ]
spark
5 posts
scala ref create dataframe
```txt master MASTERURL --> 运行模式 例:spark://host:port, mesos://host:port, yarn, or local.
PROCESSLOCAL data is in the same JVM as the running code. This is the best locality possible NODELOCAL data is on the same node. Examples might be in...
Recently reading a blog Structured Streaming in PySpark It's implemented in Databricks platform. Then I try to implement in my local Spark. Some...