Jupyter magics and kernels for working with remote Spark clusters
Sparkmagic is a set of tools for interactively working with remote Spark clusters in Jupyter notebooks. Sparkmagic interacts with remote Spark clusters through a REST server. Automatic visualization of SQL queries in the PySpark, Spark and SparkR kernels; use an easy visual interface to interactively construct visualizations, no code required. Ability to capture the output of SQL queries as Pandas dataframes to interact with other Python libraries (e.g. matplotlib). Send local files or dataframes to a remote cluster (e.g. sending pretrained local ML model straight to the Spark cluster) Authenticate to Livy via Basic Access authentication or via Kerberos.
This Operating System is specially for Raspberry Web Server Administration.
Note :
After Login, don't forget to put this command for usefull your fulldisk
sudo parted
(parted) print
(parted) resizepart
Partition number? 2
End? 100%
(parted) quit
sudo resize2fs
after finish
For editing Swap Memory size :
sudo nano /etc/dphys-swapfile
change :
CONF_SWAPSIZE=100
to :
CONF_SWAPSIZE=1000 (for 1Gb swap memory)
ctrl + x (for save)
sudo reboot
For make...
The Cocoon Adaption Kit is a NetKernel module which enables the use of Cocoon Components (Generators, Transformers, Serializer, Actions) from within the NetKernel XML Application Server.