Jupyterlab debug2/3/2024 ![]() getOrCreate () # should spawn 4 jobs in a monitor bnelow the cell spark. config ( '', 'venv/lib/python3.7/site-packages/sparkmonitor/listener.jar' )\ If you already have your own spark configuration, you will need to set spark.extraListeners to and to the path to the sparkmonitor python package path/to/package/sparkmonitor/listener.jar from pyspark.sql import SparkSession spark = SparkSession. getOrCreate ( conf = conf ) #Start the spark context # Monitor should spawn under the cell with 4 jobs sc. You can use it as follows: from pyspark import SparkContext # start the spark context using the SparkConf the extension inserted sc = SparkContext. With the extension installed, a SparkConf object called conf will be usable from your notebooks. # run jupyter lab IPYTHONDIR =.ipython jupyter lab -watch ipython/profile_default/ipython_config.py Ipython profile create -ipython-dir =.ipythonĮcho "c.('sparkmonitor.kernelextension')" >. Setting up the extension pip install jupyterlab-sparkmonitor # install the extension # set up ipython profile and add our kernel extension to it docker run -it -p 8888:8888 itsjafer/sparkmonitor This docker image has pyspark and several other related packages installed alongside the sparkmonitor extension. Quick Start To do a quick test of the extension ![]()
0 Comments
Leave a Reply.AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |