Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

In jupyter Notebook kernel PySpark3 (HDInsight Spark cluster) I've got " The code failed because of a fatal error: Neither SparkSession nor HiveContext/SqlContext is available" error when I did all steps describe in "Safely manage Python environment on Azure HDInsight using Script Action" https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-python-package-installation?fbclid=IwAR1eDOEaBrRgJQMZY8HeCPW_C9pfVw6Qq6MOhA7q_PKpfzRa2R51QDR1dlE

Could you help me ? the best option for me would be to add one new package (SpaCy) to PySpark3 kernel (py35 env.). How can I do this?


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
3.9k views
Welcome To Ask or Share your Answers For Others

1 Answer

等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...