تفاصيل الدورة
Note : This course is built on top of the "Real World Vagrant - Build an Apache Spark Development Env! - Toyin Akin" course. So if you do not have a Spark environment already installed (within a VM or directly installed), you can take the stated course above.
Jupyter Notebook is a system similar to Mathematica that allows you to create "executable documents". Notebooks integrate formatted text (Markdown), executable code (Python), mathematical formulas (LaTeX), and graphics and visualizations (matplotlib) into a single document that captures the flow of an exploration and can be exported as a formatted report or an executable script.,
The Jupyter Notebook is a web application that allows you to create and share documents that contain live code, equations, visualizations and explanatory text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, machine learning and much more.
Big data integration
Leverage big data tools, such as Apache Spark, from Python
The Jupyter Notebook is based on a set of open standards for interactive computing. Think HTML and CSS for interactive computing on the web. These open standards can be leveraged by third party developers to build customized applications with embedded interactive computing.
Spark Monitoring and Instrumentation
While creating RDDs, performing transformations and executing actions, you will be working heavily within the monitoring view of the Web UI.
Every SparkContext launches a web UI, by default on port 4040, that displays useful information about the application. This includes:
A list of scheduler stages and tasksA summary of RDD sizes and memory usageEnvironmental information.Information about the running executors
Why Apache Spark ...
Apache Spark run programs up to 100x faster than Hadoop MapReduce in memory, or 10x faster on disk. Apache Spark has an advanced DAG execution engine that supports cyclic data flow and in-memory computing. Apache Spark offers over 80 high-level operators that make it easy to build parallel apps. And you can use it interactively from the Scala, Python and R shells. Apache Spark can combine SQL, streaming, and complex analytics.
Apache Spark powers a stack of libraries including SQL and DataFrames, MLlib for machine learning, GraphX, and Spark Streaming. You can combine these libraries seamlessly in the same application.
تحديث بتاريخ 22 March, 2018- JavaScript Full stack web developer virtual internship Virtual Bootcamp + Internship at Laimoon1,449 درهممدة الدورة التدريبية: Upto 30 Hours
- Python Coding for Beginners Janets37 درهم
918 درهممدة الدورة التدريبية: Upto 9 Hours - Coding with Python For Beginners Alpha Academy92 درهم
3,428 درهممدة الدورة التدريبية: Upto 9 Hours