Spark local mode: processing of big files on the normal notebook1 year ago
On January 4 there was a new version of introduction before experience of use in projects. Spark works at the majority of operating systems and it can be started in the local mode even on the normal notebook. Using simplicity of the Spark setup in this case a sin not to use the main to functions. In this article we will look as on the notebook quickly to configure processing of the big file (more random access memory of the computer) by means of normal SQL queries. It will allow to make requests even to the unprepared user. Additional connection of iPython (Jupyter) notebook will allow to make full reports. In article the simple example of processing of the file is sorted, other examples on Python are here.