parent
15d8b52193
commit
2252be0709
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@ -0,0 +1,180 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Experimenting your Winpython installation\n",
|
||||||
|
"\n",
|
||||||
|
" . [Winpython_checker test, to see various packages](Winpython_checker.ipynb) \n",
|
||||||
|
" \n",
|
||||||
|
" . [Seaborn visualization Example](seaborn_demo_from_jakevdp.ipynb)\n",
|
||||||
|
" \n",
|
||||||
|
" . [QT libraries Example](Qt_libraries_demo.ipynb)\n",
|
||||||
|
"\n",
|
||||||
|
" . [Pandas Data-science example](dplyr_pandas.ipynb)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Tutorials and Demonstrations on Internet\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"## Introduction to DataScience\n",
|
||||||
|
" . [Python Data Science Handbook](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/README.md)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Games and Statistics\n",
|
||||||
|
" . [Pythonic Perambulations](http://jakevdp.github.io) from Jake Vanderplas, in particular http://jakevdp.github.io/blog/2017/12/18/simulating-chutes-and-ladders/\n",
|
||||||
|
" \n",
|
||||||
|
" . [Peter Norvig Studies](https://github.com/norvig/pytudes/tree/master/ipynb) from Peter Norvig, in particular http://norvig.com/sudoku.html"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Ipython Notebook Documentation\n",
|
||||||
|
" \n",
|
||||||
|
" . [IPython notebook-based online documentation](https://nbviewer.ipython.org/github/ipython/ipython/blob/master/examples/Index.ipynb)\n",
|
||||||
|
" \n",
|
||||||
|
" . [Galery of Interesting Notebooks](https://github.com/ipython/ipython/wiki/A-gallery-of-interesting-IPython-Notebooks)\n",
|
||||||
|
" \n",
|
||||||
|
" . Videos of Conferences and Trainings: [Europython](https://www.youtube.com/user/PythonItalia/playlists?shelf_id=4&view=50&sort=dd), [Pydata](https://www.youtube.com/user/PyDataTV) , [Scipy](https://www.youtube.com/user/EnthoughtMedia), [EuroScipy](https://www.youtube.com/channel/UCruMegFU9dg2doEGOUaAWTg), [Pycon 2018](https://www.youtube.com/channel/UCsX05-2sVSH7Nx3zuk3NYuQ/featured) , [Pycon 2017](https://www.youtube.com/channel/UCrJhliKNQ8g0qoE_zvL8eVg), [Pycon 2016](https://www.youtube.com/channel/UCwTD5zJbsQGJN75MwbykYNw)\n",
|
||||||
|
" "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Pandas\n",
|
||||||
|
"\n",
|
||||||
|
". Beginners Training Video: [\"Brandon Rhodes - Pandas From The Ground Up - PyCon 2015 \"](https://www.youtube.com/watch?v=5JnMutdy6Fw)\n",
|
||||||
|
"\n",
|
||||||
|
". Pandas [API reference](https://pandas.pydata.org/pandas-docs/stable/api.html)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Graphics :\n",
|
||||||
|
"\n",
|
||||||
|
" . Matplotlib : [Beginner's guide](https://matplotlib.org/users/beginner.html) , [Gallery](https://matplotlib.org/gallery.html) , [General Content](https://matplotlib.org/contents.html) \n",
|
||||||
|
" \n",
|
||||||
|
" . seaborn : [Tutorial](https://stanford.edu/~mwaskom/software/seaborn/tutorial.html) , [Gallery](https://stanford.edu/~mwaskom/software/seaborn/examples/index.html)\n",
|
||||||
|
" \n",
|
||||||
|
" . scikit-image : [Gallery](https://scikit-image.org/docs/dev/auto_examples/), [User Guide](https://scikit-image.org/docs/dev/user_guide.html)\n",
|
||||||
|
" \n",
|
||||||
|
" . holoviews : [Introduction](https://ioam.github.io/holoviews) , [Tutorials](https://ioam.github.io/holoviews/Tutorials/index.html)\n",
|
||||||
|
" \n",
|
||||||
|
" . bqplot: [Introduction](https://bqplot.readthedocs.io/en/stable/introduction.html)\n",
|
||||||
|
" \n",
|
||||||
|
" . Altair: [Introduction]](https://altair-viz.github.io/)\n",
|
||||||
|
" \n",
|
||||||
|
" . plotnine : [Gallery](https://plotnine.readthedocs.io/en/stable/gallery.html) , [Tutotials](https://github.com/has2k1/plotnine/blob/master/doc/external-resources.rst)\n",
|
||||||
|
" \n",
|
||||||
|
" . hvplot : [Gallery](https://hvplot.pyviz.org/)\n",
|
||||||
|
"\n",
|
||||||
|
" . PyQtGraph : [Gallery](http://www.pyqtgraph.org/)\n",
|
||||||
|
" "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## SQL\n",
|
||||||
|
" . IPython-SQL : [Tutorial](https://nbviewer.ipython.org/gist/catherinedevlin/6588378)\n",
|
||||||
|
" \n",
|
||||||
|
" . db.py : [Tutorial](https://nbviewer.ipython.org/github/yhat/db.py/blob/master/examples/db-example.ipynb)\n",
|
||||||
|
" \n",
|
||||||
|
" . baresql : [Tutorial](https://pypi.python.org/pypi/baresql)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"## Machine learning / Deep Learning\n",
|
||||||
|
" . scikit-learn : [Tutorial](https://scikit-learn.org/stable/tutorial/index.html) , [Gallery](https://scikit-learn.org/stable/auto_examples/index.html)\n",
|
||||||
|
" \n",
|
||||||
|
" . Theano: [Tutorial](https://deeplearning.net/software/theano/tutorial/), [Related Projects](https://github.com/Theano/Theano/wiki/Related-projects)\n",
|
||||||
|
" \n",
|
||||||
|
" . Keras: [Introduction]](https://keras.io/)\n",
|
||||||
|
"\n",
|
||||||
|
" . Tensorflow: [Tutorial](https://github.com/Hvass-Labs/TensorFlow-Tutorials) with [videos](https://www.youtube.com/playlist?list=PL9Hr9sNUjfsmEu1ZniY0XpHSzl5uihcXZ)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"## Qt User Interface Development :\n",
|
||||||
|
"\n",
|
||||||
|
" . PyQt4 tutorial: https://zetcode.com/gui/pyqt4/firstprograms/\n",
|
||||||
|
" \n",
|
||||||
|
" . PyQt5 tutorial: https://zetcode.com/gui/pyqt5/firstprograms/\n",
|
||||||
|
" \n",
|
||||||
|
" . guiqwt tutorial: https://pythonhosted.org/guiqwt/examples.html .\n",
|
||||||
|
" \n",
|
||||||
|
" . switching from guiqwt 2 to 3: https://github.com/PierreRaybaut/guiqwt/blob/master/doc/migrating_from_v2_to_v3.rst)\n",
|
||||||
|
" \n",
|
||||||
|
" . guidata: https://pythonhosted.org/guidata/examples.html\n",
|
||||||
|
" \n",
|
||||||
|
" "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"\n",
|
||||||
|
"## Winpython\n",
|
||||||
|
"\n",
|
||||||
|
". [Winpython Discussion Group](https://groups.google.com/forum/#!forum/winpython)\n",
|
||||||
|
" \n",
|
||||||
|
". [Other Winpython examples](http://nbviewer.ipython.org/github/winpython/winpython_afterdoc/tree/master/)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.7.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 1
|
||||||
|
}
|
@ -0,0 +1,691 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "-knK4sZodDZg"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"##### Copyright 2018 The TensorFlow Authors.\n",
|
||||||
|
"\n",
|
||||||
|
"Licensed under the Apache License, Version 2.0 (the \"License\");"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "zAM8G9A4dF4R"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#@title Licensed under the Apache License, Version 2.0 (the \"License\"); { display-mode: \"form\" }\n",
|
||||||
|
"# you may not use this file except in compliance with the License.\n",
|
||||||
|
"# You may obtain a copy of the License at\n",
|
||||||
|
"#\n",
|
||||||
|
"# https://www.apache.org/licenses/LICENSE-2.0\n",
|
||||||
|
"#\n",
|
||||||
|
"# Unless required by applicable law or agreed to in writing, software\n",
|
||||||
|
"# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
|
||||||
|
"# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
|
||||||
|
"# See the License for the specific language governing permissions and\n",
|
||||||
|
"# limitations under the License."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "cPw5xFcq1kpw"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"<table class=\"tfo-notebook-buttons\" align=\"left\">\n",
|
||||||
|
" <td>\n",
|
||||||
|
" <a target=\"_blank\" href=\"https://colab.research.google.com/github/tensorflow/probability/blob/master/tensorflow_probability/examples/jupyter_notebooks/Eight_Schools.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>\n",
|
||||||
|
" </td>\n",
|
||||||
|
" <td>\n",
|
||||||
|
" <a target=\"_blank\" href=\"https://github.com/tensorflow/probability/blob/master/tensorflow_probability/examples/jupyter_notebooks/Eight_Schools.ipynb\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" />View source on GitHub</a>\n",
|
||||||
|
" </td>\n",
|
||||||
|
"</table>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "5MzjGu_O7HwY"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"# Eight schools\n",
|
||||||
|
"\n",
|
||||||
|
"The eight schools problem ([Rubin 1981](https://www.jstor.org/stable/1164617)) considers the effectiveness of SAT coaching programs conducted in parallel at eight schools. It has become a classic problem ([Bayesian Data Analysis](http://www.stat.columbia.edu/~gelman/book/), [Stan](https://github.com/stan-dev/rstan/wiki/RStan-Getting-Started)) that illustrates the usefulness of hierarchical modeling for sharing information between exchangeable groups.\n",
|
||||||
|
"\n",
|
||||||
|
"The Edward2 implemention below is an adaptation of an Edward 1.0 [tutorial](https://github.com/blei-lab/edward/blob/master/notebooks/eight_schools.ipynb). "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "1hFtob1DsFqm"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"<table class=\"tfo-notebook-buttons\" align=\"left\">\n",
|
||||||
|
" <td>\n",
|
||||||
|
" <a target=\"_blank\" href=\"https://colab.research.google.com/github/tensorflow/probability/blob/master/tensorflow_probability/examples/jupyter_notebooks/Eight_Schools.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>\n",
|
||||||
|
" </td>\n",
|
||||||
|
" <td>\n",
|
||||||
|
" <a target=\"_blank\" href=\"https://github.com/tensorflow/probability/blob/master/tensorflow_probability/examples/jupyter_notebooks/Eight_Schools.ipynb\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" />View source on GitHub</a>\n",
|
||||||
|
" </td>\n",
|
||||||
|
"</table>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "TNuvn0Ih4D_R"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"# Imports"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "vznFo-cU7Pc_"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# installed by defaultf now\n",
|
||||||
|
"# !pip install -q tensorflow-probability"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "XMTEI6ep4D_S"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from __future__ import absolute_import\n",
|
||||||
|
"from __future__ import division\n",
|
||||||
|
"from __future__ import print_function\n",
|
||||||
|
"\n",
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import seaborn as sns\n",
|
||||||
|
"\n",
|
||||||
|
"import tensorflow as tf\n",
|
||||||
|
"import tensorflow_probability as tfp\n",
|
||||||
|
"from tensorflow_probability import edward2 as ed\n",
|
||||||
|
"import warnings\n",
|
||||||
|
"\n",
|
||||||
|
"plt.style.use(\"ggplot\")\n",
|
||||||
|
"warnings.filterwarnings('ignore')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "cIbNcemwwO2y"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"# The Data\n",
|
||||||
|
"\n",
|
||||||
|
"From Bayesian Data Analysis, section 5.5 (Gelman et al. 2013):\n",
|
||||||
|
"\n",
|
||||||
|
"> *A study was performed for the Educational Testing Service to analyze the effects of special coaching programs for SAT-V (Scholastic Aptitude Test-Verbal) in each of eight high schools. The outcome variable in each study was the score on a special administration of the SAT-V, a standardized multiple choice test administered by the Educational Testing Service and used to help colleges make admissions decisions; the scores can vary between 200 and 800, with mean about 500 and standard deviation about 100. The SAT examinations are designed to be resistant to short-term efforts directed specifically toward improving performance on the test; instead they are designed to reflect knowledge acquired and abilities developed over many years of education. Nevertheless, each of the eight schools in this study considered its short-term coaching program to be very successful at increasing SAT scores. Also, there was no prior reason to believe that any of the eight programs was more effective than any other or that some were more similar in effect to each other than to any other.*\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"For each of the eight schools ($J = 8$), we have an estimated treatment effect $y_j$ and a standard error of the effect estimate $\\sigma_j$. The treatment effects in the study were obtained by a linear regression on the treatment group using PSAT-M and PSAT-V scores as control variables. As there was no prior belief that any of the schools were more or less similar or that any of the coaching programs would be more effective, we can consider the treatment effects as [exchangeable](https://en.wikipedia.org/wiki/Exchangeable_random_variables)."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 516
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "rSngqHwAKv_j",
|
||||||
|
"outputId": "db4a11bc-9946-4be9-a362-3b65f29a08ca"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"num_schools = 8 # number of schools\n",
|
||||||
|
"treatment_effects = np.array(\n",
|
||||||
|
" [28, 8, -3, 7, -1, 1, 18, 12], dtype=np.float32) # treatment effects\n",
|
||||||
|
"treatment_stddevs = np.array(\n",
|
||||||
|
" [15, 10, 16, 11, 9, 11, 10, 18], dtype=np.float32) # treatment SE\n",
|
||||||
|
"\n",
|
||||||
|
"fig, ax = plt.subplots()\n",
|
||||||
|
"plt.bar(range(num_schools), treatment_effects, yerr=treatment_stddevs)\n",
|
||||||
|
"plt.title(\"8 Schools treatment effects\")\n",
|
||||||
|
"plt.xlabel(\"School\")\n",
|
||||||
|
"plt.ylabel(\"Treatment effect\")\n",
|
||||||
|
"fig.set_size_inches(10, 8)\n",
|
||||||
|
"plt.show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "S6Yj8WEDwI3L"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"# Model\n",
|
||||||
|
"\n",
|
||||||
|
"To capture the data, we use a hierarchical normal model. It follows the generative process,\n",
|
||||||
|
"\n",
|
||||||
|
"\\begin{align*}\n",
|
||||||
|
"\\mu &\\sim \\text{Normal}(\\text{loc}{=}0,\\ \\text{scale}{=}10) \\\\\n",
|
||||||
|
"\\log\\tau &\\sim \\text{Normal}(\\text{loc}{=}5,\\ \\text{scale}{=}1) \\\\\n",
|
||||||
|
"\\text{for } & i=1\\ldots 8:\\\\\n",
|
||||||
|
"& \\theta_i \\sim \\text{Normal}\\left(\\text{loc}{=}\\mu,\\ \\text{scale}{=}\\tau \\right) \\\\\n",
|
||||||
|
"& y_i \\sim \\text{Normal}\\left(\\text{loc}{=}\\theta_i,\\ \\text{scale}{=}\\sigma_i \\right) \n",
|
||||||
|
"\\end{align*}\n",
|
||||||
|
"\n",
|
||||||
|
"where $\\mu$ represents the prior average treatment effect and $\\tau$ controls how much variance there is between schools. The $y_i$ and $\\sigma_i$ are observed. As $\\tau \\rightarrow \\infty$, the model approaches the no-pooling model, i.e., each of the school treatment effect estimates are allowed to be more independent. As $\\tau \\rightarrow 0$, the model approaches the complete-pooling model, i.e., all of the school treatment effects are closer to the group average $\\mu$. To restrict the standard deviation to be positive, we draw $\\tau$ from a lognormal distribution (which is equivalent to drawing $log(\\tau)$ from a normal distribution).\n",
|
||||||
|
"\n",
|
||||||
|
"Following [Diagnosing Biased Inference with Divergences](http://mc-stan.org/users/documentation/case-studies/divergences_and_bias.html), we transform the model above into an equivalent non-centered model:\n",
|
||||||
|
"\n",
|
||||||
|
"\\begin{align*}\n",
|
||||||
|
"\\mu &\\sim \\text{Normal}(\\text{loc}{=}0,\\ \\text{scale}{=}10) \\\\\n",
|
||||||
|
"\\log\\tau &\\sim \\text{Normal}(\\text{loc}{=}5,\\ \\text{scale}{=}1) \\\\\n",
|
||||||
|
"\\text{for } & i=1\\ldots 8:\\\\\n",
|
||||||
|
"& \\theta_i' \\sim \\text{Normal}\\left(\\text{loc}{=}0,\\ \\text{scale}{=}1 \\right) \\\\\n",
|
||||||
|
"& \\theta_i = \\mu + \\tau \\theta_i' \\\\\n",
|
||||||
|
"& y_i \\sim \\text{Normal}\\left(\\text{loc}{=}\\theta_i,\\ \\text{scale}{=}\\sigma_i \\right) \n",
|
||||||
|
"\\end{align*}\n",
|
||||||
|
"\n",
|
||||||
|
"We reify this model as an Edward2 program:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "EiEtvl1zokAG"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"def schools_model(num_schools, treatment_stddevs):\n",
|
||||||
|
" avg_effect = ed.Normal(loc=0., scale=10., name=\"avg_effect\") # `mu` above\n",
|
||||||
|
" avg_stddev = ed.Normal(\n",
|
||||||
|
" loc=5., scale=1., name=\"avg_stddev\") # `log(tau)` above\n",
|
||||||
|
" school_effects_standard = ed.Normal(\n",
|
||||||
|
" loc=tf.zeros(num_schools),\n",
|
||||||
|
" scale=tf.ones(num_schools),\n",
|
||||||
|
" name=\"school_effects_standard\") # `theta_prime` above\n",
|
||||||
|
" school_effects = avg_effect + tf.exp(\n",
|
||||||
|
" avg_stddev) * school_effects_standard # `theta` above\n",
|
||||||
|
" treatment_effects = ed.Normal(\n",
|
||||||
|
" loc=school_effects, scale=treatment_stddevs,\n",
|
||||||
|
" name=\"treatment_effects\") # `y` above\n",
|
||||||
|
" return treatment_effects\n",
|
||||||
|
"\n",
|
||||||
|
"log_joint = ed.make_log_joint_fn(schools_model)\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"def target_log_prob_fn(avg_effect, avg_stddev, school_effects_standard):\n",
|
||||||
|
" \"\"\"Unnormalized target density as a function of states.\"\"\"\n",
|
||||||
|
" return log_joint(\n",
|
||||||
|
" num_schools=num_schools,\n",
|
||||||
|
" treatment_stddevs=treatment_stddevs,\n",
|
||||||
|
" avg_effect=avg_effect,\n",
|
||||||
|
" avg_stddev=avg_stddev,\n",
|
||||||
|
" school_effects_standard=school_effects_standard,\n",
|
||||||
|
" treatment_effects=treatment_effects)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "jnVK-1yH9WCY"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"# Bayesian Inference\n",
|
||||||
|
"\n",
|
||||||
|
"Given data, we perform Hamiltonian Monte Carlo (HMC) to calculate the posterior distribution over the model's parameters."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 34
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "-66vCUVrQRnb",
|
||||||
|
"outputId": "c6d7f3d0-073b-4d6c-9d7f-55826134d596"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"num_results = 5000\n",
|
||||||
|
"num_burnin_steps = 3000\n",
|
||||||
|
"\n",
|
||||||
|
"states, kernel_results = tfp.mcmc.sample_chain(\n",
|
||||||
|
" num_results=num_results,\n",
|
||||||
|
" num_burnin_steps=num_burnin_steps,\n",
|
||||||
|
" current_state=[\n",
|
||||||
|
" tf.zeros([], name='init_avg_effect'),\n",
|
||||||
|
" tf.zeros([], name='init_avg_stddev'),\n",
|
||||||
|
" tf.ones([num_schools], name='init_school_effects_standard'),\n",
|
||||||
|
" ],\n",
|
||||||
|
" kernel=tfp.mcmc.HamiltonianMonteCarlo(\n",
|
||||||
|
" target_log_prob_fn=target_log_prob_fn,\n",
|
||||||
|
" step_size=0.4,\n",
|
||||||
|
" num_leapfrog_steps=3))\n",
|
||||||
|
"\n",
|
||||||
|
"avg_effect, avg_stddev, school_effects_standard = states\n",
|
||||||
|
"\n",
|
||||||
|
"with tf.Session() as sess:\n",
|
||||||
|
" [\n",
|
||||||
|
" avg_effect_,\n",
|
||||||
|
" avg_stddev_,\n",
|
||||||
|
" school_effects_standard_,\n",
|
||||||
|
" is_accepted_,\n",
|
||||||
|
" ] = sess.run([\n",
|
||||||
|
" avg_effect,\n",
|
||||||
|
" avg_stddev,\n",
|
||||||
|
" school_effects_standard,\n",
|
||||||
|
" kernel_results.is_accepted,\n",
|
||||||
|
" ])\n",
|
||||||
|
"\n",
|
||||||
|
"school_effects_samples = (\n",
|
||||||
|
" avg_effect_[:, np.newaxis] +\n",
|
||||||
|
" np.exp(avg_stddev_)[:, np.newaxis] * school_effects_standard_)\n",
|
||||||
|
"num_accepted = np.sum(is_accepted_)\n",
|
||||||
|
"print('Acceptance rate: {}'.format(num_accepted / num_results))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 729
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "2-iMMOcFvE03",
|
||||||
|
"outputId": "3d6e2b56-d76e-41e4-f60e-29332149d350"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"fig, axes = plt.subplots(8, 2, sharex='col', sharey='col')\n",
|
||||||
|
"fig.set_size_inches(12, 10)\n",
|
||||||
|
"for i in range(num_schools):\n",
|
||||||
|
" axes[i][0].plot(school_effects_samples[:,i])\n",
|
||||||
|
" axes[i][0].title.set_text(\"School {} treatment effect chain\".format(i))\n",
|
||||||
|
" sns.kdeplot(school_effects_samples[:,i], ax=axes[i][1], shade=True)\n",
|
||||||
|
" axes[i][1].title.set_text(\"School {} treatment effect distribution\".format(i))\n",
|
||||||
|
"axes[num_schools - 1][0].set_xlabel(\"Iteration\")\n",
|
||||||
|
"axes[num_schools - 1][1].set_xlabel(\"School effect\")\n",
|
||||||
|
"fig.tight_layout()\n",
|
||||||
|
"plt.show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 153
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "l4t9XLxSszBe",
|
||||||
|
"outputId": "45b3857b-1abd-4dff-e177-692ce11a7aa3"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"print(\"E[avg_effect] = {}\".format(avg_effect_.mean()))\n",
|
||||||
|
"print(\"E[avg_stddev] = {}\".format(avg_stddev_.mean()))\n",
|
||||||
|
"print(\"E[school_effects_standard] =\")\n",
|
||||||
|
"print(school_effects_standard_[:, ].mean(0))\n",
|
||||||
|
"print(\"E[school_effects] =\")\n",
|
||||||
|
"print(school_effects_samples[:, ].mean(0))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "Wxp1uFW6RWMW"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Compute the 95% interval for school_effects\n",
|
||||||
|
"school_effects_low = np.array([\n",
|
||||||
|
" np.percentile(school_effects_samples[:, i], 2.5) for i in range(num_schools)\n",
|
||||||
|
"])\n",
|
||||||
|
"school_effects_med = np.array([\n",
|
||||||
|
" np.percentile(school_effects_samples[:, i], 50) for i in range(num_schools)\n",
|
||||||
|
"])\n",
|
||||||
|
"school_effects_hi = np.array([\n",
|
||||||
|
" np.percentile(school_effects_samples[:, i], 97.5)\n",
|
||||||
|
" for i in range(num_schools)\n",
|
||||||
|
"])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 516
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "yY-qBFTotd3F",
|
||||||
|
"outputId": "e4d9fc9b-1d40-47f6-ae34-ff8281155b42"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"fig, ax = plt.subplots(nrows=1, ncols=1, sharex=True)\n",
|
||||||
|
"ax.scatter(np.array(range(num_schools)), school_effects_med, color='red', s=60)\n",
|
||||||
|
"ax.scatter(\n",
|
||||||
|
" np.array(range(num_schools)) + 0.1, treatment_effects, color='blue', s=60)\n",
|
||||||
|
"\n",
|
||||||
|
"avg_effect = avg_effect_.mean()\n",
|
||||||
|
"\n",
|
||||||
|
"plt.plot([-0.2, 7.4], [avg_effect, avg_effect], 'k', linestyle='--')\n",
|
||||||
|
"\n",
|
||||||
|
"ax.errorbar(\n",
|
||||||
|
" np.array(range(8)),\n",
|
||||||
|
" school_effects_med,\n",
|
||||||
|
" yerr=[\n",
|
||||||
|
" school_effects_med - school_effects_low,\n",
|
||||||
|
" school_effects_hi - school_effects_med\n",
|
||||||
|
" ],\n",
|
||||||
|
" fmt='none')\n",
|
||||||
|
"\n",
|
||||||
|
"ax.legend(('avg_effect', 'Edward2/HMC', 'Observed effect'), fontsize=14)\n",
|
||||||
|
"\n",
|
||||||
|
"plt.xlabel('School')\n",
|
||||||
|
"plt.ylabel('Treatment effect')\n",
|
||||||
|
"plt.title('Edward2 HMC estimated school treatment effects vs. observed data')\n",
|
||||||
|
"fig.set_size_inches(10, 8)\n",
|
||||||
|
"plt.show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "2dV93ZSzGSIm"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"We can observe the shrinkage toward the group `avg_effect` above."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 51
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "LcljZ1prD91d",
|
||||||
|
"outputId": "c3eed6e1-5b00-4ddf-c40c-1c045c4f6b0c"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"print(\"Inferred posterior mean: {0:.2f}\".format(\n",
|
||||||
|
" np.mean(school_effects_samples[:,])))\n",
|
||||||
|
"print(\"Inferred posterior mean se: {0:.2f}\".format(\n",
|
||||||
|
" np.std(school_effects_samples[:,])))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "vWPCzgk7IMgt"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"# Criticism\n",
|
||||||
|
"\n",
|
||||||
|
"To get the posterior predictive distribution, i.e., a model of new data $y^*$ given the observed data $y$:\n",
|
||||||
|
"\n",
|
||||||
|
"$$ p(y^*|y) \\propto \\int_\\theta p(y^* | \\theta)p(\\theta |y)d\\theta$$\n",
|
||||||
|
"\n",
|
||||||
|
"we \"intercept\" the values of the random variables in the model to set them to the mean of the posterior distribution and sample from that model to generate new data $y^*$."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "6eV4Cx0HQeMU"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"def interceptor(rv_constructor, *rv_args, **rv_kwargs):\n",
|
||||||
|
" \"\"\"Replaces prior on effects with empirical posterior mean from MCMC.\"\"\"\n",
|
||||||
|
" name = rv_kwargs.pop(\"name\")\n",
|
||||||
|
" if name == \"avg_effect\":\n",
|
||||||
|
" rv_kwargs[\"value\"] = np.mean(avg_effect_, 0)\n",
|
||||||
|
" elif name == \"avg_stddev\":\n",
|
||||||
|
" rv_kwargs[\"value\"] = np.mean(avg_stddev_, 0)\n",
|
||||||
|
" elif name == \"school_effects_standard\":\n",
|
||||||
|
" rv_kwargs[\"value\"] = np.mean(school_effects_standard_, 0)\n",
|
||||||
|
" return rv_constructor(*rv_args, **rv_kwargs)\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"with ed.interception(interceptor):\n",
|
||||||
|
" posterior = schools_model(\n",
|
||||||
|
" num_schools=num_schools, treatment_stddevs=treatment_stddevs)\n",
|
||||||
|
"\n",
|
||||||
|
"with tf.Session() as sess:\n",
|
||||||
|
" posterior_predictive = sess.run(\n",
|
||||||
|
" posterior.distribution.sample(sample_shape=(5000)))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 742
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "y3c8W--fPmph",
|
||||||
|
"outputId": "489e1285-2dcb-456e-ca2d-43b9138a1801"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"fig, axes = plt.subplots(4, 2, sharex=True, sharey=True)\n",
|
||||||
|
"fig.set_size_inches(12, 10)\n",
|
||||||
|
"fig.tight_layout()\n",
|
||||||
|
"for i, ax in enumerate(axes):\n",
|
||||||
|
" sns.kdeplot(posterior_predictive[:, 2*i], ax=ax[0], shade=True)\n",
|
||||||
|
" ax[0].title.set_text(\n",
|
||||||
|
" \"School {} treatment effect posterior predictive\".format(2*i))\n",
|
||||||
|
" sns.kdeplot(posterior_predictive[:, 2*i + 1], ax=ax[1], shade=True)\n",
|
||||||
|
" ax[1].title.set_text(\n",
|
||||||
|
" \"School {} treatment effect posterior predictive\".format(2*i + 1))\n",
|
||||||
|
"plt.show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "ATOOfzg0HMII"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# The mean predicted treatment effects for each of the eight schools.\n",
|
||||||
|
"prediction = posterior_predictive.mean(axis=0)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "MkwASzOLSgbs"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"We can look at the residuals between the treatment effects data and the predictions of the model posterior. These correspond with the plot above which shows the shrinkage of the estimated effects toward the population average."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 51
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "ulqqNf_AHMBm",
|
||||||
|
"outputId": "1c49d020-30a1-4f06-baa0-6403c7f6970b"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"treatment_effects - prediction"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "0KMqrBaGRo4S"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"Because we have a distribution of predictions for each school, we can consider the distribution of residuals as well."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "7j9RAYhIRDDz"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"residuals = treatment_effects - posterior_predictive"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"base_uri": "https://localhost:8080/",
|
||||||
|
"height": 742
|
||||||
|
},
|
||||||
|
"colab_type": "code",
|
||||||
|
"id": "zW1RKYtBRIhd",
|
||||||
|
"outputId": "64c06680-22b0-4eab-d3a3-b0efe39d0376"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"fig, axes = plt.subplots(4, 2, sharex=True, sharey=True)\n",
|
||||||
|
"fig.set_size_inches(12, 10)\n",
|
||||||
|
"fig.tight_layout()\n",
|
||||||
|
"for i, ax in enumerate(axes):\n",
|
||||||
|
" sns.kdeplot(residuals[:, 2*i], ax=ax[0], shade=True)\n",
|
||||||
|
" ax[0].title.set_text(\n",
|
||||||
|
" \"School {} treatment effect residuals\".format(2*i))\n",
|
||||||
|
" sns.kdeplot(residuals[:, 2*i + 1], ax=ax[1], shade=True)\n",
|
||||||
|
" ax[1].title.set_text(\n",
|
||||||
|
" \"School {} treatment effect residuals\".format(2*i + 1))\n",
|
||||||
|
"plt.show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "PIReUYcT0CEZ"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"# Acknowledgements\n",
|
||||||
|
"\n",
|
||||||
|
"This tutorial was originally written in Edward 1.0 ([source](https://github.com/blei-lab/edward/blob/master/notebooks/eight_schools.ipynb)). We thank all contributors to writing and revising that version."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"colab_type": "text",
|
||||||
|
"id": "g7cgoQ1XyqGv"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"# References\n",
|
||||||
|
"1. Donald B. Rubin. Estimation in parallel randomized experiments. Journal of Educational Statistics, 6(4):377-401, 1981.\n",
|
||||||
|
"2. Andrew Gelman, John Carlin, Hal Stern, David Dunson, Aki Vehtari, and Donald Rubin. Bayesian Data Analysis, Third Edition. Chapman and Hall/CRC, 2013."
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"collapsed_sections": [
|
||||||
|
"cPw5xFcq1kpw",
|
||||||
|
"TNuvn0Ih4D_R",
|
||||||
|
"cIbNcemwwO2y",
|
||||||
|
"S6Yj8WEDwI3L",
|
||||||
|
"jnVK-1yH9WCY",
|
||||||
|
"vWPCzgk7IMgt",
|
||||||
|
"PIReUYcT0CEZ",
|
||||||
|
"g7cgoQ1XyqGv"
|
||||||
|
],
|
||||||
|
"name": "Eight_Schools.ipynb",
|
||||||
|
"provenance": [],
|
||||||
|
"toc_visible": true,
|
||||||
|
"version": "0.3.2"
|
||||||
|
},
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
@ -0,0 +1,425 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Holoviews 'streams demo'\n",
|
||||||
|
"\n",
|
||||||
|
"#### from https://github.com/ioam/holoviews/blob/master/examples/user_guide/15-Streaming_Data.ipynb\n",
|
||||||
|
"#### (Holoviews/Datashader/Bokeh/Jupyter Notebook)\n",
|
||||||
|
"\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# These are examples from Holoviews Developer Philipp Rudiger:\n",
|
||||||
|
"# https://anaconda.org/philippjfr/working_with_streaming_data/notebook\n",
|
||||||
|
"# As of 2017-10-26:\n",
|
||||||
|
"# . this is bleeding-edge,\n",
|
||||||
|
"# . this Notebook is made to check it can work well also on Windows / WinPython.\n",
|
||||||
|
"#\n",
|
||||||
|
"# User may notice we're getting clother to a PyQtGraph style of graphics"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import time\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import holoviews as hv\n",
|
||||||
|
"\n",
|
||||||
|
"from holoviews.streams import Pipe, Buffer\n",
|
||||||
|
"\n",
|
||||||
|
"import streamz\n",
|
||||||
|
"import streamz.dataframe\n",
|
||||||
|
"\n",
|
||||||
|
"hv.extension('bokeh')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Pipe\n",
|
||||||
|
"\n",
|
||||||
|
"A Pipe allows data to be pushed into a DynamicMap callback to change a visualization"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Pipe opening\n",
|
||||||
|
"pipe = Pipe(data=[])\n",
|
||||||
|
"vector_dmap = hv.DynamicMap(hv.VectorField, streams=[pipe])\n",
|
||||||
|
"vector_dmap.redim.range(x=(-1, 1), y=(-1, 1))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#Feeding pipe\n",
|
||||||
|
"x,y = np.mgrid[-10:11,-10:11] * 0.1\n",
|
||||||
|
"sine_rings = np.sin(x**2+y**2)*np.pi+np.pi\n",
|
||||||
|
"exp_falloff = 1/np.exp((x**2+y**2)/8)\n",
|
||||||
|
"\n",
|
||||||
|
"for i in np.linspace(0, 1, 25):\n",
|
||||||
|
" time.sleep(0.1)\n",
|
||||||
|
" pipe.send([x,y,sine_rings*i, exp_falloff])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Buffer\n",
|
||||||
|
"Buffer automatically accumulates the last N rows of the tabular data, where N is defined by the length.\n",
|
||||||
|
"\n",
|
||||||
|
"Plotting backends (such as bokeh) can optimize plot updates by sending just the latest patch. This optimization works only if the data object held by the Buffer is identical to the plotted Element data"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"\n",
|
||||||
|
"example = pd.DataFrame({'x': [], 'y': [], 'count': []}, columns=['x', 'y', 'count'])\n",
|
||||||
|
"dfstream = Buffer(example, length=100, index=False)\n",
|
||||||
|
"curve_dmap = hv.DynamicMap(hv.Curve, streams=[dfstream])\n",
|
||||||
|
"point_dmap = hv.DynamicMap(hv.Points, streams=[dfstream])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%opts Points [color_index='count', xaxis=None, yaxis=None] (line_color='black', size=5)\n",
|
||||||
|
"%%opts Curve (line_width=1, color='black')\n",
|
||||||
|
"curve_dmap * point_dmap"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"def gen_brownian():\n",
|
||||||
|
" x, y, count = 0, 0, 0\n",
|
||||||
|
" while True:\n",
|
||||||
|
" x += np.random.randn()\n",
|
||||||
|
" y += np.random.randn()\n",
|
||||||
|
" count += 1\n",
|
||||||
|
" yield pd.DataFrame([(x, y, count)], columns=['x', 'y', 'count'])\n",
|
||||||
|
"\n",
|
||||||
|
"brownian = gen_brownian()\n",
|
||||||
|
"for i in range(200):\n",
|
||||||
|
" dfstream.send(next(brownian))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"dfstream.clear()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Asynchronous updates Using the Streamz library\n",
|
||||||
|
"\n",
|
||||||
|
"Let's start with a fairly simple example:\n",
|
||||||
|
"- Declare a streamz.Stream and a Pipe object and connect them into a pipeline into which we can push data.\n",
|
||||||
|
"- Use a sliding_window of 10, which will first wait for 10 sets of stream updates to accumulate. At that point and for every subsequent update, it will apply pd.concat to combine the most recent 10 updates into a new dataframe.\n",
|
||||||
|
"- Use the sink method on the streamz.Stream to send the resulting collection of 10 updates to Pipe.\n",
|
||||||
|
"- Declare a DynamicMap that takes the sliding window of concatenated DataFrames and displays it using a Scatter Element.\n",
|
||||||
|
"- Color the Scatter points by their 'count' and set a range, then display:\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"point_source = streamz.Stream()\n",
|
||||||
|
"pipe = Pipe(data=[])\n",
|
||||||
|
"point_source.sliding_window(20).map(pd.concat).sink(pipe.send) # Connect streamz to the Pipe\n",
|
||||||
|
"scatter_dmap = hv.DynamicMap(hv.Scatter, streams=[pipe])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%opts Scatter [color_index='count', bgcolor='black']\n",
|
||||||
|
"scatter_dmap.redim.range(y=(-4, 4))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"for i in range(100):\n",
|
||||||
|
" df = pd.DataFrame({'x': np.random.rand(100), 'y': np.random.randn(100), 'count': i},\n",
|
||||||
|
" columns=['x', 'y', 'count'])\n",
|
||||||
|
" point_source.emit(df)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### StreamingDataFrame"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### A simple example"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"simple_sdf = streamz.dataframe.Random(freq='10ms', interval='100ms')\n",
|
||||||
|
"print(simple_sdf.index)\n",
|
||||||
|
"simple_sdf.example.dtypes"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%opts Curve [width=500 show_grid=True]\n",
|
||||||
|
"sdf = (simple_sdf-0.5).cumsum()\n",
|
||||||
|
"hv.DynamicMap(hv.Curve, streams=[Buffer(sdf.x)])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"The ``Random`` StreamingDataFrame will asynchronously emit events until it is stopped, which we can do by calling the ``stop`` method."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"simple_sdf.stop()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Making use of the StreamingDataFrame API"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%opts Curve [width=500 show_grid=True]\n",
|
||||||
|
"source_df = streamz.dataframe.Random(freq='5ms', interval='100ms')\n",
|
||||||
|
"sdf = (source_df-0.5).cumsum()\n",
|
||||||
|
"raw_dmap = hv.DynamicMap(hv.Curve, streams=[Buffer(sdf.x)])\n",
|
||||||
|
"smooth_dmap = hv.DynamicMap(hv.Curve, streams=[Buffer(sdf.x.rolling('500ms').mean())])\n",
|
||||||
|
"\n",
|
||||||
|
"raw_dmap.relabel('raw') * smooth_dmap.relabel('smooth')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"source_df.stop()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Controlling the backlog"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from functools import partial\n",
|
||||||
|
"multi_source = streamz.dataframe.Random(freq='5ms', interval='100ms')\n",
|
||||||
|
"sdf = (multi_source-0.5).cumsum()\n",
|
||||||
|
"hv.DynamicMap(hv.Table, streams=[Buffer(sdf.x, length=10)]) +\\\n",
|
||||||
|
"hv.DynamicMap(partial(hv.BoxWhisker, kdims=[], vdims=['x']), streams=[Buffer(sdf.x, length=100)])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Updating multiple cells"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Since a ``StreamingDataFrame`` will emit data until it is stopped we can subscribe multiple plots across different cells to the same stream:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"hv.DynamicMap(hv.Scatter, streams=[Buffer(sdf.x)])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"multi_source.stop()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Applying operations"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"hist_source = streamz.dataframe.Random(freq='5ms', interval='100ms')\n",
|
||||||
|
"sdf = (hist_source-0.5).cumsum()\n",
|
||||||
|
"dmap = hv.DynamicMap(hv.Dataset, streams=[Buffer(sdf.x, length=500)])\n",
|
||||||
|
"hv.operation.histogram(dmap, dimension='x')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"hist_source.stop()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Datashading"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"The same approach will also work for the datashader operation letting us datashade the entire ``backlog`` window even if we make it very large:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%opts RGB [width=600]\n",
|
||||||
|
"from holoviews.operation.datashader import datashade\n",
|
||||||
|
"from bokeh.palettes import Blues8\n",
|
||||||
|
"large_source = streamz.dataframe.Random(freq='100us', interval='200ms')\n",
|
||||||
|
"sdf = (large_source-0.5).cumsum()\n",
|
||||||
|
"dmap = hv.DynamicMap(hv.Curve, streams=[Buffer(sdf.x, length=100000)])\n",
|
||||||
|
"datashade(dmap, streams=[hv.streams.PlotSize], normalization='linear', cmap=Blues8)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"large_source.stop()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.3"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
@ -0,0 +1,182 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Qt Demo\n",
|
||||||
|
"\n",
|
||||||
|
"This will launch various Qt compatible packages\n",
|
||||||
|
"\n",
|
||||||
|
"Nota: as of 2018-04-29th, PySide2-5.11 compatibility is\n",
|
||||||
|
" - Ok for Qtconsole, Qtpy, pyzo, wppm \n",
|
||||||
|
" - ToDo for PyQtgraph, Spyder, guidata, guiqwt, rx\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Qt4 & Qt5 Dedicated Graphic libraries: PyQtgraph, guidata, guiqwt"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# PyQtgraph (Scientific Graphics and GUI Library for Python)\n",
|
||||||
|
"import pyqtgraph.examples; pyqtgraph.examples.run()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Guidata (Python library generating graphical user interfaces for easy dataset editing and display)\n",
|
||||||
|
"from guidata import tests; tests.run()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Guiqwt (Efficient 2D plotting Python library based on PythonQwt)\n",
|
||||||
|
"from guiqwt import tests; tests.run()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#QtDemo (if present)\n",
|
||||||
|
"!if exist \"%WINPYDIR%\\Lib\\site-packages\\PyQt5\\examples\\qtdemo\\qtdemo.py\" \"%WINPYDIR%\\python.exe\" \"%WINPYDIR%\\Lib\\site-packages\\PyQt5\\examples\\qtdemo\\qtdemo.py\"\n",
|
||||||
|
"!if exist \"%WINPYDIR%\\Lib\\site-packages\\PyQt4\\examples\\demos\\qtdemo\\qtdemo.pyw\"\"%WINPYDIR%\\pythonw.exe\" \"%WINPYDIR%\\Lib\\site-packages\\PyQt4\\examples\\demos\\qtdemo\\qtdemo.pyw\"\n",
|
||||||
|
"!if exist \"%WINPYDIR%\\Lib\\site-packages\\PySide2\\examples\\datavisualization\" \"%WINPYDIR%\\python.exe\" \"%WINPYDIR%\\Lib\\site-packages\\PySide2\\examples\\datavisualization\\bars3d.py\"\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Reactive programing: rx"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# from https://github.com/ReactiveX/RxPY/blob/master/examples/timeflie\n",
|
||||||
|
"from rx.subjects import Subject\n",
|
||||||
|
"from rx.concurrency import QtScheduler\n",
|
||||||
|
"import sys\n",
|
||||||
|
"\n",
|
||||||
|
"try:\n",
|
||||||
|
" from PyQt4 import QtCore\n",
|
||||||
|
" from PyQt4.QtGui import QWidget, QLabel\n",
|
||||||
|
" from PyQt4.QtGui import QApplication\n",
|
||||||
|
"except ImportError:\n",
|
||||||
|
" try:\n",
|
||||||
|
" from PyQt5 import QtCore\n",
|
||||||
|
" from PyQt5.QtWidgets import QApplication, QWidget, QLabel\n",
|
||||||
|
" except ImportError:\n",
|
||||||
|
" from PySide import QtCore\n",
|
||||||
|
" from PySide.QtGui import QWidget, QLabel\n",
|
||||||
|
" from PySide.QtGui import QApplication\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"class Window(QWidget):\n",
|
||||||
|
"\n",
|
||||||
|
" def __init__(self):\n",
|
||||||
|
" super(QWidget, self).__init__()\n",
|
||||||
|
" self.setWindowTitle(\"Rx for Python rocks\")\n",
|
||||||
|
" self.resize(600, 600)\n",
|
||||||
|
" self.setMouseTracking(True)\n",
|
||||||
|
"\n",
|
||||||
|
" # This Subject is used to transmit mouse moves to labels\n",
|
||||||
|
" self.mousemove = Subject()\n",
|
||||||
|
"\n",
|
||||||
|
" def mouseMoveEvent(self, event):\n",
|
||||||
|
" self.mousemove.on_next((event.x(), event.y()))\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"def main():\n",
|
||||||
|
" app = QApplication(sys.argv)\n",
|
||||||
|
" scheduler = QtScheduler(QtCore)\n",
|
||||||
|
"\n",
|
||||||
|
" window = Window()\n",
|
||||||
|
" window.show()\n",
|
||||||
|
"\n",
|
||||||
|
" text = 'TIME FLIES LIKE AN ARROW'\n",
|
||||||
|
" labels = [QLabel(char, window) for char in text]\n",
|
||||||
|
"\n",
|
||||||
|
" def handle_label(i, label):\n",
|
||||||
|
"\n",
|
||||||
|
" def on_next(pos):\n",
|
||||||
|
" x, y = pos\n",
|
||||||
|
" label.move(x + i*12 + 15, y)\n",
|
||||||
|
" label.show()\n",
|
||||||
|
"\n",
|
||||||
|
" window.mousemove.delay(i*100, scheduler=scheduler).subscribe(on_next)\n",
|
||||||
|
"\n",
|
||||||
|
" for i, label in enumerate(labels):\n",
|
||||||
|
" handle_label(i, label)\n",
|
||||||
|
"\n",
|
||||||
|
" sys.exit(app.exec_())\n",
|
||||||
|
"\n",
|
||||||
|
"if __name__ == '__main__':\n",
|
||||||
|
" main()\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"!pip download --dest C:\\WinP\\a QtPy"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.5"
|
||||||
|
},
|
||||||
|
"widgets": {
|
||||||
|
"state": {},
|
||||||
|
"version": "1.1.2"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 1
|
||||||
|
}
|
@ -0,0 +1,932 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Winpython Default checker"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import warnings\n",
|
||||||
|
"warnings.filterwarnings(\"ignore\", category=DeprecationWarning)\n",
|
||||||
|
"warnings.filterwarnings(\"ignore\", category=UserWarning)\n",
|
||||||
|
"warnings.filterwarnings(\"ignore\", category=FutureWarning)\n",
|
||||||
|
"# warnings.filterwarnings(\"ignore\") # would silence all warnings"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%matplotlib inline"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Compilers: Numba and Cython\n",
|
||||||
|
"\n",
|
||||||
|
"##### Requirement\n",
|
||||||
|
"To get Cython working, Winpython 3.5 users should install \"Microsoft Visual C++ Build Tools 2015\" (visualcppbuildtools_full.exe, a 4 Go installation) at https://beta.visualstudio.com/download-visual-studio-vs/\n",
|
||||||
|
"\n",
|
||||||
|
"To get Numba working, not-windows10 users may have to install \"Microsoft Visual C++ 2015 Redistributable\" (vc_redist) at <https://beta.visualstudio.com/download-visual-studio-vs/>\n",
|
||||||
|
"\n",
|
||||||
|
"#### Compiler toolchains"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# checking Numba JIT toolchain\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"image = np.zeros((1024, 1536), dtype = np.uint8)\n",
|
||||||
|
"\n",
|
||||||
|
"from pylab import imshow, show\n",
|
||||||
|
"from timeit import default_timer as timer\n",
|
||||||
|
"\n",
|
||||||
|
"def create_fractal(min_x, max_x, min_y, max_y, image, iters , mandelx):\n",
|
||||||
|
" height = image.shape[0]\n",
|
||||||
|
" width = image.shape[1]\n",
|
||||||
|
" pixel_size_x = (max_x - min_x) / width\n",
|
||||||
|
" pixel_size_y = (max_y - min_y) / height\n",
|
||||||
|
" \n",
|
||||||
|
" for x in range(width):\n",
|
||||||
|
" real = min_x + x * pixel_size_x\n",
|
||||||
|
" for y in range(height):\n",
|
||||||
|
" imag = min_y + y * pixel_size_y\n",
|
||||||
|
" color = mandelx(real, imag, iters)\n",
|
||||||
|
" image[y, x] = color"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"##### Numba (a JIT Compiler)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from numba import autojit\n",
|
||||||
|
"\n",
|
||||||
|
"@autojit\n",
|
||||||
|
"def mandel(x, y, max_iters):\n",
|
||||||
|
" c = complex(x, y)\n",
|
||||||
|
" z = 0.0j\n",
|
||||||
|
" for i in range(max_iters):\n",
|
||||||
|
" z = z*z + c\n",
|
||||||
|
" if (z.real*z.real + z.imag*z.imag) >= 4:\n",
|
||||||
|
" return i\n",
|
||||||
|
" return max_iters\n",
|
||||||
|
"\n",
|
||||||
|
"start = timer()\n",
|
||||||
|
"create_fractal(-2.0, 1.0, -1.0, 1.0, image, 20 , mandel) \n",
|
||||||
|
"dt = timer() - start\n",
|
||||||
|
"\n",
|
||||||
|
"print (\"Mandelbrot created by numba in %f s\" % dt)\n",
|
||||||
|
"imshow(image)\n",
|
||||||
|
"show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"##### Cython (a compiler for writing C extensions for the Python language)\n",
|
||||||
|
"WinPython 3.5 and 3.6 users may not have mingwpy available, and so need \"VisualStudio C++ Community Edition 2015\" https://www.visualstudio.com/downloads/download-visual-studio-vs#d-visual-c "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Cython + Mingwpy compiler toolchain test\n",
|
||||||
|
"%load_ext Cython"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%cython -a\n",
|
||||||
|
"# with %%cython -a , full C-speed lines are shown in white, slowest python-speed lines are shown in dark yellow lines \n",
|
||||||
|
"# ==> put your cython rewrite effort on dark yellow lines\n",
|
||||||
|
"def mandel_cython(x, y, max_iters):\n",
|
||||||
|
" cdef int i \n",
|
||||||
|
" cdef double cx, cy , zx, zy\n",
|
||||||
|
" cx , cy = x, y \n",
|
||||||
|
" zx , zy =0 ,0 \n",
|
||||||
|
" for i in range(max_iters):\n",
|
||||||
|
" zx , zy = zx*zx - zy*zy + cx , zx*zy*2 + cy\n",
|
||||||
|
" if (zx*zx + zy*zy) >= 4:\n",
|
||||||
|
" return i\n",
|
||||||
|
" return max_iters"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"start = timer()\n",
|
||||||
|
"create_fractal(-2.0, 1.0, -1.0, 1.0, image, 20 , mandel_cython) \n",
|
||||||
|
"dt = timer() - start\n",
|
||||||
|
"\n",
|
||||||
|
"print (\"Mandelbrot created by cython in %f s\" % dt)\n",
|
||||||
|
"imshow(image)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Graphics: Matplotlib, Pandas, Seaborn, Holoviews, Bokeh, bqplot, ipyleaflet, plotnine"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Matplotlib\n",
|
||||||
|
"# for more examples, see: http://matplotlib.org/gallery.html\n",
|
||||||
|
"from mpl_toolkits.mplot3d import axes3d\n",
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"from matplotlib import cm\n",
|
||||||
|
"\n",
|
||||||
|
"fig = plt.figure()\n",
|
||||||
|
"ax = fig.gca(projection='3d')\n",
|
||||||
|
"X, Y, Z = axes3d.get_test_data(0.05)\n",
|
||||||
|
"ax.plot_surface(X, Y, Z, rstride=8, cstride=8, alpha=0.3)\n",
|
||||||
|
"cset = ax.contourf(X, Y, Z, zdir='z', offset=-100, cmap=cm.coolwarm)\n",
|
||||||
|
"cset = ax.contourf(X, Y, Z, zdir='x', offset=-40, cmap=cm.coolwarm)\n",
|
||||||
|
"cset = ax.contourf(X, Y, Z, zdir='y', offset=40, cmap=cm.coolwarm)\n",
|
||||||
|
"\n",
|
||||||
|
"ax.set_xlabel('X')\n",
|
||||||
|
"ax.set_xlim(-40, 40)\n",
|
||||||
|
"ax.set_ylabel('Y')\n",
|
||||||
|
"ax.set_ylim(-40, 40)\n",
|
||||||
|
"ax.set_zlabel('Z')\n",
|
||||||
|
"ax.set_zlim(-100, 100)\n",
|
||||||
|
"\n",
|
||||||
|
"plt.show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Seaborn\n",
|
||||||
|
"# for more examples, see http://stanford.edu/~mwaskom/software/seaborn/examples/index.html\n",
|
||||||
|
"import seaborn as sns\n",
|
||||||
|
"sns.set()\n",
|
||||||
|
"df = sns.load_dataset(\"iris\")\n",
|
||||||
|
"sns.pairplot(df, hue=\"species\", height=1.5)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# altair-2.0.0 example\n",
|
||||||
|
"import altair as alt\n",
|
||||||
|
"\n",
|
||||||
|
"# Uncomment/run this line to enable Altair in JupyterLab/nteract:\n",
|
||||||
|
"#alt.renderers.enable('default') # api_v2\n",
|
||||||
|
"#alt.renderers.enable('notebook') # api_v2,if in Notebook\n",
|
||||||
|
"alt.Chart(df).mark_bar().encode(\n",
|
||||||
|
" x=alt.X('sepal_length', bin=alt.Bin(maxbins=50)),\n",
|
||||||
|
" y='count(*):Q',\n",
|
||||||
|
" color='species:N',\n",
|
||||||
|
" #column='species',\n",
|
||||||
|
").interactive() # api_v1 .configure_cell(width=200)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# temporary warning removal\n",
|
||||||
|
"import warnings\n",
|
||||||
|
"import matplotlib as mpl\n",
|
||||||
|
"warnings.filterwarnings(\"ignore\", category=mpl.cbook.MatplotlibDeprecationWarning)\n",
|
||||||
|
"# Holoviews\n",
|
||||||
|
"# for more example, see http://holoviews.org/Tutorials/index.html\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import holoviews as hv\n",
|
||||||
|
"hv.extension('matplotlib')\n",
|
||||||
|
"dots = np.linspace(-0.45, 0.45, 11)\n",
|
||||||
|
"fractal = hv.Image(image)\n",
|
||||||
|
"\n",
|
||||||
|
"layouts = {y: (fractal * hv.Points(fractal.sample([(i,y) for i in dots])) +\n",
|
||||||
|
" fractal.sample(y=y) )\n",
|
||||||
|
" for y in np.linspace(0, 0.45,11)}\n",
|
||||||
|
"\n",
|
||||||
|
"hv.HoloMap(layouts, kdims=['Y']).collate().cols(2)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Bokeh 0.12.5 \n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"from six.moves import zip\n",
|
||||||
|
"from bokeh.plotting import figure, show, output_notebook\n",
|
||||||
|
"N = 4000\n",
|
||||||
|
"x = np.random.random(size=N) * 100\n",
|
||||||
|
"y = np.random.random(size=N) * 100\n",
|
||||||
|
"radii = np.random.random(size=N) * 1.5\n",
|
||||||
|
"colors = [\"#%02x%02x%02x\" % (int(r), int(g), 150) for r, g in zip(50+2*x, 30+2*y)]\n",
|
||||||
|
"\n",
|
||||||
|
"output_notebook()\n",
|
||||||
|
"TOOLS=\"hover,crosshair,pan,wheel_zoom,box_zoom,reset,tap,save,box_select,poly_select,lasso_select\"\n",
|
||||||
|
"\n",
|
||||||
|
"p = figure(tools=TOOLS)\n",
|
||||||
|
"p.scatter(x,y, radius=radii, fill_color=colors, fill_alpha=0.6, line_color=None)\n",
|
||||||
|
"show(p)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Datashader (holoviews+Bokeh)\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import holoviews as hv\n",
|
||||||
|
"import datashader as ds\n",
|
||||||
|
"from holoviews.operation.datashader import aggregate, shade, datashade, dynspread\n",
|
||||||
|
"from bokeh.models import DatetimeTickFormatter\n",
|
||||||
|
"hv.extension('bokeh')\n",
|
||||||
|
"\n",
|
||||||
|
"def time_series(T = 1, N = 100, mu = 0.1, sigma = 0.1, S0 = 20): \n",
|
||||||
|
" \"\"\"Parameterized noisy time series\"\"\"\n",
|
||||||
|
" dt = float(T)/N\n",
|
||||||
|
" t = np.linspace(0, T, N)\n",
|
||||||
|
" W = np.random.standard_normal(size = N) \n",
|
||||||
|
" W = np.cumsum(W)*np.sqrt(dt) # standard brownian motion\n",
|
||||||
|
" X = (mu-0.5*sigma**2)*t + sigma*W \n",
|
||||||
|
" S = S0*np.exp(X) # geometric brownian motion\n",
|
||||||
|
" return S\n",
|
||||||
|
"\n",
|
||||||
|
"def apply_formatter(plot, element):\n",
|
||||||
|
" plot.handles['xaxis'].formatter = DatetimeTickFormatter()\n",
|
||||||
|
" \n",
|
||||||
|
"drange = pd.date_range(start=\"2014-01-01\", end=\"2016-01-01\", freq='1D') # or '1min'\n",
|
||||||
|
"dates = drange.values.astype('int64')/10**6 # Convert dates to ints\n",
|
||||||
|
"curve = hv.Curve((dates, time_series(N=len(dates), sigma = 1)))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%opts RGB [finalize_hooks=[apply_formatter] width=800]\n",
|
||||||
|
"%%opts Overlay [finalize_hooks=[apply_formatter] width=800] \n",
|
||||||
|
"%%opts Scatter [tools=['hover', 'box_select']] (line_color=\"black\" fill_color=\"red\" size=10)\n",
|
||||||
|
"\n",
|
||||||
|
"from holoviews.operation.timeseries import rolling, rolling_outlier_std\n",
|
||||||
|
"smoothed = rolling(curve, rolling_window=50)\n",
|
||||||
|
"outliers = rolling_outlier_std(curve, rolling_window=50, sigma=2)\n",
|
||||||
|
"datashade(curve, cmap=[\"blue\"]) * dynspread(datashade(smoothed, cmap=[\"red\"]),max_px=1) * outliers"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#bqplot\n",
|
||||||
|
"from IPython.display import display\n",
|
||||||
|
"from bqplot import (Figure, Map, Mercator, Orthographic, ColorScale, ColorAxis,\n",
|
||||||
|
" AlbersUSA, topo_load, Tooltip)\n",
|
||||||
|
"def_tt = Tooltip(fields=['id', 'name'])\n",
|
||||||
|
"map_mark = Map(scales={'projection': Mercator()}, tooltip=def_tt)\n",
|
||||||
|
"map_mark.interactions = {'click': 'select', 'hover': 'tooltip'}\n",
|
||||||
|
"fig = Figure(marks=[map_mark], title='Interactions Example')\n",
|
||||||
|
"display(fig)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# ipyleaflet (javascript library usage)\n",
|
||||||
|
"from ipyleaflet import (\n",
|
||||||
|
" Map, Marker, TileLayer, ImageOverlay, Polyline, Polygon,\n",
|
||||||
|
" Rectangle, Circle, CircleMarker, GeoJSON, DrawControl\n",
|
||||||
|
")\n",
|
||||||
|
"from traitlets import link\n",
|
||||||
|
"center = [34.6252978589571, -77.34580993652344]\n",
|
||||||
|
"m = Map(center=[34.6252978589571, -77.34580993652344], zoom=10)\n",
|
||||||
|
"dc = DrawControl()\n",
|
||||||
|
"\n",
|
||||||
|
"def handle_draw(self, action, geo_json):\n",
|
||||||
|
" print(action)\n",
|
||||||
|
" print(geo_json)\n",
|
||||||
|
"m\n",
|
||||||
|
"m"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"dc.on_draw(handle_draw)\n",
|
||||||
|
"m.add_control(dc)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# plotnine: giving a taste of ggplot of R langage (formerly we were using ggpy)\n",
|
||||||
|
"from plotnine import ggplot, aes, geom_blank, geom_point, stat_smooth, facet_wrap, theme_bw\n",
|
||||||
|
"from plotnine.data import mtcars\n",
|
||||||
|
"ggplot(mtcars, aes(x='hp', y='wt', color='mpg')) + geom_point() +\\\n",
|
||||||
|
"facet_wrap(\"~cyl\") + theme_bw()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Ipython Notebook: Interactivity & other"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import IPython;IPython.__version__"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Audio Example : https://github.com/ipython/ipywidgets/blob/master/examples/Beat%20Frequencies.ipynb\n",
|
||||||
|
"%matplotlib inline\n",
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"from ipywidgets import interactive\n",
|
||||||
|
"from IPython.display import Audio, display\n",
|
||||||
|
"def beat_freq(f1=220.0, f2=224.0):\n",
|
||||||
|
" max_time = 3\n",
|
||||||
|
" rate = 8000\n",
|
||||||
|
" times = np.linspace(0,max_time,rate*max_time)\n",
|
||||||
|
" signal = np.sin(2*np.pi*f1*times) + np.sin(2*np.pi*f2*times)\n",
|
||||||
|
" print(f1, f2, abs(f1-f2))\n",
|
||||||
|
" display(Audio(data=signal, rate=rate))\n",
|
||||||
|
" try:\n",
|
||||||
|
" plt.plot(signal); #plt.plot(v.result);\n",
|
||||||
|
" except:\n",
|
||||||
|
" pass\n",
|
||||||
|
" return signal\n",
|
||||||
|
"v = interactive(beat_freq, f1=(200.0,300.0), f2=(200.0,300.0))\n",
|
||||||
|
"display(v)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Networks graph Example : https://github.com/ipython/ipywidgets/blob/master/examples/Exploring%20Graphs.ipynb\n",
|
||||||
|
"%matplotlib inline\n",
|
||||||
|
"from ipywidgets import interact\n",
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"import networkx as nx\n",
|
||||||
|
"# wrap a few graph generation functions so they have the same signature\n",
|
||||||
|
"\n",
|
||||||
|
"def random_lobster(n, m, k, p):\n",
|
||||||
|
" return nx.random_lobster(n, p, p / m)\n",
|
||||||
|
"\n",
|
||||||
|
"def powerlaw_cluster(n, m, k, p):\n",
|
||||||
|
" return nx.powerlaw_cluster_graph(n, m, p)\n",
|
||||||
|
"\n",
|
||||||
|
"def erdos_renyi(n, m, k, p):\n",
|
||||||
|
" return nx.erdos_renyi_graph(n, p)\n",
|
||||||
|
"\n",
|
||||||
|
"def newman_watts_strogatz(n, m, k, p):\n",
|
||||||
|
" return nx.newman_watts_strogatz_graph(n, k, p)\n",
|
||||||
|
"\n",
|
||||||
|
"@interact(n=(2,30), m=(1,10), k=(1,10), p=(0.0, 1.0, 0.001),\n",
|
||||||
|
" generator={'lobster': random_lobster,\n",
|
||||||
|
" 'power law': powerlaw_cluster,\n",
|
||||||
|
" 'Newman-Watts-Strogatz': newman_watts_strogatz,\n",
|
||||||
|
" u'Erdős-Rényi': erdos_renyi,\n",
|
||||||
|
" })\n",
|
||||||
|
"def plot_random_graph(n, m, k, p, generator):\n",
|
||||||
|
" g = generator(n, m, k, p)\n",
|
||||||
|
" nx.draw(g)\n",
|
||||||
|
" plt.title(generator.__name__)\n",
|
||||||
|
" plt.show()\n",
|
||||||
|
" "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Mathematical: statsmodels, lmfit, "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# checking statsmodels\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"plt.style.use('ggplot')\n",
|
||||||
|
"import statsmodels.api as sm\n",
|
||||||
|
"data = sm.datasets.anes96.load_pandas()\n",
|
||||||
|
"party_ID = np.arange(7)\n",
|
||||||
|
"labels = [\"Strong Democrat\", \"Weak Democrat\", \"Independent-Democrat\",\n",
|
||||||
|
" \"Independent-Independent\", \"Independent-Republican\",\n",
|
||||||
|
" \"Weak Republican\", \"Strong Republican\"]\n",
|
||||||
|
"plt.rcParams['figure.subplot.bottom'] = 0.23 # keep labels visible\n",
|
||||||
|
"plt.rcParams['figure.figsize'] = (6.0, 4.0) # make plot larger in notebook\n",
|
||||||
|
"age = [data.exog['age'][data.endog == id] for id in party_ID]\n",
|
||||||
|
"fig = plt.figure()\n",
|
||||||
|
"ax = fig.add_subplot(111)\n",
|
||||||
|
"plot_opts={'cutoff_val':5, 'cutoff_type':'abs',\n",
|
||||||
|
" 'label_fontsize':'small',\n",
|
||||||
|
" 'label_rotation':30}\n",
|
||||||
|
"sm.graphics.beanplot(age, ax=ax, labels=labels,\n",
|
||||||
|
" plot_opts=plot_opts)\n",
|
||||||
|
"ax.set_xlabel(\"Party identification of respondent\")\n",
|
||||||
|
"ax.set_ylabel(\"Age\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# lmfit test (from http://nbviewer.ipython.org/github/lmfit/lmfit-py/blob/master/examples/lmfit-model.ipynb)\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"def decay(t, N, tau):\n",
|
||||||
|
" return N*np.exp(-t/tau)\n",
|
||||||
|
"t = np.linspace(0, 5, num=1000)\n",
|
||||||
|
"data = decay(t, 7, 3) + np.random.randn(*t.shape)\n",
|
||||||
|
"\n",
|
||||||
|
"from lmfit import Model\n",
|
||||||
|
"\n",
|
||||||
|
"model = Model(decay, independent_vars=['t'])\n",
|
||||||
|
"result = model.fit(data, t=t, N=10, tau=1)\n",
|
||||||
|
"plt.plot(t, data) # data\n",
|
||||||
|
"plt.plot(t, decay(t=t, **result.values), color='orange', linewidth=5) # best-fit model"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## DataFrames: Pandas, Dask"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#Pandas \n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"\n",
|
||||||
|
"idx = pd.date_range('2000', '2005', freq='d', closed='left')\n",
|
||||||
|
"datas = pd.DataFrame({'Color': [ 'green' if x> 1 else 'red' for x in np.random.randn(len(idx))], \n",
|
||||||
|
" 'Measure': np.random.randn(len(idx)), 'Year': idx.year},\n",
|
||||||
|
" index=idx.date)\n",
|
||||||
|
"datas.head()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### Split / Apply / Combine \n",
|
||||||
|
" Split your data into multiple independent groups.\n",
|
||||||
|
" Apply some function to each group.\n",
|
||||||
|
" Combine your groups back into a single data object.\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"datas.query('Measure > 0').groupby(['Color','Year']).size().unstack()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Web Scraping: Beautifulsoup"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# checking Web Scraping: beautifulsoup and requests \n",
|
||||||
|
"import requests\n",
|
||||||
|
"from bs4 import BeautifulSoup\n",
|
||||||
|
"\n",
|
||||||
|
"URL = 'http://en.wikipedia.org/wiki/Franklin,_Tennessee'\n",
|
||||||
|
"\n",
|
||||||
|
"req = requests.get(URL, headers={'User-Agent' : \"Mining the Social Web\"})\n",
|
||||||
|
"soup = BeautifulSoup(req.text, \"lxml\")\n",
|
||||||
|
"\n",
|
||||||
|
"geoTag = soup.find(True, 'geo')\n",
|
||||||
|
"\n",
|
||||||
|
"if geoTag and len(geoTag) > 1:\n",
|
||||||
|
" lat = geoTag.find(True, 'latitude').string\n",
|
||||||
|
" lon = geoTag.find(True, 'longitude').string\n",
|
||||||
|
" print ('Location is at', lat, lon)\n",
|
||||||
|
"elif geoTag and len(geoTag) == 1:\n",
|
||||||
|
" (lat, lon) = geoTag.string.split(';')\n",
|
||||||
|
" (lat, lon) = (lat.strip(), lon.strip())\n",
|
||||||
|
" print ('Location is at', lat, lon)\n",
|
||||||
|
"else:\n",
|
||||||
|
" print ('No location found')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Operations Research: Pulp"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"scrolled": true
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Pulp example : minimizing the weight to carry 99 pennies\n",
|
||||||
|
"# (from Philip I Thomas)\n",
|
||||||
|
"# see https://www.youtube.com/watch?v=UmMn-N5w-lI#t=995\n",
|
||||||
|
"# Import PuLP modeler functions\n",
|
||||||
|
"from pulp import *\n",
|
||||||
|
"# The prob variable is created to contain the problem data \n",
|
||||||
|
"prob = LpProblem(\"99 pennies Problem\",LpMinimize)\n",
|
||||||
|
"\n",
|
||||||
|
"# Variables represent how many of each coin we want to carry\n",
|
||||||
|
"pennies = LpVariable(\"Number of pennies\",0,None,LpInteger)\n",
|
||||||
|
"nickels = LpVariable(\"Number of nickels\",0,None,LpInteger)\n",
|
||||||
|
"dimes = LpVariable(\"Number of dimes\",0,None,LpInteger)\n",
|
||||||
|
"quarters = LpVariable(\"Number of quarters\",0,None,LpInteger)\n",
|
||||||
|
"\n",
|
||||||
|
"# The objective function is added to 'prob' first\n",
|
||||||
|
"\n",
|
||||||
|
"# we want to minimize (LpMinimize) this \n",
|
||||||
|
"prob += 2.5 * pennies + 5 * nickels + 2.268 * dimes + 5.670 * quarters, \"Total coins Weight\"\n",
|
||||||
|
"\n",
|
||||||
|
"# We want exactly 99 cents\n",
|
||||||
|
"prob += 1 * pennies + 5 * nickels + 10 * dimes + 25 * quarters == 99, \"\"\n",
|
||||||
|
"\n",
|
||||||
|
"# The problem data is written to an .lp file\n",
|
||||||
|
"prob.writeLP(\"99cents.lp\")\n",
|
||||||
|
"prob.solve()\n",
|
||||||
|
"\n",
|
||||||
|
"# print (\"status\",LpStatus[prob.status] )\n",
|
||||||
|
"print (\"Minimal Weight to carry exactly 99 pennies is %s grams\" % value(prob.objective))\n",
|
||||||
|
"# Each of the variables is printed with it's resolved optimum value\n",
|
||||||
|
"for v in prob.variables():\n",
|
||||||
|
" print (v.name, \"=\", v.varValue)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Deep Learning: see tutorial-first-neural-network-python-keras"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Symbolic Calculation: sympy"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# checking sympy \n",
|
||||||
|
"import sympy\n",
|
||||||
|
"a, b =sympy.symbols('a b')\n",
|
||||||
|
"e=(a+b)**5\n",
|
||||||
|
"e.expand()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## SQL tools: sqlite, Ipython-sql, sqlite_bro, baresql, db.py"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# checking Ipython-sql, sqlparse, SQLalchemy\n",
|
||||||
|
"%load_ext sql"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%sql sqlite:///.baresql.db\n",
|
||||||
|
"DROP TABLE IF EXISTS writer;\n",
|
||||||
|
"CREATE TABLE writer (first_name, last_name, year_of_death);\n",
|
||||||
|
"INSERT INTO writer VALUES ('William', 'Shakespeare', 1616);\n",
|
||||||
|
"INSERT INTO writer VALUES ('Bertold', 'Brecht', 1956);\n",
|
||||||
|
"SELECT * , sqlite_version() as sqlite_version from Writer order by Year_of_death"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# checking baresql\n",
|
||||||
|
"from __future__ import print_function, unicode_literals, division # line needed only if Python2.7\n",
|
||||||
|
"from baresql import baresql\n",
|
||||||
|
"bsql = baresql.baresql(connection=\"sqlite:///.baresql.db\")\n",
|
||||||
|
"bsqldf = lambda q: bsql.df(q, dict(globals(),**locals()))\n",
|
||||||
|
"\n",
|
||||||
|
"users = ['Alexander', 'Billy', 'Charles', 'Danielle', 'Esmeralda', 'Franz', 'Greg']\n",
|
||||||
|
"# We use the python 'users' list like a SQL table\n",
|
||||||
|
"sql = \"select 'Welcome ' || c0 || ' !' as say_hello, length(c0) as name_length from users$$ where c0 like '%a%' \"\n",
|
||||||
|
"bsqldf(sql)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Transfering Datas to sqlite, doing transformation in sql, going back to Pandas and Matplotlib\n",
|
||||||
|
"bsqldf('''\n",
|
||||||
|
"select Color, Year, count(*) as size \n",
|
||||||
|
"from datas$$ \n",
|
||||||
|
"where Measure > 0 \n",
|
||||||
|
"group by Color, Year'''\n",
|
||||||
|
" ).set_index(['Year', 'Color']).unstack().plot(kind='bar')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# checking db.py\n",
|
||||||
|
"from db import DB\n",
|
||||||
|
"db=DB(dbtype=\"sqlite\", filename=\".baresql.db\")\n",
|
||||||
|
"db.query(\"select sqlite_version() as sqlite_version ;\") "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"db.tables"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# checking sqlite_bro: this should lanch a separate non-browser window with sqlite_bro's welcome\n",
|
||||||
|
"!cmd start cmd /C sqlite_bro"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# pyodbc \n",
|
||||||
|
"import pyodbc\n",
|
||||||
|
"\n",
|
||||||
|
"# look for pyodbc providers\n",
|
||||||
|
"sources = pyodbc.dataSources()\n",
|
||||||
|
"dsns = list(sources.keys())\n",
|
||||||
|
"sl = [' %s [%s]' % (dsn, sources[dsn]) for dsn in dsns]\n",
|
||||||
|
"print(\"pyodbc Providers: (beware 32/64 bit driver and python version must match)\\n\", '\\n'.join(sl))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# pythonnet\n",
|
||||||
|
"import clr\n",
|
||||||
|
"clr.AddReference(\"System.Data\")\n",
|
||||||
|
"clr.AddReference('System.Data.Common')\n",
|
||||||
|
"import System.Data.OleDb as ADONET\n",
|
||||||
|
"import System.Data.Odbc as ODBCNET\n",
|
||||||
|
"import System.Data.Common as DATACOM\n",
|
||||||
|
"\n",
|
||||||
|
"table = DATACOM.DbProviderFactories.GetFactoryClasses()\n",
|
||||||
|
"print(\"\\n .NET Providers: (beware 32/64 bit driver and python version must match)\")\n",
|
||||||
|
"for row in table.Rows:\n",
|
||||||
|
" print(\" %s\" % row[table.Columns[0]])\n",
|
||||||
|
" print(\" \",[row[column] for column in table.Columns if column != table.Columns[0]])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Qt libraries Demo\n",
|
||||||
|
"\n",
|
||||||
|
" \n",
|
||||||
|
"#### See [Dedicated Qt Libraries Demo](Qt_libraries_demo.ipynb)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Wrap-up"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# optional scipy full test (takes up to 10 minutes)\n",
|
||||||
|
"#!cmd /C start cmd /k python.exe -c \"import scipy;scipy.test()\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"!pip list"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.7rc1"
|
||||||
|
},
|
||||||
|
"widgets": {
|
||||||
|
"state": {
|
||||||
|
"056d32c70f644417b86a152d3a2385bd": {
|
||||||
|
"views": [
|
||||||
|
{
|
||||||
|
"cell_index": 14
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"2307e84bf81346d49818eef8862360ca": {
|
||||||
|
"views": [
|
||||||
|
{
|
||||||
|
"cell_index": 22
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"4e7a6f5db8e74905a08d4636afa3b82f": {
|
||||||
|
"views": [
|
||||||
|
{
|
||||||
|
"cell_index": 15
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"e762d7875083491eb2933958cc3331a9": {
|
||||||
|
"views": [
|
||||||
|
{
|
||||||
|
"cell_index": 21
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"version": "1.2.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
@ -0,0 +1,863 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Tom Augspurger Dplyr/Pandas comparison (copy of 2016-01-01)\n",
|
||||||
|
"\n",
|
||||||
|
"### See result there\n",
|
||||||
|
"http://nbviewer.ipython.org/urls/gist.githubusercontent.com/TomAugspurger/6e052140eaa5fdb6e8c0/raw/627b77addb4bcfc39ab6be6d85cb461e956fb3a3/dplyr_pandas.ipynb\n",
|
||||||
|
"\n",
|
||||||
|
"### to reproduce on your WinPython you'll need to get flights.csv in this directory"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"This notebook compares [pandas](http://pandas.pydata.org)\n",
|
||||||
|
"and [dplyr](http://cran.r-project.org/web/packages/dplyr/index.html).\n",
|
||||||
|
"The comparison is just on syntax (verbage), not performance. Whether you're an R user looking to switch to pandas (or the other way around), I hope this guide will help ease the transition.\n",
|
||||||
|
"\n",
|
||||||
|
"We'll work through the [introductory dplyr vignette](http://cran.r-project.org/web/packages/dplyr/vignettes/introduction.html) to analyze some flight data.\n",
|
||||||
|
"\n",
|
||||||
|
"I'm working on a better layout to show the two packages side by side.\n",
|
||||||
|
"But for now I'm just putting the ``dplyr`` code in a comment above each python call.\n",
|
||||||
|
"\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### using R steps to get flights.csv\n",
|
||||||
|
"\n",
|
||||||
|
"un-comment the next cell unless you have installed R and want to get Flights example from the source\n",
|
||||||
|
"\n",
|
||||||
|
"to install R on your Winpython:\n",
|
||||||
|
"[how to install R](installing_R.ipynb)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"collapsed": true
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#%load_ext rpy2.ipython\n",
|
||||||
|
"#%R install.packages(\"nycflights13\", repos='http://cran.us.r-project.org')\n",
|
||||||
|
"#%R library(nycflights13)\n",
|
||||||
|
"#%R write.csv(flights, \"flights.csv\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### using an internet download to get flight.qcsv"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Downloading and unzipg a file, without R method :\n",
|
||||||
|
"# source= http://stackoverflow.com/a/34863053/3140336\n",
|
||||||
|
"import io\n",
|
||||||
|
"from zipfile import ZipFile\n",
|
||||||
|
"import requests\n",
|
||||||
|
"\n",
|
||||||
|
"def get_zip(file_url):\n",
|
||||||
|
" url = requests.get(file_url)\n",
|
||||||
|
" zipfile = ZipFile(io.BytesIO(url.content))\n",
|
||||||
|
" zip_names = zipfile.namelist()\n",
|
||||||
|
" if len(zip_names) == 1:\n",
|
||||||
|
" file_name = zip_names.pop()\n",
|
||||||
|
" extracted_file = zipfile.open(file_name)\n",
|
||||||
|
" return extracted_file\n",
|
||||||
|
"\n",
|
||||||
|
"url=r'https://github.com/winpython/winpython_afterdoc/raw/master/examples/nycflights13_datas/flights.zip'\n",
|
||||||
|
"with io.open(\"flights.csv\", 'wb') as f:\n",
|
||||||
|
" f.write(get_zip(url).read())\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Some prep work to get the data from R and into pandas\n",
|
||||||
|
"%matplotlib inline\n",
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"#%load_ext rpy2.ipython\n",
|
||||||
|
"\n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import seaborn as sns\n",
|
||||||
|
"\n",
|
||||||
|
"pd.set_option(\"display.max_rows\", 5)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Data: nycflights13"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"flights = pd.read_csv(\"flights.csv\", index_col=0)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# dim(flights) <--- The R code\n",
|
||||||
|
"flights.shape # <--- The python code"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# head(flights)\n",
|
||||||
|
"flights.head()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Single table verbs"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"``dplyr`` has a small set of nicely defined verbs. I've listed their closest pandas verbs.\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"<table>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td><b>dplyr</b></td>\n",
|
||||||
|
" <td><b>pandas</b></td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td><code>filter()</code> (and <code>slice()</code>)</td>\n",
|
||||||
|
" <td><code>query()</code> (and <code>loc[]</code>, <code>iloc[]</code>)</td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td><code>arrange()</code></td>\n",
|
||||||
|
" <td><code>sort_values</code> and <code>sort_index()</code></td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td><code>select() </code>(and <code>rename()</code>)</td>\n",
|
||||||
|
" <td><code>__getitem__ </code> (and <code>rename()</code>)</td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td><code>distinct()</code></td>\n",
|
||||||
|
" <td><code>drop_duplicates()</code></td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td><code>mutate()</code> (and <code>transmute()</code>)</td>\n",
|
||||||
|
" <td>assign</td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td>summarise()</td>\n",
|
||||||
|
" <td>None</td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td>sample_n() and sample_frac()</td>\n",
|
||||||
|
" <td><code>sample</code></td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
" <tr>\n",
|
||||||
|
" <td><code>%>%</code></td>\n",
|
||||||
|
" <td><code>pipe</code></td>\n",
|
||||||
|
" </tr>\n",
|
||||||
|
"\n",
|
||||||
|
"</table>\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"Some of the \"missing\" verbs in pandas are because there are other, different ways of achieving the same goal. For example `summarise` is spread across `mean`, `std`, etc. It's closest analog is actually the `.agg` method on a `GroupBy` object, as it reduces a DataFrame to a single row (per group). This isn't quite what `.describe` does.\n",
|
||||||
|
"\n",
|
||||||
|
"I've also included the `pipe` operator from R (`%>%`), the `pipe` method from pandas, even though it isn't quite a verb."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Filter rows with filter(), query()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# filter(flights, month == 1, day == 1)\n",
|
||||||
|
"flights.query(\"month == 1 & day == 1\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"We see the first big *language* difference between R and python.\n",
|
||||||
|
"Many python programmers will shun the R code as too magical.\n",
|
||||||
|
"How is the programmer supposed to know that `month` and `day` are supposed to represent columns in the DataFrame?\n",
|
||||||
|
"On the other hand, to emulate this *very* convenient feature of R, python has to write the expression as a string, and evaluate the string in the context of the DataFrame."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"The more verbose version:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# flights[flights$month == 1 & flights$day == 1, ]\n",
|
||||||
|
"flights[(flights.month == 1) & (flights.day == 1)]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# slice(flights, 1:10)\n",
|
||||||
|
"flights.iloc[:9]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Arrange rows with arrange(), sort()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# arrange(flights, year, month, day) \n",
|
||||||
|
"flights.sort_values(['year', 'month', 'day'])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# arrange(flights, desc(arr_delay))\n",
|
||||||
|
"flights.sort_values('arr_delay', ascending=False)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"It's worth mentioning the other common sorting method for pandas DataFrames, `sort_index`. Pandas puts much more emphasis on indicies, (or row labels) than R.\n",
|
||||||
|
"This is a design decision that has positives and negatives, which we won't go into here. Suffice to say that when you need to sort a `DataFrame` by the index, use `DataFrame.sort_index`."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Select columns with select(), []"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# select(flights, year, month, day) \n",
|
||||||
|
"flights[['year', 'month', 'day']]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# select(flights, year:day) \n",
|
||||||
|
"flights.loc[:, 'year':'day']"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"collapsed": true
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# select(flights, -(year:day)) \n",
|
||||||
|
"\n",
|
||||||
|
"# No direct equivalent here. I would typically use\n",
|
||||||
|
"# flights.drop(cols_to_drop, axis=1)\n",
|
||||||
|
"# or fligths[flights.columns.difference(pd.Index(cols_to_drop))]\n",
|
||||||
|
"# point to dplyr!"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# select(flights, tail_num = tailnum)\n",
|
||||||
|
"flights.rename(columns={'tailnum': 'tail_num'})['tail_num']"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"But like Hadley mentions, not that useful since it only returns the one column. ``dplyr`` and ``pandas`` compare well here."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# rename(flights, tail_num = tailnum)\n",
|
||||||
|
"flights.rename(columns={'tailnum': 'tail_num'})"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Pandas is more verbose, but the the argument to `columns` can be any mapping. So it's often used with a function to perform a common task, say `df.rename(columns=lambda x: x.replace('-', '_'))` to replace any dashes with underscores. Also, ``rename`` (the pandas version) can be applied to the Index."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"One more note on the differences here.\n",
|
||||||
|
"Pandas could easily include a `.select` method.\n",
|
||||||
|
"[`xray`](http://xray.readthedocs.org/en/stable/), a library that builds on top of NumPy and pandas to offer labeled N-dimensional arrays (along with many other things) does [just that](http://xray.readthedocs.org/en/stable/indexing.html#indexing-with-labeled-dimensions).\n",
|
||||||
|
"Pandas chooses the `.loc` and `.iloc` accessors because *any valid selection is also a valid assignment*. This makes it easier to modify the data.\n",
|
||||||
|
"\n",
|
||||||
|
"```python\n",
|
||||||
|
"flights.loc[:, 'year':'day'] = data\n",
|
||||||
|
"```\n",
|
||||||
|
"\n",
|
||||||
|
"where `data` is an object that is, or can be broadcast to, the correct shape."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Extract distinct (unique) rows "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# distinct(select(flights, tailnum))\n",
|
||||||
|
"flights.tailnum.unique()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"FYI this returns a numpy array instead of a Series."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# distinct(select(flights, origin, dest))\n",
|
||||||
|
"flights[['origin', 'dest']].drop_duplicates()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"OK, so ``dplyr`` wins there from a consistency point of view. ``unique`` is only defined on Series, not DataFrames."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Add new columns with mutate() "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"We at pandas shamelessly stole this for [v0.16.0](http://pandas.pydata.org/pandas-docs/stable/whatsnew.html#whatsnew-0160-enhancements-assign)."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# mutate(flights,\n",
|
||||||
|
"# gain = arr_delay - dep_delay,\n",
|
||||||
|
"# speed = distance / air_time * 60)\n",
|
||||||
|
"\n",
|
||||||
|
"flights.assign(gain=flights.arr_delay - flights.dep_delay,\n",
|
||||||
|
" speed=flights.distance / flights.air_time * 60)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# mutate(flights,\n",
|
||||||
|
"# gain = arr_delay - dep_delay,\n",
|
||||||
|
"# gain_per_hour = gain / (air_time / 60)\n",
|
||||||
|
"# )\n",
|
||||||
|
"\n",
|
||||||
|
"(flights.assign(gain=flights.arr_delay - flights.dep_delay)\n",
|
||||||
|
" .assign(gain_per_hour = lambda df: df.gain / (df.air_time / 60)))\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"The first example is pretty much identical (aside from the names, `mutate` vs. `assign`).\n",
|
||||||
|
"\n",
|
||||||
|
"The second example just comes down to language differences. In `R`, it's possible to implement a function like `mutate` where you can refer to `gain` in the line calcuating `gain_per_hour`, even though `gain` hasn't actually been calcuated yet.\n",
|
||||||
|
"\n",
|
||||||
|
"In Python, you can have arbitrary keyword arguments to functions (which we needed for `.assign`), but the order of the argumnets is arbitrary since `dict`s are unsorted and `**kwargs*` is a `dict`. So you can't have something like `df.assign(x=df.a / df.b, y=x **2)`, because you don't know whether `x` or `y` will come first (you'd also get an error saying `x` is undefined.\n",
|
||||||
|
"\n",
|
||||||
|
"To work around that with pandas, you'll need to split up the assigns, and pass in a *callable* to the second assign. The callable looks at itself to find a column named `gain`. Since the line above returns a DataFrame with the `gain` column added, the pipeline goes through just fine."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# transmute(flights,\n",
|
||||||
|
"# gain = arr_delay - dep_delay,\n",
|
||||||
|
"# gain_per_hour = gain / (air_time / 60)\n",
|
||||||
|
"# )\n",
|
||||||
|
"(flights.assign(gain=flights.arr_delay - flights.dep_delay)\n",
|
||||||
|
" .assign(gain_per_hour = lambda df: df.gain / (df.air_time / 60))\n",
|
||||||
|
" [['gain', 'gain_per_hour']])\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Summarise values with summarise()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# summarise(flights,\n",
|
||||||
|
"# delay = mean(dep_delay, na.rm = TRUE))\n",
|
||||||
|
"flights.dep_delay.mean()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"This is only roughly equivalent.\n",
|
||||||
|
"`summarise` takes a callable (e.g. `mean`, `sum`) and evaluates that on the DataFrame. In pandas these are spread across `pd.DataFrame.mean`, `pd.DataFrame.sum`. This will come up again when we look at `groupby`."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Randomly sample rows with sample_n() and sample_frac()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# sample_n(flights, 10)\n",
|
||||||
|
"flights.sample(n=10)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# sample_frac(flights, 0.01)\n",
|
||||||
|
"flights.sample(frac=.01)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Grouped operations "
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# planes <- group_by(flights, tailnum)\n",
|
||||||
|
"# delay <- summarise(planes,\n",
|
||||||
|
"# count = n(),\n",
|
||||||
|
"# dist = mean(distance, na.rm = TRUE),\n",
|
||||||
|
"# delay = mean(arr_delay, na.rm = TRUE))\n",
|
||||||
|
"# delay <- filter(delay, count > 20, dist < 2000)\n",
|
||||||
|
"\n",
|
||||||
|
"planes = flights.groupby(\"tailnum\")\n",
|
||||||
|
"delay = (planes.agg({\"year\": \"count\",\n",
|
||||||
|
" \"distance\": \"mean\",\n",
|
||||||
|
" \"arr_delay\": \"mean\"})\n",
|
||||||
|
" .rename(columns={\"distance\": \"dist\",\n",
|
||||||
|
" \"arr_delay\": \"delay\",\n",
|
||||||
|
" \"year\": \"count\"})\n",
|
||||||
|
" .query(\"count > 20 & dist < 2000\"))\n",
|
||||||
|
"delay"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"For me, dplyr's ``n()`` looked is a bit starge at first, but it's already growing on me.\n",
|
||||||
|
"\n",
|
||||||
|
"I think pandas is more difficult for this particular example.\n",
|
||||||
|
"There isn't as natural a way to mix column-agnostic aggregations (like ``count``) with column-specific aggregations like the other two. You end up writing could like `.agg{'year': 'count'}` which reads, \"I want the count of `year`\", even though you don't care about `year` specifically. You could just as easily have said `.agg('distance': 'count')`.\n",
|
||||||
|
"Additionally assigning names can't be done as cleanly in pandas; you have to just follow it up with a ``rename`` like before."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"We may as well reproduce the graph. It looks like `ggplots` `geom_smooth` is some kind of lowess smoother. We can either us [seaborn](http://stanford.edu/~mwaskom/software/seaborn/):"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"fig, ax = plt.subplots(figsize=(12, 6))\n",
|
||||||
|
"\n",
|
||||||
|
"sns.regplot(\"dist\", \"delay\", data=delay, lowess=True, ax=ax,\n",
|
||||||
|
" scatter_kws={'color': 'k', 'alpha': .5, 's': delay['count'] / 10}, ci=90,\n",
|
||||||
|
" line_kws={'linewidth': 3});"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Or using statsmodels directly for more control over the lowess, with an extremely lazy\n",
|
||||||
|
"\"confidence interval\"."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"collapsed": true
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import statsmodels.api as sm"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"smooth = sm.nonparametric.lowess(delay.delay, delay.dist, frac=1/8)\n",
|
||||||
|
"ax = delay.plot(kind='scatter', x='dist', y = 'delay', figsize=(12, 6),\n",
|
||||||
|
" color='k', alpha=.5, s=delay['count'] / 10)\n",
|
||||||
|
"ax.plot(smooth[:, 0], smooth[:, 1], linewidth=3);\n",
|
||||||
|
"std = smooth[:, 1].std()\n",
|
||||||
|
"ax.fill_between(smooth[:, 0], smooth[:, 1] - std, smooth[:, 1] + std, alpha=.25);"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# destinations <- group_by(flights, dest)\n",
|
||||||
|
"# summarise(destinations,\n",
|
||||||
|
"# planes = n_distinct(tailnum),\n",
|
||||||
|
"# flights = n()\n",
|
||||||
|
"# )\n",
|
||||||
|
"\n",
|
||||||
|
"destinations = flights.groupby('dest')\n",
|
||||||
|
"destinations.agg({\n",
|
||||||
|
" 'tailnum': lambda x: len(x.unique()),\n",
|
||||||
|
" 'year': 'count'\n",
|
||||||
|
" }).rename(columns={'tailnum': 'planes',\n",
|
||||||
|
" 'year': 'flights'})"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"There's a little know feature to `groupby.agg`: it accepts a dict of dicts mapping\n",
|
||||||
|
"columns to `{name: aggfunc}` pairs. Here's the result:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"destinations = flights.groupby('dest')\n",
|
||||||
|
"r = destinations.agg({'tailnum': {'planes': lambda x: len(x.unique())},\n",
|
||||||
|
" 'year': {'flights': 'count'}})\n",
|
||||||
|
"r"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"The result is a `MultiIndex` in the columns which can be a bit awkard to work with (you can drop a level with `r.columns.droplevel()`). Also the syntax going into the `.agg` may not be the clearest."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Similar to how ``dplyr`` provides optimized C++ versions of most of the `summarise` functions, pandas uses [cython](http://cython.org) optimized versions for most of the `agg` methods."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# daily <- group_by(flights, year, month, day)\n",
|
||||||
|
"# (per_day <- summarise(daily, flights = n()))\n",
|
||||||
|
"\n",
|
||||||
|
"daily = flights.groupby(['year', 'month', 'day'])\n",
|
||||||
|
"per_day = daily['distance'].count()\n",
|
||||||
|
"per_day"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# (per_month <- summarise(per_day, flights = sum(flights)))\n",
|
||||||
|
"per_month = per_day.groupby(level=['year', 'month']).sum()\n",
|
||||||
|
"per_month"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# (per_year <- summarise(per_month, flights = sum(flights)))\n",
|
||||||
|
"per_year = per_month.sum()\n",
|
||||||
|
"per_year"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"I'm not sure how ``dplyr`` is handling the other columns, like `year`, in the last example. With pandas, it's clear that we're grouping by them since they're included in the groupby. For the last example, we didn't group by anything, so they aren't included in the result."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Chaining"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Any follower of Hadley's [twitter account](https://twitter.com/hadleywickham/) will know how much R users *love* the ``%>%`` (pipe) operator. And for good reason!"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# flights %>%\n",
|
||||||
|
"# group_by(year, month, day) %>%\n",
|
||||||
|
"# select(arr_delay, dep_delay) %>%\n",
|
||||||
|
"# summarise(\n",
|
||||||
|
"# arr = mean(arr_delay, na.rm = TRUE),\n",
|
||||||
|
"# dep = mean(dep_delay, na.rm = TRUE)\n",
|
||||||
|
"# ) %>%\n",
|
||||||
|
"# filter(arr > 30 | dep > 30)\n",
|
||||||
|
"(\n",
|
||||||
|
"flights.groupby(['year', 'month', 'day'])\n",
|
||||||
|
" [['arr_delay', 'dep_delay']]\n",
|
||||||
|
" .mean()\n",
|
||||||
|
" .query('arr_delay > 30 | dep_delay > 30')\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"A bit of soapboxing here if you'll indulge me.\n",
|
||||||
|
"\n",
|
||||||
|
"The example above is a bit contrived since it only uses methods on `DataFrame`. But what if you have some function to work into your pipeline that pandas hasn't (or won't) implement? In that case you're required to break up your pipeline by assigning your intermediate (probably uninteresting) DataFrame to a temporary variable you don't actually care about.\n",
|
||||||
|
"\n",
|
||||||
|
"`R` doesn't have this problem since the `%>%` operator works with any function that takes (and maybe returns) DataFrames.\n",
|
||||||
|
"The python language doesn't have any notion of right to left function application (other than special cases like `__radd__` and `__rmul__`).\n",
|
||||||
|
"It only allows the usual left to right `function(arguments)`, where you can think of the `()` as the \"call this function\" operator.\n",
|
||||||
|
"\n",
|
||||||
|
"Pandas wanted something like `%>%` and we did it in a farily pythonic way. The `pd.DataFrame.pipe` method takes a function and optionally some arguments, and calls that function with `self` (the DataFrame) as the first argument.\n",
|
||||||
|
"\n",
|
||||||
|
"So\n",
|
||||||
|
"\n",
|
||||||
|
"```R\n",
|
||||||
|
"flights >%> my_function(my_argument=10)\n",
|
||||||
|
"```\n",
|
||||||
|
"\n",
|
||||||
|
"becomes\n",
|
||||||
|
"\n",
|
||||||
|
"```python\n",
|
||||||
|
"flights.pipe(my_function, my_argument=10)\n",
|
||||||
|
"```\n",
|
||||||
|
"\n",
|
||||||
|
"We initially had grander visions for `.pipe`, but the wider python community didn't seem that interested."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Other Data Sources"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Pandas has tons [IO tools](http://pandas.pydata.org/pandas-docs/version/0.15.0/io.html) to help you get data in and out, including SQL databases via [SQLAlchemy](http://www.sqlalchemy.org)."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Summary"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"I think pandas held up pretty well, considering this was a vignette written for dplyr. I found the degree of similarity more interesting than the differences. The most difficult task was renaming of columns within an operation; they had to be followed up with a call to ``rename`` *after* the operation, which isn't that burdensome honestly.\n",
|
||||||
|
"\n",
|
||||||
|
"More and more it looks like we're moving towards future where being a language or package partisan just doesn't make sense. Not when you can load up a [Jupyter](http://jupyter.org) (formerly IPython) notebook to call up a library written in R, and hand those results off to python or Julia or whatever for followup, before going back to R to make a cool [shiny](http://shiny.rstudio.com) web app.\n",
|
||||||
|
"\n",
|
||||||
|
"There will always be a place for your \"utility belt\" package like dplyr or pandas, but it wouldn't hurt to be familiar with both.\n",
|
||||||
|
"\n",
|
||||||
|
"If you want to contribute to pandas, we're always looking for help at https://github.com/pydata/pandas/.\n",
|
||||||
|
"You can get ahold of me directly on [twitter](https://twitter.com/tomaugspurger)."
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.2"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 1
|
||||||
|
}
|
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,183 @@
|
|||||||
|
"""
|
||||||
|
Matplotlib Minesweeper
|
||||||
|
----------------------
|
||||||
|
A simple Minesweeper implementation in matplotlib.
|
||||||
|
|
||||||
|
Author: Jake Vanderplas <vanderplas@astro.washington.edu>, Dec. 2012
|
||||||
|
License: BSD
|
||||||
|
"""
|
||||||
|
import numpy as np
|
||||||
|
from itertools import product
|
||||||
|
from scipy.signal import convolve2d
|
||||||
|
import matplotlib.pyplot as plt
|
||||||
|
from matplotlib.patches import RegularPolygon
|
||||||
|
|
||||||
|
|
||||||
|
class MineSweeper(object):
|
||||||
|
covered_color = '#DDDDDD'
|
||||||
|
uncovered_color = '#AAAAAA'
|
||||||
|
edge_color = '#888888'
|
||||||
|
count_colors = ['none', 'blue', 'green', 'red', 'darkblue',
|
||||||
|
'darkred', 'darkgreen', 'black', 'black']
|
||||||
|
flag_vertices = np.array([[0.25, 0.2], [0.25, 0.8],
|
||||||
|
[0.75, 0.65], [0.25, 0.5]])
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def beginner(cls):
|
||||||
|
return cls(8, 8, 10)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def intermediate(cls):
|
||||||
|
return cls(16, 16, 40)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def expert(cls):
|
||||||
|
return cls(30, 16, 99)
|
||||||
|
|
||||||
|
def __init__(self, width, height, nmines):
|
||||||
|
self.width, self.height, self.nmines = width, height, nmines
|
||||||
|
|
||||||
|
# Create the figure and axes
|
||||||
|
self.fig = plt.figure(figsize=((width + 2) / 3., (height + 2) / 3.))
|
||||||
|
self.ax = self.fig.add_axes((0.05, 0.05, 0.9, 0.9),
|
||||||
|
aspect='equal', frameon=False,
|
||||||
|
xlim=(-0.05, width + 0.05),
|
||||||
|
ylim=(-0.05, height + 0.05))
|
||||||
|
for axis in (self.ax.xaxis, self.ax.yaxis):
|
||||||
|
axis.set_major_formatter(plt.NullFormatter())
|
||||||
|
axis.set_major_locator(plt.NullLocator())
|
||||||
|
|
||||||
|
# Create the grid of squares
|
||||||
|
self.squares = np.array([[RegularPolygon((i + 0.5, j + 0.5),
|
||||||
|
numVertices=4,
|
||||||
|
radius=0.5 * np.sqrt(2),
|
||||||
|
orientation=np.pi / 4,
|
||||||
|
ec=self.edge_color,
|
||||||
|
fc=self.covered_color)
|
||||||
|
for j in range(height)]
|
||||||
|
for i in range(width)])
|
||||||
|
[self.ax.add_patch(sq) for sq in self.squares.flat]
|
||||||
|
|
||||||
|
# define internal state variables
|
||||||
|
self.mines = None
|
||||||
|
self.counts = None
|
||||||
|
self.clicked = np.zeros((self.width, self.height), dtype=bool)
|
||||||
|
self.flags = np.zeros((self.width, self.height), dtype=object)
|
||||||
|
self.game_over = False
|
||||||
|
|
||||||
|
# Create event hook for mouse clicks
|
||||||
|
self.fig.canvas.mpl_connect('button_press_event', self._button_press)
|
||||||
|
|
||||||
|
def _draw_mine(self, i, j):
|
||||||
|
self.ax.add_patch(plt.Circle((i + 0.5, j + 0.5), radius=0.25,
|
||||||
|
ec='black', fc='black'))
|
||||||
|
|
||||||
|
def _draw_red_X(self, i, j):
|
||||||
|
self.ax.text(i + 0.5, j + 0.5, 'X', color='r', fontsize=20,
|
||||||
|
ha='center', va='center')
|
||||||
|
|
||||||
|
def _toggle_mine_flag(self, i, j):
|
||||||
|
if self.clicked[i, j]:
|
||||||
|
pass
|
||||||
|
elif self.flags[i, j]:
|
||||||
|
self.ax.patches.remove(self.flags[i, j])
|
||||||
|
self.flags[i, j] = None
|
||||||
|
else:
|
||||||
|
self.flags[i, j] = plt.Polygon(self.flag_vertices + [i, j],
|
||||||
|
fc='red', ec='black', lw=2)
|
||||||
|
self.ax.add_patch(self.flags[i, j])
|
||||||
|
|
||||||
|
def _reveal_unmarked_mines(self):
|
||||||
|
for (i, j) in zip(*np.where(self.mines & ~self.flags.astype(bool))):
|
||||||
|
self._draw_mine(i, j)
|
||||||
|
|
||||||
|
def _cross_out_wrong_flags(self):
|
||||||
|
for (i, j) in zip(*np.where(~self.mines & self.flags.astype(bool))):
|
||||||
|
self._draw_red_X(i, j)
|
||||||
|
|
||||||
|
def _mark_remaining_mines(self):
|
||||||
|
for (i, j) in zip(*np.where(self.mines & ~self.flags.astype(bool))):
|
||||||
|
self._toggle_mine_flag(i, j)
|
||||||
|
|
||||||
|
def _setup_mines(self, i, j):
|
||||||
|
# randomly place mines on a grid, but not on space (i, j)
|
||||||
|
idx = np.concatenate([np.arange(i * self.height + j),
|
||||||
|
np.arange(i * self.height + j + 1,
|
||||||
|
self.width * self.height)])
|
||||||
|
np.random.shuffle(idx)
|
||||||
|
self.mines = np.zeros((self.width, self.height), dtype=bool)
|
||||||
|
self.mines.flat[idx[:self.nmines]] = 1
|
||||||
|
|
||||||
|
# count the number of mines bordering each square
|
||||||
|
self.counts = convolve2d(self.mines.astype(complex), np.ones((3, 3)),
|
||||||
|
mode='same').real.astype(int)
|
||||||
|
|
||||||
|
def _click_square(self, i, j):
|
||||||
|
# if this is the first click, then set up the mines
|
||||||
|
if self.mines is None:
|
||||||
|
self._setup_mines(i, j)
|
||||||
|
|
||||||
|
# if there is a flag or square is already clicked, do nothing
|
||||||
|
if self.flags[i, j] or self.clicked[i, j]:
|
||||||
|
return
|
||||||
|
self.clicked[i, j] = True
|
||||||
|
|
||||||
|
# hit a mine: game over
|
||||||
|
if self.mines[i, j]:
|
||||||
|
self.game_over = True
|
||||||
|
self._reveal_unmarked_mines()
|
||||||
|
self._draw_red_X(i, j)
|
||||||
|
self._cross_out_wrong_flags()
|
||||||
|
|
||||||
|
# square with no surrounding mines: clear out all adjacent squares
|
||||||
|
elif self.counts[i, j] == 0:
|
||||||
|
self.squares[i, j].set_facecolor(self.uncovered_color)
|
||||||
|
for ii in range(max(0, i - 1), min(self.width, i + 2)):
|
||||||
|
for jj in range(max(0, j - 1), min(self.height, j + 2)):
|
||||||
|
self._click_square(ii, jj)
|
||||||
|
|
||||||
|
# hit an empty square: reveal the number
|
||||||
|
else:
|
||||||
|
self.squares[i, j].set_facecolor(self.uncovered_color)
|
||||||
|
self.ax.text(i + 0.5, j + 0.5, str(self.counts[i, j]),
|
||||||
|
color=self.count_colors[self.counts[i, j]],
|
||||||
|
ha='center', va='center', fontsize=18,
|
||||||
|
fontweight='bold')
|
||||||
|
|
||||||
|
# if all remaining squares are mines, mark them and end game
|
||||||
|
if self.mines.sum() == (~self.clicked).sum():
|
||||||
|
self.game_over = True
|
||||||
|
self._mark_remaining_mines()
|
||||||
|
|
||||||
|
def _button_press(self, event):
|
||||||
|
if self.game_over or (event.xdata is None) or (event.ydata is None):
|
||||||
|
return
|
||||||
|
i, j = map(int, (event.xdata, event.ydata))
|
||||||
|
if (i < 0 or j < 0 or i >= self.width or j >= self.height):
|
||||||
|
return
|
||||||
|
|
||||||
|
# left mouse button: reveal square. If the square is already clicked
|
||||||
|
# and the correct # of mines are marked, then clear surroundig squares
|
||||||
|
if event.button == 1:
|
||||||
|
if (self.clicked[i, j]):
|
||||||
|
flag_count = self.flags[max(0, i - 1):i + 2,
|
||||||
|
max(0, j - 1):j + 2].astype(bool).sum()
|
||||||
|
if self.counts[i, j] == flag_count:
|
||||||
|
for ii, jj in product(range(max(0, i - 1),
|
||||||
|
min(self.width, i + 2)),
|
||||||
|
range(max(0, j - 1),
|
||||||
|
min(self.height, j + 2))):
|
||||||
|
self._click_square(ii, jj)
|
||||||
|
else:
|
||||||
|
self._click_square(i, j)
|
||||||
|
|
||||||
|
# right mouse button: mark/unmark flag
|
||||||
|
elif (event.button == 3) and (not self.clicked[i, j]):
|
||||||
|
self._toggle_mine_flag(i, j)
|
||||||
|
|
||||||
|
self.fig.canvas.draw()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
ms = MineSweeper.intermediate()
|
||||||
|
plt.show()
|
@ -0,0 +1,183 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Pandas demo from Jake Reback\n",
|
||||||
|
"see https://github.com/jreback/PyDataNYC2015 or http://pandas.pydata.org/pandas-docs/version/0.17.1/style.html#Table-Styles"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%matplotlib inline\n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import seaborn as sns\n",
|
||||||
|
"pd.options.display.max_rows=12\n",
|
||||||
|
"pd.options.display.width=80\n",
|
||||||
|
"\n",
|
||||||
|
"# read a csv\n",
|
||||||
|
"df = pd.read_csv('~/seaborn-data/iris.csv',index_col=False)\n",
|
||||||
|
"# adjust column names\n",
|
||||||
|
"df.columns = df.columns.str.replace('\\s+','_').str.lower()\n",
|
||||||
|
"# in pandas 0.17.1 and more : sample and style ... \n",
|
||||||
|
"def color_negative_red(val):\n",
|
||||||
|
" \"\"\"\n",
|
||||||
|
" Takes a scalar and returns a string with\n",
|
||||||
|
" the css property `'color: red'` for negative\n",
|
||||||
|
" strings, black otherwise.\n",
|
||||||
|
" \"\"\"\n",
|
||||||
|
" color = 'red' if (isinstance(val, (int, float)) and val < 3) else 'black'\n",
|
||||||
|
" return 'color: %s' % color\n",
|
||||||
|
"\n",
|
||||||
|
"def highlight_max(s):\n",
|
||||||
|
" '''\n",
|
||||||
|
" highlight the maximum in a Series\n",
|
||||||
|
" '''\n",
|
||||||
|
" is_max = s == s.max()\n",
|
||||||
|
" return ['background-color: yellow' if v else '' for v in is_max]\n",
|
||||||
|
"\n",
|
||||||
|
"cm = sns.light_palette(\"green\", as_cmap=True)\n",
|
||||||
|
"\n",
|
||||||
|
"(df\n",
|
||||||
|
" .sample (n=7)\n",
|
||||||
|
" .style\n",
|
||||||
|
" .applymap(color_negative_red, subset=pd.IndexSlice[['sepal_width', 'petal_width']])\n",
|
||||||
|
" .bar(subset=['sepal_length', 'petal_length'], color='#7F7FFF')\n",
|
||||||
|
" .background_gradient(subset=['sepal_width', 'petal_width'], cmap=cm)\n",
|
||||||
|
" .apply(highlight_max)\n",
|
||||||
|
" .highlight_null(null_color='red')\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# assign = define new temporary columns (like 'mutate' in R language)\n",
|
||||||
|
"(df\n",
|
||||||
|
" .query('sepal_length > 5')\n",
|
||||||
|
" .assign(sepal_ratio = lambda x: x.sepal_width / x.sepal_length,\n",
|
||||||
|
" petal_ratio = lambda x: x.petal_width / x.petal_length)\n",
|
||||||
|
" .plot\n",
|
||||||
|
" .scatter(x='sepal_ratio', y='petal_ratio', figsize=(8,4))\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# pipe = like '%>%' in R language\n",
|
||||||
|
"(df\n",
|
||||||
|
" .query('sepal_length > 5')\n",
|
||||||
|
" .assign(sepal_ratio = lambda x: x.sepal_width / x.sepal_length,\n",
|
||||||
|
" petal_ratio = lambda x: x.petal_width / x.petal_length)\n",
|
||||||
|
" .pipe(sns.pairplot, hue='species', height=1.5)\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Pandas interactive\n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"\n",
|
||||||
|
"# create a df with random datas\n",
|
||||||
|
"np.random.seed(24)\n",
|
||||||
|
"df = pd.DataFrame({'A': np.linspace(1, 10, 10)})\n",
|
||||||
|
"df = pd.concat([df, pd.DataFrame(np.random.randn(10, 4), columns=list('BCDE'))],\n",
|
||||||
|
" axis=1)\n",
|
||||||
|
"df.iloc[0, 2] = np.nan\n",
|
||||||
|
"\n",
|
||||||
|
"# interactive\n",
|
||||||
|
"#from IPython.html import widgets\n",
|
||||||
|
"from ipywidgets import widgets\n",
|
||||||
|
"@widgets.interact\n",
|
||||||
|
"def f(h_neg=(0, 359, 1), h_pos=(0, 359), s=(0., 99.9), l=(0., 99.9)):\n",
|
||||||
|
" return (df\n",
|
||||||
|
" .style\n",
|
||||||
|
" .background_gradient(\n",
|
||||||
|
" cmap=sns.palettes.diverging_palette(\n",
|
||||||
|
" h_neg=h_neg, h_pos=h_pos, s=s, l=l, as_cmap=True)\n",
|
||||||
|
" ).highlight_null()\n",
|
||||||
|
" )"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from IPython.display import HTML\n",
|
||||||
|
"\n",
|
||||||
|
"def hover(hover_color=\"#ff0f99\"):\n",
|
||||||
|
" return dict(selector=\"tr:hover\",\n",
|
||||||
|
" props=[(\"background-color\", \"%s\" % hover_color)])\n",
|
||||||
|
"\n",
|
||||||
|
"styles = [\n",
|
||||||
|
" hover(),\n",
|
||||||
|
" dict(selector=\"th\", props=[(\"font-size\", \"150%\"),\n",
|
||||||
|
" (\"text-align\", \"center\")]),\n",
|
||||||
|
" dict(selector=\"caption\", props=[(\"caption-side\", \"bottom\")])\n",
|
||||||
|
"]\n",
|
||||||
|
"html = (df.style.set_table_styles(styles)\n",
|
||||||
|
" .set_caption(\"Hover to highlight.\"))\n",
|
||||||
|
"html"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Usefull links\n",
|
||||||
|
"\n",
|
||||||
|
"### Beginners Training Video: [\"Brandon Rhodes - Pandas From The Ground Up - PyCon 2015 \"](https://www.youtube.com/watch?v=5JnMutdy6Fw)\n",
|
||||||
|
"\n",
|
||||||
|
"### Pandas [API reference](http://pandas.pydata.org/pandas-docs/stable/api.html)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.2"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
@ -0,0 +1,768 @@
|
|||||||
|
6,148,72,35,0,33.6,0.627,50,1
|
||||||
|
1,85,66,29,0,26.6,0.351,31,0
|
||||||
|
8,183,64,0,0,23.3,0.672,32,1
|
||||||
|
1,89,66,23,94,28.1,0.167,21,0
|
||||||
|
0,137,40,35,168,43.1,2.288,33,1
|
||||||
|
5,116,74,0,0,25.6,0.201,30,0
|
||||||
|
3,78,50,32,88,31.0,0.248,26,1
|
||||||
|
10,115,0,0,0,35.3,0.134,29,0
|
||||||
|
2,197,70,45,543,30.5,0.158,53,1
|
||||||
|
8,125,96,0,0,0.0,0.232,54,1
|
||||||
|
4,110,92,0,0,37.6,0.191,30,0
|
||||||
|
10,168,74,0,0,38.0,0.537,34,1
|
||||||
|
10,139,80,0,0,27.1,1.441,57,0
|
||||||
|
1,189,60,23,846,30.1,0.398,59,1
|
||||||
|
5,166,72,19,175,25.8,0.587,51,1
|
||||||
|
7,100,0,0,0,30.0,0.484,32,1
|
||||||
|
0,118,84,47,230,45.8,0.551,31,1
|
||||||
|
7,107,74,0,0,29.6,0.254,31,1
|
||||||
|
1,103,30,38,83,43.3,0.183,33,0
|
||||||
|
1,115,70,30,96,34.6,0.529,32,1
|
||||||
|
3,126,88,41,235,39.3,0.704,27,0
|
||||||
|
8,99,84,0,0,35.4,0.388,50,0
|
||||||
|
7,196,90,0,0,39.8,0.451,41,1
|
||||||
|
9,119,80,35,0,29.0,0.263,29,1
|
||||||
|
11,143,94,33,146,36.6,0.254,51,1
|
||||||
|
10,125,70,26,115,31.1,0.205,41,1
|
||||||
|
7,147,76,0,0,39.4,0.257,43,1
|
||||||
|
1,97,66,15,140,23.2,0.487,22,0
|
||||||
|
13,145,82,19,110,22.2,0.245,57,0
|
||||||
|
5,117,92,0,0,34.1,0.337,38,0
|
||||||
|
5,109,75,26,0,36.0,0.546,60,0
|
||||||
|
3,158,76,36,245,31.6,0.851,28,1
|
||||||
|
3,88,58,11,54,24.8,0.267,22,0
|
||||||
|
6,92,92,0,0,19.9,0.188,28,0
|
||||||
|
10,122,78,31,0,27.6,0.512,45,0
|
||||||
|
4,103,60,33,192,24.0,0.966,33,0
|
||||||
|
11,138,76,0,0,33.2,0.420,35,0
|
||||||
|
9,102,76,37,0,32.9,0.665,46,1
|
||||||
|
2,90,68,42,0,38.2,0.503,27,1
|
||||||
|
4,111,72,47,207,37.1,1.390,56,1
|
||||||
|
3,180,64,25,70,34.0,0.271,26,0
|
||||||
|
7,133,84,0,0,40.2,0.696,37,0
|
||||||
|
7,106,92,18,0,22.7,0.235,48,0
|
||||||
|
9,171,110,24,240,45.4,0.721,54,1
|
||||||
|
7,159,64,0,0,27.4,0.294,40,0
|
||||||
|
0,180,66,39,0,42.0,1.893,25,1
|
||||||
|
1,146,56,0,0,29.7,0.564,29,0
|
||||||
|
2,71,70,27,0,28.0,0.586,22,0
|
||||||
|
7,103,66,32,0,39.1,0.344,31,1
|
||||||
|
7,105,0,0,0,0.0,0.305,24,0
|
||||||
|
1,103,80,11,82,19.4,0.491,22,0
|
||||||
|
1,101,50,15,36,24.2,0.526,26,0
|
||||||
|
5,88,66,21,23,24.4,0.342,30,0
|
||||||
|
8,176,90,34,300,33.7,0.467,58,1
|
||||||
|
7,150,66,42,342,34.7,0.718,42,0
|
||||||
|
1,73,50,10,0,23.0,0.248,21,0
|
||||||
|
7,187,68,39,304,37.7,0.254,41,1
|
||||||
|
0,100,88,60,110,46.8,0.962,31,0
|
||||||
|
0,146,82,0,0,40.5,1.781,44,0
|
||||||
|
0,105,64,41,142,41.5,0.173,22,0
|
||||||
|
2,84,0,0,0,0.0,0.304,21,0
|
||||||
|
8,133,72,0,0,32.9,0.270,39,1
|
||||||
|
5,44,62,0,0,25.0,0.587,36,0
|
||||||
|
2,141,58,34,128,25.4,0.699,24,0
|
||||||
|
7,114,66,0,0,32.8,0.258,42,1
|
||||||
|
5,99,74,27,0,29.0,0.203,32,0
|
||||||
|
0,109,88,30,0,32.5,0.855,38,1
|
||||||
|
2,109,92,0,0,42.7,0.845,54,0
|
||||||
|
1,95,66,13,38,19.6,0.334,25,0
|
||||||
|
4,146,85,27,100,28.9,0.189,27,0
|
||||||
|
2,100,66,20,90,32.9,0.867,28,1
|
||||||
|
5,139,64,35,140,28.6,0.411,26,0
|
||||||
|
13,126,90,0,0,43.4,0.583,42,1
|
||||||
|
4,129,86,20,270,35.1,0.231,23,0
|
||||||
|
1,79,75,30,0,32.0,0.396,22,0
|
||||||
|
1,0,48,20,0,24.7,0.140,22,0
|
||||||
|
7,62,78,0,0,32.6,0.391,41,0
|
||||||
|
5,95,72,33,0,37.7,0.370,27,0
|
||||||
|
0,131,0,0,0,43.2,0.270,26,1
|
||||||
|
2,112,66,22,0,25.0,0.307,24,0
|
||||||
|
3,113,44,13,0,22.4,0.140,22,0
|
||||||
|
2,74,0,0,0,0.0,0.102,22,0
|
||||||
|
7,83,78,26,71,29.3,0.767,36,0
|
||||||
|
0,101,65,28,0,24.6,0.237,22,0
|
||||||
|
5,137,108,0,0,48.8,0.227,37,1
|
||||||
|
2,110,74,29,125,32.4,0.698,27,0
|
||||||
|
13,106,72,54,0,36.6,0.178,45,0
|
||||||
|
2,100,68,25,71,38.5,0.324,26,0
|
||||||
|
15,136,70,32,110,37.1,0.153,43,1
|
||||||
|
1,107,68,19,0,26.5,0.165,24,0
|
||||||
|
1,80,55,0,0,19.1,0.258,21,0
|
||||||
|
4,123,80,15,176,32.0,0.443,34,0
|
||||||
|
7,81,78,40,48,46.7,0.261,42,0
|
||||||
|
4,134,72,0,0,23.8,0.277,60,1
|
||||||
|
2,142,82,18,64,24.7,0.761,21,0
|
||||||
|
6,144,72,27,228,33.9,0.255,40,0
|
||||||
|
2,92,62,28,0,31.6,0.130,24,0
|
||||||
|
1,71,48,18,76,20.4,0.323,22,0
|
||||||
|
6,93,50,30,64,28.7,0.356,23,0
|
||||||
|
1,122,90,51,220,49.7,0.325,31,1
|
||||||
|
1,163,72,0,0,39.0,1.222,33,1
|
||||||
|
1,151,60,0,0,26.1,0.179,22,0
|
||||||
|
0,125,96,0,0,22.5,0.262,21,0
|
||||||
|
1,81,72,18,40,26.6,0.283,24,0
|
||||||
|
2,85,65,0,0,39.6,0.930,27,0
|
||||||
|
1,126,56,29,152,28.7,0.801,21,0
|
||||||
|
1,96,122,0,0,22.4,0.207,27,0
|
||||||
|
4,144,58,28,140,29.5,0.287,37,0
|
||||||
|
3,83,58,31,18,34.3,0.336,25,0
|
||||||
|
0,95,85,25,36,37.4,0.247,24,1
|
||||||
|
3,171,72,33,135,33.3,0.199,24,1
|
||||||
|
8,155,62,26,495,34.0,0.543,46,1
|
||||||
|
1,89,76,34,37,31.2,0.192,23,0
|
||||||
|
4,76,62,0,0,34.0,0.391,25,0
|
||||||
|
7,160,54,32,175,30.5,0.588,39,1
|
||||||
|
4,146,92,0,0,31.2,0.539,61,1
|
||||||
|
5,124,74,0,0,34.0,0.220,38,1
|
||||||
|
5,78,48,0,0,33.7,0.654,25,0
|
||||||
|
4,97,60,23,0,28.2,0.443,22,0
|
||||||
|
4,99,76,15,51,23.2,0.223,21,0
|
||||||
|
0,162,76,56,100,53.2,0.759,25,1
|
||||||
|
6,111,64,39,0,34.2,0.260,24,0
|
||||||
|
2,107,74,30,100,33.6,0.404,23,0
|
||||||
|
5,132,80,0,0,26.8,0.186,69,0
|
||||||
|
0,113,76,0,0,33.3,0.278,23,1
|
||||||
|
1,88,30,42,99,55.0,0.496,26,1
|
||||||
|
3,120,70,30,135,42.9,0.452,30,0
|
||||||
|
1,118,58,36,94,33.3,0.261,23,0
|
||||||
|
1,117,88,24,145,34.5,0.403,40,1
|
||||||
|
0,105,84,0,0,27.9,0.741,62,1
|
||||||
|
4,173,70,14,168,29.7,0.361,33,1
|
||||||
|
9,122,56,0,0,33.3,1.114,33,1
|
||||||
|
3,170,64,37,225,34.5,0.356,30,1
|
||||||
|
8,84,74,31,0,38.3,0.457,39,0
|
||||||
|
2,96,68,13,49,21.1,0.647,26,0
|
||||||
|
2,125,60,20,140,33.8,0.088,31,0
|
||||||
|
0,100,70,26,50,30.8,0.597,21,0
|
||||||
|
0,93,60,25,92,28.7,0.532,22,0
|
||||||
|
0,129,80,0,0,31.2,0.703,29,0
|
||||||
|
5,105,72,29,325,36.9,0.159,28,0
|
||||||
|
3,128,78,0,0,21.1,0.268,55,0
|
||||||
|
5,106,82,30,0,39.5,0.286,38,0
|
||||||
|
2,108,52,26,63,32.5,0.318,22,0
|
||||||
|
10,108,66,0,0,32.4,0.272,42,1
|
||||||
|
4,154,62,31,284,32.8,0.237,23,0
|
||||||
|
0,102,75,23,0,0.0,0.572,21,0
|
||||||
|
9,57,80,37,0,32.8,0.096,41,0
|
||||||
|
2,106,64,35,119,30.5,1.400,34,0
|
||||||
|
5,147,78,0,0,33.7,0.218,65,0
|
||||||
|
2,90,70,17,0,27.3,0.085,22,0
|
||||||
|
1,136,74,50,204,37.4,0.399,24,0
|
||||||
|
4,114,65,0,0,21.9,0.432,37,0
|
||||||
|
9,156,86,28,155,34.3,1.189,42,1
|
||||||
|
1,153,82,42,485,40.6,0.687,23,0
|
||||||
|
8,188,78,0,0,47.9,0.137,43,1
|
||||||
|
7,152,88,44,0,50.0,0.337,36,1
|
||||||
|
2,99,52,15,94,24.6,0.637,21,0
|
||||||
|
1,109,56,21,135,25.2,0.833,23,0
|
||||||
|
2,88,74,19,53,29.0,0.229,22,0
|
||||||
|
17,163,72,41,114,40.9,0.817,47,1
|
||||||
|
4,151,90,38,0,29.7,0.294,36,0
|
||||||
|
7,102,74,40,105,37.2,0.204,45,0
|
||||||
|
0,114,80,34,285,44.2,0.167,27,0
|
||||||
|
2,100,64,23,0,29.7,0.368,21,0
|
||||||
|
0,131,88,0,0,31.6,0.743,32,1
|
||||||
|
6,104,74,18,156,29.9,0.722,41,1
|
||||||
|
3,148,66,25,0,32.5,0.256,22,0
|
||||||
|
4,120,68,0,0,29.6,0.709,34,0
|
||||||
|
4,110,66,0,0,31.9,0.471,29,0
|
||||||
|
3,111,90,12,78,28.4,0.495,29,0
|
||||||
|
6,102,82,0,0,30.8,0.180,36,1
|
||||||
|
6,134,70,23,130,35.4,0.542,29,1
|
||||||
|
2,87,0,23,0,28.9,0.773,25,0
|
||||||
|
1,79,60,42,48,43.5,0.678,23,0
|
||||||
|
2,75,64,24,55,29.7,0.370,33,0
|
||||||
|
8,179,72,42,130,32.7,0.719,36,1
|
||||||
|
6,85,78,0,0,31.2,0.382,42,0
|
||||||
|
0,129,110,46,130,67.1,0.319,26,1
|
||||||
|
5,143,78,0,0,45.0,0.190,47,0
|
||||||
|
5,130,82,0,0,39.1,0.956,37,1
|
||||||
|
6,87,80,0,0,23.2,0.084,32,0
|
||||||
|
0,119,64,18,92,34.9,0.725,23,0
|
||||||
|
1,0,74,20,23,27.7,0.299,21,0
|
||||||
|
5,73,60,0,0,26.8,0.268,27,0
|
||||||
|
4,141,74,0,0,27.6,0.244,40,0
|
||||||
|
7,194,68,28,0,35.9,0.745,41,1
|
||||||
|
8,181,68,36,495,30.1,0.615,60,1
|
||||||
|
1,128,98,41,58,32.0,1.321,33,1
|
||||||
|
8,109,76,39,114,27.9,0.640,31,1
|
||||||
|
5,139,80,35,160,31.6,0.361,25,1
|
||||||
|
3,111,62,0,0,22.6,0.142,21,0
|
||||||
|
9,123,70,44,94,33.1,0.374,40,0
|
||||||
|
7,159,66,0,0,30.4,0.383,36,1
|
||||||
|
11,135,0,0,0,52.3,0.578,40,1
|
||||||
|
8,85,55,20,0,24.4,0.136,42,0
|
||||||
|
5,158,84,41,210,39.4,0.395,29,1
|
||||||
|
1,105,58,0,0,24.3,0.187,21,0
|
||||||
|
3,107,62,13,48,22.9,0.678,23,1
|
||||||
|
4,109,64,44,99,34.8,0.905,26,1
|
||||||
|
4,148,60,27,318,30.9,0.150,29,1
|
||||||
|
0,113,80,16,0,31.0,0.874,21,0
|
||||||
|
1,138,82,0,0,40.1,0.236,28,0
|
||||||
|
0,108,68,20,0,27.3,0.787,32,0
|
||||||
|
2,99,70,16,44,20.4,0.235,27,0
|
||||||
|
6,103,72,32,190,37.7,0.324,55,0
|
||||||
|
5,111,72,28,0,23.9,0.407,27,0
|
||||||
|
8,196,76,29,280,37.5,0.605,57,1
|
||||||
|
5,162,104,0,0,37.7,0.151,52,1
|
||||||
|
1,96,64,27,87,33.2,0.289,21,0
|
||||||
|
7,184,84,33,0,35.5,0.355,41,1
|
||||||
|
2,81,60,22,0,27.7,0.290,25,0
|
||||||
|
0,147,85,54,0,42.8,0.375,24,0
|
||||||
|
7,179,95,31,0,34.2,0.164,60,0
|
||||||
|
0,140,65,26,130,42.6,0.431,24,1
|
||||||
|
9,112,82,32,175,34.2,0.260,36,1
|
||||||
|
12,151,70,40,271,41.8,0.742,38,1
|
||||||
|
5,109,62,41,129,35.8,0.514,25,1
|
||||||
|
6,125,68,30,120,30.0,0.464,32,0
|
||||||
|
5,85,74,22,0,29.0,1.224,32,1
|
||||||
|
5,112,66,0,0,37.8,0.261,41,1
|
||||||
|
0,177,60,29,478,34.6,1.072,21,1
|
||||||
|
2,158,90,0,0,31.6,0.805,66,1
|
||||||
|
7,119,0,0,0,25.2,0.209,37,0
|
||||||
|
7,142,60,33,190,28.8,0.687,61,0
|
||||||
|
1,100,66,15,56,23.6,0.666,26,0
|
||||||
|
1,87,78,27,32,34.6,0.101,22,0
|
||||||
|
0,101,76,0,0,35.7,0.198,26,0
|
||||||
|
3,162,52,38,0,37.2,0.652,24,1
|
||||||
|
4,197,70,39,744,36.7,2.329,31,0
|
||||||
|
0,117,80,31,53,45.2,0.089,24,0
|
||||||
|
4,142,86,0,0,44.0,0.645,22,1
|
||||||
|
6,134,80,37,370,46.2,0.238,46,1
|
||||||
|
1,79,80,25,37,25.4,0.583,22,0
|
||||||
|
4,122,68,0,0,35.0,0.394,29,0
|
||||||
|
3,74,68,28,45,29.7,0.293,23,0
|
||||||
|
4,171,72,0,0,43.6,0.479,26,1
|
||||||
|
7,181,84,21,192,35.9,0.586,51,1
|
||||||
|
0,179,90,27,0,44.1,0.686,23,1
|
||||||
|
9,164,84,21,0,30.8,0.831,32,1
|
||||||
|
0,104,76,0,0,18.4,0.582,27,0
|
||||||
|
1,91,64,24,0,29.2,0.192,21,0
|
||||||
|
4,91,70,32,88,33.1,0.446,22,0
|
||||||
|
3,139,54,0,0,25.6,0.402,22,1
|
||||||
|
6,119,50,22,176,27.1,1.318,33,1
|
||||||
|
2,146,76,35,194,38.2,0.329,29,0
|
||||||
|
9,184,85,15,0,30.0,1.213,49,1
|
||||||
|
10,122,68,0,0,31.2,0.258,41,0
|
||||||
|
0,165,90,33,680,52.3,0.427,23,0
|
||||||
|
9,124,70,33,402,35.4,0.282,34,0
|
||||||
|
1,111,86,19,0,30.1,0.143,23,0
|
||||||
|
9,106,52,0,0,31.2,0.380,42,0
|
||||||
|
2,129,84,0,0,28.0,0.284,27,0
|
||||||
|
2,90,80,14,55,24.4,0.249,24,0
|
||||||
|
0,86,68,32,0,35.8,0.238,25,0
|
||||||
|
12,92,62,7,258,27.6,0.926,44,1
|
||||||
|
1,113,64,35,0,33.6,0.543,21,1
|
||||||
|
3,111,56,39,0,30.1,0.557,30,0
|
||||||
|
2,114,68,22,0,28.7,0.092,25,0
|
||||||
|
1,193,50,16,375,25.9,0.655,24,0
|
||||||
|
11,155,76,28,150,33.3,1.353,51,1
|
||||||
|
3,191,68,15,130,30.9,0.299,34,0
|
||||||
|
3,141,0,0,0,30.0,0.761,27,1
|
||||||
|
4,95,70,32,0,32.1,0.612,24,0
|
||||||
|
3,142,80,15,0,32.4,0.200,63,0
|
||||||
|
4,123,62,0,0,32.0,0.226,35,1
|
||||||
|
5,96,74,18,67,33.6,0.997,43,0
|
||||||
|
0,138,0,0,0,36.3,0.933,25,1
|
||||||
|
2,128,64,42,0,40.0,1.101,24,0
|
||||||
|
0,102,52,0,0,25.1,0.078,21,0
|
||||||
|
2,146,0,0,0,27.5,0.240,28,1
|
||||||
|
10,101,86,37,0,45.6,1.136,38,1
|
||||||
|
2,108,62,32,56,25.2,0.128,21,0
|
||||||
|
3,122,78,0,0,23.0,0.254,40,0
|
||||||
|
1,71,78,50,45,33.2,0.422,21,0
|
||||||
|
13,106,70,0,0,34.2,0.251,52,0
|
||||||
|
2,100,70,52,57,40.5,0.677,25,0
|
||||||
|
7,106,60,24,0,26.5,0.296,29,1
|
||||||
|
0,104,64,23,116,27.8,0.454,23,0
|
||||||
|
5,114,74,0,0,24.9,0.744,57,0
|
||||||
|
2,108,62,10,278,25.3,0.881,22,0
|
||||||
|
0,146,70,0,0,37.9,0.334,28,1
|
||||||
|
10,129,76,28,122,35.9,0.280,39,0
|
||||||
|
7,133,88,15,155,32.4,0.262,37,0
|
||||||
|
7,161,86,0,0,30.4,0.165,47,1
|
||||||
|
2,108,80,0,0,27.0,0.259,52,1
|
||||||
|
7,136,74,26,135,26.0,0.647,51,0
|
||||||
|
5,155,84,44,545,38.7,0.619,34,0
|
||||||
|
1,119,86,39,220,45.6,0.808,29,1
|
||||||
|
4,96,56,17,49,20.8,0.340,26,0
|
||||||
|
5,108,72,43,75,36.1,0.263,33,0
|
||||||
|
0,78,88,29,40,36.9,0.434,21,0
|
||||||
|
0,107,62,30,74,36.6,0.757,25,1
|
||||||
|
2,128,78,37,182,43.3,1.224,31,1
|
||||||
|
1,128,48,45,194,40.5,0.613,24,1
|
||||||
|
0,161,50,0,0,21.9,0.254,65,0
|
||||||
|
6,151,62,31,120,35.5,0.692,28,0
|
||||||
|
2,146,70,38,360,28.0,0.337,29,1
|
||||||
|
0,126,84,29,215,30.7,0.520,24,0
|
||||||
|
14,100,78,25,184,36.6,0.412,46,1
|
||||||
|
8,112,72,0,0,23.6,0.840,58,0
|
||||||
|
0,167,0,0,0,32.3,0.839,30,1
|
||||||
|
2,144,58,33,135,31.6,0.422,25,1
|
||||||
|
5,77,82,41,42,35.8,0.156,35,0
|
||||||
|
5,115,98,0,0,52.9,0.209,28,1
|
||||||
|
3,150,76,0,0,21.0,0.207,37,0
|
||||||
|
2,120,76,37,105,39.7,0.215,29,0
|
||||||
|
10,161,68,23,132,25.5,0.326,47,1
|
||||||
|
0,137,68,14,148,24.8,0.143,21,0
|
||||||
|
0,128,68,19,180,30.5,1.391,25,1
|
||||||
|
2,124,68,28,205,32.9,0.875,30,1
|
||||||
|
6,80,66,30,0,26.2,0.313,41,0
|
||||||
|
0,106,70,37,148,39.4,0.605,22,0
|
||||||
|
2,155,74,17,96,26.6,0.433,27,1
|
||||||
|
3,113,50,10,85,29.5,0.626,25,0
|
||||||
|
7,109,80,31,0,35.9,1.127,43,1
|
||||||
|
2,112,68,22,94,34.1,0.315,26,0
|
||||||
|
3,99,80,11,64,19.3,0.284,30,0
|
||||||
|
3,182,74,0,0,30.5,0.345,29,1
|
||||||
|
3,115,66,39,140,38.1,0.150,28,0
|
||||||
|
6,194,78,0,0,23.5,0.129,59,1
|
||||||
|
4,129,60,12,231,27.5,0.527,31,0
|
||||||
|
3,112,74,30,0,31.6,0.197,25,1
|
||||||
|
0,124,70,20,0,27.4,0.254,36,1
|
||||||
|
13,152,90,33,29,26.8,0.731,43,1
|
||||||
|
2,112,75,32,0,35.7,0.148,21,0
|
||||||
|
1,157,72,21,168,25.6,0.123,24,0
|
||||||
|
1,122,64,32,156,35.1,0.692,30,1
|
||||||
|
10,179,70,0,0,35.1,0.200,37,0
|
||||||
|
2,102,86,36,120,45.5,0.127,23,1
|
||||||
|
6,105,70,32,68,30.8,0.122,37,0
|
||||||
|
8,118,72,19,0,23.1,1.476,46,0
|
||||||
|
2,87,58,16,52,32.7,0.166,25,0
|
||||||
|
1,180,0,0,0,43.3,0.282,41,1
|
||||||
|
12,106,80,0,0,23.6,0.137,44,0
|
||||||
|
1,95,60,18,58,23.9,0.260,22,0
|
||||||
|
0,165,76,43,255,47.9,0.259,26,0
|
||||||
|
0,117,0,0,0,33.8,0.932,44,0
|
||||||
|
5,115,76,0,0,31.2,0.343,44,1
|
||||||
|
9,152,78,34,171,34.2,0.893,33,1
|
||||||
|
7,178,84,0,0,39.9,0.331,41,1
|
||||||
|
1,130,70,13,105,25.9,0.472,22,0
|
||||||
|
1,95,74,21,73,25.9,0.673,36,0
|
||||||
|
1,0,68,35,0,32.0,0.389,22,0
|
||||||
|
5,122,86,0,0,34.7,0.290,33,0
|
||||||
|
8,95,72,0,0,36.8,0.485,57,0
|
||||||
|
8,126,88,36,108,38.5,0.349,49,0
|
||||||
|
1,139,46,19,83,28.7,0.654,22,0
|
||||||
|
3,116,0,0,0,23.5,0.187,23,0
|
||||||
|
3,99,62,19,74,21.8,0.279,26,0
|
||||||
|
5,0,80,32,0,41.0,0.346,37,1
|
||||||
|
4,92,80,0,0,42.2,0.237,29,0
|
||||||
|
4,137,84,0,0,31.2,0.252,30,0
|
||||||
|
3,61,82,28,0,34.4,0.243,46,0
|
||||||
|
1,90,62,12,43,27.2,0.580,24,0
|
||||||
|
3,90,78,0,0,42.7,0.559,21,0
|
||||||
|
9,165,88,0,0,30.4,0.302,49,1
|
||||||
|
1,125,50,40,167,33.3,0.962,28,1
|
||||||
|
13,129,0,30,0,39.9,0.569,44,1
|
||||||
|
12,88,74,40,54,35.3,0.378,48,0
|
||||||
|
1,196,76,36,249,36.5,0.875,29,1
|
||||||
|
5,189,64,33,325,31.2,0.583,29,1
|
||||||
|
5,158,70,0,0,29.8,0.207,63,0
|
||||||
|
5,103,108,37,0,39.2,0.305,65,0
|
||||||
|
4,146,78,0,0,38.5,0.520,67,1
|
||||||
|
4,147,74,25,293,34.9,0.385,30,0
|
||||||
|
5,99,54,28,83,34.0,0.499,30,0
|
||||||
|
6,124,72,0,0,27.6,0.368,29,1
|
||||||
|
0,101,64,17,0,21.0,0.252,21,0
|
||||||
|
3,81,86,16,66,27.5,0.306,22,0
|
||||||
|
1,133,102,28,140,32.8,0.234,45,1
|
||||||
|
3,173,82,48,465,38.4,2.137,25,1
|
||||||
|
0,118,64,23,89,0.0,1.731,21,0
|
||||||
|
0,84,64,22,66,35.8,0.545,21,0
|
||||||
|
2,105,58,40,94,34.9,0.225,25,0
|
||||||
|
2,122,52,43,158,36.2,0.816,28,0
|
||||||
|
12,140,82,43,325,39.2,0.528,58,1
|
||||||
|
0,98,82,15,84,25.2,0.299,22,0
|
||||||
|
1,87,60,37,75,37.2,0.509,22,0
|
||||||
|
4,156,75,0,0,48.3,0.238,32,1
|
||||||
|
0,93,100,39,72,43.4,1.021,35,0
|
||||||
|
1,107,72,30,82,30.8,0.821,24,0
|
||||||
|
0,105,68,22,0,20.0,0.236,22,0
|
||||||
|
1,109,60,8,182,25.4,0.947,21,0
|
||||||
|
1,90,62,18,59,25.1,1.268,25,0
|
||||||
|
1,125,70,24,110,24.3,0.221,25,0
|
||||||
|
1,119,54,13,50,22.3,0.205,24,0
|
||||||
|
5,116,74,29,0,32.3,0.660,35,1
|
||||||
|
8,105,100,36,0,43.3,0.239,45,1
|
||||||
|
5,144,82,26,285,32.0,0.452,58,1
|
||||||
|
3,100,68,23,81,31.6,0.949,28,0
|
||||||
|
1,100,66,29,196,32.0,0.444,42,0
|
||||||
|
5,166,76,0,0,45.7,0.340,27,1
|
||||||
|
1,131,64,14,415,23.7,0.389,21,0
|
||||||
|
4,116,72,12,87,22.1,0.463,37,0
|
||||||
|
4,158,78,0,0,32.9,0.803,31,1
|
||||||
|
2,127,58,24,275,27.7,1.600,25,0
|
||||||
|
3,96,56,34,115,24.7,0.944,39,0
|
||||||
|
0,131,66,40,0,34.3,0.196,22,1
|
||||||
|
3,82,70,0,0,21.1,0.389,25,0
|
||||||
|
3,193,70,31,0,34.9,0.241,25,1
|
||||||
|
4,95,64,0,0,32.0,0.161,31,1
|
||||||
|
6,137,61,0,0,24.2,0.151,55,0
|
||||||
|
5,136,84,41,88,35.0,0.286,35,1
|
||||||
|
9,72,78,25,0,31.6,0.280,38,0
|
||||||
|
5,168,64,0,0,32.9,0.135,41,1
|
||||||
|
2,123,48,32,165,42.1,0.520,26,0
|
||||||
|
4,115,72,0,0,28.9,0.376,46,1
|
||||||
|
0,101,62,0,0,21.9,0.336,25,0
|
||||||
|
8,197,74,0,0,25.9,1.191,39,1
|
||||||
|
1,172,68,49,579,42.4,0.702,28,1
|
||||||
|
6,102,90,39,0,35.7,0.674,28,0
|
||||||
|
1,112,72,30,176,34.4,0.528,25,0
|
||||||
|
1,143,84,23,310,42.4,1.076,22,0
|
||||||
|
1,143,74,22,61,26.2,0.256,21,0
|
||||||
|
0,138,60,35,167,34.6,0.534,21,1
|
||||||
|
3,173,84,33,474,35.7,0.258,22,1
|
||||||
|
1,97,68,21,0,27.2,1.095,22,0
|
||||||
|
4,144,82,32,0,38.5,0.554,37,1
|
||||||
|
1,83,68,0,0,18.2,0.624,27,0
|
||||||
|
3,129,64,29,115,26.4,0.219,28,1
|
||||||
|
1,119,88,41,170,45.3,0.507,26,0
|
||||||
|
2,94,68,18,76,26.0,0.561,21,0
|
||||||
|
0,102,64,46,78,40.6,0.496,21,0
|
||||||
|
2,115,64,22,0,30.8,0.421,21,0
|
||||||
|
8,151,78,32,210,42.9,0.516,36,1
|
||||||
|
4,184,78,39,277,37.0,0.264,31,1
|
||||||
|
0,94,0,0,0,0.0,0.256,25,0
|
||||||
|
1,181,64,30,180,34.1,0.328,38,1
|
||||||
|
0,135,94,46,145,40.6,0.284,26,0
|
||||||
|
1,95,82,25,180,35.0,0.233,43,1
|
||||||
|
2,99,0,0,0,22.2,0.108,23,0
|
||||||
|
3,89,74,16,85,30.4,0.551,38,0
|
||||||
|
1,80,74,11,60,30.0,0.527,22,0
|
||||||
|
2,139,75,0,0,25.6,0.167,29,0
|
||||||
|
1,90,68,8,0,24.5,1.138,36,0
|
||||||
|
0,141,0,0,0,42.4,0.205,29,1
|
||||||
|
12,140,85,33,0,37.4,0.244,41,0
|
||||||
|
5,147,75,0,0,29.9,0.434,28,0
|
||||||
|
1,97,70,15,0,18.2,0.147,21,0
|
||||||
|
6,107,88,0,0,36.8,0.727,31,0
|
||||||
|
0,189,104,25,0,34.3,0.435,41,1
|
||||||
|
2,83,66,23,50,32.2,0.497,22,0
|
||||||
|
4,117,64,27,120,33.2,0.230,24,0
|
||||||
|
8,108,70,0,0,30.5,0.955,33,1
|
||||||
|
4,117,62,12,0,29.7,0.380,30,1
|
||||||
|
0,180,78,63,14,59.4,2.420,25,1
|
||||||
|
1,100,72,12,70,25.3,0.658,28,0
|
||||||
|
0,95,80,45,92,36.5,0.330,26,0
|
||||||
|
0,104,64,37,64,33.6,0.510,22,1
|
||||||
|
0,120,74,18,63,30.5,0.285,26,0
|
||||||
|
1,82,64,13,95,21.2,0.415,23,0
|
||||||
|
2,134,70,0,0,28.9,0.542,23,1
|
||||||
|
0,91,68,32,210,39.9,0.381,25,0
|
||||||
|
2,119,0,0,0,19.6,0.832,72,0
|
||||||
|
2,100,54,28,105,37.8,0.498,24,0
|
||||||
|
14,175,62,30,0,33.6,0.212,38,1
|
||||||
|
1,135,54,0,0,26.7,0.687,62,0
|
||||||
|
5,86,68,28,71,30.2,0.364,24,0
|
||||||
|
10,148,84,48,237,37.6,1.001,51,1
|
||||||
|
9,134,74,33,60,25.9,0.460,81,0
|
||||||
|
9,120,72,22,56,20.8,0.733,48,0
|
||||||
|
1,71,62,0,0,21.8,0.416,26,0
|
||||||
|
8,74,70,40,49,35.3,0.705,39,0
|
||||||
|
5,88,78,30,0,27.6,0.258,37,0
|
||||||
|
10,115,98,0,0,24.0,1.022,34,0
|
||||||
|
0,124,56,13,105,21.8,0.452,21,0
|
||||||
|
0,74,52,10,36,27.8,0.269,22,0
|
||||||
|
0,97,64,36,100,36.8,0.600,25,0
|
||||||
|
8,120,0,0,0,30.0,0.183,38,1
|
||||||
|
6,154,78,41,140,46.1,0.571,27,0
|
||||||
|
1,144,82,40,0,41.3,0.607,28,0
|
||||||
|
0,137,70,38,0,33.2,0.170,22,0
|
||||||
|
0,119,66,27,0,38.8,0.259,22,0
|
||||||
|
7,136,90,0,0,29.9,0.210,50,0
|
||||||
|
4,114,64,0,0,28.9,0.126,24,0
|
||||||
|
0,137,84,27,0,27.3,0.231,59,0
|
||||||
|
2,105,80,45,191,33.7,0.711,29,1
|
||||||
|
7,114,76,17,110,23.8,0.466,31,0
|
||||||
|
8,126,74,38,75,25.9,0.162,39,0
|
||||||
|
4,132,86,31,0,28.0,0.419,63,0
|
||||||
|
3,158,70,30,328,35.5,0.344,35,1
|
||||||
|
0,123,88,37,0,35.2,0.197,29,0
|
||||||
|
4,85,58,22,49,27.8,0.306,28,0
|
||||||
|
0,84,82,31,125,38.2,0.233,23,0
|
||||||
|
0,145,0,0,0,44.2,0.630,31,1
|
||||||
|
0,135,68,42,250,42.3,0.365,24,1
|
||||||
|
1,139,62,41,480,40.7,0.536,21,0
|
||||||
|
0,173,78,32,265,46.5,1.159,58,0
|
||||||
|
4,99,72,17,0,25.6,0.294,28,0
|
||||||
|
8,194,80,0,0,26.1,0.551,67,0
|
||||||
|
2,83,65,28,66,36.8,0.629,24,0
|
||||||
|
2,89,90,30,0,33.5,0.292,42,0
|
||||||
|
4,99,68,38,0,32.8,0.145,33,0
|
||||||
|
4,125,70,18,122,28.9,1.144,45,1
|
||||||
|
3,80,0,0,0,0.0,0.174,22,0
|
||||||
|
6,166,74,0,0,26.6,0.304,66,0
|
||||||
|
5,110,68,0,0,26.0,0.292,30,0
|
||||||
|
2,81,72,15,76,30.1,0.547,25,0
|
||||||
|
7,195,70,33,145,25.1,0.163,55,1
|
||||||
|
6,154,74,32,193,29.3,0.839,39,0
|
||||||
|
2,117,90,19,71,25.2,0.313,21,0
|
||||||
|
3,84,72,32,0,37.2,0.267,28,0
|
||||||
|
6,0,68,41,0,39.0,0.727,41,1
|
||||||
|
7,94,64,25,79,33.3,0.738,41,0
|
||||||
|
3,96,78,39,0,37.3,0.238,40,0
|
||||||
|
10,75,82,0,0,33.3,0.263,38,0
|
||||||
|
0,180,90,26,90,36.5,0.314,35,1
|
||||||
|
1,130,60,23,170,28.6,0.692,21,0
|
||||||
|
2,84,50,23,76,30.4,0.968,21,0
|
||||||
|
8,120,78,0,0,25.0,0.409,64,0
|
||||||
|
12,84,72,31,0,29.7,0.297,46,1
|
||||||
|
0,139,62,17,210,22.1,0.207,21,0
|
||||||
|
9,91,68,0,0,24.2,0.200,58,0
|
||||||
|
2,91,62,0,0,27.3,0.525,22,0
|
||||||
|
3,99,54,19,86,25.6,0.154,24,0
|
||||||
|
3,163,70,18,105,31.6,0.268,28,1
|
||||||
|
9,145,88,34,165,30.3,0.771,53,1
|
||||||
|
7,125,86,0,0,37.6,0.304,51,0
|
||||||
|
13,76,60,0,0,32.8,0.180,41,0
|
||||||
|
6,129,90,7,326,19.6,0.582,60,0
|
||||||
|
2,68,70,32,66,25.0,0.187,25,0
|
||||||
|
3,124,80,33,130,33.2,0.305,26,0
|
||||||
|
6,114,0,0,0,0.0,0.189,26,0
|
||||||
|
9,130,70,0,0,34.2,0.652,45,1
|
||||||
|
3,125,58,0,0,31.6,0.151,24,0
|
||||||
|
3,87,60,18,0,21.8,0.444,21,0
|
||||||
|
1,97,64,19,82,18.2,0.299,21,0
|
||||||
|
3,116,74,15,105,26.3,0.107,24,0
|
||||||
|
0,117,66,31,188,30.8,0.493,22,0
|
||||||
|
0,111,65,0,0,24.6,0.660,31,0
|
||||||
|
2,122,60,18,106,29.8,0.717,22,0
|
||||||
|
0,107,76,0,0,45.3,0.686,24,0
|
||||||
|
1,86,66,52,65,41.3,0.917,29,0
|
||||||
|
6,91,0,0,0,29.8,0.501,31,0
|
||||||
|
1,77,56,30,56,33.3,1.251,24,0
|
||||||
|
4,132,0,0,0,32.9,0.302,23,1
|
||||||
|
0,105,90,0,0,29.6,0.197,46,0
|
||||||
|
0,57,60,0,0,21.7,0.735,67,0
|
||||||
|
0,127,80,37,210,36.3,0.804,23,0
|
||||||
|
3,129,92,49,155,36.4,0.968,32,1
|
||||||
|
8,100,74,40,215,39.4,0.661,43,1
|
||||||
|
3,128,72,25,190,32.4,0.549,27,1
|
||||||
|
10,90,85,32,0,34.9,0.825,56,1
|
||||||
|
4,84,90,23,56,39.5,0.159,25,0
|
||||||
|
1,88,78,29,76,32.0,0.365,29,0
|
||||||
|
8,186,90,35,225,34.5,0.423,37,1
|
||||||
|
5,187,76,27,207,43.6,1.034,53,1
|
||||||
|
4,131,68,21,166,33.1,0.160,28,0
|
||||||
|
1,164,82,43,67,32.8,0.341,50,0
|
||||||
|
4,189,110,31,0,28.5,0.680,37,0
|
||||||
|
1,116,70,28,0,27.4,0.204,21,0
|
||||||
|
3,84,68,30,106,31.9,0.591,25,0
|
||||||
|
6,114,88,0,0,27.8,0.247,66,0
|
||||||
|
1,88,62,24,44,29.9,0.422,23,0
|
||||||
|
1,84,64,23,115,36.9,0.471,28,0
|
||||||
|
7,124,70,33,215,25.5,0.161,37,0
|
||||||
|
1,97,70,40,0,38.1,0.218,30,0
|
||||||
|
8,110,76,0,0,27.8,0.237,58,0
|
||||||
|
11,103,68,40,0,46.2,0.126,42,0
|
||||||
|
11,85,74,0,0,30.1,0.300,35,0
|
||||||
|
6,125,76,0,0,33.8,0.121,54,1
|
||||||
|
0,198,66,32,274,41.3,0.502,28,1
|
||||||
|
1,87,68,34,77,37.6,0.401,24,0
|
||||||
|
6,99,60,19,54,26.9,0.497,32,0
|
||||||
|
0,91,80,0,0,32.4,0.601,27,0
|
||||||
|
2,95,54,14,88,26.1,0.748,22,0
|
||||||
|
1,99,72,30,18,38.6,0.412,21,0
|
||||||
|
6,92,62,32,126,32.0,0.085,46,0
|
||||||
|
4,154,72,29,126,31.3,0.338,37,0
|
||||||
|
0,121,66,30,165,34.3,0.203,33,1
|
||||||
|
3,78,70,0,0,32.5,0.270,39,0
|
||||||
|
2,130,96,0,0,22.6,0.268,21,0
|
||||||
|
3,111,58,31,44,29.5,0.430,22,0
|
||||||
|
2,98,60,17,120,34.7,0.198,22,0
|
||||||
|
1,143,86,30,330,30.1,0.892,23,0
|
||||||
|
1,119,44,47,63,35.5,0.280,25,0
|
||||||
|
6,108,44,20,130,24.0,0.813,35,0
|
||||||
|
2,118,80,0,0,42.9,0.693,21,1
|
||||||
|
10,133,68,0,0,27.0,0.245,36,0
|
||||||
|
2,197,70,99,0,34.7,0.575,62,1
|
||||||
|
0,151,90,46,0,42.1,0.371,21,1
|
||||||
|
6,109,60,27,0,25.0,0.206,27,0
|
||||||
|
12,121,78,17,0,26.5,0.259,62,0
|
||||||
|
8,100,76,0,0,38.7,0.190,42,0
|
||||||
|
8,124,76,24,600,28.7,0.687,52,1
|
||||||
|
1,93,56,11,0,22.5,0.417,22,0
|
||||||
|
8,143,66,0,0,34.9,0.129,41,1
|
||||||
|
6,103,66,0,0,24.3,0.249,29,0
|
||||||
|
3,176,86,27,156,33.3,1.154,52,1
|
||||||
|
0,73,0,0,0,21.1,0.342,25,0
|
||||||
|
11,111,84,40,0,46.8,0.925,45,1
|
||||||
|
2,112,78,50,140,39.4,0.175,24,0
|
||||||
|
3,132,80,0,0,34.4,0.402,44,1
|
||||||
|
2,82,52,22,115,28.5,1.699,25,0
|
||||||
|
6,123,72,45,230,33.6,0.733,34,0
|
||||||
|
0,188,82,14,185,32.0,0.682,22,1
|
||||||
|
0,67,76,0,0,45.3,0.194,46,0
|
||||||
|
1,89,24,19,25,27.8,0.559,21,0
|
||||||
|
1,173,74,0,0,36.8,0.088,38,1
|
||||||
|
1,109,38,18,120,23.1,0.407,26,0
|
||||||
|
1,108,88,19,0,27.1,0.400,24,0
|
||||||
|
6,96,0,0,0,23.7,0.190,28,0
|
||||||
|
1,124,74,36,0,27.8,0.100,30,0
|
||||||
|
7,150,78,29,126,35.2,0.692,54,1
|
||||||
|
4,183,0,0,0,28.4,0.212,36,1
|
||||||
|
1,124,60,32,0,35.8,0.514,21,0
|
||||||
|
1,181,78,42,293,40.0,1.258,22,1
|
||||||
|
1,92,62,25,41,19.5,0.482,25,0
|
||||||
|
0,152,82,39,272,41.5,0.270,27,0
|
||||||
|
1,111,62,13,182,24.0,0.138,23,0
|
||||||
|
3,106,54,21,158,30.9,0.292,24,0
|
||||||
|
3,174,58,22,194,32.9,0.593,36,1
|
||||||
|
7,168,88,42,321,38.2,0.787,40,1
|
||||||
|
6,105,80,28,0,32.5,0.878,26,0
|
||||||
|
11,138,74,26,144,36.1,0.557,50,1
|
||||||
|
3,106,72,0,0,25.8,0.207,27,0
|
||||||
|
6,117,96,0,0,28.7,0.157,30,0
|
||||||
|
2,68,62,13,15,20.1,0.257,23,0
|
||||||
|
9,112,82,24,0,28.2,1.282,50,1
|
||||||
|
0,119,0,0,0,32.4,0.141,24,1
|
||||||
|
2,112,86,42,160,38.4,0.246,28,0
|
||||||
|
2,92,76,20,0,24.2,1.698,28,0
|
||||||
|
6,183,94,0,0,40.8,1.461,45,0
|
||||||
|
0,94,70,27,115,43.5,0.347,21,0
|
||||||
|
2,108,64,0,0,30.8,0.158,21,0
|
||||||
|
4,90,88,47,54,37.7,0.362,29,0
|
||||||
|
0,125,68,0,0,24.7,0.206,21,0
|
||||||
|
0,132,78,0,0,32.4,0.393,21,0
|
||||||
|
5,128,80,0,0,34.6,0.144,45,0
|
||||||
|
4,94,65,22,0,24.7,0.148,21,0
|
||||||
|
7,114,64,0,0,27.4,0.732,34,1
|
||||||
|
0,102,78,40,90,34.5,0.238,24,0
|
||||||
|
2,111,60,0,0,26.2,0.343,23,0
|
||||||
|
1,128,82,17,183,27.5,0.115,22,0
|
||||||
|
10,92,62,0,0,25.9,0.167,31,0
|
||||||
|
13,104,72,0,0,31.2,0.465,38,1
|
||||||
|
5,104,74,0,0,28.8,0.153,48,0
|
||||||
|
2,94,76,18,66,31.6,0.649,23,0
|
||||||
|
7,97,76,32,91,40.9,0.871,32,1
|
||||||
|
1,100,74,12,46,19.5,0.149,28,0
|
||||||
|
0,102,86,17,105,29.3,0.695,27,0
|
||||||
|
4,128,70,0,0,34.3,0.303,24,0
|
||||||
|
6,147,80,0,0,29.5,0.178,50,1
|
||||||
|
4,90,0,0,0,28.0,0.610,31,0
|
||||||
|
3,103,72,30,152,27.6,0.730,27,0
|
||||||
|
2,157,74,35,440,39.4,0.134,30,0
|
||||||
|
1,167,74,17,144,23.4,0.447,33,1
|
||||||
|
0,179,50,36,159,37.8,0.455,22,1
|
||||||
|
11,136,84,35,130,28.3,0.260,42,1
|
||||||
|
0,107,60,25,0,26.4,0.133,23,0
|
||||||
|
1,91,54,25,100,25.2,0.234,23,0
|
||||||
|
1,117,60,23,106,33.8,0.466,27,0
|
||||||
|
5,123,74,40,77,34.1,0.269,28,0
|
||||||
|
2,120,54,0,0,26.8,0.455,27,0
|
||||||
|
1,106,70,28,135,34.2,0.142,22,0
|
||||||
|
2,155,52,27,540,38.7,0.240,25,1
|
||||||
|
2,101,58,35,90,21.8,0.155,22,0
|
||||||
|
1,120,80,48,200,38.9,1.162,41,0
|
||||||
|
11,127,106,0,0,39.0,0.190,51,0
|
||||||
|
3,80,82,31,70,34.2,1.292,27,1
|
||||||
|
10,162,84,0,0,27.7,0.182,54,0
|
||||||
|
1,199,76,43,0,42.9,1.394,22,1
|
||||||
|
8,167,106,46,231,37.6,0.165,43,1
|
||||||
|
9,145,80,46,130,37.9,0.637,40,1
|
||||||
|
6,115,60,39,0,33.7,0.245,40,1
|
||||||
|
1,112,80,45,132,34.8,0.217,24,0
|
||||||
|
4,145,82,18,0,32.5,0.235,70,1
|
||||||
|
10,111,70,27,0,27.5,0.141,40,1
|
||||||
|
6,98,58,33,190,34.0,0.430,43,0
|
||||||
|
9,154,78,30,100,30.9,0.164,45,0
|
||||||
|
6,165,68,26,168,33.6,0.631,49,0
|
||||||
|
1,99,58,10,0,25.4,0.551,21,0
|
||||||
|
10,68,106,23,49,35.5,0.285,47,0
|
||||||
|
3,123,100,35,240,57.3,0.880,22,0
|
||||||
|
8,91,82,0,0,35.6,0.587,68,0
|
||||||
|
6,195,70,0,0,30.9,0.328,31,1
|
||||||
|
9,156,86,0,0,24.8,0.230,53,1
|
||||||
|
0,93,60,0,0,35.3,0.263,25,0
|
||||||
|
3,121,52,0,0,36.0,0.127,25,1
|
||||||
|
2,101,58,17,265,24.2,0.614,23,0
|
||||||
|
2,56,56,28,45,24.2,0.332,22,0
|
||||||
|
0,162,76,36,0,49.6,0.364,26,1
|
||||||
|
0,95,64,39,105,44.6,0.366,22,0
|
||||||
|
4,125,80,0,0,32.3,0.536,27,1
|
||||||
|
5,136,82,0,0,0.0,0.640,69,0
|
||||||
|
2,129,74,26,205,33.2,0.591,25,0
|
||||||
|
3,130,64,0,0,23.1,0.314,22,0
|
||||||
|
1,107,50,19,0,28.3,0.181,29,0
|
||||||
|
1,140,74,26,180,24.1,0.828,23,0
|
||||||
|
1,144,82,46,180,46.1,0.335,46,1
|
||||||
|
8,107,80,0,0,24.6,0.856,34,0
|
||||||
|
13,158,114,0,0,42.3,0.257,44,1
|
||||||
|
2,121,70,32,95,39.1,0.886,23,0
|
||||||
|
7,129,68,49,125,38.5,0.439,43,1
|
||||||
|
2,90,60,0,0,23.5,0.191,25,0
|
||||||
|
7,142,90,24,480,30.4,0.128,43,1
|
||||||
|
3,169,74,19,125,29.9,0.268,31,1
|
||||||
|
0,99,0,0,0,25.0,0.253,22,0
|
||||||
|
4,127,88,11,155,34.5,0.598,28,0
|
||||||
|
4,118,70,0,0,44.5,0.904,26,0
|
||||||
|
2,122,76,27,200,35.9,0.483,26,0
|
||||||
|
6,125,78,31,0,27.6,0.565,49,1
|
||||||
|
1,168,88,29,0,35.0,0.905,52,1
|
||||||
|
2,129,0,0,0,38.5,0.304,41,0
|
||||||
|
4,110,76,20,100,28.4,0.118,27,0
|
||||||
|
6,80,80,36,0,39.8,0.177,28,0
|
||||||
|
10,115,0,0,0,0.0,0.261,30,1
|
||||||
|
2,127,46,21,335,34.4,0.176,22,0
|
||||||
|
9,164,78,0,0,32.8,0.148,45,1
|
||||||
|
2,93,64,32,160,38.0,0.674,23,1
|
||||||
|
3,158,64,13,387,31.2,0.295,24,0
|
||||||
|
5,126,78,27,22,29.6,0.439,40,0
|
||||||
|
10,129,62,36,0,41.2,0.441,38,1
|
||||||
|
0,134,58,20,291,26.4,0.352,21,0
|
||||||
|
3,102,74,0,0,29.5,0.121,32,0
|
||||||
|
7,187,50,33,392,33.9,0.826,34,1
|
||||||
|
3,173,78,39,185,33.8,0.970,31,1
|
||||||
|
10,94,72,18,0,23.1,0.595,56,0
|
||||||
|
1,108,60,46,178,35.5,0.415,24,0
|
||||||
|
5,97,76,27,0,35.6,0.378,52,1
|
||||||
|
4,83,86,19,0,29.3,0.317,34,0
|
||||||
|
1,114,66,36,200,38.1,0.289,21,0
|
||||||
|
1,149,68,29,127,29.3,0.349,42,1
|
||||||
|
5,117,86,30,105,39.1,0.251,42,0
|
||||||
|
1,111,94,0,0,32.8,0.265,45,0
|
||||||
|
4,112,78,40,0,39.4,0.236,38,0
|
||||||
|
1,116,78,29,180,36.1,0.496,25,0
|
||||||
|
0,141,84,26,0,32.4,0.433,22,0
|
||||||
|
2,175,88,0,0,22.9,0.326,22,0
|
||||||
|
2,92,52,0,0,30.1,0.141,22,0
|
||||||
|
3,130,78,23,79,28.4,0.323,34,1
|
||||||
|
8,120,86,0,0,28.4,0.259,22,1
|
||||||
|
2,174,88,37,120,44.5,0.646,24,1
|
||||||
|
2,106,56,27,165,29.0,0.426,22,0
|
||||||
|
2,105,75,0,0,23.3,0.560,53,0
|
||||||
|
4,95,60,32,0,35.4,0.284,28,0
|
||||||
|
0,126,86,27,120,27.4,0.515,21,0
|
||||||
|
8,65,72,23,0,32.0,0.600,42,0
|
||||||
|
2,99,60,17,160,36.6,0.453,21,0
|
||||||
|
1,102,74,0,0,39.5,0.293,42,1
|
||||||
|
11,120,80,37,150,42.3,0.785,48,1
|
||||||
|
3,102,44,20,94,30.8,0.400,26,0
|
||||||
|
1,109,58,18,116,28.5,0.219,22,0
|
||||||
|
9,140,94,0,0,32.7,0.734,45,1
|
||||||
|
13,153,88,37,140,40.6,1.174,39,0
|
||||||
|
12,100,84,33,105,30.0,0.488,46,0
|
||||||
|
1,147,94,41,0,49.3,0.358,27,1
|
||||||
|
1,81,74,41,57,46.3,1.096,32,0
|
||||||
|
3,187,70,22,200,36.4,0.408,36,1
|
||||||
|
6,162,62,0,0,24.3,0.178,50,1
|
||||||
|
4,136,70,0,0,31.2,1.182,22,1
|
||||||
|
1,121,78,39,74,39.0,0.261,28,0
|
||||||
|
3,108,62,24,0,26.0,0.223,25,0
|
||||||
|
0,181,88,44,510,43.3,0.222,26,1
|
||||||
|
8,154,78,32,0,32.4,0.443,45,1
|
||||||
|
1,128,88,39,110,36.5,1.057,37,1
|
||||||
|
7,137,90,41,0,32.0,0.391,39,0
|
||||||
|
0,123,72,0,0,36.3,0.258,52,1
|
||||||
|
1,106,76,0,0,37.5,0.197,26,0
|
||||||
|
6,190,92,0,0,35.5,0.278,66,1
|
||||||
|
2,88,58,26,16,28.4,0.766,22,0
|
||||||
|
9,170,74,31,0,44.0,0.403,43,1
|
||||||
|
9,89,62,0,0,22.5,0.142,33,0
|
||||||
|
10,101,76,48,180,32.9,0.171,63,0
|
||||||
|
2,122,70,27,0,36.8,0.340,27,0
|
||||||
|
5,121,72,23,112,26.2,0.245,30,0
|
||||||
|
1,126,60,0,0,30.1,0.349,47,1
|
||||||
|
1,93,70,31,0,30.4,0.315,23,0
|
@ -0,0 +1,239 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Seaborn demo per Jake VanderPlas below"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from __future__ import print_function, division\n",
|
||||||
|
"\n",
|
||||||
|
"%matplotlib inline\n",
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import pandas as pd"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"plt.style.use('ggplot')\n",
|
||||||
|
"x = np.linspace(0, 10, 1000)\n",
|
||||||
|
"plt.plot(x, np.sin(x), x, np.cos(x));"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import seaborn as sns\n",
|
||||||
|
"sns.set()\n",
|
||||||
|
"plt.plot(x, np.sin(x), x, np.cos(x));"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"data = np.random.multivariate_normal([0, 0], [[5, 2], [2, 2]], size=2000)\n",
|
||||||
|
"data = pd.DataFrame(data, columns=['x', 'y'])\n",
|
||||||
|
"\n",
|
||||||
|
"for col in 'xy':\n",
|
||||||
|
" plt.hist(data[col], density=True, alpha=0.5)\n",
|
||||||
|
" # old Matplotlib would be plt.hist(data[col], normed=True, alpha=0.5)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"for col in 'xy':\n",
|
||||||
|
" sns.kdeplot(data[col], shade=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"sns.distplot(data['x']);"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"sns.kdeplot(data.x, data.y); # formerly sns.kdeplot(data)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with sns.axes_style('white'):\n",
|
||||||
|
" sns.jointplot(\"x\", \"y\", data, kind='kde');"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with sns.axes_style('white'):\n",
|
||||||
|
" sns.jointplot(\"x\", \"y\", data, kind='hex')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"iris = sns.load_dataset(\"iris\")\n",
|
||||||
|
"iris.head()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"tips = sns.load_dataset('tips')\n",
|
||||||
|
"tips.head()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"tips['tip_pct'] = 100 * tips['tip'] / tips['total_bill']\n",
|
||||||
|
"\n",
|
||||||
|
"grid = sns.FacetGrid(tips, row=\"sex\", col=\"time\", margin_titles=True)\n",
|
||||||
|
"grid.map(plt.hist, \"tip_pct\", bins=np.linspace(0, 40, 15));"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with sns.axes_style(style='ticks'):\n",
|
||||||
|
" g = sns.catplot(\"day\", \"total_bill\", \"sex\", data=tips, kind=\"box\")\n",
|
||||||
|
" g.set_axis_labels(\"Day\", \"Total Bill\");"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with sns.axes_style('white'):\n",
|
||||||
|
" sns.jointplot(\"total_bill\", \"tip\", data=tips, kind='hex')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"sns.jointplot(\"total_bill\", \"tip\", data=tips, kind='reg');"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"planets = sns.load_dataset('planets')\n",
|
||||||
|
"planets.head()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with sns.axes_style('white'):\n",
|
||||||
|
" g = sns.catplot(\"year\", data=planets, aspect=1.5)\n",
|
||||||
|
" g.set_xticklabels(step=5)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with sns.axes_style('white'):\n",
|
||||||
|
" g = sns.catplot(\"year\", data=planets, aspect=4.0,\n",
|
||||||
|
" hue='method', order=range(2001, 2015), kind=\"count\")\n",
|
||||||
|
" g.set_ylabels('Number of Planets Discovered')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Scikit-learn tutorial from pycon 2015 Jake VanderPlas [here](http://nbviewer.ipython.org/github/jakevdp/sklearn_pycon2015/blob/master/notebooks/Index.ipynb)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.7"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
Binary file not shown.
Binary file not shown.
Binary file not shown.
@ -0,0 +1,217 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Testing System Data access\n",
|
||||||
|
"\n",
|
||||||
|
"Depending of Drivers some test will works, and other may not\n",
|
||||||
|
"\n",
|
||||||
|
"Drivers may be at: https://www.microsoft.com/fr-fr/download/details.aspx?id=13255"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# pyodbc \n",
|
||||||
|
"import pyodbc\n",
|
||||||
|
"\n",
|
||||||
|
"# look for pyodbc providers\n",
|
||||||
|
"sources = pyodbc.dataSources()\n",
|
||||||
|
"dsns = list(sources.keys())\n",
|
||||||
|
"sl = [' %s [%s]' % (dsn, sources[dsn]) for dsn in dsns]\n",
|
||||||
|
"print(\"pyodbc Providers: (beware 32/64 bit driver and python version must match)\\n\", '\\n'.join(sl))\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# odbc to EXCEL .xls via pyodbc (beware 32/64 bit driver and pytho version must match)\n",
|
||||||
|
"import pyodbc, os\n",
|
||||||
|
"filename = os.path.join(os.getcwd(), 'test.xls')\n",
|
||||||
|
"todo = \"select * from [Sheet1$]\"\n",
|
||||||
|
"print(\"\\nusing pyodbc to read an Excel .xls file:\\n\\t\", filename)\n",
|
||||||
|
"if os.path.exists(filename):\n",
|
||||||
|
" CNXNSTRING = 'Driver={Microsoft Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)};DBQ=%s;READONLY=FALSE' % filename\n",
|
||||||
|
" cnxn = pyodbc.connect(CNXNSTRING, autocommit=True)\n",
|
||||||
|
" cursor = cnxn.cursor()\n",
|
||||||
|
" rows = cursor.execute(todo).fetchall()\n",
|
||||||
|
" print([column[0] for column in cursor.description])\n",
|
||||||
|
" print(rows)\n",
|
||||||
|
" cursor.close()\n",
|
||||||
|
" cnxn.close()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# odbc to ACCESS .mdb via pyodbc (beware 32/64 bit driver and python version must match)\n",
|
||||||
|
"import pyodbc, os\n",
|
||||||
|
"filename = os.path.join(os.getcwd(), 'test.mdb')\n",
|
||||||
|
"print(\"\\nusing pyodbc to read an ACCESS .mdb file:\\n\\t\", filename)\n",
|
||||||
|
"if os.path.exists(filename):\n",
|
||||||
|
" CNXNSTRING = 'Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=%s;READONLY=FALSE' % filename\n",
|
||||||
|
" cnxn = pyodbc.connect(CNXNSTRING, autocommit=False)\n",
|
||||||
|
" cursor = cnxn.cursor()\n",
|
||||||
|
" rows = cursor.execute(\"select * from users\").fetchall()\n",
|
||||||
|
" print([column[0] for column in cursor.description])\n",
|
||||||
|
" print(rows)\n",
|
||||||
|
" cursor.close()\n",
|
||||||
|
" cnxn.close()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# pythonnet\n",
|
||||||
|
"import clr\n",
|
||||||
|
"clr.AddReference(\"System.Data\")\n",
|
||||||
|
"import System.Data.OleDb as ADONET\n",
|
||||||
|
"import System.Data.Odbc as ODBCNET\n",
|
||||||
|
"import System.Data.Common as DATACOM\n",
|
||||||
|
"\n",
|
||||||
|
"table = DATACOM.DbProviderFactories.GetFactoryClasses()\n",
|
||||||
|
"print(\"\\n .NET Providers: (beware 32/64 bit driver and pytho version must match)\")\n",
|
||||||
|
"for row in table.Rows:\n",
|
||||||
|
" print(\" %s\" % row[table.Columns[0]])\n",
|
||||||
|
" print(\" \",[row[column] for column in table.Columns if column != table.Columns[0]])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# odbc to EXCEL .xls via pythonnet\n",
|
||||||
|
"import clr, os\n",
|
||||||
|
"clr.AddReference(\"System.Data\")\n",
|
||||||
|
"import System.Data.OleDb as ADONET\n",
|
||||||
|
"import System.Data.Odbc as ODBCNET\n",
|
||||||
|
"import System.Data.Common as DATACOM\n",
|
||||||
|
"\n",
|
||||||
|
"filename = os.path.join(os.getcwd(), 'test.xls')\n",
|
||||||
|
"todo = \"select * from [Sheet1$]\"\n",
|
||||||
|
"print(\"\\nusing pythonnet to read an excel .xls file:\\n\\t\", filename , \"\\n\\t\", todo)\n",
|
||||||
|
"if os.path.exists(filename):\n",
|
||||||
|
" CNXNSTRING = 'Driver={Microsoft Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)};DBQ=%s;READONLY=FALSE' % filename\n",
|
||||||
|
" cnxn = ODBCNET.OdbcConnection(CNXNSTRING)\n",
|
||||||
|
" cnxn.Open()\n",
|
||||||
|
" command = cnxn.CreateCommand()\n",
|
||||||
|
" command.CommandText = \"select * from [Sheet1$]\"\n",
|
||||||
|
" rows = command.ExecuteReader()\n",
|
||||||
|
" print ([rows.GetName(i) for i in range(rows.FieldCount)])\n",
|
||||||
|
" for row in rows:\n",
|
||||||
|
" print([row[i] for i in range(rows.FieldCount)])\n",
|
||||||
|
" command.Dispose()\n",
|
||||||
|
" cnxn.Close()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# odbc to ACCESS .mdb via pythonnet\n",
|
||||||
|
"import clr, os\n",
|
||||||
|
"clr.AddReference(\"System.Data\")\n",
|
||||||
|
"import System.Data.OleDb as ADONET\n",
|
||||||
|
"import System.Data.Odbc as ODBCNET\n",
|
||||||
|
"import System.Data.Common as DATACOM\n",
|
||||||
|
"\n",
|
||||||
|
"filename = os.path.join(os.getcwd(), 'test.mdb')\n",
|
||||||
|
"todo = \"select * from users\"\n",
|
||||||
|
"print(\"\\nusing odbc via pythonnet to read an ACCESS .mdb file:\\n\\t\", filename , \"\\n\\t\", todo)\n",
|
||||||
|
"\n",
|
||||||
|
"if os.path.exists(filename):\n",
|
||||||
|
" CNXNSTRING = 'Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=%s;READONLY=FALSE' % filename\n",
|
||||||
|
" cnxn = ODBCNET.OdbcConnection(CNXNSTRING)\n",
|
||||||
|
" cnxn.Open()\n",
|
||||||
|
" command = cnxn.CreateCommand()\n",
|
||||||
|
" command.CommandText = \"select * from users\"\n",
|
||||||
|
" rows = command.ExecuteReader()\n",
|
||||||
|
" print ([rows.GetName(i) for i in range(rows.FieldCount)])\n",
|
||||||
|
" for row in rows:\n",
|
||||||
|
" print([row[i] for i in range(rows.FieldCount)])\n",
|
||||||
|
" command.Dispose()\n",
|
||||||
|
" cnxn.Close()\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# DAO via pythonnet: works ONLY if you have the 32 (or 64 bit) driver.\n",
|
||||||
|
"import clr, os\n",
|
||||||
|
"clr.AddReference(\"System.Data\")\n",
|
||||||
|
"import System.Data.OleDb as ADONET\n",
|
||||||
|
"import System.Data.Odbc as ODBCNET\n",
|
||||||
|
"import System.Data.Common as DATACOM\n",
|
||||||
|
"\n",
|
||||||
|
"filename = os.path.join(os.getcwd(), 'test.accdb')\n",
|
||||||
|
"todo = \"select * from users\"\n",
|
||||||
|
"print(\"\\nusing DAO via pythonnet to read an ACCESS .mdb file:\\n\\t\", filename , \"\\n\\t\", todo)\n",
|
||||||
|
"if os.path.exists(filename):\n",
|
||||||
|
" # needs a driver in 32 or 64 bit like your running python\n",
|
||||||
|
" # https://www.microsoft.com/download/details.aspx?id=13255\n",
|
||||||
|
" CNXNSTRING = 'Provider=Microsoft.ACE.OLEDB.12.0; Data Source=%s;READONLY=FALSE' % filename\n",
|
||||||
|
" cnxn = ADONET.OleDbConnection(CNXNSTRING)\n",
|
||||||
|
" cnxn.Open()\n",
|
||||||
|
" command = cnxn.CreateCommand()\n",
|
||||||
|
" command.CommandText = todo\n",
|
||||||
|
" # command.CommandText = 'select id, name from people where group_id = @group_id'\n",
|
||||||
|
" # command.Parameters.Add(SqlParameter('group_id', 23))\n",
|
||||||
|
" rows = command.ExecuteReader()\n",
|
||||||
|
" print ([rows.GetName(i) for i in range(rows.FieldCount)])\n",
|
||||||
|
" for row in rows:\n",
|
||||||
|
" print([row[i] for i in range(rows.FieldCount)])\n",
|
||||||
|
" command.Dispose()\n",
|
||||||
|
" cnxn.Close()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.2"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 1
|
||||||
|
}
|
@ -0,0 +1,140 @@
|
|||||||
|
# pyodbc
|
||||||
|
import pyodbc
|
||||||
|
|
||||||
|
# look for pyodbc providers
|
||||||
|
sources = pyodbc.dataSources()
|
||||||
|
dsns = list(sources.keys())
|
||||||
|
sl = [' %s [%s]' % (dsn, sources[dsn]) for dsn in dsns]
|
||||||
|
print("pyodbc Providers: (beware 32/64 bit driver and python version must match)\n", '\n'.join(sl))
|
||||||
|
|
||||||
|
# odbc to EXCEL .xls via pyodbc (beware 32/64 bit driver and pytho version must match)
|
||||||
|
import pyodbc, os
|
||||||
|
filename = os.path.join(os.getcwd(), 'test.xls')
|
||||||
|
todo = "select * from [Sheet1$]"
|
||||||
|
print("\nusing pyodbc to read an Excel .xls file:\n\t", filename)
|
||||||
|
if os.path.exists(filename):
|
||||||
|
CNXNSTRING = 'Driver={Microsoft Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)};DBQ=%s;READONLY=FALSE' % filename
|
||||||
|
try:
|
||||||
|
cnxn = pyodbc.connect(CNXNSTRING, autocommit=True)
|
||||||
|
cursor = cnxn.cursor()
|
||||||
|
rows = cursor.execute(todo).fetchall()
|
||||||
|
print([column[0] for column in cursor.description])
|
||||||
|
print(rows)
|
||||||
|
cursor.close()
|
||||||
|
cnxn.close()
|
||||||
|
except:
|
||||||
|
print("\n *** failed ***\n")
|
||||||
|
# odbc to ACCESS .mdb via pyodbc (beware 32/64 bit driver and python version must match)
|
||||||
|
import pyodbc, os
|
||||||
|
filename = os.path.join(os.getcwd(), 'test.mdb')
|
||||||
|
print("\nusing pyodbc to read an ACCESS .mdb file:\n\t", filename)
|
||||||
|
if os.path.exists(filename):
|
||||||
|
CNXNSTRING = 'Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=%s;READONLY=FALSE' % filename
|
||||||
|
try:
|
||||||
|
cnxn = pyodbc.connect(CNXNSTRING, autocommit=False)
|
||||||
|
cursor = cnxn.cursor()
|
||||||
|
rows = cursor.execute("select * from users").fetchall()
|
||||||
|
print([column[0] for column in cursor.description])
|
||||||
|
print(rows)
|
||||||
|
cursor.close()
|
||||||
|
cnxn.close()
|
||||||
|
except:
|
||||||
|
print("\n *** failed ***\n")
|
||||||
|
|
||||||
|
# pythonnet
|
||||||
|
import clr
|
||||||
|
clr.AddReference("System.Data")
|
||||||
|
import System.Data.OleDb as ADONET
|
||||||
|
import System.Data.Odbc as ODBCNET
|
||||||
|
import System.Data.Common as DATACOM
|
||||||
|
|
||||||
|
table = DATACOM.DbProviderFactories.GetFactoryClasses()
|
||||||
|
print("\n .NET Providers: (beware 32/64 bit driver and pytho version must match)")
|
||||||
|
for row in table.Rows:
|
||||||
|
print(" %s" % row[table.Columns[0]])
|
||||||
|
print(" ",[row[column] for column in table.Columns if column != table.Columns[0]])
|
||||||
|
|
||||||
|
|
||||||
|
# odbc to EXCEL .xls via pythonnet
|
||||||
|
import clr, os
|
||||||
|
clr.AddReference("System.Data")
|
||||||
|
import System.Data.OleDb as ADONET
|
||||||
|
import System.Data.Odbc as ODBCNET
|
||||||
|
import System.Data.Common as DATACOM
|
||||||
|
|
||||||
|
filename = os.path.join(os.getcwd(), 'test.xls')
|
||||||
|
todo = "select * from [Sheet1$]"
|
||||||
|
print("\nusing pythonnet to read an excel .xls file:\n\t", filename , "\n\t", todo)
|
||||||
|
if os.path.exists(filename):
|
||||||
|
CNXNSTRING = 'Driver={Microsoft Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)};DBQ=%s;READONLY=FALSE' % filename
|
||||||
|
cnxn = ODBCNET.OdbcConnection(CNXNSTRING)
|
||||||
|
try:
|
||||||
|
cnxn.Open()
|
||||||
|
command = cnxn.CreateCommand()
|
||||||
|
command.CommandText = "select * from [Sheet1$]"
|
||||||
|
rows = command.ExecuteReader()
|
||||||
|
print ([rows.GetName(i) for i in range(rows.FieldCount)])
|
||||||
|
for row in rows:
|
||||||
|
print([row[i] for i in range(rows.FieldCount)])
|
||||||
|
command.Dispose()
|
||||||
|
cnxn.Close()
|
||||||
|
except:
|
||||||
|
print("\n *** failed ***\n")
|
||||||
|
|
||||||
|
|
||||||
|
# odbc to ACCESS .mdb via pythonnet
|
||||||
|
import clr, os
|
||||||
|
clr.AddReference("System.Data")
|
||||||
|
import System.Data.OleDb as ADONET
|
||||||
|
import System.Data.Odbc as ODBCNET
|
||||||
|
import System.Data.Common as DATACOM
|
||||||
|
|
||||||
|
filename = os.path.join(os.getcwd(), 'test.mdb')
|
||||||
|
todo = "select * from users"
|
||||||
|
print("\nusing odbc via pythonnet to read an ACCESS .mdb file:\n\t", filename , "\n\t", todo)
|
||||||
|
|
||||||
|
if os.path.exists(filename):
|
||||||
|
CNXNSTRING = 'Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=%s;READONLY=FALSE' % filename
|
||||||
|
cnxn = ODBCNET.OdbcConnection(CNXNSTRING)
|
||||||
|
try:
|
||||||
|
cnxn.Open()
|
||||||
|
command = cnxn.CreateCommand()
|
||||||
|
command.CommandText = "select * from users"
|
||||||
|
rows = command.ExecuteReader()
|
||||||
|
print ([rows.GetName(i) for i in range(rows.FieldCount)])
|
||||||
|
for row in rows:
|
||||||
|
print([row[i] for i in range(rows.FieldCount)])
|
||||||
|
command.Dispose()
|
||||||
|
cnxn.Close()
|
||||||
|
except:
|
||||||
|
print("\n *** failed ***\n")
|
||||||
|
|
||||||
|
# DAO via pythonnet: works ONLY if you have the 32 (or 64 bit) driver.
|
||||||
|
import clr, os
|
||||||
|
clr.AddReference("System.Data")
|
||||||
|
import System.Data.OleDb as ADONET
|
||||||
|
import System.Data.Odbc as ODBCNET
|
||||||
|
import System.Data.Common as DATACOM
|
||||||
|
|
||||||
|
filename = os.path.join(os.getcwd(), 'test.accdb')
|
||||||
|
todo = "select * from users"
|
||||||
|
print("\nusing DAO via pythonnet to read an ACCESS .mdb file:\n\t", filename , "\n\t", todo)
|
||||||
|
if os.path.exists(filename):
|
||||||
|
# needs a driver in 32 or 64 bit like your running python
|
||||||
|
# https://www.microsoft.com/download/details.aspx?id=13255
|
||||||
|
CNXNSTRING = 'Provider=Microsoft.ACE.OLEDB.12.0; Data Source=%s;READONLY=FALSE' % filename
|
||||||
|
cnxn = ADONET.OleDbConnection(CNXNSTRING)
|
||||||
|
try:
|
||||||
|
cnxn.Open()
|
||||||
|
command = cnxn.CreateCommand()
|
||||||
|
command.CommandText = todo
|
||||||
|
# command.CommandText = 'select id, name from people where group_id = @group_id'
|
||||||
|
# command.Parameters.Add(SqlParameter('group_id', 23))
|
||||||
|
rows = command.ExecuteReader()
|
||||||
|
print ([rows.GetName(i) for i in range(rows.FieldCount)])
|
||||||
|
for row in rows:
|
||||||
|
print([row[i] for i in range(rows.FieldCount)])
|
||||||
|
command.Dispose()
|
||||||
|
cnxn.Close()
|
||||||
|
except:
|
||||||
|
print("\n *** failed ***\n")
|
File diff suppressed because it is too large
Load Diff
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
After Width: | Height: | Size: 74 KiB |
After Width: | Height: | Size: 77 KiB |
After Width: | Height: | Size: 81 KiB |
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@ -0,0 +1,603 @@
|
|||||||
|
A. HISTORY OF THE SOFTWARE
|
||||||
|
==========================
|
||||||
|
|
||||||
|
Python was created in the early 1990s by Guido van Rossum at Stichting
|
||||||
|
Mathematisch Centrum (CWI, see http://www.cwi.nl) in the Netherlands
|
||||||
|
as a successor of a language called ABC. Guido remains Python's
|
||||||
|
principal author, although it includes many contributions from others.
|
||||||
|
|
||||||
|
In 1995, Guido continued his work on Python at the Corporation for
|
||||||
|
National Research Initiatives (CNRI, see http://www.cnri.reston.va.us)
|
||||||
|
in Reston, Virginia where he released several versions of the
|
||||||
|
software.
|
||||||
|
|
||||||
|
In May 2000, Guido and the Python core development team moved to
|
||||||
|
BeOpen.com to form the BeOpen PythonLabs team. In October of the same
|
||||||
|
year, the PythonLabs team moved to Digital Creations, which became
|
||||||
|
Zope Corporation. In 2001, the Python Software Foundation (PSF, see
|
||||||
|
https://www.python.org/psf/) was formed, a non-profit organization
|
||||||
|
created specifically to own Python-related Intellectual Property.
|
||||||
|
Zope Corporation was a sponsoring member of the PSF.
|
||||||
|
|
||||||
|
All Python releases are Open Source (see http://www.opensource.org for
|
||||||
|
the Open Source Definition). Historically, most, but not all, Python
|
||||||
|
releases have also been GPL-compatible; the table below summarizes
|
||||||
|
the various releases.
|
||||||
|
|
||||||
|
Release Derived Year Owner GPL-
|
||||||
|
from compatible? (1)
|
||||||
|
|
||||||
|
0.9.0 thru 1.2 1991-1995 CWI yes
|
||||||
|
1.3 thru 1.5.2 1.2 1995-1999 CNRI yes
|
||||||
|
1.6 1.5.2 2000 CNRI no
|
||||||
|
2.0 1.6 2000 BeOpen.com no
|
||||||
|
1.6.1 1.6 2001 CNRI yes (2)
|
||||||
|
2.1 2.0+1.6.1 2001 PSF no
|
||||||
|
2.0.1 2.0+1.6.1 2001 PSF yes
|
||||||
|
2.1.1 2.1+2.0.1 2001 PSF yes
|
||||||
|
2.1.2 2.1.1 2002 PSF yes
|
||||||
|
2.1.3 2.1.2 2002 PSF yes
|
||||||
|
2.2 and above 2.1.1 2001-now PSF yes
|
||||||
|
|
||||||
|
Footnotes:
|
||||||
|
|
||||||
|
(1) GPL-compatible doesn't mean that we're distributing Python under
|
||||||
|
the GPL. All Python licenses, unlike the GPL, let you distribute
|
||||||
|
a modified version without making your changes open source. The
|
||||||
|
GPL-compatible licenses make it possible to combine Python with
|
||||||
|
other software that is released under the GPL; the others don't.
|
||||||
|
|
||||||
|
(2) According to Richard Stallman, 1.6.1 is not GPL-compatible,
|
||||||
|
because its license has a choice of law clause. According to
|
||||||
|
CNRI, however, Stallman's lawyer has told CNRI's lawyer that 1.6.1
|
||||||
|
is "not incompatible" with the GPL.
|
||||||
|
|
||||||
|
Thanks to the many outside volunteers who have worked under Guido's
|
||||||
|
direction to make these releases possible.
|
||||||
|
|
||||||
|
|
||||||
|
B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON
|
||||||
|
===============================================================
|
||||||
|
|
||||||
|
PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
|
||||||
|
--------------------------------------------
|
||||||
|
|
||||||
|
1. This LICENSE AGREEMENT is between the Python Software Foundation
|
||||||
|
("PSF"), and the Individual or Organization ("Licensee") accessing and
|
||||||
|
otherwise using this software ("Python") in source or binary form and
|
||||||
|
its associated documentation.
|
||||||
|
|
||||||
|
2. Subject to the terms and conditions of this License Agreement, PSF hereby
|
||||||
|
grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
|
||||||
|
analyze, test, perform and/or display publicly, prepare derivative works,
|
||||||
|
distribute, and otherwise use Python alone or in any derivative version,
|
||||||
|
provided, however, that PSF's License Agreement and PSF's notice of copyright,
|
||||||
|
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
|
||||||
|
2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018 Python Software Foundation; All
|
||||||
|
Rights Reserved" are retained in Python alone or in any derivative version
|
||||||
|
prepared by Licensee.
|
||||||
|
|
||||||
|
3. In the event Licensee prepares a derivative work that is based on
|
||||||
|
or incorporates Python or any part thereof, and wants to make
|
||||||
|
the derivative work available to others as provided herein, then
|
||||||
|
Licensee hereby agrees to include in any such work a brief summary of
|
||||||
|
the changes made to Python.
|
||||||
|
|
||||||
|
4. PSF is making Python available to Licensee on an "AS IS"
|
||||||
|
basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
||||||
|
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
|
||||||
|
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
||||||
|
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
|
||||||
|
INFRINGE ANY THIRD PARTY RIGHTS.
|
||||||
|
|
||||||
|
5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
|
||||||
|
FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
|
||||||
|
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
|
||||||
|
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
||||||
|
|
||||||
|
6. This License Agreement will automatically terminate upon a material
|
||||||
|
breach of its terms and conditions.
|
||||||
|
|
||||||
|
7. Nothing in this License Agreement shall be deemed to create any
|
||||||
|
relationship of agency, partnership, or joint venture between PSF and
|
||||||
|
Licensee. This License Agreement does not grant permission to use PSF
|
||||||
|
trademarks or trade name in a trademark sense to endorse or promote
|
||||||
|
products or services of Licensee, or any third party.
|
||||||
|
|
||||||
|
8. By copying, installing or otherwise using Python, Licensee
|
||||||
|
agrees to be bound by the terms and conditions of this License
|
||||||
|
Agreement.
|
||||||
|
|
||||||
|
|
||||||
|
BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
|
||||||
|
-------------------------------------------
|
||||||
|
|
||||||
|
BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
|
||||||
|
|
||||||
|
1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
|
||||||
|
office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
|
||||||
|
Individual or Organization ("Licensee") accessing and otherwise using
|
||||||
|
this software in source or binary form and its associated
|
||||||
|
documentation ("the Software").
|
||||||
|
|
||||||
|
2. Subject to the terms and conditions of this BeOpen Python License
|
||||||
|
Agreement, BeOpen hereby grants Licensee a non-exclusive,
|
||||||
|
royalty-free, world-wide license to reproduce, analyze, test, perform
|
||||||
|
and/or display publicly, prepare derivative works, distribute, and
|
||||||
|
otherwise use the Software alone or in any derivative version,
|
||||||
|
provided, however, that the BeOpen Python License is retained in the
|
||||||
|
Software, alone or in any derivative version prepared by Licensee.
|
||||||
|
|
||||||
|
3. BeOpen is making the Software available to Licensee on an "AS IS"
|
||||||
|
basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
||||||
|
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
|
||||||
|
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
||||||
|
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
|
||||||
|
INFRINGE ANY THIRD PARTY RIGHTS.
|
||||||
|
|
||||||
|
4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
|
||||||
|
SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
|
||||||
|
AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
|
||||||
|
DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
||||||
|
|
||||||
|
5. This License Agreement will automatically terminate upon a material
|
||||||
|
breach of its terms and conditions.
|
||||||
|
|
||||||
|
6. This License Agreement shall be governed by and interpreted in all
|
||||||
|
respects by the law of the State of California, excluding conflict of
|
||||||
|
law provisions. Nothing in this License Agreement shall be deemed to
|
||||||
|
create any relationship of agency, partnership, or joint venture
|
||||||
|
between BeOpen and Licensee. This License Agreement does not grant
|
||||||
|
permission to use BeOpen trademarks or trade names in a trademark
|
||||||
|
sense to endorse or promote products or services of Licensee, or any
|
||||||
|
third party. As an exception, the "BeOpen Python" logos available at
|
||||||
|
http://www.pythonlabs.com/logos.html may be used according to the
|
||||||
|
permissions granted on that web page.
|
||||||
|
|
||||||
|
7. By copying, installing or otherwise using the software, Licensee
|
||||||
|
agrees to be bound by the terms and conditions of this License
|
||||||
|
Agreement.
|
||||||
|
|
||||||
|
|
||||||
|
CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
|
||||||
|
---------------------------------------
|
||||||
|
|
||||||
|
1. This LICENSE AGREEMENT is between the Corporation for National
|
||||||
|
Research Initiatives, having an office at 1895 Preston White Drive,
|
||||||
|
Reston, VA 20191 ("CNRI"), and the Individual or Organization
|
||||||
|
("Licensee") accessing and otherwise using Python 1.6.1 software in
|
||||||
|
source or binary form and its associated documentation.
|
||||||
|
|
||||||
|
2. Subject to the terms and conditions of this License Agreement, CNRI
|
||||||
|
hereby grants Licensee a nonexclusive, royalty-free, world-wide
|
||||||
|
license to reproduce, analyze, test, perform and/or display publicly,
|
||||||
|
prepare derivative works, distribute, and otherwise use Python 1.6.1
|
||||||
|
alone or in any derivative version, provided, however, that CNRI's
|
||||||
|
License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
|
||||||
|
1995-2001 Corporation for National Research Initiatives; All Rights
|
||||||
|
Reserved" are retained in Python 1.6.1 alone or in any derivative
|
||||||
|
version prepared by Licensee. Alternately, in lieu of CNRI's License
|
||||||
|
Agreement, Licensee may substitute the following text (omitting the
|
||||||
|
quotes): "Python 1.6.1 is made available subject to the terms and
|
||||||
|
conditions in CNRI's License Agreement. This Agreement together with
|
||||||
|
Python 1.6.1 may be located on the Internet using the following
|
||||||
|
unique, persistent identifier (known as a handle): 1895.22/1013. This
|
||||||
|
Agreement may also be obtained from a proxy server on the Internet
|
||||||
|
using the following URL: http://hdl.handle.net/1895.22/1013".
|
||||||
|
|
||||||
|
3. In the event Licensee prepares a derivative work that is based on
|
||||||
|
or incorporates Python 1.6.1 or any part thereof, and wants to make
|
||||||
|
the derivative work available to others as provided herein, then
|
||||||
|
Licensee hereby agrees to include in any such work a brief summary of
|
||||||
|
the changes made to Python 1.6.1.
|
||||||
|
|
||||||
|
4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
|
||||||
|
basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
|
||||||
|
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
|
||||||
|
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
|
||||||
|
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
|
||||||
|
INFRINGE ANY THIRD PARTY RIGHTS.
|
||||||
|
|
||||||
|
5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
|
||||||
|
1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
|
||||||
|
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
|
||||||
|
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
|
||||||
|
|
||||||
|
6. This License Agreement will automatically terminate upon a material
|
||||||
|
breach of its terms and conditions.
|
||||||
|
|
||||||
|
7. This License Agreement shall be governed by the federal
|
||||||
|
intellectual property law of the United States, including without
|
||||||
|
limitation the federal copyright law, and, to the extent such
|
||||||
|
U.S. federal law does not apply, by the law of the Commonwealth of
|
||||||
|
Virginia, excluding Virginia's conflict of law provisions.
|
||||||
|
Notwithstanding the foregoing, with regard to derivative works based
|
||||||
|
on Python 1.6.1 that incorporate non-separable material that was
|
||||||
|
previously distributed under the GNU General Public License (GPL), the
|
||||||
|
law of the Commonwealth of Virginia shall govern this License
|
||||||
|
Agreement only as to issues arising under or with respect to
|
||||||
|
Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
|
||||||
|
License Agreement shall be deemed to create any relationship of
|
||||||
|
agency, partnership, or joint venture between CNRI and Licensee. This
|
||||||
|
License Agreement does not grant permission to use CNRI trademarks or
|
||||||
|
trade name in a trademark sense to endorse or promote products or
|
||||||
|
services of Licensee, or any third party.
|
||||||
|
|
||||||
|
8. By clicking on the "ACCEPT" button where indicated, or by copying,
|
||||||
|
installing or otherwise using Python 1.6.1, Licensee agrees to be
|
||||||
|
bound by the terms and conditions of this License Agreement.
|
||||||
|
|
||||||
|
ACCEPT
|
||||||
|
|
||||||
|
|
||||||
|
CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
|
||||||
|
--------------------------------------------------
|
||||||
|
|
||||||
|
Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
|
||||||
|
The Netherlands. All rights reserved.
|
||||||
|
|
||||||
|
Permission to use, copy, modify, and distribute this software and its
|
||||||
|
documentation for any purpose and without fee is hereby granted,
|
||||||
|
provided that the above copyright notice appear in all copies and that
|
||||||
|
both that copyright notice and this permission notice appear in
|
||||||
|
supporting documentation, and that the name of Stichting Mathematisch
|
||||||
|
Centrum or CWI not be used in advertising or publicity pertaining to
|
||||||
|
distribution of the software without specific, written prior
|
||||||
|
permission.
|
||||||
|
|
||||||
|
STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
|
||||||
|
THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
|
||||||
|
FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
|
||||||
|
FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||||
|
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||||
|
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
|
||||||
|
OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
Additional Conditions for this Windows binary build
|
||||||
|
---------------------------------------------------
|
||||||
|
|
||||||
|
This program is linked with and uses Microsoft Distributable Code,
|
||||||
|
copyrighted by Microsoft Corporation. The Microsoft Distributable Code
|
||||||
|
is embedded in each .exe, .dll and .pyd file as a result of running
|
||||||
|
the code through a linker.
|
||||||
|
|
||||||
|
If you further distribute programs that include the Microsoft
|
||||||
|
Distributable Code, you must comply with the restrictions on
|
||||||
|
distribution specified by Microsoft. In particular, you must require
|
||||||
|
distributors and external end users to agree to terms that protect the
|
||||||
|
Microsoft Distributable Code at least as much as Microsoft's own
|
||||||
|
requirements for the Distributable Code. See Microsoft's documentation
|
||||||
|
(included in its developer tools and on its website at microsoft.com)
|
||||||
|
for specific details.
|
||||||
|
|
||||||
|
Redistribution of the Windows binary build of the Python interpreter
|
||||||
|
complies with this agreement, provided that you do not:
|
||||||
|
|
||||||
|
- alter any copyright, trademark or patent notice in Microsoft's
|
||||||
|
Distributable Code;
|
||||||
|
|
||||||
|
- use Microsoft's trademarks in your programs' names or in a way that
|
||||||
|
suggests your programs come from or are endorsed by Microsoft;
|
||||||
|
|
||||||
|
- distribute Microsoft's Distributable Code to run on a platform other
|
||||||
|
than Microsoft operating systems, run-time technologies or application
|
||||||
|
platforms; or
|
||||||
|
|
||||||
|
- include Microsoft Distributable Code in malicious, deceptive or
|
||||||
|
unlawful programs.
|
||||||
|
|
||||||
|
These restrictions apply only to the Microsoft Distributable Code as
|
||||||
|
defined above, not to Python itself or any programs running on the
|
||||||
|
Python interpreter. The redistribution of the Python interpreter and
|
||||||
|
libraries is governed by the Python Software License included with this
|
||||||
|
file, or by other licenses as marked.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------
|
||||||
|
|
||||||
|
This program, "bzip2", the associated library "libbzip2", and all
|
||||||
|
documentation, are copyright (C) 1996-2010 Julian R Seward. All
|
||||||
|
rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions
|
||||||
|
are met:
|
||||||
|
|
||||||
|
1. Redistributions of source code must retain the above copyright
|
||||||
|
notice, this list of conditions and the following disclaimer.
|
||||||
|
|
||||||
|
2. The origin of this software must not be misrepresented; you must
|
||||||
|
not claim that you wrote the original software. If you use this
|
||||||
|
software in a product, an acknowledgment in the product
|
||||||
|
documentation would be appreciated but is not required.
|
||||||
|
|
||||||
|
3. Altered source versions must be plainly marked as such, and must
|
||||||
|
not be misrepresented as being the original software.
|
||||||
|
|
||||||
|
4. The name of the author may not be used to endorse or promote
|
||||||
|
products derived from this software without specific prior written
|
||||||
|
permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS
|
||||||
|
OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||||
|
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
||||||
|
ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
|
||||||
|
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
||||||
|
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
|
||||||
|
GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
|
||||||
|
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
|
||||||
|
WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
|
||||||
|
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||||
|
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
||||||
|
Julian Seward, jseward@bzip.org
|
||||||
|
bzip2/libbzip2 version 1.0.6 of 6 September 2010
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
LICENSE ISSUES
|
||||||
|
==============
|
||||||
|
|
||||||
|
The OpenSSL toolkit stays under a double license, i.e. both the conditions of
|
||||||
|
the OpenSSL License and the original SSLeay license apply to the toolkit.
|
||||||
|
See below for the actual license texts.
|
||||||
|
|
||||||
|
OpenSSL License
|
||||||
|
---------------
|
||||||
|
|
||||||
|
/* ====================================================================
|
||||||
|
* Copyright (c) 1998-2018 The OpenSSL Project. All rights reserved.
|
||||||
|
*
|
||||||
|
* Redistribution and use in source and binary forms, with or without
|
||||||
|
* modification, are permitted provided that the following conditions
|
||||||
|
* are met:
|
||||||
|
*
|
||||||
|
* 1. Redistributions of source code must retain the above copyright
|
||||||
|
* notice, this list of conditions and the following disclaimer.
|
||||||
|
*
|
||||||
|
* 2. Redistributions in binary form must reproduce the above copyright
|
||||||
|
* notice, this list of conditions and the following disclaimer in
|
||||||
|
* the documentation and/or other materials provided with the
|
||||||
|
* distribution.
|
||||||
|
*
|
||||||
|
* 3. All advertising materials mentioning features or use of this
|
||||||
|
* software must display the following acknowledgment:
|
||||||
|
* "This product includes software developed by the OpenSSL Project
|
||||||
|
* for use in the OpenSSL Toolkit. (http://www.openssl.org/)"
|
||||||
|
*
|
||||||
|
* 4. The names "OpenSSL Toolkit" and "OpenSSL Project" must not be used to
|
||||||
|
* endorse or promote products derived from this software without
|
||||||
|
* prior written permission. For written permission, please contact
|
||||||
|
* openssl-core@openssl.org.
|
||||||
|
*
|
||||||
|
* 5. Products derived from this software may not be called "OpenSSL"
|
||||||
|
* nor may "OpenSSL" appear in their names without prior written
|
||||||
|
* permission of the OpenSSL Project.
|
||||||
|
*
|
||||||
|
* 6. Redistributions of any form whatsoever must retain the following
|
||||||
|
* acknowledgment:
|
||||||
|
* "This product includes software developed by the OpenSSL Project
|
||||||
|
* for use in the OpenSSL Toolkit (http://www.openssl.org/)"
|
||||||
|
*
|
||||||
|
* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY
|
||||||
|
* EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
||||||
|
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||||
|
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR
|
||||||
|
* ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||||
|
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
|
||||||
|
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||||
|
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
||||||
|
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
|
||||||
|
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
|
||||||
|
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
|
||||||
|
* OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
* ====================================================================
|
||||||
|
*
|
||||||
|
* This product includes cryptographic software written by Eric Young
|
||||||
|
* (eay@cryptsoft.com). This product includes software written by Tim
|
||||||
|
* Hudson (tjh@cryptsoft.com).
|
||||||
|
*
|
||||||
|
*/
|
||||||
|
|
||||||
|
Original SSLeay License
|
||||||
|
-----------------------
|
||||||
|
|
||||||
|
/* Copyright (C) 1995-1998 Eric Young (eay@cryptsoft.com)
|
||||||
|
* All rights reserved.
|
||||||
|
*
|
||||||
|
* This package is an SSL implementation written
|
||||||
|
* by Eric Young (eay@cryptsoft.com).
|
||||||
|
* The implementation was written so as to conform with Netscapes SSL.
|
||||||
|
*
|
||||||
|
* This library is free for commercial and non-commercial use as long as
|
||||||
|
* the following conditions are aheared to. The following conditions
|
||||||
|
* apply to all code found in this distribution, be it the RC4, RSA,
|
||||||
|
* lhash, DES, etc., code; not just the SSL code. The SSL documentation
|
||||||
|
* included with this distribution is covered by the same copyright terms
|
||||||
|
* except that the holder is Tim Hudson (tjh@cryptsoft.com).
|
||||||
|
*
|
||||||
|
* Copyright remains Eric Young's, and as such any Copyright notices in
|
||||||
|
* the code are not to be removed.
|
||||||
|
* If this package is used in a product, Eric Young should be given attribution
|
||||||
|
* as the author of the parts of the library used.
|
||||||
|
* This can be in the form of a textual message at program startup or
|
||||||
|
* in documentation (online or textual) provided with the package.
|
||||||
|
*
|
||||||
|
* Redistribution and use in source and binary forms, with or without
|
||||||
|
* modification, are permitted provided that the following conditions
|
||||||
|
* are met:
|
||||||
|
* 1. Redistributions of source code must retain the copyright
|
||||||
|
* notice, this list of conditions and the following disclaimer.
|
||||||
|
* 2. Redistributions in binary form must reproduce the above copyright
|
||||||
|
* notice, this list of conditions and the following disclaimer in the
|
||||||
|
* documentation and/or other materials provided with the distribution.
|
||||||
|
* 3. All advertising materials mentioning features or use of this software
|
||||||
|
* must display the following acknowledgement:
|
||||||
|
* "This product includes cryptographic software written by
|
||||||
|
* Eric Young (eay@cryptsoft.com)"
|
||||||
|
* The word 'cryptographic' can be left out if the rouines from the library
|
||||||
|
* being used are not cryptographic related :-).
|
||||||
|
* 4. If you include any Windows specific code (or a derivative thereof) from
|
||||||
|
* the apps directory (application code) you must include an acknowledgement:
|
||||||
|
* "This product includes software written by Tim Hudson (tjh@cryptsoft.com)"
|
||||||
|
*
|
||||||
|
* THIS SOFTWARE IS PROVIDED BY ERIC YOUNG ``AS IS'' AND
|
||||||
|
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
||||||
|
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
||||||
|
* ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
|
||||||
|
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
||||||
|
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
|
||||||
|
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
|
||||||
|
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
|
||||||
|
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
|
||||||
|
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
|
||||||
|
* SUCH DAMAGE.
|
||||||
|
*
|
||||||
|
* The licence and distribution terms for any publically available version or
|
||||||
|
* derivative of this code cannot be changed. i.e. this code cannot simply be
|
||||||
|
* copied and put under another distribution licence
|
||||||
|
* [including the GNU Public Licence.]
|
||||||
|
*/
|
||||||
|
|
||||||
|
|
||||||
|
This software is copyrighted by the Regents of the University of
|
||||||
|
California, Sun Microsystems, Inc., Scriptics Corporation, ActiveState
|
||||||
|
Corporation and other parties. The following terms apply to all files
|
||||||
|
associated with the software unless explicitly disclaimed in
|
||||||
|
individual files.
|
||||||
|
|
||||||
|
The authors hereby grant permission to use, copy, modify, distribute,
|
||||||
|
and license this software and its documentation for any purpose, provided
|
||||||
|
that existing copyright notices are retained in all copies and that this
|
||||||
|
notice is included verbatim in any distributions. No written agreement,
|
||||||
|
license, or royalty fee is required for any of the authorized uses.
|
||||||
|
Modifications to this software may be copyrighted by their authors
|
||||||
|
and need not follow the licensing terms described here, provided that
|
||||||
|
the new terms are clearly indicated on the first page of each file where
|
||||||
|
they apply.
|
||||||
|
|
||||||
|
IN NO EVENT SHALL THE AUTHORS OR DISTRIBUTORS BE LIABLE TO ANY PARTY
|
||||||
|
FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES
|
||||||
|
ARISING OUT OF THE USE OF THIS SOFTWARE, ITS DOCUMENTATION, OR ANY
|
||||||
|
DERIVATIVES THEREOF, EVEN IF THE AUTHORS HAVE BEEN ADVISED OF THE
|
||||||
|
POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
||||||
|
THE AUTHORS AND DISTRIBUTORS SPECIFICALLY DISCLAIM ANY WARRANTIES,
|
||||||
|
INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT. THIS SOFTWARE
|
||||||
|
IS PROVIDED ON AN "AS IS" BASIS, AND THE AUTHORS AND DISTRIBUTORS HAVE
|
||||||
|
NO OBLIGATION TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR
|
||||||
|
MODIFICATIONS.
|
||||||
|
|
||||||
|
GOVERNMENT USE: If you are acquiring this software on behalf of the
|
||||||
|
U.S. government, the Government shall have only "Restricted Rights"
|
||||||
|
in the software and related documentation as defined in the Federal
|
||||||
|
Acquisition Regulations (FARs) in Clause 52.227.19 (c) (2). If you
|
||||||
|
are acquiring the software on behalf of the Department of Defense, the
|
||||||
|
software shall be classified as "Commercial Computer Software" and the
|
||||||
|
Government shall have only "Restricted Rights" as defined in Clause
|
||||||
|
252.227-7014 (b) (3) of DFARs. Notwithstanding the foregoing, the
|
||||||
|
authors grant the U.S. Government and others acting in its behalf
|
||||||
|
permission to use and distribute the software in accordance with the
|
||||||
|
terms specified in this license.
|
||||||
|
|
||||||
|
This software is copyrighted by the Regents of the University of
|
||||||
|
California, Sun Microsystems, Inc., Scriptics Corporation, ActiveState
|
||||||
|
Corporation, Apple Inc. and other parties. The following terms apply to
|
||||||
|
all files associated with the software unless explicitly disclaimed in
|
||||||
|
individual files.
|
||||||
|
|
||||||
|
The authors hereby grant permission to use, copy, modify, distribute,
|
||||||
|
and license this software and its documentation for any purpose, provided
|
||||||
|
that existing copyright notices are retained in all copies and that this
|
||||||
|
notice is included verbatim in any distributions. No written agreement,
|
||||||
|
license, or royalty fee is required for any of the authorized uses.
|
||||||
|
Modifications to this software may be copyrighted by their authors
|
||||||
|
and need not follow the licensing terms described here, provided that
|
||||||
|
the new terms are clearly indicated on the first page of each file where
|
||||||
|
they apply.
|
||||||
|
|
||||||
|
IN NO EVENT SHALL THE AUTHORS OR DISTRIBUTORS BE LIABLE TO ANY PARTY
|
||||||
|
FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES
|
||||||
|
ARISING OUT OF THE USE OF THIS SOFTWARE, ITS DOCUMENTATION, OR ANY
|
||||||
|
DERIVATIVES THEREOF, EVEN IF THE AUTHORS HAVE BEEN ADVISED OF THE
|
||||||
|
POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
||||||
|
THE AUTHORS AND DISTRIBUTORS SPECIFICALLY DISCLAIM ANY WARRANTIES,
|
||||||
|
INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT. THIS SOFTWARE
|
||||||
|
IS PROVIDED ON AN "AS IS" BASIS, AND THE AUTHORS AND DISTRIBUTORS HAVE
|
||||||
|
NO OBLIGATION TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR
|
||||||
|
MODIFICATIONS.
|
||||||
|
|
||||||
|
GOVERNMENT USE: If you are acquiring this software on behalf of the
|
||||||
|
U.S. government, the Government shall have only "Restricted Rights"
|
||||||
|
in the software and related documentation as defined in the Federal
|
||||||
|
Acquisition Regulations (FARs) in Clause 52.227.19 (c) (2). If you
|
||||||
|
are acquiring the software on behalf of the Department of Defense, the
|
||||||
|
software shall be classified as "Commercial Computer Software" and the
|
||||||
|
Government shall have only "Restricted Rights" as defined in Clause
|
||||||
|
252.227-7013 (b) (3) of DFARs. Notwithstanding the foregoing, the
|
||||||
|
authors grant the U.S. Government and others acting in its behalf
|
||||||
|
permission to use and distribute the software in accordance with the
|
||||||
|
terms specified in this license.
|
||||||
|
|
||||||
|
Copyright (c) 1993-1999 Ioi Kim Lam.
|
||||||
|
Copyright (c) 2000-2001 Tix Project Group.
|
||||||
|
Copyright (c) 2004 ActiveState
|
||||||
|
|
||||||
|
This software is copyrighted by the above entities
|
||||||
|
and other parties. The following terms apply to all files associated
|
||||||
|
with the software unless explicitly disclaimed in individual files.
|
||||||
|
|
||||||
|
The authors hereby grant permission to use, copy, modify, distribute,
|
||||||
|
and license this software and its documentation for any purpose, provided
|
||||||
|
that existing copyright notices are retained in all copies and that this
|
||||||
|
notice is included verbatim in any distributions. No written agreement,
|
||||||
|
license, or royalty fee is required for any of the authorized uses.
|
||||||
|
Modifications to this software may be copyrighted by their authors
|
||||||
|
and need not follow the licensing terms described here, provided that
|
||||||
|
the new terms are clearly indicated on the first page of each file where
|
||||||
|
they apply.
|
||||||
|
|
||||||
|
IN NO EVENT SHALL THE AUTHORS OR DISTRIBUTORS BE LIABLE TO ANY PARTY
|
||||||
|
FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES
|
||||||
|
ARISING OUT OF THE USE OF THIS SOFTWARE, ITS DOCUMENTATION, OR ANY
|
||||||
|
DERIVATIVES THEREOF, EVEN IF THE AUTHORS HAVE BEEN ADVISED OF THE
|
||||||
|
POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
|
||||||
|
THE AUTHORS AND DISTRIBUTORS SPECIFICALLY DISCLAIM ANY WARRANTIES,
|
||||||
|
INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT. THIS SOFTWARE
|
||||||
|
IS PROVIDED ON AN "AS IS" BASIS, AND THE AUTHORS AND DISTRIBUTORS HAVE
|
||||||
|
NO OBLIGATION TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR
|
||||||
|
MODIFICATIONS.
|
||||||
|
|
||||||
|
GOVERNMENT USE: If you are acquiring this software on behalf of the
|
||||||
|
U.S. government, the Government shall have only "Restricted Rights"
|
||||||
|
in the software and related documentation as defined in the Federal
|
||||||
|
Acquisition Regulations (FARs) in Clause 52.227.19 (c) (2). If you
|
||||||
|
are acquiring the software on behalf of the Department of Defense, the
|
||||||
|
software shall be classified as "Commercial Computer Software" and the
|
||||||
|
Government shall have only "Restricted Rights" as defined in Clause
|
||||||
|
252.227-7013 (c) (1) of DFARs. Notwithstanding the foregoing, the
|
||||||
|
authors grant the U.S. Government and others acting in its behalf
|
||||||
|
permission to use and distribute the software in accordance with the
|
||||||
|
terms specified in this license.
|
||||||
|
|
||||||
|
----------------------------------------------------------------------
|
||||||
|
|
||||||
|
Parts of this software are based on the Tcl/Tk software copyrighted by
|
||||||
|
the Regents of the University of California, Sun Microsystems, Inc.,
|
||||||
|
and other parties. The original license terms of the Tcl/Tk software
|
||||||
|
distribution is included in the file docs/license.tcltk.
|
||||||
|
|
||||||
|
Parts of this software are based on the HTML Library software
|
||||||
|
copyrighted by Sun Microsystems, Inc. The original license terms of
|
||||||
|
the HTML Library software distribution is included in the file
|
||||||
|
docs/license.html_lib.
|
||||||
|
|
@ -0,0 +1,146 @@
|
|||||||
|
"""Record of phased-in incompatible language changes.
|
||||||
|
|
||||||
|
Each line is of the form:
|
||||||
|
|
||||||
|
FeatureName = "_Feature(" OptionalRelease "," MandatoryRelease ","
|
||||||
|
CompilerFlag ")"
|
||||||
|
|
||||||
|
where, normally, OptionalRelease < MandatoryRelease, and both are 5-tuples
|
||||||
|
of the same form as sys.version_info:
|
||||||
|
|
||||||
|
(PY_MAJOR_VERSION, # the 2 in 2.1.0a3; an int
|
||||||
|
PY_MINOR_VERSION, # the 1; an int
|
||||||
|
PY_MICRO_VERSION, # the 0; an int
|
||||||
|
PY_RELEASE_LEVEL, # "alpha", "beta", "candidate" or "final"; string
|
||||||
|
PY_RELEASE_SERIAL # the 3; an int
|
||||||
|
)
|
||||||
|
|
||||||
|
OptionalRelease records the first release in which
|
||||||
|
|
||||||
|
from __future__ import FeatureName
|
||||||
|
|
||||||
|
was accepted.
|
||||||
|
|
||||||
|
In the case of MandatoryReleases that have not yet occurred,
|
||||||
|
MandatoryRelease predicts the release in which the feature will become part
|
||||||
|
of the language.
|
||||||
|
|
||||||
|
Else MandatoryRelease records when the feature became part of the language;
|
||||||
|
in releases at or after that, modules no longer need
|
||||||
|
|
||||||
|
from __future__ import FeatureName
|
||||||
|
|
||||||
|
to use the feature in question, but may continue to use such imports.
|
||||||
|
|
||||||
|
MandatoryRelease may also be None, meaning that a planned feature got
|
||||||
|
dropped.
|
||||||
|
|
||||||
|
Instances of class _Feature have two corresponding methods,
|
||||||
|
.getOptionalRelease() and .getMandatoryRelease().
|
||||||
|
|
||||||
|
CompilerFlag is the (bitfield) flag that should be passed in the fourth
|
||||||
|
argument to the builtin function compile() to enable the feature in
|
||||||
|
dynamically compiled code. This flag is stored in the .compiler_flag
|
||||||
|
attribute on _Future instances. These values must match the appropriate
|
||||||
|
#defines of CO_xxx flags in Include/compile.h.
|
||||||
|
|
||||||
|
No feature line is ever to be deleted from this file.
|
||||||
|
"""
|
||||||
|
|
||||||
|
all_feature_names = [
|
||||||
|
"nested_scopes",
|
||||||
|
"generators",
|
||||||
|
"division",
|
||||||
|
"absolute_import",
|
||||||
|
"with_statement",
|
||||||
|
"print_function",
|
||||||
|
"unicode_literals",
|
||||||
|
"barry_as_FLUFL",
|
||||||
|
"generator_stop",
|
||||||
|
"annotations",
|
||||||
|
]
|
||||||
|
|
||||||
|
__all__ = ["all_feature_names"] + all_feature_names
|
||||||
|
|
||||||
|
# The CO_xxx symbols are defined here under the same names defined in
|
||||||
|
# code.h and used by compile.h, so that an editor search will find them here.
|
||||||
|
# However, they're not exported in __all__, because they don't really belong to
|
||||||
|
# this module.
|
||||||
|
CO_NESTED = 0x0010 # nested_scopes
|
||||||
|
CO_GENERATOR_ALLOWED = 0 # generators (obsolete, was 0x1000)
|
||||||
|
CO_FUTURE_DIVISION = 0x2000 # division
|
||||||
|
CO_FUTURE_ABSOLUTE_IMPORT = 0x4000 # perform absolute imports by default
|
||||||
|
CO_FUTURE_WITH_STATEMENT = 0x8000 # with statement
|
||||||
|
CO_FUTURE_PRINT_FUNCTION = 0x10000 # print function
|
||||||
|
CO_FUTURE_UNICODE_LITERALS = 0x20000 # unicode string literals
|
||||||
|
CO_FUTURE_BARRY_AS_BDFL = 0x40000
|
||||||
|
CO_FUTURE_GENERATOR_STOP = 0x80000 # StopIteration becomes RuntimeError in generators
|
||||||
|
CO_FUTURE_ANNOTATIONS = 0x100000 # annotations become strings at runtime
|
||||||
|
|
||||||
|
class _Feature:
|
||||||
|
def __init__(self, optionalRelease, mandatoryRelease, compiler_flag):
|
||||||
|
self.optional = optionalRelease
|
||||||
|
self.mandatory = mandatoryRelease
|
||||||
|
self.compiler_flag = compiler_flag
|
||||||
|
|
||||||
|
def getOptionalRelease(self):
|
||||||
|
"""Return first release in which this feature was recognized.
|
||||||
|
|
||||||
|
This is a 5-tuple, of the same form as sys.version_info.
|
||||||
|
"""
|
||||||
|
|
||||||
|
return self.optional
|
||||||
|
|
||||||
|
def getMandatoryRelease(self):
|
||||||
|
"""Return release in which this feature will become mandatory.
|
||||||
|
|
||||||
|
This is a 5-tuple, of the same form as sys.version_info, or, if
|
||||||
|
the feature was dropped, is None.
|
||||||
|
"""
|
||||||
|
|
||||||
|
return self.mandatory
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "_Feature" + repr((self.optional,
|
||||||
|
self.mandatory,
|
||||||
|
self.compiler_flag))
|
||||||
|
|
||||||
|
nested_scopes = _Feature((2, 1, 0, "beta", 1),
|
||||||
|
(2, 2, 0, "alpha", 0),
|
||||||
|
CO_NESTED)
|
||||||
|
|
||||||
|
generators = _Feature((2, 2, 0, "alpha", 1),
|
||||||
|
(2, 3, 0, "final", 0),
|
||||||
|
CO_GENERATOR_ALLOWED)
|
||||||
|
|
||||||
|
division = _Feature((2, 2, 0, "alpha", 2),
|
||||||
|
(3, 0, 0, "alpha", 0),
|
||||||
|
CO_FUTURE_DIVISION)
|
||||||
|
|
||||||
|
absolute_import = _Feature((2, 5, 0, "alpha", 1),
|
||||||
|
(3, 0, 0, "alpha", 0),
|
||||||
|
CO_FUTURE_ABSOLUTE_IMPORT)
|
||||||
|
|
||||||
|
with_statement = _Feature((2, 5, 0, "alpha", 1),
|
||||||
|
(2, 6, 0, "alpha", 0),
|
||||||
|
CO_FUTURE_WITH_STATEMENT)
|
||||||
|
|
||||||
|
print_function = _Feature((2, 6, 0, "alpha", 2),
|
||||||
|
(3, 0, 0, "alpha", 0),
|
||||||
|
CO_FUTURE_PRINT_FUNCTION)
|
||||||
|
|
||||||
|
unicode_literals = _Feature((2, 6, 0, "alpha", 2),
|
||||||
|
(3, 0, 0, "alpha", 0),
|
||||||
|
CO_FUTURE_UNICODE_LITERALS)
|
||||||
|
|
||||||
|
barry_as_FLUFL = _Feature((3, 1, 0, "alpha", 2),
|
||||||
|
(3, 9, 0, "alpha", 0),
|
||||||
|
CO_FUTURE_BARRY_AS_BDFL)
|
||||||
|
|
||||||
|
generator_stop = _Feature((3, 5, 0, "beta", 1),
|
||||||
|
(3, 7, 0, "alpha", 0),
|
||||||
|
CO_FUTURE_GENERATOR_STOP)
|
||||||
|
|
||||||
|
annotations = _Feature((3, 7, 0, "beta", 1),
|
||||||
|
(4, 0, 0, "alpha", 0),
|
||||||
|
CO_FUTURE_ANNOTATIONS)
|
@ -0,0 +1 @@
|
|||||||
|
# This file exists as a helper for the test.test_frozen module.
|
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in new issue