Quellcode durchsuchen

Add files via upload

Initial upload of interactive dashboard for the new dedicated repository.
Michael Pyrcz vor 1 Jahr
Ursprung
Commit
c06937ac07
60 geänderte Dateien mit 49002 neuen und 0 gelöschten Zeilen
  1. 752 0
      Interactive_Aggregate_Uncertainty.ipynb
  2. 582 0
      Interactive_Bayesian_Linear_Regression.ipynb
  3. 803 0
      Interactive_Bayesian_Updating.ipynb
  4. 523 0
      Interactive_Bootstrap.ipynb
  5. 717 0
      Interactive_Bootstrap_CowbowHat.ipynb
  6. 501 0
      Interactive_Bootstrap_Everthing.ipynb
  7. 484 0
      Interactive_Central_Limit_Theorem.ipynb
  8. 792 0
      Interactive_Confidence_Interval.ipynb
  9. 594 0
      Interactive_Convolution_kNearest.ipynb
  10. 470 0
      Interactive_Correlation_Coefficient.ipynb
  11. 487 0
      Interactive_Correlation_Coefficient_Issues.ipynb
  12. 509 0
      Interactive_DBSCAN.ipynb
  13. 517 0
      Interactive_DecisionMaking_Load_Data.ipynb
  14. 405 0
      Interactive_Decision_Making.ipynb
  15. 675 0
      Interactive_Decision_Tree.ipynb
  16. 809 0
      Interactive_Declustering.ipynb
  17. 1246 0
      Interactive_Distribution_Transformations.ipynb
  18. 342 0
      Interactive_Gibbs_Sampler.ipynb
  19. 466 0
      Interactive_Hypothesis_Testing.ipynb
  20. 1051 0
      Interactive_LASSO_Regression.ipynb
  21. 1209 0
      Interactive_Linear_Regression.ipynb
  22. 965 0
      Interactive_Linear_Solutions.ipynb
  23. 718 0
      Interactive_MDS_Impact_Correlation_Standardization.ipynb
  24. 804 0
      Interactive_Marginal_Joint_Conditional_Probability.ipynb
  25. 683 0
      Interactive_Model_Fitting.ipynb
  26. 672 0
      Interactive_Monte_Carlo_Methods.ipynb
  27. 889 0
      Interactive_Monte_Carlo_Simulation.ipynb
  28. 2289 0
      Interactive_Neural_Network_Single_Layer.ipynb
  29. 374 0
      Interactive_Norms.ipynb
  30. 3469 0
      Interactive_Optimization.ipynb
  31. 2331 0
      Interactive_Optimization_Working.ipynb
  32. 408 0
      Interactive_Overfit.ipynb
  33. 1694 0
      Interactive_PCA.ipynb
  34. 427 0
      Interactive_PCA_Eigen-Copy1.ipynb
  35. 348 0
      Interactive_PCA_Eigen.ipynb
  36. 329 0
      Interactive_PP_Plot.ipynb
  37. 615 0
      Interactive_Parametric_Distributions.ipynb
  38. 599 0
      Interactive_Polynomial_Solution.ipynb
  39. 925 0
      Interactive_PreDrill_Prediction.ipynb
  40. 312 0
      Interactive_QQ_Plot.ipynb
  41. 502 0
      Interactive_RadialBasisFunctions.ipynb
  42. 1029 0
      Interactive_Ridge_Regresion.ipynb
  43. 317 0
      Interactive_Sampling_Methods.ipynb
  44. 683 0
      Interactive_Simple_Kriging.ipynb
  45. 899 0
      Interactive_Simple_Kriging_Behavoir.ipynb
  46. 911 0
      Interactive_Simulation.ipynb
  47. 383 0
      Interactive_Sivia_Coin_Toss.ipynb
  48. 759 0
      Interactive_Spatial_Aggregate_Uncertainty.ipynb
  49. 762 0
      Interactive_Spatial_Aggregate_Uncertainty_Pad.ipynb
  50. 1304 0
      Interactive_Spectral_Clustering.ipynb
  51. 379 0
      Interactive_Spurious_Correlations.ipynb
  52. 596 0
      Interactive_String_Effect.ipynb
  53. 675 0
      Interactive_Trend.ipynb
  54. 1063 0
      Interactive_Variogram_Calculation.ipynb
  55. 1257 0
      Interactive_Variogram_Calculation_Modeling.ipynb
  56. 1558 0
      Interactive_Variogram_Calculation_Modeling_Krige.ipynb
  57. 1555 0
      Interactive_Variogram_Modeling.ipynb
  58. 412 0
      Interactive_Variogram_Nugget_Effect.ipynb
  59. 453 0
      Interactive_Variogram_h_scatter.ipynb
  60. 720 0
      Interactive_kMeans_Clustering.ipynb

Datei-Diff unterdrückt, da er zu groß ist
+ 752 - 0
Interactive_Aggregate_Uncertainty.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 582 - 0
Interactive_Bayesian_Linear_Regression.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 803 - 0
Interactive_Bayesian_Updating.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 523 - 0
Interactive_Bootstrap.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 717 - 0
Interactive_Bootstrap_CowbowHat.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 501 - 0
Interactive_Bootstrap_Everthing.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 484 - 0
Interactive_Central_Limit_Theorem.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 792 - 0
Interactive_Confidence_Interval.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 594 - 0
Interactive_Convolution_kNearest.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 470 - 0
Interactive_Correlation_Coefficient.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 487 - 0
Interactive_Correlation_Coefficient_Issues.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 509 - 0
Interactive_DBSCAN.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 517 - 0
Interactive_DecisionMaking_Load_Data.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 405 - 0
Interactive_Decision_Making.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 675 - 0
Interactive_Decision_Tree.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 809 - 0
Interactive_Declustering.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1246 - 0
Interactive_Distribution_Transformations.ipynb


+ 342 - 0
Interactive_Gibbs_Sampler.ipynb

@@ -0,0 +1,342 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "id": "ede095b1",
+   "metadata": {},
+   "source": [
+    "<p align=\"center\">\n",
+    "    <img src=\"https://github.com/GeostatsGuy/GeostatsPy/blob/master/TCG_color_logo.png?raw=true\" width=\"220\" height=\"240\" />\n",
+    "\n",
+    "</p>\n",
+    "\n",
+    "## Interactive Gibbs Sampler \n",
+    "\n",
+    "### Michael J. Pyrcz, Professor, The University of Texas at Austin \n",
+    "\n",
+    "*Novel Data Analytics, Geostatistics and Machine Learning Subsurface Solutions*"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "2e15b023",
+   "metadata": {},
+   "source": [
+    "#### Gibbs Sampler\n",
+    "\n",
+    "I teach the Gibbs Sampler as part of my lecture on Markov chain Monte Carlo (McMC) methods. This is critical to understand solution methods Bayesian machine learning methods. See my lectures:\n",
+    "\n",
+    "* [Bayesian linear regression lecture](https://youtu.be/LzZ5b3wdZQk?si=3Uu2pvCjsl1fH5qU)\n",
+    "* [Markov chain Monte Carlo](https://youtu.be/7QX-yVboLhk?si=o7CSimpgFhjT1Vxo)\n",
+    "* [Bayesian Linear Regression Example](https://youtu.be/JG69fxKzwt8?si=ywn9xC_Pe8YQwR2f)\n",
+    "\n",
+    "Gibbs sampler is one of the most intuitive methods for McMC.\n",
+    "\n",
+    "* as usual we don't have access to the joint distribution, but we have access to the conditional distributions. \n",
+    "* instead of sampleing directly from the joint distribution (not available), we sequentially sample from the conditional distribution! \n",
+    "\n",
+    "For a bivariate example, features $X_1$ and $X_2$, we proceed as follows:\n",
+    "\n",
+    "1. Assign random values for $𝑋_1^{\\ell=0}$, $X_2^{\\ell=0}$\n",
+    "<p></p>\n",
+    "2. Sample from $𝑓(𝑋_1|X_2^{\\ell=0})$ to get $𝑋_1^{\\ell=1}$ \n",
+    "<p></p>\n",
+    "3. Sample from $𝑓(𝑋_2|X_1^{\\ell=1})$ to get $𝑋_2^{\\ell=1}$ \n",
+    "<p></p>\n",
+    "4. Repeat for the next steps / samples, $\\ell = 1,\\ldots,𝐿$\n",
+    "\n",
+    "Although we only applied the conditional distribution, the resulting samples will have the correct joint distribution.\n",
+    "\n",
+    "\\begin{equation}\n",
+    "f(X_1,X_2)\n",
+    "\\end{equation}\n",
+    "\n",
+    "We never needed to use the joint distribution, we only needed the conditionals!\n",
+    "\n",
+    "* Bayesian Linear Regression - we apply Gibbs sampler to sample the posterior distributions of the model parameters given the data.\n",
+    "\n",
+    "#### Gibbs Sampler for Bivariate Gaussian Distribution\n",
+    "\n",
+    "Below I build out an interactive Gibbs sampler to sample the bivariate joint Gaussian distribution from only the conditional distributions!\n",
+    "\n",
+    "#### Load and Configure the Required Libraries\n",
+    "\n",
+    "The following code loads the required libraries and sets a plotting default."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 1,
+   "id": "da837ef7",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "%matplotlib inline\n",
+    "supress_warnings = False\n",
+    "import os                                               # to set current working directory \n",
+    "import sys                                              # supress output to screen for interactive variogram modeling\n",
+    "import numpy as np                                      # arrays and matrix math\n",
+    "import pandas as pd                                     # DataFrames\n",
+    "from scipy.stats import norm                            # Gaussian PDF\n",
+    "import matplotlib.pyplot as plt                         # plotting\n",
+    "import seaborn as sns                                   # plot PDF\n",
+    "from sklearn.model_selection import train_test_split    # train and test split\n",
+    "from sklearn import tree                                # tree program from scikit learn (package for machine learning)\n",
+    "from sklearn import metrics                             # measures to check our models\n",
+    "import scipy.spatial as spatial                         #search for neighbours\n",
+    "from matplotlib.patches import Rectangle                # build a custom legend\n",
+    "from matplotlib.ticker import (MultipleLocator, AutoMinorLocator) # control of axes ticks\n",
+    "import math                                             # sqrt operator\n",
+    "from ipywidgets import interactive                      # widgets and interactivity\n",
+    "from ipywidgets import widgets                            \n",
+    "from ipywidgets import Layout\n",
+    "from ipywidgets import Label\n",
+    "from ipywidgets import VBox, HBox\n",
+    "cmap = plt.cm.inferno                                   # default color bar, no bias and friendly for color vision defeciency\n",
+    "plt.rc('axes', axisbelow=True)                          # grid behind plotting elements\n",
+    "if supress_warnings == True:\n",
+    "    import warnings                                     # supress any warnings for this demonstration\n",
+    "    warnings.filterwarnings('ignore')                  "
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "2b57659e",
+   "metadata": {},
+   "source": [
+    "#### Declare Functions\n",
+    "\n",
+    "The following functions for clean code. \n",
+    "\n",
+    "* Just a improved grid for the plot."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 2,
+   "id": "a333fd85",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def add_grid():\n",
+    "    plt.gca().grid(True, which='major',linewidth = 1.0); plt.gca().grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.gca().tick_params(which='major',length=7); plt.gca().tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "e382a4e2",
+   "metadata": {},
+   "source": [
+    "#### Interactive Gibbs Sampler to Sample the Bivariate Gausian Distribution Dashboard\n",
+    "\n",
+    "Here's a dashboard with a cool visualization for my interactive Gibbs sampler."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 3,
+   "id": "90be8276",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "l = widgets.Text(value='                                  Interactive Gibbs Sampler Demo, Prof. Michael Pyrcz, The University of Texas at Austin',\n",
+    "                 layout=Layout(width='750px', height='30px'))\n",
+    "\n",
+    "nsample = widgets.IntSlider(min=1, max = 101, value=10, step = 1, description = '$n_{sample}$',orientation='horizontal', \n",
+    "           style = {'description_width': 'initial'},layout=Layout(width='370px', height='30px'),continuous_update=False)\n",
+    "rho = widgets.FloatSlider(min=-1.0, max = 1.0, value=0.7, step = 0.1, description = r'$\\rho_{X_1,X_2}$',orientation='horizontal',\n",
+    "           style = {'description_width': 'initial'},layout=Layout(width='370px', height='30px'),continuous_update=False)\n",
+    "\n",
+    "ui = widgets.HBox([nsample,rho],)\n",
+    "ui2 = widgets.VBox([l,ui],)\n",
+    "\n",
+    "def run_plot(nsample,rho):\n",
+    "    mu1 = 0.0; sig1 = 1.0; mu2 = 0.0; sig2 = 1.0; seed = 73073; nc = 200\n",
+    "    \n",
+    "    L = nsample\n",
+    "    np.random.seed(seed=seed)\n",
+    "    x1 = np.zeros(L); x2 = np.zeros(L); x = np.linspace(-3,3,nc)\n",
+    "    \n",
+    "    x1[0] = np.random.rand(1) * 6.0 - 3.0; x2[0] = np.random.rand(1) * 6.0 - 3.0; \n",
+    "    \n",
+    "    plt.subplot(111)\n",
+    "    plt.scatter(x1[0],x2[0],color='grey',edgecolor='black',s=15,zorder=4)\n",
+    "    \n",
+    "    case = 0\n",
+    "    \n",
+    "    for l in range(1,L):\n",
+    "        if case == 0: # update x2\n",
+    "            x1[l] = x1[l-1]\n",
+    "            lmu = mu2 + rho * (sig2/sig1) * (x1[l] - mu1); lstd = 1 - rho**2\n",
+    "            x2[l] = np.random.normal(loc = lmu,scale = lstd,size = 1)\n",
+    "            case = 1\n",
+    "            plt.scatter(x1[l],x2[l],color='blue',edgecolor='black',s=15,alpha=1.0,zorder=100)\n",
+    "            plt.plot([x1[l-1],x1[l]],[x2[l-1],x2[l]],color='black',lw=1,alpha = max((l-(L-20))/20,0),zorder=4)\n",
+    "            plt.plot([x1[l-1],x1[l]],[x2[l-1],x2[l]],color='white',lw=3,alpha = max((l-(L-20))/20,0),zorder=3)\n",
+    "            if l == L-1:\n",
+    "                #plt.plot([x1[l],x1[l]],[-3,3],color='blue',alpha=0.7,zorder=10)\n",
+    "                pdf = norm.pdf(x, loc=lmu, scale=lstd)*0.5\n",
+    "                mask = pdf > np.percentile(pdf,q=40)\n",
+    "                plt.fill_betweenx(x[mask],x1[l]+pdf[mask],np.full(len(x[mask]),x1[l]),color='blue',alpha=0.2,zorder=2)\n",
+    "                plt.plot(x1[l]+pdf[mask],x[mask],color='blue',alpha=0.7,zorder=1)\n",
+    "                plt.arrow(x1[l-1],x2[l-1],0,x2[l]-x2[l-1],color='black',lw=0.5,head_width=0.05,length_includes_head=True,zorder=100)\n",
+    "                plt.scatter(x1[l],x2[l],color='white',edgecolor='blue',s=30,linewidth=1,alpha=1.0,zorder=100)\n",
+    "                plt.annotate(r'$f_{X_2|X_1}$ = ' + str(np.round(x1[l],2)),xy=[x1[l]+0.02,max(x[mask])-0.2],color='blue',rotation=-90)\n",
+    "        elif case == 1: # update x1\n",
+    "            x2[l] = x2[l-1]\n",
+    "            lmu = mu1 + rho * (sig1/sig2) * (x2[l] - mu2); lstd = 1 - rho**2\n",
+    "            x1[l] = np.random.normal(loc = lmu,scale = lstd,size = 1)\n",
+    "            case = 0\n",
+    "            plt.scatter(x1[l],x2[l],color='red',edgecolor='black',s=15,alpha=1.0,zorder=100)\n",
+    "            plt.plot([x1[l-1],x1[l]],[x2[l-1],x2[l]],color='black',lw=1,alpha = max((l-(L-20))/20,0),zorder=4)\n",
+    "            plt.plot([x1[l-1],x1[l]],[x2[l-1],x2[l]],color='white',lw=3,alpha = max((l-(L-20))/20,0),zorder=3)\n",
+    "            if l == L-1:\n",
+    "                #plt.plot([-3,3],[x2[l],x2[l]],color='red',alpha=0.7,zorder=10)\n",
+    "                pdf = norm.pdf(x, loc=lmu, scale=lstd)*0.5\n",
+    "                mask = pdf > np.percentile(pdf,q=40)\n",
+    "                plt.fill_between(x[mask],x2[l]+pdf[mask],np.full(len(x[mask]),x2[l]),color='red',alpha=0.2,zorder=2)\n",
+    "                plt.plot(x[mask],x2[l]+pdf[mask],color='red',alpha=0.7,zorder=1)\n",
+    "                plt.arrow(x1[l-1],x2[l-1],x1[l]-x1[l-1],0,color='black',lw=0.5,head_width=0.05,length_includes_head=True,zorder=100)\n",
+    "                plt.scatter(x1[l],x2[l],color='white',edgecolor='red',s=30,linewidth=1,alpha=1.0,zorder=100)\n",
+    "                plt.annotate(r'$f_{X_1|X_2}$ = ' + str(np.round(x2[l],2)),xy=[min(x[mask])-0.5,x2[l]+0.1],color='red')\n",
+    "    \n",
+    "    df = pd.DataFrame(np.vstack([x1,x2]).T, columns= ['x1','x2'])\n",
+    "    if L > 20:\n",
+    "        sns.kdeplot(data=df,x='x1',y='x2',color='grey',linewidths=1.0,alpha=min(((l-20)/20),1.0),levels=5,zorder=1)\n",
+    "    add_grid()\n",
+    "    plt.xlim([-3.5,3.5]); plt.ylim([-3.5,3.5]); plt.xlabel(r'$X_1$'); plt.ylabel(r'$X_2$'); plt.title('Gibbs Sampler - Bivariate Joint Gaussian Distribution')\n",
+    "    plt.subplots_adjust(left=0.0,bottom=0.0,right=1.0,top=1.1); plt.show() # set plot size \n",
+    "    \n",
+    "# connect the function to make the samples and plot to the widgets    \n",
+    "interactive_plot = widgets.interactive_output(run_plot, {'nsample':nsample,'rho':rho})\n",
+    "interactive_plot.clear_output(wait = True)               # reduce flickering by delaying plot updating  "
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "faaceed1",
+   "metadata": {},
+   "source": [
+    "### Interactive Gibbs Sampler Demonstation \n",
+    "\n",
+    "#### Michael Pyrcz, Professor, The University of Texas at Austin \n",
+    "\n",
+    "Set the number of samples and correlation coefficient and observe the Gibbs sampler.\n",
+    "\n",
+    "### The Inputs\n",
+    "\n",
+    "* **$n_{sample}$** - number of samples, **$\\rho_{X_1,X_2}$** - correlation coefficient"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 4,
+   "id": "899c4fa6",
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "b530fb44f2a843f1960cddbbc6aef73d",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "VBox(children=(Text(value='                                  Interactive Gibbs Sampler Demo, Prof. Michael Pyr…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    },
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "e0e8fc2fdb43415697a1eaf74174806f",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "Output()"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "display(ui2, interactive_plot)                           # display the interactive plot"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "07eb83a5",
+   "metadata": {},
+   "source": [
+    "#### Comments\n",
+    "\n",
+    "This was a basic demonstration of the Gibbs sampler for McMC. I have many other demonstrations and even basics of working with DataFrames, ndarrays, univariate statistics, plotting data, declustering, data transformations and many other workflows available at https://github.com/GeostatsGuy/PythonNumericalDemos and https://github.com/GeostatsGuy/GeostatsPy. \n",
+    "  \n",
+    "#### The Author:\n",
+    "\n",
+    "### Michael J. Pyrcz, Professor, The University of Texas at Austin \n",
+    "*Novel Data Analytics, Geostatistics and Machine Learning Subsurface Solutions*\n",
+    "\n",
+    "With over 17 years of experience in subsurface consulting, research and development, Michael has returned to academia driven by his passion for teaching and enthusiasm for enhancing engineers' and geoscientists' impact in subsurface resource development. \n",
+    "\n",
+    "For more about Michael check out these links:\n",
+    "\n",
+    "#### [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)\n",
+    "\n",
+    "#### Want to Work Together?\n",
+    "\n",
+    "I hope this content is helpful to those that want to learn more about subsurface modeling, data analytics and machine learning. Students and working professionals are welcome to participate.\n",
+    "\n",
+    "* Want to invite me to visit your company for training, mentoring, project review, workflow design and / or consulting? I'd be happy to drop by and work with you! \n",
+    "\n",
+    "* Interested in partnering, supporting my graduate student research or my Subsurface Data Analytics and Machine Learning consortium (co-PIs including Profs. Foster, Torres-Verdin and van Oort)? My research combines data analytics, stochastic modeling and machine learning theory with practice to develop novel methods and workflows to add value. We are solving challenging subsurface problems!\n",
+    "\n",
+    "* I can be reached at mpyrcz@austin.utexas.edu.\n",
+    "\n",
+    "I'm always happy to discuss,\n",
+    "\n",
+    "*Michael*\n",
+    "\n",
+    "Michael Pyrcz, Ph.D., P.Eng. Professor, The Hildebrand Department of Petroleum and Geosystems Engineering, Bureau of Economic Geology, Jackson School of Geosciences, The University of Texas at Austin\n",
+    "\n",
+    "#### More Resources Available at: [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)  \n",
+    "  "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "f51344db",
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.11.4"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}

Datei-Diff unterdrückt, da er zu groß ist
+ 466 - 0
Interactive_Hypothesis_Testing.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1051 - 0
Interactive_LASSO_Regression.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1209 - 0
Interactive_Linear_Regression.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 965 - 0
Interactive_Linear_Solutions.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 718 - 0
Interactive_MDS_Impact_Correlation_Standardization.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 804 - 0
Interactive_Marginal_Joint_Conditional_Probability.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 683 - 0
Interactive_Model_Fitting.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 672 - 0
Interactive_Monte_Carlo_Methods.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 889 - 0
Interactive_Monte_Carlo_Simulation.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 2289 - 0
Interactive_Neural_Network_Single_Layer.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 374 - 0
Interactive_Norms.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 3469 - 0
Interactive_Optimization.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 2331 - 0
Interactive_Optimization_Working.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 408 - 0
Interactive_Overfit.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1694 - 0
Interactive_PCA.ipynb


+ 427 - 0
Interactive_PCA_Eigen-Copy1.ipynb

@@ -0,0 +1,427 @@
+{
+ "cells": [
+  {
+   "cell_type": "code",
+   "execution_count": 1,
+   "id": "7fede23e",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "import pandas as pd                                       # DataFrames and plotting\n",
+    "import numpy as np\n",
+    "import matplotlib.pyplot as plt                           # plotting\n",
+    "from matplotlib.colors import ListedColormap              # custom color maps\n",
+    "import matplotlib.ticker as mtick\n",
+    "from matplotlib.patches import Rectangle\n",
+    "import matplotlib as mpl\n",
+    "from mpl_toolkits.axes_grid1 import make_axes_locatable\n",
+    "from numpy.linalg import eig                              # Eigen values and Eigen vectors\n",
+    "from sklearn.decomposition import PCA                     # PCA program from scikit learn (package for machine learning)\n",
+    "from sklearn.preprocessing import StandardScaler          # normalize synthetic data\n",
+    "from ipywidgets import interactive                        # widgets and interactivity\n",
+    "from ipywidgets import widgets                            \n",
+    "from ipywidgets import Layout\n",
+    "from ipywidgets import Label\n",
+    "from ipywidgets import VBox, HBox\n",
+    "import warnings\n",
+    "warnings.filterwarnings('ignore')"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "dd001700",
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "12effd17",
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "70e4f054",
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "80e526c4",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "m = 4\n",
+    "mean = np.zeros((m))\n",
+    "#cov = np.zeros((m,m))\n",
+    "cov = np.full((m,m),0.8)\n",
+    "for i in range(0,m):\n",
+    "    cov[i,i] = 1.0\n",
+    "cov[2,3] = cov[3,2] = 0.2; cov[1,3] = cov[3,1] = -0.2; cov[2,0] = cov[0,2] = 0.4; cov[1,0] = cov[0,1] = -0.5\n",
+    "\n",
+    "data = np.random.multivariate_normal(mean = mean, cov = cov, size = 1000)\n",
+    "data = StandardScaler(copy=True, with_mean=True, with_std=True).fit(data).transform(data)\n",
+    "\n",
+    "df = pd.DataFrame(data, columns=np.array(['' + str(i) for i in range(0, m)]))\n",
+    "\n",
+    "plt.subplot(121)                                        # plot correlation matrix with significance colormap\n",
+    "sns.heatmap(cov,vmin = -1.0, vmax = 1.0,linewidths=.5, fmt= '.1f',cmap = plt.cm.Spectral_r)\n",
+    "plt.title('Target Covariance Matrix')\n",
+    "\n",
+    "plt.subplot(122)\n",
+    "sns.heatmap(df.iloc[:,:].corr(),vmin = -1.0, vmax = 1.0,linewidths=.5, fmt= '.1f',cmap = plt.cm.Spectral_r)\n",
+    "plt.title('Actual Covariance Matrix')\n",
+    "\n",
+    "plt.subplots_adjust(left=0.0, bottom=0.0, right=2.0, top=1.1, wspace=0.2, hspace=0.2); plt.show()\n",
+    "\n",
+    "df.head()"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "26aa283f",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "df.iloc[:,:].corr()"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "ea5b2515",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "df.describe()"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "83b1e382",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "nsample = 100\n",
+    "dpalette = sns.color_palette(\"rocket_r\",n_colors = 3)   # matrix scatter plot with points and density estimator\n",
+    "palette = sns.color_palette(\"rocket\")\n",
+    "matrixplot = sns.pairplot(df.sample(n=nsample),diag_kind = 'kde',palette = dpalette,diag_kws={'edgecolor':'black'},plot_kws=dict(s=50, edgecolor=\"black\", linewidth=0.5,alpha=0.2))\n",
+    "matrixplot.map_lower(sns.kdeplot, levels=3, color=\"black\")\n",
+    "plt.subplots_adjust(left=0.0, bottom=0.0, right=0.5, top=0.6, wspace=0.2, hspace=0.3); plt.show()"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "58970b63",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "cov_actual = np.cov(data,rowvar = False)\n",
+    "eigen_values,eigen_vectors = eig(cov_actual)\n",
+    "sorted_indices = np.argsort(-eigen_values)\n",
+    "sorted_eigen_vectors = eigen_vectors[:, sorted_indices]\n",
+    "sorted_eigen_values = np.sort(-eigen_values)*-1\n",
+    "\n",
+    "plt.subplot(121)\n",
+    "plt.plot(np.arange(1,5,1),np.cumsum(sorted_eigen_values)/np.sum(sorted_eigen_values)*100,color='darkorange',alpha=0.8)\n",
+    "plt.scatter(np.arange(1,5,1),np.cumsum(sorted_eigen_values)/np.sum(sorted_eigen_values)*100,color='darkorange',alpha=0.8,edgecolor='black')\n",
+    "plt.plot([1,4],[95,95], color='black',linestyle='dashed')\n",
+    "plt.xlabel('Principal Component'); plt.ylabel('Cumulative Variance Explained'); plt.title('Cumulative Variance Explained by Principal Component')\n",
+    "fmt = '%.0f%%' # Format you want the ticks, e.g. '40%'\n",
+    "yticks = mtick.FormatStrFormatter(fmt); \n",
+    "plt.xlim(1,4); plt.ylim(0,100.0); plt.annotate('95% variance explained',[3.0,90])\n",
+    "plt.gca().yaxis.set_major_formatter(yticks)\n",
+    "\n",
+    "plt.subplot(122)\n",
+    "im = plt.imshow(sorted_eigen_vectors,cmap = plt.cm.Spectral_r)\n",
+    "plt.title('Actual Covariance Matrix')\n",
+    "cbar = plt.colorbar(\n",
+    "    im, orientation=\"vertical\", ticks=np.linspace(-1, 1, 10)\n",
+    ")\n",
+    "cbar.set_label('Component Loadings', rotation=270, labelpad=20)\n",
+    "plt.xlim([-0.5,3.5]); plt.ylim([-0.5,3.5])\n",
+    "plt.gca().set_xticks([0,1, 2, 3],[1,2,3,4]); plt.gca().set_yticks([0,1, 2, 3],[1,2,3,4])\n",
+    "for x in np.arange(0.5,4.5,1.0):\n",
+    "    plt.plot([x,x],[-0.5,3.5],c='black',lw=3)\n",
+    "    plt.plot([-0.5,3.5],[x,x],c='black',lw=1,ls='--')\n",
+    "\n",
+    "plt.subplots_adjust(left=0.0, bottom=0.0, right=2.01, top=0.9, wspace=0.2, hspace=0.3); plt.show()"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "5142ac6d",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "fig = plt.figure(figsize=(6, 6))\n",
+    "gs = fig.add_gridspec(2,2 ,width_ratios=(1.0, 1.0))\n",
+    "\n",
+    "plt_center = fig.add_subplot(gs[0, 1])\n",
+    "plt_x = fig.add_subplot(gs[0, 0],sharey=plt_center) \n",
+    "plt_y = fig.add_subplot(gs[1, 1],sharex=plt_center) \n",
+    "plt_extra = fig.add_subplot(gs[1, 0]) \n",
+    "\n",
+    "# im = plt_center.imshow(sorted_eigen_vectors,cmap = plt.cm.Spectral_r)\n",
+    "# plt_center.set_title('Actual Covariance Matrix')\n",
+    "# cbar = plt.colorbar(\n",
+    "#     im, orientation=\"vertical\", ticks=np.linspace(-1, 1, 10)\n",
+    "# )\n",
+    "# cbar.set_label('Component Loadings', rotation=270, labelpad=20)\n",
+    "\n",
+    "for i in range(0,m):\n",
+    "    for j in range(0,m):\n",
+    "        color = (sorted_eigen_vectors[j,i] + 1.0)/(2.0)\n",
+    "        plt_center.add_patch(Rectangle((i-0.5,j-0.5), 1, 1,color = plt.cm.RdGy_r(color),fill=True))\n",
+    "\n",
+    "plt_center.set_xlim([-0.5,3.5]); plt_center.set_ylim([-0.5,3.5])\n",
+    "plt_center.set_xticks([0,1, 2, 3],[1,2,3,4]); plt_center.set_yticks([0,1, 2, 3],[1,2,3,4])\n",
+    "for x in np.arange(0.5,3.5,1.0):\n",
+    "    plt_center.plot([x,x],[-0.5,3.5],c='black',lw=3)\n",
+    "    plt_center.plot([-0.5,3.5],[x,x],c='black',lw=1,ls='--')\n",
+    "plt_center.set_title('Eigen Vectors / Principal Component Loadings')  \n",
+    "\n",
+    "plt_x.barh(y=np.array([0,1,2,3],dtype='float'),width=np.var(data,axis=0),color='darkorange',edgecolor='black')\n",
+    "plt_x.set_xlim([1.2,0]); plt_x.set_yticks([0,1, 2, 3],[1,2,3,4])\n",
+    "plt_x.set_ylabel('Feature'); plt_x.set_xlabel('Variance')\n",
+    "plt_x.set_title('Original Feature Variance') \n",
+    "\n",
+    "plt_y.bar(x=np.array([0,1,2,3],dtype='float'),height=sorted_eigen_values,color='darkorange',edgecolor='black')\n",
+    "plt_y.set_ylim([2.5,0]); plt_y.set_xticks([0,1, 2, 3],[1,2,3,4])\n",
+    "plt_y.set_xlabel('Feature'); plt_y.set_ylabel('Variance')\n",
+    "plt_y.set_title('Projected Feature Variance')  \n",
+    "\n",
+    "for i in range(0,m):\n",
+    "    for j in range(0,m):\n",
+    "        color = (cov_actual[j,i] + 1.0)/(2.0)\n",
+    "        plt_extra.add_patch(Rectangle((i-0.5,j-0.5), 1, 1,color = plt.cm.bwr(color),fill=True))\n",
+    "\n",
+    "plt_extra.set_xlim([-0.5,3.5]); plt_extra.set_ylim([-0.5,3.5])\n",
+    "plt_extra.set_xticks([0,1, 2, 3],[1,2,3,4]); plt_extra.set_yticks([0,1, 2, 3],[1,2,3,4])\n",
+    "for x in np.arange(0.5,3.5,1.0):\n",
+    "    plt_extra.plot([x,x],[-0.5,3.5],c='black',lw=3)\n",
+    "    plt_extra.plot([-0.5,3.5],[x,x],c='black',lw=1,ls='--')\n",
+    "plt_extra.set_title('Eigen Vectors / Principal Component Loadings')  \n",
+    "\n",
+    "plt.subplots_adjust(left=0.0, bottom=0.0, right=1.51, top=1.50, wspace=0.2, hspace=0.2); plt.show()"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "36698d45",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "\n",
+    "\n",
+    "\n",
+    "m = 4; cstr = 0.0\n",
+    "\n",
+    "mean = np.zeros((m))                         # make inputs for multivariate dataset\n",
+    "#cov = np.zeros((m,m))\n",
+    "cov = np.full((m,m),0.8*cstr)\n",
+    "for i in range(0,m):\n",
+    "    cov[i,i] = 1.0\n",
+    "cov[2,3] = cov[3,2] = 0.2*cstr; cov[1,3] = cov[3,1] = -0.2*cstr; cov[2,0] = cov[0,2] = 0.4*cstr; \n",
+    "cov[1,0] = cov[0,1] = -0.5*cstr\n",
+    "\n",
+    "data = np.random.multivariate_normal(mean = mean, cov = cov, size = 1000) # draw samples from MV Gaussian\n",
+    "data = StandardScaler(copy=True, with_mean=True, with_std=True).fit(data).transform(data)\n",
+    "\n",
+    "cov_actual = np.cov(data,rowvar = False)\n",
+    "\n",
+    "eigen_values,eigen_vectors = eig(cov_actual) # Eigen values and vectors \n",
+    "sorted_indices = np.argsort(-eigen_values)\n",
+    "sorted_eigen_vectors = eigen_vectors[:, sorted_indices]\n",
+    "sorted_eigen_values = np.sort(-eigen_values)*-1\n",
+    "\n",
+    "fig = plt.figure(figsize=(6, 6))\n",
+    "gs = fig.add_gridspec(2,2 ,width_ratios=(1.0, 1.0))\n",
+    "\n",
+    "plt_center = fig.add_subplot(gs[1, 1])\n",
+    "plt_x = fig.add_subplot(gs[1, 0],sharey=plt_center) \n",
+    "plt_y = fig.add_subplot(gs[0, 1],sharex=plt_center) \n",
+    "plt_extra = fig.add_subplot(gs[0, 0]) \n",
+    "\n",
+    "for i in range(0,m):\n",
+    "    for j in range(0,m):\n",
+    "        color = (sorted_eigen_vectors[j,i] + 1.0)/(2.0)\n",
+    "        plt_center.add_patch(Rectangle((i-0.5,j-0.5), 1, 1,color = plt.cm.RdGy_r(color),fill=True))\n",
+    "        plt_center.annotate(np.round(sorted_eigen_vectors[j,i],1),(i-0.1,j-0.05))\n",
+    "\n",
+    "plt_center.set_xlim([-0.5,3.5]); plt_center.set_ylim([-0.5,3.5])\n",
+    "plt_center.set_xticks([0,1, 2, 3],[1,2,3,4]); plt_center.set_yticks([0,1, 2, 3],[1,2,3,4])\n",
+    "for x in np.arange(0.5,3.5,1.0):\n",
+    "    plt_center.plot([x,x],[-0.5,3.5],c='black',lw=3)\n",
+    "    plt_center.plot([-0.5,3.5],[x,x],c='black',lw=1,ls='--')\n",
+    "plt_center.set_title('Eigen Vectors / Principal Component Loadings')  \n",
+    "plt_center.set_xlabel('Eigen Vector'); plt_center.set_ylabel('Feature')\n",
+    "\n",
+    "plt_x.barh(y=np.array([0,1,2,3],dtype='float'),width=np.var(data,axis=0),color='darkorange',edgecolor='black')\n",
+    "plt_x.set_xlim([1.2,0]); plt_x.set_yticks([0,1, 2, 3],[1,2,3,4])\n",
+    "plt_x.set_ylabel('Feature'); plt_x.set_xlabel('Variance')\n",
+    "plt_x.set_title('Original Feature Variance') \n",
+    "\n",
+    "plt_y.bar(x=np.array([0,1,2,3],dtype='float'),height=sorted_eigen_values,color='darkorange',edgecolor='black')\n",
+    "plt_y.set_ylim([0,2.5]); plt_y.set_xticks([0,1, 2, 3],[1,2,3,4])\n",
+    "plt_y.set_xlabel('Eigen Value'); plt_y.set_ylabel('Variance')\n",
+    "plt_y.set_title('Sorted, Projected Feature Variance')  \n",
+    "\n",
+    "for i in range(0,m):\n",
+    "    for j in range(0,m):\n",
+    "        color = (cov_actual[j,i] + 1.0)/(2.0)\n",
+    "        plt_extra.add_patch(Rectangle((i-0.5,j-0.5), 1, 1,color = plt.cm.BrBG(color),fill=True))\n",
+    "\n",
+    "plt_extra.set_xlim([-0.5,3.5]); plt_extra.set_ylim([3.5,-0.5])\n",
+    "plt_extra.set_xticks([0,1, 2, 3],[1,2,3,4]); plt_extra.set_yticks([0,1, 2, 3],[1,2,3,4])\n",
+    "for x in np.arange(0.5,3.5,1.0):\n",
+    "    plt_extra.plot([x,x],[-0.5,3.5],c='black',lw=2)\n",
+    "    plt_extra.plot([-0.5,3.5],[x,x],c='black',lw=2)\n",
+    "plt_extra.set_title('Original Covariance Matrix')  \n",
+    " \n",
+    "cplt_extra = make_axes_locatable(plt_extra).append_axes('left', size='5%', pad=0.3)\n",
+    "fig.colorbar(mpl.cm.ScalarMappable(norm=mpl.colors.Normalize(vmin=-1.0, vmax=1.0), cmap=plt.cm.BrBG),\n",
+    "             cax=cplt_extra, orientation='vertical')\n",
+    "cplt_extra.yaxis.set_ticks_position('left')\n",
+    "\n",
+    "plt.subplots_adjust(left=0.0, bottom=0.0, right=1.51, top=1.50, wspace=0.2, hspace=0.2); plt.show()"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "26c81bfe",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "cov"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "dfd37f62",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "eigen_vectors_sorted[3][3]"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "8ffc30bb",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "sorted_eigen_vectors"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "ed00b8a9",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "n_components = 4\n",
+    "pca = PCA(n_components=n_components,)\n",
+    "pca.fit(data)\n",
+    "print(np.round(pca.components_,3))"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a6c58c46",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "pca.explained_variance_"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "39565068",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "eigen_values"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "4fdb342b",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "sorted_indices = np.argsort(-eigen_values)\n",
+    "sorted_eigen_vectors = eigen_vectors[:, sorted_indices]\n",
+    "sorted_eigen_vectors"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "4bd12717",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "eigen_vectors"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "df84d9b7",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "sorted_indices"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "8b72f080",
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.11.4"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}

Datei-Diff unterdrückt, da er zu groß ist
+ 348 - 0
Interactive_PCA_Eigen.ipynb


+ 329 - 0
Interactive_PP_Plot.ipynb

@@ -0,0 +1,329 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "id": "76df9c91",
+   "metadata": {},
+   "source": [
+    "<p align=\"center\">\n",
+    "    <img src=\"https://github.com/GeostatsGuy/GeostatsPy/blob/master/TCG_color_logo.png?raw=true\" width=\"220\" height=\"240\" />\n",
+    "\n",
+    "</p>\n",
+    "\n",
+    "## P-P Plot Interactive Demonstration\n",
+    "\n",
+    "### P-P (Probability-Probablity) Plots in Python \n",
+    "\n",
+    "Interactive demonstration of P-P plots to compare two distributions, cumulative distribution functions. \n",
+    "\n",
+    "* P-P plots map data values between two distributions, and then scatter plot the cumulative probability values.\n",
+    "\n",
+    "* A lecture that covers these concepts is available [Q-Q plots and P-P plots](https://www.youtube.com/watch?v=RETZus4XBNM&list=PLG19vXLQHvSB-D4XKYieEku9GQMQyAzjJ&index=23&t=4s).\n",
+    "\n",
+    "This interactive dashboard may be applied to support teaching data science.\n",
+    "\n",
+    "#### Jason Bott, Undergraduate Student, The University of Texas at Austin\n",
+    "\n",
+    "####  [GitHub](https://github.com/jasonbott124) | [GoogleScholar](https://scholar.google.com/citations?user=31Ae8UkAAAAJ&hl=en) | [LinkedIn](https://www.linkedin.com/in/jason-bott-a52944270/) | [Eportfolio](https://jasonseportfolio5.wordpress.com/) | Email: jbott@utexas.edu\n",
+    "\n",
+    "#### Michael Pyrcz, Professor, The University of Texas at Austin \n",
+    "\n",
+    "##### [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)\n"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "96b3cd54",
+   "metadata": {},
+   "source": [
+    "#### Importing Packages\n",
+    "\n",
+    "We will need some standard packages. These should have been installed with Anaconda 3."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 1,
+   "id": "7fb9fccd",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "%matplotlib inline\n",
+    "from ipywidgets import interactive        # widgets and interactivity\n",
+    "from ipywidgets import widgets                            \n",
+    "from ipywidgets import Layout\n",
+    "from ipywidgets import Label\n",
+    "from ipywidgets import VBox, HBox\n",
+    "\n",
+    "import numpy as np                        # ndarrays for gridded data\n",
+    "import pandas as pd                       # DataFrames for tabular data\n",
+    "from scipy import stats                   # inverse percentiles, percentileofscore function for P-P plots\n",
+    "import os                                 # set working directory, run executables\n",
+    "\n",
+    "import matplotlib.pyplot as plt           # plotting\n",
+    "import matplotlib.gridspec as gridspec\n",
+    "plt.rc('axes', axisbelow=True)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "82a0811b",
+   "metadata": {},
+   "source": [
+    "#### Widgets and Display\n",
+    "Next, we need to create our widgets and format the overall display"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 2,
+   "id": "82630901",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# interactive calculation of the sample set (control of source parametric distribution and number of samples)\n",
+    "l = widgets.Text(value='           Interactive P-P Plot | Jason Bott, Undergraduate Student, the University of Texas at Austin | Michael Pyrcz, Professor, The University of Texas at Austin',layout=Layout(width='950px', height='30px'),continuous_update=True)\n",
+    "\n",
+    "n1 = widgets.IntSlider(min=0, max = 1000, value = 100, step = 10, description = '$n_{1}$',orientation='horizontal',layout=Layout(width='300px', height='30px'),continuous_update=False)\n",
+    "n1.style.handle_color = 'red'\n",
+    "\n",
+    "m1 = widgets.FloatSlider(min=0.2, max = 0.8, value = 0.3, step = 0.1, description = '$\\overline{x}_{1}$',orientation='horizontal',layout=Layout(width='300px', height='30px'),continuous_update=False)\n",
+    "m1.style.handle_color = 'red'\n",
+    "\n",
+    "s1 = widgets.FloatSlider(min=0.0, max = 0.2, value = 0.03, step = 0.005, description = '$s_1$',orientation='horizontal',layout=Layout(width='300px', height='30px'),continuous_update=False)\n",
+    "s1.style.handle_color = 'red'\n",
+    "\n",
+    "ui1 = widgets.VBox([n1,m1,s1],)                               # basic widget formatting \n",
+    "\n",
+    "n2 = widgets.IntSlider(min=0, max = 1000, value = 100, step = 10, description = '$n_{2}$',orientation='horizontal',layout=Layout(width='300px', height='30px'),continuous_update=False)\n",
+    "n2.style.handle_color = 'blue'\n",
+    "\n",
+    "m2 = widgets.FloatSlider(min=0.2, max = 0.8, value = 0.2, step = 0.1, description = '$\\overline{x}_{2}$',orientation='horizontal',layout=Layout(width='300px', height='30px'),continuous_update=False)\n",
+    "m2.style.handle_color = 'blue'\n",
+    "\n",
+    "s2 = widgets.FloatSlider(min=0, max = 0.2, value = 0.03, step = 0.005, description = '$s_2$',orientation='horizontal',layout=Layout(width='300px', height='30px'),continuous_update=False)\n",
+    "s2.style.handle_color = 'blue'\n",
+    "\n",
+    "ui2 = widgets.VBox([n2,m2,s2],)                               # basic widget formatting \n",
+    "\n",
+    "nq = widgets.IntSlider(min=10, max = 1000, value = 100, step = 1, description = '$n_q$',orientation='horizontal',layout=Layout(width='300px', height='30px'),continuous_update=False)\n",
+    "nq.style.handle_color = 'gray'\n",
+    "\n",
+    "# plot = widgets.Checkbox(value=False,description='Make Plot')\n",
+    "\n",
+    "ui3 = widgets.VBox([nq,],)                                # basic widget formatting \n",
+    "\n",
+    "ui4 = widgets.HBox([ui1,ui2,ui3],)                               # basic widget formatting \n",
+    "\n",
+    "ui2 = widgets.VBox([l,ui4],)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "0bacdefc",
+   "metadata": {},
+   "source": [
+    "#### P-P plot Function\n",
+    "\n",
+    "We create a function that calculates and matches the values from both data distributions. And plots the cumulative probilities."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 3,
+   "id": "27c8eadb",
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "#function to take parameters, make sample and PP plot\n",
+    "def double_p(n1, m1, s1, n2, m2, s2, nq):\n",
+    "    \n",
+    "#     n1 = 100; mean1 = 0.35; stdev1 = 0.06 \n",
+    "#     n2 = 50; mean2 = 0.3; stdev2 = 0.05\n",
+    "    \n",
+    "    seed = 73073; #nq = 100\n",
+    "    xmin=0.0; xmax=0.6\n",
+    "    np.random.seed(seed=seed)\n",
+    "    \n",
+    "    X1 = np.random.normal(loc=m1,scale=s1,size=n1)\n",
+    "    X2 = np.random.normal(loc=m2,scale=s2,size=n2)\n",
+    "    \n",
+    "    min_X = min(X1.min(),X2.min())\n",
+    "    max_X = max(X1.max(),X2.max())\n",
+    "    \n",
+    "    X_values = np.linspace(min_X,max_X,nq)\n",
+    "\n",
+    "    X1_cumul_probs = []; X2_cumul_probs = []\n",
+    "\n",
+    "    for X in X_values:\n",
+    "        X1_cumul_probs.append(stats.percentileofscore(X1,X)/100)\n",
+    "        X2_cumul_probs.append(stats.percentileofscore(X2,X)/100)\n",
+    "    \n",
+    "    X1_cumul_probs = np.asarray(X1_cumul_probs); X2_cumul_probs = np.asarray(X2_cumul_probs)\n",
+    "    fig = plt.figure()\n",
+    "    spec = fig.add_gridspec(2, 3)\n",
+    "\n",
+    "    #P-P plot\n",
+    "    ax0 = fig.add_subplot(spec[:, 1:])\n",
+    "    plt.scatter(X1_cumul_probs,X2_cumul_probs,color='darkorange',edgecolor='black',s=20,label='P-P plot')\n",
+    "    plt.plot([0,1.0],[0,1.0],ls='--',color='red')\n",
+    "    plt.grid(); plt.xlim([0.0,1.0]); plt.ylim([0.0,1.0]); plt.xlabel(r'$F^{-1}_{X_1}(x)$ - Cumulative Probability'); plt.ylabel(r'$F^{-1}_{X_2}(x)$ - Cumulative Probability'); \n",
+    "    plt.title('P-P Plot'); plt.legend(loc='lower right')\n",
+    "\n",
+    "    #Histogram\n",
+    "    ax10 = fig.add_subplot(spec[0, 0])\n",
+    "    plt.hist(X1,bins=np.linspace(xmin,xmax,30),color='red',alpha=0.5,edgecolor='black',label=r'$X_1$',density=True)\n",
+    "    plt.hist(X2,bins=np.linspace(xmin,xmax,30),color='yellow',alpha=0.5,edgecolor='black',label=r'$X_2$',density=True)\n",
+    "    plt.grid(); plt.xlim([xmin,xmax]); plt.ylim([0,15]); plt.xlabel('Porosity (fraction)'); plt.ylabel('Density')\n",
+    "    plt.title('Histograms'); plt.legend(loc='upper right')\n",
+    "    \n",
+    "    #CDF\n",
+    "    ax11 = fig.add_subplot(spec[1, 0])\n",
+    "    plt.scatter(np.sort(X1),np.linspace(0,1,len(X1)),color='red',alpha=0.5,edgecolor='black',s=30,label=r'$X_1$')\n",
+    "    plt.scatter(np.sort(X2),np.linspace(0,1,len(X2)),color='yellow',alpha=0.5,edgecolor='black',s=30,label=r'$X_2$')\n",
+    "    plt.grid(); plt.xlim([xmin,xmax]); plt.ylim([0,1]); plt.xlabel('Porosity (fraction)'); plt.title('CDFs'); plt.legend(loc='lower right')\n",
+    "\n",
+    "    plt.subplots_adjust(left=0.0, bottom=0.0, right=1.5, top=1.4, wspace=0.3, hspace=0.3); plt.show()\n",
+    "\n",
+    "interactive_plot = widgets.interactive_output(double_p, {'n1': n1, 'm1': m1, 's1': s1, 'n2': n2, 'm2': m2, 's2': s2, 'nq': nq}) #creates an object called interactive_plot that calls the double_p() function\n",
+    "interactive_plot.clear_output(wait = True) #reduce flickering by delaying plot updating"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "63fd9c7b",
+   "metadata": {},
+   "source": [
+    "### P-P Plot for Comparing Distributions\n",
+    "\n",
+    "* demonstration of P-P plots to compare distributions, while interactively varying the distributions\n",
+    "\n",
+    "#### Jason Bott, Undergraduate Student and Michael Pyrcz, Professor, The University of Texas at Austin \n",
+    "\n",
+    "Let's make 2 random datasets, $\\color{red}{X_1}$ and $\\color{blue}{X_2}$ and calculate their P-P plot.\n",
+    "\n",
+    "* **$\\color{red}{n_1}$**, **$\\color{blue}{n_2}$** number of samples, **$\\color{red}{\\overline{x}_1}$**, **$\\color{blue}{\\overline{x}_2}$** means and **$\\color{red}{s_1}$**, **$\\color{blue}s_2$** standard deviation of the 2 sample sets\n",
+    "* **$\\color{grey}{n_q}$**: number of regular bins over the range of values"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 4,
+   "id": "2ef49ab5",
+   "metadata": {
+    "scrolled": false
+   },
+   "outputs": [
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "b1b0656181e54c11b9eab66506771cce",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "VBox(children=(Text(value='           Interactive P-P Plot | Jason Bott, Undergraduate Student, the University…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    },
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "5724e2c78fa74728a3bdc66004562464",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': '<Figure size 432x288 with 3 Axes>', 'i…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "display(ui2, interactive_plot) #displays the widgets and plots"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "a0ba1653",
+   "metadata": {},
+   "source": [
+    "#### Comments\n",
+    "\n",
+    "This was a basic interactive demonstration of a P-P plot in Python.\n",
+    "\n",
+    "#### The Authors:\n",
+    "\n",
+    "### Jason Bott, Undergraduate Student, University of Texas at Austin\n",
+    "*Geostatistics, Geophysics, Polar Geophysics, Volcanism, Subglacial Volcanism, Exploration Geophysics*\n",
+    "\n",
+    "Just a passionate student who enjoys all things geoscience. If you would like to contact me, I can be reached through email at: jbott@utexas.edu\n",
+    "\n",
+    "For more about Jason check out these links:\n",
+    "\n",
+    "####  [GitHub](https://github.com/jasonbott124) | [GoogleScholar](https://scholar.google.com/citations?user=31Ae8UkAAAAJ&hl=en) | [LinkedIn](https://www.linkedin.com/in/jason-bott-a52944270/) | [Eportfolio](https://jasonseportfolio5.wordpress.com/) \n",
+    "\n",
+    "\n",
+    "### Michael Pyrcz, Associate Professor, University of Texas at Austin \n",
+    "*Novel Data Analytics, Geostatistics and Machine Learning Subsurface Solutions*\n",
+    "\n",
+    "With over 17 years of experience in subsurface consulting, research and development, Michael has returned to academia driven by his passion for teaching and enthusiasm for enhancing engineers' and geoscientists' impact in subsurface resource development. \n",
+    "\n",
+    "For more about Michael check out these links:\n",
+    "\n",
+    "#### [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)\n",
+    "\n",
+    "#### Want to Work With Michael?\n",
+    "\n",
+    "I hope this content is helpful to those that want to learn more about subsurface modeling, data analytics and machine learning. Students and working professionals are welcome to participate.\n",
+    "\n",
+    "* Want to invite me to visit your company for training, mentoring, project review, workflow design and / or consulting? I'd be happy to drop by and work with you! \n",
+    "\n",
+    "* Interested in partnering, supporting my graduate student research or my Subsurface Data Analytics and Machine Learning consortium (co-PIs including Profs. Foster, Torres-Verdin and van Oort)? My research combines data analytics, stochastic modeling and machine learning theory with practice to develop novel methods and workflows to add value. We are solving challenging subsurface problems!\n",
+    "\n",
+    "* I can be reached at mpyrcz@austin.utexas.edu.\n",
+    "\n",
+    "I'm always happy to discuss,\n",
+    "\n",
+    "*Michael*\n",
+    "\n",
+    "Michael Pyrcz, Ph.D., P.Eng. Associate Professor The Hildebrand Department of Petroleum and Geosystems Engineering, Bureau of Economic Geology, The Jackson School of Geosciences, The University of Texas at Austin\n",
+    "\n",
+    "#### More Resources Available at: [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)\n"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "a1d28d92",
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.11.4"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}

+ 615 - 0
Interactive_Parametric_Distributions.ipynb

@@ -0,0 +1,615 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "metadata": {
+    "collapsed": true
+   },
+   "source": [
+    "<p align=\"center\">\n",
+    "    <img src=\"https://github.com/GeostatsGuy/GeostatsPy/blob/master/TCG_color_logo.png?raw=true\" width=\"220\" height=\"240\" />\n",
+    "\n",
+    "</p>\n",
+    "\n",
+    "## Data Analytics \n",
+    "\n",
+    "### Interactive Parametric Distributions  in Python \n",
+    "\n",
+    "\n",
+    "#### Michael Pyrcz, Professor, The University of Texas at Austin \n",
+    "\n",
+    "##### [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)\n"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "Here's a demonstration of making and general use of parametric distributions in Python. This demonstration is part of the resources that I include for my courses in Spatial / Subsurface Data Analytics at the Cockrell School of Engineering at the University of Texas at Austin.  \n",
+    "\n",
+    "#### Parametric Distributions\n",
+    "\n",
+    "I provide an set of interactives with thses parametric distributions:\n",
+    "\n",
+    "* Uniform\n",
+    "* Triangular\n",
+    "* Binomial\n",
+    "* Poisson\n",
+    "* Gaussian\n",
+    "* Lognormal\n",
+    "\n",
+    "Change the distribution parameters and watch the probability density functions and cumulative distribution functions changes.\n",
+    "\n",
+    "I have a lecture on these parametric distributions available on [YouTube](https://www.youtube.com/watch?v=U7fGsqCLPHU&t=1687s).   \n",
+    "\n",
+    "#### Getting Started\n",
+    "\n",
+    "Here's the steps to get setup in Python with the GeostatsPy package:\n",
+    "\n",
+    "1. Install Anaconda 3 on your machine (https://www.anaconda.com/download/). \n",
+    "2. From Anaconda Navigator (within Anaconda3 group), go to the environment tab, click on base (root) green arrow and open a terminal. \n",
+    "3. In the terminal type: pip install geostatspy. \n",
+    "4. Open Jupyter and in the top block get started by copy and pasting the code block below from this Jupyter Notebook to start using the geostatspy functionality. \n",
+    "\n",
+    "#### Importing Packages\n",
+    "\n",
+    "We will need some standard packages. These should have been installed with Anaconda 3."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 1,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "import numpy as np                        # ndarrys for gridded data\n",
+    "import pandas as pd                       # DataFrames for tabular data\n",
+    "import os                                 # set working directory, run executables\n",
+    "import matplotlib.pyplot as plt           # for plotting\n",
+    "from matplotlib.ticker import (MultipleLocator, AutoMinorLocator) # control of axes ticks\n",
+    "plt.rc('axes', axisbelow=True)            # set axes and grids in the background for all plots\n",
+    "from scipy import stats                   # summary statistics\n",
+    "import math                               # trigonometry etc.\n",
+    "import scipy.signal as signal             # kernel for moving window calculation\n",
+    "import random                             # for randon numbers\n",
+    "import seaborn as sns                     # for matrix scatter plots\n",
+    "from scipy import linalg                  # for linear regression\n",
+    "from sklearn import preprocessing\n",
+    "import geostatspy.GSLIB as GSLIB\n",
+    "from ipywidgets import interactive        # widgets and interactivity\n",
+    "from ipywidgets import widgets                            \n",
+    "from ipywidgets import Layout\n",
+    "from ipywidgets import Label\n",
+    "from ipywidgets import VBox, HBox"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Uniform Distribution\n",
+    "\n",
+    "We start with the uniform distribution.\n",
+    "\n",
+    "* a continuous, maximum uncertainty distribution, with constant density between the minimum and maximum values."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 2,
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "e075b713d22a4c2cab7fd2d5d23e8f08",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "VBox(children=(Text(value='        Uniform Parametric Distribution Demonstration, Michael Pyrcz, Associate Pro…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    },
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "d9949455235c45638ff203791b31740a",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "Output()"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "# interactive calculation of the random sample set (control of source parametric distribution and number of samples)\n",
+    "l = widgets.Text(value='        Uniform Parametric Distribution Demonstration, Michael Pyrcz, Associate Professor, The University of Texas at Austin',layout=Layout(width='950px', height='30px'))\n",
+    "\n",
+    "zmin = widgets.FloatSlider(min=0.0, max = 100.0, value = 10.0, description = 'Min',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zmin.style.handle_color = 'red'\n",
+    "zmax = widgets.FloatSlider(min=0.0, max = 100.0, value = 30.0, description = 'Max',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zmax.style.handle_color = 'red'\n",
+    "\n",
+    "ui1 = widgets.HBox([zmin,zmax],kwargs = {'justify_content':'center'}) \n",
+    "ui = widgets.VBox([l,ui1],kwargs = {'justify_content':'center'})\n",
+    "\n",
+    "def f_make(zmin,zmax):\n",
+    "    xvals = np.linspace(0.0,100.0,1000)\n",
+    "    pdf = stats.uniform.pdf(xvals, loc = zmin, scale = zmax-zmin)\n",
+    "    cdf = stats.uniform.cdf(xvals, loc = zmin, scale = zmax-zmin)\n",
+    "\n",
+    "    plt.subplot(121)\n",
+    "    plt.plot(xvals,pdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.fill_between(xvals, 0, pdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot([0,100],[0,0],color='black')\n",
+    "    plt.xlim(0,100); plt.gca().set_ylim(bottom=0.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Density\"); plt.title(\"Uniform Probability Density Function, $f_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    " \n",
+    "    plt.subplot(122)\n",
+    "    plt.fill_between(xvals, 0, cdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot(xvals,cdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.plot([0,100],[0,0],color='black')\n",
+    "    plt.xlim(0,100); plt.ylim(0,1.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Cumulative Probability\"); plt.title(\"Uniform Cumulative Distribution Function, $F_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    "    \n",
+    "    plt.subplots_adjust(left=0.0, bottom=0.0, right=2.3, top=1.2, wspace=0.2, hspace=0.3)  \n",
+    "    plt.plot()\n",
+    "\n",
+    "interactive_plot = widgets.interactive_output(f_make, {'zmin':zmin,'zmax':zmax})\n",
+    "interactive_plot.clear_output(wait = True)                # reduce flickering by delaying plot updating    \n",
+    "    \n",
+    "display(ui, interactive_plot)                            # display the interactive plot"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Binomial Distribution\n",
+    "\n",
+    "Now the binomial distribution.\n",
+    "\n",
+    "* a discrete distribution of the probablity of a specific number of successes given $P(success)$ and a number of trails, $n$."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 6,
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "632051a6ea18494c8f49eb64fcb89188",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "VBox(children=(Text(value='                                      Binomial Parametric Distribution Demonstratio…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    },
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "43b7d1906a0441199f5f85395a31beab",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "Output()"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "# interactive calculation of the random sample set (control of source parametric distribution and number of samples)\n",
+    "l = widgets.Text(value='                                      Binomial Parametric Distribution Demonstration, Michael Pyrcz, Associate Professor, The University of Texas at Austin',layout=Layout(width='950px', height='30px'))\n",
+    "\n",
+    "zp = widgets.FloatSlider(min=0.0, max = 1.0, value = 0.5, step = 0.01, description = '$P(Success)$',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zp.style.handle_color = 'red'\n",
+    "zn = widgets.IntSlider(min=2, max = 100, value = 10, description = '$N_{trials}$',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zn.style.handle_color = 'red'\n",
+    "\n",
+    "ui1 = widgets.HBox([zp,zn],kwargs = {'justify_content':'center'}) \n",
+    "ui = widgets.VBox([l,ui1],kwargs = {'justify_content':'center'})\n",
+    "\n",
+    "def f_make(zp,zn):\n",
+    "    xvals = np.linspace(0,101,102)\n",
+    "    pdf = stats.binom.pmf(xvals, zn, zp)\n",
+    "    cdf = stats.binom.cdf(xvals, zn, zp)\n",
+    "    \n",
+    "    plt.subplot(121)\n",
+    "    plt.plot(xvals,pdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.fill_between(xvals, 0, pdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot([1,zn],[0,0],color='black')\n",
+    "    plt.xlim(0,100); plt.gca().set_ylim(bottom=0.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Probability\"); plt.title(\"Binomial Probability Density Function, $f_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks    \n",
+    "    \n",
+    "    plt.subplot(122)\n",
+    "    plt.fill_between(xvals, 0, cdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot(xvals,cdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.plot([0,zn],[0,0],color='black')\n",
+    "    plt.xlim(0,100); plt.ylim(0,1.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Cumulative Probability\"); plt.title(\"Binomial Cumulative Distribution Function, $F_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    "    \n",
+    "    plt.subplots_adjust(left=0.0, bottom=0.0, right=2.3, top=1.2, wspace=0.2, hspace=0.3)  \n",
+    "    plt.plot()\n",
+    "\n",
+    "interactive_plot = widgets.interactive_output(f_make, {'zp':zp,'zn':zn})\n",
+    "interactive_plot.clear_output(wait = True)                # reduce flickering by delaying plot updating    \n",
+    "    \n",
+    "display(ui, interactive_plot)                            # display the interactive plot"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Poisson Distribution\n",
+    "\n",
+    "Now the Poisson distribution.\n",
+    "\n",
+    "* a discrete distribution of the probablity of a specific number of successes given the average number of successes, $\\lambda$, over an interval."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 10,
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "87b31a418b594d9f80f9443587c23890",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "VBox(children=(Text(value='                                      Poisson Parametric Distribution Demonstration…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    },
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "7a8da6e12ab242b796f9aebc71617f19",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "Output()"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "# interactive calculation of the random sample set (control of source parametric distribution and number of samples)\n",
+    "l = widgets.Text(value='                                      Poisson Parametric Distribution Demonstration, Michael Pyrcz, Associate Professor, The University of Texas at Austin',layout=Layout(width='950px', height='30px'))\n",
+    "\n",
+    "zlambda = widgets.IntSlider(min=1, max = 15, value = 5, step = 1.0, description = '$\\\\lambda$',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zlambda.style.handle_color = 'red'\n",
+    "\n",
+    "ui1 = widgets.HBox([zlambda],kwargs = {'justify_content':'center'}) \n",
+    "ui = widgets.VBox([l,ui1],kwargs = {'justify_content':'center'})\n",
+    "\n",
+    "def f_make(zlambda):\n",
+    "    n = 30\n",
+    "    xvals = np.linspace(0,n+1,n+2)\n",
+    "    pdf = stats.poisson.pmf(xvals, zlambda)\n",
+    "    cdf = stats.poisson.cdf(xvals, zlambda)\n",
+    "    \n",
+    "    plt.subplot(121)\n",
+    "    plt.plot(xvals,pdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.fill_between(xvals, 0, pdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot([0,n],[0,0],color='black')\n",
+    "    plt.xlim(0,n); plt.ylim(0,0.40)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Probability\"); plt.title(\"Poisson Probability Density Function, $f_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    "     \n",
+    "    plt.subplot(122)\n",
+    "    plt.fill_between(xvals, 0, cdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot(xvals,cdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.plot([0,n],[0,0],color='black')\n",
+    "    plt.xlim(1,n); plt.ylim(0,1.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Cumulative Probability\"); plt.title(\"Poisson Cumulative Distribution Function, $F_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    "        \n",
+    "    plt.subplots_adjust(left=0.0, bottom=0.0, right=2.3, top=1.2, wspace=0.2, hspace=0.3)  \n",
+    "    plt.plot()\n",
+    "\n",
+    "interactive_plot = widgets.interactive_output(f_make, {'zlambda':zlambda})\n",
+    "interactive_plot.clear_output(wait = True)                # reduce flickering by delaying plot updating    \n",
+    "    \n",
+    "display(ui, interactive_plot)                            # display the interactive plot"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Gaussian / Normal Distribution\n",
+    "\n",
+    "Now the Gaussian distribution.\n",
+    "\n",
+    "* an unbounded distribution (infinite tails) distribution parameterized by mean and standard deviation."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 11,
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "ddf7abee469a40ca8b9f22c7e2afdde8",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "VBox(children=(Text(value='                                      Gaussian Parametric Distribution Demonstratio…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    },
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "b9f9ad202dbb476fbfa86e705c8454f3",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "Output()"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "# interactive calculation of the random sample set (control of source parametric distribution and number of samples)\n",
+    "l = widgets.Text(value='                                      Gaussian Parametric Distribution Demonstration, Michael Pyrcz, Associate Professor, The University of Texas at Austin',layout=Layout(width='950px', height='30px'))\n",
+    "\n",
+    "zmean = widgets.FloatSlider(min=0.0, max = 100.0, value = 50.0, description = '$\\overline{x}$/$\\\\mu$',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zmean.style.handle_color = 'red'\n",
+    "zstdev = widgets.FloatSlider(min=0.0, max = 30.0, value = 5.0, description = '$s$/$\\\\sigma$',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zstdev.style.handle_color = 'red'\n",
+    "\n",
+    "ui1 = widgets.HBox([zmean,zstdev],kwargs = {'justify_content':'center'}) \n",
+    "ui = widgets.VBox([l,ui1],kwargs = {'justify_content':'center'})\n",
+    "\n",
+    "def f_make(zmean,zstdev):\n",
+    "    xvals = np.linspace(0.0,100.0,1000)\n",
+    "    pdf = stats.norm.pdf(xvals, loc = zmean, scale = zstdev)\n",
+    "    cdf = stats.norm.cdf(xvals, loc = zmean, scale = zstdev)\n",
+    "\n",
+    "    plt.subplot(121)\n",
+    "    plt.plot(xvals,pdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.fill_between(xvals, 0, pdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot([0,100],[0,0],color='black')\n",
+    "    plt.xlim(0,100); plt.gca().set_ylim(bottom=0.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Density\"); plt.title(\"Gaussian Probability Density Function, $f_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    "    \n",
+    "    plt.subplot(122)\n",
+    "    plt.fill_between(xvals, 0, cdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot(xvals,cdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.plot([0,100],[0,0],color='black')\n",
+    "    plt.xlim(0,100); plt.ylim(0,1.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Cumulative Probability\"); plt.title(\"Gaussian Cumulative Distribution Function, $F_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    "    \n",
+    "    plt.subplots_adjust(left=0.0, bottom=0.0, right=2.3, top=1.2, wspace=0.2, hspace=0.3)  \n",
+    "    plt.plot()\n",
+    "\n",
+    "interactive_plot = widgets.interactive_output(f_make, {'zmean':zmean,'zstdev':zstdev})\n",
+    "interactive_plot.clear_output(wait = True)                # reduce flickering by delaying plot updating    \n",
+    "    \n",
+    "display(ui, interactive_plot)                            # display the interactive plot"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Lognormal Distribution\n",
+    "\n",
+    "Now the lognormal distribution.\n",
+    "\n",
+    "* an unbounded distribution (infinite tails) distribution parameterized by mu and sigma."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 9,
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "eed2f3b8c78a44aba8518cb30243d02d",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "VBox(children=(Text(value='                                      Lognormal Parametric Distribution Demonstrati…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    },
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "efa4afa4a1e6448c88db537ae1cff6e0",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "Output()"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "# interactive calculation of the random sample set (control of source parametric distribution and number of samples)\n",
+    "l = widgets.Text(value='                                      Lognormal Parametric Distribution Demonstration, Michael Pyrcz, Associate Professor, The University of Texas at Austin',layout=Layout(width='950px', height='30px'))\n",
+    "\n",
+    "zmu = widgets.FloatSlider(min=0.1, max = 10.0, value = 1.0, description = 'Mu/$\\\\mu$',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zmu.style.handle_color = 'red'\n",
+    "zsigma = widgets.FloatSlider(min=0.1, max = 3.0, value = 1.0, description = 'Sigma/$\\\\sigma$',orientation='horizontal',layout=Layout(width='400px', height='50px'),continuous_update=False)\n",
+    "zsigma.style.handle_color = 'red'\n",
+    "\n",
+    "ui1 = widgets.HBox([zmu,zsigma],kwargs = {'justify_content':'center'}) \n",
+    "ui = widgets.VBox([l,ui1],kwargs = {'justify_content':'center'})\n",
+    "\n",
+    "def f_make(zmu,zsigma):\n",
+    "    xvals = np.linspace(0.0,100.0,1000)\n",
+    "    pdf = stats.lognorm.pdf(xvals, s = zsigma, scale = math.exp(zmu))\n",
+    "    cdf = stats.lognorm.cdf(xvals, s = zsigma, scale = math.exp(zmu))\n",
+    "\n",
+    "    plt.subplot(121)\n",
+    "    plt.plot(xvals,pdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.fill_between(xvals, 0, pdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot([0,100],[0,0],color='black')\n",
+    "    plt.xlim(0,100); plt.gca().set_ylim(bottom=0.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Density\"); plt.title(\"Lognormal Probability Density Function, $f_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    "      \n",
+    "    plt.subplot(122)\n",
+    "    plt.fill_between(xvals, 0, cdf, facecolor='darkorange', interpolate=True, alpha = 0.7)\n",
+    "    plt.plot(xvals,cdf,color='black',alpha=0.8,linewidth=2)\n",
+    "    plt.plot([0,100],[0,0],color='black')\n",
+    "    plt.xlim(0,100); plt.ylim(0,1.0)\n",
+    "    plt.xlabel(\"$x$\"); plt.ylabel(\"Cumulative Probability\"); plt.title(\"Lognormal Cumulative Distribution Function, $F_x(x)$\")\n",
+    "    plt.grid(True, which='major',linewidth = 1.0); plt.grid(True, which='minor',linewidth = 0.2) # add y grids\n",
+    "    plt.tick_params(which='major',length=7); plt.tick_params(which='minor', length=4)\n",
+    "    plt.gca().xaxis.set_minor_locator(AutoMinorLocator()); plt.gca().yaxis.set_minor_locator(AutoMinorLocator()) # turn on minor ticks\n",
+    "      \n",
+    "    plt.subplots_adjust(left=0.0, bottom=0.0, right=2.3, top=1.2, wspace=0.2, hspace=0.3)  \n",
+    "    plt.plot()\n",
+    "\n",
+    "interactive_plot = widgets.interactive_output(f_make, {'zmu':zmu,'zsigma':zsigma})\n",
+    "interactive_plot.clear_output(wait = True)                # reduce flickering by delaying plot updating    \n",
+    "    \n",
+    "display(ui, interactive_plot)                            # display the interactive plot"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "There are many other parametric distributions that we could have included. Also we could have demonstrated the distribution fitting. \n",
+    "\n",
+    "#### Comments\n",
+    "\n",
+    "This was a basic interactive demonstration of common parametric distributions. \n",
+    "\n",
+    "I have other demonstrations on the basics of working with DataFrames, ndarrays, univariate statistics, plotting data, declustering, data transformations, trend modeling and many other workflows available at [Python Demos](https://github.com/GeostatsGuy/PythonNumericalDemos) and a Python package for data analytics and geostatistics at [GeostatsPy](https://github.com/GeostatsGuy/GeostatsPy). \n",
+    "  \n",
+    "I hope this was helpful,\n",
+    "\n",
+    "*Michael*\n",
+    "\n",
+    "#### The Author:\n",
+    "\n",
+    "### Michael Pyrcz, Professor, The University of Texas at Austin \n",
+    "*Novel Data Analytics, Geostatistics and Machine Learning Subsurface Solutions*\n",
+    "\n",
+    "With over 17 years of experience in subsurface consulting, research and development, Michael has returned to academia driven by his passion for teaching and enthusiasm for enhancing engineers' and geoscientists' impact in subsurface resource development. \n",
+    "\n",
+    "For more about Michael check out these links:\n",
+    "\n",
+    "#### [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)\n",
+    "\n",
+    "#### Want to Work Together?\n",
+    "\n",
+    "I hope this content is helpful to those that want to learn more about subsurface modeling, data analytics and machine learning. Students and working professionals are welcome to participate.\n",
+    "\n",
+    "* Want to invite me to visit your company for training, mentoring, project review, workflow design and / or consulting? I'd be happy to drop by and work with you! \n",
+    "\n",
+    "* Interested in partnering, supporting my graduate student research or my Subsurface Data Analytics and Machine Learning consortium (co-PIs including Profs. Foster, Torres-Verdin and van Oort)? My research combines data analytics, stochastic modeling and machine learning theory with practice to develop novel methods and workflows to add value. We are solving challenging subsurface problems!\n",
+    "\n",
+    "* I can be reached at mpyrcz@austin.utexas.edu.\n",
+    "\n",
+    "I'm always happy to discuss,\n",
+    "\n",
+    "*Michael*\n",
+    "\n",
+    "Michael Pyrcz, Ph.D., P.Eng. Professor, Cockrell School of Engineering and The Jackson School of Geosciences, The University of Texas at Austin\n",
+    "\n",
+    "#### More Resources Available at: [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.11.4"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}

Datei-Diff unterdrückt, da er zu groß ist
+ 599 - 0
Interactive_Polynomial_Solution.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 925 - 0
Interactive_PreDrill_Prediction.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 312 - 0
Interactive_QQ_Plot.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 502 - 0
Interactive_RadialBasisFunctions.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1029 - 0
Interactive_Ridge_Regresion.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 317 - 0
Interactive_Sampling_Methods.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 683 - 0
Interactive_Simple_Kriging.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 899 - 0
Interactive_Simple_Kriging_Behavoir.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 911 - 0
Interactive_Simulation.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 383 - 0
Interactive_Sivia_Coin_Toss.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 759 - 0
Interactive_Spatial_Aggregate_Uncertainty.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 762 - 0
Interactive_Spatial_Aggregate_Uncertainty_Pad.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1304 - 0
Interactive_Spectral_Clustering.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 379 - 0
Interactive_Spurious_Correlations.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 596 - 0
Interactive_String_Effect.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 675 - 0
Interactive_Trend.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1063 - 0
Interactive_Variogram_Calculation.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1257 - 0
Interactive_Variogram_Calculation_Modeling.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1558 - 0
Interactive_Variogram_Calculation_Modeling_Krige.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 1555 - 0
Interactive_Variogram_Modeling.ipynb


+ 412 - 0
Interactive_Variogram_Nugget_Effect.ipynb

@@ -0,0 +1,412 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "metadata": {
+    "collapsed": true
+   },
+   "source": [
+    "\n",
+    "<p align=\"center\">\n",
+    "    <img src=\"https://github.com/GeostatsGuy/GeostatsPy/blob/master/TCG_color_logo.png?raw=true\" width=\"220\" height=\"240\" />\n",
+    "\n",
+    "</p>\n",
+    "\n",
+    "## Spatial Data Analytics \n",
+    "\n",
+    "### Interactive Demonstration of the Variogram Nugget Effect \n",
+    "\n",
+    "#### Michael Pyrcz, Associate Professor, The University of Texas at Austin \n",
+    "\n",
+    "##### Contacts: [Twitter/@GeostatsGuy](https://twitter.com/geostatsguy) | [GitHub/GeostatsGuy](https://github.com/GeostatsGuy) | [www.michaelpyrcz.com](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446)\n",
+    "\n",
+    "This a simple demonstration of the variogram nugget effect structure for a 1D datasets with variable spatial continuity and visualization.\n",
+    "\n",
+    "* we will see that the nugget effect results from random error\n",
+    "\n",
+    "* we will perform the calculations in 1D for fast run times and ease of visualization.\n",
+    "\n",
+    "#### Load the required libraries\n",
+    "\n",
+    "The following code loads the required libraries.\n"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 1,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "import os                                                   # to set current working directory \n",
+    "import numpy as np                                          # arrays and matrix math\n",
+    "import matplotlib.pyplot as plt                             # for plotting\n",
+    "from matplotlib.gridspec import GridSpec                    # custom matrix plots\n",
+    "plt.rc('axes', axisbelow=True)                              # set axes and grids in the background for all plots\n",
+    "from ipywidgets import interactive                          # widgets and interactivity\n",
+    "from ipywidgets import widgets                            \n",
+    "from ipywidgets import Layout\n",
+    "from ipywidgets import Label\n",
+    "from ipywidgets import VBox, HBox\n",
+    "import math                                                 # for square root\n",
+    "from geostatspy import GSLIB                                # affine correction\n",
+    "seed = 73073"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "If you get a package import error, you may have to first install some of these packages. This can usually be accomplished by opening up a command window on Windows and then typing 'python -m pip install [package-name]'. More assistance is available with the respective package docs.  "
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Set the working directory\n",
+    "\n",
+    "I always like to do this so I don't lose files and to simplify subsequent read and writes (avoid including the full address each time).  Also, in this case make sure to place the required (see below) data file in this working directory.  "
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 2,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "#os.chdir(\"C:\\PGE337\")                                      # set the working directory"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Declare Functions\n",
+    "\n",
+    "We need a variogram calculator that is fast and works well with 1D.\n",
+    "\n",
+    "* I have modified the gam function from GeostatsPy below.\n",
+    "\n",
+    "References:\n",
+    "\n",
+    "Pyrcz, M.J., Jo. H., Kupenko, A., Liu, W., Gigliotti, A.E., Salomaki, T., and Santos, J., 2021, GeostatsPy Python Package, PyPI, Python Package Index, https://pypi.org/project/geostatspy/."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 3,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "def gam(array, tmin, tmax, xsiz, ysiz, ixd, iyd, nlag, isill):\n",
+    "    \"\"\"GSLIB's GAM program (Deutsch and Journel, 1998) converted from the\n",
+    "    original Fortran to Python by Michael Pyrcz, the University of Texas at\n",
+    "    Austin (Jan, 2019).\n",
+    "    :param array: 2D gridded data / model\n",
+    "    :param tmin: property trimming limit\n",
+    "    :param tmax: property trimming limit\n",
+    "    :param xsiz: grid cell extents in x direction\n",
+    "    :param ysiz: grid cell extents in y direction\n",
+    "    :param ixd: lag offset in grid cells\n",
+    "    :param iyd: lag offset in grid cells\n",
+    "    :param nlag: number of lags to calculate\n",
+    "    :param isill: 1 for standardize sill\n",
+    "    :return: TODO\n",
+    "    \"\"\"\n",
+    "    if array.ndim == 2:\n",
+    "        ny, nx = array.shape\n",
+    "    elif array.ndim == 1:\n",
+    "        ny, nx = len(array),1\n",
+    "        array = array.reshape((ny,1))\n",
+    "\n",
+    "    nvarg = 1  # for multiple variograms repeat the program\n",
+    "    nxy = nx * ny  # TODO: not used\n",
+    "    mxdlv = nlag\n",
+    "\n",
+    "    # Allocate the needed memory\n",
+    "    lag = np.zeros(mxdlv)\n",
+    "    vario = np.zeros(mxdlv)\n",
+    "    hm = np.zeros(mxdlv)\n",
+    "    tm = np.zeros(mxdlv)\n",
+    "    hv = np.zeros(mxdlv)  # TODO: not used\n",
+    "    npp = np.zeros(mxdlv)\n",
+    "    ivtail = np.zeros(nvarg + 2)\n",
+    "    ivhead = np.zeros(nvarg + 2)\n",
+    "    ivtype = np.zeros(nvarg + 2)\n",
+    "    ivtail[0] = 0\n",
+    "    ivhead[0] = 0\n",
+    "    ivtype[0] = 0\n",
+    "\n",
+    "    # Summary statistics for the data after trimming\n",
+    "    inside = (array > tmin) & (array < tmax)\n",
+    "    avg = array[(array > tmin) & (array < tmax)].mean()  # TODO: not used\n",
+    "    stdev = array[(array > tmin) & (array < tmax)].std()\n",
+    "    var = stdev ** 2.0\n",
+    "    vrmin = array[(array > tmin) & (array < tmax)].min()  # TODO: not used\n",
+    "    vrmax = array[(array > tmin) & (array < tmax)].max()  # TODO: not used\n",
+    "    num = ((array > tmin) & (array < tmax)).sum()  # TODO: not used\n",
+    "\n",
+    "    # For the fixed seed point, loop through all directions\n",
+    "    for iy in range(0, ny):\n",
+    "        for ix in range(0, nx):\n",
+    "            if inside[iy, ix]:\n",
+    "                vrt = array[iy, ix]\n",
+    "                ixinc = ixd\n",
+    "                iyinc = iyd\n",
+    "                ix1 = ix\n",
+    "                iy1 = iy\n",
+    "                for il in range(0, nlag):\n",
+    "                    ix1 = ix1 + ixinc\n",
+    "                    if 0 <= ix1 < nx:\n",
+    "                        iy1 = iy1 + iyinc\n",
+    "                        if 1 <= iy1 < ny:\n",
+    "                            if inside[iy1, ix1]:\n",
+    "                                vrh = array[iy1, ix1]\n",
+    "                                npp[il] = npp[il] + 1\n",
+    "                                tm[il] = tm[il] + vrt\n",
+    "                                hm[il] = hm[il] + vrh\n",
+    "                                vario[il] = vario[il] + ((vrh - vrt) ** 2.0)\n",
+    "\n",
+    "    # Get average values for gam, hm, tm, hv, and tv, then compute the correct\n",
+    "    # \"variogram\" measure\n",
+    "    for il in range(0, nlag):\n",
+    "        if npp[il] > 0:\n",
+    "            rnum = npp[il]\n",
+    "            lag[il] = np.sqrt((ixd * xsiz * il) ** 2 + (iyd * ysiz * il) ** 2)\n",
+    "            vario[il] = vario[il] / float(rnum)\n",
+    "            hm[il] = hm[il] / float(rnum)\n",
+    "            tm[il] = tm[il] / float(rnum)\n",
+    "\n",
+    "            # Standardize by the sill\n",
+    "            if isill == 1:\n",
+    "                vario[il] = vario[il] / var\n",
+    "\n",
+    "            # Semivariogram\n",
+    "            vario[il] = 0.5 * vario[il]\n",
+    "    return lag, vario, npp"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Interactive Interface\n",
+    "\n",
+    "Here's the interactive interface. I make a correlated 1D data set, add noise and then calculate the histogram and variogram with and without noise. \n",
+    "\n",
+    "* the user specifies the proportion of noise and the spatial continuity range of the original data."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 6,
+   "metadata": {
+    "scrolled": false
+   },
+   "outputs": [],
+   "source": [
+    "\n",
+    "n = 200; mean = 0.20; stdev = 0.03; nlag = 100; pnoise = 0.5; \n",
+    "\n",
+    "l = widgets.Text(value='                                                     Variogram Nugger Effect Demonstration, Prof. Michael Pyrcz, The University of Texas at Austin',\n",
+    "                 layout=Layout(width='930px', height='30px'))\n",
+    "\n",
+    "pnoise = widgets.FloatSlider(min=0.0,max = 1.0,value=0.0,step = 0.05,description = 'Noise %',orientation='horizontal',style = {'description_width': 'initial'},layout=Layout(width='500px',height='30px'),continuous_update=False)\n",
+    "vrange = widgets.IntSlider(min=1,max = 100,value=30,step = 5,description = 'Spatial Continuity Range',orientation='horizontal',style = {'description_width': 'initial'},layout=Layout(width='500px',height='30px'),continuous_update=False)\n",
+    "\n",
+    "ui = widgets.HBox([pnoise,vrange],)\n",
+    "ui2 = widgets.VBox([l,ui],)\n",
+    "\n",
+    "def run_plot(pnoise,vrange):\n",
+    "\n",
+    "    psignal = 1 - pnoise\n",
+    "\n",
+    "    np.random.seed(seed = seed)\n",
+    "    data0 = np.random.normal(loc=0.20,scale=0.03,size=n+1000)\n",
+    "    \n",
+    "    kern1 = np.ones(vrange)\n",
+    "    data1 = np.convolve(data0,kern1,mode='same')\n",
+    "    data1_sub = GSLIB.affine(data1[500:n+500],mean,stdev)\n",
+    "    \n",
+    "    data1_sub_rescale = GSLIB.affine(data1[500:n+500],mean,stdev*math.sqrt(psignal))\n",
+    "    data1_sub_noise = data1_sub_rescale + np.random.normal(loc=0.0,scale = stdev*math.sqrt(pnoise),size=n)\n",
+    "    data1_sub_noise = GSLIB.affine(data1_sub_noise,mean,stdev)\n",
+    "    \n",
+    "    #fig, axs = plt.subplots(2,3, gridspec_kw={'width_ratios': [2, 1, 1, 1]})\n",
+    "    \n",
+    "    fig = plt.figure()\n",
+    "    spec = fig.add_gridspec(2, 3)\n",
+    "    \n",
+    "    ax1 = fig.add_subplot(spec[0, :])\n",
+    "    plt.plot(np.arange(1,n+1),data1_sub,color='blue',alpha=0.3,lw=3,label='Original')\n",
+    "    plt.plot(np.arange(1,n+1),data1_sub_noise,color='red',alpha=0.3,lw=3,label='Original + Noise')\n",
+    "    plt.xlim([0,n]); plt.ylim([mean-4*stdev,mean+4*stdev])\n",
+    "    plt.xlabel('Location (m)'); plt.ylabel('Porosity (%)'); plt.title('Porosity Over Location, Original and with Random Noise')\n",
+    "    plt.grid(); plt.legend(loc='upper right')\n",
+    "    \n",
+    "    ax2 = fig.add_subplot(spec[1, 0])\n",
+    "    plt.hist(data1_sub,color='blue',alpha=0.3,edgecolor='black',bins=np.linspace(mean-4*stdev,mean+4*stdev,30),\n",
+    "             label='Original')\n",
+    "    plt.hist(data1_sub_noise,color='red',alpha=0.3,edgecolor='black',bins=np.linspace(mean-4*stdev,mean+4*stdev,30),\n",
+    "             label='Original + Noise')\n",
+    "    plt.xlim([mean-4*stdev,mean+4*stdev]); plt.ylim([0,30])\n",
+    "    plt.xlabel('Porosity (%)'); plt.ylabel('Frequency'); plt.title('Histogram')\n",
+    "    plt.grid(); plt.legend(loc='upper right')\n",
+    "    \n",
+    "    ax3 = fig.add_subplot(spec[1, 1])\n",
+    "    labels = ['Signal','Noise',]\n",
+    "    plt.pie([psignal, pnoise,],radius = 1, autopct='%1.1f%%', \n",
+    "                colors = ['#0000FF','#FF0000'], explode = [.02,.02],wedgeprops = {\"edgecolor\":\"k\",'linewidth':1,\"alpha\":0.3},)\n",
+    "    plt.title('Variance of Signal and Noise')\n",
+    "    plt.legend(labels,loc='lower left')\n",
+    "    \n",
+    "    ax4 = fig.add_subplot(spec[1, 2])\n",
+    "    data1_sub_reshape = data1_sub.reshape((n,1))\n",
+    "    lag,gamma,npp = gam(data1_sub,-9999,9999,1.0,1.0,0,1,nlag,1)\n",
+    "    _,gamma_noise,_ = gam(data1_sub_noise,-9999,9999,1.0,1.0,0,1,nlag,1)\n",
+    "    plt.scatter(lag,gamma,s=30,color='blue',alpha=0.3,edgecolor='black',label='Original')\n",
+    "    plt.scatter(lag,gamma_noise,s=30,color='red',alpha=0.3,edgecolor='black',label='Original + Noise')\n",
+    "    plt.plot([0,nlag],[1.0,1.0],color='black',ls='--')\n",
+    "    plt.xlim([0,nlag]); plt.ylim([0,2.0]); plt.grid(); plt.legend(loc='upper right')\n",
+    "    plt.xlabel('Lag Distance (h)'); plt.ylabel('Variogram'); plt.title('Experimental Variogram')\n",
+    "\n",
+    "    plt.subplots_adjust(left=0.0, bottom=0.0, right=2.0, top=1.6, wspace=0.1, hspace=0.3); plt.show()\n",
+    "\n",
+    "# connect the function to make the samples and plot to the widgets    \n",
+    "interactive_plot = widgets.interactive_output(run_plot, {'pnoise':pnoise,'vrange':vrange})\n",
+    "interactive_plot.clear_output(wait = True)               # reduce flickering by delaying plot updating"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "Take some time to observe a random phenomenon. \n",
+    "\n",
+    "* see any patterns, e.g., strings of low or high values, increasing or decreasing trends?\n",
+    "\n",
+    "#### Add Spatial Correlation\n",
+    "\n",
+    "We can use convolution to add spatial continuity to a random set of values\n",
+    "\n",
+    "* we won't go into the details, but the convolution kernel can actually be related to the variogram in sequential Gaussian simulation.\n",
+    "\n",
+    "* we apply an affine correction to ensure that we don't change the mean or standard deviation with the convolution, we just change the spatial continuity\n",
+    "\n",
+    "* since we are using convolution, it is likely that there will be edge artifacts, so we have 'cut off' the edges of the model (500 m on each side)."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 7,
+   "metadata": {},
+   "outputs": [
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "100c9cb7acbd4ae7b830fc4c82f8bf54",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "VBox(children=(Text(value='                                                     Variogram Nugger Effect Demons…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    },
+    {
+     "data": {
+      "application/vnd.jupyter.widget-view+json": {
+       "model_id": "836cee2e2cd84072b1db197fe1481830",
+       "version_major": 2,
+       "version_minor": 0
+      },
+      "text/plain": [
+       "Output(outputs=({'output_type': 'display_data', 'data': {'text/plain': '<Figure size 432x288 with 4 Axes>', 'i…"
+      ]
+     },
+     "metadata": {},
+     "output_type": "display_data"
+    }
+   ],
+   "source": [
+    "display(ui2, interactive_plot)                           # display the interactive plot"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "#### Comments\n",
+    "\n",
+    "This was an interactive demonstration of the variogram nugget effect structure resulting from the addition of random noise to spatial data. \n",
+    "\n",
+    "I have many other demonstrations on simulation to build spatial models with spatial continuity and many other workflows available [here](https://github.com/GeostatsGuy/PythonNumericalDemos), along with a package for geostatistics in Python called [GeostatsPy](https://github.com/GeostatsGuy/GeostatsPy). \n",
+    "  \n",
+    "We hope this was helpful,\n",
+    "\n",
+    "*Michael*\n",
+    "\n",
+    "***\n",
+    "\n",
+    "#### More on Michael Pyrcz and the Texas Center for Geostatistics:\n",
+    "\n",
+    "### Michael Pyrcz, Associate Professor, University of Texas at Austin \n",
+    "*Novel Data Analytics, Geostatistics and Machine Learning Subsurface Solutions*\n",
+    "\n",
+    "With over 17 years of experience in subsurface consulting, research and development, Michael has returned to academia driven by his passion for teaching and enthusiasm for enhancing engineers' and geoscientists' impact in subsurface resource development. \n",
+    "\n",
+    "For more about Michael check out these links:\n",
+    "\n",
+    "#### [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)\n",
+    "\n",
+    "#### Want to Work Together?\n",
+    "\n",
+    "I hope this content is helpful to those that want to learn more about subsurface modeling, data analytics and machine learning. Students and working professionals are welcome to participate.\n",
+    "\n",
+    "* Want to invite me to visit your company for training, mentoring, project review, workflow design and / or consulting? I'd be happy to drop by and work with you! \n",
+    "\n",
+    "* Interested in partnering, supporting my graduate student research or my Subsurface Data Analytics and Machine Learning consortium (co-PIs including Profs. Foster, Torres-Verdin and van Oort)? My research combines data analytics, stochastic modeling and machine learning theory with practice to develop novel methods and workflows to add value. We are solving challenging subsurface problems!\n",
+    "\n",
+    "* I can be reached at mpyrcz@austin.utexas.edu.\n",
+    "\n",
+    "I'm always happy to discuss,\n",
+    "\n",
+    "*Michael*\n",
+    "\n",
+    "Michael Pyrcz, Ph.D., P.Eng. Associate Professor The Hildebrand Department of Petroleum and Geosystems Engineering, Bureau of Economic Geology, The Jackson School of Geosciences, The University of Texas at Austin\n",
+    "\n",
+    "#### More Resources Available at: [Twitter](https://twitter.com/geostatsguy) | [GitHub](https://github.com/GeostatsGuy) | [Website](http://michaelpyrcz.com) | [GoogleScholar](https://scholar.google.com/citations?user=QVZ20eQAAAAJ&hl=en&oi=ao) | [Book](https://www.amazon.com/Geostatistical-Reservoir-Modeling-Michael-Pyrcz/dp/0199731446) | [YouTube](https://www.youtube.com/channel/UCLqEr-xV-ceHdXXXrTId5ig)  | [LinkedIn](https://www.linkedin.com/in/michael-pyrcz-61a648a1)\n"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3 (ipykernel)",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.9.12"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}

Datei-Diff unterdrückt, da er zu groß ist
+ 453 - 0
Interactive_Variogram_h_scatter.ipynb


Datei-Diff unterdrückt, da er zu groß ist
+ 720 - 0
Interactive_kMeans_Clustering.ipynb