{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"#### Imports"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [
{
"data": {
"text/html": [
" \n",
" "
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"C:\\Users\\shahla.huseynova\\source\\repos\\Mooi-Kickstart\\pyrecoy\\pyrecoy\\pyrecoy2\\__init__.py\n"
]
}
],
"source": [
"from datetime import timedelta, datetime\n",
"import json\n",
"import pprint\n",
"from copy import deepcopy\n",
"import pytz\n",
"\n",
"import cufflinks\n",
"cufflinks.go_offline()\n",
"import numpy as np\n",
"from numpy.polynomial import Polynomial\n",
"import pandas as pd\n",
"from tqdm.notebook import tqdm\n",
"\n",
"import sys\n",
"\n",
"import os\n",
" \n",
"# Path to the folder containing the alternate version of PyRecoy\n",
"\n",
"alternate_pyrecoy_path = 'C:\\\\Users\\\\shahla.huseynova\\\\source\\\\repos\\\\Mooi-Kickstart\\\\pyrecoy\\\\pyrecoy'\n",
" \n",
"# Add the path to sys.path\n",
"\n",
"if alternate_pyrecoy_path not in sys.path:\n",
"\n",
" sys.path.insert(0, alternate_pyrecoy_path)\n",
" \n",
"# Now import PyRecoy\n",
"\n",
"import pyrecoy2\n",
"\n",
"# Check the version or path to confirm\n",
"\n",
"print(pyrecoy2.__file__)\n",
"\n",
"\n",
"from pyrecoy2.assets import Heatpump, Eboiler, GasBoiler, HotWaterStorage\n",
"from pyrecoy2.colors import *\n",
"from pyrecoy2.converters import *\n",
"from pyrecoy2.financial import calculate_eb_ode, get_tax_tables, get_tax_rate, get_grid_tariffs_electricity\n",
"from pyrecoy2.framework import TimeFramework\n",
"from pyrecoy2.casestudy import CaseStudy\n",
"from pyrecoy2.plotting import ebitda_bar_chart, npv_bar_chart\n",
"from pyrecoy2.reports import CaseReport, ComparisonReport, BusinessCaseReport, SingleFigureComparison\n",
"from pyrecoy2.sensitivity import SensitivityAnalysis\n",
"from pyrecoy2.prices import get_tennet_data, get_afrr_capacity_fees_nl\n",
"from pyrecoy2.forecasts import Mipf, Forecast\n",
"\n",
"%load_ext autoreload\n",
"%autoreload 2"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"#### Development backlog"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"* aFRR (can improve the optimisation case)\n",
"* Report\n",
"\n",
"--\n",
"* Create strategy on imbalance POS (buy 100% day-ahead, and respond to high prices)\n",
"* Graphs\n",
" * EBITDA vs. baseline (earnings vs baseline)\n",
"* Show COP curves in different cases, just for illustration\n",
"* Energy report --> Check + add gas\n",
"* Fix comparison reports\n",
"* Model verification"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"#### Meeting Notes"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"##### Meeting 25-11-2020"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"* aFRR can help optimisation case\n",
"* SDE++ should be included\n",
"* Tsource sensitivity really gives interesting insights\n",
"* Sensitivities should be verified (especially CO2, Tsink, Tsource, time period)\n",
"* AP TNO: Update/verify COP curve\n",
"* AP TNO: Update CAPEX\n",
"* AP Mark: \n",
" * Create graphs on COP curve with different Tsource, Tsink\n",
" * Generate table and output in .csv (send it to Andrew)\n",
"* Investigate opportunity to lower the COP and negative electricity prices\n",
" * Technically feasible, but not really needed/possible to do it in this project\n",
"* Could be interesting to run this model on a usecase with higher delta T\n",
"* Conclusion: Finalize this model, but not add too much extra complexity, next steps is to go towards industrial partners with the results"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"# ENCORE : Heatpump Optimisation Framework\n",
"\n",
"***\n",
"© Mark Kremer \n",
"July 2020\n",
"##### Cases\n",
"In this model, 3 cases are compared:\n",
"* **Baseline** : All heat is supplied by steam turbine\n",
"* **Heatpump case** : All heat is supplied by heatpump\n",
"* **Hybrid case** : Steam turbine and heatpump run in hybrid mode, and are optimised on costs in real-time"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"#### Loading config"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"class Config():\n",
" start = '2023-01-01'\n",
" end = '2023-12-31'\n",
" # year = '2019'\n",
" # startdate = year + '-01-01'\n",
" # enddate = str(year) + '-12-31'\n",
" # start = datetime.strptime(startdate, \"%Y-%m-%d\").astimezone(pytz.timezone('Europe/Amsterdam'))\n",
" # end = datetime.strptime(enddate, \"%Y-%m-%d\").astimezone(pytz.timezone('Europe/Amsterdam'))\n",
" # start = start.astimezone(pytz.UTC)\n",
" # end = end.astimezone(pytz.UTC)\n",
" \n",
" hp_vdg_e_power = 23.3 # MW\n",
" hp_ndg_e_power = 7.7 # MW\n",
" hp_min_load = 0\n",
" hp_lifetime = 25\n",
" hp_capex = 200_000 # EUR/MWth\n",
" hp_opex = 0.01 # in % of CAPEX\n",
" hp_devex = 0.005 # in % of CAPEX\n",
" \n",
" gb_power = 35 # MW\n",
" gb_efficiency = 0.9\n",
" \n",
" storage_power = 25\n",
" storage_volume = 1000\n",
" storage_cap_per_volume = 50 * 1e-3\n",
" storage_lifetime = 25\n",
" storage_temperature = 95\n",
" storage_min_level = storage_volume * storage_cap_per_volume * 0.05\n",
" storage_capex_per_MW = 7_000\n",
" storage_capex_per_MWh = 5_000\n",
" storage_opex_perc_of_capex = 0.02\n",
" # storage_initial_level = 5\n",
" threshold = 20\n",
" \n",
" tax_bracket_g = 4 \n",
" tax_bracket_e = 4\n",
" \n",
" include_transport_costs = False\n",
" grid_operator = 'Liander'\n",
" connection_type = 'TS/MS'\n",
" \n",
" discount_rate = 0.1\n",
" project_duration = 12\n",
"\n",
" forecast = 'ForeNeg'\n",
" gas_price_multiplier = 1\n",
" e_price_multiplier = 1\n",
" e_price_volatility_multiplier = 1\n",
" co2_price_multiplier = 1\n",
" tsource_delta = 0\n",
" tsink_delta = 0\n",
" energy_tax_multiplier = 1\n",
" \n",
" # Review the SDE implementation\n",
" sde_base_amount = 81\n",
" longterm_gas_price = 24.00\n",
" longterm_co2_price = 37.90\n",
" sde_switch_price_correction = 40\n",
" \n",
" day_ahead_buying_perc = 0.3\n",
" \n",
" afrr_capacity_fee = 25_000\n",
"\n",
"c = Config()"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"class Store():\n",
" pass"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"## Model set-up"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"# Define the file paths\n",
"excel_file_path = r'C:\\Users\\shahla.huseynova\\source\\repos\\Mooi-Kickstart\\Kickstart\\V2\\Demand_Data_Smurfit_Preprocessed2.xlsx'\n",
"output_directory = r'C:\\Users\\shahla.huseynova\\source\\repos\\Mooi-Kickstart\\Kickstart\\V2\\data'\n",
"output_file_name = 'smurfit_demand_preprocessed2.csv'\n",
"csv_file_path = os.path.join(output_directory, output_file_name)\n",
"\n",
"df = pd.read_excel(excel_file_path)\n",
"# df = df.fillna(0) \n",
"\n",
"# # Save the DataFrame to CSV\n",
"df.to_csv(csv_file_path, index=False)"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def load_demand_data(c, s):\n",
" demand = pd.read_csv('data/smurfit_demand_preprocessed2.csv', delimiter=',', decimal=';')\n",
" dt_index = pd.date_range(\n",
" start=s.time_fw.start,\n",
" end=s.time_fw.start + timedelta(days=365), \n",
" freq='1T',\n",
" tz='Europe/Amsterdam')\n",
" dt_index = dt_index[:len(demand)]\n",
" demand.index = dt_index\n",
" demand['Total demand'] = demand['MW (VDG)'] + demand['MW (NDG)']\n",
" demand = demand[c.start:c.end]\n",
" return demand"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"s = Store()"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"testets\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"C:\\Users\\shahla.huseynova\\source\\repos\\Mooi-Kickstart\\pyrecoy\\pyrecoy\\pyrecoy2\\framework.py:41: UserWarning:\n",
"\n",
"The chosen timeperiod spans 365.99930555555557 days, which is not a full year. Beware that certain functions that use yearly rates might return incorrect values.\n",
"\n"
]
},
{
"ename": "OperationalError",
"evalue": "(pyodbc.OperationalError) ('08001', '[08001] [Microsoft][ODBC Driver 17 for SQL Server]TCP Provider: The wait operation timed out.\\r\\n (258) (SQLDriverConnect); [08001] [Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired (0); [08001] [Microsoft][ODBC Driver 17 for SQL Server]Invalid connection string attribute (0); [08001] [Microsoft][ODBC Driver 17 for SQL Server]A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online. (258)')\n(Background on this error at: https://sqlalche.me/e/14/e3q8)",
"output_type": "error",
"traceback": [
"\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[1;31mOperationalError\u001b[0m Traceback (most recent call last)",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\base.py:3280\u001b[0m, in \u001b[0;36mEngine._wrap_pool_connect\u001b[1;34m(self, fn, connection)\u001b[0m\n\u001b[0;32m 3279\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m-> 3280\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mfn\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 3281\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m dialect\u001b[38;5;241m.\u001b[39mdbapi\u001b[38;5;241m.\u001b[39mError \u001b[38;5;28;01mas\u001b[39;00m e:\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:310\u001b[0m, in \u001b[0;36mPool.connect\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 303\u001b[0m \u001b[38;5;250m\u001b[39m\u001b[38;5;124;03m\"\"\"Return a DBAPI connection from the pool.\u001b[39;00m\n\u001b[0;32m 304\u001b[0m \n\u001b[0;32m 305\u001b[0m \u001b[38;5;124;03mThe connection is instrumented such that when its\u001b[39;00m\n\u001b[1;32m (...)\u001b[0m\n\u001b[0;32m 308\u001b[0m \n\u001b[0;32m 309\u001b[0m \u001b[38;5;124;03m\"\"\"\u001b[39;00m\n\u001b[1;32m--> 310\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_ConnectionFairy\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_checkout\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m)\u001b[49m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:868\u001b[0m, in \u001b[0;36m_ConnectionFairy._checkout\u001b[1;34m(cls, pool, threadconns, fairy)\u001b[0m\n\u001b[0;32m 867\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m fairy:\n\u001b[1;32m--> 868\u001b[0m fairy \u001b[38;5;241m=\u001b[39m \u001b[43m_ConnectionRecord\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mcheckout\u001b[49m\u001b[43m(\u001b[49m\u001b[43mpool\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 870\u001b[0m fairy\u001b[38;5;241m.\u001b[39m_pool \u001b[38;5;241m=\u001b[39m pool\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:476\u001b[0m, in \u001b[0;36m_ConnectionRecord.checkout\u001b[1;34m(cls, pool)\u001b[0m\n\u001b[0;32m 474\u001b[0m \u001b[38;5;129m@classmethod\u001b[39m\n\u001b[0;32m 475\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mcheckout\u001b[39m(\u001b[38;5;28mcls\u001b[39m, pool):\n\u001b[1;32m--> 476\u001b[0m rec \u001b[38;5;241m=\u001b[39m \u001b[43mpool\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_do_get\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 477\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\impl.py:145\u001b[0m, in \u001b[0;36mQueuePool._do_get\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 144\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m:\n\u001b[1;32m--> 145\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m util\u001b[38;5;241m.\u001b[39msafe_reraise():\n\u001b[0;32m 146\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_dec_overflow()\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\langhelpers.py:70\u001b[0m, in \u001b[0;36msafe_reraise.__exit__\u001b[1;34m(self, type_, value, traceback)\u001b[0m\n\u001b[0;32m 69\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mwarn_only:\n\u001b[1;32m---> 70\u001b[0m \u001b[43mcompat\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 71\u001b[0m \u001b[43m \u001b[49m\u001b[43mexc_value\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 72\u001b[0m \u001b[43m \u001b[49m\u001b[43mwith_traceback\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexc_tb\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 73\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 74\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\compat.py:208\u001b[0m, in \u001b[0;36mraise_\u001b[1;34m(***failed resolving arguments***)\u001b[0m\n\u001b[0;32m 207\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 208\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m exception\n\u001b[0;32m 209\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[0;32m 210\u001b[0m \u001b[38;5;66;03m# credit to\u001b[39;00m\n\u001b[0;32m 211\u001b[0m \u001b[38;5;66;03m# https://cosmicpercolator.com/2016/01/13/exception-leaks-in-python-2-and-3/\u001b[39;00m\n\u001b[0;32m 212\u001b[0m \u001b[38;5;66;03m# as the __traceback__ object creates a cycle\u001b[39;00m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\impl.py:143\u001b[0m, in \u001b[0;36mQueuePool._do_get\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 142\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 143\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_create_connection\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 144\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m:\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:256\u001b[0m, in \u001b[0;36mPool._create_connection\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 254\u001b[0m \u001b[38;5;250m\u001b[39m\u001b[38;5;124;03m\"\"\"Called by subclasses to create a new ConnectionRecord.\"\"\"\u001b[39;00m\n\u001b[1;32m--> 256\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_ConnectionRecord\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m)\u001b[49m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:371\u001b[0m, in \u001b[0;36m_ConnectionRecord.__init__\u001b[1;34m(self, pool, connect)\u001b[0m\n\u001b[0;32m 370\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m connect:\n\u001b[1;32m--> 371\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m__connect\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 372\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mfinalize_callback \u001b[38;5;241m=\u001b[39m deque()\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:665\u001b[0m, in \u001b[0;36m_ConnectionRecord.__connect\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 664\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m--> 665\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m util\u001b[38;5;241m.\u001b[39msafe_reraise():\n\u001b[0;32m 666\u001b[0m pool\u001b[38;5;241m.\u001b[39mlogger\u001b[38;5;241m.\u001b[39mdebug(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mError on connect(): \u001b[39m\u001b[38;5;132;01m%s\u001b[39;00m\u001b[38;5;124m\"\u001b[39m, e)\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\langhelpers.py:70\u001b[0m, in \u001b[0;36msafe_reraise.__exit__\u001b[1;34m(self, type_, value, traceback)\u001b[0m\n\u001b[0;32m 69\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mwarn_only:\n\u001b[1;32m---> 70\u001b[0m \u001b[43mcompat\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 71\u001b[0m \u001b[43m \u001b[49m\u001b[43mexc_value\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 72\u001b[0m \u001b[43m \u001b[49m\u001b[43mwith_traceback\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexc_tb\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 73\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 74\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\compat.py:208\u001b[0m, in \u001b[0;36mraise_\u001b[1;34m(***failed resolving arguments***)\u001b[0m\n\u001b[0;32m 207\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 208\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m exception\n\u001b[0;32m 209\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[0;32m 210\u001b[0m \u001b[38;5;66;03m# credit to\u001b[39;00m\n\u001b[0;32m 211\u001b[0m \u001b[38;5;66;03m# https://cosmicpercolator.com/2016/01/13/exception-leaks-in-python-2-and-3/\u001b[39;00m\n\u001b[0;32m 212\u001b[0m \u001b[38;5;66;03m# as the __traceback__ object creates a cycle\u001b[39;00m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:661\u001b[0m, in \u001b[0;36m_ConnectionRecord.__connect\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 660\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mstarttime \u001b[38;5;241m=\u001b[39m time\u001b[38;5;241m.\u001b[39mtime()\n\u001b[1;32m--> 661\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdbapi_connection \u001b[38;5;241m=\u001b[39m connection \u001b[38;5;241m=\u001b[39m \u001b[43mpool\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_invoke_creator\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[0;32m 662\u001b[0m pool\u001b[38;5;241m.\u001b[39mlogger\u001b[38;5;241m.\u001b[39mdebug(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mCreated new connection \u001b[39m\u001b[38;5;132;01m%r\u001b[39;00m\u001b[38;5;124m\"\u001b[39m, connection)\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\create.py:590\u001b[0m, in \u001b[0;36mcreate_engine..connect\u001b[1;34m(connection_record)\u001b[0m\n\u001b[0;32m 589\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m connection\n\u001b[1;32m--> 590\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m dialect\u001b[38;5;241m.\u001b[39mconnect(\u001b[38;5;241m*\u001b[39mcargs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mcparams)\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\default.py:597\u001b[0m, in \u001b[0;36mDefaultDialect.connect\u001b[1;34m(self, *cargs, **cparams)\u001b[0m\n\u001b[0;32m 595\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mconnect\u001b[39m(\u001b[38;5;28mself\u001b[39m, \u001b[38;5;241m*\u001b[39mcargs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mcparams):\n\u001b[0;32m 596\u001b[0m \u001b[38;5;66;03m# inherits the docstring from interfaces.Dialect.connect\u001b[39;00m\n\u001b[1;32m--> 597\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdbapi\u001b[38;5;241m.\u001b[39mconnect(\u001b[38;5;241m*\u001b[39mcargs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mcparams)\n",
"\u001b[1;31mOperationalError\u001b[0m: ('08001', '[08001] [Microsoft][ODBC Driver 17 for SQL Server]TCP Provider: The wait operation timed out.\\r\\n (258) (SQLDriverConnect); [08001] [Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired (0); [08001] [Microsoft][ODBC Driver 17 for SQL Server]Invalid connection string attribute (0); [08001] [Microsoft][ODBC Driver 17 for SQL Server]A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online. (258)')",
"\nThe above exception was the direct cause of the following exception:\n",
"\u001b[1;31mOperationalError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[1;32mIn[7], line 25\u001b[0m\n\u001b[0;32m 22\u001b[0m s\u001b[38;5;241m.\u001b[39mdemand \u001b[38;5;241m=\u001b[39m load_demand_data(c, s)\n\u001b[0;32m 23\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m s\n\u001b[1;32m---> 25\u001b[0m s \u001b[38;5;241m=\u001b[39m \u001b[43msetup_model\u001b[49m\u001b[43m(\u001b[49m\u001b[43mc\u001b[49m\u001b[43m)\u001b[49m\n",
"Cell \u001b[1;32mIn[7], line 6\u001b[0m, in \u001b[0;36msetup_model\u001b[1;34m(c)\u001b[0m\n\u001b[0;32m 4\u001b[0m s\u001b[38;5;241m.\u001b[39mtime_fw \u001b[38;5;241m=\u001b[39m TimeFramework(start\u001b[38;5;241m=\u001b[39mc\u001b[38;5;241m.\u001b[39mstart, end\u001b[38;5;241m=\u001b[39mc\u001b[38;5;241m.\u001b[39mend)\n\u001b[0;32m 5\u001b[0m \u001b[38;5;66;03m# s.minute_ix = s.time_fw.dt_index(freq='1T')\u001b[39;00m\n\u001b[1;32m----> 6\u001b[0m mipf \u001b[38;5;241m=\u001b[39m \u001b[43mMipf\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstart\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43ms\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mtime_fw\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mstart\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mend\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43ms\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mtime_fw\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mend\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mfrom_database\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mTrue\u001b[39;49;00m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43madd_days_to_start_end\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m)\u001b[49m\u001b[38;5;241m.\u001b[39mdata\n\u001b[0;32m 7\u001b[0m s\u001b[38;5;241m.\u001b[39mmipf \u001b[38;5;241m=\u001b[39m mipf\n\u001b[0;32m 8\u001b[0m s\u001b[38;5;241m.\u001b[39mbaseline \u001b[38;5;241m=\u001b[39m CaseStudy(time_fw\u001b[38;5;241m=\u001b[39ms\u001b[38;5;241m.\u001b[39mtime_fw, freq\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124m1T\u001b[39m\u001b[38;5;124m'\u001b[39m, name\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mBaseline\u001b[39m\u001b[38;5;124m'\u001b[39m)\n",
"File \u001b[1;32m~\\source\\repos\\Mooi-Kickstart\\pyrecoy\\pyrecoy\\pyrecoy2\\forecasts.py:98\u001b[0m, in \u001b[0;36mMipf.__init__\u001b[1;34m(self, *args, **kwargs)\u001b[0m\n\u001b[0;32m 97\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m__init__\u001b[39m(\u001b[38;5;28mself\u001b[39m, \u001b[38;5;241m*\u001b[39margs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs):\n\u001b[1;32m---> 98\u001b[0m \u001b[38;5;28msuper\u001b[39m()\u001b[38;5;241m.\u001b[39m\u001b[38;5;21m__init__\u001b[39m(\u001b[38;5;241m*\u001b[39margs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n\u001b[0;32m 99\u001b[0m forecasts \u001b[38;5;241m=\u001b[39m get_imbalance_forecasts_from_database_on_quarter_start_time(kwargs[\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mstart\u001b[39m\u001b[38;5;124m'\u001b[39m], kwargs[\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mend\u001b[39m\u001b[38;5;124m'\u001b[39m],\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mNLD\u001b[39m\u001b[38;5;124m'\u001b[39m)\n\u001b[0;32m 100\u001b[0m forecasts[\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mPublicationTime\u001b[39m\u001b[38;5;124m'\u001b[39m] \u001b[38;5;241m=\u001b[39m pd\u001b[38;5;241m.\u001b[39mto_datetime(forecasts[\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mPublicationTime\u001b[39m\u001b[38;5;124m'\u001b[39m], utc\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mTrue\u001b[39;00m)\n",
"File \u001b[1;32m~\\source\\repos\\Mooi-Kickstart\\pyrecoy\\pyrecoy\\pyrecoy2\\forecasts.py:45\u001b[0m, in \u001b[0;36mForecast.__init__\u001b[1;34m(self, filename, start, end, freq, folder_path, from_database, add_days_to_start_end)\u001b[0m\n\u001b[0;32m 42\u001b[0m end \u001b[38;5;241m=\u001b[39m datetime\u001b[38;5;241m.\u001b[39mstrptime(end, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m%\u001b[39m\u001b[38;5;124mY-\u001b[39m\u001b[38;5;124m%\u001b[39m\u001b[38;5;124mm-\u001b[39m\u001b[38;5;132;01m%d\u001b[39;00m\u001b[38;5;124m\"\u001b[39m)\u001b[38;5;241m.\u001b[39mastimezone(pytz\u001b[38;5;241m.\u001b[39mtimezone(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mEurope/Amsterdam\u001b[39m\u001b[38;5;124m'\u001b[39m))\n\u001b[0;32m 43\u001b[0m \u001b[38;5;28mprint\u001b[39m(end)\n\u001b[1;32m---> 45\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdata \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget_dataset\u001b[49m\u001b[43m(\u001b[49m\u001b[43mstart\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mend\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mfreq\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mfolder_path\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mfolder_path\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43madd_days_to_start_end\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43madd_days_to_start_end\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 47\u001b[0m \u001b[38;5;66;03m# print(self.data)\u001b[39;00m\n\u001b[0;32m 49\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mlen\u001b[39m(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdata) \u001b[38;5;241m==\u001b[39m \u001b[38;5;241m0\u001b[39m:\n",
"File \u001b[1;32m~\\source\\repos\\Mooi-Kickstart\\pyrecoy\\pyrecoy\\pyrecoy2\\forecasts.py:61\u001b[0m, in \u001b[0;36mForecast.get_dataset\u001b[1;34m(self, start, end, freq, folder_path, add_days_to_start_end)\u001b[0m\n\u001b[0;32m 58\u001b[0m start \u001b[38;5;241m=\u001b[39m start\u001b[38;5;241m.\u001b[39mastimezone(pytz\u001b[38;5;241m.\u001b[39mutc)\n\u001b[0;32m 59\u001b[0m end \u001b[38;5;241m=\u001b[39m end\u001b[38;5;241m.\u001b[39mastimezone(pytz\u001b[38;5;241m.\u001b[39mutc)\n\u001b[1;32m---> 61\u001b[0m dam \u001b[38;5;241m=\u001b[39m \u001b[43mget_day_ahead_prices_from_database\u001b[49m\u001b[43m(\u001b[49m\u001b[43mstart\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mend\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mNLD\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[0;32m 62\u001b[0m dam \u001b[38;5;241m=\u001b[39m dam\u001b[38;5;241m.\u001b[39mresample(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124m15T\u001b[39m\u001b[38;5;124m'\u001b[39m)\u001b[38;5;241m.\u001b[39mffill()\n\u001b[0;32m 64\u001b[0m imb \u001b[38;5;241m=\u001b[39m get_imbalance_prices_from_database(start, end, \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mNLD\u001b[39m\u001b[38;5;124m'\u001b[39m)\n",
"File \u001b[1;32mc:\\users\\shahla.huseynova\\documents\\asset_studies_recoy\\asset-case-studies\\asset-case-studies\\pyrecoy\\pyrecoy\\pyrecoy\\prices.py:627\u001b[0m, in \u001b[0;36mget_day_ahead_prices_from_database\u001b[1;34m(start, end, CountryIsoCode, tz)\u001b[0m\n\u001b[0;32m 626\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mget_day_ahead_prices_from_database\u001b[39m(start, end, CountryIsoCode, tz\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mutc\u001b[39m\u001b[38;5;124m\"\u001b[39m):\n\u001b[1;32m--> 627\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mget_price_data_from_database\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 628\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mDayAheadPrices\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[0;32m 629\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mHourStartTime\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[0;32m 630\u001b[0m \u001b[43m \u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mPrice\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 631\u001b[0m \u001b[43m \u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mDAM\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 632\u001b[0m \u001b[43m \u001b[49m\u001b[43mstart\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 633\u001b[0m \u001b[43m \u001b[49m\u001b[43mend\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 634\u001b[0m \u001b[43m \u001b[49m\u001b[43mCountryIsoCode\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 635\u001b[0m \u001b[43m \u001b[49m\u001b[43mtz\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtz\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 636\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[1;32mc:\\users\\shahla.huseynova\\documents\\asset_studies_recoy\\asset-case-studies\\asset-case-studies\\pyrecoy\\pyrecoy\\pyrecoy\\prices.py:603\u001b[0m, in \u001b[0;36mget_price_data_from_database\u001b[1;34m(database_name, time_index_column, database_columns, rename_columns, start, end, CountryIsoCode, tz, to_datetime_columns)\u001b[0m\n\u001b[0;32m 601\u001b[0m table \u001b[38;5;241m=\u001b[39m database_name\n\u001b[0;32m 602\u001b[0m md \u001b[38;5;241m=\u001b[39m MetaData(ENGINE_PRICES)\n\u001b[1;32m--> 603\u001b[0m table \u001b[38;5;241m=\u001b[39m \u001b[43mTable\u001b[49m\u001b[43m(\u001b[49m\u001b[43mtable\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mmd\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mautoload\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mTrue\u001b[39;49;00m\u001b[43m)\u001b[49m\n\u001b[0;32m 604\u001b[0m session \u001b[38;5;241m=\u001b[39m sessionmaker(bind\u001b[38;5;241m=\u001b[39mENGINE_PRICES)()\n\u001b[0;32m 605\u001b[0m end \u001b[38;5;241m=\u001b[39m end \u001b[38;5;241m+\u001b[39m timedelta(days\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m+\u001b[39m\u001b[38;5;241m1\u001b[39m)\n",
"File \u001b[1;32m:2\u001b[0m, in \u001b[0;36m__new__\u001b[1;34m(cls, *args, **kw)\u001b[0m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\deprecations.py:309\u001b[0m, in \u001b[0;36mdeprecated_params..decorate..warned\u001b[1;34m(fn, *args, **kwargs)\u001b[0m\n\u001b[0;32m 302\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m m \u001b[38;5;129;01min\u001b[39;00m kwargs:\n\u001b[0;32m 303\u001b[0m _warn_with_version(\n\u001b[0;32m 304\u001b[0m messages[m],\n\u001b[0;32m 305\u001b[0m versions[m],\n\u001b[0;32m 306\u001b[0m version_warnings[m],\n\u001b[0;32m 307\u001b[0m stacklevel\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m3\u001b[39m,\n\u001b[0;32m 308\u001b[0m )\n\u001b[1;32m--> 309\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m fn(\u001b[38;5;241m*\u001b[39margs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\sql\\schema.py:615\u001b[0m, in \u001b[0;36mTable.__new__\u001b[1;34m(cls, *args, **kw)\u001b[0m\n\u001b[0;32m 613\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m table\n\u001b[0;32m 614\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mException\u001b[39;00m:\n\u001b[1;32m--> 615\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m util\u001b[38;5;241m.\u001b[39msafe_reraise():\n\u001b[0;32m 616\u001b[0m metadata\u001b[38;5;241m.\u001b[39m_remove_table(name, schema)\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\langhelpers.py:70\u001b[0m, in \u001b[0;36msafe_reraise.__exit__\u001b[1;34m(self, type_, value, traceback)\u001b[0m\n\u001b[0;32m 68\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m \u001b[38;5;66;03m# remove potential circular references\u001b[39;00m\n\u001b[0;32m 69\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mwarn_only:\n\u001b[1;32m---> 70\u001b[0m \u001b[43mcompat\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 71\u001b[0m \u001b[43m \u001b[49m\u001b[43mexc_value\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 72\u001b[0m \u001b[43m \u001b[49m\u001b[43mwith_traceback\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexc_tb\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 73\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 74\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m 75\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m compat\u001b[38;5;241m.\u001b[39mpy3k \u001b[38;5;129;01mand\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info \u001b[38;5;129;01mand\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info[\u001b[38;5;241m1\u001b[39m]:\n\u001b[0;32m 76\u001b[0m \u001b[38;5;66;03m# emulate Py3K's behavior of telling us when an exception\u001b[39;00m\n\u001b[0;32m 77\u001b[0m \u001b[38;5;66;03m# occurs in an exception handler.\u001b[39;00m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\compat.py:208\u001b[0m, in \u001b[0;36mraise_\u001b[1;34m(***failed resolving arguments***)\u001b[0m\n\u001b[0;32m 205\u001b[0m exception\u001b[38;5;241m.\u001b[39m__cause__ \u001b[38;5;241m=\u001b[39m replace_context\n\u001b[0;32m 207\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 208\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m exception\n\u001b[0;32m 209\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[0;32m 210\u001b[0m \u001b[38;5;66;03m# credit to\u001b[39;00m\n\u001b[0;32m 211\u001b[0m \u001b[38;5;66;03m# https://cosmicpercolator.com/2016/01/13/exception-leaks-in-python-2-and-3/\u001b[39;00m\n\u001b[0;32m 212\u001b[0m \u001b[38;5;66;03m# as the __traceback__ object creates a cycle\u001b[39;00m\n\u001b[0;32m 213\u001b[0m \u001b[38;5;28;01mdel\u001b[39;00m exception, replace_context, from_, with_traceback\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\sql\\schema.py:611\u001b[0m, in \u001b[0;36mTable.__new__\u001b[1;34m(cls, *args, **kw)\u001b[0m\n\u001b[0;32m 609\u001b[0m metadata\u001b[38;5;241m.\u001b[39m_add_table(name, schema, table)\n\u001b[0;32m 610\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 611\u001b[0m table\u001b[38;5;241m.\u001b[39m_init(name, metadata, \u001b[38;5;241m*\u001b[39margs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkw)\n\u001b[0;32m 612\u001b[0m table\u001b[38;5;241m.\u001b[39mdispatch\u001b[38;5;241m.\u001b[39mafter_parent_attach(table, metadata)\n\u001b[0;32m 613\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m table\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\sql\\schema.py:686\u001b[0m, in \u001b[0;36mTable._init\u001b[1;34m(self, name, metadata, *args, **kwargs)\u001b[0m\n\u001b[0;32m 682\u001b[0m \u001b[38;5;66;03m# load column definitions from the database if 'autoload' is defined\u001b[39;00m\n\u001b[0;32m 683\u001b[0m \u001b[38;5;66;03m# we do it after the table is in the singleton dictionary to support\u001b[39;00m\n\u001b[0;32m 684\u001b[0m \u001b[38;5;66;03m# circular foreign keys\u001b[39;00m\n\u001b[0;32m 685\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m autoload:\n\u001b[1;32m--> 686\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_autoload\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 687\u001b[0m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 688\u001b[0m \u001b[43m \u001b[49m\u001b[43mautoload_with\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 689\u001b[0m \u001b[43m \u001b[49m\u001b[43minclude_columns\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 690\u001b[0m \u001b[43m \u001b[49m\u001b[43m_extend_on\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43m_extend_on\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 691\u001b[0m \u001b[43m \u001b[49m\u001b[43mresolve_fks\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mresolve_fks\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 692\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 694\u001b[0m \u001b[38;5;66;03m# initialize all the column, etc. objects. done after reflection to\u001b[39;00m\n\u001b[0;32m 695\u001b[0m \u001b[38;5;66;03m# allow user-overrides\u001b[39;00m\n\u001b[0;32m 697\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_init_items(\n\u001b[0;32m 698\u001b[0m \u001b[38;5;241m*\u001b[39margs,\n\u001b[0;32m 699\u001b[0m allow_replacements\u001b[38;5;241m=\u001b[39mextend_existing \u001b[38;5;129;01mor\u001b[39;00m keep_existing \u001b[38;5;129;01mor\u001b[39;00m autoload\n\u001b[0;32m 700\u001b[0m )\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\sql\\schema.py:719\u001b[0m, in \u001b[0;36mTable._autoload\u001b[1;34m(self, metadata, autoload_with, include_columns, exclude_columns, resolve_fks, _extend_on)\u001b[0m\n\u001b[0;32m 711\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m autoload_with \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m 712\u001b[0m autoload_with \u001b[38;5;241m=\u001b[39m _bind_or_error(\n\u001b[0;32m 713\u001b[0m metadata,\n\u001b[0;32m 714\u001b[0m msg\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mNo engine is bound to this Table\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124ms MetaData. \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[0;32m 715\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mPass an engine to the Table via \u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[0;32m 716\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mautoload_with=\u001b[39m\u001b[38;5;124m\"\u001b[39m,\n\u001b[0;32m 717\u001b[0m )\n\u001b[1;32m--> 719\u001b[0m insp \u001b[38;5;241m=\u001b[39m \u001b[43minspection\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43minspect\u001b[49m\u001b[43m(\u001b[49m\u001b[43mautoload_with\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 720\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m insp\u001b[38;5;241m.\u001b[39m_inspection_context() \u001b[38;5;28;01mas\u001b[39;00m conn_insp:\n\u001b[0;32m 721\u001b[0m conn_insp\u001b[38;5;241m.\u001b[39mreflect_table(\n\u001b[0;32m 722\u001b[0m \u001b[38;5;28mself\u001b[39m,\n\u001b[0;32m 723\u001b[0m include_columns,\n\u001b[1;32m (...)\u001b[0m\n\u001b[0;32m 726\u001b[0m _extend_on\u001b[38;5;241m=\u001b[39m_extend_on,\n\u001b[0;32m 727\u001b[0m )\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\inspection.py:64\u001b[0m, in \u001b[0;36minspect\u001b[1;34m(subject, raiseerr)\u001b[0m\n\u001b[0;32m 62\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m reg \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[0;32m 63\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m subject\n\u001b[1;32m---> 64\u001b[0m ret \u001b[38;5;241m=\u001b[39m \u001b[43mreg\u001b[49m\u001b[43m(\u001b[49m\u001b[43msubject\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 65\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m ret \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m 66\u001b[0m \u001b[38;5;28;01mbreak\u001b[39;00m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\reflection.py:182\u001b[0m, in \u001b[0;36mInspector._engine_insp\u001b[1;34m(bind)\u001b[0m\n\u001b[0;32m 180\u001b[0m \u001b[38;5;129m@inspection\u001b[39m\u001b[38;5;241m.\u001b[39m_inspects(Engine)\n\u001b[0;32m 181\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_engine_insp\u001b[39m(bind):\n\u001b[1;32m--> 182\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mInspector\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_construct\u001b[49m\u001b[43m(\u001b[49m\u001b[43mInspector\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_init_engine\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbind\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\reflection.py:117\u001b[0m, in \u001b[0;36mInspector._construct\u001b[1;34m(cls, init, bind)\u001b[0m\n\u001b[0;32m 114\u001b[0m \u001b[38;5;28mcls\u001b[39m \u001b[38;5;241m=\u001b[39m bind\u001b[38;5;241m.\u001b[39mdialect\u001b[38;5;241m.\u001b[39minspector\n\u001b[0;32m 116\u001b[0m \u001b[38;5;28mself\u001b[39m \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mcls\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;21m__new__\u001b[39m(\u001b[38;5;28mcls\u001b[39m)\n\u001b[1;32m--> 117\u001b[0m \u001b[43minit\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbind\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 118\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\reflection.py:128\u001b[0m, in \u001b[0;36mInspector._init_engine\u001b[1;34m(self, engine)\u001b[0m\n\u001b[0;32m 126\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_init_engine\u001b[39m(\u001b[38;5;28mself\u001b[39m, engine):\n\u001b[0;32m 127\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mbind \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mengine \u001b[38;5;241m=\u001b[39m engine\n\u001b[1;32m--> 128\u001b[0m \u001b[43mengine\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mconnect\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\u001b[38;5;241m.\u001b[39mclose()\n\u001b[0;32m 129\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_op_context_requires_connect \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m\n\u001b[0;32m 130\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdialect \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mengine\u001b[38;5;241m.\u001b[39mdialect\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\base.py:3234\u001b[0m, in \u001b[0;36mEngine.connect\u001b[1;34m(self, close_with_result)\u001b[0m\n\u001b[0;32m 3219\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mconnect\u001b[39m(\u001b[38;5;28mself\u001b[39m, close_with_result\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mFalse\u001b[39;00m):\n\u001b[0;32m 3220\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Return a new :class:`_engine.Connection` object.\u001b[39;00m\n\u001b[0;32m 3221\u001b[0m \n\u001b[0;32m 3222\u001b[0m \u001b[38;5;124;03m The :class:`_engine.Connection` object is a facade that uses a DBAPI\u001b[39;00m\n\u001b[1;32m (...)\u001b[0m\n\u001b[0;32m 3231\u001b[0m \n\u001b[0;32m 3232\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[1;32m-> 3234\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_connection_cls\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mclose_with_result\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mclose_with_result\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\base.py:96\u001b[0m, in \u001b[0;36mConnection.__init__\u001b[1;34m(self, engine, connection, close_with_result, _branch_from, _execution_options, _dispatch, _has_events, _allow_revalidate)\u001b[0m\n\u001b[0;32m 91\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_has_events \u001b[38;5;241m=\u001b[39m _branch_from\u001b[38;5;241m.\u001b[39m_has_events\n\u001b[0;32m 92\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m 93\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_dbapi_connection \u001b[38;5;241m=\u001b[39m (\n\u001b[0;32m 94\u001b[0m connection\n\u001b[0;32m 95\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m connection \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[1;32m---> 96\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m \u001b[43mengine\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraw_connection\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 97\u001b[0m )\n\u001b[0;32m 99\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_transaction \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_nested_transaction \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[0;32m 100\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m__savepoint_seq \u001b[38;5;241m=\u001b[39m \u001b[38;5;241m0\u001b[39m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\base.py:3313\u001b[0m, in \u001b[0;36mEngine.raw_connection\u001b[1;34m(self, _connection)\u001b[0m\n\u001b[0;32m 3291\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mraw_connection\u001b[39m(\u001b[38;5;28mself\u001b[39m, _connection\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mNone\u001b[39;00m):\n\u001b[0;32m 3292\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Return a \"raw\" DBAPI connection from the connection pool.\u001b[39;00m\n\u001b[0;32m 3293\u001b[0m \n\u001b[0;32m 3294\u001b[0m \u001b[38;5;124;03m The returned object is a proxied version of the DBAPI\u001b[39;00m\n\u001b[1;32m (...)\u001b[0m\n\u001b[0;32m 3311\u001b[0m \n\u001b[0;32m 3312\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[1;32m-> 3313\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_wrap_pool_connect\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mpool\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mconnect\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m_connection\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\base.py:3283\u001b[0m, in \u001b[0;36mEngine._wrap_pool_connect\u001b[1;34m(self, fn, connection)\u001b[0m\n\u001b[0;32m 3281\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m dialect\u001b[38;5;241m.\u001b[39mdbapi\u001b[38;5;241m.\u001b[39mError \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[0;32m 3282\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m connection \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[1;32m-> 3283\u001b[0m \u001b[43mConnection\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_handle_dbapi_exception_noconnection\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 3284\u001b[0m \u001b[43m \u001b[49m\u001b[43me\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdialect\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\n\u001b[0;32m 3285\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 3286\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m 3287\u001b[0m util\u001b[38;5;241m.\u001b[39mraise_(\n\u001b[0;32m 3288\u001b[0m sys\u001b[38;5;241m.\u001b[39mexc_info()[\u001b[38;5;241m1\u001b[39m], with_traceback\u001b[38;5;241m=\u001b[39msys\u001b[38;5;241m.\u001b[39mexc_info()[\u001b[38;5;241m2\u001b[39m]\n\u001b[0;32m 3289\u001b[0m )\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\base.py:2117\u001b[0m, in \u001b[0;36mConnection._handle_dbapi_exception_noconnection\u001b[1;34m(cls, e, dialect, engine)\u001b[0m\n\u001b[0;32m 2115\u001b[0m util\u001b[38;5;241m.\u001b[39mraise_(newraise, with_traceback\u001b[38;5;241m=\u001b[39mexc_info[\u001b[38;5;241m2\u001b[39m], from_\u001b[38;5;241m=\u001b[39me)\n\u001b[0;32m 2116\u001b[0m \u001b[38;5;28;01melif\u001b[39;00m should_wrap:\n\u001b[1;32m-> 2117\u001b[0m \u001b[43mutil\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 2118\u001b[0m \u001b[43m \u001b[49m\u001b[43msqlalchemy_exception\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mwith_traceback\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexc_info\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m2\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mfrom_\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43me\u001b[49m\n\u001b[0;32m 2119\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 2120\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m 2121\u001b[0m util\u001b[38;5;241m.\u001b[39mraise_(exc_info[\u001b[38;5;241m1\u001b[39m], with_traceback\u001b[38;5;241m=\u001b[39mexc_info[\u001b[38;5;241m2\u001b[39m])\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\compat.py:208\u001b[0m, in \u001b[0;36mraise_\u001b[1;34m(***failed resolving arguments***)\u001b[0m\n\u001b[0;32m 205\u001b[0m exception\u001b[38;5;241m.\u001b[39m__cause__ \u001b[38;5;241m=\u001b[39m replace_context\n\u001b[0;32m 207\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 208\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m exception\n\u001b[0;32m 209\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[0;32m 210\u001b[0m \u001b[38;5;66;03m# credit to\u001b[39;00m\n\u001b[0;32m 211\u001b[0m \u001b[38;5;66;03m# https://cosmicpercolator.com/2016/01/13/exception-leaks-in-python-2-and-3/\u001b[39;00m\n\u001b[0;32m 212\u001b[0m \u001b[38;5;66;03m# as the __traceback__ object creates a cycle\u001b[39;00m\n\u001b[0;32m 213\u001b[0m \u001b[38;5;28;01mdel\u001b[39;00m exception, replace_context, from_, with_traceback\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\base.py:3280\u001b[0m, in \u001b[0;36mEngine._wrap_pool_connect\u001b[1;34m(self, fn, connection)\u001b[0m\n\u001b[0;32m 3278\u001b[0m dialect \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdialect\n\u001b[0;32m 3279\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m-> 3280\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mfn\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 3281\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m dialect\u001b[38;5;241m.\u001b[39mdbapi\u001b[38;5;241m.\u001b[39mError \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[0;32m 3282\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m connection \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:310\u001b[0m, in \u001b[0;36mPool.connect\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 302\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mconnect\u001b[39m(\u001b[38;5;28mself\u001b[39m):\n\u001b[0;32m 303\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Return a DBAPI connection from the pool.\u001b[39;00m\n\u001b[0;32m 304\u001b[0m \n\u001b[0;32m 305\u001b[0m \u001b[38;5;124;03m The connection is instrumented such that when its\u001b[39;00m\n\u001b[1;32m (...)\u001b[0m\n\u001b[0;32m 308\u001b[0m \n\u001b[0;32m 309\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[1;32m--> 310\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_ConnectionFairy\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_checkout\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m)\u001b[49m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:868\u001b[0m, in \u001b[0;36m_ConnectionFairy._checkout\u001b[1;34m(cls, pool, threadconns, fairy)\u001b[0m\n\u001b[0;32m 865\u001b[0m \u001b[38;5;129m@classmethod\u001b[39m\n\u001b[0;32m 866\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_checkout\u001b[39m(\u001b[38;5;28mcls\u001b[39m, pool, threadconns\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mNone\u001b[39;00m, fairy\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mNone\u001b[39;00m):\n\u001b[0;32m 867\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m fairy:\n\u001b[1;32m--> 868\u001b[0m fairy \u001b[38;5;241m=\u001b[39m \u001b[43m_ConnectionRecord\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mcheckout\u001b[49m\u001b[43m(\u001b[49m\u001b[43mpool\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 870\u001b[0m fairy\u001b[38;5;241m.\u001b[39m_pool \u001b[38;5;241m=\u001b[39m pool\n\u001b[0;32m 871\u001b[0m fairy\u001b[38;5;241m.\u001b[39m_counter \u001b[38;5;241m=\u001b[39m \u001b[38;5;241m0\u001b[39m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:476\u001b[0m, in \u001b[0;36m_ConnectionRecord.checkout\u001b[1;34m(cls, pool)\u001b[0m\n\u001b[0;32m 474\u001b[0m \u001b[38;5;129m@classmethod\u001b[39m\n\u001b[0;32m 475\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mcheckout\u001b[39m(\u001b[38;5;28mcls\u001b[39m, pool):\n\u001b[1;32m--> 476\u001b[0m rec \u001b[38;5;241m=\u001b[39m \u001b[43mpool\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_do_get\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 477\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m 478\u001b[0m dbapi_connection \u001b[38;5;241m=\u001b[39m rec\u001b[38;5;241m.\u001b[39mget_connection()\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\impl.py:145\u001b[0m, in \u001b[0;36mQueuePool._do_get\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 143\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_create_connection()\n\u001b[0;32m 144\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m:\n\u001b[1;32m--> 145\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m util\u001b[38;5;241m.\u001b[39msafe_reraise():\n\u001b[0;32m 146\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_dec_overflow()\n\u001b[0;32m 147\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\langhelpers.py:70\u001b[0m, in \u001b[0;36msafe_reraise.__exit__\u001b[1;34m(self, type_, value, traceback)\u001b[0m\n\u001b[0;32m 68\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m \u001b[38;5;66;03m# remove potential circular references\u001b[39;00m\n\u001b[0;32m 69\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mwarn_only:\n\u001b[1;32m---> 70\u001b[0m \u001b[43mcompat\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 71\u001b[0m \u001b[43m \u001b[49m\u001b[43mexc_value\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 72\u001b[0m \u001b[43m \u001b[49m\u001b[43mwith_traceback\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexc_tb\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 73\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 74\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m 75\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m compat\u001b[38;5;241m.\u001b[39mpy3k \u001b[38;5;129;01mand\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info \u001b[38;5;129;01mand\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info[\u001b[38;5;241m1\u001b[39m]:\n\u001b[0;32m 76\u001b[0m \u001b[38;5;66;03m# emulate Py3K's behavior of telling us when an exception\u001b[39;00m\n\u001b[0;32m 77\u001b[0m \u001b[38;5;66;03m# occurs in an exception handler.\u001b[39;00m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\compat.py:208\u001b[0m, in \u001b[0;36mraise_\u001b[1;34m(***failed resolving arguments***)\u001b[0m\n\u001b[0;32m 205\u001b[0m exception\u001b[38;5;241m.\u001b[39m__cause__ \u001b[38;5;241m=\u001b[39m replace_context\n\u001b[0;32m 207\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 208\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m exception\n\u001b[0;32m 209\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[0;32m 210\u001b[0m \u001b[38;5;66;03m# credit to\u001b[39;00m\n\u001b[0;32m 211\u001b[0m \u001b[38;5;66;03m# https://cosmicpercolator.com/2016/01/13/exception-leaks-in-python-2-and-3/\u001b[39;00m\n\u001b[0;32m 212\u001b[0m \u001b[38;5;66;03m# as the __traceback__ object creates a cycle\u001b[39;00m\n\u001b[0;32m 213\u001b[0m \u001b[38;5;28;01mdel\u001b[39;00m exception, replace_context, from_, with_traceback\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\impl.py:143\u001b[0m, in \u001b[0;36mQueuePool._do_get\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 141\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_inc_overflow():\n\u001b[0;32m 142\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 143\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_create_connection\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 144\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m:\n\u001b[0;32m 145\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m util\u001b[38;5;241m.\u001b[39msafe_reraise():\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:256\u001b[0m, in \u001b[0;36mPool._create_connection\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 253\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_create_connection\u001b[39m(\u001b[38;5;28mself\u001b[39m):\n\u001b[0;32m 254\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Called by subclasses to create a new ConnectionRecord.\"\"\"\u001b[39;00m\n\u001b[1;32m--> 256\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_ConnectionRecord\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m)\u001b[49m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:371\u001b[0m, in \u001b[0;36m_ConnectionRecord.__init__\u001b[1;34m(self, pool, connect)\u001b[0m\n\u001b[0;32m 369\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m__pool \u001b[38;5;241m=\u001b[39m pool\n\u001b[0;32m 370\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m connect:\n\u001b[1;32m--> 371\u001b[0m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m__connect\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 372\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mfinalize_callback \u001b[38;5;241m=\u001b[39m deque()\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:665\u001b[0m, in \u001b[0;36m_ConnectionRecord.__connect\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 663\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mfresh \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m\n\u001b[0;32m 664\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m--> 665\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m util\u001b[38;5;241m.\u001b[39msafe_reraise():\n\u001b[0;32m 666\u001b[0m pool\u001b[38;5;241m.\u001b[39mlogger\u001b[38;5;241m.\u001b[39mdebug(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mError on connect(): \u001b[39m\u001b[38;5;132;01m%s\u001b[39;00m\u001b[38;5;124m\"\u001b[39m, e)\n\u001b[0;32m 667\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m 668\u001b[0m \u001b[38;5;66;03m# in SQLAlchemy 1.4 the first_connect event is not used by\u001b[39;00m\n\u001b[0;32m 669\u001b[0m \u001b[38;5;66;03m# the engine, so this will usually not be set\u001b[39;00m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\langhelpers.py:70\u001b[0m, in \u001b[0;36msafe_reraise.__exit__\u001b[1;34m(self, type_, value, traceback)\u001b[0m\n\u001b[0;32m 68\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m \u001b[38;5;66;03m# remove potential circular references\u001b[39;00m\n\u001b[0;32m 69\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mwarn_only:\n\u001b[1;32m---> 70\u001b[0m \u001b[43mcompat\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_\u001b[49m\u001b[43m(\u001b[49m\n\u001b[0;32m 71\u001b[0m \u001b[43m \u001b[49m\u001b[43mexc_value\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 72\u001b[0m \u001b[43m \u001b[49m\u001b[43mwith_traceback\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexc_tb\u001b[49m\u001b[43m,\u001b[49m\n\u001b[0;32m 73\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[0;32m 74\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[0;32m 75\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m compat\u001b[38;5;241m.\u001b[39mpy3k \u001b[38;5;129;01mand\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info \u001b[38;5;129;01mand\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_exc_info[\u001b[38;5;241m1\u001b[39m]:\n\u001b[0;32m 76\u001b[0m \u001b[38;5;66;03m# emulate Py3K's behavior of telling us when an exception\u001b[39;00m\n\u001b[0;32m 77\u001b[0m \u001b[38;5;66;03m# occurs in an exception handler.\u001b[39;00m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\util\\compat.py:208\u001b[0m, in \u001b[0;36mraise_\u001b[1;34m(***failed resolving arguments***)\u001b[0m\n\u001b[0;32m 205\u001b[0m exception\u001b[38;5;241m.\u001b[39m__cause__ \u001b[38;5;241m=\u001b[39m replace_context\n\u001b[0;32m 207\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m--> 208\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m exception\n\u001b[0;32m 209\u001b[0m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[0;32m 210\u001b[0m \u001b[38;5;66;03m# credit to\u001b[39;00m\n\u001b[0;32m 211\u001b[0m \u001b[38;5;66;03m# https://cosmicpercolator.com/2016/01/13/exception-leaks-in-python-2-and-3/\u001b[39;00m\n\u001b[0;32m 212\u001b[0m \u001b[38;5;66;03m# as the __traceback__ object creates a cycle\u001b[39;00m\n\u001b[0;32m 213\u001b[0m \u001b[38;5;28;01mdel\u001b[39;00m exception, replace_context, from_, with_traceback\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\pool\\base.py:661\u001b[0m, in \u001b[0;36m_ConnectionRecord.__connect\u001b[1;34m(self)\u001b[0m\n\u001b[0;32m 659\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m 660\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mstarttime \u001b[38;5;241m=\u001b[39m time\u001b[38;5;241m.\u001b[39mtime()\n\u001b[1;32m--> 661\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdbapi_connection \u001b[38;5;241m=\u001b[39m connection \u001b[38;5;241m=\u001b[39m \u001b[43mpool\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_invoke_creator\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[0;32m 662\u001b[0m pool\u001b[38;5;241m.\u001b[39mlogger\u001b[38;5;241m.\u001b[39mdebug(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mCreated new connection \u001b[39m\u001b[38;5;132;01m%r\u001b[39;00m\u001b[38;5;124m\"\u001b[39m, connection)\n\u001b[0;32m 663\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mfresh \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\create.py:590\u001b[0m, in \u001b[0;36mcreate_engine..connect\u001b[1;34m(connection_record)\u001b[0m\n\u001b[0;32m 588\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m connection \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m 589\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m connection\n\u001b[1;32m--> 590\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m dialect\u001b[38;5;241m.\u001b[39mconnect(\u001b[38;5;241m*\u001b[39mcargs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mcparams)\n",
"File \u001b[1;32mC:\\ProgramData\\anaconda3\\lib\\site-packages\\sqlalchemy\\engine\\default.py:597\u001b[0m, in \u001b[0;36mDefaultDialect.connect\u001b[1;34m(self, *cargs, **cparams)\u001b[0m\n\u001b[0;32m 595\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mconnect\u001b[39m(\u001b[38;5;28mself\u001b[39m, \u001b[38;5;241m*\u001b[39mcargs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mcparams):\n\u001b[0;32m 596\u001b[0m \u001b[38;5;66;03m# inherits the docstring from interfaces.Dialect.connect\u001b[39;00m\n\u001b[1;32m--> 597\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mdbapi\u001b[38;5;241m.\u001b[39mconnect(\u001b[38;5;241m*\u001b[39mcargs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mcparams)\n",
"\u001b[1;31mOperationalError\u001b[0m: (pyodbc.OperationalError) ('08001', '[08001] [Microsoft][ODBC Driver 17 for SQL Server]TCP Provider: The wait operation timed out.\\r\\n (258) (SQLDriverConnect); [08001] [Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired (0); [08001] [Microsoft][ODBC Driver 17 for SQL Server]Invalid connection string attribute (0); [08001] [Microsoft][ODBC Driver 17 for SQL Server]A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online. (258)')\n(Background on this error at: https://sqlalche.me/e/14/e3q8)"
]
}
],
"source": [
"def setup_model(c):\n",
" s = Store()\n",
" \n",
" s.time_fw = TimeFramework(start=c.start, end=c.end)\n",
" # s.minute_ix = s.time_fw.dt_index(freq='1T')\n",
" mipf = Mipf(None, start=s.time_fw.start, end=s.time_fw.end, from_database=True, add_days_to_start_end=False).data\n",
" s.mipf = mipf\n",
" s.baseline = CaseStudy(time_fw=s.time_fw, freq='1T', name='Baseline')\n",
" s.hpcase = CaseStudy(time_fw=s.time_fw, freq='1T', name='Heatpump only', data=mipf)\n",
" #s.hpcase_sde = CaseStudy(time_fw=s.time_fw, freq='1T', name='Heatpump + SDE', data=mipf)\n",
" #s.optcase1 = CaseStudy(time_fw=s.time_fw, freq='1T', name='Optimisation', data=mipf)\n",
" #s.afrr_case = CaseStudy(time_fw=s.time_fw, freq='1T', name='Optimisation + aFRR', data=mipf)\n",
" s.storage_case_PCM = CaseStudy(time_fw=s.time_fw, freq='1T', name='Heatpump + Storage_PCM', data=mipf)\n",
" # s.storage_case_TCM = CaseStudy(time_fw=s.time_fw, freq='1T', name='Heatpump + Storage_TCM', data=mipf)\n",
" # s.storage_case_battery = CaseStudy(time_fw=s.time_fw, freq='1T', name='Heatpump + Storage_battery', data=mipf)\n",
" s.cases = list(CaseStudy.instances.values())\n",
" #s.optcases = [s.hpcase, s.hpcase_sde, s.optcase1, s.afrr_case]\n",
" s.optcases = [s.hpcase, s.storage_case_PCM]\n",
" #s.sde_cases = [s.hpcase_sde, s.optcase1, s.afrr_case]\n",
" s.sde_cases = []\n",
" \n",
" s.demand = load_demand_data(c, s)\n",
" return s\n",
"\n",
"s = setup_model(c)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.demand.resample('15T').mean()[['Tsource (VDG)', 'Tsink (VDG)']].iplot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.demand.describe()"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"## Load in data"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def add_afrr_prices(c, case):\n",
" try:\n",
" aFRR_signal = pd.read_csv(f'data/aFRR_{c.start}.csv', delimiter=';', decimal=',', index_col='datetime')\n",
" aFRR_signal.index = case.data.index\n",
" except:\n",
" data = get_tennet_data('balansdelta2017', pd.to_datetime(c.start), pd.to_datetime(c.end))\n",
" data.index = data[[\"datum\", \"tijd\"]].apply(lambda x: \" \".join(x), axis=1)\n",
" data.index = pd.to_datetime(data.index, format=\"%d-%m-%Y %H:%M\").tz_localize(\n",
" \"Europe/Amsterdam\", ambiguous=True\n",
" )\n",
" data = data[~data.index.duplicated(keep=\"first\")]\n",
" date_ix = pd.date_range(\n",
" data.index[0], data.index[-1], freq=\"1T\", tz=\"Europe/Amsterdam\"\n",
" )\n",
" data = data.reindex(date_ix)\n",
" aFRR_signal = data[['Hoogste_prijs_opregelen', 'Mid_prijs_opregelen', 'Laagste_prijs_afregelen']]\n",
" aFRR_signal.to_csv(f'data/aFRR_{c.start}.csv', sep=';', decimal=',', index_label='datetime')\n",
"\n",
" try:\n",
" aFRR_prices = pd.read_csv(f'data/aFRR_prices_{c.start}.csv', delimiter=';', decimal=',', index_col='datetime')\n",
" aFRR_prices.index = case.data.index\n",
" except:\n",
" data = get_aFRR_prices_nl(pd.to_datetime(c.start), pd.to_datetime(c.end))\n",
" data.index = pd.date_range(\n",
" start=c.start,\n",
" end=pd.to_datetime(c.end) + timedelta(days=1),\n",
" tz='Europe/Amsterdam', \n",
" freq='15T', \n",
" closed='left'\n",
" )\n",
" aFRR_prices = data.reindex(case.data.index, method='ffill')\n",
" aFRR_prices.to_csv(f'data/aFRR_prices_{c.start}.csv', sep=';', decimal=',', index_label='datetime')\n",
" \n",
" case.data = pd.concat([case.data, aFRR_signal, aFRR_prices], axis=1)\n",
" return case"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def increase_volatility_by_factor(col, factor):\n",
" mean = col.mean()\n",
" diff_to_mean = col - mean\n",
" new_diff = diff_to_mean * factor\n",
" return mean + new_diff\n",
"\n",
"def multiply_by_factor(col, factor):\n",
" mean = col.mean()\n",
" diff_to_mean = col - mean\n",
" \n",
" cond = diff_to_mean > 0\n",
" diff_to_mean[cond] *= factor\n",
" diff_to_mean[~cond] /= factor\n",
" return mean + diff_to_mean"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def load_data(c, s):\n",
" if hasattr(s, 'afrr_case'):\n",
" s.afrr_case = add_afrr_prices(c, s.afrr_case)\n",
" \n",
" for case in s.cases:\n",
" case.add_gasprices()\n",
" case.add_co2prices(perMWh=True)\n",
" \n",
" case.data['Gas prices (€/MWh)'] *= c.gas_price_multiplier\n",
" case.data['CO2 prices (€/MWh)'] *= c.co2_price_multiplier\n",
" case.data['CO2 prices (€/ton)'] *= c.co2_price_multiplier\n",
" \n",
" for case in s.optcases:\n",
" case.data['NEG'] = multiply_by_factor(case.data['NEG'], c.e_price_multiplier)\n",
" case.data['ForeNeg'] = multiply_by_factor(case.data['ForeNeg'], c.e_price_multiplier)\n",
" case.data['DAM'] = multiply_by_factor(case.data['DAM'], c.e_price_multiplier)\n",
" \n",
" case.data['NEG'] = increase_volatility_by_factor(case.data['NEG'], c.e_price_volatility_multiplier)\n",
" case.data['ForeNeg'] = increase_volatility_by_factor(case.data['ForeNeg'], c.e_price_volatility_multiplier)\n",
" \n",
" if hasattr(s, 'afrr_case'):\n",
" for case in [s.afrr_case]:\n",
" case.data['Hoogste_prijs_opregelen'] = multiply_by_factor(case.data['Hoogste_prijs_opregelen'], c.e_price_multiplier)\n",
" case.data['Hoogste_prijs_opregelen'] = increase_volatility_by_factor(case.data['Hoogste_prijs_opregelen'], c.e_price_volatility_multiplier)\n",
" case.data['aFRR_up'] = multiply_by_factor(case.data['aFRR_up'], c.e_price_multiplier)\n",
" case.data['aFRR_up'] = increase_volatility_by_factor(case.data['aFRR_up'], c.e_price_volatility_multiplier)\n",
"\n",
" s.demand[['Tsource (VDG)', 'Tsource (NDG)']] += c.tsource_delta\n",
" s.demand[['Tsink (VDG)', 'Tsink (NDG)']] += c.tsink_delta\n",
" for case in s.cases:\n",
" case.data = pd.concat([case.data, s.demand], axis=1) \n",
"\n",
" s.eb_ode_g = get_tax_rate('gas', 2020, 4)['EB+ODE'] * c.energy_tax_multiplier\n",
" s.eb_ode_e = get_tax_rate('electricity', 2020, 4)['EB+ODE'] * c.energy_tax_multiplier\n",
" s.grid_fees = get_grid_tariffs_electricity(c.grid_operator, 2020, c.connection_type)\n",
" s.grid_fee_per_MWh = s.grid_fees['kWh tarief'] * 1000\n",
" return s\n",
"\n",
"s = load_data(c, s)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.hpcase.data.head()"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"## Assets"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### COP curve"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def cop_curve(Tsink, Tsource):\n",
" Tsink += 273\n",
" Tsource += 273\n",
"\n",
" c1 = 0.267 * Tsink / (Tsink - Tsource)\n",
" c2 = 0.333 * Tsink / (Tsink - Tsource)\n",
" \n",
" return Polynomial([c2, c1])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"cop_curve(140, 13)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"cop_curve(140, 73)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"heatpump = Heatpump(\n",
" name='Heatpump',\n",
" max_th_power=1,\n",
" min_th_power=0,\n",
" cop_curve=cop_curve\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#heatpump.get_cop(heat_output=load, Tsink=140, Tsource=13)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#heatpump.get_cop(heat_output=load, Tsink=140, Tsource=99)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"import itertools\n",
"\n",
"source_Ts = np.arange(25, 75) + 273\n",
"sink_Ts = np.arange(80, 170) + 273\n",
"\n",
"df = pd.DataFrame(columns=list(sink_Ts), index=list(source_Ts))\n",
"for sourceT, sinkT in itertools.product(source_Ts, sink_Ts):\n",
" df.loc[sourceT, sinkT] = heatpump.get_cop(heat_output=0.5, Tsink=sinkT, Tsource=sourceT)\n",
" \n",
"#df.to_csv('cops_at_50perc_load.csv', sep=';', decimal=',')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sourceT = 63 + 273\n",
"sinkT = 140 + 273\n",
"loads = np.arange(0, 1, 0.01)\n",
"\n",
"cop_series = pd.Series(index=loads, dtype='float')\n",
"load_series = pd.Series(index=loads, dtype='float')\n",
"for load in loads:\n",
" cop = heatpump.get_cop(heat_output=load, Tsink=sinkT, Tsource=sourceT)\n",
" cop_series[load] = cop\n",
" load_series[load] = load / cop "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"fig_cop_curve = cop_series.iplot(title=f'Heatpump COP curve at Tsource={sourceT} and Tsink={sinkT}', yTitle='COP', xTitle='Thermal load in %', colors=recoygreen, asFigure=True, dimensions=(600,400))\n",
"fig_cop_curve"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_load_curve = load_series.iplot(title=f'Heatpump Load curve at Tsource={sourceT} and Tsink={sinkT}', yTitle='E-load', xTitle='Thermal load in %', colors=recoygreen, asFigure=True, dimensions=(600,400))\n",
"fig_load_curve"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def cop_curve_new(Tsink, Tsource):\n",
" Tsink += 273\n",
" Tsource += 273\n",
" Tlift = Tsink - Tsource\n",
"\n",
" c0 = 0.0005426*Tlift**2 - 0.1178*Tlift + 6.962\n",
" c1 = 6.7058 \n",
" c2 = -1.79\n",
" \n",
" return Polynomial([c0, c1, c2])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sourceT = 63 + 273\n",
"sinkT = 140 + 273\n",
"cop_curve_new(sourceT, sinkT)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"heatpump = Heatpump(\n",
" name='Heatpump',\n",
" max_th_power=1,\n",
" min_th_power=0,\n",
" cop_curve=cop_curve_new\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"import itertools\n",
"\n",
"source_Ts = np.arange(25, 75) + 273\n",
"sink_Ts = np.arange(80, 170) + 273\n",
"\n",
"df = pd.DataFrame(columns=list(sink_Ts), index=list(source_Ts))\n",
"for sourceT, sinkT in itertools.product(source_Ts, sink_Ts):\n",
" df.loc[sourceT, sinkT] = heatpump.get_cop(heat_output=0.5, Tsink=sinkT, Tsource=sourceT)\n",
" \n",
"#df.to_csv('cops_at_50perc_load.csv', sep=';', decimal=',')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sourceT = 63 + 273\n",
"sinkT = 140 + 273\n",
"loads = np.arange(0, 1, 0.01)\n",
"\n",
"cop_series = pd.Series(index=loads, dtype='float')\n",
"load_series = pd.Series(index=loads, dtype='float')\n",
"for load in loads:\n",
" cop = heatpump.get_cop(heat_output=load, Tsink=sinkT, Tsource=sourceT)\n",
" cop_series[load] = cop\n",
" load_series[load] = load / cop "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sourceTs = np.arange(50, 100)\n",
"sinkT = 140 + 273\n",
"load = 1 \n",
"cop_series = pd.Series(index=sourceTs, dtype='float')\n",
"\n",
"for sourceT in sourceTs:\n",
" cop = heatpump.get_cop(heat_output=load, Tsink=sinkT, Tsource=sourceT + 273)\n",
" cop_series[sourceT] = cop\n",
" \n",
"cop_series.iplot(yrange=[0, 8], xTitle='Source Temperature', yTitle='COP', dimensions=(800, 400))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"fig_cop_curve = cop_series.iplot(title=f'Heatpump COP curve at Tsource={sourceT} and Tsink={sinkT}', yTitle='COP', xTitle='Thermal load in %', colors=recoygreen, asFigure=True, dimensions=(600,400))\n",
"fig_cop_curve"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_load_curve = load_series.iplot(title=f'Heatpump Load curve at Tsource={sourceT} and Tsink={sinkT}', yTitle='E-load', xTitle='Thermal load in %', colors=recoygreen, asFigure=True, dimensions=(600,400))\n",
"fig_load_curve"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Create and assign assets"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def create_and_assign_assets(c, s):\n",
" heatpump_vdg = Heatpump(\n",
" name='Heatpump VDG',\n",
" max_th_power=c.hp_vdg_e_power,\n",
" min_th_power=c.hp_vdg_e_power * c.hp_min_load,\n",
" cop_curve=cop_curve_new\n",
" )\n",
"\n",
" heatpump_ndg = Heatpump(\n",
" name='Heatpump NDG',\n",
" max_th_power=c.hp_ndg_e_power,\n",
" min_th_power=c.hp_ndg_e_power * c.hp_min_load,\n",
" cop_curve=cop_curve_new\n",
" )\n",
"\n",
" capex_vdg = c.hp_capex*(heatpump_vdg.max_th_power) \n",
" capex_ndg = c.hp_capex*(heatpump_ndg.max_th_power)\n",
" heatpump_vdg.set_financials(capex=capex_vdg, opex=c.hp_opex*capex_vdg, devex=c.hp_devex*capex_vdg, lifetime=25)\n",
" heatpump_ndg.set_financials(capex=capex_ndg, opex=c.hp_opex*capex_ndg, devex=c.hp_devex*capex_ndg, lifetime=25)\n",
"\n",
" gasboiler = GasBoiler(\n",
" name='Gasboiler',\n",
" max_th_output=c.gb_power,\n",
" efficiency=c.gb_efficiency\n",
" )\n",
" gasboiler.set_financials(capex=0, opex=0, devex=0, lifetime=25)\n",
" \n",
" waterstorage = HotWaterStorage(\n",
" name='HotWaterStorage',\n",
" rated_power=c.storage_power, \n",
" capacity_per_volume=c.storage_cap_per_volume,\n",
" volume=c.storage_volume, \n",
" temperature=c.storage_temperature,\n",
" min_storagelevel=c.storage_min_level,\n",
" # initial_storagelevel=c.storage_initial_level\n",
" )\n",
" capex_ws = c.storage_capex_per_MW * waterstorage.max_power + c.storage_capex_per_MWh * waterstorage.capacity\n",
" opex_ws = c.storage_opex_perc_of_capex * capex_ws\n",
" waterstorage.set_financials(capex=capex_ws, opex=opex_ws, devex=0, lifetime=c.storage_lifetime)\n",
" \n",
" s.baseline.add_asset(gasboiler)\n",
" s.hpcase.add_asset(heatpump_vdg)\n",
" s.hpcase.add_asset(heatpump_ndg)\n",
" s.storage_case_PCM.add_asset(heatpump_vdg)\n",
" s.storage_case_PCM.add_asset(heatpump_ndg)\n",
" s.storage_case_PCM.add_asset(waterstorage)\n",
"# s.hpcase_sde.add_asset(heatpump_vdg)\n",
"# s.hpcase_sde.add_asset(heatpump_ndg)\n",
"# s.optcase1.add_asset(heatpump_vdg)\n",
"# s.optcase1.add_asset(heatpump_ndg)\n",
"# s.optcase1.add_asset(gasboiler)\n",
"# s.afrr_case.add_asset(heatpump_vdg)\n",
"# s.afrr_case.add_asset(heatpump_ndg)\n",
"# s.afrr_case.add_asset(gasboiler)\n",
" return s\n",
"\n",
"s = create_and_assign_assets(c, s)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.assets['HotWaterStorage']"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"## Optimization"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Strategies"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def baseline_sim(case):\n",
" gasboiler = list(case.assets.values())[0]\n",
" data = case.data\n",
" demand = (data['MW (VDG)'] + data['MW (NDG)']).to_list()\n",
"\n",
" minutes = iter(range(len(case.data)))\n",
" th_output = [0] * len(case.data)\n",
" gas_input = [0] * len(case.data)\n",
"\n",
" for m in minutes:\n",
" th_output[m], gas_input[m] = gasboiler.set_heat_output(demand[m])\n",
"\n",
" data['output_MW_th'] = np.array(th_output)\n",
" data['output_MWh_th'] = np.array(data['output_MW_th']/60)\n",
" data['gb_input_MW'] = np.array(gas_input)\n",
" data['gb_input_MWh'] = np.array(data['gb_input_MW']/60)\n",
" case.data = data.round(5)\n",
" return case"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def hponly(case):\n",
" hp_vdg = case.assets['Heatpump VDG']\n",
" hp_ndg = case.assets['Heatpump NDG']\n",
" demand_vdg = case.data['MW (VDG)'].to_list()\n",
" demand_ndg = case.data['MW (NDG)'].to_list()\n",
" Tsink_vdg = case.data['Tsink (VDG)'].to_list()\n",
" Tsink_ndg = case.data['Tsink (NDG)'].to_list()\n",
" Tsource_vdg = case.data['Tsource (VDG)'].to_list()\n",
" Tsource_ndg = case.data['Tsource (NDG)'].to_list()\n",
"\n",
" hp_vdg_input = [0] * len(case.data)\n",
" hp_ndg_input = [0] * len(case.data)\n",
" hp_vdg_output = [0] * len(case.data)\n",
" hp_ndg_output = [0] * len(case.data)\n",
"\n",
" minutes = iter(range(len(case.data)))\n",
" for m in minutes:\n",
" demand = demand_vdg[m]\n",
" if demand != 0:\n",
" hp_vdg_input[m], hp_vdg_output[m] = hp_vdg.set_heat_output(\n",
" heat_output=demand,\n",
" Tsink=Tsink_vdg[m],\n",
" Tsource=Tsource_vdg[m]\n",
" )\n",
"\n",
" demand = demand_ndg[m]\n",
" if demand != 0:\n",
" hp_ndg_input[m], hp_ndg_output[m] = hp_ndg.set_heat_output(\n",
" heat_output=demand_ndg[m],\n",
" Tsink=Tsink_ndg[m],\n",
" Tsource=Tsource_ndg[m]\n",
" )\n",
"\n",
" case.data['hp_output_MW'] = np.array(hp_vdg_output) + np.array(hp_ndg_output)\n",
" case.data['hp_input_MW'] = np.array(hp_vdg_input) + np.array(hp_ndg_input)\n",
" case.data['cop'] = case.data['hp_output_MW'] / -case.data['hp_input_MW']\n",
" \n",
" for col in case.data.columns:\n",
" if col.endswith('MW'):\n",
" case.data[col + 'h'] = case.data[col] / 60\n",
"\n",
" case.data = case.data.round(3)\n",
" return case"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"Tref = 0\n",
"Cp = 4190 #J/kgK\n",
"MWtoJs = 1000_000\n",
"\n",
"def power_to_mass_flow(power_MW, Tsink, Tref, Cp):\n",
" return power_MW * MWtoJs /(Cp*(Tsink - Tref))\n",
"def energy_to_storage(hp_heat_output_MW, process_demand_MW):\n",
" return hp_heat_output_MW - process_demand_MW #MW\n",
"\n",
"def Tsource_calculation(Tstorage, discharge_power, Tsource, process_mass_flow): \n",
" discharge_mass_flow = power_to_mass_flow(discharge_power, Tstorage, Tref, Cp)\n",
" \n",
" combined_mass_flow = (discharge_mass_flow + process_mass_flow)\n",
" if combined_mass_flow == 0:\n",
" return Tsource\n",
" else: \n",
" return (Tstorage * discharge_mass_flow + Tsource * process_mass_flow) / combined_mass_flow"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def hp_storage_opt_enginge(c, ws, hp_vdg, hp_ndg, pos, neg, dam, demand_vdg, demand_ndg, tsource_vdg, tsink_vdg, tsource_ndg, tsink_ndg):\n",
" \n",
" from_storage_vdg_MW = 0\n",
" to_storage_vdg_MW = 0 \n",
" from_storage_ndg_MW = 0\n",
" to_storage_ndg_MW = 0 \n",
" \n",
" if neg < dam - c.threshold:\n",
" # overproduce\n",
" if demand_vdg != 0:\n",
" desired_hp_load_vdg = min(demand_vdg + ws.charging_power_limit, hp_vdg.max_th_power)\n",
" e_load_vdg, th_load_vdg = hp_vdg.set_heat_output(desired_hp_load_vdg, tsink_vdg, tsource_vdg)\n",
"\n",
" to_storage_vdg_MW = th_load_vdg - demand_vdg\n",
" to_storage_vdg_MW = -ws.charge(round(to_storage_vdg_MW, 3))\n",
"\n",
" extra_charging_power_constraint = ws.max_power - to_storage_vdg_MW\n",
" else:\n",
" e_load_vdg, th_load_vdg = (0,0)\n",
" \n",
" if demand_ndg != 0:\n",
" desired_hp_load_ndg = min(\n",
" demand_ndg + min(ws.charging_power_limit, extra_charging_power_constraint), \n",
" hp_ndg.max_th_power\n",
" )\n",
" e_load_ndg, th_load_ndg = hp_ndg.set_heat_output(desired_hp_load_ndg, tsink_ndg, tsource_ndg)\n",
"\n",
" to_storage_ndg_MW = th_load_ndg - demand_ndg\n",
" to_storage_ndg_MW = -ws.charge(round(to_storage_ndg_MW, 3))\n",
" else:\n",
" e_load_ndg, th_load_ndg = (0,0)\n",
" \n",
"\n",
" elif pos > dam + c.threshold:\n",
" # take from storage\n",
" if demand_vdg != 0:\n",
" from_process_massflow_vdg = power_to_mass_flow(demand_vdg, tsink_vdg, Tref, Cp)\n",
" try:\n",
" from_storage_vdg_MW = ws.discharging_power_limit\n",
" tsource_vdg = Tsource_calculation(ws.temperature, from_storage_vdg_MW, tsource_vdg, from_process_massflow_vdg)\n",
" e_load_vdg, th_load_vdg = hp_vdg.set_heat_output(demand_vdg, tsink_vdg, tsource_vdg)\n",
" except: \n",
" from_storage_vdg_MW = 0\n",
" tsource_vdg = Tsource_calculation(ws.temperature, from_storage_vdg_MW, tsource_vdg, from_process_massflow_vdg)\n",
" e_load_vdg, th_load_vdg = hp_vdg.set_heat_output(demand_vdg, tsink_vdg, tsource_vdg)\n",
"\n",
" from_storage_vdg_MW = ws.discharge(round(from_storage_vdg_MW, 3))\n",
" else:\n",
" e_load_vdg, th_load_vdg = (0,0)\n",
" \n",
" #print({f'from_storage_vdg_MW='})\n",
" if demand_ndg != 0:\n",
" from_process_massflow_ndg = power_to_mass_flow(demand_ndg, tsink_ndg, Tref, Cp)\n",
" \n",
" try:\n",
" from_storage_ndg_MW = min(ws.max_power - from_storage_vdg_MW, ws.discharging_power_limit)\n",
" tsource_ndg = Tsource_calculation(ws.temperature, from_storage_ndg_MW, tsource_ndg, from_process_massflow_ndg)\n",
" e_load_ndg, th_load_ndg = hp_ndg.set_heat_output(demand_ndg, tsink_ndg, tsource_ndg)\n",
" except: \n",
" from_storage_vdg_MW = 0\n",
" tsource_ndg = Tsource_calculation(ws.temperature, from_storage_ndg_MW, tsource_ndg, from_process_massflow_ndg)\n",
" e_load_ndg, th_load_ndg = hp_ndg.set_heat_output(demand_ndg, tsink_ndg, tsource_ndg)\n",
"\n",
" from_storage_ndg_MW = ws.discharge(round(from_storage_ndg_MW))\n",
" else:\n",
" e_load_ndg, th_load_ndg = (0,0)\n",
" else:\n",
" e_load_vdg, th_load_vdg = hp_vdg.set_heat_output(\n",
" heat_output=demand_vdg,\n",
" Tsink=tsink_vdg,\n",
" Tsource=tsource_vdg\n",
" )\n",
"\n",
" e_load_ndg, th_load_ndg = hp_ndg.set_heat_output(\n",
" heat_output=demand_ndg,\n",
" Tsink=tsink_ndg,\n",
" Tsource=tsource_ndg\n",
" )\n",
" return e_load_vdg, th_load_vdg, e_load_ndg, th_load_ndg, tsource_vdg, tsource_ndg, to_storage_vdg_MW, from_storage_vdg_MW, to_storage_ndg_MW, from_storage_ndg_MW, ws, hp_vdg, hp_ndg\n",
"# Neg-take price, Pos-feed price"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def hponly_with_storage(case):\n",
" hp_vdg = case.assets['Heatpump VDG']\n",
" hp_ndg = case.assets['Heatpump NDG']\n",
" ws = case.assets['HotWaterStorage']\n",
" demands_vdg = case.data['MW (VDG)'].to_list()\n",
" demands_ndg = case.data['MW (NDG)'].to_list()\n",
" Tsink_vdg = case.data['Tsink (VDG)'].to_list()\n",
" Tsink_ndg = case.data['Tsink (NDG)'].to_list()\n",
" Tsource_vdg = case.data['Tsource (VDG)'].to_list()\n",
" Tsource_ndg = case.data['Tsource (NDG)'].to_list()\n",
" dam_prices = case.data['DAM'].to_list()\n",
" pos_prices = case.data['POS'].to_list()\n",
" neg_prices = case.data['NEG'].to_list()\n",
"\n",
" hp_vdg_input = [0] * len(case.data)\n",
" hp_ndg_input = [0] * len(case.data)\n",
" hp_vdg_output = [0] * len(case.data)\n",
" hp_ndg_output = [0] * len(case.data)\n",
"\n",
" minutes = len(case.data)\n",
" storage_levels = [None] * minutes\n",
" to_storage_list = [0] * minutes\n",
" from_storage_vdg_list = [0] * minutes\n",
" to_storage_vdg_list = [0] * minutes\n",
" from_storage_ndg_list = [0] * minutes\n",
" to_storage_ndg_list = [0] * minutes\n",
" \n",
" new_tsources_vdg = case.data['Tsource (VDG)'].to_list()\n",
" new_tsources_ndg = case.data['Tsource (NDG)'].to_list()\n",
" \n",
" ws.set_chargelevel(ws.min_chargelevel)\n",
" \n",
" for m in range(minutes):\n",
" tsource_vdg = Tsource_vdg[m]\n",
" tsink_vdg = Tsink_vdg[m]\n",
" tsource_ndg = Tsource_ndg[m]\n",
" tsink_ndg = Tsink_ndg[m]\n",
" demand_vdg = demands_vdg[m]\n",
" demand_ndg = demands_ndg[m]\n",
" \n",
" e_load_vdg, th_load_vdg, e_load_ndg, th_load_ndg, tsource_vdg, tsource_ndg, to_storage_vdg_MW, from_storage_vdg_MW, to_storage_ndg_MW, from_storage_ndg_MW, ws, hp_vdg, hp_ndg = hp_storage_opt_enginge(\n",
" c, ws, hp_vdg, hp_ndg, pos_prices[m], neg_prices[m], dam_prices[m], demand_vdg, demand_ndg, \n",
" tsource_vdg, tsink_vdg, tsource_ndg, tsink_ndg\n",
" )\n",
" \n",
" hp_vdg_input[m] = e_load_vdg\n",
" hp_vdg_output[m] = th_load_vdg\n",
" hp_ndg_input[m] = e_load_ndg\n",
" hp_ndg_output[m] = th_load_ndg\n",
" \n",
" storage_levels[m] = ws.chargelevel\n",
" new_tsources_vdg[m] = tsource_vdg\n",
" new_tsources_ndg[m] = tsource_ndg\n",
" to_storage_vdg_list[m] = to_storage_vdg_MW\n",
" from_storage_vdg_list[m] = from_storage_vdg_MW\n",
" to_storage_ndg_list[m] = to_storage_ndg_MW\n",
" from_storage_ndg_list[m] = from_storage_ndg_MW\n",
" \n",
" case.data['hp_output_MW'] = np.array(hp_vdg_output) + np.array(hp_ndg_output)\n",
" case.data['hp_input_MW'] = np.array(hp_vdg_input) + np.array(hp_ndg_input)\n",
" case.data['cop'] = case.data['hp_output_MW'] / -case.data['hp_input_MW']\n",
" case.data['tsource_vdg'] = new_tsources_vdg\n",
" case.data['tsource_ndg'] = new_tsources_ndg\n",
" case.data['to_storage_ndg_MW'] = to_storage_ndg_list\n",
" case.data['from_storage_ndg_MW'] = from_storage_ndg_list\n",
" case.data['to_storage_vdg_MW'] = to_storage_vdg_list\n",
" case.data['from_storage_vdg_MW'] = from_storage_vdg_list\n",
" case.data['to_storage_MW'] = case.data['to_storage_vdg_MW'] + case.data['to_storage_ndg_MW']\n",
" case.data['from_storage_MW'] = case.data['from_storage_vdg_MW'] + case.data['from_storage_ndg_MW']\n",
" case.data['storage_level_MWh'] = storage_levels\n",
" \n",
" for col in case.data.columns:\n",
" if col.endswith('MW'):\n",
" case.data[col + 'h'] = case.data[col] / 60\n",
"\n",
" case.data = case.data.round(3)\n",
" return case\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# charge_times = print(sum(s.storage_case_PCM.data['to_storage_MW']!= 0))\n",
"# discharge_times = print(sum(s.storage_case_PCM.data['from_storage_MW']!= 0))\n",
"# # 154476 times out of 525600, the storage is used to charge/dischage\n",
"# # from 39 000 to 110 000 discharge time increases\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM = hponly_with_storage(s.storage_case_PCM)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data.sample(2)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data[['DAM', 'POS', 'NEG', 'hp_output_MW', 'to_storage_MW', 'Total demand']].describe()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data[['DAM', 'POS', 'NEG', 'hp_output_MW', 'to_storage_MW', 'from_storage_MW','Total demand']].sample(20)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.hpcase.data.columns"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data.columns"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# hp_selection = s.hpcase.data[['DAM', 'POS', 'NEG', 'hp_input_MW', 'hp_output_MW', 'cop', 'Total demand']]\n",
"hp_selection = s.hpcase.data[['DAM', 'POS', 'NEG', 'Total demand']]\n",
"storage_selection = s.storage_case_PCM.data[['hp_input_MW', 'hp_output_MW', 'cop', 'to_storage_MW', 'from_storage_MW']]\n",
"\n",
"selection = pd.concat([hp_selection, storage_selection], axis=1)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"selection.sample(20)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"selection.mean()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"########################\n",
"\n",
"# This is where we left\n",
"\n",
"########################"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def cost_function(th_load, cop, electricity_cost, alt_heat_price, demand):\n",
" return (\n",
" th_load / cop * electricity_cost\n",
" + (demand - th_load) * alt_heat_price\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def hybrid_imb_optimisation(case, decimals, s):\n",
" gb = case.assets['Gasboiler']\n",
" hp_vdg = case.assets['Heatpump VDG']\n",
" hp_ndg = case.assets['Heatpump NDG']\n",
" demand_vdg = case.data['MW (VDG)'].round(decimals).to_list()\n",
" demand_ndg = case.data['MW (NDG)'].round(decimals).to_list()\n",
" Tsink_vdg = case.data['Tsink (VDG)'].round(decimals).to_list()\n",
" Tsink_ndg = case.data['Tsink (NDG)'].round(decimals).to_list()\n",
" Tsource_vdg = case.data['Tsource (VDG)'].round(decimals).to_list()\n",
" Tsource_ndg = case.data['Tsource (NDG)'].round(decimals).to_list()\n",
" fore_neg = case.data[c.forecast].fillna(999).round(decimals).to_list()\n",
" gas_prices = case.data['Gas prices (€/MWh)'].round(decimals).to_list()\n",
" co2_prices = case.data['CO2 prices (€/MWh)'].round(decimals).to_list()\n",
" eb_ode_g = s.eb_ode_g\n",
" eb_ode_e = s.eb_ode_e\n",
" \n",
" gb_input = [0] * len(case.data)\n",
" gb_output = [0] * len(case.data)\n",
"\n",
" minutes = range(len(case.data))\n",
" hp_output = [0] * len(case.data)\n",
" hp_input = [0] * len(case.data)\n",
"\n",
" for m in tqdm(minutes):\n",
" dem_vgd = demand_vdg[m]\n",
" if dem_vgd != 0:\n",
" max_load = min(hp_vdg.max_th_power, dem_vgd)\n",
" min_load = hp_vdg.min_th_power\n",
" Tsink = Tsink_vdg[m]\n",
" Tsource = Tsource_vdg[m]\n",
" cop_max_load = hp_vdg.get_cop(heat_output=max_load, Tsink=Tsink, Tsource=Tsource)\n",
" cop_min_load = hp_vdg.get_cop(heat_output=min_load, Tsink=Tsink_vdg[m], Tsource=Tsource_vdg[m])\n",
" \n",
" cost_full_load = cost_function(\n",
" th_load=max_load,\n",
" cop=cop_max_load,\n",
" electricity_cost=fore_neg[m] + eb_ode_e - c.sde_switch_price_correction,\n",
" alt_heat_price=gas_prices[m] + co2_prices[m] + eb_ode_g/case.assets['Gasboiler'].efficiency,\n",
" demand=dem_vgd\n",
" )\n",
" \n",
" cost_min_load = cost_function(\n",
" th_load=min_load,\n",
" cop=cop_min_load,\n",
" electricity_cost=fore_neg[m] + eb_ode_e,\n",
" alt_heat_price=gas_prices[m] + co2_prices[m] + eb_ode_g/case.assets['Gasboiler'].efficiency,\n",
" demand=dem_vgd\n",
" )\n",
" \n",
" if cost_full_load < cost_min_load:\n",
" hp_vdg_input, hp_vdg_output = hp_vdg.set_heat_output(max_load, Tsink, Tsource)\n",
" else:\n",
" hp_vdg_input, hp_vdg_output = hp_vdg.set_heat_output(min_load, Tsink, Tsource)\n",
" else:\n",
" hp_vdg_input, hp_vdg_output = (0, 0)\n",
"\n",
" dem_ngd = demand_ndg[m]\n",
" if dem_ngd != 0:\n",
" max_load = min(hp_ndg.max_th_power, dem_ngd)\n",
" min_load = hp_ndg.min_th_power\n",
" Tsink = Tsink_ndg[m]\n",
" Tsource = Tsource_ndg[m]\n",
" cop_max_load = hp_ndg.get_cop(heat_output=max_load, Tsink=Tsink, Tsource=Tsource)\n",
" cop_min_load = hp_ndg.get_cop(heat_output=min_load, Tsink=Tsink, Tsource=Tsource)\n",
" \n",
" cost_full_load = cost_function(\n",
" th_load=max_load,\n",
" cop=cop_max_load,\n",
" electricity_cost=fore_neg[m] + eb_ode_e,\n",
" alt_heat_price=gas_prices[m] + co2_prices[m] + eb_ode_g/case.assets['Gasboiler'].efficiency,\n",
" demand=dem_ngd\n",
" )\n",
" \n",
" cost_min_load = cost_function(\n",
" th_load=min_load,\n",
" cop=cop_min_load,\n",
" electricity_cost=fore_neg[m] + eb_ode_e,\n",
" alt_heat_price=gas_prices[m] + co2_prices[m] + eb_ode_g/case.assets['Gasboiler'].efficiency,\n",
" demand=dem_vgd\n",
" )\n",
" \n",
" if cost_full_load <= cost_min_load:\n",
" hp_ndg_input, hp_ndg_output = hp_ndg.set_heat_output(max_load, Tsink=Tsink, Tsource=Tsource)\n",
" else:\n",
" hp_ndg_input, hp_ndg_output = hp_ndg.set_heat_output(min_load, Tsink=Tsink, Tsource=Tsource)\n",
" else:\n",
" hp_ndg_input, hp_ndg_output = (0, 0)\n",
"\n",
" hp_out = hp_vdg_output + hp_ndg_output\n",
" hp_output[m] = hp_out\n",
" hp_input[m] = hp_vdg_input + hp_ndg_input\n",
" remaining_demand = max(dem_vgd+dem_ngd-hp_out, 0)\n",
" gb_output[m], gb_input[m] = gb.set_heat_output(remaining_demand)\n",
"\n",
" case.data['hp_output_MW'] = np.array(hp_output)\n",
" case.data['hp_input_MW'] = np.array(hp_input)\n",
" case.data['gb_output_MW'] = np.array(gb_output)\n",
" case.data['gb_input_MW'] = np.array(gb_input)\n",
"\n",
" for col in case.data.columns:\n",
" if col.endswith('MW'):\n",
" case.data[col + 'h'] = case.data[col] / 60\n",
"\n",
" case.data = case.data.round(5)\n",
" return case"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"aFRR\n",
"* Bid in a volume (X MW) --> Strategy is to only bid in on aFRR down\n",
"* Remaining demand is filed in by gasboiler\n",
"* Bid price at switch price?\n",
"* Assume direct response to 0% for now?\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def calc_afrr_capacity(case):\n",
" hp_vdg = case.assets['Heatpump VDG']\n",
" hp_ndg = case.assets['Heatpump NDG']\n",
" \n",
" capacity = 0\n",
" for hp in [hp_vdg, hp_ndg]:\n",
" max_th_output = hp.max_th_power\n",
" cop = hp.get_cop(max_th_output, Tsink=135, Tsource=60)\n",
" e_power = max_th_output / cop\n",
" capacity += e_power\n",
" return capacity"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def aFRR_optimisation(case, s):\n",
" s.afrr_case.afrr_capacity = calc_afrr_capacity(s.afrr_case)\n",
" gb = case.assets['Gasboiler']\n",
" hp_vdg = case.assets['Heatpump VDG']\n",
" hp_ndg = case.assets['Heatpump NDG']\n",
" demand_vdg = case.data['MW (VDG)'].to_list()\n",
" demand_ndg = case.data['MW (NDG)'].to_list()\n",
" Tsink_vdg = case.data['Tsink (VDG)'].to_list()\n",
" Tsink_ndg = case.data['Tsink (NDG)'].to_list()\n",
" Tsource_vdg = case.data['Tsource (VDG)'].to_list()\n",
" Tsource_ndg = case.data['Tsource (NDG)'].to_list()\n",
" afrr_up = case.data['Hoogste_prijs_opregelen'].fillna(-999).to_list()\n",
" gas_prices = case.data['Gas prices (€/MWh)'].to_list()\n",
" co2_prices = case.data['CO2 prices (€/MWh)'].to_list()\n",
" eb_ode_g = s.eb_ode_g\n",
" eb_ode_e = s.eb_ode_e\n",
" \n",
" gb_input = [0] * len(case.data)\n",
" gb_output = [0] * len(case.data)\n",
"\n",
" minutes = range(len(case.data))\n",
" hp_output = [0] * len(case.data)\n",
" hp_input = [0] * len(case.data)\n",
"\n",
" for m in tqdm(minutes):\n",
" dem_vgd = demand_vdg[m]\n",
" if dem_vgd != 0:\n",
" max_load = min(hp_vdg.max_th_power, dem_vgd)\n",
" min_load = hp_vdg.min_th_power\n",
" Tsink = Tsink_vdg[m]\n",
" Tsource = Tsource_vdg[m]\n",
" cop_max_load = hp_vdg.get_cop(heat_output=max_load, Tsink=Tsink, Tsource=Tsource)\n",
" cop_min_load = hp_vdg.get_cop(heat_output=min_load, Tsink=Tsink_vdg[m], Tsource=Tsource_vdg[m])\n",
" \n",
" cost_full_load = cost_function(\n",
" th_load=max_load,\n",
" cop=cop_max_load,\n",
" electricity_cost=afrr_up[m] + eb_ode_e - c.sde_switch_price_correction,\n",
" alt_heat_price=gas_prices[m] + co2_prices[m] + eb_ode_g/case.assets['Gasboiler'].efficiency,\n",
" demand=dem_vgd\n",
" )\n",
" \n",
" cost_min_load = cost_function(\n",
" th_load=min_load,\n",
" cop=cop_min_load,\n",
" electricity_cost=afrr_up[m] + eb_ode_e,\n",
" alt_heat_price=gas_prices[m] + co2_prices[m] + eb_ode_g/case.assets['Gasboiler'].efficiency,\n",
" demand=dem_vgd\n",
" )\n",
" \n",
" if cost_full_load < cost_min_load:\n",
" hp_vdg_input, hp_vdg_output = hp_vdg.set_heat_output(max_load, Tsink, Tsource)\n",
" else:\n",
" hp_vdg_input, hp_vdg_output = hp_vdg.set_heat_output(min_load, Tsink, Tsource)\n",
" else:\n",
" hp_vdg_input, hp_vdg_output = (0, 0)\n",
"\n",
" dem_ngd = demand_ndg[m]\n",
" if dem_ngd != 0:\n",
" max_load = min(hp_ndg.max_th_power, dem_ngd)\n",
" min_load = hp_ndg.min_th_power\n",
" Tsink = Tsink_ndg[m]\n",
" Tsource = Tsource_ndg[m]\n",
" cop_max_load = hp_ndg.get_cop(heat_output=max_load, Tsink=Tsink, Tsource=Tsource)\n",
" cop_min_load = hp_ndg.get_cop(heat_output=min_load, Tsink=Tsink, Tsource=Tsource)\n",
" \n",
" cost_full_load = cost_function(\n",
" th_load=max_load,\n",
" cop=cop_max_load,\n",
" electricity_cost=afrr_up[m] + eb_ode_e,\n",
" alt_heat_price=gas_prices[m] + co2_prices[m] + eb_ode_g/case.assets['Gasboiler'].efficiency,\n",
" demand=dem_ngd\n",
" )\n",
" \n",
" cost_min_load = cost_function(\n",
" th_load=min_load,\n",
" cop=cop_min_load,\n",
" electricity_cost=afrr_up[m] + eb_ode_e,\n",
" alt_heat_price=gas_prices[m] + co2_prices[m] + eb_ode_g/case.assets['Gasboiler'].efficiency,\n",
" demand=dem_vgd\n",
" )\n",
" \n",
" if cost_full_load <= cost_min_load:\n",
" hp_ndg_input, hp_ndg_output = hp_ndg.set_heat_output(max_load, Tsink=Tsink, Tsource=Tsource)\n",
" else:\n",
" hp_ndg_input, hp_ndg_output = hp_ndg.set_heat_output(min_load, Tsink=Tsink, Tsource=Tsource)\n",
" else:\n",
" hp_ndg_input, hp_ndg_output = (0, 0)\n",
"\n",
" hp_out = hp_vdg_output + hp_ndg_output\n",
" hp_output[m] = hp_out\n",
" hp_input[m] = hp_vdg_input + hp_ndg_input\n",
" remaining_demand = max(dem_vgd+dem_ngd-hp_out, 0)\n",
" gb_output[m], gb_input[m] = gb.set_heat_output(remaining_demand)\n",
"\n",
" case.data['hp_output_MW'] = np.array(hp_output)\n",
" case.data['hp_input_MW'] = np.array(hp_input)\n",
" case.data['gb_output_MW'] = np.array(gb_output)\n",
" case.data['gb_input_MW'] = np.array(gb_input)\n",
"\n",
" for col in case.data.columns:\n",
" if col.endswith('MW'):\n",
" case.data[col + 'h'] = case.data[col] / 60\n",
"\n",
" case.data = case.data.round(5)\n",
" return case"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Run optimisation"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def run_optimisation(c, s):\n",
" s.baseline = baseline_sim(s.baseline)\n",
" s.hpcase = hponly(s.hpcase)\n",
" s.storage_case_PCM = hponly_with_storage(s.storage_case_PCM)\n",
"# s.hpcase_sde.assign_algorithm(hponly)\n",
"# s.hpcase_sde.run()\n",
" \n",
"# s.optcase1.assign_algorithm(hybrid_imb_optimisation)\n",
"# s.optcase1.run(decimals=2, s=s)\n",
" \n",
"# s.afrr_case.assign_algorithm(aFRR_optimisation)\n",
"# s.afrr_case.run(s=s)\n",
" \n",
" for case in [s.hpcase, s.storage_case_PCM]: # [s.hpcase, s.hpcase_sde, s.optcase1, s.afrr_case]:\n",
" case.mean_cop = case.data['hp_output_MW'].sum() / case.data['hp_input_MW'].abs().sum()\n",
" \n",
" return s\n",
"\n",
"s = run_optimisation(c, s)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.hpcase.data['hp_output_MW'].sum() /s.hpcase.data['hp_input_MW'].abs().sum() "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data['hp_output_MW'].sum() /s.storage_case_PCM.data['hp_input_MW'].abs().sum() "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.hpcase.data.head()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"vis_data = s.storage_case_PCM.data[[\n",
" 'storage_level_MWh', \n",
" #'to_storage_ndg_MW', \n",
" #'to_storage_vdg_MW', \n",
" 'to_storage_MW', \n",
" #'from_storage_ndg_MW', \n",
" #'from_storage_vdg_MW',\n",
" 'from_storage_MW'\n",
"]]\n",
"\n",
"cops = pd.concat([s.hpcase.data['cop'], s.storage_case_PCM.data['cop']], axis=1)\n",
"vis_data = pd.concat([vis_data, cops], axis=1)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"vis_data.loc['2023-01-01', :].iplot(subplots=True, shape=(5,1), dimensions=(1000, 800))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"total_to_storage = s.storage_case_PCM.data['to_storage_MWh'].sum()\n",
"total_from_storage = s.storage_case_PCM.data['from_storage_MWh'].sum()\n",
"total_to_storage, total_from_storage"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data.columns"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"hp_output = s.storage_case_PCM.data['hp_output_MW'].round(2)\n",
"demand_and_to_storage = (s.storage_case_PCM.data['Total demand'] + s.storage_case_PCM.data['to_storage_MW']).round(2)\n",
"hp_output.equals(demand_and_to_storage)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pd.concat([hp_output, demand_and_to_storage], axis=1).resample('H').sum().iplot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"## Financials"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def calculate_sde_subsidy(c, s, case):\n",
" hp_capacity = abs(case.assets['Heatpump VDG'].max_th_power) + abs(case.assets['Heatpump NDG'].max_th_power)\n",
" case.full_load_hours = abs(case.data['hp_output_MWh'].sum() / hp_capacity)\n",
"\n",
" subsidized_hours = min(case.full_load_hours, 8000)\n",
" subsidized_MWh = subsidized_hours * hp_capacity\n",
"\n",
" base_amount_per_MWh_th = round(c.sde_base_amount*0.173 + c.longterm_gas_price, 2)\n",
" base_subsidy_amount = round(base_amount_per_MWh_th * subsidized_MWh, 2)\n",
" long_term_gas_price_LHV = round(EURperHHV_to_EURperLHV(c.longterm_gas_price), 2)\n",
" base_amount_gas = (2/3)*long_term_gas_price_LHV*0.9\n",
" mean_ttf_price_LHV = round(EURperHHV_to_EURperLHV(case.data['Gas prices (€/MWh)'].mean()), 2)\n",
" correction_amount_gas = max(mean_ttf_price_LHV, base_amount_gas)\n",
" avoided_gas_consumption = subsidized_MWh/0.9\n",
" correction_gas = round(correction_amount_gas * avoided_gas_consumption, 2)\n",
"\n",
" base_amount_co2 = (2/3)*c.longterm_co2_price\n",
" avoided_co2_emission = avoided_gas_consumption * 0.226\n",
" mean_ets_price = round(EURperHHV_to_EURperLHV(case.data['CO2 prices (€/ton)'].mean()), 2)\n",
" correction_amount_co2 = max(base_amount_co2, mean_ets_price)\n",
" correction_co2 = round(correction_amount_co2 * avoided_co2_emission, 2)\n",
"\n",
" sde_subsidy_corrected = max(base_subsidy_amount - (correction_gas + correction_co2), 0)\n",
" sde_per_MWh_th = sde_subsidy_corrected / subsidized_MWh\n",
" sde_per_MWh_e = sde_subsidy_corrected / case.data['hp_input_MWh'].abs().sum()\n",
"\n",
" case.sde_results = {\n",
" 'base_amount': base_subsidy_amount,\n",
" 'correction_gas': correction_gas,\n",
" 'correction_co2': correction_co2,\n",
" 'corrected_amount': sde_subsidy_corrected,\n",
" 'sde_per_MWh_th': sde_per_MWh_th,\n",
" 'sde_per_MWh_e': sde_per_MWh_e\n",
" }\n",
" return case"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data.columns"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def collect_cashflows(c, s):\n",
" s.hpcase.generate_electr_market_results(nom_col='hp_input_MWh', real_col='hp_input_MWh')\n",
" s.storage_case_PCM.data['nomination_MWh'] = s.hpcase.data['hp_input_MWh']\n",
" s.storage_case_PCM.generate_electr_market_results(nom_col='nomination_MWh', real_col='hp_input_MWh')\n",
"\n",
"# s.hpcase_sde.generate_electr_market_results(nom_col='hp_input_MWh', real_col='hp_input_MWh')\n",
" \n",
"# s.optcase1.data['DA Nom'] = s.hpcase.data['hp_input_MWh'] * c.day_ahead_buying_perc\n",
"# s.optcase1.generate_electr_market_results(nom_col='DA Nom', real_col='hp_input_MWh')\n",
" \n",
"# s.afrr_case.data['POS'] = s.afrr_case.data['aFRR_up'].fillna(value=s.afrr_case.data['POS'])\n",
"# s.afrr_case.data['DA Nom'] = s.hpcase.data['hp_input_MWh']\n",
"# s.afrr_case.generate_electr_market_results(nom_col='DA Nom', real_col='hp_input_MWh')\n",
"# s.afrr_case.add_cashflow('aFRR capacity fee (€)', c.afrr_capacity_fee * s.afrr_case.afrr_capacity)\n",
" \n",
" for case in [s.baseline]: #, s.optcase1, s.afrr_case]:\n",
" case.add_gas_costs(gasvolumes_col='gb_input_MWh')\n",
" case.add_co2_costs(volume_cols='gb_input_MWh', fuel='gas')\n",
" case.add_eb_ode(commodity='gas', tax_bracket=c.tax_bracket_g)\n",
" \n",
" for case in [s.hpcase, s.storage_case_PCM]: #, s.hpcase_sde, s.optcase1, s.afrr_case]:\n",
" \n",
" case.add_eb_ode(commodity='electricity', tax_bracket=c.tax_bracket_e, cons_col='hp_input_MWh')\n",
" case.add_grid_costs(\n",
" power_MW_col='hp_input_MW',\n",
" grid_operator=c.grid_operator,\n",
" year=2020, \n",
" connection_type=c.connection_type\n",
" )\n",
" \n",
" for case in s.sde_cases:\n",
" case = calculate_sde_subsidy(c=c, s=s, case=case)\n",
" case.add_cashflow('SDE++ subsidy (€)', case.sde_results['corrected_amount'])\n",
" return s\n",
"\n",
"s = collect_cashflows(c, s)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def calculate_financials(c, s):\n",
" for case in s.cases:\n",
" case.calculate_ebitda(c.project_duration)\n",
"\n",
" for case in [s.hpcase, s.storage_case_PCM]: #, s.hpcase_sde, s.optcase1, s.afrr_case]:\n",
" case.calculate_business_case(\n",
" project_duration=c.project_duration, \n",
" discount_rate=c.discount_rate, \n",
" baseline=s.baseline\n",
" )\n",
" \n",
" return s\n",
"\n",
"s = calculate_financials(c, s)"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"## Visualisations"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"fig_demands_over_time = s.demand['Total demand'].resample('H').mean().iplot(\n",
" title='Smurfit Kappa: Heat demand by Hour in MW', \n",
" yTitle='MW', \n",
" colors=recoygreen,\n",
" asFigure=True,\n",
" dimensions=(800, 400)\n",
")\n",
"fig_demands_over_time"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"demands_fig = s.demand[['Tsource (VDG)', 'Tsink (VDG)', 'Tsource (NDG)', 'Tsink (NDG)']].resample('H').mean().iplot(\n",
" kind='box',\n",
" title='Smurfit Kappa: Source and Sink temperatures',\n",
" color=recoygreen,\n",
" yTitle='Temperature in degrees C',\n",
" legend=False,\n",
" asFigure=True,\n",
" dimensions=(800, 400)\n",
")\n",
"demands_fig"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data.columns"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"temp_comparison = pd.concat([s.storage_case_PCM.data[['tsource_ndg', 'tsource_vdg']], s.storage_case_PCM.data[['Tsource (NDG)', 'Tsource (VDG)']]], axis=1)\n",
"temp_comparison.columns = ['Tsource (NDG) -after', 'Tsource (VDG) - after', 'Tsource (NDG) - before', 'Tsource (VDG) - before']\n",
"temp_comparison.resample('15T').mean().iplot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s.storage_case_PCM.data[['tsource_ndg', 'tsource_vdg']].idxmin()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# s.storage_case_PCM.data.loc['2019-01-04 07:45:00']"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"case = s.storage_case_PCM\n",
"hp_vdg = case.assets['Heatpump VDG']\n",
"hp_ndg = case.assets['Heatpump NDG']\n",
"ws = case.assets['HotWaterStorage']\n",
"ws.storage_level = 47.083 + 25\n",
"pos = 35.650\n",
"neg = 43.850\n",
"dam = 68.400\n",
"demand_vdg = 0\n",
"demand_ndg = 0\n",
"tsource_vdg = 29.635\n",
"tsink_vdg = 134.840\n",
"tsource_ndg = 21.420\n",
"tsink_ndg = 118.755\n",
"\n",
"hp_storage_opt_enginge(c, ws, hp_vdg, hp_ndg, pos, neg, dam, demand_vdg, demand_ndg, tsource_vdg, tsink_vdg, tsource_ndg, tsink_ndg)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#e_load_vdg, th_load_vdg, e_load_ndg, th_load_ndg, tsource_vdg, tsource_ndg, to_storage_vdg_MW, from_storage_vdg_MW, to_storage_ndg_MW, from_storage_ndg_MW, ws, hp_vdg, hp_ndg"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"Tsource_VDG_before = s.hpcase.data['Tsource (VDG)'].round(2)\n",
"Tsource_VDG_after = s.storage_case_PCM.data['Tsource (VDG)'].round(2)\n",
"Tsource_VDG_before.equals(Tsource_VDG_after)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"Tsource_NDG_before = s.hpcase.data['Tsource (NDG)'].round(2)\n",
"Tsource_NDG_after = s.storage_case_PCM.data['Tsource (NDG)'].round(2)\n",
"Tsource_NDG_before.equals(Tsource_NDG_after)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"test_hp = Heatpump(\n",
" name='Heatpump for Testing',\n",
" max_th_power=1,\n",
" min_th_power=0.3,\n",
" cop_curve=cop_curve\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"# Tsrc_vdg, Tsnk_vdg, Tsrc_ndg, Tsnk_ndg = s.demand[['Tsource (VDG)', 'Tsink (VDG)', 'Tsource (NDG)', 'Tsink (NDG)']].mean().to_list()\n",
"# mean_gas_price = (s.optcase1.data['Gas prices (€/MWh)'].mean() \n",
"# + s.eb_ode_g \n",
"# + s.optcase1.data['CO2 prices (€/MWh)'].mean()\n",
"# )\n",
"\n",
"# max_gas_price = (s.optcase1.data['Gas prices (€/MWh)'].max() \n",
"# + s.eb_ode_g \n",
"# + s.optcase1.data['CO2 prices (€/MWh)'].max()\n",
"# )\n",
"\n",
"# min_gas_price = (s.optcase1.data['Gas prices (€/MWh)'].min() \n",
"# + s.eb_ode_g \n",
"# + s.optcase1.data['CO2 prices (€/MWh)'].min()\n",
"# )\n",
"\n",
"# Tsrc_vdg_min, Tsnk_vdg_min, Tsrc_ndg_min, Tsnk_ndg_min = s.demand[['Tsource (VDG)', 'Tsink (VDG)', 'Tsource (NDG)', 'Tsink (NDG)']].min().to_list()\n",
"# Tsrc_vdg_max, Tsnk_vdg_max, Tsrc_ndg_max, Tsnk_ndg_max = s.demand[['Tsource (VDG)', 'Tsink (VDG)', 'Tsource (NDG)', 'Tsink (NDG)']].max().to_list()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"import plotly.graph_objs as go\n",
"def create_load_trace(gas_price, Tsnk, Tsrc, name):\n",
" loads = []\n",
" eprices = list(range(1000))\n",
" for eprice in eprices:\n",
" _, load = test_hp.set_opt_load(\n",
" electricity_cost=eprice + s.eb_ode_e,\n",
" alt_heat_price=gas_price / 0.9,\n",
" demand=1, \n",
" Tsink=Tsnk,\n",
" Tsource=Tsrc\n",
" )\n",
" loads.append(load)\n",
" trace = go.Scatter(x=eprices, y=loads, name=name)\n",
" return trace"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"# import plotly.graph_objects as go\n",
"# import numpy as np\n",
"\n",
"# fig = go.Figure()\n",
"\n",
"# configs = {\n",
"# 'mean': [mean_gas_price, Tsnk_vdg, Tsrc_vdg],\n",
"# 'unfav_gas': [min_gas_price, Tsnk_vdg, Tsrc_vdg],\n",
"# 'fav_gas': [max_gas_price, Tsnk_vdg, Tsrc_vdg],\n",
"# 'unfav_all': [min_gas_price, Tsnk_vdg_max, Tsrc_vdg_min],\n",
"# 'fav_all': [max_gas_price, Tsnk_vdg_min, Tsrc_vdg_max],\n",
"# }\n",
"\n",
"# for name, config in configs.items():\n",
"# trace = create_load_trace(*config, name)\n",
"# fig.add_trace(trace)\n",
"\n",
"# fig.update_layout(title='Switch prices for different configurations')\n",
"# fig.show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"date = s.baseline.data.index[0].strftime('%Y-%m-%d')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"fig_steamboiler = s.baseline.data[['Total demand', 'output_MW_th']].abs().loc[date, :].iplot(\n",
" subplots=True, \n",
" title=f'Steamboiler only case on {date}',\n",
" subplot_titles=['Total demand', 'Steam boiler output (MW)'],\n",
" legend=False,\n",
" dimensions=(800, 400),\n",
" colors=[recoydarkblue, recoygreen], \n",
" asFigure=True,\n",
" shape = (2, 1)\n",
")\n",
"fig_steamboiler"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"fig_heatpump = s.hpcase.data[['Total demand', 'hp_output_MW']].abs().loc[date, :].iplot(\n",
" subplots=True, \n",
" title=f'Heatpump only case on {date}', \n",
" subplot_titles=['Total demand', 'Heatpump output (MW)'],\n",
" legend=False,\n",
" dimensions=(800, 400),\n",
" colors=[recoydarkblue, recoygreen], \n",
" asFigure=True,\n",
" shape = (2, 1)\n",
")\n",
"fig_heatpump"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"# fig_optcase = s.optcase1.data[['hp_output_MW', 'gb_output_MW', 'NEG']].loc[date, :].iplot(\n",
"# subplots=True, \n",
"# title=f'Hybrid case on {date}',\n",
"# subplot_titles=['Heatpump output (MW)', 'Steam boiler output (MW)', 'Imbalance price (€/MWh)'],\n",
"# legend=False,\n",
"# dimensions=(800, 600),\n",
"# colors=[recoydarkblue, recoygreen, recoyred],\n",
"# asFigure=True,\n",
"# shape=(3,1)\n",
"# )\n",
"# fig_optcase"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"# date = '2019-09-04'\n",
"\n",
"# fig_optcase2 = s.optcase1.data[['hp_output_MW', 'gb_output_MW', 'NEG']].loc[date, :].iplot(\n",
"# subplots=True, \n",
"# title=f'Hybrid case on {date}',\n",
"# subplot_titles=['Heatpump output (MW)', 'Steam boiler output (MW)', 'Imbalance price (€/MWh)'],\n",
"# legend=False,\n",
"# dimensions=(800, 600),\n",
"# colors=[recoydarkblue, recoygreen, recoyred],\n",
"# asFigure=True,\n",
"# shape=(3,1)\n",
"# )\n",
"# fig_optcase2"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"report = ComparisonReport(\n",
" cases=s.optcases, \n",
" kind='electr_market_results',\n",
")\n",
"report.show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"casereport = ComparisonReport(cases = s.cases, kind='ebitda_calc', baseline=s.baseline, comparison='relative')\n",
"casereport.show(presentation_format=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"BusinessCaseReport(s.hpcase).show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"#price chart\n",
"from copy import deepcopy\n",
"\n",
"data = deepcopy(s.baseline.data)\n",
"data = data[data.columns[:2]]\n",
"data[data.columns[1]] = MWh_gas_to_tonnes_CO2(data[data.columns[1]])\n",
"data = data.rename(columns={\"CO2 prices (€/ton)\": \"CO2 prices (€/MWh)\"})\n",
"\n",
"data.resample('D').mean().iplot(dimensions=(800, 300), title='Gasprices vs. CO2 prices', colors=recoycolors)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"# #price chart\n",
"# from copy import deepcopy\n",
"\n",
"# s.optcase1.data['DAM'].resample('D').mean().iplot(dimensions=(800, 300), title='Electricity Prices', colors=recoycolors)\n",
"\n",
"# _source_output = s.optcase1.data[['hp_output_MWh', 'gb_output_MWh']].resample('M').sum()\n",
"# _total_output = _source_output.sum(axis=1)\n",
"# _data = _source_output.divide(_total_output, axis=0).rename(\n",
"# columns={'hp_output_MWh':'Heat pump', 'gb_output_MWh':'Gasboiler'}\n",
"# ) * 100\n",
"\n",
"# production_fig = _data.iplot(\n",
"# kind='bar',\n",
"# barmode='stack',\n",
"# colors=[recoydarkblue, recoygreen],\n",
"# title='Hybrid case: Heat production per Month by Source in % share',\n",
"# yTitle='Share of production in %',\n",
"# dimensions=(600, 400),\n",
"# asFigure=True\n",
"# )\n",
"\n",
"# production_fig = production_fig.update_layout(legend_traceorder=\"reversed\")\n",
"# production_fig"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"report = ComparisonReport(s.cases, kind='capex').report\n",
"report = report[report.index.str.contains('CAPEX')].T\n",
"capex_fig = report.iplot(\n",
" kind='bar', \n",
" barmode='relative',\n",
" colors=recoycolors,\n",
" title='CAPEX by Casestudy',\n",
" yTitle='CAPEX in €',\n",
" dimensions=(600, 400),\n",
" asFigure=True,\n",
")\n",
"\n",
"capex_fig = capex_fig.update_layout(legend_traceorder=\"reversed\")\n",
"capex_fig"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"cashflow_report = ComparisonReport(cases = s.cases, kind='cashflows')\n",
"\n",
"fig = cashflow_report.report.T.iplot(\n",
" kind='bar',\n",
" barmode='relative',\n",
" colors=recoycolors,\n",
" title='OPEX breakdown',\n",
" asFigure=True,\n",
" yTitle='€',\n",
" dimensions=(800, 600)\n",
")\n",
"\n",
"ebitda_report = ComparisonReport(cases=s.cases, kind='ebitda_calc')\n",
"scat = go.Scatter(\n",
" mode='markers',\n",
" y=ebitda_report.report.loc['EBITDA (€)', :].values, \n",
" x=ebitda_report.report.columns, \n",
" line=go.scatter.Line(color=recoydarkgrey),\n",
" marker=dict(\n",
" color=recoydarkgrey,\n",
" size=20,\n",
" line=dict(\n",
" color=recoydarkgrey,\n",
" width=3\n",
" ),\n",
" symbol='line-ew'\n",
" ),\n",
" name='EBITDA (€)'\n",
")\n",
"fig.add_trace(scat)\n",
"fig"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# ebitda_report.show(comparison='relative')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_series = SingleFigureComparison(s.cases, 'ebitda', label='EBITDA').report\n",
"ebitda_graph = _series.iplot(\n",
" kind='bar',\n",
" title='Yearly EBITDA by Casestudy in €',\n",
" colors=recoygreen,\n",
" dimensions=(600, 400), \n",
" yTitle='EBITDA in €',\n",
" asFigure=True,\n",
" yrange=[_series.min() * 1.2, max(_series.max() * 2, 0)]\n",
")\n",
"\n",
"ebitda_graph.update_traces(\n",
" text=_series.values/1000_000, \n",
" textposition='outside', \n",
" texttemplate=\"%{text:.1f}M €\", \n",
")\n",
"\n",
"ebitda_graph"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_series = SingleFigureComparison(s.optcases, 'npv', label='NPV').report\n",
"\n",
"npv_graph = _series.iplot(\n",
" kind='bar',\n",
" title='NPV by Casestudy in €',\n",
" colors=recoygreen,\n",
" dimensions=(600, 400), \n",
" yTitle='NPV in €',\n",
" asFigure=True,\n",
" yrange=[0, _series.max() * 1.1]\n",
")\n",
"\n",
"npv_graph.update_traces(\n",
" text=_series.values/1000_000, \n",
" textposition='outside', \n",
" texttemplate=\"%{text:.1f}M €\", \n",
")\n",
"\n",
"npv_graph"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"irr_report = (SingleFigureComparison(s.optcases, 'irr', label='IRR').report * 100)\n",
"irr_fig = irr_report.iplot(\n",
" kind='bar',\n",
" title='IRR by Casestudy in %',\n",
" colors=recoygreen,\n",
" dimensions=(600, 400),\n",
" yTitle='IRR in %',\n",
" asFigure=True,\n",
" yrange=[0, irr_report.max() * 1.2]\n",
")\n",
"\n",
"irr_fig.update_traces(\n",
" text=irr_report.values, \n",
" textposition='outside', \n",
" texttemplate=\"%{text:.0f} %\", \n",
")\n",
"\n",
"irr_fig"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_series = (SingleFigureComparison(s.optcases, 'spp', label='Simple Payback Time').report)\n",
"spt_fig = _series.iplot(\n",
" kind='bar',\n",
" title='Simple Payback Time by Casestudy in Years',\n",
" colors=recoygreen,\n",
" dimensions=(600, 400),\n",
" yTitle='Years',\n",
" asFigure=True,\n",
" yrange=[0, _series.max() * 1.2]\n",
")\n",
"\n",
"spt_fig.update_traces(\n",
" text=_series.values, \n",
" textposition='outside', \n",
" texttemplate=\"%{text:.1f} years\", \n",
")\n",
"\n",
"spt_fig "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"mean_cops = (SingleFigureComparison(s.optcases, 'mean_cop', label='COP').report)\n",
"cop_fig = mean_cops.iplot(\n",
" kind='bar',\n",
" title='Mean COP by Casestudy',\n",
" colors=recoygreen,\n",
" dimensions=(600, 400),\n",
" yTitle='COP',\n",
" asFigure=True,\n",
" yrange=[0, mean_cops.max() * 1.2]\n",
")\n",
"\n",
"cop_fig.update_traces(\n",
" text=mean_cops.values, \n",
" textposition='outside', \n",
" texttemplate=\"%{text:.1f}\", \n",
")\n",
"\n",
"cop_fig "
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"## Sensitivity analysis"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"We have agreed that for the sensitivity analysis we will vary the following key assumptions in the flex-model:\n",
"1. Market prices of gas and electricity. We will use the actual prices for 2019, the actual prices for 2020.\n",
"2. aFRR prices +/- 30%\n",
"3. CAPEX\n",
"4. CO2 price\n",
"5. Tsource +/- 10 degrees Celsius\n",
"6. Tsink\n",
" * Roam off the peak / lower pressure\n",
" * Stabalize / running average per hour/ 2 hours\n",
"7. __A scenario with a constraint on grid capacity and a scenario without grid capacity as a constraint__\n",
"8. Energy Tax and ODE +/- 30%"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def setup():\n",
" _c = Config()\n",
" s = setup_model(c=_c)\n",
" return s\n",
"\n",
"def routine(c, s):\n",
" s = load_data(c=c, s=s)\n",
" s = create_and_assign_assets(c=c, s=s)\n",
" #s = preprocessing(c=c, s=s)\n",
" s = run_optimisation(c=c, s=s)\n",
" #s = postprocessing(c=c, s=s)\n",
" s = collect_cashflows(c=c, s=s)\n",
" s = calculate_financials(c=c, s=s)\n",
" return s"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"%time _s = setup()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"%time result = routine(c, _s)\n",
"npv = result.hpcase.npv\n",
"npv"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# sensitivity:Storage temperature\n",
"values = range(80, 120, 10)\n",
"param = 'storage_temperature'\n",
"kpis = ['npv','irr','spp', 'mean_cop']\n",
"\n",
"sens = SensitivityAnalysis(c, _s, routine, param, values, kpis)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='volume(m3)',\n",
" yTitle='NPV in €',\n",
" yrange=[output.min().min()*1.1, 0],\n",
" title='Sensitivity: Storage volume',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Sensitivity: Water storage volume"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"(_s.demand['MW (VDG)'] + _s.demand['MW (NDG)']).to_list()[:3]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%time _s = setup()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"values = range(100, 500, 100)\n",
"param = 'storage_volume'\n",
"kpis = ['npv','irr','spp']\n",
"\n",
"sens = SensitivityAnalysis(c, _s, routine, param, values, kpis)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"[case.name for case in s.cases][1:]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='volume(m3)',\n",
" yTitle='NPV in €',\n",
" yrange=[output.min().min()*1.1, 0],\n",
" title='Sensitivity: Storage volume',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('irr', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='volume(m3)',\n",
" yTitle='irr in %',\n",
" yrange=[output.min().min()*1.1, 0],\n",
" title='Sensitivity: IRR',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='volume(m3)',\n",
" yTitle='years',\n",
" yrange=[output.min().min()*1.2, 0],\n",
" title='Sensitivity: payback time',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('mean_cop', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='volume(m3)',\n",
" yTitle='COP',\n",
" yrange=[output.min().min()*1.1, 0],\n",
" title='Sensitivity: COP',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Sensitivity: CAPEX per MW storage"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"values = range(4_000, 10_000, 2_000)\n",
"param = 'storage_capex_per_MW'\n",
"kpis = ['npv', 'irr', 'spp', 'mean_cop']\n",
"\n",
"sens = SensitivityAnalysis(c, _s, routine, param, values, kpis)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='CAPEX',\n",
" yTitle='NPV in Euro',\n",
" yrange=[output.min().min()*1.1, 0],\n",
" title='Sensitivity: NPV in euro',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='CAPEX/MW',\n",
" yTitle='years',\n",
" yrange=[output.min().min()*1.2, 0],\n",
" title='Sensitivity: payback time',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Sensitivity: CAPEX per MWh storage"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"values = range(1_000, 5_000, 1000)\n",
"param = 'storage_capex_per_MWh'\n",
"kpis = ['npv', 'spp', 'mean_cop']\n",
"\n",
"sens = SensitivityAnalysis(c, _s, routine, param, values, kpis)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='CAPEX/MWh',\n",
" yTitle='NPV',\n",
" yrange=[output.min().min()*1.2, 0],\n",
" title='Sensitivity: NPV in Euro',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='CAPEX/MWh',\n",
" yTitle='years',\n",
" yrange=[output.min().min()*1.2, 0],\n",
" title='Sensitivity: Payback time',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Sensitivity: threshold price"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"values = range(10, 60, 10)\n",
"param = 'threshold'\n",
"kpis = ['npv', 'spp', 'mean_cop']\n",
"\n",
"sens = SensitivityAnalysis(c, _s, routine, param, values, kpis)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='threshold',\n",
" yTitle='NPV',\n",
" yrange=[output.min().min()*1.2, 0],\n",
" title='Sensitivity: NPV in Euro',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='threshold',\n",
" yTitle='years',\n",
" yrange=[output.min().min()*1.2, 0],\n",
" title='Sensitivity: SPP in years',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('mean_cop', case_names=[case.name for case in s.cases][1:])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='threshold',\n",
" yTitle='COP',\n",
" yrange=[output.min().min()*1.2, 0],\n",
" title='Sensitivity: SOP',\n",
" colors=recoycolors,\n",
" dimensions=(600, 400),\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Heat Pump CAPEX"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [0.7, 1, 1.3, 2]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.hp_capex *= value\n",
" configs[value*100] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"%%time\n",
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"\n",
"sens_capex = output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='CAPEX factor (%)',\n",
" yTitle='Payback Period in years',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: Heat Pump CAPEX (€)',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")\n",
"sens_capex"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: CO2 prices"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"mean = 24.86\n",
"co2_prices = [10, 25, 50, 100]\n",
"\n",
"for price in co2_prices:\n",
" _c = Config()\n",
" multiplier = price / mean\n",
" _c.co2_price_multiplier = multiplier\n",
" configs[price] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"%%time\n",
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('ebitda', case_names=['Baseline', 'Heatpump + SDE', 'Heatpump only', 'Optimisation', 'Optimisation + aFRR'])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Mean CO2 price in €',\n",
" yTitle='EBITDA in €',\n",
" yrange=[output.min().min() * 1.1, 0],\n",
" title='Sensitivity: CO2 prices > EBITDA',\n",
" colors=[recoypurple, recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"sens_co2_spp = output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Mean CO2 price in €',\n",
" yTitle='Payback Period in Years',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: CO2 prices',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")\n",
"sens_co2_spp"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Mean CO2 price in €',\n",
" yTitle='NPV in €',\n",
" yrange=[output.min().min() * 10, output.max().max()*1.1],\n",
" title='Sensitivity: CO2 prices > NPV',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400)\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Gas prices"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [0.7, 1, 1.3]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.gas_price_multiplier = value\n",
" configs[value * 100] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"%%time\n",
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Gas price factor (%)',\n",
" yTitle='NPV in €',\n",
" yrange=[output.min().min()*1.5, output.max().max()*1.1],\n",
" title='Sensitivity: Gas prices',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400)\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Day Ahead buying amount"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [0, 0.5, 1]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.day_ahead_buying_perc = value\n",
" configs[value * 100] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Volume in %',\n",
" yTitle='NPV in €',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: Volume bought on Day-Ahead market in %',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400)\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Electricity prices"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [0.7, 1, 1.3]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.e_price_multiplier = value\n",
" configs[value * 100] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"sens_eprices = output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Electricity price factor in %',\n",
" yTitle='Payback Period in years',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: Electricity prices',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")\n",
"sens_eprices"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Electricity price volatility"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [0.7, 1, 1.3]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.e_price_volatility_multiplier = value\n",
" configs[value] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"sens_evol = output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Volatility factor',\n",
" yTitle='Payback period in years',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: E-price volatility',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")\n",
"sens_evol"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: aFRR capacity fee"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [10_000, 25_000, 50_000]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.afrr_capacity_fee = value\n",
" configs[value] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"sens_affr_fee = output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='aFRR capacity fee in €/MW',\n",
" yTitle='Payback Period in years',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: aFRR capacity fee',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")\n",
"sens_affr_fee"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Energy tax"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [0.7, 1, 1.3]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.energy_tax_multiplier = value\n",
" configs[value] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('npv', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Factor',\n",
" yTitle='NPV in €',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: Energy taxes',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400)\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Tsource"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [-25, -10, 0, 10, 25]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.tsource_delta = value\n",
" configs[value] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"output.loc[-25, 'Heatpump only'] = np.nan\n",
"sens_tsource = output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Tsource delta',\n",
" yTitle='Payback Period in years',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: Tsource',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")\n",
"sens_tsource"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('mean_cop', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Tsource delta',\n",
" yTitle='COP',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: Tsource > Mean COP',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400)\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Tsink"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"_s = setup()\n",
"configs = {}\n",
"values = [-25, -10, 0, 10, 25]\n",
"for value in values:\n",
" _c = Config()\n",
" _c.tsink_delta = value\n",
" configs[value] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"sens = SensitivityAnalysis(_s, routine, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('spp', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"sens_tsink = output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Tsink delta',\n",
" yTitle='Payback Period in Years',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: Tsink',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")\n",
"sens_tsink"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"output = sens.single_kpi_overview('mean_cop', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"sens_tsink_cop = output.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Tsink delta',\n",
" yTitle='COP',\n",
" yrange=[0, output.max().max()*1.1],\n",
" title='Sensitivity: Tsink > Mean COP',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400),\n",
" asFigure=True\n",
")\n",
"sens_tsink_cop"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"### Sensitivity: Time period"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"def routine2(c, s):\n",
" s = setup_model(c=c)\n",
" s = load_data(c=c, s=s)\n",
" s = create_and_assign_assets(c=c, s=s)\n",
" s = run_optimisation(c=c, s=s)\n",
" s = collect_cashflows(c=c, s=s)\n",
" s = calculate_financials(c=c, s=s)\n",
" return s"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"configs = {}\n",
"start_values = ['2018-01-01', '2019-01-01', '2019-11-01']\n",
"\n",
"for value in start_values:\n",
" _c = Config()\n",
" _c.start = value\n",
" _c.end = (pd.to_datetime(value) + timedelta(days=364)).strftime('%Y-%m-%d')\n",
" configs[value] = _c"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"%%time\n",
"sens = SensitivityAnalysis(_s, routine2, configs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"result = sens.single_kpi_overview('npv', case_names=['Heatpump only', 'Heatpump + SDE', 'Optimisation', 'Optimisation + aFRR'])\n",
"result.iplot(\n",
" mode='lines+markers',\n",
" symbol='circle-dot',\n",
" size=10,\n",
" xTitle='Start date',\n",
" yTitle='NPV in €',\n",
" yrange=[0, result.max().max()*1.1],\n",
" title='Sensitivity: Modelled time period',\n",
" colors=[recoygreen, recoyyellow, recoydarkblue, recoyred],\n",
" dimensions=(600, 400)\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": [
"exclude"
]
},
"source": [
"## Report"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"exclude"
]
},
"outputs": [],
"source": [
"renderer = 'svg'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# __Presentation Flexible Heatpumps__\n",
"\n",
"ENCORE meeting: 17-12-2020\n",
"Mark Kremer\n",
"\n",
" \n",
" \n",
" \n",
"\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## __Central question:__\n",
"#### *Can Smurfit Kappi shorten the Payback Period of an investment in a Heatpump by operating it in a flexible manner?*"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Flexible operations on imbalance market__\n",
"\n",
"Benefiting from fluctuations in electricity market prices by ramping the asset up- and down (increasing and decreasing the electricity consumption)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_example_1 = s.hpcase.data[['DAM']].iloc[:60*24].iplot(\n",
" title='Day-Ahead prices on arbitrary day in €/MWh',\n",
" yrange=[-100, 200],\n",
" colors=recoygreen, \n",
" yTitle='Price in €/MWh',\n",
" xTitle='Time of Day',\n",
" dimensions=(800, 400),\n",
" asFigure=True\n",
")\n",
"\n",
"fig_example_1.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Imbalance prices are very volatile, with prices below -100 and above 200 €/MWh on a daily basis."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_example_2 = s.hpcase.data[['DAM', 'POS']].rename(columns={'POS':'IMB'}).iloc[:60*24].iplot(\n",
" title='Imbalance Prices on arbitrary day in €/MWh',\n",
" colors=[recoygreen, recoydarkblue], \n",
" yTitle='Price in €/MWh',\n",
" xTitle='Time of Day',\n",
" dimensions=(800, 400),\n",
" asFigure=True\n",
")\n",
"\n",
"fig_example_2.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"It is possible to benefit from these fluctiations, if you have __flexibility__\n",
"* Storage options\n",
"* Hybrid installations (e.g. with gas-powered assets)\n",
"\n",
"In this case we are looking at a __hybrid set-up of a Steamboiler and a Heatpump__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Simulations & mathematical modelling__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"* To answer the central question, we have build a simulation model\n",
"* The model simulates the operations of a hybrid set-up of a Heatpump and a Steamboiler over the timespan of 1 years (on a 1 minute basis)\n",
"* The goal of the model is to minimize the operating costs, in order to reach the shortest Payback Period\n",
"* We are taking into account all major investment and operating costs, including:\n",
" * Asset CAPEX\n",
" * Commodity costs for gas and electricity\n",
" * Energy taxes\n",
" * Grid transport costs\n",
" * SDE++ subsidies\n",
" * Maintenance costs\n",
" * CO2 allowances\n",
"* The output of the model is a Payback Period for an investment in an heatpump, in different scenario's"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Casestudies__\n",
"5 main scenario's"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"1. Steamboiler only (baseline, baseload)\n",
"2. Heatpump only (stand-alone, baseload, without SDE++)\n",
"3. Heatpump + SDE\n",
"4. Heatpump + SDE + Steam boiler (hybrid set-up) on Imbalance market\n",
"5. Heatpump + SDE + Steam boiler (hybrid set-up) on aFRR (secondary reserve market)\n",
"\n",
"Besides that, we modelled 11 sensitivities:\n",
"* Heatpump CAPEX\n",
"* Gas & CO2 prices\n",
"* Electricity prices & volatility\n",
"* Energy taxes\n",
"* Bidding strategies\n",
"* Source temperatures (affecting COP)\n",
"* Sink temperatures (affecting COP)\n",
"* Time period (2018, 2019, 2020)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Smurfit Kappa case__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The model is based on the context of Smurfit Kappa (paper factory)\n",
"* Currently a Steamboiler is providing the 20-30 MW of average heat demand for drying processes\n",
"* We add a 31 MW heatpump (to make sure it can cover entire demand)\n",
"* The steam demand must be fulfilled at all times, by either the heatpump or the gasboiler\n",
"* The heatpump and steam boiler can both respond very quickly (within minutes) within a flexible range (30%-100% for heatpump)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_demands_over_time.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"* Source temperatures of around 65 degrees C\n",
"* Sink temperatures of 125-170 degrees C \n",
"* Average Temperature lift of about 85 degrees C"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"demands_fig.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Heat pump__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"COP roughly between 4 and 1.5, depending on load, Tsource and Tsink"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def cop_curve(Tsink, Tsource):\n",
" Tsink += 273\n",
" Tsource += 273\n",
"\n",
" c1 = 0.267 * Tsink / (Tsink - Tsource)\n",
" c2 = 0.333 * Tsink / (Tsink - Tsource)\n",
" \n",
" return Polynomial([c2, c1])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sourceT = 63\n",
"sinkT = 140\n",
"cop_curve(sourceT, sinkT)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_cop_curve.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Optimisation__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"* At each moment in time, we calculate the cheapest option to produce the required heat. \n",
"* Taking into account the COP fluctuations, due to changing Tsource and Tsink\n",
"* Taking into account fluctuating market prices (electricity, gas, CO2)\n",
"* We are predicting real-time electricity prices using our forecasting models"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"__Some example days:__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Steamboiler only is following demand pattern"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_steamboiler.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Similar pattern for heatpump only case"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_heatpump.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Hybrid set-up is responding to price fluctuactions, steam boiler taking over at high prices"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_optcase.show(renderer=renderer)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig_optcase2.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Business case__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"CAPEX of around 6 M€ (200.000 €/MW), which need to be earned back by savings in operating costs"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"capex_fig.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"* Savings in EBITDA compared to the baseline are about 1.5 mln € without subsidy, and up to 4.5 mln € including subsidy\n",
"* The optimisation on aFRR allows for a 30-40% improvement in EBITDA"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ebitda_graph.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Resulting in a Payback Period of 5.4 years without subsidy, and 1.8 years with subsidy"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"spt_fig.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The added value of the optimisation is limited (in absolute terms), which is explained by the high COP of the heatpump"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"production_fig.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"* The heatpump is filling in 95-98 % of the demand. \n",
"* Because of its high COP, it is almost always cheaper to run than the steam boiler\n",
"* Switch price is on average around 90€/MWh (excluding subsidies)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Sensitivities__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"If CAPEX is 200%, subsidy is needed to keep a good Payback Time"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sens_capex.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Subsidies are protecting the business case againsts low CO2 prices"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sens_co2_spp.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"* The businesscase is quite sensitive to Tsource and Tsink differences, because they directly impact the COP\n",
"* The Smurtfit Kappa case, with a temperature lifte of about 85 degrees C on average, looks favorable. \n",
"* When the temperature lift is higher, the COP will decrease and the business case will degrade"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sens_tsource.show(renderer=renderer)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sens_tsink.show(renderer=renderer)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sens_tsink_cop.show(renderer=renderer)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sens_eprices.show(renderer=renderer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### __Conclusions__"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"* The business case for a Heat Pump seems favourable\n",
"* Flexible operation, using aFRR, can improve the operational results by 30-40%\n",
"* However, this only results in a marginal improvement of the business case\n",
"* SDE++ has a very favourable effect on the business case, but is not critical\n",
"* The business case is notably sensitive to the temperature lift required, and is therefore strongly dependent on the specific use case."
]
}
],
"metadata": {
"interpreter": {
"hash": "fea6e7ab4c0ed1d184e153838ac4b2d24a0985a5f17e99f4f437a114b66796b8"
},
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
},
"toc-autonumbering": false,
"toc-showmarkdowntxt": false,
"widgets": {
"application/vnd.jupyter.widget-state+json": {
"state": {},
"version_major": 2,
"version_minor": 0
}
}
},
"nbformat": 4,
"nbformat_minor": 4
}