|
From: Rishi S. <ris...@gm...> - 2024-02-20 18:11:38
|
Hi all, A small reminder about this. Since asking this question and playing around with the parameters, I found that a similar observation of first rapidly decreasing and then slowly increasing runtimes with the number of assets when pricing American Basket options even on using different tolerances and also using random values for the underlying asset spots, interest rates, volatilities etc. A Flat forward yield term structure was also used for both dividends and interest rates, and a constant volatility was also assumed. To reiterate the questions: 1. Why would the run-time for pricing American Basket options price decrease rapidly with increasing the number of assets first before slowly increasing again? I am very new to quantitative finance and hence want to make sure I'm not doing anything obviously wrong. 2. Is the errorEstimate() calculated on the options price a moving standard deviation of the MonteCarlo values obtained for some fixed window size before convergence? If not, how is this estimated? 3. Is the requiredTolerance referring to absolute tolerance or relative tolerance? Thanks again for looking into this! Most Cordially, Rishi On Sat, Feb 10, 2024 at 11:14 PM Rishi Sreedhar <ris...@gm...> wrote: > Dear QuantLib community, > > I've been exploring how to price American Basket options > using MCAmericanBasketEngine, when I found something strange. The time it > took to produce a result for a requiredTolerance of 1e-2 was decreasing as > I increased the number of assets [d]. (See attached plot for reference). > > Isn't this surprising? The landscape from where monte carlo should sample > becomes significantly more complex when simulating larger baskets, and > hence shouldn't the time increase with the number of assets? > > The parameters I am using are: > > d = 4 #number of assets > underlying_r = np.array([0.3 for i in range(d)]) > underlying_volatilities = np.array([0.5 for i in range(d)]) > underlying_spots = np.array([100.0 for i in range(d)]) > underlying_dividend_rate = np.zeros(d) > > β = 0.5 > underlying_correlation_mat = (β*np.ones((d,d)) > +np.identity(d))/(1+β) > > Also, could someone please point me to where I can learn more about the > actual algorithms implemented behind the pricing engines, and what the > parameters like requiredTolerance mean? I see that the requiredTolerance > sets an upper bound to the errorEstimate(), but how is this errorEstimate > also calculated? > > Thank you so much again for taking the time to answer these very beginner > questions! > Most Cordially, > Rishi > > |