I think time frame matters. I did a histogram of daily SPY stock price changes from 1993 to 2019, it is almost Gaussian with fat tails and zero mean. However, the histogram of yearly stock price change for the same period is not Gaussian and have a non zero mean.
I'm not trying to win any argument, just trying to have a healthy discussion and build a different viewpoint. Cheers!
I don't think though, I know that's the difference. Now if you really have something that you KNOW works and you make money on it obviously that's a fact therefore you can disregard what I say. All I can tell you is I KNOW time frames don't make a difference as far an edge generally speaking. I know because I've seen live examples, I've seen the math both in numerical and visual form and I also put real live trades on the line everyday using it. That doesn't mean that there no strategy or situation where a bigger time frame wouldn't of made you profitable when a small one didn't. I would never claim that, there's so many different trades happening that of course that would be a dumb statement on my part. I am just simply saying if you find a technical edge as far as pattern prediction or expectation of price movement and continuation that works over and over consistently that changing a time frame will not change that edge. Yes if smaller time frame of course less atr etc or if you're trading huge you will need a bigger time frame to be able to get all your shares / contracts in so that you don't move the market and destroy your own edge (I don't think there's many people on here that have to worry about that).
It was 20 years ago and the Matlab files are lost to drive technology changes, so speaking from memory that probably isn't 100% I remember converting daily returns on indexes and a bunch of individual stocks from different sectors over periods of a few weeks to a year to the frequency domain. A couple of relevant points made here that would argue for someone to redo my work is that some aspects of markets have undeniably changed over the last 20 years and I didn't have access to transaction level or even minute level data then, or the computational power to do the transforms on that much data. I definitely would have done EKF work because my main capstone project focus was a novel use of GPS signals where I was using EKFs a lot. My general recollection is that the EKFs tend to diverge rapidly on their own tangent in general, and the lack of clear frequency components tends to amplify that tendency.
A few comments from Nassim Taleb on Garch. For silver, in 46 years 94 percent of the kurtosis came from one single observation. We cannot use standard statistical methods with financial data. GARCH(a method popular in academia) does not work because we are dealing with squares. The variance of the squares is analogous to the fourth moment. We do not know the variance. But we can work very easily with Pareto distributions. They give us less information, but nevertheless, it is more rigorous if the data are uncapped or if there are any open variables. This is a diagnostics tour of the properties of the SP500 index in its history. We engage in a battery of tests and check what statistical picture emerges. Clearly, its returns are power law distributed (with some added complications, such as an asymmetry between upside and downside) which, again, invalidates common methods of analysis. We look, among other things to: • The behavior of Kurtosis under aggregation (as we lengthen the observation window ) • The behavior of the conditional expectation E(X|X>K) for various values of K. • The maximum-to-sum plot(MS Plot). • Drawdowns (that is, maximum excursions over a time window) • Extremes and records to see if extremes are independent.These diagnostics allow us to confirm that an entire class of analyses in L2 such as modern portfolio theory, factor analysis, GARCH, conditional variance, or stochastic volatility are methodologically (and practically) invalid. Of course there's more information from him and others regarding this type of analysis.
Which book of his was that from? His work in Dynamic Hedging is pretty solid. From there on it becomes increasingly tedious grandstanding and you have to take the unsupported rants against "academia" with a massive grain of salt.
I am really not trying to be a d*ck so if I come across like that it isn't my intention. But how do you know some aspects of the markets have undeniably changed in the last 20 years? I mean sure I understand at face value you can defend that statement and it's true. I am speaking if we're talking about price prediction, pattern recognition and etc. That hasn't changed as far back as I can go. Now what has changed is of course things like QE, major events and etc. That hasn't changed how price moves that has only changed the velocity and how long a certain pattern stays into effect. Example: Let's say there's a buy trigger pre QE. That trend would end sooner and reverse and means reversion to the downside wasn't quite as hard to achieve. After QE the same buy trigger worked, but now in would continue further due to the easing and fresh funds coming in. So, yes of course you had to adapt your strategy particularly if you're a long and short trader. But the fundamental way that price moves hasn't changed at all, the buy triggers just last longer and there's way less short triggers. I totally get you need a way to effectively measure this and not claiming its easy. I am just stating what I know and was curious exactly what you meant by your comment and if you think that or actually know that.
My fault for not being more specific, I was talking about some known things that have changed like the volatility skew for puts, which actually goes back a bit further than 20 years but is the cleanest example. If I was to purely theorize I might hypothesize that HFT has introduced some short duration harmonics that weren't there before that you might be able to see if you were looking at trade by trade data in the frequency domain. Also I'm solely discussing what you see, if anything, when you transform the price changes of a security over time into the frequency domain, including performing various filtering while in the frequency domain. It's a somewhat arcane corner of digital signal processing that EEs get excited about, it's also the underlying basis for Kalman filtering among other DSP filtering methods (getting back to the thread topic). So like you, not trying to be a dick so I apologise if it comes off that way, but it's a couple levels of rigor above triggers and pattern matching to the level of looking to see if those truly exist when you start looking in the frequency domain. In my investigation they didn't, but I can see why it may still be a fruitful area for research if you're already familiar with the DSP space from another profession or you wanted to spend a few months coming up to speed in the space if you already had the math background needed.