Create Account
Log In
Dark
chart
exchange
Premium
Terminal
Screener
Stocks
Crypto
Forex
Trends
Depth
Close
Check out our Level2View

MU
Micron Technology, Inc.
stock NASDAQ

At Close
Dec 5, 2025 3:59:57 PM EST
237.18USD+4.646%(+10.53)21,160,980
0.00Bid   0.00Ask   0.00Spread
Pre-market
Dec 5, 2025 9:28:30 AM EST
229.25USD+1.147%(+2.60)358,896
After-hours
Dec 5, 2025 4:58:30 PM EST
236.33USD-0.358%(-0.85)98,562
OverviewOption ChainMax PainOptionsPrice & VolumeSplitsDividendsHistoricalExchange VolumeDark Pool LevelsDark Pool PrintsExchangesShort VolumeShort Interest - DailyShort InterestBorrow Fee (CTB)Failure to Deliver (FTD)ShortsTrendsNewsTrends
MU Reddit Mentions
Subreddits
Limit Labels     

We have sentiment values and mention counts going back to 2017. The complete data set is available via the API.
Take me to the API
MU Specific Mentions
As of Dec 7, 2025 8:13:40 AM EST (7 minutes ago)
Includes all comments and posts. Mentions per user per ticker capped at one per hour.
7 min ago • u/shadowpawn • r/stocks • so_whats_your_game_plan_for_2026 • C
Yes $MU turned into my top 2 stock in 2025 behind my $sofi
sentiment 0.54
5 hr ago • u/ChairmanMeow1986 • r/ValueInvesting • a_logical_answer_to_the_question_when_should_i • C
Look, I like MU here, it's just not where to talk about this. It's my semi to take up on value, but Value Investing is not the place to document a value trade currently. So I will, again, limit myself to my hope it works out for you. Respect the community you post in.
sentiment 0.95
7 hr ago • u/themarketapex • r/wallstreetbets • weekend_discussion_thread_for_the_weekend_of • C
\- RAM is great again
\- That basically means MU = NVDA?
\- So MU = 4T marketcap
\- Therefore MU 350C 12/13?
I don't see how this can possibly go wrong!
sentiment 0.39
7 hr ago • u/Rexobe • r/mauerstrassenwetten • 2_advent_rädern • B
*Ruprecht, Ruprecht, guter Gast*
*Hast du mir was mitgebracht?*
*Hast du Calls, dann setz dich nieder,*
*Hast du Puts, dann geh nur wieder.*
Da der erste Teil ganz gut angekommen ist, machen wir heute zum zweiten Advent mit der zweiten Strategie weiter. Das Rädern.
*Das Rädern (auch: Radebrechen, radebreken, mit dem rade stozen, mittelhochdeutsch rederen, oder auf das Rad flechten) war eine vom europäischen Mittelalter bis in die Frühe Neuzeit praktizierte Form der Hinrichtung, bei der den Betroffenen mit einem großen Wagenrad, dem Richtrad, zunächst die Glieder gebrochen wurden, um sie anschließend zwischen die Radspeichen zu flechten und zur Abschreckung auszustellen. Als besonders qualvolle und entehrende Todesart war das Rädern der Sanktionierung von Kapitalverbrechen vorbehalten. Quelle:* [*https://de.wikipedia.org/wiki/R%C3%A4dern*](https://de.wikipedia.org/wiki/R%C3%A4dern)
Klingt ungefähr wie das, was ich an der Börse jeden Tag fühle.
Grundsätzlich ist das Rädern eine sehr einfache Strategie und deswegen erfreut sie sich auf LasEs wohl auch größter Beliebtheit. Im Prinzip funktioniert es so: Man verkauft Aufleger einer Aktie, die man bereit ist zum Preis des Auflegers zu kaufen und das ganze spielt man so lange bis man assignt wird. Anschließend verkauft man gedeckte Anrufe, bis man auch dort assignt wird und die Stöcker wieder aus der Deponie ausgebucht werden.
Je mehr ihr darüber lest, desto mehr Zusatzregeln könnt ihr zu der Strategie finden. Zum Beispiel Anrufe nur über dem Preis verkaufen, zu dem die Stöcker eingebucht wurden. Oder nur so viele Aufleger verkaufen, wie ihr mit Geld besichern könnt.
Das mach ich natürlich nicht, ich spiele das ganze komplett auf Margarine, da ich ja auch ohne Rädern schon zu mindestens 100% im Markt investiert bin.
Der erste Schritt zum Rädern ist also die Auswahl der richtigen Aktie. Wie ich das mache, ist relativ einfach: Ich nehme die größte Bumsbude, die gerade jemand im Täglichen erwähnt. Je mehr implizite Volatilität, desto besser und wenn sie dann gerade noch in einem Aufwärtstrend ist, fang ich an zu rädern. Weiter unten findet ihr die Liste an Tickern, die ich genutzt habe. Absolut retardiert, ich weiß.
[Es ist literarisch so](https://preview.redd.it/a432vpuitk5g1.png?width=468&format=png&auto=webp&s=52b8e2e0109d7e8091efa58fe4b82a01db41f257)
Ein typisches Setz-auf, bei mir würde also zum Beispiel so aussehen:
[Ticker: $RKLB, 32 Tage Restlaufzeit, 38$ Put beim Aktienkurs von 45,54$, 162$ Prämie \(4,26%\)](https://preview.redd.it/f6noduuntk5g1.png?width=904&format=png&auto=webp&s=0d8062f46ee056b9c89cddcea385071588d21440)
Ihr seht, wir begrenzen wieder erfolgreich unsere Gewinne und lassen die Verluste frei laufen.
Assignt wurde ich dieses Jahr insgesamt 18-mal bei 102 verkauften Auflegern. Wie oben bereits erwähnt verkaufe ich dann direkt wieder gedeckte Anrufe darauf, die genau wie die Aufleger ca. 5% Prämie geben. Es gibt viele die Anrufe dann nur über dem Preis verkaufen zu dem sie die Aktien eingebucht bekommen haben, ich mache das nicht. Woher soll ich wissen, ob die Aktie je wieder diesen Preis erreicht. Oder ewig als Tasche in meiner Deponie liegen bleibt. Nein, das Rad muss weiterdrehen.
Und hier die Liste an Tickern mit zugeordneten Gewinnen/Verlusten über das Jahr:
|Wheel Bullisch|$16.678,04|
|:-|:-|
|ASTS|$5.804,27|
|HIMS|$1.624,56|
|WOLF|$337,29|
|CELH|\-$2.696,09|
|NVDA|$1.571,98|
|CHWY|$500,35|
|MU|\-$559,44|
|AAL|$132,84|
|RKLB|$4.481,17|
|RDDT|\-$2.383,88|
|HOOD|$2.963,97|
|AMZN|\-$1.981,78|
|FTNT|$124,78|
|GME|$2.624,64|
|BRK B|$384,22|
|GOOG|$1.214,11|
|UNH|\-$499,26|
|CRWV|\-$3.409,33|
|NET|$1.271,11|
|PARA|$218,40|
|RBLX|$40,95|
|SMCI|\-$1.153,41|
|ONDS|$1.474,68|
|BE|$3.557,81|
|BABA|$386,38|
|OKLO|$373,15|
|NBIS|\-$891,95|
|VSAT|$477,89|
|APLD|\-$470,45|
|SHOP|$1.222,94|
|IREN|\-$162,19|
|SOFI|$98,33|
Aber selbst ich bin nicht bei jeder einzelnen Aktie bullisch. Und ab und zu brauch ich auch einfach Geld, weil ich es nicht mag, wenn mein Bargeldbestand im Depot eine andere Zahl als 0 ist, egal ob positiv oder negativ. Deswegen kommen wir jetzt zum bärischen Rädern. Das ist vielleicht jetzt auch für den ein oder anderen neu.
Ich kürze als erstes eine Aktie, die mir u/laStealer zum Kürzen empfiehlt (z.B. $ON oder $NKE) und verkaufe dann Aufleger dagegen. Wenn der Preis dann unter den Streikpreis sinkt, bekomme ich die fehlenden Stöckchen wieder eingebucht.
[Ticker: $JACK, 32 Tage Restlaufzeit, 12,5$ Put beim Aktienkurs von 14,94$](https://preview.redd.it/uro90xsjuk5g1.png?width=904&format=png&auto=webp&s=bdf3ba64a63af639ddf428dd1cf3fa43cba1926c)
Ich mache hier praktisch nur den zweiten Teil des Räderns. Man könnte auch noch ohne die Aktie zu kürzen nackte Calls schreiben und warten bis diese assignt werden um dann eine negative Anzahl an Stöckern in die Deponie eingebucht zu bekommen.
Bärisch habe ich das Ganze dieses Jahr mit folgenden Unternehmen gespielt:
|Wheel Bärisch|$4.920,93|
|:-|:-|
|TOL|$894,41|
|PTON|$834,40|
|CMG|$580,73|
|ON|$817,72|
|JACK|$2.245,71|
|NKE|\-$162,37|
|CAVA|\-$289,67|
^(Keine Anlageberatung. Macht das nicht, begrenzt Verluste und nicht Gewinne.)
sentiment -1.00
12 hr ago • u/jyl8 • r/ValueInvesting • a_logical_answer_to_the_question_when_should_i • C
I assume we’re talking about selling a stock that has done well.
Most of the time, I have a valuation range in mind, usually P/E; for certain deep cyclicals, could be P/S; for banks could be P/TB; for REITs, usually P/AFFO; etc. Hopefully I bought when the stock was at the low end. When the stock is at the high end, I ask whether I think the denominator is going to accelerate big time for some reason or something else will happen to increase the valuation range. If no then I’ll typically sell half, and then sell the rest when the stock starts rolling over. If yes then I usually don’t sell.
Example: I bought ULTA in 2024 at various prices between $350 and $400. At $550 it was bumping against the top of its valuation range and I sold half. Now it is $600 and I need to figure out if I overlooked coming acceleration or revaluation, thus if I’m going to finish exiting.
Of course, it is just as easy to make a mistake on the exit as the entry.
Example: I bought MU in 2022 around $60, exited in 2025 around $130, watched it go down and thought “boy am I smart”, then turned my attention to other names, now it is over $200 and it’s clear I was actually very stupid - failed to re-assess for acceleration/revaluation.
sentiment 0.10
14 hr ago • u/coconutconsidered • r/wallstreetbets • options_are_free_money_chip_shortageamdnvidia_ram • C
MU $400 by end of 2026. The memory shortage is just getting started.
sentiment -0.25
14 hr ago • u/Hiro-Nishi • r/wallstreetbets • options_are_free_money_chip_shortageamdnvidia_ram • C
Okay i will switch from GOOG and AMZN to NVDA and MU
sentiment 0.23
14 hr ago • u/featherbirdcalls • r/wallstreetbets • options_are_free_money_chip_shortageamdnvidia_ram • C
What calls on MU or EWY would you buy now ?
sentiment 0.46
15 hr ago • u/luciusbentley7 • r/wallstreetbets • options_are_free_money_chip_shortageamdnvidia_ram • C
This is the way. For sure through 2026 in my own opinion. MU just set up to begin its uptrend again after a correction. Going to run up into earnings we hope. Get ready to rock.
sentiment 0.80
15 hr ago • u/tugrulthelol • r/wallstreetbets • options_are_free_money_chip_shortageamdnvidia_ram • C
Hell yeah, I've also full ported into MU calls and shares, I think the RAM shortage has the greatest upside, especially since NVDA and the likes has already reached sky high valuations, but why ewy though? To get exposure to Samsung or something
sentiment 0.52
15 hr ago • u/grassmunkie • r/wallstreetbets • options_are_free_money_chip_shortageamdnvidia_ram • C
I haven’t looked at EWY but I suppose you’re using as a proxy for Samsung and SK Hynix. Seems reasonable, but MU is plenty enough for me. I like the pureplay on DRAM and a little NAND.
There is no end yet in sight to the DRAM supply issue - so I think we still have time to run. The RAM party usually ends 6-12 months before supply issues starts becoming addressed but 2026 is sold out and 2027 is when the TPU and Rubin go into volume production. Only thing that ends the party early is if OpenAI goes bust, but they have Microsoft and Softbank, and Nvidia standing behind them with deep pockets. Anything can happen but this cycle is definitely the craziest one we’ve ever seen.
sentiment 0.90
15 hr ago • u/Legitimate_Paint2018 • r/stockstobuytoday • which_do_you_hold • Discussion • B
1. $OKLO +393%
2. $IREN +355%
3. $OPEN +347%
4. $CIFR +316%
5. $MP +298%
6. $ONDS +254%
7. $HOOD +254%
8. $NBIS +254%
9. $ASTS +250%
10. $QBTS +221%
11. $JMIA +218%
12. $PL +216%
13. $EOSE +208%
14. $MU +182%
15. $PGY +167%
sentiment -0.60
16 hr ago • u/Moonchips12345 • r/investing • pick_your_3_growth_stocks_for_2026 • C
MU, NVO, GOOG
sentiment 0.00
19 hr ago • u/Independent_Care9569 • r/investing • pick_your_3_growth_stocks_for_2026 • C
MU - 2026 is the year we all become very familiar with the memory oligopoly. This stock will be $400 by EOY 2026.
sentiment 0.00
21 hr ago • u/LOLIMJESUS • r/investing • pick_your_3_growth_stocks_for_2026 • C
MU and DDOG will continue to benefit greatly from the AI / datacenter arms race. Also AFRM as the consumer will need a way to continue their expensive lifestyle habits
sentiment 0.46
22 hr ago • u/Evening_Squirrel_754 • r/investing • pick_your_3_growth_stocks_for_2025 • C
AVGO, WDC, MU
sentiment 0.00
23 hr ago • u/grassmunkie • r/wallstreetbets • has_mu_truly_exited_the_crucial_consumer_business • C
Never been so bullish on a stock in a while like I am on MU.
Trade of a lifetime.
sentiment 0.63
1 day ago • u/Leo6-2 • r/algorithmictrading • second_opinion_needed_i_recreated_weekly_rotation • Backtest • B
Weekly Rotation Strategy vs SPY buy and hold

Hey everyone, I recreated a trading strategy from a book by a trader who now teaches others, so I figure it's legit and not just hype. But now I'm stuck—it's outputting as a vector, and I'm questioning if my backtest results are realistic or if my code is off.​
Where do I go from here? I could run walk-forward tests or Monte Carlo simulations, but realistically, since it's based on weekly candles, I can handle entries/exits manually and use it more like an indicator—no execution issues there, right? The main doubt is whether I backtested it correctly, so I'd love a second opinion on validating it properly, like manual charting or key metrics (win rate, drawdown).
this the strategy :
The Weekly Rotation strategy is a simple, long-only momentum approach for S&P 500 stocks. It requires just one weekly check (typically Friday after close) to select and rotate into the top 10 strongest performers, aiming to beat the S&P 500 with lower drawdowns by staying in cash during bear markets.​
# Key Requirements
* **Universe**: All current, delisted, and joining/leaving S&P 500 stocks for full testing.
* **Filters**: Stocks must have 20-day average volume > 1M shares and price > $1 USD.
* **Market Condition**: SPY close must be above its 200-day SMA (with 2% buffer below).​
* **Max Positions**: 10 stocks, each sized at 10% of total equity (e.g., $100K equity = $10K per position).
# Entry Rules
* On Friday close, confirm market is "up" (SPY > 200-day SMA band).
* From filtered stocks, select those with 3-day RSI < 50 (avoids overbought).
* Rank by highest 200-day Rate of Change (ROC, or % gain); pick top 10.
* Buy all positions market-on-open Monday.​
# Exit and Rotation Rules
* Every Friday, re-rank stocks by 200-day ROC.
* Hold if still in top 10; sell and replace if dropped out (market-on-open next day).
* No hard stops normally (rotation handles weakness), but optional 20% stop loss per position if desired.
&#8203;
"""
Bensdorp Weekly Rotation Strategy - CORRECTED Implementation
Based on "The 30-Minute Stock Trader" by Laurens Bensdorp

pip install pandas numpy yfinance matplotlib seaborn
"""

import pandas as pd
import numpy as np
from pathlib import Path
from datetime import datetime, timedelta
from typing import Dict, List, Tuple, Optional
import warnings
warnings.filterwarnings('ignore')

try:
import yfinance as yf
except ImportError:
import subprocess
subprocess.check_call(['pip', 'install', 'yfinance'])
import yfinance as yf

try:
import matplotlib.pyplot as plt
import seaborn as sns
except ImportError:
import subprocess
subprocess.check_call(['pip', 'install', 'matplotlib', 'seaborn'])
import matplotlib.pyplot as plt
import seaborn as sns

sns.set_style('darkgrid')


# ============================================================================
# DATA LAYER - Parquet-based local database
# ============================================================================

class MarketDataDB:
"""Local market data storage using Parquet files"""

def __init__(self, db_path: str = "./market_data"):
self.db_path = Path(db_path)
self.db_path.mkdir(parents=True, exist_ok=True)
self.price_path = self.db_path / "prices"
self.price_path.mkdir(exist_ok=True)

def _get_ticker_file(self, ticker: str) -> Path:
return self.price_path / f"{ticker}.parquet"

def download_ticker(self, ticker: str, start_date: str, end_date: str,
force_refresh: bool = False) -> pd.DataFrame:
"""Download and cache ticker data"""
file_path = self._get_ticker_file(ticker)

if file_path.exists() and not force_refresh:
df = pd.read_parquet(file_path)
df.index = pd.to_datetime(df.index)
last_date = df.index[-1].date()
today = datetime.now().date()

if (today - last_date).days <= 1:
return df[start_date:end_date]
else:
new_data = yf.download(ticker, start=last_date, end=end_date,
progress=False, auto_adjust=True)
if not new_data.empty:
df = pd.concat([df, new_data[new_data.index > df.index[-1]]])
df.to_parquet(file_path)
return df[start_date:end_date]

print(f"Downloading {ticker}...")
try:
df = yf.download(ticker, start=start_date, end=end_date,
progress=False, auto_adjust=True)
if not df.empty:
df.to_parquet(file_path)
return df
except Exception as e:
print(f"Error downloading {ticker}: {e}")
return pd.DataFrame()

def download_universe(self, tickers: List[str], start_date: str,
end_date: str, force_refresh: bool = False) -> Dict[str, pd.DataFrame]:
"""Download multiple tickers"""
data = {}
failed = []
for ticker in tickers:
try:
df = self.download_ticker(ticker, start_date, end_date, force_refresh)
if not df.empty and len(df) > 220: # Need 200+ for indicators + buffer
data[ticker] = df
else:
failed.append(ticker)
except Exception as e:
failed.append(ticker)

if failed:
print(f"Skipped {len(failed)} tickers with insufficient data")

return data


# ============================================================================
# INDICATOR CALCULATIONS - CORRECTED
# ============================================================================

class TechnicalIndicators:
"""Technical indicators - EXACT book methodology"""

u/staticmethod
def sma(series: pd.Series, period: int) -> pd.Series:
"""Simple Moving Average"""
return series.rolling(window=period, min_periods=period).mean()

u/staticmethod
def rsi_wilder(series: pd.Series, period: int = 3) -> pd.Series:
"""
CORRECTED: Wilder's RSI using exponential smoothing
Book uses 3-day RSI < 50 to avoid overbought stocks

This is THE critical fix - original used simple moving average
"""
delta = series.diff()

# Separate gains and losses
gain = delta.where(delta > 0, 0)
loss = -delta.where(delta < 0, 0)

# Wilder's smoothing: use exponential weighted moving average
# alpha = 1/period gives the Wilder smoothing
avg_gain = gain.ewm(alpha=1/period, min_periods=period, adjust=False).mean()
avg_loss = loss.ewm(alpha=1/period, min_periods=period, adjust=False).mean()

rs = avg_gain / avg_loss
rsi = 100 - (100 / (1 + rs))

return rsi

u/staticmethod
def roc(series: pd.Series, period: int = 200) -> pd.Series:
"""
Rate of Change (Momentum)
Book: "highest rate of change over last 200 trading days"
"""
return ((series - series.shift(period)) / series.shift(period)) * 100


# ============================================================================
# STRATEGY IMPLEMENTATION - CORRECTED BOOK RULES
# ============================================================================

class BensdorpWeeklyRotation:
"""
Weekly Rotation Strategy - CORRECTED implementation

CRITICAL DIFFERENCES FROM BROKEN VERSION:
1. Uses Wilder's RSI (exponential), not SMA-based RSI
2. Executes on MONDAY OPEN, not Friday close
3. Top 10 selection FIRST, then RSI filter for NEW entries only
4. Proper rotation: keep anything in top 10, exit anything that drops out

Entry Rules (Friday evening analysis, Monday morning execution):
1. Friday close: Check SPY > 200-day SMA (with 2% buffer)
2. Friday close: Rank all stocks by 200-day ROC
3. Friday close: Select top 10 by momentum
4. Friday close: For NEW entries only, filter RSI < 50
5. Monday open: Execute trades

Exit Rules:
1. Hold as long as stock remains in top 10 by ROC
2. Exit when stock drops out of top 10
3. No stop losses (rotation serves as exit)
"""

def __init__(self, initial_capital: float = 10000):
self.initial_capital = initial_capital
self.capital = initial_capital
self.positions = {} # {ticker: shares}
self.trades = []
self.equity_curve = []
self.indicators = TechnicalIndicators()

def calculate_indicators(self, data: Dict[str, pd.DataFrame],
spy_data: pd.DataFrame) -> pd.DataFrame:
"""Calculate indicators - Friday close data"""

# Need at least 200 days of SPY data
if len(spy_data) < 200:
return pd.DataFrame()

# Calculate SPY market regime
spy_sma = self.indicators.sma(spy_data['Close'], 200)
spy_sma_band = spy_sma * 0.98 # 2% buffer

# Check if SPY SMA is valid (not NaN)
spy_sma_value = spy_sma.iloc[-1]
if isinstance(spy_sma_value, pd.Series):
spy_sma_value = spy_sma_value.iloc[0]
if pd.isna(spy_sma_value):
return pd.DataFrame()

spy_close_value = spy_data['Close'].iloc[-1]
if isinstance(spy_close_value, pd.Series):
spy_close_value = spy_close_value.iloc[0]
spy_close = float(spy_close_value)

spy_band_value = spy_sma_band.iloc[-1]
if isinstance(spy_band_value, pd.Series):
spy_band_value = spy_band_value.iloc[0]
spy_band = float(spy_band_value)

indicator_data = []

for ticker, df in data.items():
if len(df) < 203: # Need 200 for ROC + 3 for RSI
continue

try:
# Calculate indicators using CORRECTED methods
rsi_3 = self.indicators.rsi_wilder(df['Close'], 3) # WILDER'S RSI
roc_200 = self.indicators.roc(df['Close'], 200)

# Get values
last_rsi = float(rsi_3.iloc[-1])
last_roc = float(roc_200.iloc[-1])
last_close = float(df['Close'].iloc[-1])
last_volume = float(df['Volume'].iloc[-1])

# Skip if NaN
if pd.isna(last_rsi) or pd.isna(last_roc):
continue

# Calculate 20-day average volume for liquidity filter
avg_volume_20 = float(df['Volume'].rolling(20).mean().iloc[-1])

indicator_data.append({
'ticker': ticker,
'date': df.index[-1],
'close': last_close,
'volume': last_volume,
'avg_volume_20': avg_volume_20,
'rsi_3': last_rsi,
'roc_200': last_roc,
'spy_close': spy_close,
'spy_sma_band': spy_band
})

except Exception:
continue

return pd.DataFrame(indicator_data)

def get_weekly_signals(self, indicators: pd.DataFrame) -> Tuple[List[str], List[str]]:
"""
CORRECTED rotation logic - matches book exactly

Key insight: "Solution C" from C# code:
1. Rank ALL stocks by momentum
2. Top 10 = target portfolio
3. KEEP: anything we hold that's still in top 10
4. ENTER: new positions from top 10, but ONLY if RSI < 50
5. EXIT: anything not in top 10
"""

if indicators.empty:
return [], []

# Extract SPY regime
spy_close = float(indicators['spy_close'].iloc[0])
spy_band = float(indicators['spy_sma_band'].iloc[0])

# Check market regime: SPY > 200 SMA band
if spy_close <= spy_band:
# Bear market: exit everything
return [], list(self.positions.keys())

# Filter valid stocks (liquidity + price)
valid = indicators[
(indicators['close'] > 1.0) &
(indicators['avg_volume_20'] > 1_000_000)
].copy()

if valid.empty:
return [], list(self.positions.keys())

# STEP 1: Rank by 200-day ROC (momentum)
valid = valid.sort_values('roc_200', ascending=False)

# STEP 2: Top 10 by momentum = TARGET PORTFOLIO
top_10 = valid.head(10)
top_10_tickers = set(top_10['ticker'].values)

# STEP 3: KEEP - positions we already hold that are still in top 10
keeps = [t for t in self.positions.keys() if t in top_10_tickers]

# STEP 4: ENTER - new positions from top 10 with RSI < 50 filter
available_slots = 10 - len(keeps)

# Filter top 10 for new entries: must have RSI < 50 and we don't already hold it
entry_candidates = top_10[
(~top_10['ticker'].isin(self.positions.keys())) &
(top_10['rsi_3'] < 50)
]

enters = entry_candidates['ticker'].head(available_slots).tolist()

# STEP 5: EXIT - anything we hold that's NOT in top 10
exits = [t for t in self.positions.keys() if t not in top_10_tickers]

return enters, exits

def execute_trades(self, friday_date: datetime, enters: List[str], exits: List[str],
friday_data: Dict[str, pd.DataFrame],
monday_data: Dict[str, pd.DataFrame]):
"""
CORRECTED: Execute trades at MONDAY OPEN, not Friday close

friday_date: Date of signal generation
friday_data: Data up to and including Friday (for portfolio valuation)
monday_data: Data including Monday (for execution prices)
"""

# Calculate portfolio value using Friday close prices
portfolio_value = self.capital
for ticker, shares in self.positions.items():
if ticker in friday_data:
try:
price = float(friday_data[ticker]['Close'].iloc[-1])
if not pd.isna(price):
portfolio_value += shares * price
except (ValueError, TypeError, IndexError):
pass

# Execute exits first (Monday open price)
for ticker in exits:
if ticker in self.positions and ticker in monday_data:
shares = self.positions[ticker]
try:
# Get Monday's open price
monday_open = float(monday_data[ticker]['Open'].iloc[-1])
if pd.isna(monday_open):
continue
except (ValueError, TypeError, IndexError, KeyError):
# If no Open price, use Close
try:
monday_open = float(monday_data[ticker]['Close'].iloc[-1])
except:
continue

proceeds = shares * monday_open
self.capital += proceeds

self.trades.append({
'date': monday_data[ticker].index[-1], # Actual Monday date
'ticker': ticker,
'action': 'SELL',
'shares': shares,
'price': monday_open,
'value': proceeds
})

del self.positions[ticker]

# Execute entries (Monday open price)
if enters:
position_size = portfolio_value * 0.10 # 10% per position

for ticker in enters:
if ticker in monday_data:
try:
# Get Monday's open price
monday_open = float(monday_data[ticker]['Open'].iloc[-1])
if pd.isna(monday_open) or monday_open <= 0:
continue
except (ValueError, TypeError, IndexError, KeyError):
try:
monday_open = float(monday_data[ticker]['Close'].iloc[-1])
except:
continue

shares = int(position_size / monday_open)
cost = shares * monday_open

if self.capital >= cost and shares > 0:
self.positions[ticker] = shares
self.capital -= cost

self.trades.append({
'date': monday_data[ticker].index[-1], # Actual Monday date
'ticker': ticker,
'action': 'BUY',
'shares': shares,
'price': monday_open,
'value': cost
})

def record_equity(self, date: datetime, data: Dict[str, pd.DataFrame]):
"""Record portfolio value at end of day"""
portfolio_value = self.capital

for ticker, shares in self.positions.items():
if ticker in data:
try:
price = float(data[ticker]['Close'].iloc[-1])
if not pd.isna(price):
portfolio_value += shares * price
except (ValueError, TypeError, IndexError):
pass

self.equity_curve.append({
'date': date,
'equity': float(portfolio_value),
'cash': float(self.capital),
'num_positions': len(self.positions)
})


# ============================================================================
# BACKTESTING ENGINE - CORRECTED
# ============================================================================

class Backtester:
"""Backtest engine with CORRECTED execution timing"""

def __init__(self, strategy: BensdorpWeeklyRotation, data_db: MarketDataDB):
self.strategy = strategy
self.data_db = data_db

def run(self, universe: List[str], start_date: str, end_date: str,
benchmark: str = 'SPY') -> pd.DataFrame:
"""Run backtest with MONDAY OPEN execution"""

print(f"\n{'='*70}")
print(f"BACKTEST: Bensdorp Weekly Rotation (CORRECTED)")
print(f"Period: {start_date} to {end_date}")
print(f"Universe: {len(universe)} stocks")
print(f"Initial Capital: ${self.strategy.initial_capital:,.2f}")
print(f"{'='*70}\n")

# Download data
print("Loading market data...")
data = self.data_db.download_universe(universe, start_date, end_date)
spy_data = self.data_db.download_ticker(benchmark, start_date, end_date)

print(f"Loaded {len(data)} stocks with sufficient history\n")

# Find all Fridays
all_dates = spy_data.index
fridays = []
for i, date in enumerate(all_dates):
if date.dayofweek == 4: # Friday = 4
fridays.append(date)

print(f"Simulating {len(fridays)} weeks of trading...")
print("Each week: Friday analysis → Monday execution\n")

trades_count = 0
for i, friday in enumerate(fridays):
# Get data up to Friday close
historical_data = {
ticker: df.loc[:friday]
for ticker, df in data.items()
if friday in df.index
}
spy_historical = spy_data.loc[:friday]

# Skip warmup period
if len(spy_historical) < 200:
continue

# Calculate indicators (Friday close)
indicators = self.strategy.calculate_indicators(
historical_data, spy_historical
)

if indicators.empty:
# Record equity even if no signals
self.strategy.record_equity(friday, historical_data)
continue

# Get signals (Friday evening)
enters, exits = self.strategy.get_weekly_signals(indicators)

# Find next Monday for execution
next_monday = None
for future_date in all_dates[all_dates > friday]:
if future_date.dayofweek == 0: # Monday = 0
next_monday = future_date
break

# If no Monday found (end of data), use next trading day
if next_monday is None:
next_available = all_dates[all_dates > friday]
if len(next_available) > 0:
next_monday = next_available[0]
else:
# End of data
self.strategy.record_equity(friday, historical_data)
continue

# Get Monday data for execution
monday_data = {
ticker: df.loc[:next_monday]
for ticker, df in data.items()
if next_monday in df.index
}

# Execute trades (Monday open)
if enters or exits:
self.strategy.execute_trades(
friday, enters, exits,
historical_data, monday_data
)
trades_count += len(enters) + len(exits)

# Record equity (use latest available data)
latest_data = monday_data if monday_data else historical_data
latest_date = next_monday if next_monday else friday
self.strategy.record_equity(latest_date, latest_data)

# Progress
if (i + 1) % 50 == 0:
current_equity = self.strategy.equity_curve[-1]['equity']
print(f" Week {i+1}/{len(fridays)}: ${current_equity:,.0f}, "
f"{len(self.strategy.positions)} positions, {trades_count} total trades")

print(f"\nBacktest complete! Total trades: {trades_count}\n")

if not self.strategy.equity_curve:
raise ValueError("No equity data recorded!")

return pd.DataFrame(self.strategy.equity_curve).set_index('date')


# ============================================================================
# PERFORMANCE ANALYTICS
# ============================================================================

class PerformanceAnalytics:
"""Performance metrics calculation"""

u/staticmethod
def calculate_metrics(equity_curve: pd.DataFrame,
benchmark_curve: pd.DataFrame,
risk_free_rate: float = 0.02) -> Dict:
"""Calculate all performance metrics"""

strategy_returns = equity_curve['equity'].pct_change().dropna()
benchmark_returns = benchmark_curve.pct_change().dropna()

# Align dates
common_dates = strategy_returns.index.intersection(benchmark_returns.index)
strategy_returns = strategy_returns.loc[common_dates]
benchmark_returns = benchmark_returns.loc[common_dates]

# CAGR
total_years = (equity_curve.index[-1] - equity_curve.index[0]).days / 365.25
strategy_cagr = float(
(equity_curve['equity'].iloc[-1] / equity_curve['equity'].iloc[0])
** (1 / total_years) - 1
) * 100

benchmark_cagr = float(
(benchmark_curve.iloc[-1] / benchmark_curve.iloc[0])
** (1 / total_years) - 1
) * 100

# Maximum Drawdown
cummax = equity_curve['equity'].cummax()
drawdown = (equity_curve['equity'] - cummax) / cummax * 100
max_dd = float(drawdown.min())

bench_cummax = benchmark_curve.cummax()
bench_drawdown = (benchmark_curve - bench_cummax) / bench_cummax * 100
bench_max_dd = float(bench_drawdown.min())

# MAR Ratio
mar_ratio = abs(strategy_cagr / max_dd) if max_dd != 0 else 0
bench_mar = abs(benchmark_cagr / bench_max_dd) if bench_max_dd != 0 else 0

# Sharpe Ratio
excess_returns = strategy_returns - (risk_free_rate / 252)
sharpe = float(np.sqrt(252) * excess_returns.mean() / strategy_returns.std())

bench_excess = benchmark_returns - (risk_free_rate / 252)
bench_sharpe = float(np.sqrt(252) * bench_excess.mean() / benchmark_returns.std())

# Sortino Ratio
downside_returns = strategy_returns[strategy_returns < 0]
sortino = (
float(np.sqrt(252) * excess_returns.mean() / downside_returns.std())
if len(downside_returns) > 0 else 0
)

# Total Return
total_return = float(
(equity_curve['equity'].iloc[-1] / equity_curve['equity'].iloc[0] - 1) * 100
)
bench_total_return = float(
(benchmark_curve.iloc[-1] / benchmark_curve.iloc[0] - 1) * 100
)

return {
'strategy_cagr': strategy_cagr,
'benchmark_cagr': benchmark_cagr,
'strategy_total_return': total_return,
'benchmark_total_return': bench_total_return,
'strategy_max_dd': max_dd,
'benchmark_max_dd': bench_max_dd,
'mar_ratio': mar_ratio,
'benchmark_mar': bench_mar,
'sharpe_ratio': sharpe,
'benchmark_sharpe': bench_sharpe,
'sortino_ratio': sortino,
'total_trades': len(strategy_returns),
'volatility': float(strategy_returns.std() * np.sqrt(252) * 100)
}

u/staticmethod
def print_metrics(metrics: Dict):
"""Pretty print metrics"""

print(f"\n{'='*70}")
print(f"PERFORMANCE SUMMARY")
print(f"{'='*70}\n")

print(f"{'Total Return':<30} Strategy: {metrics['strategy_total_return']:>8.2f}% | Benchmark: {metrics['benchmark_total_return']:>8.2f}%")
print(f"{'CAGR':<30} Strategy: {metrics['strategy_cagr']:>8.2f}% | Benchmark: {metrics['benchmark_cagr']:>8.2f}%")
print(f"{'Maximum Drawdown':<30} Strategy: {metrics['strategy_max_dd']:>8.2f}% | Benchmark: {metrics['benchmark_max_dd']:>8.2f}%")
print(f"{'MAR Ratio (CAGR/MaxDD)':<30} Strategy: {metrics['mar_ratio']:>8.2f} | Benchmark: {metrics['benchmark_mar']:>8.2f}")
print(f"{'Sharpe Ratio':<30} Strategy: {metrics['sharpe_ratio']:>8.2f} | Benchmark: {metrics['benchmark_sharpe']:>8.2f}")
print(f"{'Sortino Ratio':<30} Strategy: {metrics['sortino_ratio']:>8.2f}")
print(f"{'Volatility (Annualized)':<30} Strategy: {metrics['volatility']:>8.2f}%")

print(f"\n{'='*70}")
print(f"KEY INSIGHTS:")
print(f"{'='*70}")

outperformance = metrics['strategy_cagr'] - metrics['benchmark_cagr']
dd_improvement = abs(metrics['strategy_max_dd']) - abs(metrics['benchmark_max_dd'])

print(f"✓ Outperformance: {outperformance:+.2f}% CAGR vs benchmark")
print(f"✓ Drawdown difference: {dd_improvement:+.2f}% vs benchmark")
print(f"✓ Risk-adjusted (MAR): {(metrics['mar_ratio']/metrics['benchmark_mar']-1)*100:+.1f}% vs benchmark")
print(f"✓ Risk-adjusted (Sharpe): {(metrics['sharpe_ratio']/metrics['benchmark_sharpe']-1)*100:+.1f}% vs benchmark")
print(f"{'='*70}\n")


# ============================================================================
# VISUALIZATION
# ============================================================================

class StrategyVisualizer:
"""Professional visualizations"""

u/staticmethod
def plot_results(equity_curve: pd.DataFrame,
benchmark_curve: pd.DataFrame,
trades: List[Dict]):
"""Create comprehensive charts"""

fig, axes = plt.subplots(3, 1, figsize=(14, 10))
fig.suptitle('Bensdorp Weekly Rotation Strategy - CORRECTED Backtest',
fontsize=16, fontweight='bold')

# Equity curves
ax1 = axes[0]
ax1.plot(equity_curve.index, equity_curve['equity'],
label='Strategy (CORRECTED)', linewidth=2, color='#2E86AB')

benchmark_normalized = (
benchmark_curve / benchmark_curve.iloc[0] * equity_curve['equity'].iloc[0]
)
ax1.plot(benchmark_curve.index, benchmark_normalized,
label='S&P 500 (Buy & Hold)', linewidth=2,
color='#A23B72', alpha=0.7)

ax1.set_ylabel('Portfolio Value ($)', fontsize=11, fontweight='bold')
ax1.set_title('Equity Curve Comparison', fontsize=12, fontweight='bold')
ax1.legend(loc='upper left', fontsize=10)
ax1.grid(True, alpha=0.3)
ax1.yaxis.set_major_formatter(plt.FuncFormatter(lambda x, p: f'${x/1000:.0f}K'))

# Drawdown
ax2 = axes[1]
cummax = equity_curve['equity'].cummax()
drawdown = (equity_curve['equity'] - cummax) / cummax * 100

ax2.fill_between(drawdown.index, drawdown, 0,
color='#F18F01', alpha=0.5, label='Drawdown')
ax2.set_ylabel('Drawdown (%)', fontsize=11, fontweight='bold')
ax2.set_title('Strategy Drawdown', fontsize=12, fontweight='bold')
ax2.legend(loc='lower left', fontsize=10)
ax2.grid(True, alpha=0.3)

# Positions
ax3 = axes[2]
ax3.plot(equity_curve.index, equity_curve['num_positions'],
linewidth=2, color='#6A994E')
ax3.set_ylabel('# Positions', fontsize=11, fontweight='bold')
ax3.set_xlabel('Date', fontsize=11, fontweight='bold')
ax3.set_title('Portfolio Exposure', fontsize=12, fontweight='bold')
ax3.set_ylim(0, 11)
ax3.grid(True, alpha=0.3)

plt.tight_layout()
plt.savefig('backtest_CORRECTED.png', dpi=150, bbox_inches='tight')
print("✓ Chart saved as 'backtest_CORRECTED.png'")
plt.show()


# ============================================================================
# MAIN EXECUTION
# ============================================================================

def main():
"""Run corrected backtest"""

# Test both the book period AND recent period
START_DATE = '2020-01-01' # Book's period
# START_DATE = '2020-01-01' # Recent period for comparison
END_DATE = datetime.now().strftime('%Y-%m-%d')
INITIAL_CAPITAL = 10000

# S&P 500 sample
SP500_SAMPLE = [
"NVDA","AAPL","MSFT","AMZN","GOOGL","GOOG","AVGO","META","TSLA","BRK.B","LLY","WMT","JPM","V","ORCL","JNJ","XOM","MA","NFLX","COST","PLTR","ABBV","BAC","AMD","HD","PG","KO","GE","CVX","CSCO","UNH","IBM","MU","MS","WFC","CAT","MRK","AXP","GS","PM","TMUS","RTX","CRM","ABT","TMO","MCD","APP","PEP","AMAT","ISRG","LRCX","INTC","DIS","LIN","C","T","AMGN","QCOM","UBER","NEE","INTU","APH","NOW","VZ","TJX","SCHW","BLK","ANET","ACN","DHR","BKNG","GEV","GILD","TXN","KLAC","SPGI","BSX","PFE","SYK","BA","COF","WELL","LOW","UNP","ADBE","PGR","MDT","ETN","PANW","ADI","CRWD","DE","HON","PLD","CB","HCA","BX","CEG","COP","HOOD","KKR","PH","VRTX","MCK","ADP","LMT","CME","CVS","BMY","MO","NEM","SO","CMCSA","NKE","SBUX","DUK","TT","MMM","MMC","GD","DELL","ICE","DASH","MCO","WM","ORLY","SHW","CDNS","SNPS","AMT","MAR","UPS","HWM","REGN","NOC","BK","ECL","USB","APO","TDG","AON","PNC","WMB","CTAS","EMR","MNST","ELV","CI","RCL","MDLZ","EQIX","ITW","ABNB","GLW","COIN","JCI","COR","CMI","GM","PWR","TEL","RSG","HLT","AZO","NSC","CSX","ADSK","TRV","FDX","CL","AEP","AJG","MSI","FCX","FTNT","KMI","SPG","WBD","EOG","SRE","TFC","STX","VST","MPC","PYPL","IDXX","APD","ROST","AFL","DDOG","PSX","WDC","WDAY","ZTS","ALL","VLO","SLB","PCAR","BDX","DLR","O","F","D","URI","NDAQ","LHX","EA","MET","NXPI","BKR","EW","CAH","CBRE","PSA","ROP","XEL","LVS","OKE","DHI","FAST","EXC","TTWO","CARR","CMG","CTVA","AME","FANG","GWW","KR","MPWR","ROK","A","AMP","ETR","AXON","MSCI","DAL","FICO","OXY","TGT","YUM","AIG","PEG","PAYX","SQ","IQV","CCI","VMC","HIG","KDP","CPRT","EQT","TRGP","PRU","VTR","GRMN","HSY","EBAY","CTSH","MLM","NUE","SYY","GEHC","KMB","ON","EFX","GIS","STZ","AVB","DD","IRM","DTE","KEYS","BR","AWK","FITB","VICI","ACGL","NDSN","ODFL","WAB","PCG","DOW","FTV","TROW","SYF","TER","AEE","ZBH","HUBB","BIIB","TDY","ZBRA","CHTR","PPG","OTIS","DXCM","WTW","CTLT","ARES","WEC","LYB","MCHP","CSGP","WY","TSCO","HST","AZN","RMD","FSLR","DOV","ANSS","NTNX","EA","CTRA","KHC","PSTG","LH","INVH","KVUE","CNC","SMCI","RJF","LYV","GOOG","ILMN","DVA","ESS","WAT","TRMB","SWK","LUV","WST","AES","LDOS","FE","DRI","GPC","AVY","HOLX","TTWO","EXPD","CMS","BLDR","ALGN","STLD","ARE","EG","BRO","ES","MKC","JBHT","CNP","IT","WDC","NVR","NTRS","EPAM","POOL","BALL","HBAN","BF.B","EXPE","VTRS","PKG","J","RF","PODD","CAG","GL","STE","CFG","AKAM","BBWI","EQR","SBAC","TPR","K","DAY","FDS","NTAP","IP","ENPH","MGM","SWKS","MAS","COO","DFS","AIZ","TECH","TYL","PAYC","CHRW","MRNA","KEY","TXT","MAA","JKHY","HRL","ULTA","LNT","UDR","NI","HII","KIM","ALLE","KMX","RVTY","CE","DGX","REG","WBA","AMCR","CPT","JNPR","MTCH","APA","BXP","EVRG","RL","PFG","HSIC","BWA","ALB","SOLV","PARA","CRL","CPB","IVZ","NWS","NWSA","MOH","WYNN","HAS","PNW","BG","FRT","FOXA","FOX","VFC","EXE","HOOD","DASH","GEV","APP"
]

# Initialize system
data_db = MarketDataDB()
strategy = BensdorpWeeklyRotation(initial_capital=INITIAL_CAPITAL)
backtester = Backtester(strategy, data_db)

# Run backtest
equity_curve = backtester.run(
universe=SP500_SAMPLE,
start_date=START_DATE,
end_date=END_DATE,
benchmark='SPY'
)

# Load benchmark
benchmark = data_db.download_ticker('SPY', START_DATE, END_DATE)

# Calculate metrics
analytics = PerformanceAnalytics()
metrics = analytics.calculate_metrics(equity_curve, benchmark['Close'])

# Print results
analytics.print_metrics(metrics)

# Visualize
visualizer = StrategyVisualizer()
visualizer.plot_results(equity_curve, benchmark['Close'], strategy.trades)

# Save trade log
trades_df = pd.DataFrame(strategy.trades)
trades_df.to_csv('trade_log_CORRECTED.csv', index=False)
print("✓ Trade log saved as 'trade_log_CORRECTED.csv'\n")

return strategy, equity_curve, metrics


if __name__ == "__main__":
strategy, results, metrics = main()

print("\n" + "="*70)
print("CORRECTED BACKTEST COMPLETE")
print("="*70)
print("\nCRITICAL FIXES APPLIED:")
print(" ✓ Wilder's RSI (exponential smoothing)")
print(" ✓ Monday open execution (not Friday close)")
print(" ✓ Correct rotation logic (top 10 first, then RSI filter)")
print(" ✓ Proper position sizing and timing")
print("\nFiles generated:")
print(" • backtest_CORRECTED.png")
print(" • trade_log_CORRECTED.csv")
print(" • ./market_data/ (cached data)")
print("="*70 + "\n")

sentiment 1.00
1 day ago • u/kleft123 • r/stocks • which_stocks_show_potential_as_the_final_wave_of • C
MU
sentiment 0.00
1 day ago • u/ftntvg • r/investing • if_the_ai_bubble_did_pop_would_the_three_major • C
If the AI bubble pops, yes memory stocks would get obliterated. They're essentially a levered play versus owning NVDA or ORCL.
Doesn't mean they can't run up another 50% if you think the AI trade can go much higher. Just know you are essentially owning a 2-3x levered ETF on NVDA by buying MU SNDK STX, etc.
Cheers.
sentiment 0.60


Share
About
Pricing
Policies
Markets
API
Info
tz UTC-5
Connect with us
ChartExchange Email
ChartExchange on Discord
ChartExchange on X
ChartExchange on Reddit
ChartExchange on GitHub
ChartExchange on YouTube
© 2020 - 2025 ChartExchange LLC