Create Account
Log In
Dark
chart
exchange
Premium
Terminal
Screener
Stocks
Crypto
Forex
Trends
Depth
Close
Check out our Level2View

CRM
Salesforce, Inc.
stock NYSE

At Close
Dec 8, 2025 3:59:58 PM EST
259.54USD-0.395%(-1.03)10,769,077
247.60Bid   263.75Ask   16.15Spread
Pre-market
Dec 8, 2025 9:28:30 AM EST
261.69USD+0.430%(+1.12)42,055
After-hours
Dec 8, 2025 4:50:30 PM EST
259.35USD-0.073%(-0.19)59,197
OverviewOption ChainMax PainOptionsPrice & VolumeSplitsDividendsHistoricalExchange VolumeDark Pool LevelsDark Pool PrintsExchangesShort VolumeShort Interest - DailyShort InterestBorrow Fee (CTB)Failure to Deliver (FTD)ShortsTrendsNewsTrends
CRM Reddit Mentions
Subreddits
Limit Labels     

We have sentiment values and mention counts going back to 2017. The complete data set is available via the API.
Take me to the API
CRM Specific Mentions
As of Dec 8, 2025 7:04:01 PM EST (<1 min. ago)
Includes all comments and posts. Mentions per user per ticker capped at one per hour.
1 hr ago • u/Beetlejuice_hero • r/stocks • what_are_you_bagholding • C
Had been CRM but that came roaring back and had a banger Q that should keep it going up.
DECK 105 cost basis, averaged down at 80, now slightly in green. Very good fundamentals.
CPB I wanted a boring dividend play which I guess it is but it just lumbers along. 32 cost basis.
Meh, overall great year so can't complain.
sentiment 0.92
2 hr ago • u/Abject_Set8851 • r/ValueInvesting • how_many_of_you_actually_trust_your_dcf_models • C
MRK, CRM, and ALB.
sentiment 0.00
10 hr ago • u/dislikesmoonpies • r/wallstreetbets • daily_discussion_thread_for_december_08_2025 • C
Thinking of selling DDOG and moving it into CRM. I'm already well into the green as far as DDOG goes. Agreed?
sentiment 0.49
10 hr ago • u/stealthagents • r/fidelityinvestments • fidelity_ai_assistant_sucks_and_is_zero_help • C
It sounds frustrating dealing with AI when you need direct assistance. If you're having challenges with business tasks like CRM management or client follow-ups, Stealth Agents can lend a hand. Our team has over a decade of industry-specific expertise to help keep your operations organized.
sentiment 0.67
10 hr ago • u/Realistic-Event1930 • r/wallstreetbets • daily_discussion_thread_for_december_08_2025 • C
TTD, ADBE, CRM, ZETA, SAP
sentiment 0.00
12 hr ago • u/er0ldo89 • r/smallstreetbets • spy_0dte_puts • C
Damn I’m super bullish because of earnings for oracle adobe and one more big name Thursday. Also fed rate cut same day. CRM was a good indicator last week for bullish sentiment imo.
sentiment 0.72
22 hr ago • u/Beetlejuice_hero • r/stocks • rstocks_weekend_discussion_saturday_dec_06_2025 • C
RamCockUpMyAss - my CRM is popping off. More Green this week hopefully.
I say on all panic threads (back in April, a couple weeks ago): **NEVER** turn off your S&P auto buy. Have it in a totally separate account that you forget the PW to if it helps.
I fk around with some individual holdings and take profits sometimes, and sometimes it's ill-advised, but that S&P auto buy is always on no matter what.
Thanks RamCockUpMyAss.
sentiment 0.77
23 hr ago • u/UsefulInvestigator60 • r/wallstreetbets • what_are_your_moves_tomorrow_december_08_2025 • C
What the fuck is proprietary about CRM now. It was an “ai” sales software before ai. Now anyone can automate. Puts on that CNBC Peloton style dog shit.
sentiment -0.86
1 day ago • u/Happy-Champion1661 • r/wallstreetbets • weekly_earnings_thread_128_1212 • C
I don't really like ORCL and AVGO setups here
even by if you ignore FED week, ORCL might have a run up to ER and then just die. AVGO is good but the move might be already priced in by the small run up the past days. Both went up a ton last ER, so puts seem incorrect here.
ADBE had high IV for very little payoff last time, maybe this is the right time? calls at open post IV crush, similar to CRM
sentiment -0.62
1 day ago • u/Routine-District-588 • r/ValueInvesting • meta_is_the_best_mag7_value_wise • C
TTD? Really? its pricy as fk and the moat is really unclear i rather go with Zeta on this. ADBE is solid althought i think there is a real risk that amateaurs will shift to cheaper products. Pins i dont like. STZ i know them but dont know alot about them. Lrn dont know. i do hold CRM bought before earnigs (was stupid and did not check for earnings date).
sentiment -0.71
1 day ago • u/doughboy_491 • r/stocks • what_is_your_diversified_ai_portfolio • C
My AI portfolio is GOOGL NVDA META AVGO AMZN ANET MRVL BABA PLTR AMD but the AI and AI infrastructure play is awfully crowded and largely played out by investors (at least in public markets). I think the software companies are the next leg in the rally in stocks based on Agentic AI and it will be a much more difficult investor environment because there will be definite winners and losers that are hard to discern right now. Among these are CRM MSFT PANW ORCL WDAY NOW NET CYBR OKTA SNOW TEAM OS
sentiment 0.48
1 day ago • u/Valueandgrowthare • r/ValueInvesting • adobe_stock_every_youtuber_is_bullish_on_adbe • C
You can integrate BPM into CRM and ERP like Salesforce and SAP or Oracle with autonomy and cross enterprise action. You can also have Ai powered monitoring or cybersecurity like Dynatrace and Datadog.
Now, besides Firefly, what can Adobe bring to increase ARR? Other that rising prices.
sentiment 0.74
1 day ago • u/HMonster224 • r/ValueInvesting • stocks_where_customers_hate_the_product • C
Yeah, exactly. It's way too difficult to switch (and expensive to do so because you basically have to hire outside consultants to do it for you or you'll totally screw it up). It's especially awful for smaller companies because it's really not designed for that. I basically have PTSD from being the owner of the CRM contract for the Sales & Marketing functions at my past couple of jobs.
sentiment -0.65
1 day ago • u/HMonster224 • r/ValueInvesting • stocks_where_customers_hate_the_product • C
Yes, absolutely can't stand them. And it's not just the product (which is overcomplicated and hard to use). It is insanely expensive, they nickel and dime you for every little thing, and your rep calls once a year to "renegotiate" your contract, which literally means they tell you how much they are increasing your contract cost YoY and you have no choice but to pay it. They're not technically a monopoly but close enough that you are basically stuck with them. It basically feels like paying the mafia, just without the actual threat of death. Actually, no, that's really disrespectful to the mafia, they don't deserve to be compared to CRM.
sentiment -0.19
1 day ago • u/PerfectInAllThings • r/ValueInvesting • stocks_where_customers_hate_the_product • C
I hate CRM.
Lousy product, worked with it across multiple employers, so experienced various iterations and all were terrible.
That said, yes I bought some CRM before earnings and enjoyed the recent run up.
I hate this company so much though.
sentiment -0.84
1 day ago • u/stephendt • r/ValueInvesting • after_checking_out_this_wave_of_ai_chip_data_ive • C
Ironically the OP was written by AI. Also I used AI to find this list of companies below. I do find it baffling that people think this is a fad. Anyway here's your list of comapnies that have "AI products" that are also proditable.
Microsoft - Copilot and Azure AI for productivity and enterprise workloads
Alphabet (Google) - Gemini and Vertex AI for search, content generation, and developer tools
Meta Platforms - Llama models powering generative assistants and content tools
Amazon - AWS AI/Bedrock models used across cloud, retail, logistics
Salesforce - Einstein and Agentforce for CRM automation and enterprise agents
Adobe - Firefly AI for creative generation in imaging, video, design
ServiceNow - Now AI platform for workflow automation and enterprise agents
Oracle - OCI AI and Fusion AI for enterprise apps and data platforms
Intuit - Generative AI assistants in QuickBooks and TurboTax
Accenture - Enterprise AI transformation services and proprietary AI platforms
IBM - Watsonx platform for enterprise AI training, governance, and deployment
Tesla - Neural-network stack powering Full Self-Driving
Autodesk - Generative design and AI automation in CAD and modelling tools
sentiment 0.67
2 days ago • u/shrindcs • r/wallstreetbets • weekend_discussion_thread_for_the_weekend_of • C
Any1 have any info on CRM buying c3.ai? Supposed rumor
sentiment 0.00
2 days ago • u/Myers112 • r/wallstreetbets • unusual_adobe_on_dec_4_anyone_else_notice_this • C
CRM earnings sympathy play
sentiment 0.60
2 days ago • u/er0ldo89 • r/smallstreetbets • 75_and_4k_in_under_a_month • C
At the very least keep 100 shares. Also check out CRM, oracle and adobe
sentiment 0.30
3 days ago • u/Leo6-2 • r/algorithmictrading • second_opinion_needed_i_recreated_weekly_rotation • Backtest • B
Weekly Rotation Strategy vs SPY buy and hold

Hey everyone, I recreated a trading strategy from a book by a trader who now teaches others, so I figure it's legit and not just hype. But now I'm stuck—it's outputting as a vector, and I'm questioning if my backtest results are realistic or if my code is off.​
Where do I go from here? I could run walk-forward tests or Monte Carlo simulations, but realistically, since it's based on weekly candles, I can handle entries/exits manually and use it more like an indicator—no execution issues there, right? The main doubt is whether I backtested it correctly, so I'd love a second opinion on validating it properly, like manual charting or key metrics (win rate, drawdown).
this the strategy :
The Weekly Rotation strategy is a simple, long-only momentum approach for S&P 500 stocks. It requires just one weekly check (typically Friday after close) to select and rotate into the top 10 strongest performers, aiming to beat the S&P 500 with lower drawdowns by staying in cash during bear markets.​
# Key Requirements
* **Universe**: All current, delisted, and joining/leaving S&P 500 stocks for full testing.
* **Filters**: Stocks must have 20-day average volume > 1M shares and price > $1 USD.
* **Market Condition**: SPY close must be above its 200-day SMA (with 2% buffer below).​
* **Max Positions**: 10 stocks, each sized at 10% of total equity (e.g., $100K equity = $10K per position).
# Entry Rules
* On Friday close, confirm market is "up" (SPY > 200-day SMA band).
* From filtered stocks, select those with 3-day RSI < 50 (avoids overbought).
* Rank by highest 200-day Rate of Change (ROC, or % gain); pick top 10.
* Buy all positions market-on-open Monday.​
# Exit and Rotation Rules
* Every Friday, re-rank stocks by 200-day ROC.
* Hold if still in top 10; sell and replace if dropped out (market-on-open next day).
* No hard stops normally (rotation handles weakness), but optional 20% stop loss per position if desired.
&#8203;
"""
Bensdorp Weekly Rotation Strategy - CORRECTED Implementation
Based on "The 30-Minute Stock Trader" by Laurens Bensdorp

pip install pandas numpy yfinance matplotlib seaborn
"""

import pandas as pd
import numpy as np
from pathlib import Path
from datetime import datetime, timedelta
from typing import Dict, List, Tuple, Optional
import warnings
warnings.filterwarnings('ignore')

try:
import yfinance as yf
except ImportError:
import subprocess
subprocess.check_call(['pip', 'install', 'yfinance'])
import yfinance as yf

try:
import matplotlib.pyplot as plt
import seaborn as sns
except ImportError:
import subprocess
subprocess.check_call(['pip', 'install', 'matplotlib', 'seaborn'])
import matplotlib.pyplot as plt
import seaborn as sns

sns.set_style('darkgrid')


# ============================================================================
# DATA LAYER - Parquet-based local database
# ============================================================================

class MarketDataDB:
"""Local market data storage using Parquet files"""

def __init__(self, db_path: str = "./market_data"):
self.db_path = Path(db_path)
self.db_path.mkdir(parents=True, exist_ok=True)
self.price_path = self.db_path / "prices"
self.price_path.mkdir(exist_ok=True)

def _get_ticker_file(self, ticker: str) -> Path:
return self.price_path / f"{ticker}.parquet"

def download_ticker(self, ticker: str, start_date: str, end_date: str,
force_refresh: bool = False) -> pd.DataFrame:
"""Download and cache ticker data"""
file_path = self._get_ticker_file(ticker)

if file_path.exists() and not force_refresh:
df = pd.read_parquet(file_path)
df.index = pd.to_datetime(df.index)
last_date = df.index[-1].date()
today = datetime.now().date()

if (today - last_date).days <= 1:
return df[start_date:end_date]
else:
new_data = yf.download(ticker, start=last_date, end=end_date,
progress=False, auto_adjust=True)
if not new_data.empty:
df = pd.concat([df, new_data[new_data.index > df.index[-1]]])
df.to_parquet(file_path)
return df[start_date:end_date]

print(f"Downloading {ticker}...")
try:
df = yf.download(ticker, start=start_date, end=end_date,
progress=False, auto_adjust=True)
if not df.empty:
df.to_parquet(file_path)
return df
except Exception as e:
print(f"Error downloading {ticker}: {e}")
return pd.DataFrame()

def download_universe(self, tickers: List[str], start_date: str,
end_date: str, force_refresh: bool = False) -> Dict[str, pd.DataFrame]:
"""Download multiple tickers"""
data = {}
failed = []
for ticker in tickers:
try:
df = self.download_ticker(ticker, start_date, end_date, force_refresh)
if not df.empty and len(df) > 220: # Need 200+ for indicators + buffer
data[ticker] = df
else:
failed.append(ticker)
except Exception as e:
failed.append(ticker)

if failed:
print(f"Skipped {len(failed)} tickers with insufficient data")

return data


# ============================================================================
# INDICATOR CALCULATIONS - CORRECTED
# ============================================================================

class TechnicalIndicators:
"""Technical indicators - EXACT book methodology"""

u/staticmethod
def sma(series: pd.Series, period: int) -> pd.Series:
"""Simple Moving Average"""
return series.rolling(window=period, min_periods=period).mean()

u/staticmethod
def rsi_wilder(series: pd.Series, period: int = 3) -> pd.Series:
"""
CORRECTED: Wilder's RSI using exponential smoothing
Book uses 3-day RSI < 50 to avoid overbought stocks

This is THE critical fix - original used simple moving average
"""
delta = series.diff()

# Separate gains and losses
gain = delta.where(delta > 0, 0)
loss = -delta.where(delta < 0, 0)

# Wilder's smoothing: use exponential weighted moving average
# alpha = 1/period gives the Wilder smoothing
avg_gain = gain.ewm(alpha=1/period, min_periods=period, adjust=False).mean()
avg_loss = loss.ewm(alpha=1/period, min_periods=period, adjust=False).mean()

rs = avg_gain / avg_loss
rsi = 100 - (100 / (1 + rs))

return rsi

u/staticmethod
def roc(series: pd.Series, period: int = 200) -> pd.Series:
"""
Rate of Change (Momentum)
Book: "highest rate of change over last 200 trading days"
"""
return ((series - series.shift(period)) / series.shift(period)) * 100


# ============================================================================
# STRATEGY IMPLEMENTATION - CORRECTED BOOK RULES
# ============================================================================

class BensdorpWeeklyRotation:
"""
Weekly Rotation Strategy - CORRECTED implementation

CRITICAL DIFFERENCES FROM BROKEN VERSION:
1. Uses Wilder's RSI (exponential), not SMA-based RSI
2. Executes on MONDAY OPEN, not Friday close
3. Top 10 selection FIRST, then RSI filter for NEW entries only
4. Proper rotation: keep anything in top 10, exit anything that drops out

Entry Rules (Friday evening analysis, Monday morning execution):
1. Friday close: Check SPY > 200-day SMA (with 2% buffer)
2. Friday close: Rank all stocks by 200-day ROC
3. Friday close: Select top 10 by momentum
4. Friday close: For NEW entries only, filter RSI < 50
5. Monday open: Execute trades

Exit Rules:
1. Hold as long as stock remains in top 10 by ROC
2. Exit when stock drops out of top 10
3. No stop losses (rotation serves as exit)
"""

def __init__(self, initial_capital: float = 10000):
self.initial_capital = initial_capital
self.capital = initial_capital
self.positions = {} # {ticker: shares}
self.trades = []
self.equity_curve = []
self.indicators = TechnicalIndicators()

def calculate_indicators(self, data: Dict[str, pd.DataFrame],
spy_data: pd.DataFrame) -> pd.DataFrame:
"""Calculate indicators - Friday close data"""

# Need at least 200 days of SPY data
if len(spy_data) < 200:
return pd.DataFrame()

# Calculate SPY market regime
spy_sma = self.indicators.sma(spy_data['Close'], 200)
spy_sma_band = spy_sma * 0.98 # 2% buffer

# Check if SPY SMA is valid (not NaN)
spy_sma_value = spy_sma.iloc[-1]
if isinstance(spy_sma_value, pd.Series):
spy_sma_value = spy_sma_value.iloc[0]
if pd.isna(spy_sma_value):
return pd.DataFrame()

spy_close_value = spy_data['Close'].iloc[-1]
if isinstance(spy_close_value, pd.Series):
spy_close_value = spy_close_value.iloc[0]
spy_close = float(spy_close_value)

spy_band_value = spy_sma_band.iloc[-1]
if isinstance(spy_band_value, pd.Series):
spy_band_value = spy_band_value.iloc[0]
spy_band = float(spy_band_value)

indicator_data = []

for ticker, df in data.items():
if len(df) < 203: # Need 200 for ROC + 3 for RSI
continue

try:
# Calculate indicators using CORRECTED methods
rsi_3 = self.indicators.rsi_wilder(df['Close'], 3) # WILDER'S RSI
roc_200 = self.indicators.roc(df['Close'], 200)

# Get values
last_rsi = float(rsi_3.iloc[-1])
last_roc = float(roc_200.iloc[-1])
last_close = float(df['Close'].iloc[-1])
last_volume = float(df['Volume'].iloc[-1])

# Skip if NaN
if pd.isna(last_rsi) or pd.isna(last_roc):
continue

# Calculate 20-day average volume for liquidity filter
avg_volume_20 = float(df['Volume'].rolling(20).mean().iloc[-1])

indicator_data.append({
'ticker': ticker,
'date': df.index[-1],
'close': last_close,
'volume': last_volume,
'avg_volume_20': avg_volume_20,
'rsi_3': last_rsi,
'roc_200': last_roc,
'spy_close': spy_close,
'spy_sma_band': spy_band
})

except Exception:
continue

return pd.DataFrame(indicator_data)

def get_weekly_signals(self, indicators: pd.DataFrame) -> Tuple[List[str], List[str]]:
"""
CORRECTED rotation logic - matches book exactly

Key insight: "Solution C" from C# code:
1. Rank ALL stocks by momentum
2. Top 10 = target portfolio
3. KEEP: anything we hold that's still in top 10
4. ENTER: new positions from top 10, but ONLY if RSI < 50
5. EXIT: anything not in top 10
"""

if indicators.empty:
return [], []

# Extract SPY regime
spy_close = float(indicators['spy_close'].iloc[0])
spy_band = float(indicators['spy_sma_band'].iloc[0])

# Check market regime: SPY > 200 SMA band
if spy_close <= spy_band:
# Bear market: exit everything
return [], list(self.positions.keys())

# Filter valid stocks (liquidity + price)
valid = indicators[
(indicators['close'] > 1.0) &
(indicators['avg_volume_20'] > 1_000_000)
].copy()

if valid.empty:
return [], list(self.positions.keys())

# STEP 1: Rank by 200-day ROC (momentum)
valid = valid.sort_values('roc_200', ascending=False)

# STEP 2: Top 10 by momentum = TARGET PORTFOLIO
top_10 = valid.head(10)
top_10_tickers = set(top_10['ticker'].values)

# STEP 3: KEEP - positions we already hold that are still in top 10
keeps = [t for t in self.positions.keys() if t in top_10_tickers]

# STEP 4: ENTER - new positions from top 10 with RSI < 50 filter
available_slots = 10 - len(keeps)

# Filter top 10 for new entries: must have RSI < 50 and we don't already hold it
entry_candidates = top_10[
(~top_10['ticker'].isin(self.positions.keys())) &
(top_10['rsi_3'] < 50)
]

enters = entry_candidates['ticker'].head(available_slots).tolist()

# STEP 5: EXIT - anything we hold that's NOT in top 10
exits = [t for t in self.positions.keys() if t not in top_10_tickers]

return enters, exits

def execute_trades(self, friday_date: datetime, enters: List[str], exits: List[str],
friday_data: Dict[str, pd.DataFrame],
monday_data: Dict[str, pd.DataFrame]):
"""
CORRECTED: Execute trades at MONDAY OPEN, not Friday close

friday_date: Date of signal generation
friday_data: Data up to and including Friday (for portfolio valuation)
monday_data: Data including Monday (for execution prices)
"""

# Calculate portfolio value using Friday close prices
portfolio_value = self.capital
for ticker, shares in self.positions.items():
if ticker in friday_data:
try:
price = float(friday_data[ticker]['Close'].iloc[-1])
if not pd.isna(price):
portfolio_value += shares * price
except (ValueError, TypeError, IndexError):
pass

# Execute exits first (Monday open price)
for ticker in exits:
if ticker in self.positions and ticker in monday_data:
shares = self.positions[ticker]
try:
# Get Monday's open price
monday_open = float(monday_data[ticker]['Open'].iloc[-1])
if pd.isna(monday_open):
continue
except (ValueError, TypeError, IndexError, KeyError):
# If no Open price, use Close
try:
monday_open = float(monday_data[ticker]['Close'].iloc[-1])
except:
continue

proceeds = shares * monday_open
self.capital += proceeds

self.trades.append({
'date': monday_data[ticker].index[-1], # Actual Monday date
'ticker': ticker,
'action': 'SELL',
'shares': shares,
'price': monday_open,
'value': proceeds
})

del self.positions[ticker]

# Execute entries (Monday open price)
if enters:
position_size = portfolio_value * 0.10 # 10% per position

for ticker in enters:
if ticker in monday_data:
try:
# Get Monday's open price
monday_open = float(monday_data[ticker]['Open'].iloc[-1])
if pd.isna(monday_open) or monday_open <= 0:
continue
except (ValueError, TypeError, IndexError, KeyError):
try:
monday_open = float(monday_data[ticker]['Close'].iloc[-1])
except:
continue

shares = int(position_size / monday_open)
cost = shares * monday_open

if self.capital >= cost and shares > 0:
self.positions[ticker] = shares
self.capital -= cost

self.trades.append({
'date': monday_data[ticker].index[-1], # Actual Monday date
'ticker': ticker,
'action': 'BUY',
'shares': shares,
'price': monday_open,
'value': cost
})

def record_equity(self, date: datetime, data: Dict[str, pd.DataFrame]):
"""Record portfolio value at end of day"""
portfolio_value = self.capital

for ticker, shares in self.positions.items():
if ticker in data:
try:
price = float(data[ticker]['Close'].iloc[-1])
if not pd.isna(price):
portfolio_value += shares * price
except (ValueError, TypeError, IndexError):
pass

self.equity_curve.append({
'date': date,
'equity': float(portfolio_value),
'cash': float(self.capital),
'num_positions': len(self.positions)
})


# ============================================================================
# BACKTESTING ENGINE - CORRECTED
# ============================================================================

class Backtester:
"""Backtest engine with CORRECTED execution timing"""

def __init__(self, strategy: BensdorpWeeklyRotation, data_db: MarketDataDB):
self.strategy = strategy
self.data_db = data_db

def run(self, universe: List[str], start_date: str, end_date: str,
benchmark: str = 'SPY') -> pd.DataFrame:
"""Run backtest with MONDAY OPEN execution"""

print(f"\n{'='*70}")
print(f"BACKTEST: Bensdorp Weekly Rotation (CORRECTED)")
print(f"Period: {start_date} to {end_date}")
print(f"Universe: {len(universe)} stocks")
print(f"Initial Capital: ${self.strategy.initial_capital:,.2f}")
print(f"{'='*70}\n")

# Download data
print("Loading market data...")
data = self.data_db.download_universe(universe, start_date, end_date)
spy_data = self.data_db.download_ticker(benchmark, start_date, end_date)

print(f"Loaded {len(data)} stocks with sufficient history\n")

# Find all Fridays
all_dates = spy_data.index
fridays = []
for i, date in enumerate(all_dates):
if date.dayofweek == 4: # Friday = 4
fridays.append(date)

print(f"Simulating {len(fridays)} weeks of trading...")
print("Each week: Friday analysis → Monday execution\n")

trades_count = 0
for i, friday in enumerate(fridays):
# Get data up to Friday close
historical_data = {
ticker: df.loc[:friday]
for ticker, df in data.items()
if friday in df.index
}
spy_historical = spy_data.loc[:friday]

# Skip warmup period
if len(spy_historical) < 200:
continue

# Calculate indicators (Friday close)
indicators = self.strategy.calculate_indicators(
historical_data, spy_historical
)

if indicators.empty:
# Record equity even if no signals
self.strategy.record_equity(friday, historical_data)
continue

# Get signals (Friday evening)
enters, exits = self.strategy.get_weekly_signals(indicators)

# Find next Monday for execution
next_monday = None
for future_date in all_dates[all_dates > friday]:
if future_date.dayofweek == 0: # Monday = 0
next_monday = future_date
break

# If no Monday found (end of data), use next trading day
if next_monday is None:
next_available = all_dates[all_dates > friday]
if len(next_available) > 0:
next_monday = next_available[0]
else:
# End of data
self.strategy.record_equity(friday, historical_data)
continue

# Get Monday data for execution
monday_data = {
ticker: df.loc[:next_monday]
for ticker, df in data.items()
if next_monday in df.index
}

# Execute trades (Monday open)
if enters or exits:
self.strategy.execute_trades(
friday, enters, exits,
historical_data, monday_data
)
trades_count += len(enters) + len(exits)

# Record equity (use latest available data)
latest_data = monday_data if monday_data else historical_data
latest_date = next_monday if next_monday else friday
self.strategy.record_equity(latest_date, latest_data)

# Progress
if (i + 1) % 50 == 0:
current_equity = self.strategy.equity_curve[-1]['equity']
print(f" Week {i+1}/{len(fridays)}: ${current_equity:,.0f}, "
f"{len(self.strategy.positions)} positions, {trades_count} total trades")

print(f"\nBacktest complete! Total trades: {trades_count}\n")

if not self.strategy.equity_curve:
raise ValueError("No equity data recorded!")

return pd.DataFrame(self.strategy.equity_curve).set_index('date')


# ============================================================================
# PERFORMANCE ANALYTICS
# ============================================================================

class PerformanceAnalytics:
"""Performance metrics calculation"""

u/staticmethod
def calculate_metrics(equity_curve: pd.DataFrame,
benchmark_curve: pd.DataFrame,
risk_free_rate: float = 0.02) -> Dict:
"""Calculate all performance metrics"""

strategy_returns = equity_curve['equity'].pct_change().dropna()
benchmark_returns = benchmark_curve.pct_change().dropna()

# Align dates
common_dates = strategy_returns.index.intersection(benchmark_returns.index)
strategy_returns = strategy_returns.loc[common_dates]
benchmark_returns = benchmark_returns.loc[common_dates]

# CAGR
total_years = (equity_curve.index[-1] - equity_curve.index[0]).days / 365.25
strategy_cagr = float(
(equity_curve['equity'].iloc[-1] / equity_curve['equity'].iloc[0])
** (1 / total_years) - 1
) * 100

benchmark_cagr = float(
(benchmark_curve.iloc[-1] / benchmark_curve.iloc[0])
** (1 / total_years) - 1
) * 100

# Maximum Drawdown
cummax = equity_curve['equity'].cummax()
drawdown = (equity_curve['equity'] - cummax) / cummax * 100
max_dd = float(drawdown.min())

bench_cummax = benchmark_curve.cummax()
bench_drawdown = (benchmark_curve - bench_cummax) / bench_cummax * 100
bench_max_dd = float(bench_drawdown.min())

# MAR Ratio
mar_ratio = abs(strategy_cagr / max_dd) if max_dd != 0 else 0
bench_mar = abs(benchmark_cagr / bench_max_dd) if bench_max_dd != 0 else 0

# Sharpe Ratio
excess_returns = strategy_returns - (risk_free_rate / 252)
sharpe = float(np.sqrt(252) * excess_returns.mean() / strategy_returns.std())

bench_excess = benchmark_returns - (risk_free_rate / 252)
bench_sharpe = float(np.sqrt(252) * bench_excess.mean() / benchmark_returns.std())

# Sortino Ratio
downside_returns = strategy_returns[strategy_returns < 0]
sortino = (
float(np.sqrt(252) * excess_returns.mean() / downside_returns.std())
if len(downside_returns) > 0 else 0
)

# Total Return
total_return = float(
(equity_curve['equity'].iloc[-1] / equity_curve['equity'].iloc[0] - 1) * 100
)
bench_total_return = float(
(benchmark_curve.iloc[-1] / benchmark_curve.iloc[0] - 1) * 100
)

return {
'strategy_cagr': strategy_cagr,
'benchmark_cagr': benchmark_cagr,
'strategy_total_return': total_return,
'benchmark_total_return': bench_total_return,
'strategy_max_dd': max_dd,
'benchmark_max_dd': bench_max_dd,
'mar_ratio': mar_ratio,
'benchmark_mar': bench_mar,
'sharpe_ratio': sharpe,
'benchmark_sharpe': bench_sharpe,
'sortino_ratio': sortino,
'total_trades': len(strategy_returns),
'volatility': float(strategy_returns.std() * np.sqrt(252) * 100)
}

u/staticmethod
def print_metrics(metrics: Dict):
"""Pretty print metrics"""

print(f"\n{'='*70}")
print(f"PERFORMANCE SUMMARY")
print(f"{'='*70}\n")

print(f"{'Total Return':<30} Strategy: {metrics['strategy_total_return']:>8.2f}% | Benchmark: {metrics['benchmark_total_return']:>8.2f}%")
print(f"{'CAGR':<30} Strategy: {metrics['strategy_cagr']:>8.2f}% | Benchmark: {metrics['benchmark_cagr']:>8.2f}%")
print(f"{'Maximum Drawdown':<30} Strategy: {metrics['strategy_max_dd']:>8.2f}% | Benchmark: {metrics['benchmark_max_dd']:>8.2f}%")
print(f"{'MAR Ratio (CAGR/MaxDD)':<30} Strategy: {metrics['mar_ratio']:>8.2f} | Benchmark: {metrics['benchmark_mar']:>8.2f}")
print(f"{'Sharpe Ratio':<30} Strategy: {metrics['sharpe_ratio']:>8.2f} | Benchmark: {metrics['benchmark_sharpe']:>8.2f}")
print(f"{'Sortino Ratio':<30} Strategy: {metrics['sortino_ratio']:>8.2f}")
print(f"{'Volatility (Annualized)':<30} Strategy: {metrics['volatility']:>8.2f}%")

print(f"\n{'='*70}")
print(f"KEY INSIGHTS:")
print(f"{'='*70}")

outperformance = metrics['strategy_cagr'] - metrics['benchmark_cagr']
dd_improvement = abs(metrics['strategy_max_dd']) - abs(metrics['benchmark_max_dd'])

print(f"✓ Outperformance: {outperformance:+.2f}% CAGR vs benchmark")
print(f"✓ Drawdown difference: {dd_improvement:+.2f}% vs benchmark")
print(f"✓ Risk-adjusted (MAR): {(metrics['mar_ratio']/metrics['benchmark_mar']-1)*100:+.1f}% vs benchmark")
print(f"✓ Risk-adjusted (Sharpe): {(metrics['sharpe_ratio']/metrics['benchmark_sharpe']-1)*100:+.1f}% vs benchmark")
print(f"{'='*70}\n")


# ============================================================================
# VISUALIZATION
# ============================================================================

class StrategyVisualizer:
"""Professional visualizations"""

u/staticmethod
def plot_results(equity_curve: pd.DataFrame,
benchmark_curve: pd.DataFrame,
trades: List[Dict]):
"""Create comprehensive charts"""

fig, axes = plt.subplots(3, 1, figsize=(14, 10))
fig.suptitle('Bensdorp Weekly Rotation Strategy - CORRECTED Backtest',
fontsize=16, fontweight='bold')

# Equity curves
ax1 = axes[0]
ax1.plot(equity_curve.index, equity_curve['equity'],
label='Strategy (CORRECTED)', linewidth=2, color='#2E86AB')

benchmark_normalized = (
benchmark_curve / benchmark_curve.iloc[0] * equity_curve['equity'].iloc[0]
)
ax1.plot(benchmark_curve.index, benchmark_normalized,
label='S&P 500 (Buy & Hold)', linewidth=2,
color='#A23B72', alpha=0.7)

ax1.set_ylabel('Portfolio Value ($)', fontsize=11, fontweight='bold')
ax1.set_title('Equity Curve Comparison', fontsize=12, fontweight='bold')
ax1.legend(loc='upper left', fontsize=10)
ax1.grid(True, alpha=0.3)
ax1.yaxis.set_major_formatter(plt.FuncFormatter(lambda x, p: f'${x/1000:.0f}K'))

# Drawdown
ax2 = axes[1]
cummax = equity_curve['equity'].cummax()
drawdown = (equity_curve['equity'] - cummax) / cummax * 100

ax2.fill_between(drawdown.index, drawdown, 0,
color='#F18F01', alpha=0.5, label='Drawdown')
ax2.set_ylabel('Drawdown (%)', fontsize=11, fontweight='bold')
ax2.set_title('Strategy Drawdown', fontsize=12, fontweight='bold')
ax2.legend(loc='lower left', fontsize=10)
ax2.grid(True, alpha=0.3)

# Positions
ax3 = axes[2]
ax3.plot(equity_curve.index, equity_curve['num_positions'],
linewidth=2, color='#6A994E')
ax3.set_ylabel('# Positions', fontsize=11, fontweight='bold')
ax3.set_xlabel('Date', fontsize=11, fontweight='bold')
ax3.set_title('Portfolio Exposure', fontsize=12, fontweight='bold')
ax3.set_ylim(0, 11)
ax3.grid(True, alpha=0.3)

plt.tight_layout()
plt.savefig('backtest_CORRECTED.png', dpi=150, bbox_inches='tight')
print("✓ Chart saved as 'backtest_CORRECTED.png'")
plt.show()


# ============================================================================
# MAIN EXECUTION
# ============================================================================

def main():
"""Run corrected backtest"""

# Test both the book period AND recent period
START_DATE = '2020-01-01' # Book's period
# START_DATE = '2020-01-01' # Recent period for comparison
END_DATE = datetime.now().strftime('%Y-%m-%d')
INITIAL_CAPITAL = 10000

# S&P 500 sample
SP500_SAMPLE = [
"NVDA","AAPL","MSFT","AMZN","GOOGL","GOOG","AVGO","META","TSLA","BRK.B","LLY","WMT","JPM","V","ORCL","JNJ","XOM","MA","NFLX","COST","PLTR","ABBV","BAC","AMD","HD","PG","KO","GE","CVX","CSCO","UNH","IBM","MU","MS","WFC","CAT","MRK","AXP","GS","PM","TMUS","RTX","CRM","ABT","TMO","MCD","APP","PEP","AMAT","ISRG","LRCX","INTC","DIS","LIN","C","T","AMGN","QCOM","UBER","NEE","INTU","APH","NOW","VZ","TJX","SCHW","BLK","ANET","ACN","DHR","BKNG","GEV","GILD","TXN","KLAC","SPGI","BSX","PFE","SYK","BA","COF","WELL","LOW","UNP","ADBE","PGR","MDT","ETN","PANW","ADI","CRWD","DE","HON","PLD","CB","HCA","BX","CEG","COP","HOOD","KKR","PH","VRTX","MCK","ADP","LMT","CME","CVS","BMY","MO","NEM","SO","CMCSA","NKE","SBUX","DUK","TT","MMM","MMC","GD","DELL","ICE","DASH","MCO","WM","ORLY","SHW","CDNS","SNPS","AMT","MAR","UPS","HWM","REGN","NOC","BK","ECL","USB","APO","TDG","AON","PNC","WMB","CTAS","EMR","MNST","ELV","CI","RCL","MDLZ","EQIX","ITW","ABNB","GLW","COIN","JCI","COR","CMI","GM","PWR","TEL","RSG","HLT","AZO","NSC","CSX","ADSK","TRV","FDX","CL","AEP","AJG","MSI","FCX","FTNT","KMI","SPG","WBD","EOG","SRE","TFC","STX","VST","MPC","PYPL","IDXX","APD","ROST","AFL","DDOG","PSX","WDC","WDAY","ZTS","ALL","VLO","SLB","PCAR","BDX","DLR","O","F","D","URI","NDAQ","LHX","EA","MET","NXPI","BKR","EW","CAH","CBRE","PSA","ROP","XEL","LVS","OKE","DHI","FAST","EXC","TTWO","CARR","CMG","CTVA","AME","FANG","GWW","KR","MPWR","ROK","A","AMP","ETR","AXON","MSCI","DAL","FICO","OXY","TGT","YUM","AIG","PEG","PAYX","SQ","IQV","CCI","VMC","HIG","KDP","CPRT","EQT","TRGP","PRU","VTR","GRMN","HSY","EBAY","CTSH","MLM","NUE","SYY","GEHC","KMB","ON","EFX","GIS","STZ","AVB","DD","IRM","DTE","KEYS","BR","AWK","FITB","VICI","ACGL","NDSN","ODFL","WAB","PCG","DOW","FTV","TROW","SYF","TER","AEE","ZBH","HUBB","BIIB","TDY","ZBRA","CHTR","PPG","OTIS","DXCM","WTW","CTLT","ARES","WEC","LYB","MCHP","CSGP","WY","TSCO","HST","AZN","RMD","FSLR","DOV","ANSS","NTNX","EA","CTRA","KHC","PSTG","LH","INVH","KVUE","CNC","SMCI","RJF","LYV","GOOG","ILMN","DVA","ESS","WAT","TRMB","SWK","LUV","WST","AES","LDOS","FE","DRI","GPC","AVY","HOLX","TTWO","EXPD","CMS","BLDR","ALGN","STLD","ARE","EG","BRO","ES","MKC","JBHT","CNP","IT","WDC","NVR","NTRS","EPAM","POOL","BALL","HBAN","BF.B","EXPE","VTRS","PKG","J","RF","PODD","CAG","GL","STE","CFG","AKAM","BBWI","EQR","SBAC","TPR","K","DAY","FDS","NTAP","IP","ENPH","MGM","SWKS","MAS","COO","DFS","AIZ","TECH","TYL","PAYC","CHRW","MRNA","KEY","TXT","MAA","JKHY","HRL","ULTA","LNT","UDR","NI","HII","KIM","ALLE","KMX","RVTY","CE","DGX","REG","WBA","AMCR","CPT","JNPR","MTCH","APA","BXP","EVRG","RL","PFG","HSIC","BWA","ALB","SOLV","PARA","CRL","CPB","IVZ","NWS","NWSA","MOH","WYNN","HAS","PNW","BG","FRT","FOXA","FOX","VFC","EXE","HOOD","DASH","GEV","APP"
]

# Initialize system
data_db = MarketDataDB()
strategy = BensdorpWeeklyRotation(initial_capital=INITIAL_CAPITAL)
backtester = Backtester(strategy, data_db)

# Run backtest
equity_curve = backtester.run(
universe=SP500_SAMPLE,
start_date=START_DATE,
end_date=END_DATE,
benchmark='SPY'
)

# Load benchmark
benchmark = data_db.download_ticker('SPY', START_DATE, END_DATE)

# Calculate metrics
analytics = PerformanceAnalytics()
metrics = analytics.calculate_metrics(equity_curve, benchmark['Close'])

# Print results
analytics.print_metrics(metrics)

# Visualize
visualizer = StrategyVisualizer()
visualizer.plot_results(equity_curve, benchmark['Close'], strategy.trades)

# Save trade log
trades_df = pd.DataFrame(strategy.trades)
trades_df.to_csv('trade_log_CORRECTED.csv', index=False)
print("✓ Trade log saved as 'trade_log_CORRECTED.csv'\n")

return strategy, equity_curve, metrics


if __name__ == "__main__":
strategy, results, metrics = main()

print("\n" + "="*70)
print("CORRECTED BACKTEST COMPLETE")
print("="*70)
print("\nCRITICAL FIXES APPLIED:")
print(" ✓ Wilder's RSI (exponential smoothing)")
print(" ✓ Monday open execution (not Friday close)")
print(" ✓ Correct rotation logic (top 10 first, then RSI filter)")
print(" ✓ Proper position sizing and timing")
print("\nFiles generated:")
print(" • backtest_CORRECTED.png")
print(" • trade_log_CORRECTED.csv")
print(" • ./market_data/ (cached data)")
print("="*70 + "\n")

sentiment 1.00


Share
About
Pricing
Policies
Markets
API
Info
tz UTC-5
Connect with us
ChartExchange Email
ChartExchange on Discord
ChartExchange on X
ChartExchange on Reddit
ChartExchange on GitHub
ChartExchange on YouTube
© 2020 - 2025 ChartExchange LLC