SF4380
Certain online platforms requirement to provide information pertaining to algorithm use
Legislative Session 94 (2025-2026)
Related bill: HF3980
AI Generated Summary
Purpose
To increase transparency and user control over how online platforms use algorithmic recommender systems, with the aim of protecting consumers in Minnesota. The bill would require platforms to disclose how their algorithms work, give users more control over the types of items shown or blocked, and establish rules, penalties, and audits to ensure careful design and ongoing evaluation of these systems.
Key terms and definitions (essential concepts)
- Covered online platform: a platform that operates in Minnesota and uses algorithmic recommender systems to rank or display items to users.
- Algorithmic recommender system: the computer processes that decide the order, rank, or prominence of items shown to users.
- Personal data / userProvided data / engagement data: data about a user or their interactions that platforms use to tailor recommendations.
- Holdout group: a group of users exempt from certain design changes for testing.
- Longterm user value: outcomes that reflect a user’s forward-looking preferences, measured over time.
- Highvalue data: user-provided data or predictions from user survey data used to tailor experiences.
- Longterm holdout assessment: an ongoing process to evaluate how changes to algorithms affect long-term user value.
- Default: the preselected setting used by the platform for a service or feature.
Main provisions
Design transparency requirements
- Platforms must prominently display on their site or app:
- A list of each algorithmic recommender system used.
- A description of each input to every algorithm and the data source for each input.
- The weights used in each recommender, categorized into four quartile groups by importance.
- The state (commissioner) will adopt rules to clarify disclosures further.
- Platforms must annually disclose high-level objectives, key results, and performance metrics used to evaluate product teams responsible for algorithmic design.
User choice and defaults
- For services that use personal data in recommendations, the algorithm should default to maximizing longterm user value metrics.
- Platforms must provide an accessible user interface that lets users clearly express preferences about the types of items shown or blocked.
- Platforms must take reasonable steps to ensure outputs align with user preferences.
- Users’ exercising of rights (choosing specific recommendation types or blocks) should not be used to withhold, degrade, lower quality, or raise prices of products or services, except as necessary to comply with this section or its rules.
Protections for minors
- When an algorithmic recommender system uses personal data and is provided to a covered minor, it must default to maximizing longterm user value metrics applicable to minors.
Holdout assessments and accountability
- Platforms must maintain at least one holdout group and subject all design changes to a longterm holdout assessment.
- Annually, platforms must publish a disclosure showing:
- Longterm user value metrics used.
- Aggregate, anonymized results for each metric for the holdout group(s) and for the rest of the user base.
- The commissioner will establish rules for conducting these assessments, including how to form holdout groups, disclosure requirements, and possible exemptions.
- If a platform changes a design in a way that reduces direct and immediate harm to users without increasing engagement or revenue, an exemption from some holdout requirements may apply.
Independent auditing
- Platforms must, at least annually, obtain an independent audit of their longterm holdout assessments.
- Auditors must have full access to information and operations; the platform must cooperate fully to ensure accurate, comprehensive reporting.
Enforcement and remedies
- Violations are treated as an unfair and deceptive act under Minnesota consumer protection laws.
- The attorney general may enforce these provisions.
- A user injured by a violation can seek remedies, including:
- Monetary damages of $5,000 per user per violation (adjusted for inflation).
- Punitive damages of $7,000 per violation (also inflation-adjusted) for reckless or knowing violations.
- Attorney fees, litigation costs, and other appropriate relief (like injunctive or declaratory relief).
Relationship to other law and constitutional considerations
- The new requirements are in addition to, and do not limit, other Minnesota laws, regulations, or common law where applicable.
- If there is a conflict with other laws, the provision that affords greater consumer protection applies.
- The section should not be construed to override the First Amendment or federal Section 230 protections.
Significant changes to existing law
- Introduces a comprehensive framework for transparency in how algorithmic recommender systems operate on platforms serving Minnesota users, including disclosure of inputs, data sources, and the relative importance of factors (weights).
- Establishes a rights-based user control approach where defaults push toward longterm value, but users can explicitly set preferences regarding recommendations and item blocking.
- Creates a formal holdout and longterm assessment regime to systematically test and publicly report on the impact of algorithmic changes on longterm user value, with mandatory annual independent audits.
- Adds robust enforcement and financial remedies, including per-user damages, punitive damages, and attorney fees, for violations.
- Requires ongoing rulemaking by the state commissioner to clarify disclosures and assessment processes, aligning with consumer protection goals.
Practical impact (in plain terms)
- Minnesota residents would gain more visibility into how online platforms decide what to show them and why.
- Platforms would need to let users tune preferences and would have to default toward choices that benefit users over the long term.
- Platforms would be regularly tested and publicly reported on how their changes affect users, with independent audits to back up the findings.
- Violations could lead to monetary penalties for platforms and possible legal action by consumers, with meaningful damages and fees.
Relevant Terms - online platform - algorithmic recommender system - covered online platform - Minnesota - user preferences - longterm user value - holdout group - longterm holdout assessment - highvalue data - user provided data - engagement data - personal data - input - data source - weights - four quartile groups - accessible user interface - default - covered minor - independent audit - commissioner of commerce - unfair and deceptive act - remedies (damages, punitive damages, attorney fees) - rulemaking - First Amendment - 47 U.S.C. § 230
Actions
| Date | Chamber | Where | Type | Name | Committee Name |
|---|---|---|---|---|---|
| March 11, 2026 | Senate | Action | Introduction and first reading | ||
| March 11, 2026 | Senate | Action | Referred to | Commerce and Consumer Protection | |
| March 17, 2026 | Senate | Action | Author added | ||
| Showing the 5 most recent stages. This bill has 3 stages in total. Log in to view all stages | |||||
Citations
You must be logged in to view citations.
Progress through the legislative process
Sponsors
You must be logged in to view sponsors.