SF4674 (Legislative Session 94 (2025-2026))
Social media behavioral threat assessment reporting requirement establishment
Related bill: HF4423
AI Generated Summary
Purpose
The bill would require social media platforms to continuously review user-generated content to identify and report articulable threats of targeted violence directed at Minnesota educational facilities or religious institutions. It creates a formal process with the Minnesota Fusion Center (MNFC) to receive and coordinate threat reports, outlines definitions used for the process, and establishes enforcement, data handling, and speech protections.
Key Provisions
- Establishes definitions to guide the reporting system, including terms like social media platform, account interactions, content, user, and time sensitive content.
- Requires social media platforms to continuously monitor user-generated content to identify articulable threats of targeted violence and report them to the MNFC.
- Platforms may use algorithms or artificial intelligence to help identify threats but must have human reviewers to assess and decide if a report should be made.
- Mandates reporting within 24 hours of discovering an articulable threat, with an exception if the content indicates a threat could be acted on within 24 hours from posting; in that case, reporting must happen immediately through practicable means.
- The MNFC must designate staff who specialize in social media threat assessments to coordinate reports and facilitate reporting.
- Adds a list of threat assessment factors to help determine when to report, including violent ideation, interest in violence, aggressive behavior, direct threats, expressions of desperation, fixations or stalking, and preparation for targeted violence.
- Allows continued ordinary reporting outside this section's mandate; platforms are not restricted from other legal reporting to authorities.
- Sets penalties for noncompliance, including potential civil penalties up to $1,000,000 per violation and injunctions, plus possible recovery of certain litigation costs.
- Provides data handling rules: information obtained through this process is treated as criminal investigative data and subject to applicable data practices laws.
- Protects free speech: outside the investigation of an articulable threat or other protected speech, government entities cannot violate constitutional rights.
- Grants immunity from liability to platforms that follow the requirements, meaning they are not liable for criminal actions of others solely due to complying with this law.
Definitions (Selected Highlights)
- Educational facility: includes child care centers, schools, and postsecondary institutions in Minnesota.
- Religious institution: has the meaning given by state law and includes affiliated facilities located in Minnesota.
- Targeted violence: a deliberate act of violence directed at a specific person, group, or location that breaks criminal laws.
- MNFC: Minnesota Fusion Center, the state agency or unit designated to receive and coordinate threat information.
- Articulable threats: threats that can be described and evaluated as capable of indicating a real risk of violence.
How Reporting Works
- Platforms must review content and report articulable threats to the MNFC within the specified timeframes.
- Review must use human judgment to determine if a report is warranted, even when automated tools are involved.
- The MNFC may request data and personal information to support threat assessments and must accommodate reasonable resource-saving requests.
- The platform’s reporting should occur through practicable means of communication and follow MNFC procedures.
Threat Assessment Factors
When deciding whether to report, platforms should consider multiple indicators, including: - Violent ideation or fantasies about murder or violence. - Evidence of interest in violent topics or groups that support violence, or prior violence incidents. - Aggressive or violent behavior in user-generated content (e.g., domestic violence, harassment, stalking). - Direct threats made using platform features. - Expressions of desperation or despair signaling potential harm to self or others. - Fixations or stalking related to a person, place, belief, or cause. - Communications that indicate training or preparation for targeted violence, including weapons or related materials.
Enforcement and Penalties
- The Attorney General may bring enforcement actions for violations, including civil penalties up to $1,000,000 per violation and injunctions.
- The state may recover reasonable litigation costs if it prevails.
- Violating platforms may face additional equitable remedies as available under law.
Data Practices and Privacy
- Information obtained through this process is treated as criminal investigative data.
- Data practices rules apply to how information is stored, shared, and used.
Free Speech and Immunity
- The bill affirms free speech rights outside the scope of investigating articulable threats.
- Platforms that comply with the requirements are not liable for criminal actions by others that occur as a result of their users’ speech.
Relevant Terms articulable threat, targeted violence, educational facility, religious institution, Minnesota Fusion Center (MNFC), social media platform, account interactions, content, user, user-generated content, express preferences, algorithmic ranking system, time sensitive, human review, threat assessment factors, civil penalty, injunction, data practices, criminal investigative data, immunity, free speech.
Bill text versions
- Introduction PDF PDF file
Actions
| Date | Chamber | Where | Type | Name | Committee Name |
|---|---|---|---|---|---|
| March 23, 2026 | Senate | Action | Introduction and first reading | ||
| March 23, 2026 | Senate | Action | Referred to | Commerce and Consumer Protection |
Citations
[
{
"analysis": {
"added": [],
"removed": [],
"summary": "This bill amends Minnesota Statutes 2024, section 325M.31, to define terms for social media platform regulation and for purposes of sections 325M.30 to 325M.34 and 325M.36.",
"modified": []
},
"citation": "325M.31",
"subdivision": ""
},
{
"analysis": {
"added": [],
"removed": [],
"summary": "The bill references the meaning of 'Religious institution' as defined in section 327.30 subdivision 1 paragraph f, to be used in relation to the social media threat reporting provisions.",
"modified": []
},
"citation": "327.30",
"subdivision": "subdivision 1 paragraph f"
},
{
"analysis": {
"added": [],
"removed": [],
"summary": "Cross-reference to Minnesota Statutes for the definition of broadband service, tied to section 116J.39 subdivision 1.",
"modified": []
},
"citation": "116J.39",
"subdivision": "subdivision 1"
},
{
"analysis": {
"added": [],
"removed": [],
"summary": "The bill cross-references enforcement procedures under Minnesota Statutes section 8.31.",
"modified": []
},
"citation": "8.31",
"subdivision": ""
},
{
"analysis": {
"added": [],
"removed": [],
"summary": "Federal law reference defining telecommunications carrier as used in the bill's context.",
"modified": []
},
"citation": "47 U.S.C. § 153",
"subdivision": ""
},
{
"analysis": {
"added": [],
"removed": [],
"summary": "Data practices: the bill references Minn. Stat. §13.82, subd. 7, for criminal investigative data handling by the MNFC.",
"modified": []
},
"citation": "13.82",
"subdivision": "subdivision 7"
},
{
"analysis": {
"added": [],
"removed": [],
"summary": "Section 2 establishes social media behavioral threat assessment and reporting requirements under Minnesota Statutes 2024 section 325M.36.",
"modified": []
},
"citation": "325M.36",
"subdivision": ""
}
]