With the passage of Senate Bill 262, Florida has become the latest state who has woken up to the political capital that a state privacy law can provide. And while we see a lot of the “usual suspects” which populate other state privacy laws (e.g. notice, consumer rights, collection and use restrictions, etc.) – which we have posted on frequently – Florida didn’t just look to privacy with SB 262. It also addressed two other issues which seem to be on the mind of Governor DeSantis – government censorship of online social media platforms, and protection of a minor’s personal information.
Government Censorship – Fla. Stat. §112.23
This new section of the Florida Statutes specifically prohibits any governmental entity from asking a social media platform to remove content or accounts unless the account or content is used in commission of a crime or violates the Florida public records law. While there have been claims that social media platforms have been used to censor conservative politicians and activists, it isn’t clear how this particular provision of the law will play out over time. One interesting side effect of this prohibition could be that social media platforms could be used to promulgate content which could be seen to violate several of the state’s own content restriction laws – for example, the Parental Rights in Education Law.
Social media platforms are defined in a very broad manner. In effect, any on-line forum for a school would fit under the definition of a social media platform, since it is defined as “a form of electronic communication through which users create online communities or groups to share information…”. While SB 262 doesn’t remove risk from the school itself under the Parental Rights in Education Act, it does seem to create conundrums. For example, a school’s online community might now be insulated related to content that seems to be prohibited in the classroom. This follows from the issue that the content restrictions in Florida law seem to not be criminal in nature (i.e. you don’t go to jail for a violation) and thus wouldn’t fall under one of the exceptions in §112.23(4).
Protection of Children – Fla. Stat. §501.1753
This section of the bill is instructive in that it expands protections for “children” to anyone that is a minor. Other laws which address child protections (e.g. COPPA and the CCPA) don’t cover all minors – they stop coverage at a age under 18. COPPA only applies to children under 13, and the CCPA extends it coverage to 16. However, Florida is treating anyone under the age of 18 as in need of protection. This may make for unintended negative consequences for businesses that deal with emancipated minors.
Fortunately, the law seems to only apply to “online platforms”. However, online platforms include social media platforms – which are defined broadly (note the discussion of §112.23 above). The applicable prohibitions do seem to limit the scope further, by making the prohibition apply only to online platforms “predominately accessed by children”. Still, the scope of “predominately accessed” by 16 to 18 year-olds could include educational collaboration tools for trade schools, drivers ed programs, and similar types of businesses which have both an in-person and virtual presences.
Florida Digital Bill of Rights – Fla. Stat. §501.702 et seq.
This section of the bill is where the primary privacy protections are introduced. Here we find all the usual privacy rights and controller obligations which are present in the Colorado/Virginia style of laws. What is different from all the other state privacy laws is the way Florida has limited the scope of application. This is done in the definition of “controller” – which, of course, is the primary entity regulated by the “Digital Bill of Rights”.
A ”controller” in other state laws usually has either a minimum revenue amount of $25 million, a minimum revenue percentage made by the sale of personal information, or some sort of minimum number of effected state residents. Some states have removed the requirement for a revenue floor and are triggered based on the sale of personal information revenue percentage or number of effected state residents. In either of these models, the law covers a majority of the businesses that operate in that state.
Florida has significantly narrowed this definition of “controller”, and thus the application of the law. A controller in Florida must fulfill all of the criteria in §§501.702(9)(a)(1) – (5), and
- derive 50% or more of its global revenue from the sale of ads,
- operate a cloud supported smart speaker with voice command, or
- operate a platform which offers at least 250,000 different software applications.
The important item to note is that the criteria in §501.702(9)(a)(5) is that the business generate in excess of $1 billion in global gross annual revenue.
The revenue trigger in other states laws is independent of the other “processing” criteria (e.g. selling data generating a percentage of revenue or number of state residents effected). In Florida, both revenue and processing criteria are required.
Taking aside the processing criteria, if we just look at the number of businesses that have more than $1 billion in revenue in Florida, that number is less than 6000 (this assumes that the number of billion-plus companies – which is around 6400 – do not all operate in Florida). However, the number of companies which operate in Florida is over 588,000. This means that even without the processing criteria, the law would only apply to approximately 1% of the businesses operating in Florida.
If we add the other criteria to the billion-plus revenue criteria, that number gets even smaller. So, if you look at all of the Florida criteria needed to be a “controller” it starts to look like the only rights Floridians have are with a tiny number of big tech companies.
It could be argued, that in a practical sense, Florida really doesn’t have much in the way of a Digital Bill of Rights because the state only effectively requires privacy compliance for less than 1% of the companies with whom its residents might interact.