Commons:WMF support for Commons/WikiLegal for Commons

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search

The Legal department would like to encourage Commons contributors to use this page to highlight topics of general importance, where the input of the department may be needed, and that can then be researched for a WikiLegal article. Please read more about this initiative as explained by the promoter, User:SSpalding (WMF).

  • How to know when a question is worth asking to WMF?
    Here are some rules of thumb about what might make a good question.
    1. Has a question - that is not a routine question of copyright or licensing - been asked multiple times? Maybe it relates to new technology (e.g. "deepfakes" or the AI example above).
    2. Does the question involve emerging cultural or legal trends (e.g., the rise in authoritarianism worldwide, the rise of pervasive government surveillance, changes in attitudes towards online privacy as it relates to images of people)?
    3. Would getting a definitive answer on a topic could allow the community to create a new policy? Could it decide the fate of hundreds or thousands of images?
    4. Do you believe an existing policy needs to be rethought because of a new context? For example, there are existing, well-established policies about non-sexual nudity and sexual content. But there are often questions and debates about these policies. Much of that debate is philosophical, but some of it could come from "legal" reasons (e.g. new laws related to intermediary liability, tightening restrictions in certain countries about displaying images of nudity more generally). If there seems to be a large amount of disagreement, then we could put out a memo on the topic to help move the debate along by adding current, objective legal research into the debate.
    Conversely, it may not be appropriate if something only relates to a specific image or a handful of images and whether or not they should be deleted. This shouldn't be a substitute for the Helpdesk for example.
    It’s best to surface topics of more general importance.
  • Here is an example of a topic of general importance:
    AI generated images: There are currently both philosophical and legal debates going on Commons about when AI-generated images belong on Commons and when they do not. The philosophical debates about what Commons should and shouldn't be are for the community. But the legal policies around whether or not certain images must be removed from Commons, for example, because of copyright or other underlying rights, is an open legal question. Currently, these images are "generally" not protectable in the U.S. by copyright law (which underlies the Commons community's current position), but an investigation into what other jurisdictions are doing now or in the future on the topic could be informative. Moreover, other rights (e.g. moral rights) may exist outside of the U.S. and may limit certain users' interest in uploading certain images in their jurisdictions.

Please use the talk page for any questions and/or reach out directly to User:SSpalding (WMF).

Ripe questions for Legal's consideration[edit]

Joint authorship and copyleft licenses[edit]

[It's not entirely clear to me how this process is supposed to work, so apologies if I am overstepping by simply posting this (and subsequent) issues here. I believe they are "ripe" and the kinds of things that would be appropriate, but if they should have passed through a gatekeeper type thingy first I apologise.]

In 2019, a copyright discussion on English Wikisource regarding the Bethesda Statement on Open Access Publishing—a pretty core text for the free culture movement—triggered a huge and ultimately fruitless thread (procedurally closed as no consensus just to get it off the backlog). The most constructive outcome of the discussion was the draft guidance Commons:Joint authorship, written primarily by Bluerasberry.

The issue was discussed at length in the deletion discussion (so I won't repeat that here), and the draft guidance at Commons provides a reasonable overview, but a nutshell summary is this: In the case of a joint work, can one author unilaterally license it under a Creative Commons license, thereby depriving the co-authors of the possibility of commercially exploiting it, or must all co-authors agree to such a license?

A legal discussion of the issue can be found in "Avoiding Joint Pain: Treatment of Joint Works of Authorship Conditions" (May/June 2010, Baker Donelson) by E. Scott Johnson. It establishes that joint authors have an undivided interest in the whole work, and then casts the issue in two parts: (1) can one joint author issue an exclusive license and thereby prevent other joint authors from issuing further licenses, and (2) can one joint author destroy the copyright by releasing the work to the public domain or releasing it under a copyleft license. Johnson concludes on a particular side of the issue, but it is still a good discussion from a legal perspective.

The issue has mostly been discussed in a US legal context, but due to Commons' well established policy of requiring compatible licensing or copyright status in both the US and the country of origin the international legal context is also relevant. Due to the language commonality and the hard requirement to obey US law, the most relevant jurisdictions are the English-speaking ones (i.e. the UK and to a degree the EU), although in principle the issue may arise in any jurisdiction.

The issue potentially affects a moderate (but increasing) number of works, although there are not many currently open controversies on WMF projects that I am aware of. The core of the issue depends on legal rather than project policy considerations, and there is an already extant kernel for a Commons policy that might be developed into a clear, more or less bright-line, policy if the legal situation can conclude clearly one way or the other. Alternately, if the legal issue should turn out to be inconclusive, guidance on what the scope for project policy is would be very valuable. --Xover (talk) 09:27, 31 January 2023 (UTC)[reply]

Concerning oneself with trifles[edit]

De minimis non curat lex, in the context of copyright, is a wonderful principle for the law and the legal system, and for pedants-of-necessity like the subsets of the community on WMF projects that concern themselves with copyright issues. It admirably counteracts tendencies to become more catholic than the pope and makes efficient many otherwise near-insolvable issues.

Commons has provided guidance in the form of Commons:De minimis which has and continues to serve the community well, and being policy it can be relied upon easily and efficiently in the majority of simple cases.

However, de minimis being an assessment rather than a bright-line rule makes its application very difficult in even moderately complicated cases; and it makes it vulnerable to hijacking by even good-faith contributors who for whatever reason treat it as a magic wand. Without a clear multi-part test or factors to assess, previous deletion discussions become effectively "precedent" for what falls within de minimis regardless of the fact these deletion discussions (unlike actual case law) are affected by lots of irrelevant factors (who participates, community power dynamics, how widely noticed is the discussion, is the specific case of current news interest, etc.).

For example, in a 2019 deletion discussion about a report by the US federal government that included several photos and other material whose copyrights were owned by third parties, the discussion was closed as keep with the rationale "A few copyright images in a 450 pages document are obviously de minimis. We can't have the images extracted here, but the whole document is OK." This despite a parallel discussion on the village pump concluding fairly clearly in the opposite direction.

That deletion discussion is now being cited as precedent for other cases where third-party material is included wholesale in otherwise compatibly licensed (often {{PD-USGov}}) works. The most recent example that I personally was involved with is this currently open deletion discussion that concerns the US Supreme Court's decision in Campbell v. Acuff-Rose Music, Inc., and where the the Court included verbatim and in full the lyrics for Roy Orbison's rock ballad, "Oh, Pretty Woman", and the parody "Pretty Woman" by 2 Live Crew. Based on the precedent of the previous deletion discussion it is an entirely valid and logical argument to make, but at this point the expansion of de minimis' scope is leading to apparently absurd results like entire separable copyrighted textual works can be included in derivative works if only the containing work has enough filler text. (NB! this particular deletion discussion has a different issue that I think is appropriate here, and for which I intend to make a separate proposal)

A similar but distinct example is the Louvre Pyramid, designed by I. M. Pei in 1988, and generally held to be protected by Pei's copyright. Photos of the pyramid are therefore generally not permitted on Commons (but English Wikipedia hosts some under their fair use EDP). But despite being an example used on COM:De minimis itself, images such as example 1, example 2, example 3, example 4, and example 5 are routinely kept in the many many deletion discussions related to it (and others are probably deleted despite being actual de minimis cases).

Addendum: I have been told that there is now case law in France that allegedly creates a Freedom of Panorama out of whole cloth. I don't think that materially changes the issue because 1) the court is essentially saying that this pseudo-freedom of panorama hinges on a de minimis assessment ("Is the depicted copyrighted object incidental and unavoidably included, or is it the main subject of the picture?") and 2) we could in any case just assume there was no such case law in order to find what the de minimis assessment would be like given the example photos listed above. Several of them very clearly include the pyramid as a main compositional element, and probably would not exist without the pyramid, so it is not obvious that we are within the bounds even of this French case law and much less the status quo ante. --Xover (talk) 08:20, 2 February 2023 (UTC)[reply]
The French court case is quite different that a simple de minimis argument. File:Louvre Museum Wikimedia Commons.jpg is the typical image relaying on this case. One can't take a picture of the whole building without including the pyramid, therefore the picture is allowed. It doesn't matter that the pyramid takes a central place in the picture, and that it could be sold as a postcard of the pyramid. However, cropping the pyramid as a separate picture is not OK. Yann (talk) 12:49, 2 February 2023 (UTC)[reply]

These three example cases are, I believe, fairly representative of a large swathe of the de minimis-relevant cases that crop up with relatively high frequency, and are examples of where more specific tests or more concrete factors for analysis would lead to more consistent handling, less contentious community discussions, and better (clearer, easier to apply) policy on multiple (potentially all) WMF projects. Or put another way, if we had a clearer set of factors or tests that would have resolved these three specific issues, we could more easily resolve a great number of similar cases since and in the future. --Xover (talk) 10:47, 31 January 2023 (UTC)[reply]

Government use of third-party material[edit]

In Georgia v. Public.Resource.Org the US Supreme Court radically expanded the Edicts of government-doctrine from only force of law to also authorship by those empowered to speak the law. This is a very clear and precedent-setting decision which makes easier a whole lot of issues for related works, and it resolves some previously very knotty issues with lay-people attempting to make force of law assessments (for example, must an international treaty be self-executing to be covered by the edicts of government doctrine?). {{PD-EdictGov}} has been amended on multiple projects, and the change is slowly (conservatively) working its way into policy and practice on the projects.

However, Georgia v. PRO is now also starting to be cited as precedent in cases that bear directly on the general issue of third-party materials included in works by the US government. Most immediately for this suggested issue are ASTM v. PRO and ICC v. UpCodes, where the appeals court, in a motion for summary judgement posture, also discusses (but does not rule) on the issue of the government's use of third-party copyrighted material.

At issue in these cases, as far as it concerns us here, is model codes (building codes, fire safety codes, etc.) developed by standards developing organizations (SDOs) and directly promoted by them to state governments for adoption verbatim, partial, or inclusion for reference in that jurisdiction's code. The court in these cases goes far in saying that in cases that fit this fact pattern, the mandate of these third party standards in a law results in the otherwise copyrighted material becoming the law, and (primarily on due process grounds) must therefore be freely available to citizens.

And because the legal basis for considering a court's opinion in itself public domain is also the edicts of government doctrine, their argument in the two cases is taken to apply also for cases that do not fit the same pattern of facts. Case in point: Commons:Deletion requests/File:Campbell v. Acuff-Rose Music.pdf, that concerns the US Supreme Court's decision in Campbell v. Acuff-Rose Music, Inc., and where the the Court included verbatim and in full the lyrics for Roy Orbison's rock ballad, "Oh, Pretty Woman", and the parody "Pretty Woman" by 2 Live Crew. If the inclusion of one third-party copyright in a work covered by edicts nullifies that copyright, so too must surely all other inclusions of third-party copyrights in works covered by edicts. In other words, we are now debating whether Roy Orbison's "Oh, Pretty Woman" is in the public domain through the whim of Justice Souter in including the works at issue in an appendix.

Because this nexus of issues are in an area of the law that is in motion, and bears directly on a large number of works on multiple WMF projects, it would be very useful to have some guidance. In particular, it would be useful to address what the projects' scope is for relying on ASTM v. PRO and ICC v. UpCodes when assessing whether a third-party copyright included in a law is valid? Do these two cases settle issue? Can we rely on it for copyrighted standards mandated but not actually included verbatim in a law? What about in things covered by edicts but that are not themselves law, like opinions of the court? How about works covered by {{PD-USGov}} instead of {{PD-EdictGov}}; do they have the same effect on third-party copyrights? There are many thousands of technical standards that are in effect mandated by law, and that are otherwise eligible for hosting on WMF projects (Commons, Wikisource, and Wikipedia would all be relevant), but mandated by reference rather than direct inclusion. Can we now host these, and what are the factors to consider when making the call?

And regarding court cases, what is the basis for the court's inclusion of third-party copyrighted material as evidence? If the basis is fair use (which I believed was well settled and uncontroversial, cf. {{PD-USGov-Judiciary}}), this appears to be surprising to some, but if it is any other reason it is likely to lead to absurd results. And the basis for the court's use of third-party copyrighted material affects how we apply things like the above referenced decisions to other fact patterns. --Xover (talk) 11:47, 31 January 2023 (UTC)[reply]

A legal opinion might be useful, but IMHO the issue is not whether inclusion of copyrighted material in a legal text voids the copyright of this material. The argument is for de minimis, in the same way as inclusion a copyright item in a general picture (place or object) doesn't void the copyright of the item. See King of Hearts's opinion in the DR you cite above. Yann (talk) 19:44, 31 January 2023 (UTC)[reply]
I suggest we keep individual deletion discussions in the individual discussion. That deletion discussion is mentioned above because it is one example of somewhere this issue has been brought up, and to illustrate the kinds of arguments that are being brought up without having to paraphrase or quote large parts of it. I don't think WMF Legal would ever want to wade into an individual deletion discussion (barring Office actions and the like), nor am I sure we would want them to. In any case, the de minimis aspect of it is covered in #Concerning oneself with trifles above. Xover (talk) 08:08, 2 February 2023 (UTC)[reply]
Sorry for a bit of delay in replying, I was waiting until there was a bit of discussion as well as when we had some meaningful updates from WMF Legal to give. On this point, @Yann is right that one could argue that many instances are de minimis, and @Xover's suggestion that fair use could be used in many individual deletion discussions is also accurate. I think it's also accurate to say that we don't want to interfere with individual deletion discussions.
But one thing that we believe might be of general importance is illustrating any current U.S. law in edge cases in government public domain works when neither of those defenses are compelling. As a concrete example, we think there might be useful observations about a small subset of NASA material for example. We are currently putting something together about that internally. Getting it to a point where it can be published as a wikilegal article is a somewhat low priority though (while the TOU consultation is going on). But I did want to jump in to say that we saw this and potentially have some (limited) additional information to add sometime in the future. SSpalding (WMF) (talk) 23:38, 28 February 2023 (UTC)[reply]

Consent[edit]

Consent is an issue which is sometimes legal and sometimes social. I do not think progress could be made on this issue without community conversation, but I also think it would be helpful to have some legal guidance to help identify what options exist.

The intro here said "there are existing, well-established policies about non-sexual nudity and sexual content". I disagree. I think this is a misunderstanding where the community is identifying problems in this space, but no communication channel exists to make legal aware of them, and consequently someone has come to the conclusion that policies are giving useful guidance. The norm in the wiki community is that serious wiki community policymakers who have defined the last generation of social norms in the Wikimedia platform and beyond typically have never had a social exchange of any kind with any lawyer at the Wikimedia Foundation or otherwise. There are plenty of sex issues to discuss, but they are taboo. I have no reason to believe this is a sorted issue. I am an organizer for Wikimedia LGBT+, and I am sure that no one has ever asked for the community's organized LGBT+ perspective on sexual content. LGBT+ discusses sex more than any other Wikimedia community, so if the LGBT+ community has not reviewed this, then I have doubt that any other community could have made any policies well-established without leaving records of however they did that.

An interesting aspect of consent to nudity and porn is that the discussion generally applies to all kinds of personal consent. Patient consent to medical photography of their personal health conditions is one major issue in Wikipedia and medical research, and the Wikimedia platform is probably the only forum in the world that could respectably bring together the porn industry and the medical industry for a serious conversation. Other major sectors in this space are movies and television, which as a New Yorker I can confirm consent for big-budget productions still happens with paper photocopied pre-digital consent forms out of physical file cabinets; advertising, where models consent to be the face of products or just as stock images for anything, or in fashion, where the models get major exposure with unusual consequences. I have some of these issues raised in a proposal to track consent in Wikimedia Commons at Commons:Model_license/Case_studies. There are consent processes in porn, medicine, movies, advertising, fashion, and Wikimedia Commons. If we got everyone in the room to sync best practices, then the conversation would be astounding and productive.

Revenge is its own topic, and the model for discussion is en:Revenge porn. It comes up with some regularity in Wikimedia projects. I do not think WMF legal keeps data and statistics about this, or if there are, WMF legal is not transparent about what it know or how it detects issues. I think we need to reconsider this issue for consent. Revenge porn is not the only issue and this discussion does generally apply. One issue is nude pics in Wikimedia projects with consent; another issue is withdrawal of consent by the model years later; another issue is non-porn images or information which someone claims is used for revenge. In meta:Volunteer Response Team, there are thousands of people who claim that someone is editing Wikipedia either about them or about nothing in particular, but whatever the case, the user claims to be personally harassed by people they know. I wish I had the data to do text mining on OTRS/VRT messages, but the community has trouble evaluating claims and as best I can tell, most of these claims are invalid in the sense that the source of the problem is Wikipedia editors routinely making articles about people, and no harassment is actually occurring despite the user's paranoia. That is still not a comfort to the person who writes in asking for help and protection. We have thousands of people who report being victimized and no good community response to these people claiming a right to consent to personal data in Wikipedia. There is no typical case here, but at d:Wikidata_talk:WikiProject_Biography#Athlete_and_model_requests_deletion_for_personal_privacy I described a case where an athlete and model with media coverage asked for privacy in Wikidata specifically, which right now is unusual but will be more common as Wikidata becomes better known.

Aside from revenge personal data and content, data science is making big changes that affect consent. One of the changes is that very soon, we will have demographic data on almost everyone in the world. I expect that all university faculty will have Wikidata items, because they publish the knowledge which Wikipedia cites for fact checking. There are projects which celebrate diversity in gender, nationality, and culture which - without consent - do things like label artists and scientists as women worth profiling, or list people of African descent who have won awards, or import LGBT+ data for Pride events. There is an inherent conflict here because celebrating diversity also greatly increases vulnerability by making members of underserved communities much more visible, and much easier to target for harassment. There are lots of requests from people who are the subjects of Wikimedia diversity campaigns to take down their Wikipedia articles and photos. I have never rich Western white guys do this, except after the media reports on their fraud or deviancy. As we being the process of automating coverage of everything, and dumping non-free photos into the AI generator to make a Wikimedia Commons compatible version of their portrait, we need to talk about how open data and open media can harm individuals and what we should do when all our copyright is in order but a subject does not consent, and writes in complaining of harm.

This is related to consent but kind of just a side issue: other user generated content platforms have a process for identity verification. In the Wikimedia platform we still sometimes have people scanning their passports and government ids and mailing them in to confirm identity. We have done this 1000s of times. I looked at the documentation and I do not think this has ever been discussed ethically or with WMF legal review. I think it just started happening then somehow WMF legal years ago encouraged it further because some Internet people were doing it before WMF had a legal department. Anyway, I have documentation of this for English Wikipedia at en:Wikipedia:Identity verification but to give consent, we have to have a process for determining how sure we are of the identity of person giving consent. Also there are certain consent-related privileges that we give to user accounts who verify their identities.

Here are some outcomes I want:

  • Development of a Wikimedia community consent license for situations where a person wants to document their consent for something
  • Development of a complaint process which captures data for when and why subjects of Wikimedia coverage talk about consent
  • Establishment of long term communication channels for consent with industries including porn, medicine, advertising, movies, and fashion
  • Development of an identity verification system
  • Research data collection and intro on consent, including multiple scholarly peer-reviewed perspectives on the issue, some fundamental datasets, and better developed wiki policy. The point of this is to establish entry points for researchers and industry professionals who want to study or adopt Wikimedia practices.

Bluerasberry (talk) 15:54, 1 February 2023 (UTC)[reply]

This opens up lots of process and policy questions beyond Wikimedia Commons and beyond the scope of Wikilegal articles, but all interesting and useful nonetheless. I'll probably have a few questions, but the first off the top of my head:
1. Image consent issues often come up within the WMF Legal department when someone "changes their mind" after uploading a photo (of themselves) to Commons. That person doesn't realize the "permanence" of open licensing and has no legitimate claims under current policy to delete the image. Wikimedia Commons policies seem to discourage people from delete media that is otherwise allowed by applicable law.
Is this a problem that needs additional nuance? For example, are edge-case scenarios that you brought up such as revenge porn or medical consent (where there are specialized laws or human rights principles that might prevent these from being hosted) not taken into consideration? If so, how prevalent is this issue?

SSpalding (WMF) (talk) 00:01, 1 March 2023 (UTC)[reply]

@SSpalding (WMF): Yes, I do think this problem is more than just responding to complaints, because outside of Wikimedia Commons the media environment is proactive in addressing consent in ways that Commons is not. I think there is some appetite for proactive options, but it is challenging to organize all the use cases, stakeholders, and as you requested, to collect the data on prevalence of the problem.
There are two ways that we could get prevalence information. One is to organize community conversation, such as from Wikimedia Medicine and Wikimedia LGBT+, which lead those organizations to issue community statements trying to describe prevalence. The advantage of that would be community interpretation of the issue, but the disadvantage is lack of data. There is a Wikimedia community desire for data on many issues, but the WMF has discouraged Wikimedia community collecting data with the rationale that the foundation has software in development to do so more ethically. Briefly, I remember the community wanting complaint data since 2012, there was a WMF tool pilot presented by WMF Trust & Safety at WikiConference North America 2016, there was the 2019 meta:Community health initiative/User reporting system, and currently WMF is developing the meta:Private Incident Reporting System. If that tool existed, then we could get accurate data. Please endorse the tool's development and do not let it be vaporware or postponed. Until the tool exists, take my report that the problem exists and that community statements are possible if you need more clarity.
It is not easy for me to assign the dollar amount on what investment those groups would want to fix the problem, but as a general principle, community organizers get anxious hearing repeated problems for years for which there is no procedure to address. Getting that reporting system would be a boon.
You asked about nuance, and the approach of my above text is not of active complaints from people featured in photos, but of the proactive accessions process of the Wikimedia Commons community. I think we need an optional consent license, comparable to a Creative Commons copyright license, for voluntary use or negotiated use. The reason why I think we need this is because documenting consent to model is already a norm in many contexts, and for the same reason that practice is elsewhere, Commons should provide options to allow it also. One nuance - nowhere but Commons would anyone think to have one consent process for so many diverse media types, including for example both medicine and erotica. Another nuance - even though Commons is supposed to check primarily copyright, sometimes we have Commons reviewers who make unsolicited requests to show proof of consent. For example, at Wikipedia:Featured_picture_candidates/July-2008#The_moment_of_birth some reviewers requested proof that the model consented to the photo arguing that Commons cannot show a vulnerable person by only checking copyright. Often these issues have something to do with female nudity, and often these discussions spark calls to meta:Address the gender gap. We could call this "negotiated use" - Commons reviewers might suggest that anyone uploading sensitive media consider documenting consent, and if the uploader declines, then even without a complaint the Commons reviewers may pass judgement that content is out of scope due to being problematic to reuse. Bluerasberry (talk) 21:55, 18 March 2023 (UTC)[reply]

AI artwork-content[edit]

"US Copyright Office rules AI-generated artwork, content not legally protected" Original copyright office decision (PDF); The Hill article about it. Issue has been under discussion at deletion requests, eg Commons:Deletion_requests/File:Alice_and_Sparkle_cover.jpg, but there may be (?) additional related issues not addressed. This apparently resolves copyrightability, presumably if all parties are in the United States (any rulings in other countries?). It probably does not resolve if there is any COM:DW issues. It might or might not resolve issues around proprietary software used; I don't know. I request we get Wiki Legal involved for guidelines for such cases. -- Infrogmation of New Orleans (talk) 17:28, 1 March 2023 (UTC)[reply]

Lack of a clear public domain policy[edit]

Wikimedia Commons as of yet does not have a clear and universally accepted policy about its definition of the public domain. This is a threat to progress in the GLAMwikispace.

Last year my upload of images from the Fine Arts Museum of Ghent which I had carefully checked were public domain (in Belgium and most of the world copyright term is 70 years + death creator), was flagged for copyright infringement because while some of the artists had died sufficiently long ago, some of their work would technically not be public domain according to the USA legal context. For example, while the artist Frits Van Den Berghe died in 1939, some of his work was produced after 1928, or after 95 years after publication the current other criterium in the USA. Some of my work was flagged and nearly got me banned.

The current uncertainty puts a break on activities in the GLAMwiki space, decreases hard won trust and enthusiasm of newly openGLAMs, and can lead to unneccesary conflict between editors. Therefore, I would argue that Wikimedia Commons needs a clear policy on which public domain term(s) to adhere to or in case it is decided that different countries legal frameworks need to be respected, which users need to respect them. It is also important that it is explained why the policy is one way or the other. Sam.Donvil (talk) 17:00, 16 March 2023 (UTC)[reply]

We do have a clear policy: The object must be free of copyright in the native country (in this case, Belgium) and in the US. It is unfortunately that the US copyright laws are incompatible with most of European copyright laws, but unfortunately we have to live with this. Ymblanter (talk) 19:29, 16 March 2023 (UTC)[reply]

Great Seal of the State of Arizona[edit]

Official state document PFD states in part "The great seal of this state shall in no way be employed by anyone other than a state agency for the purpose of advertising or promoting the sale of any article of merchandise what- ever within this state or for promoting any other commercial purpose. The secretary of state may promulgate rules for the use of the great seal of this state or any facsimile, copy, likeness, imitation or other resemblance of the great seal. Any person who knowingly violates this section is guilty of a class 3 misdemeanor.” An image with threats of legal penalties (including fine and jail time) for unauthorized or commercial use seems to me not realistically "free licensed" for Commons. Others disagree. See eg Commons:Deletion requests/File:State Seal of Arizona.svg for more detailed discussion. I request guidelines from WikiLegal. -- Infrogmation of New Orleans (talk) 21:59, 29 January 2024 (UTC)[reply]

The definition of "free" is based on the copyright status alone, which is independent of the use made of it. It is "free". But like many other images here (such as images of people or trademarked logos), if you use them in advertising without obtaining those other rights, yes you have a problem. Insignia laws like this are standard across states and most countries (including similar U.S. federal symbols). This is explicitly in policy at Commons:Non-copyright restrictions. The {{Insignia}} template is listed in the "See also" section, as a member of Category:Restriction tags where we have many similar tags denoting other non-copyright restrictions. An image of U.S. currency is "free" based on the copyright, but that doesn't give you the right to make counterfeit bills from it, and so on. Carl Lindberg (talk) 23:10, 29 January 2024 (UTC)[reply]
  •  Comment "Non-copyright restrictions" - fair enough. But if we just list the image as "public domain", I doubt that even fairly savvy reusers of Commons media would usually have in mind that reuse of that image in ways that would be perfectly acceptable for most PD images could get them thrown into jail for 30 days by the State of Arizona (plus fine and probation). We have templates to alert of trademark and personality rights, perhaps we need some clear warning template when such "non-copyright restrictions" are spelled out in law and have potentially dramatic consequences. -- Infrogmation of New Orleans (talk) 23:39, 29 January 2024 (UTC)[reply]
  • When Commons uses "public domain", it is always strictly in relation to the copyright. A file which is public domain by copyright (perhaps PD-ineligible) may have a very real trademark on it. It's why we have these restriction tags, to try and remind people of them -- but it's always their responsibility to know the law where they intend to use it, since there can be many local laws we don't know about (such as a Nazi symbol in some countries). And in this case, the law is more about using it in an advertising context specifically; in this case "use" is defined more like in trademark law, as implying an official connection with the state. Hosting it here is not a "use". We have the {{Insignia}} tag for precisely this case (which the SVG in question has, as do a great many files on Commons). It can be a bit shocking to see those laws for the first time, but they have pretty much always been there. 18 USC 713 is one on the U.S. Great Seal (and others). You can be fined for misusing the Red Cross symbol, and this law section has others. Until recently, "Smokey Bear" was similarly protected (repealed in 2020). Note that those laws are in Chapter 18 of the United States Code, not Chapter 17 (which is the copyright chapter). These laws are similar in intent to trademark, since those symbols cannot be officially trademarked per the Paris Convention for the Protection of Industrial Property. (The prohibition is in Article 6ter, if you're interested.). Since they cannot be trademarked, governments now routinely pass laws such as this to protect their seals from being misused. Carl Lindberg (talk) 00:00, 30 January 2024 (UTC)[reply]

See also[edit]