Archive

The Dulin Report

Browsable archive from the WordPress export.

Results (56)

On Amazon Prime Video’s move to a monolith May 14, 2023 One size does not fit all: neither cloud nor on-prem Apr 10, 2023 Comparing AWS SQS, SNS, and Kinesis: A Technical Breakdown for Enterprise Developers Feb 11, 2023 Stop Shakespearizing Sep 16, 2022 Using GNU Make with JavaScript and Node.js to build AWS Lambda functions Sep 4, 2022 Monolithic repository vs a monolith Aug 23, 2022 Keep your caching simple and inexpensive Jun 12, 2022 Java is no longer relevant May 29, 2022 There is no such thing as one grand unified full-stack programming language May 27, 2022 Best practices for building a microservice architecture Apr 25, 2022 TypeScript is a productivity problem in and of itself Apr 20, 2022 In most cases, there is no need for NoSQL Apr 18, 2022 Node.js and Lambda deployment size restrictions Mar 1, 2021 Should we abolish Section 230 ? Feb 1, 2021 TDWI 2019: Architecting Modern Big Data API Ecosystems May 30, 2019 Microsoft acquires Citus Data Jan 26, 2019 Which AWS messaging and queuing service to use? Jan 25, 2019 Using Markov Chain Generator to create Donald Trump's state of union speech Jan 20, 2019 Let’s talk cloud neutrality Sep 17, 2018 A conservative version of Facebook? Aug 30, 2018 TypeScript starts where JavaScript leaves off Aug 2, 2017 Design patterns in TypeScript: Chain of Responsibility Jul 22, 2017 I built an ultimate development environment for iPad Pro. Here is how. Jul 21, 2017 Rather than innovating Walmart bullies their tech vendors to leave AWS Jun 27, 2017 Emails, politics, and common sense Jan 14, 2017 Don't trust your cloud service until you've read the terms Sep 27, 2016 I am addicted to Medium, and I am tempted to move my entire blog to it Sep 9, 2016 What I learned from using Amazon Alexa for a month Sep 7, 2016 Amazon Alexa is eating the retailers alive Jun 22, 2016 In search for the mythical neutrality among top-tier public cloud providers Jun 18, 2016 What can we learn from the last week's salesforce.com outage ? May 15, 2016 Why it makes perfect sense for Dropbox to leave AWS May 7, 2016 Managed IT is not the future of the cloud Apr 9, 2016 JavaScript as the language of the cloud Feb 20, 2016 Our civilization has a single point of failure Dec 16, 2015 Operations costs are the Achille's heel of NoSQL Nov 23, 2015 IT departments must transform in the face of the cloud revolution Nov 9, 2015 Setting Up Cross-Region Replication of AWS RDS for PostgreSQL Sep 12, 2015 Top Ten Differences Between ActiveMQ and Amazon SQS Sep 5, 2015 Ten Questions to Consider Before Choosing Cassandra Aug 8, 2015 The Three Myths About JavaScript Simplicity Jul 10, 2015 Big Data is not all about Hadoop May 30, 2015 Smart IT Departments Own Their Business API and Take Ownership of Data Governance May 13, 2015 Guaranteeing Delivery of Messages with AWS SQS May 9, 2015 We Need a Cloud Version of Cassandra May 7, 2015 Building a Supercomputer in AWS: Is it even worth it ? Apr 13, 2015 Ordered Sets and Logs in Cassandra vs SQL Apr 8, 2015 Exploration of the Software Engineering as a Profession Apr 8, 2015 Finding Unused Elastic Load Balancers Mar 24, 2015 Where AWS Elastic BeanStalk Could be Better Mar 3, 2015 Trying to Replace Cassandra with DynamoDB ? Not so fast Feb 2, 2015 Why I am Tempted to Replace Cassandra With DynamoDB Nov 13, 2014 How We Overcomplicated Web Design Oct 8, 2014 Infrastructure in the cloud vs on-premise Aug 25, 2014 Cassandra: a key puzzle piece in a design for failure Aug 18, 2014 Cassandra: Lessons Learned Jun 6, 2014

Should we abolish Section 230 ?

February 1, 2021

Section 230 of the Communications Decency Act shields interactive computer services from liability arising from inappropriate or illegal content published by its users as long as said service moderates the content in good faith:
(1) Treatment of publisher or speaker

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2) Civil liability

No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

Without Section 230 protections, companies like Twitter and Facebook would require an army of lawyers and editors constantly monitoring content. Advertising revenue alone would not be enough to cover the costs. As I proposed in the past, social media companies would have to charge users for publishing.

Since I wrote my original post on the subject in 2018, I had some time to think, and my views have evolved.

Section 230 does not need to be abolished — it needs to be revised. We need to clarify the distinction between hosts, content-sharing services, content-discovery services, content-consumption services, discussion boards, and publishers.

Hosts

Hosts are the easiest to define. A host offers an infrastructure for hosting user content. The user has a great degree of control over the content and how it is published. Hosts do not repurpose or modify user content in any way, though they may offer a mechanism to discover information such as search

At the crudest level, a service like AWS is a host. They offer the equivalent of running a server in your basement. You can host whatever you want on their hardware.

Wordpress.com is also a host. They offer a highly customizable platform for publishing. I would put Tumblr in this category as well.

Hosts have limited ways to earn money from their users. They can either charge users directly or have another arrangement, such as asking the user to place an advertising banner amid their content — said banner only relevant to its content. What hosts do not do is mine user-generated content for purposes other than searching and discovery.

Hosts are not publishers. Users are.

Hosts are like landlords. They let you use their property. They do not get in the way of you decorating your apartment how you see fit. They reserve the right to kick you out for illegal, abusive, or inappropriate activity. Landlords are not liable for renters’ behavior, and neither should hosts.

Content-sharing platforms

Content-sharing platforms allow users to share and discover content, such as images. Similar to hosts, they do not repurpose user-generated content in any shape or form.

Just like hosts, they have limited ways to monetize their services. They can charge users for advanced services (like high-resolution images). They can advertise directly to users, similar to how I described advertising by hosts above — advertising must be relevant only to the surrounding content.

Flickr.com is one example of such a platform. Flickr does not repurpose content.

Vimeo is a video sharing platform — the users that generate this content have full control over where, when, and how it appears.

YouTube repurposes user content — they modify user videos to insert advertising; the moment they do that, they become publishers.

Content-discovery services

Search engines are content-discovery services.

As long as search engines do not repurpose content that their users discover, they do not need to be held liable for it. The moment a search engine repurposes the content, they become a publisher.

For example, Google News is a news search engine that also repurposes content. They should be treated as a publisher.

Content-consumption platforms

Content-consumption platforms include products like RSS feed readers, and news aggregators like Apple News, that do not repurpose aggregated content for anything other than summarizing and delivering it to the user.

Some aggregators have a curated news section. By curating content, the aggregator app repurposes it. In that case, the curator acts as a publisher and is liable for the content they curate.

Discussion boards

Discussion boards are lightly moderated forums where users discuss a related set of interests and topics. Examples include old-school BBS systems, Usenet Newsgroups, email lists, hosted phpBB bulletin boards, Discord, IRC, Telegram, Signal, etc.

Additionally, I would classify the comments section of a newspaper as a discussion board.

In this case, the spirit of the original Section 230 protections should apply. As long as there is good faith moderation, the party responsible for this message board should not be held liable for the users' content.

Publishers

A publisher repurposes user-generated content. The user gives up most of their rights to control where, when, and how their content shows up, and what their content is used for.

New York Times is a publisher — their journalists are users who produce content, whereas their editors repurpose it. The journalists have little say over where and how their content shows up.

Medium is a publisher as well. They limit ways in which users can customize the look and placement of the content they generate. Medium editors pick and choose featured content and control discovery mechanisms.

Youtube, Facebook, and Twitter are most certainly publishers. They do not give users any mechanism to customize when, where, and how their content appears to others. They also repurpose the content for purposes other than displaying it to other users.

A publisher or any internet service that repurposes user-generated content for motives other than display and discovery must most certainly be held liable for the content they propagate.

Where do social networks fit?

I propose a simple rule:
Does the internet service repurpose user-generated content for motives other than display and discovery?

If the answer is yes, then the service should be considered a publisher and therefore held liable for the content they propagate.

Consider Facebook as a case study.

Facebook offers a free service to users. Users generate content, which Facebook collects. Facebook repurposes user-generated content to track users and show them personalized ads in places other than Facebook itself. Facebook’s terms of service state as much:
Specifically, when you share, post, or upload content that is covered by intellectual property rights on or in connection with our Products, you grant us a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content (consistent with your privacy and application settings).

Based on the proposed rule above, Facebook should be held liable for the content they host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of.

To avoid liability, Facebook would have to modify how they operate:

  • Advertising should only be pertinent to the content near which ads are displayed and should not track a user from place to place,

  • User-generated content should not be repurposed (i.e. Facebook may not use, distribute, modify, run, copy, display, translate, or otherwise create any derivative works of user-generated content), and

  • Algorithms may not decide when, where and how user-generated content is displayed