Twitter | Search | |
Engine
We support tech entrepreneurship and startups through economic research, policy analysis, and advocacy.
6,720
Tweets
2,618
Following
8,146
Followers
Tweets
Engine retweeted
Jess Miers Jul 12
I never thought about how difficult content moderation calls are until I did this activity at COMO D.C. The takeaway: content moderation is HARD. Without context + nuances of the platform (ex: one thing on Twitch means something else on Twitter), it's challenging to draw the line
Reply Retweet Like
Engine retweeted
decipherably david Jul 12
It’s (always) content moderation season in DC so the & “Nuts and Bolts” series on this issue is very much needed. Especially with Prof. , along with ’s , ’s , & ’s Erica Fox, on the panel!
Reply Retweet Like
Engine retweeted
Katie McAuliffe Jul 12
. on why people — not computers — make content moderation decisions: “These are all contextual, complicated things that you can’t program computers to do”
Reply Retweet Like
Engine Jul 12
Replying to @EngineOrg
We're now holding a content moderation demo to show some of the decisions that online platforms have to make as they police third-party content. Attendees select what should happen with specific content -- whether it should be left up, taken down, or escalated to supervisors.
Reply Retweet Like
Engine Jul 12
Replying to @ericgoldman
The nuclear remedy for content is to just to take it down. That ends all vetting of content and ends the conversation around it. Lots of things that could be done, including labeling and not offering people help to find specific terms--
Reply Retweet Like
Engine Jul 12
Replying to @kev097
There's a distinction between content that in itself is a crime, versus when the content is part of a larger context that makes it hateful --
Reply Retweet Like
Engine retweeted
Katie McAuliffe Jul 12
Happening now: and discussion on content moderation with Erica Fox Of , , &
Reply Retweet Like
Engine Jul 12
The final panel in our three-part series on content moderation is now underway! We're pleased to have Erica Fox from & on our panel to discuss content moderation practices across the Internet.
Reply Retweet Like
Engine Jul 12
Join us today at noon for the final panel in our three-part series on Internet platforms & user-generated content. We’ll discuss content moderation practices across the Internet, including how considerations change as you move away from the end user:
Reply Retweet Like
Engine retweeted
Ron Wyden Jul 9
Thank you for bringing together the powerful voices of . This is an equity issue, an economic issue and an innovation issue. America needs to step up.
Reply Retweet Like
Engine Jul 11
The USMCA's digital trade chapter is poised to open the door to a new era of innovation. Congress shouldn't pick it apart but instead work to pass implementing legislation:
Reply Retweet Like
Engine Jul 11
This morning, will be discussing the CASE Act. The bill, which would change copyright enforcement in the US, would open & consumers to new risks and incentivize bad faith copyright claims. Our full statement on the bill:
Reply Retweet Like
Engine Jul 3
Join us next Friday, July 12 for the final panel in our three-part series on the Nuts and Bolts of Content Moderation. We’ll discuss content moderation practices across the Internet, including how considerations change as you move away from the end user:
Reply Retweet Like
Engine Jul 2
A common misconception about is that it gives an unfair advantage to tech giants. Although the law provides legal protections for all Internet platforms, have thrived the most as a result of the law.
Reply Retweet Like
Engine retweeted
Beaufort Digital Jun 29
: Beaufort, S.C. - nice article on - a nonprofit tech policy, research, and advocacy organization that focuses on startups and tech-related legislation - with input from
Reply Retweet Like
Engine Jun 28
Replying to @LizWoolery
The 1st Amendment only applies to government actors & government actions. Online platforms are private companies. The 1st Amendment has no part in the discussion about what content a platform can and might moderate. --
Reply Retweet Like
Engine Jun 28
Replying to @CarlSzabo
Section 230 is an electronic version of the Good Samaritan doctrine. If a platform removes content that they believe should be removed, then they are given the protections to do that if that would like to do so. --
Reply Retweet Like
Engine Jun 28
We're pleased to have and on our panel today to discuss some of the myths of content moderation and look at what Section 230 actually does.
Reply Retweet Like
Engine Jun 28
Our Nuts and Bolts panel discussion on the myths of content moderation will begin shortly! We'll be discussing how and why platforms police user content, the differences between platforms and publishers, and the laws behind content moderation.
Reply Retweet Like
Engine Jun 27
The Trump administration wants to use tariffs to force China to cease predatory & anti-competitive trade practices, but Engine is concerned the tariffs will dampen activity & chill growth and innovation in the tech sector:
Reply Retweet Like