Four short links: 15 January 2019

Inside Actions, Live Coding, Science is Hard, and Censorship Factories

By Nat Torkington
January 15, 2019
  1. The Life of a GitHub Action (Jessie Frazelle) — When you go through orientation at Google, they walk you through “The Life of a Query,” and it was one of my favorite things. So, I am re-applying the same for a GitHub Action.
  2. Live Coding: OSCON Edition (Suze Hinton) — an 8-minute live “speed run” of me coding JavaScript to remotely control an Arduino. (via Twitter)
  3. Learn faster. Dig deeper. See farther.

    Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

    Learn more
  4. The Association between Adolescent Well-being and Digital Technology Use (Nature) — The widespread use of digital technologies by young people has spurred speculation that their regular use negatively impacts psychological well-being. Current empirical evidence supporting this idea is largely based on secondary analyses of large-scale social data sets. Though these data sets provide a valuable resource for highly powered investigations, their many variables and observations are often explored with an analytical flexibility that marks small effects as statistically significant, thereby leading to potential false positives and conflicting results. Here we address these methodological challenges by applying specification curve analysis (SCA) across three large-scale social data sets (total n = 355,358) to rigorously examine correlational evidence for the effects of digital technology on adolescents. The association we find between digital technology use and adolescent well-being is negative but small, explaining at most 0.4% of the variation in well-being. Taking the broader context of the data into account suggests these effects are too small to warrant policy change. As an author said on Twitter, “The paper powerfully visualizes that without pre-registering analysis plans beforehand, analytical bias can allow researchers to tell almost any story with powerful data resources.”
  5. China’s Censorship Factories (NY Times) — someone has to learn what objectionable things are being said online so the filters can be tuned properly. Beyondsoft employs over 4,000 workers like Mr. Li at its content reviewing factories. That is up from about 200 in 2016. They review and censor content day and night. […] Many online media companies have their own internal content review teams, sometimes numbering in the thousands. They are exploring ways to get artificial intelligence to do the work. The head of the AI lab at a major online media company, who asked for anonymity because the subject is sensitive, said the company had 120 machine learning models. […] New hires start with weeklong “theory” training, during which senior employees teach them the sensitive information they didn’t know before. Honestly, I could quote the whole thing. It’s a wtf paradise—like William Gibson and George Orwell got drunk and sketched a story.
Post topics: Four Short Links
Share: