Four short links: 28 September 2017

Deep Learning, Knowledge Base, Algorithm Transparency, and Formal Methods

By Nat Torkington
September 28, 2017
  1. New Theory Cracks Open the Black Box of Deep Learning (Quanta) — Talk (on YouTube), and paper (on arXiv) are interesting, but the article itself has lots of layman-accessible morsels. Tishby and Shwartz-Ziv also made the intriguing discovery that deep learning proceeds in two phases: a short “fitting” phase, during which the network learns to label its training data, and a much longer “compression” phase, during which it becomes good at generalization, as measured by its performance at labeling new test data.
  2. YAGOYAGO is a large semantic knowledge base, derived from Wikipedia, WordNet, WikiData, GeoNames, and other data sources.
  3. Learn faster. Dig deeper. See farther.

    Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

    Learn more
  4. ProPublica Seeks Source Code for New York City’s Disputed DNA Software — good to see more places legally testing opaque algorithms.
  5. New Ways of Coding (The Atlantic) — from testing to formal methods, a readable and accurate survey of discontent about modern software development. The real problem in getting people to use TLA+, he said, was convincing them it wouldn’t be a waste of their time. Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil. And yet, they’re useful.
Post topics: Four Short Links
Share: