Radar Trends to Watch: November 2022
Developments in AI, Programming, Quantum Computing, and More
Maintaining a separate category for AI is getting difficult. We’re seeing important articles about AI infiltrating security, programming, and almost everything else; even biology. That sounds like a minor point, but it’s important: AI is eating the world. What does it mean when an AI system can reconstruct what somebody wants to say from their brainwave? What does it mean when cultured brain cells can be configured to play Pong? They don’t play well, but it’s not long since that was a major achievement for AI.
- Hugo Bowne-Anderson interviews Shreya Shankar about her ethnographic study of MLOps Practices. This is a must-listen! Shreya talks about pain points, good practices, and how the real world differs from what you’re taught in school.
- Useful Sensors is a startup that produces sensors with AI built in. Their first product is a PersonSensor, a $10 camera that detects faces and computes their location relative to the camera.
- The Bias Buccaneers is a group of volunteers who are creating a competition for detecting bias in AI systems. Microsoft and Amazon, among others, are backing it. The practice of auditing AI systems, while it has had a slow start, is poised to grow as regulations covering AI gain traction.
- Microsoft has released an open source toolkit for AI-based precision farming.
- Facebook’s No Language Left Behind project has released an open source model (along with code and training data) that can translate between any of 200 languages.
- The creators of StableDiffusion have announced HarmonyAI, a community for building AI tools for generating music. They have released an application called Dance Diffusion.
- Researchers have developed a turtle-like robot that can both swim and walk on land by changing the shape of its legs. Applications may include monitoring aquatic ecosystems and underwater farming.
- If you’re interested in writing AI software to generate code (and not just using Copilot), Evan Pu has begun a series of blog posts on program synthesis.
- An AI application called Transkribus is capable of reading old handwriting. Anyone who has done archival research will know immediately what a big problem this is.
- Transformers revolutionized natural language processing. CapitalOne is exploring the use of Transformers for tabular data, which could lead to a similar revolution in financial applications.
- Google’s AudioLM uses large language model techniques to produce spoken audio and music. The prompts are audio clips, rather than texts, and the output sounds more natural than other audio synthesis software.
- Can AI be used to develop new algorithms for problems humans think are well-understood, like matrix multiplication? Deep Mind’s AlphaTensor says yes. This result won’t get as much attention as generative art, but it is likely to be more important.
- The White House has revealed its AI Bill of Rights. It’s an impressive document but, unlike similar efforts in Europe and elsewhere, says little about enforcement.
- Tesla has demonstrated a prototype of Optimus, its humanoid home robot. Reactions are mixed; the demonstration was unimpressive, though they appear to have done years worth of R&D in a very short time. It’s also not clear that their robot is solving the right problems.
- Not to be outdone by Make-A-Video and Phenaki (another text-to-video AI generator), Google has announced Imagen Video and DreamFusion. DreamFusion generates 2D images that can be viewed from any angle. (Others have done something similar based on Stable Diffusion.)
- An AI system can now reconstruct continuous language sequences from non-invasive recordings of brain activity.
- A proposed law in the EU would allow people to sue AI companies for damages after being harmed by a result from an AI system.
- A fascinating method for detecting audio deepfakes has achieved 99% accuracy. It models what a human vocal tract would have to do to produce the sounds. Most deep fakes require impossible configurations of the vocal tracts.
- Clive Thompson’s thoughts on the future of programming are worth reading, particularly on the influence of tools like Copilot on “code-adjacent” programmers; that is, non-professionals who do limited programming as part of their jobs. For them, Copilot will be a superpower.
- Another kind of automatic code generation: the OpenAPI Generator is a tool that automatically generates client libraries, stubs, and other code for APIs that are documented according to the OpenAPI Specification.
- Contributions of code to open source software projects appear to be declining, possibly because of security concerns. If this hypothesis is correct, it is counterproductive. Open source is critical infrastructure, and critical infrastructure needs to be maintained.
- wasmCloud is a platform for developing components with WebAssembly that can run anywhere, including in the cloud or on small networked devices. It includes the ability to form a mesh network that’s independent of where components run.
- Matt Welsh predicts that the future of computing will not be about programming; it will be about training large models for specialized applications.
- Another tool for deploying containers: nerdctl. How many is too many? We don’t know whether nerdctl is a winner, but it’s important to watch for lightweight alternatives to Kubernetes.
- Terraform will be offering a no code option for cloud configuration. This will simplify cloud deployment, making it possible for developers to deploy directly without assistance from an operations group.
- The Cassandra database will support ACID transactions, taking advantage of a new consensus protocol named Accord.
- More alternatives to the Electron framework: Last month, we mentioned Rust’s Tauri. Now there’s an Electron-like framework for Go, called Wails.
- Steve Yegge has emerged from retirement to take a job as Head of Engineering at Sourcegraph, a company that’s making tools for searching, navigating, and understanding code across multiple repositories. We normally wouldn’t consider a “new hire” noteworthy, but everything Steve does is worth watching. Be on the lookout for some excellent tools.
- Constellation is the first implementation of Confidential Kubernetes: a Kubernetes distribution designed for confidential computing. Confidential computing isn’t limited to individual nodes; Constellation authenticates all of the nodes in a Kubernetes cluster, and ensures that data is always encrypted, especially in transit.
- A cryptocurrency mining operation named PurpleUrchin is using free resources offered by services like GitHub and Heroku. The security community suspects that their goal isn’t mining coins but executing a 51% attack against one of the smaller currencies.
- A standard for passkeys that is supported by Google, Apple, and Microsoft, and that is easy to use, means that (at last, maybe) we can do away with passwords.
- Model spinning is a new attack against language models that causes them to take a specific viewpoint on a subject in response to trigger words in the prompt–for example, taking a positive or negative viewpoint on a political figure. It could be used to generate interactive propaganda.
- Random number generation is fundamental to many algorithms and games–particularly algorithms related to privacy and security. However, generating good random sequences is difficult. Flaws in devices like automatic card shufflers show how tricky the problem can be.
- The platform engineering movement is gaining steam. Platform engineering requires treating the developer’s environment as a product, and developing that environment into a platform that’s easy to work in, and that makes it simple and safe for developers to push working code into production.
- Aurora is a collaboration between the Chrome browser’s development team and developers of frameworks like React and Next.JS that intends to make the browser a better target for these frameworks.
- Mitre’s D3FEND is a public knowledge graph of cybersecurity countermeasures. It is a counterpart to ATT&CK, a knowledge graph of tactics and techniques used by attackers.
- Sternum is an observability and security platform designed for IoT devices based on Linux or RTOS. It is difficult to get information from devices once they’re in the field. Sternum performs anomaly detection, in addition to providing information about user-defined traces.
- What can you trust in the software supply chain? Nothing, not even compilers; a new paper shows that compilers can be used to insert undetectable backdoors into models for machine learning. Trust nothing.
- Downcoding is a new attack against common methods for anonymizing data. It was designed specifically as a challenge to privacy regulations: organizations that collect and share data have to do better to preserve privacy. Anonymization isn’t enough.
- Apple’s AppStore now allows apps that purchase NFTs, or deliver services through NFTs. However, all payments must be made through in-app purchases.
- The British artist Damien Hirst has started to burn the originals of artworks that he has sold as NFTs to “complete the transformation” of the work into the digital world. The artworks are burned in a public exhibition, making the burning itself a work of performance art.
- Metatheft: A threat actor has injected dApps into cryptocurrency scam sites. These dApps divert funds sent to the scammer to the threat actor’s accounts, thus stealing directly from the scam’s victims. The scammers presumably have no interest in reporting these thefts to authorities.
- A Korean research group has developed a platform for collaboration in the Metaverse. Fundamental ideas behind this platform are enabling people to work together in a virtual space; location recognition; minimizing latency between users; and gesture recognition.
- It’s rumored that Apple’s VR headset will perform an iris scan when someone puts it on, to authenticate the user to apps.
- Facebook/Meta figures out how to add legs to its Metaverse avatars. This was their “most requested feature.” Nobody seems impressed.
- Pong played by a dish of cultured neurons? Dishbrain doesn’t play well, but it’s surprising that it plays at all.
- Human brain cells transplanted into rat brains grow and integrate themselves with the rat’s brain cells, eventually becoming roughly one sixth of a functioning brain. This could become a platform for researching human brain diseases. And we have to ask: are these rats just rats?
- Researchers have used a quantum computer to find solutions to the Fermi-Hubbard model, a problem that can’t be solved with classical computers. Unlike previous demonstrations of quantum advantage, which had no practical value, the Fermi-Hubbard model has important implications for battery and solar cell research.