Radar Trends to Watch: April 2023

Developments in AI, Security, Programming, and More

By Mike Loukides
April 4, 2023

In March, it felt like large language models sucked all the air out of the room. There were so many announcements and claims and new waiting lists to join that it was difficult to find news about other important technologies. Those technologies still exist, and are still developing. There’s a world beyond AI.

One important shift in the past month: The new cybersecurity strategy for the United States shifts responsibility from customers to software and service providers. If something bad happens, it’s no longer (entirely) your fault; vendors need to build more secure software and services. The use of memory-safe languages, particularly Rust, but also older languages like Java and new contenders like Zig, will help software to become more secure.

Learn faster. Dig deeper. See farther.

Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

Learn more


  • According to Simon Willison, gpt4All is the easiest way to get a (small) large AI model running on a laptop. It’s the base LLaMA model with further training on 800,000 questions and answers generated by GPT-3.5.
  • Hugging Face has created a tool called Fair Diffusion for de-biasing images generated by generative graphics tools. With minimal changes to the image, Fair Diffusion changes gender and ethnic characteristics to reflect diversity in populations. It’s suggested that similar techniques will work for language models.
  • Databricks has released Dolly, a small large language model (6B parameters). Dolly is important as an exercise in democratization: it is based on an older model (EleutherAI’s GPT-J), and only required a half hour of training on one machine.
  • ChatGPT has announced a plugin API. Plugins allow ChatGPT to call APIs defined by developers. These APIs can be used to retrieve data and perform actions for the users. Unauthorized plugins became available almost immediately, for purposes like generating hate speech and looking up crypto prices.
  • A Quick and Sobering Guide to Cloning Yourself: Yes, you can. Start with ChatGPT, add a speech-to-text service that duplicates your voice, and a service that generates video from a still photo, and you’re there.
  • Prompt engineering–the technique of crafting prompts that cause a language model to produce exactly the result you want–is a new sub-discipline in computer science. Here is a good summary of prompt engineering techniques.
  • Simulating bad drivers greatly reduces the time it takes to train AI systems for autonomous vehicles. Simulations can quickly generate dangerous scenarios that rarely occur in real life.
  • Google has opened a waiting list for its Bard chat application, based on Google’s LaMDA language model. Unlike ChatGPT and GPT-4, Bard has access to information on the Web. It isn’t a substitute for search, though it will generate links to Google searches along with its response.
  • Stanford’s Alpaca 7B model, a clone of LLaMA 7B, was trained in part on output from ChatGPT, greatly reducing the training cost. The total cost of training was under $600.
  • Glaze is a free tool for “cloaking” digital artwork. It changes images in a way that isn’t detectable by humans, but that makes it difficult for a generative model to copy the work.
  • Baidu has announced Ernie Bot, a multimodal large language model and chat that should be similar to GPT-4. So far, reviewers are unimpressed.
  • Microsoft has announced that it will be building ChatGPT-like capabilities into its Office365 products (Word, PowerPoint, Excel, and Outlook).
  • Google has announced that it is building generative AI into every product. It is also making an API for its PaLM model available to the public.
  • GPT-4 was released on Pi-Day, with limited public access: chat access to subscribers to ChatGPT +, a wait list for API access. The most notable change is that it will be able to work with images, although that isn’t supported initially. Errors are still an issue, although they are less common.
  • A research group at Stanford has released Alpaca, a version of Facebook/Meta’s LLaMA 7B model that has been tuned to run on smaller systems. They will release the weights when they receive permission from Meta.
  • llama.cpp is a port of Facebook’s LLaMA 7B model to C++. It runs on OS X (possibly just Apple Silicon). The author is working on larger models. Dalai is an NPM-based tool that automates downloading, building, and running llama.cpp. There are reports of llama.cpp running on Windows, Android phones, and even Raspberry Pi.
  • Writeout is a free audio transcription and translation service, powered by the Whisper language model. Whisper was developed by OpenAI, and is closely related to the GPT-series large language models.
  • How can we design programming languages that can easily be generated by automated tools? An important question in an age of AI.
  • The Romanian government has deployed an AI “advisor” to the Cabinet that summarizes citizens’ comments. Romanians can submit remarks via a website or social media, using a special tag.
  • Andrew Ng writes that economic incentives will prevent “watermarking,” in which generative AI systems add data to their output to identify that it is AI-generated, from being effective.
  • Google has published an update on its Universal Speech Model, which is a part of their 1000 Languages project. Their goal is to build a single model for the 1000 most widely used languages in the world, many of which have a limited number of speakers.
  • Someone has developed a StableDiffusion plugin for Photoshop. It is open source, and available on GitHub.
  • Not to be outdone by Microsoft’s Kosmos, Google has announced Palm-E, an “embodied” language model that incorporates visual and other sensor inputs, and has been embedded into robots.
  • Microsoft is incorporating conversational AI into its productivity tools, including its PowerPlatform and Dynamics 365, where it can perform tasks like summarizing a website and drafting responses to customer queries.
  • Microsoft has built a Multimodal Large Language Model called Kosmos-1. Kosmos-1 is a language model that has also been trained on images. It is capable of solving visual puzzles and analyzing the content of images, while using human language: you can ask it about visual objects.
  • Microsoft has built an experimental framework for controlling robots with ChatGPT. ChatGPT converts natural language commands into code, which is then reviewed by a human and uploaded to the computer. Robotics aside, this may be a preview of programming’s future.
  • A judge in Cartagena, Colombia has used ChatGPT as an aid when drafting a decision in a court case, including GPT’s full responses in the decision.
  • The US FTC says that companies selling AI products need to be careful that the claims they make about those products are accurate.


  • The Zig programming language is worth watching. It is a simple imperative memory-safe language designed to compete with C, C++, and Rust. It has a long way to go before it catches up with Rust (let alone C++), but it is starting to get traction.
  • GitHub has announced Copilot X, its vision for next-generation Copilot. Copilot will include a voice interface, the ability to explain code (relying on GPT-4), adding comments, answering questions about documentation, and even explaining Git pull requests.
  • Slim.ai has a service that optimizes containers by throwing out everything that isn’t needed for the application. As Kelsey Hightower has said, the best software is the software you don’t ship.
  • Will WebAssembly become a general purpose programming tool? One area where it might fit is serverless. Minimal startup time, a secure sandbox, and cross-platform support are all desirable for serverless apps.
  • Miller is a tool that is conceptually similar to sed, awk, and other Unix command line utilities, except that it has been designed to work with CSV, TSV, and JSON files.
  • GitHub now requires the use of 2-factor authentication (2FA).
  • The PostgreSQL database has long been recognized as the best of the open source databases, but its popularity has always lagged behind MySQL. According to a StackOverflow survey, it is finally getting the attention it deserves.
  • Rust was designed as a “memory safe” language, and probably makes the strongest guarantees about memory safety of any widely used language. Here’s a post that demonstrates what “memory safety” means.
  • 8th Light has published a short series (and a video) discussing what programmers should know about data regulation.


  • The Evasive.AI platform, developed for Oak Ridge National Laboratory, generates malware samples along with the training data that security systems will need to detect and quarantine the malware.
  • Microsoft Exchange Online will start delaying and blocking email messages from Exchange servers that are no longer under support and that haven’t received patches.
  • VEX (Vulnerability Report Data Exchange) is a new machine-readable standard for reporting vulnerabilities in software. It is designed for use with Software Bills of Materials.
  • The US has released its national cybersecurity strategy. Its key points are that it shifts responsibility from end-users to software and service providers, and stresses the importance of long-term investments. The Lawfare blog provides an excellent summary.
  • Phishing continues to be an important attack vector, with a voice call used as a follow-up to a bogus email about a service or charge.

Web and Metaverse

  • Beauty filters on social media aren’t new. But the newest hyperrealistic beauty filters are close to undetectable, even in video (as on TikTok). Regardless of the consequences, they will inevitably be part of an AR-enhanced metaverse.
  • Lidar has become much less expensive, and is now cheap enough to be integrated into consumer devices (including the iPhone 12). It enables many exciting projects–from building 3D worlds to backing up cities in Ukraine that are liable to being destroyed by bombing.
  • Web Fingerprinting is a technique for identifying and tracking users that relies only on the characteristics of the browser and computer they are using. It doesn’t require cookies, it’s unaffected by VPNs or even Tor. And it’s available “as a Service.”
  • Google has begun a limited roll-out of client-side encryption for Gmail and Calendar.


  • A more sophisticated version of LIDAR can better understand pedestrian behavior and its relationship to auto traffic.
  • An autonomous robot has been developed to measure leaf angles on corn plants. Measuring leaf angles is important because it shows how effective the plants are at photosynthesis.


  • Over 200 people have been treated with experimental genetic therapies using CRISPR. While these treatments have been effective at curing untreatable diseases, they raise questions about the cost, which can easily be in the millions of dollars.

Post topics: Radar Trends
Post tags: Signals