Radar trends began as an internal resource for O’Reilly. It’s a monthly list of things that I find interesting or important—possibly not “trends,” strictly understood, but ideas that might become trends.
Most items on the list have links to media sources—some original sources, more frequently other reporting, whichever I think is more informative. Some items are personal observations or summaries of interesting conversations.
Many items are about technology, conceived rather broadly. Over time, topics will include biology and biotech, design and user experience, ethics, open source communities, energy, and more.
We hope you find these observations useful and informative!
Germany, France, and Japan have formed an alliance for “human-centered” AI. Canada is also a potential member. This move is partially to get them “critical mass” in AI research, rather than coming off in third place, and partially because they don’t trust ethical stances in the US and China, and they see a market opportunity for ethical AI.
RunwayML is yet another entry in the “create a deep learning model with minimal programming” sweepstakes. Jeremy Howard’s platform.ai sounds the most radical; there’s also AI2GO (from xnor.ai), AutoML (from Google Cloud Platform), and others. Are we on the edge of programmerless AI? Is this Software 2.0?
Cerebras has announced a trillion transistor chip. It’s by far the largest chip ever marketed (roughly the size of a sheet of paper), and it’s designed for training AI systems. Although it has huge power requirements (15 KW), it probably makes AI development less power hungry, and certainly less time consuming. In the long run, this probably doesn’t address the problem of power consumption; given faster, more powerful processors, people will create bigger, more complex models.
We’ve seen Snorkel (now a startup) and the other tools from Chris Ré’s lab at Stanford. Scale AI appears to be doing something similar (partially automated image tagging, though it still uses contract workers on the back end). This is an important step in the democratization of AI. As Ré said at the O’Reilly Artificial Intelligence Conference in New York, data collection and model building have largely been automated, but data tagging and cleaning are stubbornly dependent on human labor.
The Linux Foundation’s Confidential Computing Consortium aims to protect data that is in use–i.e., data that is being computed on, not data at rest (in storage) or in flight (being transferred). This requires a combination of hardware and software to build a trusted computing environment. It has significant backing from Intel, Microsoft, and Red Hat/IBM.
Supply chain attacks aren’t entirely new, but they’re becoming more common. The idea is to attack the open source supply chain: find a project that isn’t well managed (of which there are many), and submit changes that create a backdoor that can be exploited in other projects that include this project as a dependency. Backdoors (and other security problems) can be very subtle, and easy to introduce into a project that isn’t being watched very carefully. Examples of successful exploits are Webmin and RubyGems.
Google’s proposal for controls on cookies and browser fingerprinting is interesting on several levels. It establishes a privacy budget. A publisher can make a limited number of calls asking for information about the browser; that’s enough to give a publisher partial information, but not complete information. The publisher can decide what it wants. This isn’t as extreme as Apple’s restrictions, or what Mozilla is likely to implement in Firefox, at least in part because Google is dependent on advertiser income.
Google is talking about replacing passwords with biometrics. They already have fingerprint recognition on Android, but they are also starting to use biometrics for access to other services. We’ve heard for some years now that passwords would disappear; maybe it’s finally time? On the other hand, fingerprints are ultimately just objects in databases, and like anything else, databases can be attacked. Once a fingerprint is comprised, you can’t change it.
Brain-computer interface trends
It’s not just Elon Musk who wants to put wires in your brain. Facebook does, too. Here’s Facebook’s take on brain-computer interfaces. The ostensible motivation is pretty much the same as Musk’s: helping people with disabilities first, but then a big step forward in user interfaces.
One researcher is building neural networks with real neurons (and putting neurons into chips, where they live for a couple of months). It’s very futuristic research, but biological computation could become something worth following. Biological circuits aren’t terribly fast, but they’ve certainly solved some connectivity, power, and density issues.
I’ve seen recent interest in web frameworks within frameworks: frameworks within React, frameworks within Vue, etc. I think this says something about the fragility of the current state of web programming. React, Angular, and their equals are just too complex. I don’t know if metaframeworks are the answer (I think they’re not), but their existence is certainly a signal of the problem.
RISC-V is an open source architecture for building high-performance CPUs. This would compete directly with the Intel chips and all the other mass-market CPUs. It’s interesting to see that Adafruit and Raspberry Pi are getting involved.
The Federal Reserve has announced that it will create its own electronic system for clearing payments. This is effectively a government “blockchain” (though who knows if it will use blocks or chains). My guess is that it’s too little too late (launching in 2023 at the earliest, and competes with projects already in progress at major banks).