Distributed Artificial Intelligence

Many amateur "AI professionals" are "philosophers" who ignore as much experience and data as necessary in order to make pompous categorical statements that impress the rubes but cannot be translated into hardware. Sine data, cogito ergo sum insipiens.

Megabrain AI, or superfast single core AI, will never happen because it is suboptimal. Distributed AI, many nodes including humans and other lifeforms, is happening now.

Human brains are distributed low power multiprocessors relying heavily on pattern matching, and moving atoms between synapses, not electrons. The brain does all its work, including physiological maintenance, with 20 watts of power, and does so at a results-generation rate that matches the problems encountered in the natural environment. Brains could work faster, but energy is costly to gather.

There is no reason to think that engineers (human or artificial) will design AI in excess of productive need or available resources. The productive need is vast, perhaps unbounded, but it is situational and specific.

The biggest extant AI, the worldwide network of Google computers, handles a huge parallel task, serving millions of simultaneous queries. Google uses a crapton of parallel computers to do that. Using fewer, faster computers is more expensive and less energy efficient.

Speed is proportional to CV, power is proportional to CV²F, so increasing C (parallelism) and reducing V is a win, more results per watt-hour. Nature knows this, Intel knows this, nVidia knows this, Google knows this, I know this. The millenialist AI community does not.

Google response time is a matter of two things - network delay, and how many canned responses they've stored up for generic search queries. Typically, each customer-facing compute node pulls the search words deemed "important" out of a search query, ignores the rest, assembles a pointer, then does one disk lookup on the machine the pointer points to, spewing the results of a generic search that was performed hours or weeks or years ago. Along with the revenue-generating ads that attach to the search words. That approach is fast and cheap, and works like a human brain. Also like the typical brain, annoying to those of us who are trying to find exceptional results.

Given speed-of-light network delays, there is no reason to spend gigabucks to further reduce the response time of the computers. It is better to create more Google data centers, closer to the product AKA search users, so they can be harvested faster and more efficiently. There is no need to build a typical Google data center bigger than the product field it harvests. Google's data center in The Dalles, Oregon, is an exception; electricity is so damned cheap here that many of the backroom tasks, like efficiently sorting the internet into bins and assigning search pointers to them, is best done where it is cheap. Then those bins are replicated to production data centers around the world. Of course, those distributed data centers can also assist with bin creation during times of low product demand.

Google, like nature, puts all its eggs in MANY baskets. Nuke three Google data centers, and with a little bit of routing magic, the rest will shoulder the load. The unit of genetic reproduction is the species, not the individual; a too-small subset of a species is nonviable (estimated to be greater than 500 individual humans for sufficient genetic diversity and accident tolerance).

Ah, well, Google's customers (the advertisers) get access to the product they want (you and me), and this is a very efficient way to deliver the best quality product to the customers. Your brain craves access to the glucose that feeds it, so it feeds your body unhealthy crap.

AI, like brains, is a tool to solve a problem with the time and resources available. The best AI computes plausible solutions in advance of need, as power-efficiently as possible. Lots of solutions in parallel, distributed in time to match the rate of problem manifestation, is the optimum way to proceed. For problems manifesting at daily rates, solution systems 80 AU in diameter are adequate, and emergent threats accelerating to sublight velocities will need much longer to threaten those systems.