The Future of Computing

Introduction

We have only scratched the surface of what we will eventually do with computers and automation. Meanwhile, some seem to want the technology to stagnate, to become a "mature industry," when it is far from what it could be.

The future is going to provide all sorts of advances in hardware and software, but will upcoming consumer innovations be hugely useful, as they could be, or so constrained as to be useless?

Table of Contents

Many products coming to market now are pretty lame. We don't need a new smart phone just to get a few more pixels and a couple of slick but useless "features" on top.

So it goes: things that aren't on the market, but should be, things that are on the market but shouldn't be.

Then there are the things that are on the market, but don't even exist.

Fake Tech?

A.I.

We've already covered this topic in Stop Killer Robots Now. A.I. — Artificial Intelligence — is an abused term. A.I. refers to certain specific areas of research and development in computer science. While aspects of A.I. continue to improve, we are far away from an actual "thinking machine." That fantasy is merely fodder for Hollywood movies and hype to sell naïve investors the most recent junk stock.

Wherever there is great ignorance of some technology — or ignorance can be created — you can expect a real maelstrom of kooky exploitation.

And of course, when it comes to kooky exploitation, we need do no more than look to its mascot, Elon Musk. This is the type of BS that gets headlines, a vaporware announcement, and the gullible stumble all over themselves to heap praise on Musk and ask where to pre-order something that doesn't even exist in reality, but is just Musk sputtering his brain farts about.

Quick & Dirty Summary

Fake Technologies and Disappointments: Quantum Computing, H.P.'s "The Machine"

Nascent Technologies: VR and Holograms, Digital Currencies, Drones, 3-D Printers

Missing Technologies: New Operating Systems/BIOS, Speculative Ideas

Current Gizmos: Displays, Cameras

Coming Technologies: Solid State Batteries, Automation in Transport

Threats: Microsoft, "Internet of Things," Demise of PCs

With robotics, closely tied in with A.I., we should expect some sort of household robots to be offered, just not the imaginary "artificial people" that Musk lies about. Of course we've been expecting this for over 20 years now. Still, it's easy to visualize feasible uses for existing technology. It seems they should be able to come up with something that can do light general outdoor maintenance, guard duty/pool patrol and surveillance, and can act as a porter, carrying things around. You have to imagine there is a giant impetus to make something that industry can sell for the price of a car. If cars become just commodities, standardized and without the wide variation that we are currently used to, there will be a great need for something that industrial concerns can move their production lines to.

In fact, I can see a multi-purpose robot with those three specific abilities being the first step. A patrol/hauling/lawn-cutting and raking robot — something possibly just over the horizon. Priced in the hundred-thousand dollar price range, it would be just useful enough to be purchased by "early adopters" among the upper-middle and rich classes. That will lead to advances, leading to more robot capabilities, a price drop, and more general acceptance, and so on.

It's a bit strange there isn't anything like that already. Sony, Honda and Boston Dynamics especially seem to have the sophistication to make consumer robots. They probably are just wary about making the sizable investment gamble that full-fledged production lines would entail. It would be a smart bet for Apple, too, a better idea than their foolish initiative to make an "Apple Car."

Now don't just gloss over this. That there are no $199,999.99 robot butlers is a tell. It means they aren't all that far along with A.I. and robotics, or else it would be a cash bonanza they could exploit. To make matters more sticky, it seems they've been faking a lot of these YouTube videos where you see these sophisticated robots doing feats! How is this not considered fraud? Well, they aren't misrepresenting a product for sale, so maybe they can claim poetic license or something.

Awesom-O and Butters

Quantum Computing (QC)

Researching QC, it's another of these seemingly promising vaporware technologies. There's a peculiar thing about QC. It's that same problem we've seen before, whenever you have a sketchy technology someone's trying to flog. There is no clear description of what it is or does. If you research the topic, you find different "explanations," all very weak, and all seeming to be a modified version of the same, single source.

QC doesn't seem to be "computing" at all — it gets away from binary calculations into "binary, and anything in between." Oh, Boy, here we go. This reminds me of an old sci-fi story where the author described how these aliens' computers didn't use our (old-fangled) binary systems, but new-fangled trinary systems.

Doesn't work that way. Binary is good, not something to be transcended, up an advancing scale, to "trinary," "quaternary(?) ," "quinternary(?)..." It has to do with the way the inner workings, the transistors, work in computers, and the fact that it's easier to assure signals are accurate if they are only either "on" or "off," and to work with binary logic.

The "quantum computer" isn't actually a computer at all, in fact. That is, if it even exists or can be made at all.

We've also seen similar confusion, of course, with the Bitcoin boondoggle, as the proponents trip over themselves to explain something that, in the final analysis, doesn't make sense, and doesn't really perform in the advertised manner.

One notable claim of of this QC baloney-fest, is that IBM has supposedly designed a working quantum computer. Now, one of the few advantages of these computers is that they are supposed to be able to factor large numbers down into their prime number components. For example, the factors of 15 are 3 and 5, because 3 times 5 is 15. But when you multiply two really large primes together, that number can be used in data encryption, because it could take a computer years to factor those very large primes. This fact is used to protect your data, via cryptography. Note that a "public key" and "private key" are each large prime numbers.

Now, a quantum computer is supposed to excel at factoring large numbers. So you'd expect that the first thing this so-called IBM quantum computer would be tasked with, is processing to solve a really big factoring problem.

But no, as their so-called "quantum computer" produces "results" that are only stable for 90 millionths of a second. Kind of hard to read that off your monitor. Better to say, ridiculous. The whole quantum computing hype is indicative of yet another scam. It's not too hard to go out on a limb here and say it will go nowhere. What might happen (since so much hype, ego and cash has been involved), is that they'll quietly move to the next big thing and quietly drop QC, or they'll pretend some existing tech is "QC," but it'll just be a sneaky detour.

They're already saying that the company, D-Wave, that claims to have produced a quantum "supercomputer" isn't really making quantum computers in the real spirit of the thing. Muddying the waters or jealous competitors?

Anyway, takeaway point is nobody's factoring any large numbers into their prime components, and no one's doing anything tangible with these "quantum computers." So, trolls, shills and boobs out there that are hyping QC, please shut up about it until there's a real working product to show.

Disappointment

HP & "The Machine"

Hewlett-Packard had a good idea. They were going to produce a new, advanced computer. They even gave a name to it: "The Machine." It was designed to use "memristors," a new type of passive component that "remembers" the direction of travel of current last put through it, even after a power-down. The memristor acquires low resistance if current flows one way, high resistance if in the opposite direction. It's somewhat reminiscent of a transistor, whose current flow is controlled by the voltage applied to one of its three terminals. In fact, it can be employed as a switching device and therefore be used to replace transistors.

In any case, this new tech was scheduled to be employed in a commercially-available computer. Using memristors as an extremely fast memory across the architecture, including replacement of the old-style hard drive, would remove the bottleneck caused by having various types of memory. This, along with using fiber-optic connections internally, taking advantage of some optical signal processing, would make the new machine considerably faster and simpler than what we use now.

Well, the mass production of memristors hit a snag — they finally displayed a working model of The Machine, but weren't able to mass produce it economically, so they dropped the whole project. Seems short-sighted, of course, but the project lost its leader to retirement, so that probably took the wind out of the sails within HP.

What's really disappointing is the general lack of investment by almost all companies into the fundamental research needed to keep new advances coming.

The frequency (clock speed) of processors has also plateaued. My 11-year-old machine reports as clocking at 2.4 GHz — about the same as brand new machines. That does not seem right. You used to expect an order of magnitude or more increase in speed over that period of time.

There was a "law," Moore's law, that says that the density of computer chips was expected to double about every two years. This ad hoc rule was obeyed pretty closely for quite a while, but we have already approached physical limits of how much we can shrink transistors, the building blocks of microprocessors. At the same time, there are real physical limits to how fast the clock speed on our current architectures can be.

And the two are tied together. You need to pack the transistors closer together if you want to increase processing speed. But that makes for more heat dissipation problems.

It looks like we hit the wall, the technological limit.

These days, to increase capability, they add more "processing cores," and run processes in parallel. (Used to be, one process followed another, in serial fashion, one at a time.) Of course this requires more logic, more work, and even modification of the software that runs on the machine, to take advantage.

The "wall," though, is to a large extent a money wall in working with transistor technology. This Extreme Tech article gives an interesting insight, showing it's a wall of cost that keeps these advances slow. From the article: "...chip design costs are rising so quickly, they could effectively kill long-term semiconductor scaling across the entire industry... And all of that cash is being spun out in search of smaller and smaller improvements. By 3nm, the relative price/performance improvement expected to be offered is around 20 percent, compared to approximately 30 percent today."

So this new advance using memristors, smaller and quicker than transistors, would be a great boon. Could it be something we can look forward to in a decade or so, or is the technology lost to corporate mismanagement?

Nascent Technologies

Virtual Reality (VR), Holograms and Projection Technology

Isn't it virtual unreality that we want?

It bears some thought whether VR should be considered "Fake Tech." Is it just a pipe dream to have a virtual environment — like the holodeck in Star Trek? At the current level of technology, it is impossible. There is so-called virtual imaging via those geeky-looking gimmick wrap-around goggles, but that's not quite the same deal.

Delays in hologram technology show it's not yet advanced enough or satisfying enough. There are no practical applications to make a viable consumer product, though you think this would be ideally suited for movie theaters.

You'd think some commercial use for VR exists, but the reality is, it's mostly just talk now. Perhaps it's a cost issue, how to make a good return on investment.

Virtual reality could rekindle interest in movies, resuscitate failing movie theaters — though I suppose with some blockbusters still pulling in hundreds of millions at the box office, there's no need for the movie industry to panic quite yet.

We have already seen things like "floating in the air" touch screens, simulated in movies. Of course, there we face the issue of practicality. Most useful work is done via typing, and it is impractical to implement "floating" keyboards.

But it would be great, if you could just call up a computer and more practical interface with a few hand gestures, in any room of the house. That type of VR would be great for interactive games, 3-D modeling, architectural plans and visualizing chemical models, among other things.

"Virtual Presence"

Improving networking over the Internet is a good place to concentrate efforts. Used to be, a program called "PC Anywhere," let you hook up remotely, say to the office computer, from home. It was discontinued in 2014, reportedly for security reasons. There are several next-generation replacements for the product, though.

Building on that, how about something like "Virtual Presence?"

"Virtual Presence" could, for example, allow everyone in the community to vote at City Council, on various proposals and expenditures. Politicians probably wouldn't like that! But in the practical sense, it would certainly make referenda much easier and therefore more widely used. It's easy to envision something like this as a sort of teleconferencing, allowing interactive participation like voting, presentations, and such. Talk about your grass-roots politics! Probably this would be viewed as too empowering, so don't expect it any time soon.

BTC & Digital Currencies

Note how useful and informative consumer things don't get produced, but nonsense, like "bitcoins" (BTC) do get produced.

This site has already proposed an idea for a proper cryptocurrency that would actually work, but so far as I know, no one has stepped in to fill the void. It's interesting how multiple entities are releasing their own cryptos, as a publicity stunt, though.

If we're going to see all this democratization, it seems that everyone should have their own server. And then, why not provide each person with his or her own "digital coin," which could be implemented on their server, if we want true egalitarianism with this new tech? This wouldn't be a way to speculate and make windfall unearned profits, but merely for use to assist trade and commerce between people. Each person's coin would find its own level — its own relative trade value — based on the reliability, honesty, and work ethic of the particular owner.

Drones

We aren't seeing the fanatical hobbyist influence so much anymore. If you remember when the first "hobbyist computers" came out, like the Sinclair, or TRS-80, they inspired quite a fan base of dedicated followers.

Today, those surges of geeky enthusiasm don't seem to spark up like they used to.

People these days, noses buried in their phones, just don't seem motivated into exploring individual challenges and innovation anymore.

For example, we aren't seeing competitions to build endurance drones, perhaps ones that attempt to go overseas, or around the world, using solar power to recharge.

We do see a lot of commercial interest in applications like having drones replace couriers and pizza pie delivery people — there may be a business opportunity here, as anyone who completes this accomplishment (a proper, reliable drone self-guidance/navigation system) will have a great "in" to provide autonomous car software.

Practical drones that we aren't seeing, and should be, are fast drones that ride ahead of trains, on the track, required for all trains, freight and passenger. These drones should, obviously, find and report back on track conditions, remove smaller obstacles, signaling to stop the train for particular hazards.

Yet they don't do that, and the possible reasons behind it are worrisome. Do they want to have the odd derailing here and there?

Also, there's no reason we can't have rental drone robots — like those dog robots you've seen — that we can use to explore places remotely, take snapshots and upload to the net, pick up packages or mail. Why is this not something that is touted as the future? One would think they've thought of it.

3-D Printers

What happened to... 3-D printers? Are they too useful, so therefore suppressed, to keep monopolies safe, like expensive replacement parts for cars, some of which now can be printed out?

Are they still just too expensive, or not fulfilling expectations, or what? There seems to be an unlimited world of applicability for these things, as their capabilities expand, and the variety of types of materials they use increases. It's hard to figure why there isn't the same attention and promotion devoted to these game-changers as there is to nonsense.

What's Missing?

New Operating Systems/BIOS

How about a new operating system (OS) for computers that takes all the lessons learned in computing and applies them, taking advantage of new technologies.

The improvements in memory and processors, new database technology, and new advanced concepts in computer science should be mustered to make a killer system (and something that's snappy, that opens your program almost instantly when you click on its icon).

How about full disk indexing, built into the OS, so you can find that phrase in that article you saved somewhere on your terabyte disk more efficiently, without having to run auxiliary programs?

How about a multi-processor system that works by plugging in multiple processors to a back plane as more power is required... Really, why not?

Certainly, music and video players should have their own dedicated processors, and buffered memory. Note that multi-core processors solved a problem with processing multiple programs, after a lot of engineering work. Let's see that work leveraged to really speed things up.

Decades ago now, at one point, they were bragging that they were going to put a graphical interface on the BIOS (the program that starts before your operating system, enabling the various systems of the computer for the OS to manage). That went nowhere it seems, unfortunately.

What About These?

  • Good speech recognition or writing to text would be nice. Google has a hit with Google Translator. If only that could be for voice, too! Auto-generated subtitles on YouTube seem to work very well — they have trouble with the same words I do.

  • Foolproof programming would also be welcome. There are still many memory leaks in programs (where the computer starts to over-consume memory resources) and bottlenecks where your machine seems to freeze — will we ever see some technique that bypasses all this?

  • We still don't have the video phones we were promised 50 years ago — there is streaming on-line, but we don't have just plain telephones with video. Of course one reason is privacy. No one wants to be on camera with their hair a mess, in a bathrobe, toast crumbs at the corner of their mouth.

  • How about a new type of touchscreen that's actually a monitor itself, showing what's on screen, but in place of the touch pad. It probably doesn't need to be as high-res or expensive as the normal monitor, except it needs the added feature of good touch resolution, so it can interpret what you draw on it. It should be flat so you can draw on it, select text, use gestures, like pinching, etc., but the monitor itself stays untouched. Touch screens as your main monitor are a bad idea, and uncomfortable to use. (Update: Looks like Asus did this exact thing with their ZenBook Pro laptop in 2018 — but it doesn't seem to have set the world afire because we aren't seeing the idea imitated by other manufacturers.)

What should be coming sometime in the next decades, is that your phone is your computer, powerful enough so you don't need another desktop or laptop. It would be nice if it incorporated a projector that could project an image onto any suitable flat surface, or on a portable roll-up screen. Or, if you could link up to any video monitor or TV that was handy wirelessly, that would be exciting.

What's important with computers is the data. That should be the focus, and better respected. The computer should rightfully be viewed as the utility, the way to access that data. As such, people, and especially businesses, should all have their own servers, rather than relying on the "cloud" nonsense (which is, of course, someone else's computer).

A computer should be always online or working for its owner. Like Bitcoin rigs, but instead, doing useful work. Bitcoin itself is sophisticated pirating, really, and taking advantage of the naïve, non-technically savvy and gullible.

How about a PB, a Personal Butler, instead of a "PC." Your PB could look for jobs, items you want to buy or barter, look for cars, look for best prices on flights.

Yep, I want a PC to do actual work for me. Online searches, without the grinding monotony. Looking for opportunities, items I want to buy, reviews, definitions, searching for information on something I'm working on, etc.

I want to wake up to, "You have insights waiting."

This would be enabled by a modified web browser program that we leave running all night to do searches for us, then present the organized and simplified results, along with our daily websites, in the morning. Like the morning newspaper, but tailored to personal interests, and more informative, entertaining and relevant.

Admittedly, search engines are part of the way there, but I'm thinking of a step beyond that, with websites optimized to serve as a "scraper" that picks up tailored data for presentation. Instead of going to multiple websites each day, you just get a digest, with all the topics you have preselected, but the useless parts filtered.

Gimmicks and Gizmos

There are some cool new products that have been released, like the following:

Display Technology

LG Display's crazy 65-inch OLED TV can roll up like a poster

Interesting: The world's first big-screen TV that can be rolled up to hide away when not in use.

Cameras

High-definition camera using multiple small lenses

This new camera doesn't need those bulky lenses we're used to seeing, but uses optics knowledge and multiple small thin lenses to produce hi-def, high-quality images, like those made by the huge lenses pros use. (That means a much less bulky, heavy, expensive camera, for the photographer.)

On the Way?

Solid State Batteries

These are tough to figure. Are Solid State Batteries simply electrolytic capacitors? They aren't, nominally, but what's the real difference?

Batteries are dependent on chemically-enabled reactions, transfers of electricity via ions. But caps, too, may utilize similar chemistry, as in electrolytic capacitors.

It raises the question of whether there is some mid-point between the electrolytic capacitor and battery that provides some new and marvelous properties, a "Batacitor," or a "Capattery."

Self-Sailing Ships

An interesting new proposal is autonomous, unmanned ships. These would save the labor costs on long ocean journeys by ditching the crew. I see these automated ships also implement something I've mentioned above that would be useful for railroad trains: drones to scout ahead to check for any problems and issues.

Automated Cars

You know, automated cars aren't something that are very practical at this stage — for long distance travel. But they could be made practical in towns. If normal cars were banned from inside cities, and people had to park outside the limits and take electric "pods" into town, you can imagine the potential. Surface/above ground parking lots would be freed for development, pollution would be lessened, people would often get much closer to their destinations than having to park. No parking fees! Streets could be freed, of buses as well, perhaps. Basically a sort of pod-taxi network that operated in towns.

The bad aspects of these pod vehicles the same issues as the subway or city bus. Sitting in someone else's stain, and mess. Remember, that can be alleviated when people have to pay for their messes. A pod could record its internal condition after each passenger, taking a photo from an internal camera. Any damage could be charged to the offending user.

Streets could be reconfigured, for pedestrians, for bicycles and skateboarders, and other "personal mobility devices." It would be really nice to see some streets canopied, as well, which would alleviate one of the most objectionable aspects of being a pedestrian or cyclist: inclement weather.

Confining autonomous vehicles to the urban centers would go a long way to making autonomy of cars practical. The sensors and paraphernalia necessary to enable sensible self-driving cars would be limited in scope to the cities, therefore be much cheaper to employ.

Dark Side

Looking at wasted resources, or what is coming (or threatened) and shouldn't be:

  • The "Internet of Things," another snooping initiative, is an absurd plan that has apparently slowed, or they've just gone underground with it and are going to spring it on us later.

  • The commercialization and emasculation of the Internet is disappointing.

  • Smart meters are being forced on people. Electricity should be getting cheaper, especially with electric cars coming to provide new demand. Instead, the utilities act as if this is some great burden. Suddenly when their product is in demand, which is what any rational industry wants, they become martyrs.

  • Microsoft Windows "Sets," a tabbed application to group together programs in one — like tabs in browser windows, but for applications is absurd.

  • There is even talk of eliminating personal computers.

Elaborating on some of these points:

Microsoft

As always, Microsoft dominates the bad news category, as Windows 10/11 and their other spyware continue to get worse.

With that useless Sets initiative, they forget that they already have the Taskbar on screen. Much better to save current open programs and files in a named "project," and leave it at that. Tabs make no sense when you want to move things from window to window. Also, if you could save whole arrangements like that to a memory stick to carry around with you, that would be a boon.

It's pretty much moot now, though, for it looks like Sets, like so many of Microsoft's boondoggles, was a failure and has been withdrawn.

It is believable that they create these make-work projects, as a distraction to appeal to the gullible, as a flashy "new" thing to prompt sales. The things that are "hard," are thus circumvented. As are the things that are important, useful and empowering. This often happens in a "mature" industry (i.e.: monopoly or semi-monopoly).

People are still tolerating the useless Windows crapware. Mind-boggling. Unbelievably, it may be that most people have somehow never learned that Microsoft Windows is indeed spying on you and your machine. But how people put up with the intolerable loading times, as the thing melds with Microsoft and installs "updates," while feeding back its filthy little spy report and your data, is stunning.

But now, it's ratcheted things up a notch, with its overstepping goons making demands on what you can and can't do online. If you rightfully criticize Microsoft, for example, they can pull your software license.

It's as though the gas station were coming after you. "Since you're using our gas, you can only drive where we tell you to." And the computer rags are mostly mute on the issue! We know where their money comes from.

I don't know why people are using Microsoft products in the first place, but this hits new lows.

These enforced agreements from on high on Microsoft's part have no legal substance. Their unilateral demands aren't recognized in law, but that doesn't stop them from fooling people.

With the additional problem of MS software being inefficient, it's a burden that shouldn't be used in the first place. Why continue to support their schemes? Microsoft is criminal.

Use Linux, or continue to be a victim. I guess even Apple, while not perfect, is also the better choice.

The Internet

Websites are devolving to advertising and pablum, with the same information constantly being regurgitated in different ways.

People put up with an absurd level of ads on certain sites. Often, you get on a bad website and it can freeze up your browser. Fortunately, the ad blockers are pretty effective in limiting that, so now we get accosted to "Turn Off Your Ad Blocker," and this is almost always at sites that you can do without anyway. Often, I'll just close that page if they insist on it.

Yes, instead of being snappy, as they should be, with their processor power, we have lagging computers, especially on web pages, due to long load times because of how those pages are hopelessly burdened with all their tracking, JavaScript overkill, and so on. Of course, the bad programs provided by the likes of Microsoft aren't helping the situation.

Also, the Net has empowered distractions to suck us dry. If it's not banal Facebook and Twitter, it's something else that preys on people, like the Bitcoin scheme.

"Internet of Things"

This idea is sounding worse with each passing month. This non-viable nonsense is pushed relentlessly. That's because it's a component of the scheme to put everything and everyone under some sort of control and surveillance. To limit consumption and so forth, they will have the power to turn on and off your very appliances. Like other stupidities, it should be obvious at first glance that you can't punish the refrigerator. It has a fixed usage, and it's going to require that amount of juice, regardless of any conspiratorial dreams of dictatorship.

Hilarious, too, is their boast (threat) that the device can order your milk or other food. Complete idiocy. It may be tied into an initiative where the grocers have guaranteed deliveries. You can imagine how that will affect prices when people are locked in to food supplier contracts.

They want to regulate your house lighting, your computer usage, and your TV to stop "over-consumption." More important to certain snoops, is to use all this regulation as a spying technique, so they know what you're doing at all times.

"Phasing Out" Personal Computers

Here's an interesting idea: They want to slowly "phase out" laptops and computers. It probably isn't any more likely than phasing out telephones and TVs. But the computer industry has slowed to a no-growth situation, so it will stabilize and stagnate to an extent. That means they're going to have trouble finding increased profitability. To that end, they're cheapening everything, like making the batteries in laptops very hard to replace, and removing the DVD drive, which was never a quality piece from the beginning. One article made a case that Apple will phase out the Mac.

There's a bright side, as mentioned previously, if the laptop can be merged with the cellphone, applying a little bit more miniaturization. Using small built-in projectors, in place of a panel monitor and keyboard, would make sense as a real portable.

Bottom Line

The lack of the enthusiastic, heady growth of past decades in the computing field is disappointing. It's as though the drive for innovative things is being sucked up in the wake of nonsense notions, and a lot of wasted effort — the Facebooks, "Internet of Things," RFID tracking, nonsense software like "MS-Sets," and unnecessary new "iPhones," released for fashion's sake, not because of major advancement. But, despite that, there is room for many more surprises and breakthroughs, mostly in the area of robotics.


Comments

Popular Posts