compaqkeyboard1200x200
Computing

Computer Corner

Robots and AI: Something Wrong with the World

They’re Still Pushing the Same Old BS

It’s a daily barrage, and recently it was about using AI (artificial intelligence) at pharmaceutical companies. I won’t waste your time with a link. And, unbelievably, we also were treated to old war criminal, 98-year-old Henry Kissinger, weighing in with his ill-formed opinion about AI. Who in his or her right mind cares? The pharmaceutical article was talking about data analysis and perhaps expert systems, not AI. Words matter. When they start on this line of deception its inevitably for a bad purpose like fooling the stockholders or fooling the general public in an ongoing brainwashing scheme.

Even automation lags. They still don’t have anything practical yet. Even a subway train driver can’t be replaced right now, on older, non-automated trains, which would be the first place you’d try out “automated” driving control systems. No one of these “commentators” or “experts” pushing this have any common sense. You always see new tech implemented in simple venues first, then more complex uses.

But here’s the real test, and what would be evidence of real sincerity: A NASCAR-type event, where all the cars are strictly autonomous, unmanned. All the manufacturers could compete in this event, to show off their systems (or adaptations of some system they purchase). In fact, in an actual functioning society, this would be required before any of these systems could be trialed on the roads (looking at you, Tesla), and any systems that crashed in this trial wouldn’t be allowed.

An article states that on Dec. 22, 2021, a semi navigated an 80 mile stretch of highway between Tucson, Arizona, and Phoenix without any human intervention in a test by San Diego-based TuSimple. I wrote to ask for the video of this, and TuSimple responded it had a YouTube video of the event. It’s not a bad start, but it isn’t the issue whether there is partially autonomous driving capability, but when it’s going to be pretty much foolproof, and able to work in harsh conditions, not just on a prepared route.

The studies say that semis are inefficient/not cost effective for anything over 150 mile distances, with rail being the better choice. Obviously, with proper planning, including, say, dedicated lanes for electric semis on the highways, this could be improved. And since semis go long distances despite that, even cross-country, obviously the rail system planners are greatly at fault. I like one person’s idea that semis would have depots very near to highways and freeways, but they should also be integrated with rail stations, and ports in coastal areas.

No, we’re a long way from full automated driving. Elon Musk’s Boring Company tunnel was built to ferry convention visitors. The Las Vegas tunnel stretches 1.7 miles beneath the Las Vegas Convention Center. This small scale-effort in Vegas has manually piloted cars — they can’t even automate that, the simplest task for automation, safely, reliably, effectively and cheaply.

And of course there’s the “robots taking over” meme. A woman told me robots were going to take over, one day soon, and lead to unemployment. I downplayed it as unlikely, but to deaf ears. She knew I work in I.T., but she was suddenly the expert. From what though? Something seen on the evening news? Well, where else? I’ve spoken with others who should know better, or at least be teachable, but the result is always intransigence, and obstinacy, and blind dogma. They have no defense of their claims, so what’s wrong in the world when so many are so easily brainwashed?

Here’s the grim window into their thoughts. They really seem to believe this…

Sci-fi Robots

As I mentioned in an earlier article, there aren’t even personal “butler” robots to go fetch the groceries from the car, and everyone thinks we’re in Black Mirror or Fahrenheit 451 with electric hounds on the loose, rending hapless jaywalkers limb from limb.

Well, at least I try to get a kick out of the silliness, but tinged with sadness. Reading the comments on a sensationalized article about a dummy-head so-called robot that made an expression that “even scared its creators,” is typical. The ominous and foreboding comments were like, “It’s not long now,” “There is much to fear,” “Soon they’ll be taking over.” Fearsome, all right, their childish perceptions. How manipulable is the public, anyway?

We see these type of pomposity with the Bitcoin farce, from which emerged so many self-styled experts overnight.

Have you noticed these screeching climate goofs they put out there for public consumption never gripe about Bitcoin and its tremendous waste? A transparent racket to rob money from the simple and gullible, and there’s no significant resistance to it, or warnings about Bitcoin from media or government. Bitcoin is organized crime and a confidence racket, and its implementation is a disaster that has destroyed endless amounts of real value in the form of electricity, wasted human effort, and computer components, solely to make a very few rich.

What about idea that there are secret super-robots in some skunk works somewhere, they’re hiding from us to unleash in a big surprise? Highly unlikely, much like the BS that the Lockheed Martin Skunk Works has, “secretly done everything we’ve seen in Star Trek.” What an egotistical bunch of maroons that try to sell that to us.

CPU Comparison

Curious about how one might pick between different CPUs (computer processors) when buying a computer, I dug up these articles.

Tom’s Hardware Rankings

Gaming Scan Rankings

Well, that’s overkill, it’s overwhelming. It doesn’t really tell us bugger all. Yes we could guess the higher numbers/higher-priced processors are the best, the i9 is going to be better than the i3, and so on, but that’s not really what most people need to know.

Don’t give me that, “Everyone needs something different” BS, there’s probably one or two processors from each company that are the best for everyone except specialized users, a sweet spot.

It would have been nice to start with or include a chart of price/performance, but there are several factors in performance, as they demonstrate. I guess if you’re selecting a computer, it’s still a good idea to run an online benchmark — if you can find a good one, and remember to, when you’re in the stressful midst of shopping. Maybe make a checklist of things to verify when selecting a new machine, and include, “run benchmark” on the list.

It seems it would be a great exercise for someplace like Tom’s Hardware to do its own benchmark with dynamic charting, so when you run your machine it shows where on the graph of CPU rankings your own tested machine rates. As well, they should highlight the “sweet spot” areas on the charts. This could be selectable, “If your priority is gaming, click here to highlight the best, if your priority is general use…” It might be of interest to contact Tom’s and request they do this, but I strangely got a 404 – Not Found when clicking on their “Contact” page, LOL.

This isn’t the Tom’s of old. Tom’s has been commandeered by some big corporation, it seems, and when you do get to a corporate contact page it’s all about services they offer. Talk about the Umbrella Corporation taking over, with the corresponding zombies.

Mission Creep

Something unseen by most people is the complexity of programming, which continues its wild ride into excess. There is a serious problem with the approach to programming. They add features, subtract features, and sometimes it’s overwhelming — or unnerving. The biggest problem was and continues to be, of course, the folly and twisted mindset where languages like C#, and JavaScript are ever more complicated with each version, when the exact reverse should be happening.

Genius generally lies in simplification, not complication, which is why things get so complicated over time. There are few geniuses to come in and clean things up. Fault lies in not recognizing the simple facts: People are very limited in their capabilities, but unfettered in their ego, so they think they can do anything when they can do something. A perfect example is when “old-time, traditionalist” programmers didn’t like the idea of garbage collection done by the computer programming language automatically, without having to be coded by giving the computer instruction when to do so (“garbage collection” in computer programming is simply the name for freeing up internal computer memory when that area of memory is no longer being used by a program).

The whole point of computers is automation, but they objected to automation, thinking it made then somehow “better programmers,” or “more authoritative,” or, worst of all, “more hardcore,” *groan*, to take on the unnecessary burden, which is nonsense. It usually means that the authors/programmers don’t understand it themselves when complication piles on complication, instead of seeking better, easier solutions. There are limitations of knowledge and ability even among the most skilled, and we need to constantly review and re-work and simplify to address that. In fact, there’s a term that in coding that recently acquired wider use, refactoring, to describe rewriting and cleaning up software.

It’s sometimes tough understanding how and why anything in computing works at all, the way it’s all cobbled together. Things are getting worse rather than better, as new complications in coding come out, you see many websites that are pretty sad states of affairs, as people try to adapt things not fully understood.

Here’s a dirty secret about programming: It’s mostly a patchwork. People come in and patch together a solution, and hope the changes don’t affect something else adversely. But it’s too costly and time-consuming and inconvenient in most cases to start from scratch. And, starting from scratch will often mean you get something worse than the old system!

But why care about that? That’s their problem. Except it’s not, when it affects you and your data is stolen or something goes wrong in an application that inconveniences you or causes you problems. Or, if you run a company, you pay immensely for this type of nonsense.

We have to wonder how much time, energy, effort and money is wasted on bad programming, wasted efforts, dead-ends and the like. It’s like other scenes of waste in the world: We all end up paying for it in the end, usually through higher prices.

More than that, often something is extended and overextended to do things that are necessary, but beyond the original scope and intent of the program or system.

Another problem: There’s no need for endless complication, but there is a perverse desire for people to continue with something and add complication past the point of rationality.

It comes back to bite them, like in the case of Microsoft, it can’t even maintain its own browser, or produce a new one. The old browser, Internet Explorer, always was a worthless piece of Billcrap. The new browser, Edge, isn’t anything new at all. It’s a “port.” It’s a “reskin.” It’s actually Google Chromium browser, disguised!

Yes, the biggest software company, revenue of $143 billion in 2020, can’t even produce its own browser, but the shills are out there, doing comparisons of which one is “better!”

Edge and Chrome are built on the Chromium open-source browser using the Blink rendering engine. So you can see original software development is overwhelming, even for big companies.

Horrors of Programming

Software needs simplification, not complication. So of course, it becomes insanely complex.

Cars, overly complex now, have gotten to the point they’re almost unrepairable. A complaint about German luxury cars to a German mechanic was met with the response that they’re only supposed to last four or five years anyway.

You know there are a lot of best practices, but if you go to work in this field, it’s all just, “get it done as fast as possible.”

Horrors of the Internet: Internet Link Rot, Bad Websites, Internet Deterioration & Simple Fixes

It’s surprising how disappointing many articles on the web are. Many are reprints of other articles, lots of unsubstantiated opinion and vanity pieces, lots of BS and propaganda, wasting time comparing two browsers that are basically the same thing… When I mentioned it would be nice if someplace like Tom’s Hardware did a useful article allowing you to benchmark your own PC or to test a PC, that would be too much like work for them to do. Something useful often is. Sure, it’s a lot of work, but couldn’t they devote the resources they use to making fluff pieces to doing something real?

They’re in that “zone,” where what they do is good enough to keep their publication running, and attract enough readers, for a while at least, but not at that next level of providing real value. And, also, they have to tread lightly, because if they step on any toes at the chip-makers, they could be punished by withdrawal of ad revenue.

This is to say, they’ve found sort of a sweet spot with these publications, basically just taking data they can gather from somewhere else (there are news services, presumably for all sorts of businesses, including places like the chip-makers), then reformat it, juggle it around a bit, and present it.

The “Tragedy of the Common” Internet

The “tragedy of the commons,” refers to people exploiting things they have no ownership interest in but are “up for grabs,” as common holdings (like over-fishing lakes or the ocean). And the internet is this new, fantastic resource. But it suffers from a sort of “tragedy of the commons,” with stuff getting outdated, broken links, ads disguised as information…

A more curable problem with the web, is the problem of no dates on things, or files not kept up to date.

A simple fix would be to add new HTML tags to the spec to notify search engines and users the creation date and update date… Yes, after railing against added complexity, I am proposing yet another addition, but this is a pretty simple yet useful one. I propose something like an additional compulsory field for all HTML files that saves the creation date of the article, with perhaps another field to indicate last update. Then, browsers can post those dates beside the link. It’s not a fanciful idea, it’s an urgent one. This would be handy for developers, too.

Why couldn’t search engines incorporate a feature that allows the user to rate a link, based on how useful and related to the search string it was, much the same as online video hosts have a “thumbs up/down” feature? Actually, it needs at least three ratings: Quality, relation to search string and whether or not it’s an ad. There’s a business opportunity for an ambitious developer.

Link Rot/Not Using Links

This is another real issue with the internet, that also could be addressed easily. It’s another thing John Dvorak railed against, years ago, in the ’80s and ’90s.

Link Rot, where old links don’t work, is annoying, because a lot of times this happens even for large corporations or newspapers, which are supposed to be reliable references. Well, in this case it depends on the developer/maintainer of the code to check for bad links periodically. It’s a considerable amount of boring work that doesn’t provide much, if any, return on investment.

Sometimes they want to cover up or hide things, so the linked article goes away.

Another annoyance is when links aren’t used at all. Everyone’s in a rush these days, and they might be skipped in an article. We already saw that I had to write to TuSimple, since the article about its product on ZeroHedge didn’t include a link. This falls more under internet etiquette, it seems, but it’s something for authors to consider when they’re writing, so as not to waste their readers’ time.

There’s another thing: bare links. This is a link without corresponding information to help the user find any information lost due to a broken link. Avoiding bare links simply involves describing the context and where the link goes, in case it does break.


Leave a Reply

Your email address will not be published. Required fields are marked *