The tech world experienced more insanity last week. We finally got confirmation from AMD that the CTS Labs security report was a tempest in a teapot, but the big question remained unanswered. A self-driving Uber vehicle killed a pedestrian, but we didn't ask the right questions. Facebook admitted that it gave our information to a bad actor -- we not only failed to reach the right conclusion, but also forgot what really would wake up Mark Zuckerberg.
I'll address all three topics and close with my product of the week: Technically Wrong: Sexists Apps, Biased Algorithms, and Other Threats of Toxic Tech, a book that helps explain why so many tech companies seem to be behaving really badly.
CTS Labs' Statement of the Obvious
I touched on CTS Labs in last week's column, mentioning the screwy report that came from a company staffed with questionable folks. CTS Labs appeared out of nowhere to showcase what it claimed were major security problems with AMD's parts.
Since then, we found out that all these "major" problems weren't and that they would require administrative privileges to execute -- the same kind of privileges Snowden had to execute one of the biggest security breaches in the history of the U.S.
The CTS Labs warning was like having someone give you a security report on your home and telling you that anyone who got the keys to your house could enter it and leave your refrigerator open or your faucets on.
I don't know about you, but I wouldn't pay for that report, because I kind of already know that if someone gets the keys to my house they could steal all my stuff, murder me in my sleep, and pretty much do anything they wanted. I might not have thought about the refrigerator and faucet thing -- but really, I'm a tad more concerned about the murder or theft potential.
The big question is who funded this report? CTS Labs hired a U.S. public relations company to push it, but no rational person would pay for such a report, and AMD, the only firm that might make use of it, appeared to get it for free.
AMD did promise to fix the issue so that administrators could not do the three things they probably wouldn't do anyway. (I mean they already can, as we saw with Snowden, steal everything. They could erase all the files or plant a virus. So what is this about? They have some extra time to do what else exactly?)
I wonder who would benefit from AMD getting some bad press? Let's think really, really hard...
Uber's Downward Spiral
Speaking of Intel, er, Uber -- here is a company that really seems to have a death wish. The reason I say this is that it was targeted successfully for stealing self -driving technology from Waymo, a Google subsidiary. There's some irony in this, someone stealing from Google...
Anyway, Uber promised not to use it, so what has it been using in the self-driving cars it has been road testing? We know that Mobileye sensors are one of the technologies in use because that was announced.
You remember Mobileye -- it is the Intel-owned developer of the technology that Elon Musk implicated for causing the Tesla driver who was using Tesla's "Autopilot" feature (still think Tesla should change that misleading name) to hit a trailer and die.
Yes, none other than Elon Musk rejected Mobileye. Apparently, Uber didn't read much into that, and it may have resulted in some poor woman's death.
Here's the deal -- optical technology is limited by sight. That means, just like you, it is severely limited when it can't see very well. If you watch the video of the accident, you will see that the pedestrian who was hit suddenly emerges from the dark right before the car hits her.
What I'd like you to try is watching the video with your foot on the floor while pretending it is on the accelerator. When you see the woman, try to move your foot to where the brake pedal would be. Keep in mind that it likely would take 2 seconds, at least, for the car to stop.
You'll see that you likely would have hit her too. Were you using traditional cruise control, you likely wouldn't even have gotten to the brake pedal. The lesson here really isn't a self-driving car lesson -- it is a wear reflective clothing at night lesson.
Self-driving cars should be able to see things that you can't, but optical sensors don't, for the most part. After being used in two accidents resulting in deaths, perhaps it is time to look at another technology. Interestingly, Intel developed something that would see through rain and snow, but it doesn't market it as part of its solution.
If the whole insider trading thing didn't bug you, I'll bet this especially makes you all warm and fuzzy about Intel Inside, doesn't it?
Speaking of bad actors, Facebook is in the dog house for supplying members' personal information to Cambridge Analytica, which it then used not only to turn out more votes for President Trump, but also to discourage votes for Hillary Clinton.
I'm still amazed that after all that has come to light the administration still insists the U.S. had a legitimate election -- but then again, it also maintains that the president didn't have an affair with Stormy Daniels. Maybe it is time to ask the White House to return Steve Jobs' reality distortion field -- I think it is broken.
To pile on, Cambridge Analytica has been connected to despots winning elections. In fact, its entire gig appears to be getting some bad people whom folks shouldn't vote for into politics. It is kind of what it does. You know, Arby's is about the meats, and Cambridge Analytica is about screwing over voters.
If that is what it does, and it is really good at doing it, why aren't we now a tad more focused on making sure that Cambridge Analytica -- or a similar operation -- doesn't do that to us again?
As for Facebook, I get that deleting your account may sound like you are doing something, but you aren't the customer at firms like Facebook and Google -- you are the product. If you want to get a firm's attention, you'll have greater effect by boycotting the advertisers than by deleting your account.
Remember the NRA? Quitting really didn't seem to bother that group -- but boy, Delta pulling its support woke it the hell up. Maybe another path?
Still speaking of bad actors, Google decided to restrict all gun videos on YouTube. Let's stop a moment and try to figure out why. Do people buy guns from gun videos? Or do people tend to buy guns when they feel that someone will take away their right to buy them?
Who causes gun sales to go up? Republicans who support gun sales, or Democrats who don't? Historically it is the latter, which suggests that Google's move probably will cause people to buy more guns rather than fewer.
If Google really wanted to stop school shootings, then taking some of the billions it makes and helping to give the students driving that initiative more voice certainly would help. Since we know kids don't understand consequences, creating compelling videos that showcased consequences might help.
I mean here is a company that was highlighted as a bad actor in Brotopia, a company that instituted a hiring policy favoring engineers who are mostly men in an industry known for excessive discrimination against women, and one that is highlighted as a bad actor in the book I recommend as product of this week. Maybe it is time for Google to try being one of the good guys?
As I mentioned in last week's column on fake news, we really need to get some focus on what is important. Intel is off the reservation, and I'm not just talking about insider trading. Uber and Intel could kill autonomous cars, which otherwise would save rather than take lives. Facebook appears to be on the wrong side of democracy (and if there was ever a firm that needed a crisis team...). Google, even when it tries to do something good, does something bad.
Why does it suddenly seem like so many powerful tech firms are run by idiots? Still, as consumers, we do have a choice of which companies we buy from and which get our business. Perhaps we all should be taking a bit more time to choose good actors rather than bad ones.
A book everyone should read -- particularly those of us who have had bad experiences with companies like Amazon, Google, Facebook and Uber -- is Technically Wrong: Sexists Apps, Biased Algorithms, and Other Threats of Toxic Tech.
Here is the net of it: These firms really, really, don't give a crap about your experience. In many cases you aren't even the customer, even though they fool you into thinking you are. You are basically their product, and you'd think they'd care about such a valuable product -- but they don't, because you don't pay them money.
We are effectively a new class of slave. I expect that the government eventually will come around to the idea that this really isn't a good thing for its citizens. These tech giants make billions of dollars from our personal information. We don't make billions -- they do. That's slavery. Slaves don't make money, slavers do.
That isn't the author's term, but she is clear that his new class of firms is hostile to consumers, and she clearly has deep knowledge of these firms (the book is massively referenced, just like Brotopia is).
Remember I'm the guy who nearly was killed because of Facebook, was cut off by Amazon for daring to question questionable charges, was cut off by eBay for not wanting to give it double access to my bank account, and has long thought that Google should be synonymous with theft and sexual misconduct.
If you often feel screwed by this new class of company, read this book and you'll understand why. It doesn't really tell you what to do about it, but perhaps it will help you choose the company you want to work for or do business with. Just maybe it will help the next wave of tech companies to be something other than James Bond villains.
Because Technically Wrong helped explain why so many new companies are assh*les, it is my product of the week.