Okay, so back in the mid-1970s, LISP was the language of choice for creating AI applications. Then, along came Prolog from Japan. A computerized psychotherapist named Eliza dominated the AI talk for regular people.
AI couldn't do anything but give developers headaches back then.
Now, we have all this computing power sitting in warehouse-like buildings, heating up the landscape so much that you can almost see it in satellite images. How is it useful?
We can ask for photo manipulations. We can ask for essays to be written. We can ask for a workout routine.
Microsoft has just introduced a new type of laptop computer with dedicated AI, including a keyboard key. However, aren't they the people who put Cortana in the dustbin? Aren't they the people who put Clippy in the dustbin?
I've no doubt that in 40 years, AI will be useful for the average person. Right now, I can see people using AI to cheat. AI can also help to detect when someone is cheating.
I could see people training AI to hack. Imagine when hackers have been trying so many combinations that they haven't slept for a while. If they could train AI to hack, they could give it a list of things to try using example after example, and they could sleep.
Android 15 and iOS 18 are coming with some AI enhancements. We'll see what works. If Siri worked better than Apple PlainTalk from 1998, I'd be impressed. Google Assistant is better but isn't much more useful.
People are worried about Skynet-type AI and they are far too early. When someone gets that level of AI to work, I'd be impressed but it's more likely that humanity will eliminate itself long before that happens.
Update 2024.07.16: It's odd to me to see all these headlines about ethics and AI. I've seen people use other people's work for years without paying for its use, without even giving credit to the author.
Update 2024.07.24: I'm waiting for Apple to not fix security flaws on iOS 17 to get us to move to iOS 18, which will include AI in multiple doses. Apple hasn't been getting the operating system features ready in time for the release, so they've been releasing something with basic features and adding what they promised in spurts. I understood this during the pandemic. People weren't as coordinated as they should have been.
I've seen people from Apple, Google, HP, and other Silicon Valley-based companies coming to shop after their 3 hour commute. They want their own house and things are less expensive out here. Well, they were less expensive before they arrived.
In any case, they've got to get it together and build software that isn't full of mistakes. You'd think that they could train AI to look for mistakes in their programming code and highlight them to be fixed. That might be the best use of AI right now.
Update 2024.10.23: Just saw this Siri vs ChatGPT article claiming that Siri is two years behind ChatGPT. I have to question that Siri is only two years behind ChatGPT. Maybe, this is the latest they're testing and not the one most people are using because that one seems to be four to six years behind everything else.
No comments:
Post a Comment