Moral AI

I'm keeping my house dumb for as long as I can. I don't trust my Android phone the way it is. I had a discussion about Googles voice search on Android and I say it is always listening even if passively. The Google forum top contributor disagreed saying it only listens when you say the words "Ok Google". I have to ask, how does it know you said "Ok Google" if it's not listening.

I think we need to step back and really look at A.I. and not turn it loose with too much authority. There are weapons being developed that are in the works that are supposed to ultimately save lives by keeping solders out of harms way. A.I. controlled drones with kill capabilities are in the works with the proposed intent to go on dangerous missions and eliminate certain targets without the need for boots on the ground. Saving lives is good but A.I. with kill authority could potentially do more harm than good. Sure there is always human error to worry about but there is no replacing gut instincts and human compassion.

Some say I'm paranoid. Well just because I'm paranoid does not mean they are not out to get me.
 
  • Like
Reactions: ozentity and Paulm
I'm keeping my house dumb for as long as I can. I don't trust my Android phone the way it is. I had a discussion about Googles voice search on Android and I say it is always listening even if passively. The Google forum top contributor disagreed saying it only listens when you say the words "Ok Google". I have to ask, how does it know you said "Ok Google" if it's not listening.

I think we need to step back and really look at A.I. and not turn it loose with too much authority. There are weapons being developed that are in the works that are supposed to ultimately save lives by keeping solders out of harms way. A.I. controlled drones with kill capabilities are in the works with the proposed intent to go on dangerous missions and eliminate certain targets without the need for boots on the ground. Saving lives is good but A.I. with kill authority could potentially do more harm than good. Sure there is always human error to worry about but there is no replacing gut instincts and human compassion.

Some say I'm paranoid. Well just because I'm paranoid does not mean they are not out to get me.
Under the Geneva convention autonomous weapons capable of making the decision to kill on their own are illegal. The demilitarized zone between the Koreas has machine Korea's turrets with this capability but a person has to pull the trigger. The USA has autonomous weapons and the USA only abides by the Geneva convention with other Geneva compliant countries and terrorists who they claim have no affiliation with known terror groups. Everyone else is fair game for these kill bots.
 
On this topic, a few years ago when I’d flown out of state my toddler grand child played near my phone, speaking & babbling into it. Suddenly, a voice from the phone interrupted my grand baby tersely asked if I needed to call an ambulance! /My voice is quite similar to my grandchild.

(So maybe it’s already there / re moral ai.)
 
interesting. I just read an article this week that was talking about new cars haveing the ability to call 911 if started without the key or "chip"..(stolen), but also being able to call if you break the states "super speeder" laws or reckless driving laws....it was also saying that every car now also has the ability to have devices like "google assist" or "alexa" installed for voice control of radio, g.p.s. and phone services...
wow, when you throw onstar, google home, alexa, smart appliances, home computers and web cams.(if you have a laptop, tablet or smart phone, you have a webcam...) into the mix, you realize we have lost all privacy, there is no escaping it unless you get rid of everything from T.V. to internet.......we are doomed.....lol

If something can be abused, it will be abused.