ChatGPT, Bard, et al

Sort:
pcwildman

If you are trying to send a pic it's not coming through.

GreenMoon07
Idilis’ flag? Take a wild guess
paper_llama

Well, this was quickly overrun with children. Maybe DD will exercise some blocking...

pcwildman

Starland is our own little Utopia, where all the women are strong, all the men are good-looking, and all the children are above average.

DiogenesDue
Philfiu wrote:
Chat GTP is very good for my homework my mom doesn’t know it but I like to do English homework with chat gyp

It shows.

DiogenesDue
zolidmill wrote:

Well, this was quickly overrun with children. Maybe DD will exercise some blocking...

No can do, this thread is also an experiment. There's a parallel thread running somewhere, and this is the placebo group that establishes the baseline.

paper_llama
DiogenesDue wrote:
zolidmill wrote:

Well, this was quickly overrun with children. Maybe DD will exercise some blocking...

No can do, this thread is also an experiment. There's a parallel thread running somewhere, and this is the placebo group that establishes the baseline.

I'll delete my serious posts then. Have fun.

paper_llama
Philfiu wrote:
What?

If you're in a junkyard and you don't know where the refuse is, you're it.

Or something like that.

DiogenesDue
paper_llama wrote:

I'll delete my serious posts then. Have fun.

That also impacts the experiment, but your choice.

paper_llama
DiogenesDue wrote:
paper_llama wrote:

I'll delete my serious posts then. Have fun.

That also impacts the experiment, but your choice.

Ignoring my post, choosing to interact with children instead, then telling me you never took the topic seriously to begin with does have an impact, true. Maybe not how I'd run the experiment, but your choice.

DiogenesDue
paper_llama wrote:

Ignoring my post, choosing to interact with children instead, then telling me you never took the topic seriously to begin with does have an impact, true. Maybe not how I'd run the experiment, but your choice.

I didn't ignore your post. But I guess I can't respond to it now happy.png.

I am taking both threads seriously.

pcwildman

I've always laughed every time I hear the name Diogenes. Cynicism, huh? Sounds a lot like Buddhism. 🤣

DiogenesDue
pcwildman wrote:

I've always laughed every time I hear the name Diogenes. Cynicism, huh? Sounds a lot like Buddhism. 🤣

The Cynics philosophy (their actual philosophy, not the "cynical" interpretations invented later) does have similarities with Buddhism, yes.

pcwildman

#70 I was having a little fun with our new friend #59. I play Deus Ex: Mankind Divided. It's one of the last Thief versions and you can play it non deadly, which I mostly do. Every once in a while I will go Full Metal Jacket and usually you are trying to stop the evil cops from killing the innocent populace. I like Thief because it was the first game that you didn't just kill people or monsters, ala Doom. In the original it was best not to have any interaction at all with anyone. In the new one you have to just pretend that, yes, these are evil cops and I must stop them and protect the innocent. One shot, one down. Two if he has a helmet. You play this game and you realize how ridiculous Hollywood machine gun scenes are. I'm killing to save lives. That was our reasoning behind Horoshima. 😥 I also just went broke saving money.

DiogenesDue
MelvinGarvey wrote:

I don't know if anyone pointed this out already, but Isaac Asimov himself refuted his "3 laws of robotics" in his novel "I Robot", demonstrating that if pushed to its extreme (which a machine would surely do), the 3 laws of robotics would inevitably lead an IA to decide the murder of some people for the reason it calculated less people would be dead in the end, and so on, and so forth.

The 3 laws of robotics fall down in several ways, because words are fuzzy and vague and the 3 laws are based on words...

Isaac Asimov's Three Laws of Robotics are:

* First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
* Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
* Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

It doesn't take much to mess up the 3 laws, which is why Asimov was able to write so many stories using them as a plot device.

- The definition of "harm" is vague.

- The idea of "by action" versus "or through inaction" sets up contradictions that cannot be resolved.

- The definition of "action" versus "inaction" is also vague. If someone pushes a robot over and it is going to fall on someone who is trying to move out of the way, which is considered "action" or "inaction"...for the robot to attempt to maintain balance and keep it's existing standing state, to fall but attempt to avoid hitting the person, to fall unimpeded and unmoving, or...?

- The definition of "obey" is vague.

- The definition of "would conflict with" is vague.

- The definitions of "protect" and "existence" are vague.

...and so on. Language is a very loose approximation of reality, so trying to tell a computer in human words how to act in reality doesn't work. Just like programming today, you have to be 100% explicit to get an AI to do anything correctly. That requires a non-vague language with no interpretations involved. Such a human language does not exist.

pcwildman

@DiogenesDue, that is a headful. The sense I am getting is, no matter how well we code, we will never be able to instill mores, a soul, a heart, morals, right and wrong, whatever you want to call it, into a machine. Which is what we've all supposed all along, but there's a lot more to it than that, now that we are actually at, or approaching, the crossroads. Terminator still rules. Carl the Draper. ROTF. Judgement is a relative thing. Chat GPT acknowledges at every turn that it is not human. I don't even know what to make out of all this. The possibilities are endless. I have some pondering to do. Thanks for all the info. 

 

Kavmaj
I think ai may lose some jobs in the short run, but in the long run it will create jobs. Many of the jobs that exist today didn’t exist 30 years ago
Kavmaj
Wdym by regurgitative
TheRealTorchLit
Rubbish, comparing it to vomit
Kavmaj
“That’s the same thing said of the Industrial Revolution.” I’m not sure how to interpret that statement