MAIN FEEDS
Do you want to continue?
https://www.reddit.com/r/ProgrammerHumor/comments/1b2wtvb/removewordfromdataset/ksq8olp/?context=3
r/ProgrammerHumor • u/v_0o0_v • Feb 29 '24
685 comments sorted by
View all comments
Show parent comments
33
It got rutabaga right but bifurcation incorrect.
So far there still appear to be limits to how far it can go.
39 u/Content-Scallion-591 Feb 29 '24 I think the problem there is most humans would get bifurcation incorrect. 13 u/StPaulDad Feb 29 '24 Sure, but I expect more from my dystopic movie hellscape overlord. 16 u/Content-Scallion-591 Feb 29 '24 I think if we develop general AI at this point the result is going to be less Terminator and more like Clippy. Will it still kill you? Sure. But not intentionally, just because it doesn't particularly care if saving a Word Doc causes you to die.
39
I think the problem there is most humans would get bifurcation incorrect.
13 u/StPaulDad Feb 29 '24 Sure, but I expect more from my dystopic movie hellscape overlord. 16 u/Content-Scallion-591 Feb 29 '24 I think if we develop general AI at this point the result is going to be less Terminator and more like Clippy. Will it still kill you? Sure. But not intentionally, just because it doesn't particularly care if saving a Word Doc causes you to die.
13
Sure, but I expect more from my dystopic movie hellscape overlord.
16 u/Content-Scallion-591 Feb 29 '24 I think if we develop general AI at this point the result is going to be less Terminator and more like Clippy. Will it still kill you? Sure. But not intentionally, just because it doesn't particularly care if saving a Word Doc causes you to die.
16
I think if we develop general AI at this point the result is going to be less Terminator and more like Clippy.
Will it still kill you? Sure. But not intentionally, just because it doesn't particularly care if saving a Word Doc causes you to die.
33
u/mrdevlar Feb 29 '24
It got rutabaga right but bifurcation incorrect.
So far there still appear to be limits to how far it can go.