SpaceX owners do not care. If they were risk-averse, they would have dumped SpaceX like it was toxic waste.
In a broader sense, this "I bet Oracle shareholders hate their bad PR" attitude is really zero-sum. It's pervasive on HN, and we rarely ever see bad PR snowball beyond niche discussions. I want $BIGCORP to collapse as much as the next guy, but the outrage-derived comments don't seem to reflect the market's response.
Typing a description of an image is comparable to finding a button in a menu regarding ease/enablement. The difference is actually that Grok decides explicitly to not add even simple (if imperfect) guards, which requires no real burden on their part, which is morally and spiritually hideous.
And it had blatantly insufficient safety guarding that. I saw photos of 4th graders, labeled accordingly, that Grok happily modified. Given the model does have access to the text content of posts it's replying to, that's wildly negligent. This is not a simple brush tool we're talking about, and refusing requests to strip clothes off people is one of the most basic measures that could be taken here.
Do you really think that when people make things that have risks associated with the use or misuse of those tools that they have no responsibility to mitigate those risks or prevent misuse?
Adobe doesn't provide tools explicitly designed to enable the creation of child pornography — in fact their tools try to prevent its creation — and they don't profit from the sale of it. But, of course, Musk fanboys can be reliably counted upon to support profiteering from child sexual abuse in any form.
"Something you don't like" as a description for the deliberate sexualization of children for profit, as if it's not an objective moral harm, is telling on yourself here. Just because the loudest leaders in Silicon Valley have been trying to convince every one of their sycophants that sexually abusing kids is no big deal doesn't mean the rest of us who are normal have bought into it.
Reducing safety filters doesn't mean that it's explicitly designed for child pornography. This is like thinking free speech is designed for child pornography.
You seem to be leaning heavily on analogy, which is inherently flawed. The entire point of analogy is that you are comparing two different things without actually comparing them - just declaring them equal. It's a weak rhetorical tool for petty arguments.
I bet shareholders of SpaceX are thrilled to be exposed to this for no reason
SpaceX owners do not care. If they were risk-averse, they would have dumped SpaceX like it was toxic waste.
In a broader sense, this "I bet Oracle shareholders hate their bad PR" attitude is really zero-sum. It's pervasive on HN, and we rarely ever see bad PR snowball beyond niche discussions. I want $BIGCORP to collapse as much as the next guy, but the outrage-derived comments don't seem to reflect the market's response.
You would lose this bet.
This is the equivalent of suing photoshop when people use it to make something you don't like.
I don't expect this kind of dramatic internet-forum-style false equivalence on HN.
Argue your opinion on its own terms, rather than simply pointing to something else and saying "they're the same".
It’s more like emailing google for CASM and suing google for emailing it back.
Or in the near future, “Why are we suing the robot company ! Bob told the robot to kill the child!”
does photoshop have a “make porn of this person” button? does it have a “make me csam” button?
Nor did Grok. You had to prompt what you wanted to have happen.
Typing a description of an image is comparable to finding a button in a menu regarding ease/enablement. The difference is actually that Grok decides explicitly to not add even simple (if imperfect) guards, which requires no real burden on their part, which is morally and spiritually hideous.
I hope they experience consequences.
And it had blatantly insufficient safety guarding that. I saw photos of 4th graders, labeled accordingly, that Grok happily modified. Given the model does have access to the text content of posts it's replying to, that's wildly negligent. This is not a simple brush tool we're talking about, and refusing requests to strip clothes off people is one of the most basic measures that could be taken here.
It's called the "brush" tool.
Do you really think that when people make things that have risks associated with the use or misuse of those tools that they have no responsibility to mitigate those risks or prevent misuse?
Adobe doesn't provide tools explicitly designed to enable the creation of child pornography — in fact their tools try to prevent its creation — and they don't profit from the sale of it. But, of course, Musk fanboys can be reliably counted upon to support profiteering from child sexual abuse in any form.
"Something you don't like" as a description for the deliberate sexualization of children for profit, as if it's not an objective moral harm, is telling on yourself here. Just because the loudest leaders in Silicon Valley have been trying to convince every one of their sycophants that sexually abusing kids is no big deal doesn't mean the rest of us who are normal have bought into it.
Reducing safety filters doesn't mean that it's explicitly designed for child pornography. This is like thinking free speech is designed for child pornography.
You seem to be leaning heavily on analogy, which is inherently flawed. The entire point of analogy is that you are comparing two different things without actually comparing them - just declaring them equal. It's a weak rhetorical tool for petty arguments.
one time i mentioned peter thiel and dang immediately responded scolding me for not engaging in constructive debate.